CN110880188B - Calibration method, calibration device and calibration system for near-eye display optical system - Google Patents

Calibration method, calibration device and calibration system for near-eye display optical system Download PDF

Info

Publication number
CN110880188B
CN110880188B CN201811036951.9A CN201811036951A CN110880188B CN 110880188 B CN110880188 B CN 110880188B CN 201811036951 A CN201811036951 A CN 201811036951A CN 110880188 B CN110880188 B CN 110880188B
Authority
CN
China
Prior art keywords
image
optical system
source
test target
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811036951.9A
Other languages
Chinese (zh)
Other versions
CN110880188A (en
Inventor
陈强元
冉成荣
孙杰
陈远
胡增新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sunny Optical Zhejiang Research Institute Co Ltd
Original Assignee
Sunny Optical Zhejiang Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sunny Optical Zhejiang Research Institute Co Ltd filed Critical Sunny Optical Zhejiang Research Institute Co Ltd
Priority to CN201811036951.9A priority Critical patent/CN110880188B/en
Publication of CN110880188A publication Critical patent/CN110880188A/en
Application granted granted Critical
Publication of CN110880188B publication Critical patent/CN110880188B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)

Abstract

A calibration method, a calibration device and a calibration system of a near-eye display optical system are provided, wherein the calibration method comprises the following steps: acquiring an image of a test image formed on a display unit by an image source; based on the image of the test image, obtaining the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera; processing the image of the test target acquired by the detection camera based on the corresponding relation to obtain the position information of the position of the source image corresponding to the image of the test target in the image source relative to the imaging surface of the detection camera; and processing the position information of the source image corresponding to the test target image in the image source relative to the imaging surface of the detection camera and the position information of the test target relative to the near-eye display optical system by using a virtual imaging model to obtain virtual image imaging parameters of the near-eye display optical system. Thus, automatic calibration is realized.

Description

Calibration method, calibration device and calibration system for near-eye display optical system
Technical Field
The invention relates to the field of near-eye display optical systems, in particular to a calibration method, a calibration device and a calibration system for a near-eye display optical system.
Background
In recent years, near-eye display optical devices such as Virtual Reality (Virtual Reality) and Augmented Reality (Augmented Reality) have created a unique sensory experience for humans. Before the related products are put into service, performance parameters of a plurality of sensors on the equipment and relative position relations among the sensors are calibrated, so that a user is ensured to have good visual experience.
Take a head-mounted display device for augmented reality, which is commonly available on the market as an example. Due to errors of a device manufacturing process and the like, performance parameters of sensors of the head-mounted display device and a position relationship between the performance parameters and the position relationship are often inconsistent with a design value, so that fusion between a virtual image and a physical environment has many problems, such as size deviation of the virtual image, inaccurate display position of the virtual image and the like. This gives the user a poor visual experience.
Therefore, before the head-mounted display device is put on the market, relevant parameters of the head-mounted display device need to be calibrated. Currently, there are many calibration methods for AR head-mounted display devices. However, these calibration methods often only calibrate a single parameter, and the calibration efficiency is low. Even in some calibration methods, manual assistance is required, and the cost is high, which is not favorable for the development of calibration towards industrial automation.
Therefore, the need for a highly integrated and automated calibration method for a near-eye display optical system, a calibration apparatus and a calibration system is urgent.
Disclosure of Invention
The invention mainly aims to provide a calibration method, a calibration device and a calibration system for a near-eye display optical system, wherein the calibration system can realize full-automatic calibration of the near-eye display optical system.
Another object of the present invention is to provide a calibration method, a calibration apparatus and a calibration system for a near-eye display optical system, wherein the calibration system can calibrate various parameters of the near-eye display optical system. In other words, the calibration system provided by the invention has relatively high integration level.
Another objective of the present invention is to provide a calibration method, a calibration apparatus and a calibration system for a near-eye display optical system, wherein no link is required in the calibration process of the near-eye display optical system, so that on one hand, the calibration process can be fully automated, and the calibration efficiency is improved; on the other hand, the stability and the accuracy of the calibration result are enhanced.
Another object of the present invention is to provide a calibration method, a calibration apparatus and a calibration system for a near-eye display optical system, wherein by the calibration system, various parameters of the near-eye display optical system can be integrally calibrated rather than individually calibrated, in such a way, error accumulation caused during individual calibration of various parameters can be effectively avoided.
Another objective of the present invention is to provide a calibration method, a calibration device and a calibration system for a near-eye display optical system, wherein various parameters of the near-eye display optical system are correlated with each other, and the calibration system is used to integrally calibrate various parameters of the near-eye display optical system, so that the relationship among various parameters of the near-eye display optical system can be effectively ensured to be consistent, and the calibration precision is improved.
Other advantages and features of the invention will become apparent from the following description and may be realized by means of the instrumentalities and combinations particularly pointed out in the appended claims.
To achieve at least one of the above objects or advantages, the present invention provides a calibration method for a near-eye display optical system, including:
acquiring an image of a test image when an entrance pupil surface of a detection camera is aligned with an exit pupil surface of a near-eye display optical system, wherein the test image is an image formed on a display unit of the near-eye display optical system through an image source of the near-eye display optical system, and the image source has a source image of the test image;
based on the image of the test image, obtaining the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera;
processing an image of a test target acquired by the detection camera based on a position correspondence between a position of a source image of the test image in the image source and an imaging plane of the detection camera to obtain position information of a position of the source image of the test target corresponding to the image of the test target in the image source relative to the imaging plane of the detection camera;
obtaining positional information of the test target relative to the near-eye display optical system; and the number of the first and second groups,
processing position information of a test target source image corresponding to the test target image in the image source on an imaging surface of a detection camera and position information of the test target relative to the near-eye display optical system by using a virtual imaging model to obtain virtual image imaging parameters of the near-eye display optical system, wherein the virtual image imaging parameters refer to position parameter information of an image formed by the image source on the display unit.
In an embodiment of the present invention, processing an image of a test target collected by a detection camera based on a position correspondence between a position of a source image of the test image in the image source and an imaging plane of the detection camera to obtain position information of a position of the source image of the test target corresponding to the image of the test target in the image source relative to the imaging plane of the detection camera includes:
extracting feature points in the test target image; and
based on the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera, the position information between the position of the feature point in the test target image in the image source relative to the imaging surface of the detection camera is obtained, so that the position information between the position of the test target source image corresponding to the test target image in the image source relative to the imaging surface of the detection camera is obtained.
In an embodiment of the present invention, obtaining the position information of the test target relative to the near-eye display optical system includes: obtaining position information of the test target relative to the near-eye display optical system based on an internal parameter and an external parameter of a tracking camera of the near-eye display optical system.
In an embodiment of the present invention, before obtaining the position information of the test target relative to the near-eye display optical system based on the intrinsic parameters and the extrinsic parameters of the tracking camera, the method includes: processing the images of the test target acquired by the tracking camera to obtain the intrinsic and extrinsic parameters of the tracking camera.
In an embodiment of the present invention, the calibration method further includes: and acquiring the position information of the entrance pupil surface of the detection camera relative to the display unit by using an eyeball tracking sensor of the near-eye display optical system.
In an embodiment of the present invention, the calibration method further includes: and outputting the position information of the entrance pupil surface of the detection camera relative to the display unit, the internal parameters and the external parameters of the tracking camera, and the virtual image imaging parameters.
In an embodiment of the present invention, the near-eye display optical system is an augmented reality near-eye display optical system.
According to another aspect of the present invention, there is also provided a calibration apparatus for a near-eye display optical system, including:
the system comprises a test image acquisition module, a detection camera and a display module, wherein the test image acquisition module is used for acquiring an image of a test image when an entrance pupil surface of the detection camera is aligned with an exit pupil surface of a near-eye display optical system, the test image is an image formed on a display unit of the near-eye display optical system through an image source of the near-eye display optical system, and the image source is provided with a source image of the test image;
the registration unit is used for obtaining the position corresponding relation between the position of a source image of the test image in the image source and an imaging surface of the detection camera based on the image of the test image;
a virtual image surface position solving module, configured to process an image of a test target acquired by the detection camera based on a position correspondence relationship between a position of a source image of the test image in the image source and an imaging surface of the detection camera, so as to obtain position information of a position of the source image of the test target corresponding to the test target image in the image source relative to the imaging surface of the detection camera;
the target position solving module is used for obtaining the position information of the test target relative to the near-eye display optical system; and
the virtual image imaging parameter acquiring module is configured to process, by using a virtual imaging model, position information of a position of the test target source image corresponding to the test target image in the image source relative to an imaging surface of the detection camera and position information of the test target relative to the near-eye display optical system, so as to obtain virtual image imaging parameters of the near-eye display optical system, where the virtual image imaging parameters refer to position parameter information of an image of the image source formed in the display unit.
In an embodiment of the present invention, the virtual image plane position solving module is configured to: extracting feature points in the test target image; and obtaining the position information of the position of the feature point in the test target image in the image source relative to the imaging plane of the detection camera based on the position corresponding relation between the position of the source image of the test image in the image source and the imaging plane of the detection camera, so as to obtain the position information of the position of the source image of the test target corresponding to the test target image in the image source relative to the imaging plane of the detection camera.
In an embodiment of the invention, the target position solving module is configured to obtain the position information of the test target relative to the near-eye display optical system based on an internal parameter and an external parameter of a tracking camera of the near-eye display optical system.
In an embodiment of the invention, the target position solving module is further configured to process the image of the test target acquired by the tracking camera to obtain the internal parameter and the external parameter of the tracking camera.
In an embodiment of the present invention, the calibration apparatus further includes a pupil distance information obtaining module, configured to: and acquiring the position information of the entrance pupil surface of the detection camera relative to the display unit by using an eyeball tracking sensor of the near-eye display optical system.
In an embodiment of the present invention, the calibration apparatus further includes an output module, configured to: and outputting the position information of the entrance pupil surface of the detection camera relative to the display unit, the internal parameters and the external parameters of the tracking camera, and the virtual image imaging parameters.
In an embodiment of the present invention, the near-eye display optical system is an augmented reality near-eye display optical system
According to another aspect of the present invention, there is also provided a calibration system for a near-eye display optical system, comprising:
testing the target;
a detection camera for acquiring a test image projected by the near-eye display optical system and an image of the test target;
the motion platform is used for driving the entrance pupil surface of the detection camera to be aligned with the exit pupil surface of the near-eye display optical system; and
calibration apparatus, wherein the calibration apparatus comprises:
a processor; and
a memory, wherein computer program instructions are stored in the memory, which computer program instructions, when executed by the processor, cause the processor to perform the calibration method as described above.
According to another aspect of the present invention, there is also provided a computer readable storage medium having stored thereon computer program instructions operable, when executed by a computing device, to perform the calibration method as described above.
Further objects and advantages of the invention will be fully apparent from the ensuing description and drawings.
These and other objects, features and advantages of the present invention will become more fully apparent from the following detailed description, the accompanying drawings and the claims.
Drawings
FIG. 1 illustrates a flow chart of a calibration method for a near-eye display optical system according to a preferred embodiment of the invention.
Fig. 2 is a perspective view illustrating the AR head-mounted display device calibrated in the calibration method according to the preferred embodiment of the invention.
Fig. 3 is a perspective view illustrating a visual effect of augmented reality of the AR head-mounted display device calibrated in the calibration method according to the preferred embodiment of the invention.
Fig. 4 illustrates a block diagram of a calibration apparatus for a near-eye display optical system according to a preferred embodiment of the present invention.
FIG. 5 is a perspective view of a calibration system for a near-eye display optical system according to a preferred embodiment of the invention.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
It will be understood by those skilled in the art that in the present disclosure, the terms "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for ease of description and simplicity of description, and do not indicate or imply that the referenced devices or components must be constructed and operated in a particular orientation and thus are not to be considered limiting.
It is understood that the terms "a" and "an" should be interpreted as meaning "at least one" or "one or more," i.e., that a quantity of one element may be one in one embodiment, while a quantity of another element may be plural in other embodiments, and the terms "a" and "an" should not be interpreted as limiting the quantity.
Summary of the application
As described above, due to errors in device manufacturing processes and the like, before a near-eye display optical device (e.g., an AR head-mounted display device) is put into service, performance parameters of sensors and a positional relationship between the sensors need to be calibrated to ensure that a user has a good visual experience.
The existing calibration method for the near-eye display optical system is low in integration level and is often used for individually calibrating various calibration parameters. Taking the AR head-mounted display device as an example, the parameters that are usually calibrated include: internal and external parameters of the tracking camera, imaging parameters of the virtual image, pupil distance parameters and the like.
In the existing calibration process of the AR head-mounted display device, firstly, internal and external parameters of a tracking camera of the AR head-mounted display device need to be calibrated in an off-line manner, that is, the tracking camera is used to shoot an image of a test target in an off-line manner to solve the internal and external parameters. After obtaining the internal and external parameters of the tracking camera, further calibrating the imaging parameters of the virtual image of the head-mounted display device. The process is as follows: firstly, moving the head-mounted display device or the test target to enable a virtual image of the test target on a display unit of the head-mounted display device, a test target real object and a human eye pupil to be in the same straight line so as to obtain a group of matching point pair information (the test target real object, the virtual image of the test target on the display unit of the head-mounted display device and the human eye pupil); and repeatedly acquiring multiple groups of matching point pair information and substituting the matching point pair information into the related virtual imaging model to obtain the imaging parameters of the virtual image of the head-mounted display equipment.
Another common practice is: fixing a detection camera and the head-mounted display device so as to observe the relative position relationship between the virtual image of the test target on the display unit of the head-mounted display device and the test target object by using the detection camera, and enabling the virtual image of the test target on the display unit of the head-mounted display device to coincide with the test target object by moving the test target object.
Both of these approaches have a number of drawbacks. First, in both the first and second approaches, a complicated and precise adjustment operation is required to confirm that the virtual image of the test target on the display unit of the head-mounted display device and the test target are physically coincident (or in a collinear state). Those skilled in the art will appreciate that it is difficult to determine whether two objects in three-dimensional space are in a coincident or collinear state, which is prone to deviation. In other words, neither the first practice nor the second practice can ensure the accuracy of the virtual image imaging parameters of the head-mounted display apparatus.
Secondly, in both the first method and the second method, the operator is required to sequentially input multiple sets of matching point pair information for measuring virtual image imaging parameters of the head-mounted display device. On one hand, a mode of recording and inputting relevant data by an operator is introduced, so that additional cost is increased; on the other hand, an operator inevitably makes errors in the data recording and inputting process, so that an erroneous virtual image imaging parameter calibration result is obtained.
In the first method, an operator needs to manually determine whether a virtual image of the test target on the display unit of the head-mounted display device, a real object of the test target, and pupils of human eyes are in the same straight line, and record matching point pair information. Since the operator cannot keep the head still for a long time, in the actual operation process, only one set of matching point pair information is often obtained at a time. In order to make the data amount meet the solution requirement, the operator needs to change a plurality of positions to obtain a plurality of sets of matching point pair information. These operations are tedious and error-prone for the operator, not only leading to inefficient calibration, but also contributing to unstable calibration results. Those skilled in the art will know that in an actual production line, AR head-mounted display devices are often manufactured in a large batch, and the manual calibration manner obviously cannot meet the requirement of large batch calibration.
In addition, various calibration parameters of the head-mounted display device are often correlated. Therefore, if a detection deviation occurs in one parameter during the process of calibrating each parameter individually, or when the head-mounted display device is moved to calibrate another parameter, the relative position between the sensors of the head-mounted display device changes, and these factors will cause the accumulation of errors, and the accuracy of the calibration result will be reduced.
Aiming at the technical problem, the basic idea of the invention is to firstly use a detection camera to replace human eyes to calibrate the near-to-eye display optical system; secondly, directly solving the image surface position information of the test target image by utilizing the corresponding relation between the imaging surface of the detection camera and the image surface of the virtual image formed on the display unit by the near-to-eye display optical system; and then, after the position information of the test target relative to the near-eye display optical system is obtained, substituting the image plane position information of the test target image and the position information of the test target relative to the near-eye display optical system into the constructed virtual imaging model to obtain virtual image imaging parameters of the near-eye display optical system.
Based on the above, the invention provides a calibration method for a near-eye display optical system, which comprises the steps of firstly, when an entrance pupil surface of a detection camera is aligned with an exit pupil surface of the near-eye display optical system, acquiring an image of a test image formed on a display unit by an image source of the near-eye display optical system; then, based on the image of the test image, obtaining the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera; then, processing the image of the test target acquired by the detection camera based on the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera to obtain the position information of the position of the source image of the test target corresponding to the image of the test target in the image source relative to the imaging surface of the detection camera; then, position information of the test target relative to the near-eye display optical system is obtained, and further, position information of an image plane of the test target image relative to an imaging plane of a detection camera and position information of the test target relative to the near-eye display optical system are processed by a virtual imaging model to obtain virtual image imaging parameters of the near-eye display optical system, wherein the virtual image imaging parameters refer to position parameter information of an image formed by the image source on the display unit. In this way, a fully automated calibration of the near-eye display optical system is achieved and has a relatively high degree of integration.
Having described the general principles of the present invention, various non-limiting embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
Exemplary calibration method
FIG. 1 illustrates a flow chart of a calibration method for a near-eye display optical system according to a preferred embodiment of the invention. As shown in fig. 1, the calibration method for a near-eye display optical system according to the preferred embodiment of the present invention includes: s110, when an entrance pupil surface of a detection camera is aligned with an exit pupil surface of a near-eye display optical system, acquiring an image of a test image, wherein the test image is an image formed on a display unit of the near-eye display optical system through an image source of the near-eye display optical system, and the image source is provided with a source image of the test image; s120, obtaining a position corresponding relation between the position of a source image of the test image in the image source and an imaging surface of the detection camera based on the image of the test image; s130, processing the image of the test target acquired by the detection camera based on the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera to obtain the position information of the position of the source image of the test target corresponding to the image of the test target in the image source relative to the imaging surface of the detection camera; s140, obtaining position information of the test target relative to the near-eye display optical system, and S150, processing, by using a virtual imaging model, position information of a position of the test target source image corresponding to the test target image in the image source relative to an imaging plane of the detection camera and position information of the test target relative to the near-eye display optical system to obtain virtual image imaging parameters of the near-eye display optical system, where the virtual image imaging parameters refer to position parameter information of an image formed by the image source on the display unit.
In particular, in the preferred embodiment of the present invention, the near-eye display optical system is an AR (Augmented Reality) near-eye display optical system, and more specifically, the calibration method is used for an AR head-mounted display device.
Fig. 2 is a perspective view illustrating the AR head-mounted display device calibrated in the calibration method according to the preferred embodiment of the invention. As shown in fig. 2, the AR head-mounted display device includes a tracking camera 11, an eye tracking sensor 12, a display unit 13 and an image source (not shown), wherein the tracking camera 11 is used for obtaining the position information of the viewed entity, and the eye tracking sensor 12 is used for obtaining the gazing direction of the human eye. The image source is used for generating a virtual image on the display unit 13, wherein the display unit 13 has a special performance, and after the user wears the head-mounted display device, the user can see not only real objects in the real world but also the virtual image projected on the display unit 13 by the head-mounted display device through the display unit 13, so as to bring a special visual experience that a virtual object exists in the real space to the user, as shown in fig. 3. In a specific implementation, the image source may be implemented as an oled (organic Light Emitting display) screen or an led (Light Emitting display) screen, and the display unit 13 is implemented as an AR lens.
In the process of fusing the virtual image projected by the head-mounted display device to the real object in the real space, first obtaining position information of the real object in the space by analyzing a space environment with the tracking camera 11 and other sensors located above the head-mounted display device; further, based on the position information, the position of the virtual image generated by the head-mounted display apparatus to be presented on the display unit 13 is solved. In the above fusion process, internal and external parameters of the tracking camera 11, the virtual image imaging parameters (parameters of the position of the image formed by the head-mounted display device on the display unit 13), and other relevant parameters need to be used.
However, due to errors in device manufacturing processes and the like, performance parameters of sensors of the head-mounted display device and their mutual positional relationship often do not conform to design values, which causes problems in fusion between the virtual image and the physical environment, such as deviation in virtual image size, misalignment in virtual image display position, and the like. This is exactly the reason for calibrating the head mounted display device: the relevant parameters of the head mounted display device are determined (this process is also referred to as calibration process).
More specifically, in step S110, when the entrance pupil surface of the detection camera is aligned with the exit pupil surface of the near-eye display optical system, an image of a test image is acquired, wherein the test image is an image formed on the display unit 13 of the near-eye display optical system by an image source of the near-eye display optical system, and the image source has a source image of the test image. In other words, the entrance pupil surface of the inspection camera and the exit pupil surface of the head-mounted display device are aligned, and the inspection camera captures the image of the test image formed on the display unit 13 by the image source. Here, the entrance pupil plane of the detection camera refers to a common entrance plane of the detection camera into which the light beams emitted from all points on the object plane enter. The exit pupil surface of the head-mounted display device refers to a common exit surface from which light beams emitted from each point on the object plane exit from the last light hole after passing through the whole optical system of the head-mounted display device.
When the entrance pupil surface of the detection camera is aligned with the exit pupil surface of the head-mounted display device, the light beam emitted from the test image projected on the display unit 13 by the head-mounted display device can pass through the exit pupil surface of the head-mounted display device, and passes through the entrance pupil surface of the detection camera, and finally the image of the test image is obtained by the detection camera. Preferably, in implementation, the relative positional relationship between the detection camera and the display unit 13 of the head-mounted display device is adjusted so that the entrance pupil surface of the detection camera coincides with the exit pupil surface of the head-mounted display device, so that the image of the test image can be captured to the maximum extent by the detection camera.
Accordingly, the distance (the interpupillary distance parameter) between the entrance pupil surface of the detection camera and the exit pupil surface of the head-mounted display device, that is, the position information of the entrance pupil surface of the detection camera with respect to the display unit 13 can be obtained by the eye tracking sensor of the near-eye display optical system. This is one of the calibration parameters of the head-mounted display device.
It should be noted that, in the preferred embodiment of the present invention, the detection camera is placed at the exit pupil position of the head-mounted display device to replace the observation of the pupil of the human eye, so that the calibration error caused by human factors can be effectively reduced. Meanwhile, the automatic calibration of the head-mounted display equipment before leaving the factory is facilitated.
In step S120, based on the image of the test image, a position corresponding relationship between a position of a source image of the test image in the image source and an imaging surface of the detection camera is obtained. Here, the imaging plane of the inspection camera refers to an imaging plane in which the image of the test image is imaged in the inspection camera in step S110, that is, a plane set by the light sensing chip of the inspection camera. The source image of the test image refers to a real image of the test image formed by the image source, which forms a virtual image of the test image presented on the display unit 13 by the optical action of the near-eye optical system (e.g., by waveguide optics, etc.).
It should be understood by those skilled in the art that, for the head-mounted display device, the position information of the source image of the test image in the image source is a preset value, and the position of the image of the test image on the imaging surface of the detection camera can also be obtained by the existing image data processing method. Therefore, the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera can be obtained, and the corresponding relation can be expressed by a formula as follows: (u, v) ═ F1(X, Y), where (X, Y) represents the position coordinates of the image of the test image on the imaging plane of the inspection camera, (u, v) represents the position information of the source image of the test image in the image source, and F1 represents the functional mapping relationship between the two.
It is worth mentioning that the purpose of establishing the position correspondence between the position of the source image of the test image in the image source and the imaging plane of the detection camera is to: by using the functional mapping relationship, an operation of manually determining that a virtual image of a test target on the display unit 13 of the head-mounted display device and the test target are physically coincident (or in a collinear state) is eliminated. The technical principle will be explained in detail in the following description.
In step S130, the image of the test target collected by the detection camera is processed based on the position correspondence between the position of the source image of the test image in the image source and the imaging plane of the detection camera, so as to obtain the position information between the position of the source image of the test target corresponding to the image of the test target in the image source and the imaging plane of the detection camera. That is, in step S130, an image of the test target is first obtained by the detection camera, and the test target image is processed by using the position corresponding relationship between the position of the source image of the test image in the image source and the imaging plane of the detection camera, so as to obtain the position information between the position of the source image of the test target corresponding to the test target image in the image source and the imaging plane of the detection camera. Here, the test target is placed within the field of view of the detection camera, preferably with a central region of the test target corresponding to the optical axis set by the detection camera.
It is worth mentioning that the test target is a real object located in the display space, and the test image described in the foregoing steps S110 and S120 is a virtual image presented on the display unit 13. Moreover, the pattern of the test image can be any test pattern, and does not need to be consistent with the test target.
As previously described, in step S120, the positional correspondence between the position of the source image of the test image in the image source and the imaging plane of the inspection camera is determined ((u, v) ═ F1(X, Y)). Therefore, when the position (X, Y) of the test target image on the imaging plane of the inspection camera is known, the correspondence can be used to solve the positional information (u, v) between the position of the test target source image corresponding to the test target image on the image source and the imaging plane of the inspection camera in the reverse direction. That is, in the preferred embodiment of the present invention, the positional information of the test target image (virtual image) formed on the display unit 13 of the head-mounted display device can be indirectly solved by the functional mapping relationship of the position of the test target source image corresponding to the test target image in the image source and the position of the imaging plane between the detection cameras. In this way, an operation of manually determining that a virtual image of a test target on the display unit 13 of the head-mounted display device coincides with (or is in a collinear state with) the test target real object can be eliminated. That is, the biggest problem in the existing calibration method is that: the alignment problem is solved (the test target real object, the virtual image of the test target on the display unit 13 of the head-mounted display device, and the alignment between pupils of human eyes).
In a specific implementation, feature points (e.g., peripheral corner points) of the test target image can be selectively extracted, and further, based on a corresponding relationship between positions of source images of the test image in the image source and an imaging plane of the inspection camera, position information between positions of the feature points (e.g., peripheral corner points) in the test target image in the image source and the imaging plane of the inspection camera is obtained, so that position information between positions of the source images of the test target corresponding to the test target image in the image source and the imaging plane of the inspection camera is obtained. In other words, the feature points of the test target image are used to characterize the test target image, so as to reduce the amount of calculation. In particular, in other embodiments of the present application, the feature points in the test target image may also be other types of feature points, for example, a center point of the test target image, and the like. And is not intended to limit the scope of the present application.
In step S140, positional information of the test target relative to the near-eye display optical system is obtained. In particular, in this embodiment of the present application, the AR head-mounted display device includes the tracking camera 11, and accordingly, in step S140, it may be selected to obtain a three-dimensional spatial positional relationship between the test target and the display unit 13 of the head-mounted display device based on the internal and external parameters of the tracking camera 11 of the head-mounted display device.
To improve calibration accuracy, the internal and external parameters of the tracking camera 11 are typically recalibrated (this is the second calibration parameter) before solving for the positional information of the test target relative to the head mounted display device. The specific process is as follows: first, an image of the test target is acquired by the tracking camera 11; further, the acquired test target image is processed (for example, feature point coordinates in the test image are extracted); then, by combining with a preset world coordinate system, the feature points in the test target image are substituted into a preset imaging model to solve and obtain the internal and external parameters of the tracking camera 11.
Accordingly, after the inside-outside parameters of the tracking camera 11 are obtained, the three-dimensional spatial positional relationship (x, y, z) between the test target and the display unit 13 of the head-mounted display device can be obtained using the inside-outside parameters of the tracking camera 11 and the test target image captured by the tracking camera 11.
It will be appreciated by those skilled in the art that in further implementations of the present disclosure, the head-mounted display device may utilize other techniques to obtain positional information of the test target relative to the near-eye display optical system, such as by a TOF camera module or the like. And is not intended to limit the scope of the present application. Here, it should be appreciated that when other distance measurement means are employed to obtain positional information of the test target relative to the near-eye display optical system, calibration of internal and external parameters of the tracking camera 11 is no longer necessary. In other words, in the present application, the inside and outside parameters of the tracking camera 11 are calibrated as introspectable items.
In step S150, position information of a test target source image corresponding to the test target image in the image source with respect to an imaging plane of a detection camera and position information of the test target with respect to the near-eye display optical system are processed by a virtual imaging model to obtain virtual image imaging parameters of the near-eye display optical system, where the virtual image imaging parameters refer to position parameter information of an image formed on the display unit 13 by the image source.
That is, the position information (u, v) between the position of the test target source image corresponding to the test target image in the image source acquired in step S130 with respect to the imaging plane of the detection camera and the three-dimensional spatial position relationship (x, y, z) of the test target and the head-mounted display device acquired in step S140 are input to the constructed virtual imaging model, whereby the virtual image imaging parameters are iteratively fitted. The process can be formulated as: (u, v) ═ F2(x, y, z), where (u, v) represents positional information between the position of a test target source image corresponding to the test target image in the image source with respect to the imaging plane of the detection camera, (x, y, z) represents a three-dimensional spatial positional relationship between the test target and the display unit 13 of the head mounted display device, and F2 represents a functional mapping relationship therebetween.
It should be noted that in the preferred embodiment of the present invention, the calibration of the virtual image imaging parameters and the calibration of the internal and external parameters of the tracking camera 11 share the test target. In other words, the calibration method provided by the invention has high integration level and simple calibration operation.
In summary, the process of calibrating the AR head-mounted display device by the calibration method provided by the present invention to automatically and highly-integratedly obtain the calibration parameters of the head-mounted display device (including the position information of the entrance pupil plane of the detection camera with respect to the display unit 13, the internal and external parameters of the tracking camera 11, and the virtual image imaging parameters) is described.
It should be appreciated that although the calibration method is described above as an example for the calibration of the AR head-mounted display device, one skilled in the art will appreciate that the calibration method may also be applied to other types of devices for augmented reality near-eye display optical systems. Even in other embodiments of the present invention, the calibration principle and spirit of the calibration method can be applied to a Virtual Reality (VR) near-eye display optical system. The present application is not limited in this respect.
Schematic calibration device
Fig. 4 illustrates a block diagram of a calibration apparatus for a near-eye display optical system according to a preferred embodiment of the present invention.
As shown in fig. 4, the calibration apparatus 400 for a near-eye display optical system according to the preferred embodiment of the present invention includes: a test image obtaining module 410, configured to obtain an image of a test image when an entrance pupil surface of the detection camera is aligned with an exit pupil surface of the near-eye display optical system, where the test image is an image formed on a display unit of the near-eye display optical system by an image source of the near-eye display optical system, where the image source has a source image of the test image; a registration unit 420, configured to obtain, based on an image of the test image, a position correspondence between a position of a source image of the test image in the image source and an imaging plane of the detection camera; a virtual image surface position solving module 430, configured to process, based on a position correspondence between a position of a source image of the test image in the image source and an imaging surface of the detection camera, an image of the test target acquired by the detection camera, so as to obtain position information of a position of the source image of the test target corresponding to the test target image in the image source relative to the imaging surface of the detection camera; a target position solving module 440, configured to obtain position information of the test target relative to the near-eye display optical system; and a virtual image imaging parameter obtaining module 450, configured to process, with a virtual imaging model, position information of an image plane of the test target image with respect to an imaging plane of the detection camera and position information of the test target with respect to the near-eye display optical system to obtain virtual image imaging parameters of the near-eye display optical system, where the virtual image imaging parameters refer to position parameter information of an image formed by the image source on the display unit.
In an example, in the above calibration apparatus 400, the virtual image plane position solving module 430 is further configured to: extracting feature points in the test target image; and obtaining the position information of the position of the feature point in the test target image in the image source relative to the imaging plane of the detection camera based on the position corresponding relation between the position of the source image of the test image in the image source and the imaging plane of the detection camera, so as to obtain the position information of the position of the source image of the test target corresponding to the test target image in the image source relative to the imaging plane of the detection camera.
In one example, in the calibration apparatus 400, the target position solving module 440 is configured to obtain the position information of the test target relative to the near-eye display optical system based on the internal parameter and the external parameter of the tracking camera of the near-eye display optical system.
In one example, in the calibration apparatus 400, the target position solving module is further configured to process the image of the test target acquired by the tracking camera to obtain the internal parameter and the external parameter of the tracking camera.
In an example, in the calibration apparatus 400, the calibration apparatus 400 further includes a pupil distance information obtaining module 460, configured to: and acquiring the position information of the entrance pupil surface of the detection camera relative to the display unit by using an eyeball tracking sensor of the near-eye display optical system.
In one example, in the calibration apparatus 400, the calibration apparatus 400 further includes an output module 470 for: and outputting the position information of the entrance pupil surface of the detection camera relative to the display unit, the internal parameters and the external parameters of the tracking camera, and the virtual image imaging parameters.
In one example, in the above calibration apparatus 400, the near-eye display optical system is an augmented reality near-eye display optical system.
Here, it will be understood by those skilled in the art that the specific functions and operations of the respective units and modules in the above-described calibration apparatus 400 have been described in detail in the calibration method for a near-eye display optical system described above with reference to fig. 1 to 3, and thus, a repetitive description thereof will be omitted.
As described above, the calibration apparatus according to the embodiment of the present application can be implemented in various terminal devices, such as a server of a calibration system for a near-eye display optical system. In one example, the calibration apparatus according to the embodiment of the present application may be integrated into the terminal device as a software module and/or a hardware module. For example, the calibration means may be a software module in the operating system of the terminal device, or may be an application developed for the terminal device; of course, the calibration device can also be one of many hardware modules of the terminal device.
Alternatively, in another example, the calibration apparatus and the terminal device may also be separate terminal devices, and the calibration apparatus may be connected to the terminal device through a wired and/or wireless network and transmit the interaction information according to an agreed data format.
Exemplary calibration System
FIG. 5 is a perspective view of a calibration system for a near-eye display optical system according to a preferred embodiment of the invention.
As shown in fig. 5, the calibration system 500 for a near-eye display optical system according to the preferred embodiment of the present invention includes: a test target 510; detection camera 520, motion platform 530, and calibration apparatus 540. The detection camera 520 is used to capture an image of the test image projected by the near-eye display optical system and an image of the test target 510. The motion stage 530 is configured to drive an entrance pupil plane of the detection camera 520 to be aligned with an exit pupil plane of the near-eye display optical system.
The calibration apparatus 540 includes: a processor 541 and a memory 542, wherein in the memory 542 are stored computer program instructions that, when executed by the processor 541, cause the processor to perform a calibration method for a near-eye display optical system as described above.
Illustrative computer program product
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the calibration method for a near-eye display optical system according to various embodiments of the present application described in the "exemplary methods" section above of this specification.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as "r" or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform the steps in the calibration method for a near-eye display optical system according to various embodiments of the present application described in the "exemplary methods" section above in this specification.
The computer readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present application have been described above with reference to specific embodiments, but it should be noted that advantages, effects, etc. mentioned in the present application are only examples and are not limiting, and the advantages, effects, etc. must not be considered to be possessed by various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is provided for purposes of illustration and understanding only, and is not intended to limit the application to the details which are set forth in order to provide a thorough understanding of the present application.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by one skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are given by way of example only and are not limiting of the invention. The objects of the invention have been fully and effectively accomplished. The functional and structural principles of the present invention have been shown and described in the embodiments, and any variations or modifications may be made to the embodiments of the present invention without departing from the principles described.

Claims (17)

1. A calibration method for a near-eye display optical system, comprising:
acquiring an image of a test image when an entrance pupil surface of a detection camera is aligned with an exit pupil surface of a near-eye display optical system, wherein the test image is an image formed on a display unit of the near-eye display optical system through an image source of the near-eye display optical system, and the image source has a source image of the test image;
based on the image of the test image, obtaining the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera;
processing an image of a test target acquired by the detection camera based on a position corresponding relation between a position of a source image of the test image in the image source and an imaging plane of the detection camera to obtain position information of a position of the source image of the test target corresponding to the image of the test target in the image source relative to the imaging plane of the detection camera;
obtaining positional information of the test target relative to the near-eye display optical system; and the number of the first and second groups,
processing position information of a test target source image corresponding to the test target image in the image source relative to an imaging plane of a detection camera and position information of the test target relative to the near-eye display optical system by using a virtual imaging model to obtain virtual image imaging parameters of the near-eye display optical system, wherein the virtual image imaging parameters refer to position parameter information of an image formed by the image source on the display unit.
2. The calibration method according to claim 1, wherein processing the image of the test target acquired by the inspection camera based on the positional correspondence between the position of the source image of the test image in the image source and the imaging plane of the inspection camera to obtain positional information of the position of the source image of the test target corresponding to the test target image in the image source with respect to the imaging plane of the inspection camera comprises:
extracting feature points in the test target image; and
based on the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera, the position information between the position of the feature point in the test target image in the image source relative to the imaging surface of the detection camera is obtained, so that the position information between the position of the test target source image corresponding to the test target image in the image source relative to the imaging surface of the detection camera is obtained.
3. A calibration method according to claim 1, wherein obtaining positional information of the test target relative to the near-eye display optical system comprises:
and obtaining the position information of the test target relative to the near-eye display optical system based on the internal parameters and the external parameters of the tracking camera of the near-eye display optical system.
4. A calibration method according to claim 3, wherein, prior to obtaining positional information of the test target relative to the near-eye display optical system based on the intrinsic and extrinsic parameters of the tracking camera, comprising: processing the images of the test target acquired by the tracking camera to obtain the intrinsic and extrinsic parameters of the tracking camera.
5. A calibration method according to claim 1 or 2, further comprising: and acquiring the position information of the entrance pupil surface of the detection camera relative to the display unit by using an eyeball tracking sensor of the near-eye display optical system.
6. The calibration method according to claim 3 or 4, further comprising: and acquiring the position information of the entrance pupil surface of the detection camera relative to the display unit by using an eyeball tracking sensor of the near-eye display optical system.
7. The calibration method of claim 6, further comprising: and outputting the position information of the entrance pupil surface of the detection camera relative to the display unit, the internal parameters and the external parameters of the tracking camera, and the virtual image imaging parameters.
8. The calibration method according to claim 7, wherein the near-eye display optical system is an augmented reality near-eye display optical system.
9. A calibration apparatus for a near-eye display optical system, comprising:
the system comprises a test image acquisition module, a detection camera and a display module, wherein the test image acquisition module is used for acquiring an image of a test image when an entrance pupil surface of the detection camera is aligned with an exit pupil surface of a near-eye display optical system, the test image is an image formed on a display unit of the near-eye display optical system through an image source of the near-eye display optical system, and the image source is provided with a source image of the test image;
the registration unit is used for obtaining the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera based on the image of the test image;
a virtual image surface position solving module, configured to process an image of the test target acquired by the detection camera based on a position correspondence between a position of a source image of the test image in the image source and an imaging surface of the detection camera, so as to obtain position information of a position of the source image of the test target corresponding to the test target image in the image source relative to the imaging surface of the detection camera;
the target position solving module is used for obtaining the position information of the test target relative to the near-eye display optical system; and
the virtual image imaging parameter acquiring module is configured to process, by using a virtual imaging model, position information of a position of the test target source image corresponding to the test target image in the image source relative to an imaging surface of the detection camera and position information of the test target relative to the near-eye display optical system, so as to obtain virtual image imaging parameters of the near-eye display optical system, where the virtual image imaging parameters refer to position parameter information of an image of the image source formed in the display unit.
10. Calibration apparatus according to claim 9, wherein the virtual image plane position solving module is configured to:
extracting feature points in the test target image; and
based on the position corresponding relation between the position of the source image of the test image in the image source and the imaging surface of the detection camera, the position information between the position of the feature point in the test target image in the image source relative to the imaging surface of the detection camera is obtained, so that the position information between the position of the test target source image corresponding to the test target image in the image source relative to the imaging surface of the detection camera is obtained.
11. The calibration apparatus of claim 10, wherein the target position solving module is configured to:
obtaining position information of the test target relative to the near-eye display optical system based on an internal parameter and an external parameter of a tracking camera of the near-eye display optical system.
12. The calibration device of claim 11, wherein the target position solving module is further configured to:
processing the images of the test target acquired by the tracking camera to obtain intrinsic and extrinsic parameters of the tracking camera.
13. The calibration apparatus according to claim 12, further comprising a interpupillary distance information obtaining module, configured to: and acquiring the position information of the entrance pupil surface of the detection camera relative to the display unit by using an eyeball tracking sensor of the near-eye display optical system.
14. A calibration arrangement according to any one of claims 11-13, further comprising an output module for: and outputting the position information of the entrance pupil surface of the detection camera relative to the display unit, the internal parameters and the external parameters of the tracking camera, and the virtual image imaging parameters.
15. The calibration device of claim 14, wherein the near-eye display optical system is an augmented reality near-eye display optical system.
16. A calibration system for a near-eye display optical system, comprising:
testing the target;
a detection camera for acquiring a test image projected by the near-eye display optical system and an image of the test target;
the motion platform is used for driving the entrance pupil surface of the detection camera to be aligned with the exit pupil surface of the near-eye display optical system; and
calibration apparatus, wherein the calibration apparatus comprises:
a processor; and
memory in which computer program instructions are stored, which, when executed by the processor, cause the processor to carry out the calibration method according to any one of claims 1-8.
17. A computer readable storage medium having computer program instructions stored thereon which, when executed by a computing device, are operable to perform a calibration method as defined in any one of claims 1 to 8.
CN201811036951.9A 2018-09-06 2018-09-06 Calibration method, calibration device and calibration system for near-eye display optical system Active CN110880188B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811036951.9A CN110880188B (en) 2018-09-06 2018-09-06 Calibration method, calibration device and calibration system for near-eye display optical system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811036951.9A CN110880188B (en) 2018-09-06 2018-09-06 Calibration method, calibration device and calibration system for near-eye display optical system

Publications (2)

Publication Number Publication Date
CN110880188A CN110880188A (en) 2020-03-13
CN110880188B true CN110880188B (en) 2022-07-01

Family

ID=69727161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811036951.9A Active CN110880188B (en) 2018-09-06 2018-09-06 Calibration method, calibration device and calibration system for near-eye display optical system

Country Status (1)

Country Link
CN (1) CN110880188B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111586290B (en) * 2020-04-07 2021-06-15 延锋伟世通电子科技(上海)有限公司 Position calibration method for camera for optical test of vehicle-mounted head-up display
CN111652946B (en) * 2020-06-17 2024-05-24 Oppo广东移动通信有限公司 Display calibration method and device, equipment and storage medium
CN113252309A (en) * 2021-04-19 2021-08-13 苏州市计量测试院 Testing method and testing device for near-to-eye display equipment and storage medium
CN113155036B (en) * 2021-04-25 2023-03-21 歌尔光学科技有限公司 Testing method and testing system for binocular projection assembly offset
US11961258B2 (en) 2022-01-26 2024-04-16 Industrial Technology Research Institute Calibration method for optical see-through display and calibration system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204431A (en) * 2016-08-24 2016-12-07 中国科学院深圳先进技术研究院 The display packing of intelligent glasses and device
CN106970711A (en) * 2017-04-27 2017-07-21 上海欢米光学科技有限公司 The method and apparatus that VR display devices are alignd with display terminal screen
CN107884160A (en) * 2017-09-25 2018-04-06 杭州浙大三色仪器有限公司 Virtual image photoelectric measuring instrument
CN207424365U (en) * 2017-10-27 2018-05-29 广东烨嘉光电科技股份有限公司 A kind of lens group structure that VR shootings are realized using mobile phone dual camera
CN108124152A (en) * 2017-12-26 2018-06-05 华勤通讯技术有限公司 The distortion measurement method and system of head-mounted display apparatus
CN108462867A (en) * 2017-12-29 2018-08-28 无锡易维视显示技术有限公司 The system and method for automatic Calibration tracking mode bore hole stereoscopic display equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204431A (en) * 2016-08-24 2016-12-07 中国科学院深圳先进技术研究院 The display packing of intelligent glasses and device
CN106970711A (en) * 2017-04-27 2017-07-21 上海欢米光学科技有限公司 The method and apparatus that VR display devices are alignd with display terminal screen
CN107884160A (en) * 2017-09-25 2018-04-06 杭州浙大三色仪器有限公司 Virtual image photoelectric measuring instrument
CN207424365U (en) * 2017-10-27 2018-05-29 广东烨嘉光电科技股份有限公司 A kind of lens group structure that VR shootings are realized using mobile phone dual camera
CN108124152A (en) * 2017-12-26 2018-06-05 华勤通讯技术有限公司 The distortion measurement method and system of head-mounted display apparatus
CN108462867A (en) * 2017-12-29 2018-08-28 无锡易维视显示技术有限公司 The system and method for automatic Calibration tracking mode bore hole stereoscopic display equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"CALIBRATION OF FISHEYE CAMERA USING ENTRANCE PUPL";Peter Fasogbon et al.;《arXiv》;20190731;全文 *
"光学投射头盔显示器标定综述";罗斌 等;《计算机辅助设计与图形学学报》;20090430;第21卷(第4期);全文 *
"双目光栅投影关键技术研究";邱运春;《中国优秀硕士学位论文全文数据库 信息科技辑》;20160315(第03期);全文 *

Also Published As

Publication number Publication date
CN110880188A (en) 2020-03-13

Similar Documents

Publication Publication Date Title
CN110880188B (en) Calibration method, calibration device and calibration system for near-eye display optical system
US11544874B2 (en) System and method for calibration of machine vision cameras along at least three discrete planes
CN110967166B (en) Detection method, detection device and detection system of near-eye display optical system
CN110084854B (en) System and method for runtime determination of camera calibration errors
JP5858433B2 (en) Gaze point detection method and gaze point detection device
CN109859272B (en) Automatic focusing binocular camera calibration method and device
Semeniuta Analysis of camera calibration with respect to measurement accuracy
EP2682710B1 (en) Apparatus and method for three-dimensional measurement and robot system comprising said apparatus
US20190033064A1 (en) Three-dimensional measurement device mobile geometry verification
JP2009053147A (en) Three-dimensional measuring method and three-dimensional measuring device
EP3709270A1 (en) Registration of individual 3d frames
US9990739B1 (en) Method and device for fisheye camera automatic calibration
JP3138080B2 (en) Automatic calibration device for vision sensor
CN113034612A (en) Calibration device and method and depth camera
CN111780715A (en) Visual ranging method
CN107850425A (en) Method for measuring artifact
KR101535801B1 (en) Process inspection device, method and system for assembling process in product manufacturing using depth map sensors
CN113196165A (en) Information projection system, control device, and information projection method
JP2023510738A (en) Method of moving the coordinate system of the 3D camera to the incident position of the 2D camera
Yamauchi et al. Calibration of a structured light system by observing planar object from unknown viewpoints
US10162189B2 (en) Device, system and method for the visual alignment of a pipettor tip and a reference point marker
CN112529856A (en) Method for determining the position of an operating object, robot and automation system
US10967517B2 (en) Information processing apparatus, method, and storage medium for presenting information for calibration
Karan Accuracy improvements of consumer-grade 3D sensors for robotic applications
JP7329427B2 (en) lens meter

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20200313

Assignee: Zhejiang Shunwei Technology Co.,Ltd.

Assignor: SUNNY OPTICAL (ZHEJIANG) RESEARCH INSTITUTE Co.,Ltd.

Contract record no.: X2024330000055

Denomination of invention: Calibration method, calibration device, and calibration system for near eye display optical systems

Granted publication date: 20220701

License type: Common License

Record date: 20240515