US20190052851A1 - System and Method for Recalibrating a Projector System - Google Patents

System and Method for Recalibrating a Projector System Download PDF

Info

Publication number
US20190052851A1
US20190052851A1 US15/970,764 US201815970764A US2019052851A1 US 20190052851 A1 US20190052851 A1 US 20190052851A1 US 201815970764 A US201815970764 A US 201815970764A US 2019052851 A1 US2019052851 A1 US 2019052851A1
Authority
US
United States
Prior art keywords
projector
cameras
recalibratable
projector system
patterns
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/970,764
Inventor
Sascha Korl
Vinod KHARE
Yujia Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hilti AG
Original Assignee
Hilti AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hilti AG filed Critical Hilti AG
Priority to US15/970,764 priority Critical patent/US20190052851A1/en
Priority to US16/637,965 priority patent/US11050983B2/en
Priority to EP18753078.7A priority patent/EP3665898A1/en
Priority to PCT/EP2018/070168 priority patent/WO2019029991A1/en
Publication of US20190052851A1 publication Critical patent/US20190052851A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • G01C11/025Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures by scanning the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/08Projecting images onto non-planar surfaces, e.g. geodetic screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • H04N5/247
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/317Convergence or focusing systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20012Locally adaptive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • G06T3/005
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/315Modulator illumination systems
    • H04N9/3161Modulator illumination systems using laser light sources

Definitions

  • the invention relates to determining projector calibration parameters in order to recalibrate a projector system and project an image onto a work surface at the correct position based on the determined projector calibration parameters.
  • Projection mapping which may also be known as video mapping or spatial augmented reality, is a projection technology used to turn physical objects (often irregularly shaped objects) into a display surface for image and video projection.
  • the objects may be complex industrial landscapes, such as buildings, small indoor objects or theatrical stages.
  • Using software a two or three-dimensional object is spatially mapped on a virtual program that mimics the real environment that is to be projected on.
  • the software may interact with a projector to fit any desired image onto a surface of that object.
  • a position enabled projector is a tool that projects an image, such as a blueprint, onto a work surface at its true position with true scale. Being able to correctly project the position of the image onto a surface may involve determining various parameters, such as the position or orientation of the projector itself, the characteristics of the surface that the image is being projected onto, and where each of the projected pixels of the image will appear on the surface.
  • the projection of image points on a surface at their true positions along with true scale is particularly important if a specific task to be performed requires precision and accuracy.
  • a blueprint may be projected onto a work surface so as to allow a construction worker to drill holes at various specified positions on the surface based on the information provided by the blueprint.
  • the projector In order to project the blueprint at the correct position on the work surface, the projector needs to know where each of the projected pixels of the image will appear on the work surface.
  • off-the-shelf projectors or other types of projector modules are not designed and built for this purpose, and thus, they are inherently unstable and the projector components (such as the light source, mirror array, etc.) are physically moving relative to their initially installed positions over time, temperature, and mechanical stress (projector intrinsics).
  • the mechanical fixing(s) between the projector and the positioning system may move as well (projector extrinsics).
  • One solution for the above-described problem may be to recalibrate the whole projector/positioning system offline and on a well-defined calibration jig to measure the projector intrinsic and extrinsic parameters, which can then be used to correctly project the image again.
  • the disadvantage of this approach is that the recalibration procedure requires specific setup, equipment, software, and also takes time and extra effort, thereby rendering this approach practically unfeasible for a projector to be used on a construction site.
  • the invention is directed to a system and method for recalibrating a projector system.
  • the projector may be calibrated using one or more cameras arranged near or adjacent to a projector of a projector system.
  • the projector may project one or more specific patterns onto a work surface, where the one or more cameras may be configured to capture each of the projected patterns on the surface. All of the captured images of the patterns may be used to determine projector calibration parameters in order to properly recalibrate the projector and project an image at the correction position based on the determined projector calibration parameters.
  • the one or more cameras of the projector system may be calibrated only once off-site (such as in a factory where the projector system is produced and/or assembled) and may also be installed in a mechanically stable manner so that they do not move or change positions relative to each other.
  • the positional relationship between the one or more cameras is a known parameter.
  • stabilizing the one or more cameras is an easier requirement than stabilizing the projector itself since the cameras may be easier to install and also smaller in size than the projector.
  • cameras are easier to make stable than the projector when assembling and/or producing the projector system.
  • FIGS. 1 and 2 illustrate projector systems in accordance with one or more principles of the present invention.
  • FIG. 3 illustrates a flow diagram in accordance with one or more principles of the present invention.
  • the present invention is directed to correctly and accurately projecting, using a projector system, a two-dimensional image (e.g., a construction-related blueprint) onto an uneven work surface, such as corrugated steel sheet, and automatically recalibrating the projector system (either by the projector system on-site itself or the user on-site) so that the pixels of the image actually appear where they are supposed to appear on the work surface.
  • a projector system e.g., a construction-related blueprint
  • a two-dimensional image e.g., a construction-related blueprint
  • an uneven work surface such as corrugated steel sheet
  • a projector system may include a projector and one or more cameras, and the components of the whole projector system, including the cameras, may be calibrated and installed/stabilized, only once, in the off-site location (e.g., factory) where the projection system is produced and/or assembled. For instance, it may be easier to make mechanically stable the one or more cameras of the projector system than the projector itself.
  • the positional relation between the one or more cameras is a known (and, importantly, a calibrated) parameter that should not change due to time or external stresses as the projector system is being operated in the field, such as a construction site.
  • one or more patterns may be projected onto the work surface. It is not necessary to know the exact location and geometry of the work surface.
  • the projected one or more patterns may include a specific shape (e.g., square, rectangle, triangle, etc.) or a combination of different types of shapes.
  • the one or more cameras may capture each of the projected patterns.
  • All the captured images of the patterns are then used to determine projector calibration parameters.
  • the correspondences between points from all cameras and the projector are computed.
  • the unknown projector calibration parameters may be solved by equation system(s), which will be further described below.
  • the computed projector calibration parameters may then be used for projecting a positionally correct image onto the work surface.
  • the projection of the image at the correct position based on the projector calibration parameters relates at least to the recalibration of the projector system.
  • the determination of the projector calibration parameters may be performed on-site and/or in real time so that time and costs associated with recalibrating the projector system are saved.
  • the determination of the projector calibration parameters are based on known parameters (e.g., the positional relationship between the multiple cameras in relation to the projector) that will not change or easily change over time and/or external stresses, such as temperature or mechanical damage or stress.
  • the recalibration of the projector system can be performed on-site, in real-time, and/or simultaneously while the projector system is profiling the surface.
  • the invention described herein may be implemented on and executed by one or more computing devices and/or one or more processors.
  • the projector system may have computing capabilities, by way of example, one or more processors, central processing units (CPUs), etc.
  • the computing associated with determining projector calibration parameters according to aspect(s) of the present invention may be executed by computing hardware in the projector system itself.
  • the processing may be performed by a separate portable computing device, such as a laptop, tablet computer, or any other suitable type of mobile computing device that can be operated by a user.
  • FIG. 1 illustrates an example projector system 110 in accordance with one or more principles of the present invention.
  • the projector system 110 includes one or more processors 112 , memory 114 (which includes instructions 116 and data 118 ), at least one projector 119 , at least one camera 120 , and at least one interface 121 .
  • the processor 112 may instruct the components of the projector system 110 to perform various tasks based on the processing of information and/or data that may have been previously stored or have been received, such as based on the instructions 116 and/or data 118 stored in memory 114 .
  • the processor(s) 112 may be a standard processor, such as a central processing unit (CPU), graphics processor, or may be a dedicated processor, such as an application-specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • the instructions 116 may be one or more sets of computer-executable instructions (e.g., software) that can be implemented by the processor 112 .
  • Data 118 may include various types of information (which can be retrieved, manipulated and/or stored by the processor 112 ), such as information to be projected by the projector 119 , information captured by the one or more cameras 120 , etc.
  • the projector 119 may be any object, apparatus, or device with a system of lenses that is used to project rays of light that form an image that is generated by a processor or computing device for projection.
  • the one or more cameras 120 may be an optical instrument, apparatus, or device that is sued for recording or capturing images, which may be still photographs and/or sequences of images constituting a video or movie. While FIG. 1 illustrates a camera, it may be understood that any type of image capturing device may be used, such as a thermographic camera (infrared), etc.
  • Interface 121 may be any component that allows interfacing with an operator or user.
  • interface 121 may be a device, port, or a connection that allows a user to communicate with the projector system 110 , including but not limited to a touch-sensitive screen, microphone, camera, and may also include one or more input/output ports, such as a universal serial bus (USB) drive, various card readers, etc.
  • the interface 121 may also include hardware and/or equipment for surface profiling, such as one or more sensors, one or more range meters, etc.
  • the projector system 110 may be configured to communicate with other computing devices via network 130 .
  • the projector system 110 may communicate with other projector systems, mobile computing devices (e.g., laptops, tablet computers, smartphones).
  • the network 130 which may be any type of network, such as LAN, WAN, Wi-Fi, Bluetooth, etc.
  • processing related to determining the projector calibration parameters are carried out by the one or more processors 112 of the projector system 110 , it may be understood that the processing may be performed by external computing devices and/or hardware, such as a mobile computing device, that may be communicating with the projector system 110 via the network 130 .
  • FIG. 2 illustrates the various components of a projector system 200 in accordance with one or more principles of the present invention.
  • the projector system 200 may be configured in a similar manner to the projector system 100 of FIG. 1 .
  • the projector system 200 may include at least two different cameras 202 and 204 , which are arranged on opposing sides of a projector 206 .
  • the projector 206 may project an image, such as a construction-related blueprint for drilling various holes, onto a work surface 208 , which in FIG. 2 , may be a corrugated steel sheet.
  • the projector 206 may also be configured to project one or more patterns, as described above.
  • the two cameras 202 and 204 may each capture images of the projection of each of the patterns onto the work surface 208 , as also shown in FIG. 2 .
  • FIG. 2 illustrates a pattern generator for generating and projecting the various patterns and a calibration process or routine for determining the projector calibration parameters (which will be further described below), which are depicted inside of the dashed box, as shown in FIG. 2 .
  • the dashed box represents that the above-described pattern generation and calibration process/routine for determining the projector calibration parameters may be performed or executed by one or more processors and/or one or more computing devices.
  • the one or more processors may be the one or more processors 112 illustrated in FIG. 1 .
  • the one or more processors 112 may be configured generate one or more specific patterns associated with the calibration process, as described above, and further the one or more processors 112 may then be configured to determine all requisite projector calibration parameters for recalibrating the projector system in order to accurately project images onto the work surface.
  • the two cameras 202 and 204 may be arranged in the projector system 200 in a mechanically stable manner relative to each other and also relative to the projector 206 .
  • mechanical stability may be provided in any fashion, such as by way of screws, fasteners, adhesive substance, etc.
  • the overall effect, for example, of mechanical stability is that the cameras 202 and 204 will not move or change in position regardless of time, temperature, and mechanical stress, especially compared to the position and stability of the projector 206 . In that way, the two cameras 202 and 204 may also be smaller in size than the projector 206 in order to achieve the above-effect.
  • a set of parameters may include intrinsic parameters and extrinsic parameters.
  • the intrinsic parameters for a linear projector model, e.g., pinhole model
  • the intrinsic parameters may include:
  • the complete camera-projector calibration matrix including the intrinsic and extrinsic parameters may be represented as follows:
  • the variables R and t are the rotation matrix and translation vector between a camera or projector and a world coordinate system, respectively.
  • the variables t u and t v denote the translation between optical axis and image sensor coordinate system.
  • the variable C is a [3 ⁇ 4] calibration matrix.
  • the intrinsic calibration matrix K, the rotation matrix R and the translation vector t may be determined by standard matrix decomposition or other types of suitable decomposition techniques.
  • the projected point may be represented by the variable k and has homogenous coordinates:
  • the projected point originates from the pixel of the projector image, which may be represented by:
  • the projected point appears on a different camera image plane (e.g., for the second camera) represented by:
  • C (n) is the n-th row of the matrix C.
  • the camera calibration matrices C c1 and C c2 are known based on how the cameras were calibrated off-site (e.g., at the factory).
  • the pixel coordinates of the projector u′ p , v′ p and the cameras u′ c1 , v′ c1 and u′ c2 , v′ c2 are known from the prior step of determining these correspondences.
  • the locations P k of the points on the work surface are not known and the projector calibration matrix C p is not known.
  • any solver for e.g., nonlinear
  • any solver for such as the Levenberg-Marquardt algorithm
  • the point locations P k can also be solved and a three-dimensional depth map of the work surface may be obtained.
  • the determined projector calibration parameters and/or locations of the points may then be used to recalibrate the projector system, if needed.
  • projector recalibration may be implemented and combined with the step of surface profiling (as further described U.S. application Ser. No. 15/639,308, as set forth above).
  • recalibration and surface profiling may be executed by the projector system simultaneously.
  • an alternative to projecting specific patterns is projecting other types of patterns (e.g., decoded patterns) used by a structured light approach to surface profiling may be used as input for solving the equation system(s) for the unknown projector calibration parameters.
  • an alternative to projecting specific patterns onto the work surface to determine the projector calibration parameters is using the regular user image (e.g., construction-related blueprint image) that is projected onto the surface during operation on-site (for example, at a construction site).
  • the regular user image e.g., construction-related blueprint image
  • the change in projector calibration parameters due to temperature or mechanical stress may be relatively small in practical situations, which results in small amounts of parameter deviation from the factory calibration parameters.
  • the solution for the equation system above may be constrained to be close to a given set of calibration parameters, which can be derived, for instance, from the actual factory calibration.
  • a positioning system (which includes the one or more cameras) may separately be attached or snapped onto an off-the-shelf projector. If there is at least one camera with an overlapping field of view with the projector, the techniques described in the present invention may be used to determine the intrinsic (e.g., focal length, etc.) and extrinsic (e.g., relative position of projector to positioning system) projector calibration parameters. In at least that regard, the off-the-shelf projector may be used to project a positionally correct image on the work surface.
  • the present invention may be applied to any projector-camera system, not only in the visual light spectrum, but also for example in the infrared spectrum, as numerous commercial three-dimensional sensors use projector-camera pairs (such as infrared).
  • FIG. 3 illustrates a flow diagram 300 of recalibrating a projector system in accordance with one or more principles of the present invention. It may be understood that the steps of the flow diagram 300 may be performed or executed by one or more processors of a computing device, whether it may be via the example system 100 of FIG. 1 or via the one or more processors 112 of the projector system 110 . Moreover, it may be understood that order of the steps in FIG. 3 may not be limited thereto, but may be arranged in any suitable order.
  • a projector of the projector system may project on to a work surface one or more specific patterns.
  • images of each of the projected patterns may be captured by one or more cameras of the projector system.
  • the specific patterns may include any suitable geometric shape, e.g., rectangles, squares, triangles, etc. or any suitable combination thereof.
  • step 306 analysis is performed by the one or more processors on the captured images.
  • the analysis may involve and include ascertaining various parameters from the projected patterns in order to solve the equation system(s), described above, along with the parameters that are already known, such as the positional relationship of the cameras with respect to each other and the projector, etc.
  • step 308 the projector calibration parameters are determined based on the analysis in step 306 , and in step 310 , the projector system recalibrates in order to project an image onto the work surface at a correct position based on the determined projector calibration parameters.
  • an overarching advantage of the present invention is that recalibrating the projector system, for example on-site and in real-time, will increase overall accuracy of the projector system, especially in fields where precision and accuracy of the projection of an image is required.
  • the recalibration is, thus, performed essentially in the hands of the user and/or operator, without having to send the projector off-site for recalibration.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention is directed to a system and method for recalibrating a projector system. The projector may be calibrated using one or more cameras arranged near or adjacent to a projector of a projector system. The one or more cameras may be calibrated once and made mechanically stable so that they do not move relative to each other. During operation of the projector system, the projector may project one or more specific patterns onto a work surface, where the one or more cameras capture each of the projected patterns. All the captured images of the patterns by the one or more cameras may be used to determine projector calibration parameters in order to recalibrate the projector.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of and claims priority to U.S. application Ser. No. 15/674,755, filed Aug. 11, 2017, the content of which is expressly incorporated herein by reference in its entirety.
  • BACKGROUND OF THE INVENTION
  • The invention relates to determining projector calibration parameters in order to recalibrate a projector system and project an image onto a work surface at the correct position based on the determined projector calibration parameters.
  • Projection mapping, which may also be known as video mapping or spatial augmented reality, is a projection technology used to turn physical objects (often irregularly shaped objects) into a display surface for image and video projection. The objects may be complex industrial landscapes, such as buildings, small indoor objects or theatrical stages. Using software, a two or three-dimensional object is spatially mapped on a virtual program that mimics the real environment that is to be projected on. The software may interact with a projector to fit any desired image onto a surface of that object.
  • A position enabled projector (PEP) is a tool that projects an image, such as a blueprint, onto a work surface at its true position with true scale. Being able to correctly project the position of the image onto a surface may involve determining various parameters, such as the position or orientation of the projector itself, the characteristics of the surface that the image is being projected onto, and where each of the projected pixels of the image will appear on the surface.
  • The projection of image points on a surface at their true positions along with true scale is particularly important if a specific task to be performed requires precision and accuracy. For instance, a blueprint may be projected onto a work surface so as to allow a construction worker to drill holes at various specified positions on the surface based on the information provided by the blueprint. In order to project the blueprint at the correct position on the work surface, the projector needs to know where each of the projected pixels of the image will appear on the work surface. But off-the-shelf projectors or other types of projector modules are not designed and built for this purpose, and thus, they are inherently unstable and the projector components (such as the light source, mirror array, etc.) are physically moving relative to their initially installed positions over time, temperature, and mechanical stress (projector intrinsics). Moreover, the mechanical fixing(s) between the projector and the positioning system may move as well (projector extrinsics).
  • One solution for the above-described problem may be to recalibrate the whole projector/positioning system offline and on a well-defined calibration jig to measure the projector intrinsic and extrinsic parameters, which can then be used to correctly project the image again. The disadvantage of this approach, however, is that the recalibration procedure requires specific setup, equipment, software, and also takes time and extra effort, thereby rendering this approach practically unfeasible for a projector to be used on a construction site.
  • In at least that regard, there is a need to recalibrate a projector system in the field and/or in real time based on determined projector calibration parameters in order to project an image at its correct position.
  • SUMMARY OF THE INVENTION
  • In accordance with one or more aspects of the present invention, the invention is directed to a system and method for recalibrating a projector system.
  • By way of example, the projector may be calibrated using one or more cameras arranged near or adjacent to a projector of a projector system. During operation of the projector system, the projector may project one or more specific patterns onto a work surface, where the one or more cameras may be configured to capture each of the projected patterns on the surface. All of the captured images of the patterns may be used to determine projector calibration parameters in order to properly recalibrate the projector and project an image at the correction position based on the determined projector calibration parameters.
  • The one or more cameras of the projector system may be calibrated only once off-site (such as in a factory where the projector system is produced and/or assembled) and may also be installed in a mechanically stable manner so that they do not move or change positions relative to each other. In at least that regard, the positional relationship between the one or more cameras is a known parameter. Moreover, stabilizing the one or more cameras is an easier requirement than stabilizing the projector itself since the cameras may be easier to install and also smaller in size than the projector. Thus, cameras are easier to make stable than the projector when assembling and/or producing the projector system.
  • Other objects, advantages and novel features of the present invention will become apparent from the following detailed description of one or more preferred embodiments when considered in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1 and 2 illustrate projector systems in accordance with one or more principles of the present invention.
  • FIG. 3 illustrates a flow diagram in accordance with one or more principles of the present invention.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • The present invention is directed to correctly and accurately projecting, using a projector system, a two-dimensional image (e.g., a construction-related blueprint) onto an uneven work surface, such as corrugated steel sheet, and automatically recalibrating the projector system (either by the projector system on-site itself or the user on-site) so that the pixels of the image actually appear where they are supposed to appear on the work surface. Surface profiling and projecting an image onto the profiled surface by a projector system are described in U.S. application Ser. No. 15/639,308 (Attorney Docket No. 105440.70050), filed Jun. 30, 2017, the contents of which are incorporated herein by reference in its entirety.
  • In one embodiment of the present invention, a projector system may include a projector and one or more cameras, and the components of the whole projector system, including the cameras, may be calibrated and installed/stabilized, only once, in the off-site location (e.g., factory) where the projection system is produced and/or assembled. For instance, it may be easier to make mechanically stable the one or more cameras of the projector system than the projector itself. Thus, the positional relation between the one or more cameras is a known (and, importantly, a calibrated) parameter that should not change due to time or external stresses as the projector system is being operated in the field, such as a construction site.
  • During operation of the projector system, one or more patterns may be projected onto the work surface. It is not necessary to know the exact location and geometry of the work surface. The projected one or more patterns may include a specific shape (e.g., square, rectangle, triangle, etc.) or a combination of different types of shapes. As the one or more patterns are projected onto the work surface, the one or more cameras may capture each of the projected patterns.
  • All the captured images of the patterns are then used to determine projector calibration parameters. In one example, the correspondences between points from all cameras and the projector are computed. Thereafter, the unknown projector calibration parameters may be solved by equation system(s), which will be further described below. The computed projector calibration parameters may then be used for projecting a positionally correct image onto the work surface. Thus, the projection of the image at the correct position based on the projector calibration parameters relates at least to the recalibration of the projector system. The determination of the projector calibration parameters may be performed on-site and/or in real time so that time and costs associated with recalibrating the projector system are saved.
  • One of the numerous advantages of the present invention, which is further described below, is that the determination of the projector calibration parameters are based on known parameters (e.g., the positional relationship between the multiple cameras in relation to the projector) that will not change or easily change over time and/or external stresses, such as temperature or mechanical damage or stress. Moreover, the recalibration of the projector system can be performed on-site, in real-time, and/or simultaneously while the projector system is profiling the surface.
  • The invention described herein may be implemented on and executed by one or more computing devices and/or one or more processors. For instance, the projector system may have computing capabilities, by way of example, one or more processors, central processing units (CPUs), etc. As will be further described below, the computing associated with determining projector calibration parameters according to aspect(s) of the present invention may be executed by computing hardware in the projector system itself. Alternatively, the processing may be performed by a separate portable computing device, such as a laptop, tablet computer, or any other suitable type of mobile computing device that can be operated by a user.
  • FIG. 1 illustrates an example projector system 110 in accordance with one or more principles of the present invention. As shown, the projector system 110 includes one or more processors 112, memory 114 (which includes instructions 116 and data 118), at least one projector 119, at least one camera 120, and at least one interface 121. The processor 112 may instruct the components of the projector system 110 to perform various tasks based on the processing of information and/or data that may have been previously stored or have been received, such as based on the instructions 116 and/or data 118 stored in memory 114. The processor(s) 112 may be a standard processor, such as a central processing unit (CPU), graphics processor, or may be a dedicated processor, such as an application-specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
  • The instructions 116 may be one or more sets of computer-executable instructions (e.g., software) that can be implemented by the processor 112. Data 118 may include various types of information (which can be retrieved, manipulated and/or stored by the processor 112), such as information to be projected by the projector 119, information captured by the one or more cameras 120, etc.
  • The projector 119 may be any object, apparatus, or device with a system of lenses that is used to project rays of light that form an image that is generated by a processor or computing device for projection. The one or more cameras 120 may be an optical instrument, apparatus, or device that is sued for recording or capturing images, which may be still photographs and/or sequences of images constituting a video or movie. While FIG. 1 illustrates a camera, it may be understood that any type of image capturing device may be used, such as a thermographic camera (infrared), etc.
  • Interface 121 may be any component that allows interfacing with an operator or user. For example, interface 121 may be a device, port, or a connection that allows a user to communicate with the projector system 110, including but not limited to a touch-sensitive screen, microphone, camera, and may also include one or more input/output ports, such as a universal serial bus (USB) drive, various card readers, etc. The interface 121 may also include hardware and/or equipment for surface profiling, such as one or more sensors, one or more range meters, etc.
  • Additionally, the projector system 110 may be configured to communicate with other computing devices via network 130. For example, the projector system 110 may communicate with other projector systems, mobile computing devices (e.g., laptops, tablet computers, smartphones). The network 130, which may be any type of network, such as LAN, WAN, Wi-Fi, Bluetooth, etc.
  • Although the processing related to determining the projector calibration parameters are carried out by the one or more processors 112 of the projector system 110, it may be understood that the processing may be performed by external computing devices and/or hardware, such as a mobile computing device, that may be communicating with the projector system 110 via the network 130.
  • FIG. 2 illustrates the various components of a projector system 200 in accordance with one or more principles of the present invention. As shown, the projector system 200 may be configured in a similar manner to the projector system 100 of FIG. 1. For example, the projector system 200 may include at least two different cameras 202 and 204, which are arranged on opposing sides of a projector 206. The projector 206 may project an image, such as a construction-related blueprint for drilling various holes, onto a work surface 208, which in FIG. 2, may be a corrugated steel sheet. And for determining projector calibration parameters, the projector 206 may also be configured to project one or more patterns, as described above. The two cameras 202 and 204 may each capture images of the projection of each of the patterns onto the work surface 208, as also shown in FIG. 2.
  • In addition, FIG. 2 illustrates a pattern generator for generating and projecting the various patterns and a calibration process or routine for determining the projector calibration parameters (which will be further described below), which are depicted inside of the dashed box, as shown in FIG. 2. The dashed box represents that the above-described pattern generation and calibration process/routine for determining the projector calibration parameters may be performed or executed by one or more processors and/or one or more computing devices. For instance, the one or more processors may be the one or more processors 112 illustrated in FIG. 1. In that regard, the one or more processors 112 may be configured generate one or more specific patterns associated with the calibration process, as described above, and further the one or more processors 112 may then be configured to determine all requisite projector calibration parameters for recalibrating the projector system in order to accurately project images onto the work surface.
  • Moreover, as described above, the two cameras 202 and 204 may be arranged in the projector system 200 in a mechanically stable manner relative to each other and also relative to the projector 206. It is understood that mechanical stability may be provided in any fashion, such as by way of screws, fasteners, adhesive substance, etc. The overall effect, for example, of mechanical stability is that the cameras 202 and 204 will not move or change in position regardless of time, temperature, and mechanical stress, especially compared to the position and stability of the projector 206. In that way, the two cameras 202 and 204 may also be smaller in size than the projector 206 in order to achieve the above-effect.
  • The calibration process/routine for acquiring, analyzing, and/or determining the various projector calibration parameters will now be described. According to an exemplary embodiment of the present invention, a set of parameters may include intrinsic parameters and extrinsic parameters. For instance, the intrinsic parameters (for a linear projector model, e.g., pinhole model) may include:
      • (i) projector focal lengths in x-direction and y-direction (fx, fy)
      • (ii) principal point in x-direction and y-direction (tx, ty), and
      • (iii) skew coefficient between x-axis and y-axis (s).
        Moreover, the extrinsic parameters may include:
      • (i) translation vector between the projector and the one or more cameras (which may be known as a “camera assembly”) (t), and
      • (ii) rotation matrix between projector and camera assembly (R).
        In instances where the camera-projector model is more complex, the set of parameters may also include a set of distortion coefficients, etc. Not all calibration parameters is required to be estimated at the same time, as it may be understood that only a subset of the parameters can be derived at a time.
  • In at least that regard, the complete camera-projector calibration matrix including the intrinsic and extrinsic parameters may be represented as follows:

  • C=K*E=K*[R|t]
      • where
  • K = [ f x s t u 0 f y t v 0 0 1 ]
  • and is the intrinsic calibration parameter matrix. The variables R and t are the rotation matrix and translation vector between a camera or projector and a world coordinate system, respectively. The variables tu and tv denote the translation between optical axis and image sensor coordinate system. The variable C is a [3×4] calibration matrix.
  • Once the complete calibration matrix C is known, the intrinsic calibration matrix K, the rotation matrix R and the translation vector t may be determined by standard matrix decomposition or other types of suitable decomposition techniques.
  • With respect to a projected point on the work surface, the projected point may be represented by the variable k and has homogenous coordinates:

  • P k=(x k y k z k1)T
  • For example, the projected point originates from the pixel of the projector image, which may be represented by:

  • P p =C p P k  (1)
  • The same projected point appears on a camera image plane (e.g., for the first camera) represented by:

  • P c1 =C c1 P k  (2)
  • The projected point appears on a different camera image plane (e.g., for the second camera) represented by:

  • P c2 =C c2 P k  (3)
  • where Pp, Pc1, and Pc2 are of the form P=(u v m)T.
  • It may be understood that the two dimensional image coordinates that are observable are represented by:

  • (u/w,v/w)=(u′,v′)
  • By way of example, writing out the above-described matrix equations (1) to (3) results in six different scalar equations:

  • u′ p(C p (3) P k)−C p (1) P k=0

  • v′ p(C p (3) P k)−C p (2) P k=0

  • u′ c1(C c1 (3) P k)−C c1 (1) P k=0

  • v′ c1(C c1 (3) P k)−C c1 (2) P k=0

  • u′ c2(C c2 (3) P k)−C c2 (1) P k=0

  • v′ c2(C c2 (3) P k)−C c2 (2) P k=0
  • where C(n) is the n-th row of the matrix C.
  • The camera calibration matrices Cc1 and Cc2 are known based on how the cameras were calibrated off-site (e.g., at the factory). The pixel coordinates of the projector u′p, v′p and the cameras u′c1, v′c1 and u′c2, v′c2 are known from the prior step of determining these correspondences. However, the locations Pk of the points on the work surface are not known and the projector calibration matrix Cp is not known.
  • In order to determine the locations Pk of the points on the work surface and the projector calibration matrix Cp, any solver for (e.g., nonlinear) overdetermined equation systems, such as the Levenberg-Marquardt algorithm, may be used to find a solution for the projector calibration matrix Cp. Moreover, as a byproduct, the point locations Pk can also be solved and a three-dimensional depth map of the work surface may be obtained. The determined projector calibration parameters and/or locations of the points may then be used to recalibrate the projector system, if needed.
  • It may be understood that in the case where only the correspondence of one point P1 is present, there is a system of equations with six equations and 15 unknowns. For every point that is added, six new equations are generated with 3 more new unknowns. As such, at least four points may be needed such that 24 equations for 24 unknowns are generated to obtain a unique solution to the equation system. Practically, many more points may be used, which makes the equation system overdetermined. Moreover, it may be understood that the above-described determination principles apply to a different number of cameras. And depending on the number of cameras, more or less correspondence points may be needed for the solution to become unique.
  • Variants and/or alternative embodiments of the above-described embodiment of the present invention will now be described.
  • In one embodiment, projector recalibration may be implemented and combined with the step of surface profiling (as further described U.S. application Ser. No. 15/639,308, as set forth above). In that regard, recalibration and surface profiling may be executed by the projector system simultaneously. Accordingly, for example, an alternative to projecting specific patterns is projecting other types of patterns (e.g., decoded patterns) used by a structured light approach to surface profiling may be used as input for solving the equation system(s) for the unknown projector calibration parameters.
  • In another embodiment, an alternative to projecting specific patterns onto the work surface to determine the projector calibration parameters is using the regular user image (e.g., construction-related blueprint image) that is projected onto the surface during operation on-site (for example, at a construction site).
  • In yet another embodiment, the change in projector calibration parameters due to temperature or mechanical stress may be relatively small in practical situations, which results in small amounts of parameter deviation from the factory calibration parameters. In order to account for these small deviations, the solution for the equation system above may be constrained to be close to a given set of calibration parameters, which can be derived, for instance, from the actual factory calibration.
  • In a further embodiment, a positioning system (which includes the one or more cameras) may separately be attached or snapped onto an off-the-shelf projector. If there is at least one camera with an overlapping field of view with the projector, the techniques described in the present invention may be used to determine the intrinsic (e.g., focal length, etc.) and extrinsic (e.g., relative position of projector to positioning system) projector calibration parameters. In at least that regard, the off-the-shelf projector may be used to project a positionally correct image on the work surface.
  • In yet a further embodiment, the present invention may be applied to any projector-camera system, not only in the visual light spectrum, but also for example in the infrared spectrum, as numerous commercial three-dimensional sensors use projector-camera pairs (such as infrared).
  • FIG. 3 illustrates a flow diagram 300 of recalibrating a projector system in accordance with one or more principles of the present invention. It may be understood that the steps of the flow diagram 300 may be performed or executed by one or more processors of a computing device, whether it may be via the example system 100 of FIG. 1 or via the one or more processors 112 of the projector system 110. Moreover, it may be understood that order of the steps in FIG. 3 may not be limited thereto, but may be arranged in any suitable order.
  • In step 302, a projector of the projector system may project on to a work surface one or more specific patterns. In step 304, images of each of the projected patterns may be captured by one or more cameras of the projector system. As described above, the specific patterns may include any suitable geometric shape, e.g., rectangles, squares, triangles, etc. or any suitable combination thereof.
  • In step 306, analysis is performed by the one or more processors on the captured images. The analysis may involve and include ascertaining various parameters from the projected patterns in order to solve the equation system(s), described above, along with the parameters that are already known, such as the positional relationship of the cameras with respect to each other and the projector, etc.
  • In step 308, the projector calibration parameters are determined based on the analysis in step 306, and in step 310, the projector system recalibrates in order to project an image onto the work surface at a correct position based on the determined projector calibration parameters.
  • Numerous advantageous of the present invention, include but are not limited to: the projector not having to be stable over time and temperature; recalibration of the projector may be performed in the field; recalibration may be done without user intervention; no specific calibration setup is required; recalibration and surface profiling may be performed simultaneously; and the projector focus may be adjusted manually to show a crisp image and the projector may automatically adjust to this focus setting so as to guarantee correct position and projection. An overarching advantage of the present invention is that recalibrating the projector system, for example on-site and in real-time, will increase overall accuracy of the projector system, especially in fields where precision and accuracy of the projection of an image is required. The recalibration is, thus, performed essentially in the hands of the user and/or operator, without having to send the projector off-site for recalibration.
  • The foregoing invention has been set forth merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof. Although the present disclosure uses terminology and acronyms that may not be familiar to the layperson, those skilled in the art will be familiar with the terminology and acronyms used herein.

Claims (19)

What is claimed is:
1. A recalibratable projector system, comprising:
one or more cameras;
a projector; and
at least one processor configured to execute stored instructions to:
project one or more patterns on to a work surface via the projector,
capture images of the projected one or more patterns via the one or more cameras,
perform analysis on the captured images of the one or more patterns,
determine projector calibration parameters based at least in part on the performed analysis on the images, and
recalibrate the projector system such that an image is projected onto the work surface by the projector at a correct position based on the determined projector calibration parameters.
2. The recalibratable projector system of claim 1, further comprising one or more of the following: (ii) at least one laser scanner and (ii) at least one surface profiling sensor, wherein the one or more cameras, the at least one scanner, the at least one surface profiling sensor, and the projector are configured to profile the work surface.
3. The recalibratable projector system of claim 1, wherein the one or more cameras are mechanically affixed to the projector system and calibrated relative to each other and relative to the projector.
4. The recalibratable projector system of claim 3, wherein the calibration of the one or more cameras is performed only once and is a known parameter.
5. The recalibratable projector system of claim 3, wherein the one or more cameras are mechanically affixed to the projector system via one or more of the following: (i) a screw, (ii) a fastener, and (ii) an adhesive.
6. The recalibratable projector system of claim 1, wherein the one or more patterns includes specific geometric shapes.
7. The recalibratable projector system of claim 1, wherein the at least one processor is further configured to profile the work surface, and wherein the profiling of the work surface and recalibration of the projector system are performed simultaneously.
8. The recalibratable projector system of claim 1, wherein the determination of the projector calibration parameters includes calculating correspondences between points from the one or more cameras and the projector.
9. The recalibratable projector system of claim 1, wherein the determination of the projector calibration parameters includes solving at least one equation system for unknown parameters.
10. The recalibratable projector system of claim 9, wherein the at least one equation system is solved based on a Levenberg-Marquardt algorithm.
11. The recalibratable projector system of claim 9, wherein the determination of the projector calibration parameters includes obtaining intrinsic and extrinsic parameters.
12. The recalibratable projector system of claim 11, wherein the intrinsic parameters includes one or more of the following: (i) projector focal lengths in an x-direction and a y-direction, (ii) principal point in the x-direction and the y-direction, and (iii) skew coefficient between an x-axis and a y-axis.
13. The recalibratable projector system of claim 11, wherein the extrinsic parameters includes one or more of the following: (i) translation vector between the projector and the one or more cameras and (ii) rotation matrix between the projector and the one or more cameras.
14. The recalibratable projector system of claim 1, wherein the one or more patterns includes the image itself.
15. The recalibratable projector system of claim 1, wherein the one or more cameras includes an infrared camera.
16. The recalibratable projector system of claim 1, wherein the image is a blueprint of a construction-related task.
17. The recalibratable projector system of claim 1, wherein the projector is a position enabled projector.
18. A method for recalibrating a projector system, the method comprising the steps of:
projecting, by at least one processor, one or more patterns on to a work surface via a projector;
capturing, by the at least one processor, images of the projected one or more patterns via one or more cameras;
performing, by the at least one processor, analysis on the captured images of the one or more patterns;
determining, by the at least one processor, projector calibration parameters based at least in part on the performed analysis on the images;
recalibrating, by the at least one processor, the projector system such that an image is projected onto the work surface by the projector at a correct position based on the determined projector calibration parameters; and
using the projected, recalibrated image in a construction task.
19. A non-transitory computer-readable medium comprising a set of executable instructions, the set of executable instructions when executed by at least one processor causes the at least one processor to perform a method for recalibrating a projector system, the method comprising the steps of:
projecting one or more patterns on to a work surface via a projector;
capturing images of the projected one or more patterns via one or more cameras;
performing analysis on the captured images of the one or more patterns;
determining projector calibration parameters based at least in part on the performed analysis on the images; and
recalibrating the projector system such that an image is projected onto the work surface by the projector at a correct position based on the determined projector calibration parameters.
US15/970,764 2017-08-11 2018-05-03 System and Method for Recalibrating a Projector System Abandoned US20190052851A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US15/970,764 US20190052851A1 (en) 2017-08-11 2018-05-03 System and Method for Recalibrating a Projector System
US16/637,965 US11050983B2 (en) 2017-08-11 2018-07-25 System and method for recalibrating a projector system
EP18753078.7A EP3665898A1 (en) 2017-08-11 2018-07-25 System and method for recalibrating a projector system
PCT/EP2018/070168 WO2019029991A1 (en) 2017-08-11 2018-07-25 System and method for recalibrating a projector system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201715674755A 2017-08-11 2017-08-11
US15/970,764 US20190052851A1 (en) 2017-08-11 2018-05-03 System and Method for Recalibrating a Projector System

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US201715674755A Continuation 2017-08-11 2017-08-11

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US201715674755A Continuation 2017-08-11 2017-08-11
US16/637,965 Continuation US11050983B2 (en) 2017-08-11 2018-07-25 System and method for recalibrating a projector system

Publications (1)

Publication Number Publication Date
US20190052851A1 true US20190052851A1 (en) 2019-02-14

Family

ID=63168369

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/970,764 Abandoned US20190052851A1 (en) 2017-08-11 2018-05-03 System and Method for Recalibrating a Projector System
US16/637,965 Active US11050983B2 (en) 2017-08-11 2018-07-25 System and method for recalibrating a projector system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/637,965 Active US11050983B2 (en) 2017-08-11 2018-07-25 System and method for recalibrating a projector system

Country Status (3)

Country Link
US (2) US20190052851A1 (en)
EP (1) EP3665898A1 (en)
WO (1) WO2019029991A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10401716B2 (en) * 2017-10-24 2019-09-03 Canon Kabushiki Kaisha Calibration of projection systems
CN110933391A (en) * 2019-12-20 2020-03-27 成都极米科技股份有限公司 Calibration parameter compensation method and device for projection system and readable storage medium
US20210248948A1 (en) * 2020-02-10 2021-08-12 Ebm Technologies Incorporated Luminance Calibration System and Method of Mobile Device Display for Medical Images
US11132479B1 (en) * 2017-12-29 2021-09-28 II John Tyson Augmented reality system for component assembly and archival baseline clone
CN114205483A (en) * 2022-02-17 2022-03-18 杭州思看科技有限公司 Scanner precision calibration method and device and computer equipment
US11431949B2 (en) * 2019-03-11 2022-08-30 Sony Group Corporation Image processing apparatus, image processing method, and program
US11443524B2 (en) * 2018-12-10 2022-09-13 Motional Ad Llc Systems and methods for validating sensor calibration
US11451755B2 (en) * 2020-02-26 2022-09-20 Seiko Epson Corporation Method for controlling electronic instrument and electronic instrument
US11940539B2 (en) 2019-12-18 2024-03-26 Motional Ad Llc Camera-to-LiDAR calibration and validation

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4015991A1 (en) * 2020-12-15 2022-06-22 Robotic Eyes GmbH Method for visualizing a plan in real dimensions and for designing an object
WO2023010565A1 (en) * 2021-08-06 2023-02-09 中国科学院深圳先进技术研究院 Method and apparatus for calibrating monocular speckle structured light system, and terminal

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6310650B1 (en) 1998-09-23 2001-10-30 Honeywell International Inc. Method and apparatus for calibrating a tiled display
US6437823B1 (en) 1999-04-30 2002-08-20 Microsoft Corporation Method and system for calibrating digital cameras
US6618076B1 (en) * 1999-12-23 2003-09-09 Justsystem Corporation Method and apparatus for calibrating projector-camera system
US7292269B2 (en) 2003-04-11 2007-11-06 Mitsubishi Electric Research Laboratories Context aware projector
JP4286068B2 (en) 2003-06-03 2009-06-24 大塚電子株式会社 Screen quality evaluation method
US8106949B2 (en) 2009-03-26 2012-01-31 Seiko Epson Corporation Small memory footprint light transport matrix capture
NL2007577A (en) * 2010-11-10 2012-05-14 Asml Netherlands Bv Optimization of source, mask and projection optics.
US9200899B2 (en) 2012-03-22 2015-12-01 Virtek Vision International, Inc. Laser projection system and method
CN103369211A (en) * 2012-03-27 2013-10-23 中强光电股份有限公司 Photographing device and automatic projection correction method of projection device
JP6016068B2 (en) 2012-05-16 2016-10-26 株式会社リコー Image projection apparatus, control method therefor, and program
US9121692B2 (en) * 2013-03-13 2015-09-01 Trimble Navigation Limited Method and apparatus for projection of BIM information
US8872924B1 (en) * 2013-06-04 2014-10-28 Ati Technologies Ulc Imaging based auto display grid configuration system and method
US10210607B1 (en) * 2015-04-08 2019-02-19 Wein Holding LLC Digital projection system and method for workpiece assembly
EP3284078B1 (en) 2015-04-17 2024-03-27 Tulip Interfaces Inc. Augmented interface authoring
JP6525772B2 (en) * 2015-06-30 2019-06-05 キヤノン株式会社 Image processing apparatus, image processing method, radiation imaging system, and image processing program
FR3040867A1 (en) 2015-09-11 2017-03-17 Thales Sa MIRE AND METHOD FOR CALIBRATING AN X-RAY IMAGING SYSTEM

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10401716B2 (en) * 2017-10-24 2019-09-03 Canon Kabushiki Kaisha Calibration of projection systems
US11132479B1 (en) * 2017-12-29 2021-09-28 II John Tyson Augmented reality system for component assembly and archival baseline clone
US11443524B2 (en) * 2018-12-10 2022-09-13 Motional Ad Llc Systems and methods for validating sensor calibration
US20220392232A1 (en) * 2018-12-10 2022-12-08 Motional Ad Llc Systems and methods for validating sensor calibration
US11431949B2 (en) * 2019-03-11 2022-08-30 Sony Group Corporation Image processing apparatus, image processing method, and program
US11940539B2 (en) 2019-12-18 2024-03-26 Motional Ad Llc Camera-to-LiDAR calibration and validation
CN110933391A (en) * 2019-12-20 2020-03-27 成都极米科技股份有限公司 Calibration parameter compensation method and device for projection system and readable storage medium
US20210248948A1 (en) * 2020-02-10 2021-08-12 Ebm Technologies Incorporated Luminance Calibration System and Method of Mobile Device Display for Medical Images
US11580893B2 (en) * 2020-02-10 2023-02-14 Ebm Technologies Incorporated Luminance calibration system and method of mobile device display for medical images
US11451755B2 (en) * 2020-02-26 2022-09-20 Seiko Epson Corporation Method for controlling electronic instrument and electronic instrument
CN114205483A (en) * 2022-02-17 2022-03-18 杭州思看科技有限公司 Scanner precision calibration method and device and computer equipment

Also Published As

Publication number Publication date
EP3665898A1 (en) 2020-06-17
WO2019029991A1 (en) 2019-02-14
US20200186768A1 (en) 2020-06-11
US11050983B2 (en) 2021-06-29

Similar Documents

Publication Publication Date Title
US11050983B2 (en) System and method for recalibrating a projector system
KR102129103B1 (en) System and method for calibration of machine vision cameras along at least three discrete planes
US20150116691A1 (en) Indoor surveying apparatus and method
US8520067B2 (en) Method for calibrating a measuring system
CN203084734U (en) System for regenerating virtual object
US20150377606A1 (en) Projection system
WO2006077665A1 (en) Projection device, control method for projection device, composite projection system, control program for projection device, and recording medium having projection device control program recorded therein
WO2020188799A1 (en) Camera calibration device, camera calibration method, and non-transitory computer-readable medium having program stored thereon
CN109906471B (en) Real-time three-dimensional camera calibration
KR102419427B1 (en) Calibration for vision systems
JP6670974B1 (en) Robot coordinate system alignment method, alignment system, and alignment device
JP2021146499A (en) System and method for three-dimensional calibration of vision system
JP2019139030A (en) Method and device for projecting information related to measurement result on surface of three-dimensional measurement target object
JP2021507209A (en) Machine vision system with computer-generated reference objects
CN114463436A (en) Calibration method, system, equipment and storage medium of galvanometer scanning device
Borrmann et al. Spatial projection of thermal data for visual inspection
CN115482276A (en) High-precision calibration method based on phase shift deflection measurement system
WO2021039313A1 (en) Projection method, projection device, and projection system
JP2005186193A (en) Calibration method and three-dimensional position measuring method for robot
US20200151848A1 (en) System and Method for Surface Profiling
WO2023135764A1 (en) Robot device provided with three-dimensional sensor and method for controlling robot device
TWM547661U (en) System for correcting 3D visual coordinate
WO2022181500A1 (en) Simulation device using three-dimensional position information obtained from output from vision sensor
Chen et al. A novel camera calibration method based on known rotations and translations
JP2022158835A (en) Estimation device of interval of fixture

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION