WO2017006779A1 - 情報処理装置および方法、並びに、投影撮像装置および情報処理方法 - Google Patents
情報処理装置および方法、並びに、投影撮像装置および情報処理方法 Download PDFInfo
- Publication number
- WO2017006779A1 WO2017006779A1 PCT/JP2016/068753 JP2016068753W WO2017006779A1 WO 2017006779 A1 WO2017006779 A1 WO 2017006779A1 JP 2016068753 W JP2016068753 W JP 2016068753W WO 2017006779 A1 WO2017006779 A1 WO 2017006779A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projection
- unit
- imaging
- posture
- imaging apparatus
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
- G06F3/1446—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/20—Lamp housings
- G03B21/2006—Lamp housings characterised by the light source
- G03B21/2033—LED or laser light sources
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0025—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0025—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
- G02B27/0068—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration having means for controlling the degree of correction, e.g. using phase modulators, movable elements
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/002—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0693—Calibration of display systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
- G09G2340/0471—Vertical positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
- G09G2340/0478—Horizontal positioning
Definitions
- the present technology relates to an information processing device and method, and a projection imaging device and an information processing method, and in particular, information that can suppress an increase in processing amount related to update of correction information used for geometric correction or the like.
- the present invention relates to a processing apparatus and method, a projection imaging apparatus, and an information processing method.
- the orientation of the projector and the shape of the projection surface may change during the projection of images such as content.
- an on-line sensing that is a technique for obtaining a correspondence relationship between pixels while projecting an image of content or the like
- a method of embedding a gray code in a projected image for example, refer to Non-Patent Document 1
- an image feature amount such as SIFT
- SIFT A method using invisible light such as infrared light (Infrared) has been proposed.
- the correction information has been updated for all projectors. That is, as described above, all the processes such as pixel correspondence calculation, projector attitude estimation, and projection plane shape estimation are performed for all projectors. For example, even when the postures of some projectors are changed, all these processes are performed for all projectors. As a result, the amount of processing related to updating correction information used for geometric correction or the like increases, which may increase the processing time.
- the present technology has been proposed in view of such a situation, and an object thereof is to suppress an increase in the amount of processing related to updating correction information used for geometric correction or the like.
- An information processing apparatus relates to a projection imaging apparatus having a projection unit that projects an image on a projection plane and an imaging unit that images the projection plane, and the projection imaging apparatus in which the attitude has changed and the attitude has not changed.
- a corresponding point detection unit that detects corresponding points between the pixels of the projection unit and the pixels of the imaging unit between the projection imaging device and the projection imaging device detected by the corresponding point detection unit and whose posture has changed.
- the orientation of the projection imaging device whose orientation has changed is based on the corresponding points of the pixels of the projection unit and the pixels of the imaging unit between the projection imaging device and the projection imaging device whose orientation has not changed.
- a relative posture estimation unit that estimates a relative posture with respect to the projection imaging device.
- the corresponding point detection unit can detect a corresponding point between the pixel of the projection unit of the projection imaging apparatus whose posture has not changed and the pixel of the imaging unit of the projection imaging device whose posture has changed.
- the corresponding point detection unit can detect a corresponding point between the pixel of the projection unit of the projection imaging apparatus whose posture has changed and the pixel of the imaging unit of the projection imaging device whose posture has not changed.
- Detecting at least one of a corresponding point change detection unit that detects a change in a corresponding point between a pixel of the projection unit of the projection imaging apparatus and a pixel of the imaging unit, and a position and a direction of the projection imaging apparatus Based on the sensor output of the sensor unit, the projection change detection unit that detects a change in posture of the projection imaging apparatus, the projection based on the detection result of the corresponding point change detection unit and the detection result of the posture change detection unit
- a change determination unit that determines whether or not the posture of the imaging device has changed, wherein the corresponding point detection unit includes the projection imaging device in which the posture has been determined to be changed by the change determination unit, and the change determination.
- the change determination unit makes it possible to be configured to estimate the relative orientation with respect to the determination by said projection imaging apparatus position is not changed.
- the corresponding point detection unit is further configured to detect a corresponding point between the pixel of the projection unit and the pixel of the imaging unit of the projection imaging apparatus whose posture has changed, and is detected by the corresponding point detection unit
- the projection plane shape estimating unit that estimates the shape of the projection plane based on corresponding points between the pixels of the projection unit and the pixels of the imaging unit of the projection imaging apparatus whose posture has changed can be further provided.
- the corresponding point detection unit is further configured to detect a corresponding point between the pixel of the projection unit and the pixel of the imaging unit of the projection imaging apparatus whose posture has not changed, and is detected by the corresponding point detection unit.
- a projection plane shape estimation unit configured to estimate the shape of the projection plane based on corresponding points between the pixels of the projection unit of the projection imaging apparatus and the pixels of the imaging unit of which the posture has not changed. Can do.
- the projection plane shape estimation unit estimates the shape of the projection plane for a portion whose shape is unknown within a range in which an image is projected by the projection unit of the projection imaging apparatus whose posture has changed. be able to.
- An information processing method is the projection imaging apparatus having a projection unit that projects an image on a projection plane and an imaging unit that captures the projection plane, and the projection imaging apparatus in which the attitude has changed and the attitude has not changed. Corresponding points between the pixels of the projection unit and the pixels of the imaging unit are detected between the projection imaging device and the projection imaging device in which the posture is changed, and the projection imaging device in which the posture is not changed. The relative orientation of the projection imaging device whose posture has changed with respect to the projection imaging device whose posture has not changed is estimated based on corresponding points between the pixels of the projection unit and the pixels of the imaging unit between Information processing method.
- the information processing apparatus is also configured to include a projection unit that projects an image on a projection plane and an imaging unit that captures the projection plane.
- a corresponding point detecting unit that detects corresponding points between the pixels of the imaging unit and the pixels of the imaging unit, and the pixels of the projection unit of the projection imaging apparatus detected by the corresponding point detecting unit, the posture of which has not changed, and the imaging
- An information processing apparatus comprising: a projection plane shape estimation unit that estimates the shape of the projection plane based on corresponding points with pixels of the unit.
- the projection surface shape estimation unit can estimate the shape of the projection surface for a part whose shape is unknown.
- the information processing method is also directed to the projection unit of the projection imaging device in which an attitude is not changed with respect to a projection imaging device including a projection unit that projects an image on a projection plane and an imaging unit that captures the projection plane.
- a projection imaging apparatus includes a projection unit that projects an image on a projection plane, an imaging unit that captures the projection plane, a sensor unit that detects at least one of a position and a direction, and a sensor output of the sensor unit A projection unit, the imaging unit, and the sensor unit that do not change the posture when the posture is determined to be changed by the determination unit; Based on the corresponding point detected by the corresponding point detection unit and the corresponding point detection unit that detects the corresponding point between the pixel of the projection unit and the pixel of the imaging unit And a relative posture estimation unit that estimates a relative posture with respect to the other projection imaging device.
- the corresponding point detection unit can detect a corresponding point between the pixel of the projection unit of the other projection imaging apparatus and the pixel of the imaging unit of the projection imaging apparatus itself.
- the corresponding point detection unit can detect corresponding points between the pixels of the projection unit of the projection imaging apparatus itself and the pixels of the imaging unit of the other projection imaging apparatus.
- the attitude of the projection imaging device itself Based on the sensor output of the sensor unit and the corresponding point change detection unit that detects the change of the corresponding point between the pixel of the projection unit of the projection imaging device itself and the pixel of the imaging unit, the attitude of the projection imaging device itself A posture change detection unit that detects a change, and the determination unit changes a posture of the projection imaging apparatus itself based on a detection result of the corresponding point change detection unit and a detection result of the posture change detection unit. It can be determined whether or not.
- the corresponding point detection unit is further configured to detect a corresponding point between the pixel of the projection unit and the pixel of the imaging unit of the projection imaging apparatus itself, and the projection imaging detected by the corresponding point detection unit
- a projection surface shape estimation unit that estimates the shape of the projection surface based on corresponding points between the pixels of the projection unit of the apparatus itself and the pixels of the imaging unit may be further provided.
- the projection plane shape estimation unit can estimate the shape of the projection plane for a portion whose shape is unknown within a range in which an image is projected by the projection unit of the projection imaging apparatus itself.
- the information processing method of the present technology further includes a projection unit that projects an image on a projection plane, an imaging unit that captures the projection plane, and a sensor unit that detects at least one of a position and a direction. And determining whether or not the posture has changed based on the sensor output of the sensor unit, and determining that the posture has changed, if the posture has not changed, the projection unit, Corresponding points between the pixels of the projection unit and the pixels of the imaging unit are detected between the imaging unit and another projection imaging apparatus including the sensor unit, and based on the detected corresponding points, the other It is the information processing method which estimates the relative attitude
- the projection imaging device of the present technology may further include: a projection unit that projects an image on a projection plane; an imaging unit that captures the projection plane; a sensor unit that detects at least one of a position and a direction; A determination unit that determines whether or not the posture has changed based on a sensor output; and when the determination unit determines that the posture has not changed, the pixels of the projection unit and the imaging unit of the projection imaging apparatus itself
- a projection imaging apparatus comprising: a corresponding point detection unit that detects a corresponding point with a corresponding pixel; and a projection plane shape estimation unit that estimates the shape of the projection plane based on the corresponding point detected by the corresponding point detection unit It is.
- the information processing method of the present technology also includes a projection imaging apparatus that includes a projection unit that projects an image on a projection plane, an imaging unit that images the projection plane, and a sensor unit that detects at least one of a position and a direction. And determining whether or not the posture has changed based on the sensor output of the sensor unit, and if it is determined that the posture has not changed, the projection unit of the projection imaging apparatus itself In the information processing method, a corresponding point between a pixel and a pixel of the imaging unit is detected, and the shape of the projection plane is estimated based on the detected corresponding point.
- a projection imaging apparatus having a projection unit that projects an image on a projection plane and an imaging unit that captures the projection plane does not change the attitude of the projection imaging apparatus in which the attitude has changed.
- Corresponding points between the pixels of the projection unit and the pixels of the imaging unit are detected between the projection imaging device and the detected projection imaging device between which the posture has changed and the projection imaging device in which the posture has not changed.
- the relative posture of the projection imaging device whose posture has changed with respect to the projection imaging device whose posture has not changed is estimated.
- a projection unit of a projection imaging device in which an attitude is not changed in a projection imaging device having a projection unit that projects an image on a projection surface and an imaging unit that images the projection surface The shape of the projection plane is detected based on the corresponding point between the pixel of the projection unit and the pixel of the imaging unit of the projection imaging device in which the posture is not changed. Is estimated.
- a projection unit that projects an image on a projection surface an imaging unit that images the projection surface, and a sensor unit that detects at least one of a position and a direction are provided.
- it is determined whether or not the posture has changed based on the sensor output of the sensor unit and when it is determined that the posture has changed, the projection unit, the imaging unit, and the sensor unit in which the posture has not changed.
- Corresponding points between the pixels of the projection unit and the pixels of the imaging unit are detected with another projection imaging device including the above, and a relative posture with respect to the other projection imaging device is estimated based on the detected corresponding points.
- a projection unit that projects an image on a projection plane, an imaging unit that images the projection plane, and a sensor unit that detects at least one of a position and a direction are provided.
- the projection imaging apparatus it is determined whether or not the attitude has changed based on the sensor output of the sensor unit, and when it is determined that the attitude has not changed, the pixels and the imaging unit of the projection unit of the projection imaging apparatus itself Corresponding points with the other pixels are detected, and the shape of the projection plane is estimated based on the detected corresponding points.
- This technology can process information. Further, according to the present technology, it is possible to suppress an increase in the amount of processing related to updating correction information used for geometric correction or the like.
- a projection imaging system 100 shown in FIG. 1 is a system that projects an image using a plurality of projection imaging devices 102.
- the projection imaging system 100 includes a control device 101, projection imaging devices 102-1 to 102-4, and communication cables 103-1 to 103-4.
- the projection imaging device 102-1 to the projection imaging device 102-4 are connected to the control device 101 via the communication cable 103-1 to the communication cable 103-4, respectively.
- the control device 101 communicates with each of the projection imaging device 102-1 to the projection imaging device 102-4 to control these operations.
- the control apparatus 101 causes the projection imaging apparatus 102-1 to the projection imaging apparatus 102-4 to project an image on the projection plane 104, or to capture an image projected on the projection plane 104 (projection image). To do.
- control device 101 may be, for example, a relative posture (for example, a rotation component (relative direction) or a translation component (relative position)) of the projection unit 111 or the imaging unit 112 included in the projection imaging device 102-1 to the projection imaging device 102-4. ) Can be calibrated. Further, for example, the control device 101 can perform correction (positioning, geometric correction, etc.) on the image projected by each of the projection imaging device 102-1 to the projection imaging device 102-4 using the calibration result. it can.
- a relative posture for example, a rotation component (relative direction) or a translation component (relative position)
- correction positioning, geometric correction, etc.
- Each of the projection imaging apparatuses 102-1 to 102-4 is controlled by the control apparatus 101, and projects an image on the projection plane 104 or displays the projection plane 104 (for example, a projection image projected on the projection plane 104). Or take pictures. That is, the projection imaging apparatus 102-1 to the projection imaging apparatus 102-4 have the same configuration and the same function. These projection imaging apparatuses 102-1 to 102-4 are referred to as the projection imaging apparatus 102 when it is not necessary to distinguish between them.
- the projection imaging apparatus 102 includes a projection unit 111 and an imaging unit 112. That is, the projection imaging apparatus 102-1 includes a projection unit 111-1 and an imaging unit 112-1.
- the projection imaging apparatus 102-2 includes a projection unit 111-2 and an imaging unit 112-2.
- the projection imaging apparatus 102-3 includes a projection unit 111-3 and an imaging unit 112-3.
- the projection imaging apparatus 102-4 includes a projection unit 111-4 and an imaging unit 112-4.
- the projection units 111-1 to 111-4 each project an image. That is, the projection units 111-1 to 111-4 have the same configuration and the same function.
- the projection units 111-1 to 111-4 are referred to as the projection unit 111 when it is not necessary to distinguish between them.
- each of the imaging units 112-1 to 112-4 captures a subject and obtains a captured image. That is, the imaging unit 112-1 to the imaging unit 112-4 have the same configuration and the same function.
- the imaging units 112-1 to 112-4 are referred to as the imaging unit 112 when it is not necessary to distinguish them from each other.
- each of the communication cables 103-1 to 103-4 is a communication medium corresponding to a predetermined communication standard. That is, the communication cables 103-1 to 103-4 have the same configuration and the same function.
- the communication cables 103-1 to 103-4 are referred to as communication cables 103 when there is no need to distinguish between them.
- the communication cable 103 is, for example, a cable compliant with HDMI (registered trademark) (High-Definition Multimedia Interface).
- HDMI registered trademark
- the communication standard supported by the communication cable 103 is arbitrary, and may correspond to a communication standard other than HDMI (registered trademark) such as a display port (DisplayPort).
- the projection surface 104 is an example of a target (projection target) on which the projection imaging apparatus 102 projects an image.
- the projection surface 104 may be a flat surface, a curved surface, a surface that is partially or entirely uneven, or a plurality of surfaces.
- the color of the projection surface 104 is arbitrary, and may be composed of a plurality of colors, or may have a pattern or a pattern.
- the projection surface 104 may be formed on an arbitrary object.
- the projection surface 104 may be formed on a planar object such as a so-called screen or wall surface.
- the projection surface 104 may be formed in a three-dimensional structure.
- it may be formed on the wall surface of buildings such as buildings, station buildings, castles, etc., for example, natural objects such as rocks, artificial objects such as signs and statues, furniture such as fences, chairs, desks, etc.
- it may be formed in a living organism such as a human being or an animal or plant.
- the projection surface 104 may be formed on a plurality of surfaces such as a wall, floor, ceiling, etc. of the room space.
- the projection surface 104 may be formed as a solid, or may be formed as a liquid or a gas. For example, it may be formed on a water surface such as a pond or a pool, a water surface such as a waterfall or fountain, or a gas such as mist or gas. Further, the projection plane 104 may move, deform, or change color. The projection surface 104 may be formed on a plurality of objects such as a room wall, furniture and a person, a plurality of buildings, a castle wall and a fountain, and the like.
- a projection image 105-1 on the projection surface 104 is an image projected on the projection unit 111-1 of the projection imaging apparatus 102-1.
- the projected image 105-2 is an image projected by the projection unit 111-2 of the projection imaging apparatus 102-2.
- the projection image 105-3 is an image projected by the projection unit 111-3 of the projection imaging apparatus 102-3.
- the projection image 105-4 is an image projected by the projection unit 111-4 of the projection imaging apparatus 102-4.
- each projection image 105 on the projection plane 104 is arbitrary, in the case of the example in FIG. 1, one projection area (area on which an image is projected) by each projection image 105 on the projection plane 104. Is formed.
- the projection image 105-1 forms the upper left part of the one projection area
- the projection image 105-2 forms the lower left part of the one projection area
- the projection image 105-3 is the upper right part of the one projection area.
- the projection image 105-4 forms the lower right part of the one projection area.
- the projection images 105 are configured such that parts thereof overlap (overlap) each other.
- Each projection imaging device 102 (projection unit 111) is arranged in such a posture (position and projection direction) as to realize such one projection region.
- the control device 101 can control each projection imaging device 102 to perform so-called projection mapping.
- the control apparatus 101 can project an image from each projection imaging apparatus 102 and project it as one projection image on the above-described one projection area.
- the control device 101 can project each input image 106 on the projection surface 104 as one corrected projection image 107 by causing the projection imaging devices 102 to cooperate with each other.
- the control device 101 divides one input image 106 into two parts in the vertical direction and two parts in the left and right direction (total four parts), and supplies each partial image to each projection imaging device 102. , And projected onto the projection plane 104. These projected images are combined on the projection surface 104 to form one projected image.
- a projection image having a resolution higher than the resolution of one projection unit 111 in other words, a projection image having a size larger than the maximum size of an image projected by one projection unit 111) is realized. can do.
- ⁇ Image correction> In this way, when the projection images are combined, the projection images projected by the projection units 111 are appropriately combined without any sense of incongruity to form one projection image (corrected projection image 107). Therefore, the control device 101 performs correction processing such as alignment, geometric correction, image quality correction (for example, luminance, color, resolution, etc.) on each image projected by each projection imaging device 102. In particular, a blending process or the like is performed on the overlap regions projected from the plurality of projection units 111 in order to obtain an image with no sense of incongruity.
- correction processing such as alignment, geometric correction, image quality correction (for example, luminance, color, resolution, etc.)
- image quality correction for example, luminance, color, resolution, etc.
- a blending process or the like is performed on the overlap regions projected from the plurality of projection units 111 in order to obtain an image with no sense of incongruity.
- control device 101 obtains a correspondence relationship between the pixels of the projection unit 111 and the imaging unit 112, and uses the correspondence relationship to determine the posture of each projection imaging device 102 (projection unit 111) and the shape of the projection plane 104.
- the relative posture between the projection imaging apparatuses 102 is estimated by pattern projection (Structured Light) from the projection unit 111 between the plurality of projection imaging apparatuses 102, imaging of the projection plane 104 by the imaging unit 112, and the like.
- shape estimation (depth sensing) of the projection plane 104 is performed by pattern projection (Structured Light) from the projection unit 111 and imaging of the projection plane 104 by the imaging unit 112 in each projection imaging apparatus 102.
- the control device 101 When performing the projection mapping as described above, the control device 101 obtains the correspondence between the pixels of the projection unit 111 and the imaging unit 112 for all the projection imaging devices 102 before projecting an image such as content, and performs projection. The posture of the imaging device 102 (projection unit 111) and the shape of the projection surface 104 are estimated, and correction information is set. Then, when projecting an image such as content, the control device 101 corrects the image according to the correction information.
- each projection imaging apparatus 102 is calibrated in advance. For example, in each projection imaging apparatus 102, characteristics relating to the projection of the projection unit 111 and characteristics relating to imaging of the imaging unit 112 are clarified (known). Further, in each projection imaging apparatus 102, the attitude (position and projection direction) of the projection unit 111 and the attitude (position and imaging direction) of the imaging unit 112 are fixed and known. That is, in each projection imaging apparatus 102, the relationship between the postures of the projection unit 111 and the imaging unit 112 (also referred to as a relative posture) is known.
- the “projection direction” indicates a direction in which the projection unit 111 projects an image.
- the “imaging direction” indicates a direction in which the imaging unit 112 images a subject.
- Such calibration of internal variables and external variables (calibration, estimation of various variables, etc.) of each projection imaging device 102 is performed in advance using a dedicated device or the like.
- the relative posture between the projection imaging devices 102 is clarified (known). That is, the relative posture between the projection units 111, the relative posture between the imaging units 112, and the relative posture between the projection unit 111 and the imaging unit 112 are clarified between the projection imaging devices 102.
- the position and shape of the projection plane 104 (relative position and shape from each projection unit 111) are also clarified (known). That is, the position and shape of the projection image 105 by the projection imaging device 102 (projection unit 111) on the projection surface 104 are also clarified.
- control device 101 improves the resolution of the projection image by cooperating the plurality of projection imaging devices 102 (in other words, increases the image size while suppressing the reduction of the resolution (image quality)). )be able to.
- the input image 106 corrected projection image 107) may be a moving image or a still image.
- the corrected projected image 107 is realized by correcting the image to be projected according to the preset correction information. Therefore, if the environment changes during the image projection, the correction information becomes an inappropriate value, and the image quality of the corrected projection image 107 may be reduced.
- the projection imaging apparatus 102-3 moves (position shift). If this occurs, there is a possibility that the relative posture and the like of the projection imaging apparatus 102-3 and the projection surface 104 may change. Therefore, the state (position, size, shape, image quality, etc.) of the projected image 105-3 may change. In other words, the state (position, size, shape, image quality, etc.) of the partial image 107-3 included in the projection image 105-3 of the corrected projection image 107 may change. Therefore, the corrected projected image 107 may have a reduced image quality, such as distortion, partial change in image quality, or the like, or the image is divided into a plurality of images and cannot be formed as a single projected image. .
- each projection imaging apparatus 102 projects an image and a corrected projection image 107 is projected on the projection plane 104
- the projection imaging apparatus 102 on the projection plane 104 is projected.
- the three-dimensional object 121 is newly installed in front of the position projected by -2
- at least a part of the projection image 105-2 projected by the projection imaging apparatus 102-2 is projected onto the three-dimensional object 121.
- the shape of the projection surface 104 is changed. For this reason, the state (position, size, shape, image quality, etc.) of the projected image 105-2 may change.
- the state (position, size, shape, image quality, etc.) of the partial image 107-2 included in the projection image 105-2 of the corrected projection image 107 may change. Therefore, the corrected projected image 107 may have a reduced image quality, such as distortion, partial change in image quality, or the like, or the image is divided into a plurality of images and cannot be formed as a single projected image. .
- the projection area of the projection surface 104 by the projection unit 111-1 of the projection imaging apparatus 102-1 is a range of P0L to P0R.
- the projection area of the projection plane 104 by the projection unit 111-2 of the projection imaging apparatus 102-2 ranges from P1L to P1R. That is, it is a range (range P1L to P0R) indicated by a double arrow 130. What is necessary is just to specify this overlap area
- the correspondence relationship of pixels means which pixel of the imaging unit 112 corresponds to a certain pixel of the projection unit 111, as shown in FIG. That is, for example, in FIG. 5, it is assumed that light (arrow 133) emitted from the projection unit 111-1 is reflected by X on the projection surface 104 and received by the imaging unit 112-2 (arrow 134).
- the correspondence between the pixels of the projection unit 111 that has irradiated the light and the pixels of the imaging unit 112 that has received the light that is, which pixel of the projection unit 111 each pixel of the imaging unit 112 corresponds to). If it can be grasped, the overlap region can be detected.
- a method for acquiring such pixel correspondence for example, there is a method using a gray code.
- a predetermined pattern image as shown in FIG. 6A is projected from the projection unit 111 while switching in time series, and each pattern is imaged by the imaging unit 112. Then, when imaging of all the patterns is completed, “1” (white) or “0” (black) of each imaging pattern is detected in each pixel of the imaging unit 112, and as shown in FIG.
- the position of the projector pixel is acquired by decoding the pattern of “1” and “0”. Thereby, the correspondence of pixels can be acquired.
- the correction information has been updated for all the projection imaging apparatuses 102. That is, as described above, all the processes such as pixel correspondence calculation, posture estimation of the projection imaging apparatus 102 (projection unit 111), and shape estimation of the projection plane 104 are performed for all the projection imaging apparatuses 102. . For example, even when the postures of some of the projection imaging apparatuses 102 are changed, all these processes are performed on all the projection imaging apparatuses 102. For this reason, unnecessary processing such as redundant processing may occur for the projection imaging apparatus 102 whose position has not changed (no positional deviation has occurred).
- the amount of processing related to updating correction information used for geometric correction or the like increases unnecessarily, and processing time increases unnecessarily.
- the amount of processing may increase unnecessarily as the number of projection imaging devices 102 to be cooperated increases.
- the processing related to the update of the correction information used for such geometric correction can be performed while projecting an image of content or the like by using online sensing.
- the image quality of the projected image is reduced due to the influence of the projection of the image such as the content.
- the pattern image is projected so as to be superimposed on an image such as content. Therefore, depending on the content of the image such as content, the pattern image pattern or switching may be visible to the observer. There was a possibility. Such an influence may increase as the processing time of the process related to the update of the correction information as described above increases.
- the control device 101 performs each process of detecting corresponding points of the pixels of the projection unit 111 and the imaging unit 112 between the projection imaging device 102, estimating the posture of the projection imaging device 102, and estimating the shape of the projection plane 104. This is performed for all projection imaging apparatuses 102. That is, in the initial sensing, the control device 101 executes all the processes related to the update of correction information used for geometric correction and the like for all the projection imaging devices 102.
- the control device 101 performs each process of detecting corresponding points of the pixels of the projection unit 111 and the imaging unit 112 between the projection imaging device 102, estimating the posture of the projection imaging device 102, and estimating the shape of the projection plane 104. This is performed for all projection imaging apparatuses 102.
- the control device 101 is used for geometric correction and the like for all the projection imaging devices 102 as in the case of initial sensing. All processes related to the correction information update are executed.
- the control device 101 only needs to execute processing related to a part of the projection imaging devices 102 whose posture has changed. Therefore, when the postures of all the projection imaging devices 102 change while projecting an image such as content, the control device 101 only connects the projection unit 111 between the projection imaging devices 102 with respect to the projection imaging device 102 whose posture has changed.
- the corresponding point of the pixel with the imaging unit 112 is detected.
- the control device 101 estimates the posture only for the projection imaging device 102 whose posture has changed.
- the control device 101 estimates the shape of the projection surface 104 only for a portion where the shape of the projection surface 104 projected by the projection imaging device 102 whose posture has changed is unknown.
- the projection imaging device 102-1 and the projection imaging device 102-2 are arranged, the projection area of the projection plane 104 by the projection unit 111-1, and the projection plane by the projection unit 111-2. It is assumed that the projection area 104 overlaps at a portion indicated by a double-pointed arrow 141.
- a portion indicated by a double-headed arrow 142 is an overlap region, and the overlap region is narrower than in the case of A in FIG.
- the control apparatus 101 only needs to perform processing relating to the moved projection imaging apparatus 102-1.
- the control device 101 detects pixel corresponding points between the projection unit 111-2 of the projection imaging device 102-2 and the imaging unit 112-1 of the projection imaging device 102-1.
- the relative posture between the projection unit 111-2 and the imaging unit 112-1 can be estimated.
- the projection imaging apparatus 102 has been calibrated, and the relative posture between the projection unit 111-1 and the imaging unit 112-1 is known.
- the relative posture between the projection unit 111-2 and the projection unit 111-1 is also estimated. That is, the relative posture between the projection imaging device 102 is estimated by detecting the pixel corresponding points of the projection unit 111-2 and the imaging unit 112-1.
- the projection plane 104 by the projection unit 111-1 is used. It can be seen that at least the range indicated by the double-headed arrow 151 on the projection plane 104 is known from the projection area of the projection plane 111-2 and the projection area of the projection plane 104 by the projection unit 111-2.
- the projection imaging apparatus 102-1 moves as shown in FIG. 9B, the projection area changes from the state shown in FIG. 9A, but falls within the range indicated by the double arrow 151. Therefore, since the shape of the projection area of the projection plane 104 by the projection imaging apparatus 102-1 after movement is known, it is not necessary to estimate the shape of the projection plane 104.
- the projection imaging apparatus 102-1 moves as indicated by C in FIG. 9, the projection area extends beyond the range indicated by the double arrow 151. That is, there is a possibility that the range indicated by the double arrow 152 in FIG. 9C of the projection plane 104 is unknown. If the shape in this range is unknown, the control device 101 only needs to estimate the shape of the projection plane 104 for only the portion whose shape is unknown.
- control apparatus 101 can suppress execution of unnecessary processing. That is, the control device 101 can suppress an increase in the amount of processing (processing time) related to updating correction information used for geometric correction or the like.
- the control device 101 detects the corresponding points of the pixels of the projection unit 111 and the imaging unit 112 between the projection imaging devices 102 only for the portion related to the projection imaging device 102 whose corresponding points have changed.
- the control device 101 estimates the shape of the projection surface 104 only for a portion where the shape of the projection surface 104 is unknown due to a shape change. In this case, since the posture estimation of the projection imaging apparatus 102 is unnecessary, it is omitted.
- the control device 101 can suppress the execution of unnecessary processing. That is, the control device 101 can suppress an increase in the amount of processing (processing time) related to updating correction information used for geometric correction or the like.
- the projection imaging devices 102 when the posture of some projection imaging devices 102 changes while projecting an image such as content, and when the shape of the projection surface 104 changes, the projection imaging devices 102 that have changed postures are related.
- the corresponding point of the pixel changes.
- the corresponding points of the pixels corresponding to the portions where the shape of the projection surface 104 has changed also change. Other corresponding points do not change. Therefore, in this case, the control device 101 performs only the portion related to the projection imaging device 102 whose posture has changed and the portion related to the projection imaging device 102 whose corresponding point has changed between the projection unit 111 and the imaging unit 112 between the projection imaging devices 102. Pixel corresponding point detection is performed.
- the control device 101 performs posture estimation only for the projection imaging device 102 whose posture has changed.
- control device 101 estimates the shape of the projection surface 104 only for a portion where the shape of the projection surface 104 is unknown. Regardless of whether the change in the posture of the projection imaging apparatus 102 or the change in the shape of the projection plane 104 is a factor, it is not necessary to estimate the shape of a part whose shape is already known. That is, it is only necessary to perform shape estimation for unknown parts.
- control apparatus 101 can suppress execution of unnecessary processing. That is, the control device 101 can suppress an increase in the amount of processing (processing time) related to updating correction information used for geometric correction or the like.
- control device 101 can perform correction information used for geometric correction or the like. An increase in the processing amount (processing time) of processing related to updating can be suppressed.
- FIG. 10 is a block diagram illustrating a main configuration example of the control device 101 which is an embodiment of the information processing device to which the present technology is applied.
- the control device 101 includes a control unit 201. Processing related to control of each unit of the control apparatus 101 and the projection imaging apparatus 102 is performed.
- the input / output interface 210 is connected to the control unit 201 via a bus.
- An input unit 211, an output unit 212, a storage unit 213, a communication unit 214, and a drive 215 are connected to the input / output interface 210.
- the input unit 211 includes an input device that accepts external information such as user input.
- the input unit 211 includes operation buttons, a touch panel, a camera, a microphone, an input terminal, and the like.
- Various sensors such as an acceleration sensor, an optical sensor, and a temperature sensor may be included in the input unit 211.
- the output unit 212 includes an output device that outputs information such as images and sounds.
- the output unit 212 includes a display, a speaker, an output terminal, and the like.
- the storage unit 213 includes, for example, a hard disk, a RAM disk, and a nonvolatile memory.
- the communication unit 214 includes a network interface, for example.
- the communication unit 214 is connected to the communication cable 103 and communicates with other devices connected via the communication cable 103.
- the drive 215 drives a removable medium 221 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the control unit 201 includes a processing control unit 231, a projection control unit 232, an imaging control unit 233, a sensor control unit 234, an image projection unit 235, a corresponding point change detection unit 236, a sensor information change detection unit 237, and a change determination unit 238.
- a point detection unit 239, a posture estimation unit 240, a projection surface shape estimation unit 241, and a correction information update unit 242 are included.
- the processing control unit 231 performs processing related to control of processing related to correction information update.
- the projection control unit 232 performs processing related to the control of the projection unit 111.
- the imaging control unit 233 performs processing related to the control of the imaging unit 112.
- the sensor control unit 234 performs processing related to control of the sensor (sensor unit 312) included in each projection imaging apparatus 102.
- the image projection unit 235 performs processing related to the projection of images such as content.
- the corresponding point change detection unit 236 performs processing related to detection of changes in corresponding points of pixels between the projection unit 111 and the imaging unit 112 between the projection imaging devices 102.
- the sensor information change detection unit 237 performs processing related to detection of changes in sensor information.
- the change determination unit 238 performs processing related to determination of what information has changed.
- Corresponding point detection unit 239 performs processing related to detection of corresponding points of pixels between the projection unit 111 and the imaging unit 112 between the projection imaging devices 102.
- the posture estimation unit 240 performs processing related to posture estimation of the projection imaging apparatus 102.
- the projection surface shape estimation unit 241 performs processing related to the shape estimation of the projection surface 104.
- the correction information update unit 242 performs processing related to the correction information update.
- each of these processing units can be realized by software.
- the control unit 201 has a configuration such as a CPU (Central Processing Unit), a ROM (Read Only Memory), and a RAM (Random Access Memory) that can execute programs and process data.
- the control unit 201 performs various processes by, for example, loading a program read from the storage unit 213 or the like into a built-in RAM or the like, or processing data read from the storage unit 213 or the like.
- each processing unit of the processing control unit 231 to the correction information update unit 242 is realized as a functional block. That is, the control unit 201 realizes various functions of the processing control unit 231 to the correction information update unit 242 by executing a program.
- FIG. 11 shows an example of the appearance of the projection imaging apparatus 102.
- the projection imaging apparatus 102 includes the projection unit 111 and the imaging unit 112, and the housing has a projection port (lens mechanism) 301 for projecting an image and a subject for imaging a subject.
- An optical device such as an imaging port (lens mechanism) 302 is provided.
- the projection imaging device 102 may be any size device, but may be a portable (small) device, for example.
- a battery 303 may be provided in the housing of the projection imaging apparatus 102. By providing the battery 303, the projection imaging apparatus 102 can be driven without an external power supply, so that the degree of freedom of the installation position can be improved.
- FIG. 12 is a block diagram illustrating a main configuration example of the projection imaging apparatus 102.
- the projection imaging apparatus 102 includes a control unit 311, a projection unit 111, an imaging unit 112, a sensor unit 312, an input unit 321, an output unit 322, a storage unit 323, a communication unit 324, and a drive 325.
- the control unit 311 includes, for example, a CPU, a ROM, a RAM, and the like, and controls each processing unit in the apparatus, and executes various processes necessary for the control such as image processing.
- the sensor unit 312 includes a sensor that can detect a change in the attitude of the projection imaging apparatus 102, such as an acceleration sensor or each speed sensor. This sensor may sense any information as long as it can detect a change in the posture of the projection imaging apparatus 102.
- the sensor unit 312 is controlled by the control unit 311 to sense a predetermined parameter, and supplies the detection result to the control unit 311.
- the projection unit 111 is controlled by the control unit 311 to perform processing related to image projection.
- the projection unit 111 projects the image supplied from the control unit 311 onto the outside of the projection imaging apparatus 102 (for example, the projection plane 104). That is, the projection unit 111 realizes a projection function.
- the projection unit 111 projects an image by using laser light as a light source and scanning the laser light using a MEMS mirror.
- the light source of the projection unit 111 is arbitrary, and is not limited to the laser beam, but may be an LED (Light Emitting Diode), xenon, or the like. Details of the projection unit 111 will be described later.
- the imaging unit 112 is controlled by the control unit 311 to capture a subject outside the apparatus, generate a captured image, and supply the captured image to the control unit 311. That is, the imaging unit 112 implements an imaging function. For example, the imaging unit 112 captures a projection image projected on the projection plane 104 by the projection unit 111.
- the input unit 321 includes an input device that accepts external information such as user input.
- the input unit 321 includes operation buttons, a touch panel, a camera, a microphone, an input terminal, and the like.
- Various sensors such as an optical sensor and a temperature sensor may be included in the input unit 321.
- the output unit 322 includes an output device that outputs information such as images and sounds.
- the output unit 322 includes a display, a speaker, an output terminal, and the like.
- the storage unit 323 includes, for example, a hard disk, a RAM disk, a nonvolatile memory, and the like.
- the communication unit 324 is composed of a network interface, for example.
- the communication unit 324 is connected to the communication cable 103 and communicates with other devices connected via the communication cable 103.
- the drive 325 drives a removable medium 331 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- FIG. 13 is a block diagram illustrating a main configuration example of the projection unit 111.
- the projection unit 111 includes a video processor 351, a laser driver 352, a laser output unit 353-1, a laser output unit 353-2, a laser output unit 353-3, a mirror 354-1, and a mirror 354-. 2, a mirror 354-3, a MEMS (Micro Electro Mechanical Systems) driver 355, and a MEMS mirror 356.
- the video processor 351 holds an image supplied from the control unit 311 and performs necessary image processing on the image.
- the video processor 351 supplies the projected image to the laser driver 352 and the MEMS driver 355.
- the laser driver 352 controls the laser output unit 353-1 to the laser output unit 353-3 so as to project the image supplied from the video processor 351.
- the laser output units 353-1 to 353-3 output laser beams having different colors (wavelength ranges) such as red, blue, and green. That is, the laser driver 352 controls the laser output of each color so that the image supplied from the video processor 351 is projected.
- the laser output unit 353-1 to the laser output unit 353-3 are referred to as a laser output unit 353 when it is not necessary to distinguish between them.
- the mirror 354-1 reflects the laser beam output from the laser output unit 353-1 and guides it to the MEMS mirror 356.
- the mirror 354-2 reflects the laser beam output from the laser output unit 353-2 and guides it to the MEMS mirror 356.
- the mirror 354-3 reflects the laser beam output from the laser output unit 353-3 and guides it to the MEMS mirror 356. Note that the mirrors 354-1 to 354-3 are referred to as mirrors 354 when there is no need to distinguish them from each other.
- the MEMS driver 355 controls the drive of the mirror of the MEMS mirror 356 so as to project the image supplied from the video processor 351.
- the MEMS mirror 356 scans the laser light of each color as shown in the example of FIG. 14, for example, by driving a mirror (mirror) mounted on the MEMS according to the control of the MEMS driver 355.
- This laser light is output from the projection port to the outside of the apparatus, and is irradiated onto the projection surface 104, for example.
- the image supplied from the video processor 351 is projected onto the projection plane 104.
- the number of laser output units 353 may be four or more, or two or less. That is, the number of laser beams output from the projection imaging apparatus 102 (projection unit 111) may be two or less, or four or more.
- the number of colors of the laser light output from the projection imaging apparatus 102 (projection unit 111) is also arbitrary, and may be two colors or less, or four colors or more.
- the configurations of the mirror 354 and the MEMS mirror 356 are also arbitrary, and are not limited to the example of FIG. Of course, the scanning pattern of the laser beam is arbitrary.
- the control unit 201 of the control device 101 first performs initial sensing. As described above with reference to FIG. 7, in the initial sensing, all the processes related to the update of correction information used for geometric correction and the like are executed for all projection imaging apparatuses 102. After the initial sensing is performed, the image projection unit 235 controls the projection imaging apparatus 102 via the projection control unit 232 to project an image such as content. The control unit 311 of the projection imaging apparatus 102 controls the projection unit 111 according to the control, and projects an image such as content supplied from the control apparatus 101 on the projection plane 104.
- the control device 101 performs correction information update processing in parallel with the image projection, and updates correction information used for geometric correction or the like as necessary.
- This correction information update process is executed for each projection imaging apparatus 102.
- the correction information update process may be performed for each projection imaging apparatus 102 one by one, or the correction information update process may be performed for a plurality of projection imaging apparatuses 102 in parallel.
- the processing order is arbitrary, and may be a fixed order or random.
- the correction information update process may be performed on all the projection imaging apparatuses 102.
- the change of the corresponding point of the pixel is detected.
- the corresponding point change detection unit 236 controls the projection control unit 232 and the imaging control unit 233 to perform this detection process. That is, the corresponding point change detection unit 236 acquires an image projected by the projection unit 111 from the projection control unit 232. Further, the corresponding point change detection unit 236 causes the projection imaging apparatus 102 to image the projection plane 104 via the imaging control unit 233, and acquires the captured image.
- the corresponding point change detection unit 236 aligns these images, thereby corresponding points of pixels of the projection unit 111 and the imaging unit 112 between the projection imaging device 102 to be processed and the other projection imaging device 102. Changes are monitored, and if they change, the changes are detected.
- the sensor information change detection unit 237 detects a change in sensor information obtained by the sensor unit 312 of the projection imaging apparatus 102 to be processed.
- the sensor information change detection unit 237 controls the sensor control unit 234 to perform this detection process. That is, the sensor information change detection unit 237 acquires the sensor output from the sensor unit 312 of the projection imaging apparatus 102 to be processed via the sensor control unit 234. And the sensor information change detection part 237 detects the change of the time direction of sensor information based on the sensor output.
- step S103 the change determination unit 238 determines whether or not the attitude of the projection imaging apparatus 102 to be processed has changed. If a change in sensor information is detected in step S102 and it is determined that the orientation of the projection imaging apparatus 102 to be processed has changed, the process proceeds to step S104.
- step S104 the corresponding point detection unit 239 detects a corresponding point of the pixel related to the projection imaging apparatus 102 to be processed.
- step S105 the posture estimation unit 240 estimates the posture of the projection imaging apparatus 102 to be processed using the processing result (pixel corresponding point detection result) in step S104. That is, the posture estimation unit 240 estimates the relative posture of the projection imaging device 102 to be processed with respect to another projection imaging device 102 whose posture has not changed.
- step S106 the projection plane shape estimation unit 241 estimates the shape of the projection plane 104.
- step S110 the process proceeds to step S110.
- step S102 If no change in sensor information is detected in step S102 and it is determined in step S103 that the orientation of the projection imaging apparatus 102 to be processed has not changed, the process proceeds to step S107.
- step S107 the change determination unit 238 determines whether or not the shape of the projection plane 104 has changed based on the processing result in step S101. For example, when the corresponding point of the pixel related to the projection imaging apparatus 102 to be processed has changed even though the attitude of the projection imaging apparatus 102 to be processed has not changed, the shape of the projection plane 104 has changed. Thus, when it is determined that the shape of the projection plane 104 has changed, the process proceeds to step S108.
- step S108 the corresponding point detection unit 239 detects corresponding points of the pixels related to the projection imaging apparatus 102 to be processed. In this case, since the posture of the projection imaging apparatus 102 to be processed has not changed, the posture estimation is omitted.
- step S109 the projection plane shape estimation unit 241 estimates the shape of the projection plane 104.
- step S109 ends, the process proceeds to step S110.
- step S110 when the information is updated, added, or deleted by the above processes, the correction information update unit 242 updates the correction information so that the update, addition, or deletion of the information is reflected in the correction information. .
- step S110 When the process of step S110 is completed, the correction information update process is terminated. If it is determined in step S107 that the shape of the projection surface 104 has not changed, the correction information update process ends. In this case, the posture of the projection imaging apparatus 102 to be processed does not change, and the shape of the portion of the projection plane 104 related to the projection imaging apparatus 102 to be processed has not changed. Therefore, all processes related to updating correction information used for geometric correction and the like are omitted.
- step S104 the corresponding point detection unit 239 performs the operation of the pixels of the projection unit 111 and the imaging unit 112 between the projection imaging device 102 whose posture has changed and the projection imaging device 102 whose posture has not changed.
- the corresponding point with the pixel is detected.
- step S105 the posture estimation unit 240 estimates the relative posture of the projection imaging device 102 whose posture has changed with respect to the projection imaging device 102 whose posture has not changed, based on the detected corresponding point of the pixel. .
- control device 101 can suppress the execution of unnecessary processing, for example, omitting estimation of the relative posture between the projection imaging devices 102 whose postures have not changed. That is, the control device 101 can suppress an increase in the amount of processing (processing time) related to updating correction information used for geometric correction or the like.
- the corresponding point change detection unit 236 detects a change in the corresponding point between the pixel of the projection unit 111 and the pixel of the imaging unit 112 of the projection imaging apparatus 102 in step S101.
- the sensor information change detection unit 237 detects a change in the attitude of the projection imaging device based on the sensor output of the sensor unit that detects at least one of the position and direction of the projection imaging device 102.
- the change determination unit 238 determines whether or not the attitude of the projection imaging apparatus 102 has changed in step S103 based on the detection result of the corresponding point change detection unit 236 and the detection result of the sensor information change detection unit 237. To do.
- the corresponding point detection unit 239 determines in step S104 that the posture has been changed by the change determination unit 238, and the projection imaging device in which the change determination unit 238 has determined that the posture has not changed. Corresponding points are detected with 102.
- the posture estimation unit 240 of the projection imaging device 102 that has been determined to have changed posture by the change determination unit 238 has been determined that the posture has not changed by the change determination unit 238. Estimate the relative posture with respect to.
- control device 101 can suppress the execution of unnecessary processing, for example, omitting estimation of the relative posture between the projection imaging devices 102 whose postures have not changed. That is, the control device 101 can suppress an increase in the amount of processing (processing time) related to updating correction information used for geometric correction or the like.
- the corresponding point detection unit 239 further detects a corresponding point between the pixel of the projection unit 111 of the projection imaging apparatus 102 and the pixel of the imaging unit 112 whose posture has changed in step S104. Then, the projection plane shape estimation unit 241 is based on the corresponding point between the pixel of the projection unit 111 of the projection imaging apparatus 102 and the pixel of the imaging unit 112 detected by the corresponding point detection unit 239 and changed in posture in step S106. Thus, the shape of the projection plane 104 is estimated.
- the control apparatus 101 can perform shape estimation by identifying a region that may include a part whose shape is unknown, such as a projection range of the projection imaging apparatus 102 whose posture has changed, for example. For example, it is possible to suppress the execution of unnecessary processing such as omitting shape estimation for a portion whose shape is clearly known. That is, the control device 101 can suppress an increase in the amount of processing (processing time) related to updating correction information used for geometric correction or the like.
- the corresponding point detection unit 239 further detects corresponding points between the pixels of the projection unit 111 of the projection imaging apparatus 102 and the pixels of the imaging unit 112 whose posture has not changed in step S108.
- the projection plane shape estimation unit 241 detects the corresponding points between the pixels of the projection unit 111 of the projection imaging apparatus 102 and the pixels of the imaging unit 112 detected by the corresponding point detection unit 239 and whose posture is not changed. The shape of the projection plane 104 is estimated.
- the control device 101 detects a change in the shape of the projection surface 104 from the change in the corresponding point of the pixel even when the posture of the projection imaging device 102 does not change, and The shape can be estimated. Therefore, it is possible to suppress the execution of unnecessary processing, for example, omitting shape estimation for a portion whose shape is clearly known. That is, the control device 101 can suppress an increase in the amount of processing (processing time) related to updating correction information used for geometric correction or the like.
- the corresponding point detection unit 239 of the control device 101 detects the corresponding point between the pixel of the projection unit 111 of the projection imaging device 102 and the pixel of the imaging unit 112 whose posture has not changed, and the projection surface shape estimation unit 241 indicates the shape of the projection plane 104 based on the corresponding points detected by the corresponding point detection unit 239 between the pixels of the projection unit 111 of the projection imaging apparatus 102 and the pixels of the imaging unit 112 whose posture has not changed.
- the projection surface shape estimation unit 241 indicates the shape of the projection plane 104 based on the corresponding points detected by the corresponding point detection unit 239 between the pixels of the projection unit 111 of the projection imaging apparatus 102 and the pixels of the imaging unit 112 whose posture has not changed.
- control apparatus 101 can identify the part where the shape of the projection surface 104 has changed, and can estimate the shape of the projection surface 104 for that part. Therefore, it is possible to suppress the execution of unnecessary processing, for example, omitting shape estimation for a portion whose shape is clearly known. That is, the control device 101 can suppress an increase in the amount of processing (processing time) related to updating correction information used for geometric correction or the like.
- step S131 the corresponding point detection unit 239 controls the projection unit 111 of the other projection imaging apparatus 102 that is not the processing target via the projection control unit 232, and obtains a predetermined pattern image. Project.
- step S132 the corresponding point detection unit 239 controls the imaging unit 112 of the projection imaging apparatus 102 to be processed via the imaging control unit 233, and images the projection image (pattern image) projected by the process of step S131. .
- step S133 the corresponding point detection unit 239 detects corresponding points between the pixels of the projection unit 111 of the other projection imaging device 102 and the pixels of the imaging unit 112 of the projection imaging device 102 to be processed. That is, the corresponding point detection unit 239 determines the corresponding points based on the pattern image projected in the process of step S131 and the captured image (captured image including the pattern image) obtained by imaging in the process of step S132. To detect.
- step S134 the corresponding point detection unit 239 controls the projection unit 111 of the projection imaging apparatus 102 to be processed via the projection control unit 232 to project a predetermined pattern image.
- step S135 the corresponding point detection unit 239 controls the imaging unit 112 of the projection imaging apparatus 102 to be processed via the imaging control unit 233, and images the projection image (pattern image) projected by the process of step S134. .
- step S136 the corresponding point detection unit 239 detects corresponding points between the pixels of the projection unit 111 and the pixels of the imaging unit 112 of the projection imaging apparatus 102 to be processed. That is, the corresponding point detection unit 239 determines the corresponding points based on the pattern image projected in the process of step S134 and the captured image (captured image including the pattern image) obtained by imaging in the process of step S135. To detect.
- the corresponding point detection unit 239 detects corresponding points of pixels between the imaging unit 112 of the projection imaging device 102 whose posture has changed and the projection unit 111 of the projection imaging device 102 whose posture has not changed.
- the corresponding points of the pixels of the projection unit 111 and the imaging unit 112 of the projection imaging apparatus 102 whose posture has changed are detected.
- the posture estimation unit 240 can estimate the posture of the projection imaging apparatus 102 whose posture has changed, and can estimate the shape of the projection plane 104. That is, the control device 101 can suppress an increase in the amount of processing (processing time) related to updating correction information used for geometric correction or the like.
- the projection plane shape estimation unit 241 controls the projection control unit 232 in step S151, and the projection plane 104 on which the projection unit 111 of the projection imaging apparatus 102 to be processed projects an image. Estimate the projection range. For example, the projection plane shape estimation unit 241 estimates the above-described projection range based on the attitude of the projection imaging apparatus 102 to be processed estimated by the process of step S105 (FIG. 15).
- step S152 the projection plane shape estimation unit 241 determines whether or not an unknown portion of the projection plane 104 is included in the estimated projection range. If it is determined that an unknown part is included, the process proceeds to step S153.
- the projection plane shape estimation unit 241 estimates the shape of the projection plane 104 for the unknown part. For example, the projection plane shape estimation unit 241 determines the unknown based on the pixel corresponding points of the projection unit 111 and the imaging unit 112 of the projection imaging apparatus 102 to be processed, which are detected by the process of step S136 (FIG. 16). The shape of the projection plane 104 is estimated for the portion.
- step S152 When the shape of the projection plane 104 is estimated, the projection plane shape estimation process ends, and the process returns to FIG. If it is determined in step S152 that an unknown part is not included in the projection range, the projection plane shape estimation process ends, and the process returns to FIG. That is, in this case, the process of step S153 (projection plane shape estimation) is omitted.
- the projection plane shape estimation unit 241 applies the projection plane 104 to a portion whose shape is unknown within a range in which an image is projected by the projection unit 111 of the projection imaging apparatus 102 whose posture has changed. Estimate the shape of Therefore, the control device 101 can further suppress an increase in the processing amount (processing time) related to the update of correction information used for geometric correction or the like.
- the corresponding point detection unit 239 controls the projection unit 111 of the projection imaging apparatus 102 to be processed via the projection control unit 232 in step S171 to project a predetermined pattern image. .
- step S172 the corresponding point detection unit 239 controls the imaging unit 112 of the projection imaging device 102 to be processed via the imaging control unit 233, and images the projection image (pattern image) projected by the process of step S171. .
- the corresponding point detection unit 239 detects corresponding points between the pixels of the projection unit 111 and the pixels of the imaging unit 112 of the projection imaging apparatus 102 to be processed. That is, the corresponding point detection unit 239 determines the corresponding points based on the pattern image projected in the process of step S171 and the captured image (captured image including the pattern image) obtained by imaging in the process of step S172. To detect.
- the control device 101 can further suppress an increase in the processing amount (processing time) related to the update of correction information used for geometric correction or the like.
- the projection plane shape estimation unit 241 When the projection plane shape estimation process is started, the projection plane shape estimation unit 241 performs the projection unit 111 and the imaging unit of the projection imaging apparatus 102 to be processed based on the processing result of step S173 (FIG. 18) in step S191.
- the change part of the corresponding point of the pixel with 112 is specified.
- step S192 the projection surface shape estimation unit 241 estimates the shape of the projection surface 104 for a portion where the shape of the projection surface 104 is unknown (a change portion of the corresponding point of the pixel specified in step S191). This process is executed in the same manner as the process in step S153 (FIG. 17). When the shape of the projection plane 104 is estimated, the projection plane shape estimation process ends, and the process returns to FIG.
- the projection plane shape estimation unit 241 estimates the shape of the projection plane 104 for a portion whose shape is unknown. Therefore, the control device 101 can further suppress an increase in the processing amount (processing time) related to the update of correction information used for geometric correction or the like.
- control apparatus 101 can perform only the process according to the change factor, and suppresses an increase in the processing amount related to the update of correction information used for geometric correction or the like. be able to.
- Second Embodiment> ⁇ Corresponding point detection processing>
- the pattern image is projected from the projection unit 111 of the other projection imaging device 102, and the processing target
- the projection image is captured by the imaging unit 112 of the projection imaging apparatus 102
- the projection and imaging of this image may be reversed. That is, the pattern image may be projected from the projection unit 111 of the projection imaging apparatus 102 to be processed, and the projection image may be captured by the imaging unit 112 of another projection imaging apparatus 102.
- corresponding points between the pixels of the imaging unit 112 of another projection imaging apparatus 102 and the pixels of the projection unit 111 of the projection imaging apparatus 102 to be processed may be detected.
- the attitude of the projection imaging apparatus 102 to be processed can be estimated from this corresponding point as in the case of the first embodiment.
- the corresponding point detection unit 239 controls the projection unit 111 of the projection imaging apparatus 102 to be processed via the projection control unit 232 in step S211 to project a predetermined pattern image. .
- step S212 the corresponding point detection unit 239 controls the imaging unit 112 of the projection imaging device 102 to be processed and the imaging unit 112 of another projection imaging device 102 that is not the processing target via the imaging control unit 233,
- the projected images (pattern images) projected by the processing in step S211 are each captured.
- the corresponding point detection unit 239 detects corresponding points between the pixels of the imaging unit 112 of the other projection imaging device 102 and the pixels of the projection unit 111 of the projection imaging device 102 to be processed. That is, the corresponding point detection unit 239 determines the corresponding points based on the pattern image projected in the process of step S211 and the captured image (captured image including the pattern image) obtained by imaging in the process of step S212. To detect.
- step S214 the corresponding point detection unit 239 detects a corresponding point between the pixel of the projection unit 111 and the pixel of the imaging unit 112 of the projection imaging apparatus 102 to be processed. That is, the corresponding point detection unit 239 determines the corresponding points based on the pattern image projected in the process of step S211 and the captured image (captured image including the pattern image) obtained by imaging in the process of step S212. To detect.
- the corresponding point detection unit 239 detects corresponding points between the pixels of the projection unit 111 of the projection imaging device 102 whose posture has changed and the pixels of the imaging unit 112 of the projection imaging device 102 whose posture has not changed. Also in this case, the control device 101 can perform only processing according to the change factor, as in the case of the first embodiment, and the processing amount of processing related to update of correction information used for geometric correction or the like can be reduced. The increase can be suppressed.
- configuration examples of the projection imaging apparatus and the projection imaging system to which the present technology is applied are not limited to the above-described examples.
- the control device 101 may be omitted as in the projection imaging system 400 shown in FIG. That is, the process related to the update of correction information used for geometric correction or the like may be performed by an apparatus other than the control apparatus 101.
- the projection imaging apparatus 102 may perform this.
- the projection imaging apparatus 102 performs processing related to updating correction information used for the above-described geometric correction or the like.
- an input image is input to each projection imaging device 102, and each projection imaging device 102 corrects the image.
- the projection imaging apparatuses 102 are connected to each other by a communication cable such as an HDMI (registered trademark) cable, and various types of information are shared between the projection imaging apparatuses 102. Therefore, any projection imaging apparatus 102 can perform processing related to the update of correction information used for the above-described geometric correction or the like.
- one of the projection imaging apparatus 102-1 to the projection imaging apparatus 102-4 performs a process related to the update of correction information used for the above-described geometric correction and the like, and the processing result is shared with other projection imaging apparatuses 102. It may be. Further, a plurality of projection imaging apparatuses 102 may cooperate to perform processing relating to update of correction information used for the above-described geometric correction, and the processing result may be shared by each projection imaging apparatus 102.
- the projection imaging apparatus performs a process related to update of correction information used for geometric correction or the like, the process is performed in the same manner as in the first embodiment or the second embodiment. be able to.
- the projection unit 111 that projects an image on the projection plane 104, the imaging unit 112 that captures the projection plane 104, the sensor unit 312 that detects at least one of the position and the direction, and the sensor output of the sensor unit 312
- a change determination unit 238 that determines whether or not the posture has changed, and a projection unit 111, an imaging unit 112, and a sensor that have not changed the posture when the change determination unit 238 determines that the posture has changed.
- a posture estimation unit 240 that estimates a relative posture with respect to another projection imaging apparatus 102 based on the points is provided.
- control device 101 can perform only the process according to the change factor, and can suppress an increase in the processing amount related to the update of the correction information used for the geometric correction or the like.
- the corresponding point detection unit 239 may detect corresponding points between the pixels of the projection unit 111 of another projection imaging apparatus 102 and the pixels of the imaging unit 112 of the projection imaging apparatus 102 itself.
- the corresponding point detection unit 239 may detect corresponding points between the pixels of the projection unit 111 of the projection imaging apparatus 102 itself and the pixels of the imaging unit 112 of another projection imaging apparatus 102.
- the projection imaging device 102 Furthermore, based on the sensor output of the corresponding point change detection unit 236 that detects the change of the corresponding point between the pixel of the projection unit 111 of the projection imaging device 102 itself and the pixel of the imaging unit 112, the projection imaging device 102.
- a sensor information change detection unit 237 that detects a change in its own posture, and the change determination unit 238 performs projection imaging based on the detection result of the corresponding point change detection unit 236 and the detection result of the sensor information change detection unit 237; It may be configured to determine whether the attitude of the device 102 itself has changed.
- the corresponding point detection unit 239 may further detect corresponding points between the pixels of the projection unit 111 and the pixels of the imaging unit 112 of the projection imaging apparatus 102 itself. Further, the projection plane shape estimation unit 241 that estimates the shape of the projection plane 104 based on the corresponding points between the pixels of the projection unit 111 of the projection imaging apparatus 102 itself and the pixels of the imaging unit 112 detected by the corresponding point detection unit 239. May be further provided.
- the projection plane shape estimation unit 241 estimates the shape of the projection plane 104 for a portion of the projection plane 104 whose shape is unknown within a range in which an image is projected by the projection unit 111 of the projection imaging apparatus 102 itself. Also good.
- the posture changes based on the sensor output of the projection unit that projects an image on the projection plane, the imaging unit that captures the projection plane, the sensor unit 312 that detects at least one of the position and the direction, and the sensor unit 312. If the change determination unit 238 determines whether or not the posture has not changed, the correspondence between the pixels of the projection unit 111 of the projection imaging apparatus 102 and the pixels of the imaging unit 112 is determined. You may make it provide the corresponding point detection part 239 which detects a point, and the projection surface shape estimation part 241 which estimates the shape of the projection surface 104 based on the corresponding point detected by the corresponding point detection part 239.
- the configurations of the respective projection imaging apparatuses may be different from each other.
- the projection imaging system 410 includes a projection imaging apparatus 411, a projection imaging apparatus 412, and a projection imaging apparatus 413 that are communicably connected to a network 302 that is an arbitrary communication medium network.
- the projection imaging device 411 includes two projection units 111 (projection unit 111-1-1 and projection unit 111-1-2) and one imaging unit 112-1.
- the projection imaging device 412 has one projection unit 111-2 and one imaging unit 112-2.
- the projection imaging device 413 includes one projection unit 111-3 and two imaging units 112 (an imaging unit 112-3-1 and an imaging unit 112-3-2).
- processing can be performed in the same manner as in the first embodiment and the second embodiment. That is, in the case of the projection imaging system 410, it is possible to suppress an increase in the amount of processing related to updating correction information used for geometric correction or the like.
- this recording medium is constituted by a removable medium 221 on which a program is recorded, which is distributed to distribute the program to the user, separately from the apparatus main body.
- the removable medium 221 includes a magnetic disk (including a flexible disk) and an optical disk (including a CD-ROM and a DVD). Further, magneto-optical disks (including MD (Mini-Disc)) and semiconductor memories are also included.
- MD Mini-Disc
- this recording medium is constituted by a removable medium 331 on which the program is recorded, which is distributed to distribute the program to the user, separately from the apparatus main body.
- the removable medium 331 includes a magnetic disk (including a flexible disk) and an optical disk (including a CD-ROM and a DVD). Furthermore, a magneto-optical disk (including MD), a semiconductor memory, and the like are also included.
- MD magneto-optical disk
- the program stored in the removable medium 331 can be read and installed in the storage unit 323.
- This program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be received by the communication unit 214 and installed in the storage unit 213.
- the program can be received by the communication unit 324 and installed in the storage unit 323.
- this program can also be installed in advance in a storage unit or ROM.
- the program can be installed in advance in a ROM (not shown) or the like built in the storage unit 213 or the control unit 201.
- the program may be installed in advance in a ROM (not shown) or the like built in the storage unit 323 or the control unit 311.
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- the step of describing the program recorded on the recording medium is not limited to the processing performed in chronological order according to the described order, but is not necessarily performed in chronological order. It also includes processes that are executed individually.
- each step described above can be executed in each device described above or any device other than each device described above.
- the device that executes the process may have the functions (functional blocks and the like) necessary for executing the process described above.
- Information necessary for processing may be transmitted to the apparatus as appropriate.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units).
- the configurations described above as a plurality of devices (or processing units) may be combined into a single device (or processing unit).
- a configuration other than that described above may be added to the configuration of each device (or each processing unit).
- a part of the configuration of a certain device (or processing unit) may be included in the configuration of another device (or other processing unit). .
- the present technology can take a configuration of cloud computing in which one function is shared by a plurality of devices via a network and is jointly processed.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- the present technology is not limited to this, and any configuration mounted on such a device or a device constituting the system, for example, a processor as a system LSI (Large Scale Integration), a module using a plurality of processors, a plurality of It is also possible to implement as a unit using other modules, a set obtained by further adding other functions to the unit (that is, a partial configuration of the apparatus), and the like.
- a processor as a system LSI (Large Scale Integration)
- a module using a plurality of processors a plurality of It is also possible to implement as a unit using other modules, a set obtained by further adding other functions to the unit (that is, a partial configuration of the apparatus), and the like.
- this technique can also take the following structures.
- a projection imaging apparatus having a projection unit that projects an image on a projection plane and an imaging unit that captures the projection plane
- the projection imaging apparatus whose posture has changed and the projection imaging apparatus whose posture has not changed A corresponding point detection unit that detects corresponding points between the pixels of the projection unit and the pixels of the imaging unit, The corresponding point detected by the corresponding point detection unit between the projection imaging device whose posture has changed and the projection imaging device whose posture has not changed is the corresponding point between the pixel of the projection unit and the pixel of the imaging unit.
- An information processing apparatus comprising: a relative attitude estimation unit configured to estimate a relative attitude of the projection imaging apparatus whose attitude has changed with respect to the projection imaging apparatus whose attitude has not changed.
- the corresponding point detection unit detects a corresponding point between the pixel of the projection unit of the projection imaging apparatus whose posture has not changed and the pixel of the imaging unit of the projection imaging device whose posture has changed.
- the information processing apparatus according to 1).
- the corresponding point detection unit detects a corresponding point between the pixel of the projection unit of the projection imaging apparatus whose posture has changed and the pixel of the imaging unit of the projection imaging device whose posture has not changed.
- the information processing apparatus according to 1) or (2).
- a corresponding point change detection unit that detects a change in corresponding points between the pixels of the projection unit and the pixels of the imaging unit of the projection imaging apparatus;
- An attitude change detection unit that detects an attitude change of the projection imaging device based on a sensor output of a sensor unit that detects at least one of a position and a direction of the projection imaging device;
- a change determination unit that determines whether or not the posture of the projection imaging apparatus has changed based on the detection result of the corresponding point change detection unit and the detection result of the posture change detection unit;
- the corresponding point detection unit includes the projection imaging device in which the posture is determined to be changed by the change determination unit and the projection imaging device in which the posture is not changed by the change determination unit.
- the relative posture estimation unit estimates a relative posture of the projection imaging apparatus determined to have changed posture by the change determination unit with respect to the projection imaging apparatus determined to have not changed posture by the change determination unit.
- the information processing apparatus according to any one of (1) to (3).
- the corresponding point detection unit is further configured to detect a corresponding point between the pixel of the projection unit and the pixel of the imaging unit of the projection imaging apparatus whose posture has changed, Projection surface shape estimation for estimating the shape of the projection surface based on the corresponding points detected by the corresponding point detection unit between the pixels of the projection unit and the pixels of the imaging unit of the projection imaging apparatus whose posture has changed.
- the information processing apparatus according to any one of (1) to (4).
- the corresponding point detection unit is further configured to detect a corresponding point between the pixel of the projection unit and the pixel of the imaging unit of the projection imaging apparatus whose posture has not changed, Projection plane that estimates the shape of the projection plane based on corresponding points between the pixels of the projection section and the pixels of the imaging section of the projection imaging apparatus that are detected by the corresponding point detection section and whose posture has not changed.
- the information processing apparatus according to any one of (1) to (5), further including a shape estimation unit.
- the projection surface shape estimation unit is configured to determine the shape of the projection surface for a portion whose shape is unknown within a range in which an image is projected by the projection unit of the projection imaging apparatus whose posture has changed.
- the information processing apparatus according to (5) or (6).
- the projection imaging apparatus having a projection unit that projects an image on a projection plane and an imaging unit that captures the projection plane, the projection imaging apparatus whose posture has changed and the projection imaging apparatus whose posture has not changed And detecting corresponding points between the pixels of the projection unit and the pixels of the imaging unit, The posture changes based on the detected corresponding points between the pixels of the projection unit and the pixels of the imaging unit between the projection imaging device in which the posture has changed and the projection imaging device in which the posture has not changed.
- a projection imaging apparatus having a projection unit that projects an image on a projection plane and an imaging unit that captures the projection plane, the pixels of the projection unit and the imaging unit of the projection imaging apparatus in which the posture has not changed
- a corresponding point detection unit for detecting a corresponding point with the pixel Projection plane that estimates the shape of the projection plane based on corresponding points between the pixels of the projection section and the pixels of the imaging section of the projection imaging apparatus that are detected by the corresponding point detection section and whose posture has not changed.
- An information processing apparatus comprising: a shape estimation unit.
- the projection plane shape estimation unit estimates the shape of the projection plane for a part whose shape is unknown.
- a projection unit that projects an image onto a projection plane; An imaging unit for imaging the projection plane; A sensor unit for detecting at least one of position and direction; A determination unit that determines whether or not the posture has changed based on a sensor output of the sensor unit; If the determination unit determines that the posture has changed, the pixels of the projection unit between the projection unit, the imaging unit, and another projection imaging device that includes the sensor unit, the posture of which has not changed. And a corresponding point detecting unit for detecting corresponding points between the pixels of the imaging unit and A projection imaging apparatus comprising: a relative attitude estimation unit that estimates a relative attitude with respect to the other projection imaging apparatus based on the corresponding points detected by the corresponding point detection unit.
- the corresponding point detection unit detects a corresponding point between the pixel of the projection unit of the other projection imaging device and the pixel of the imaging unit of the projection imaging device itself. apparatus.
- the corresponding point detection unit detects a corresponding point between the pixel of the projection unit of the projection imaging apparatus itself and the pixel of the imaging unit of the other projection imaging apparatus. (12) or (13) The projection imaging apparatus described.
- a corresponding point change detection unit that detects a change in corresponding points between the pixels of the projection unit and the pixels of the imaging unit of the projection imaging apparatus itself;
- An attitude change detection unit that detects an attitude change of the projection imaging device itself based on a sensor output of the sensor unit; and
- the determination unit is configured to determine whether the posture of the projection imaging apparatus itself has changed based on a detection result of the corresponding point change detection unit and a detection result of the posture change detection unit.
- the corresponding point detection unit is further configured to detect a corresponding point between the pixel of the projection unit and the pixel of the imaging unit of the projection imaging apparatus itself,
- a projection plane shape estimating unit configured to estimate the shape of the projection plane based on corresponding points between the pixels of the projection unit of the projection imaging apparatus itself and the pixels of the imaging unit detected by the corresponding point detection unit;
- the projection imaging apparatus according to any one of (15).
- the projection plane shape estimation unit estimates the shape of the projection plane for a portion of the projection plane whose shape is unknown within a range in which an image is projected by the projection unit of the projection imaging apparatus itself.
- An information processing method for a projection imaging apparatus comprising: a projection unit that projects an image on a projection plane; an imaging unit that captures the projection plane; and a sensor unit that detects at least one of a position and a direction. , Determine whether the posture has changed based on the sensor output of the sensor unit, When it is determined that the posture has changed, the pixels of the projection unit and the imaging unit between the projection unit, the imaging unit, and another projection imaging device including the sensor unit that have not changed the posture The corresponding point with the pixel of An information processing method for estimating a relative posture with respect to the other projection imaging apparatus based on the detected corresponding points.
- a projection unit that projects an image onto a projection plane; An imaging unit for imaging the projection plane; A sensor unit for detecting at least one of position and direction; A determination unit that determines whether or not the posture has changed based on a sensor output of the sensor unit; If the determination unit determines that the posture has not changed, a corresponding point detection unit that detects corresponding points between the pixels of the projection unit and the pixels of the imaging unit of the projection imaging apparatus itself;
- a projection imaging apparatus comprising: a projection surface shape estimation unit configured to estimate a shape of the projection surface based on the corresponding points detected by the corresponding point detection unit.
- An information processing method for a projection imaging apparatus comprising: a projection unit that projects an image on a projection plane; an imaging unit that images the projection plane; and a sensor unit that detects at least one of a position and a direction. , Determine whether the posture has changed based on the sensor output of the sensor unit, When it is determined that the posture has not changed, a corresponding point between the pixel of the projection unit and the pixel of the imaging unit of the projection imaging apparatus itself is detected, An information processing method for estimating a shape of the projection plane based on the detected corresponding points.
- 100 projection imaging system 101 control device, 102 projection imaging device, 103 communication cable, 104 projection plane, 105 projection image, 106 input image, 107 corrected projection image 111 projection unit, 112 imaging unit, 121 solid object, 201 control Unit, 231 processing control unit, 232 projection control unit, 233 imaging control unit, 234 sensor control unit, 235 image projection unit, 236 corresponding point change detection unit, 237 sensor information change detection unit, 238 change determination unit, 239 corresponding point detection Unit, 240 attitude estimation unit, 241 projection surface shape estimation unit, 242 correction information update unit, 301 projection port, 302 imaging port, 303 battery, 311 control unit, 312 sensor unit, 400 projection imaging system , 410 projection imaging systems, 411 projection imaging apparatus, 412 projection imaging apparatus, 413 projection imaging apparatus 414 network
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Geometry (AREA)
- Human Computer Interaction (AREA)
- Projection Apparatus (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
1.第1の実施の形態(投影撮像システム)
2.第2の実施の形態(対応点検出処理)
3.第3の実施の形態(投影撮像システム)
<投影撮像システム>
本技術を適用した情報処理装置の一実施の形態である制御装置や、本技術を適用した投影撮像装置の一実施の形態である投影撮像装置を適用した投影撮像システムの主な構成例を、図1に示す。図1に示される投影撮像システム100は、複数の投影撮像装置102を用いて画像を投影するシステムである。
図1において、投影面104の投影画像105-1は、投影撮像装置102-1の投影部111-1に投影された画像である。また、投影画像105-2は、投影撮像装置102-2の投影部111-2により投影された画像である。投影画像105-3は、投影撮像装置102-3の投影部111-3により投影された画像である。投影画像105-4は、投影撮像装置102-4の投影部111-4により投影された画像である。これらの投影画像105-1乃至投影画像105-4を互いに区別して説明する必要が無い場合、投影画像105と称する。
制御装置101は、各投影撮像装置102を制御して、所謂プロジェクションマッピングを行うことができる。例えば、制御装置101は、各投影撮像装置102から画像を投影させ、上述した1つの投影領域に1つの投影画像として投影されるようにすることができる。
このように投影画像を合成させる際に、各投影部111により投影される投影画像同士が違和感なく適切に合成されて1枚の投影画像(補正された投影画像107)が形成されるようにするために、制御装置101は、各投影撮像装置102が投影する各画像に対して、位置合わせ、幾何補正、画質補正(例えば輝度、色、解像度等)等の補正処理を行う。特に、複数の投影部111から投影されるオーバーラップ領域に対しては、違和感の無い画像にするためにブレンディング処理等が行われる。
これらの補正において画像をどのように補正するかは、各投影撮像装置102(投影部111)の姿勢(位置(並進成分)や方向(回転成分))や投影面104の形状等に応じて設定される。つまり、制御装置101は、各投影撮像装置102(投影部111)の姿勢や投影面104の形状を推定し、その推定結果に基づいて、各画像をどのように補正するかを設定する(どのように設定するかを示す設定情報を設定する)。
以上のようなプロジェクションマッピングを行う場合、制御装置101は、コンテンツ等の画像を投影させる前に、全ての投影撮像装置102について、投影部111と撮像部112との画素の対応関係を求め、投影撮像装置102(投影部111)の姿勢や投影面104の形状を推定し、補正情報を設定する。そして、制御装置101は、コンテンツ等の画像を投影する際に、その補正情報に従って画像の補正を行う。
以上のように、予め設定された補正情報に従って投影する画像が補正されることにより、補正された投影画像107が実現されている。したがって、画像投影中に環境が変化すると補正情報が不適切な値になり、補正された投影画像107の画質が低減してしまう可能性があった。
以上のように、複数の投影撮像装置102を用いて1つの映像を投影する際には、各投影部111の投影が投影面104において重畳している部分、すなわち、オーバーラップ領域を、撮像部112を用いてセンシングして、どのようにオーバーラップしているかを把握する必要がある。
しかしながら、この方法では画素の対応関係を取得するために、例えば図6に示されるようなパターン画像を投影する必要があり、その都度、画像投影を中断しなければならなかった。すなわち、投影画像を見る観察者による視聴を妨げなければならなかった。そこで、コンテンツ等の画像を投影しながら画素の対応関係を取得する「オンラインセンシング」という技術が考えられる。このオンラインセンシングとしては、例えば、投影画像にグレイコードなどを埋め込む方式、SIFT等の画像特徴量を利用した方式、Infrared(赤外光)等の不可視光を利用した方式等が考えられる。
そこで、図7に示される表のように、幾何補正等に用いられる補正情報の更新に関する処理として実行する処理を、変化要因に応じて選択するようにする。
図10は、本技術を適用した情報処理装置の一実施の形態である制御装置101の主な構成例を示すブロック図である。
図11に投影撮像装置102の外観の例を示す。投影撮像装置102は、上述したように投影部111と撮像部112を有しており、その筐体には、画像を投影するための投射口(レンズ機構)301や、被写体を撮像するための撮像口(レンズ機構)302等の光学デバイスが設けられている。また、投影撮像装置102は、どのような大きさの装置であってもよいが、例えば、携帯型(小型)の装置としてもよい。その場合、図11に示されるように、投影撮像装置102の筐体にバッテリ303を設けるようにしてもよい。バッテリ303を設けることにより、外部電源なしに投影撮像装置102を駆動させることができるので、その設置位置の自由度を向上させることができる。
図12は、投影撮像装置102の主な構成例を示すブロック図である。
図13は、投影部111の主な構成例を示すブロック図である。図13に示されるように、投影部111は、ビデオプロセッサ351、レーザドライバ352、レーザ出力部353-1、レーザ出力部353-2、レーザ出力部353-3、ミラー354-1、ミラー354-2、ミラー354-3、MEMS(Micro Electro Mechanical Systems)ドライバ355、および、MEMSミラー356を有する。
次に、図15のフローチャートを参照して、制御装置101の制御部201により実行される補正情報更新処理の流れの例を説明する。
次に、図16のフローチャートを参照して、図15のステップS104において実行される対応点検出処理の流れの例を説明する。
次に、図17のフローチャートを参照して、図15のステップS106において実行される投影面形状推定処理の流れの例を説明する。
次に、図18のフローチャートを参照して、図15のステップS108において実行される対応点検出処理の流れの例を説明する。
次に、図19のフローチャートを参照して、図15のステップS109において実行される投影面形状推定処理の流れの例を説明する。
<対応点検出処理>
なお、以上においては、投影撮像装置102間で投影部111と撮像部112との画素の対応点を求める際に、他の投影撮像装置102の投影部111からパターン画像を投影させ、処理対象の投影撮像装置102の撮像部112によりその投影画像を撮像させるように説明したが、この画像の投影と撮像とは互いに逆であってもよい。すなわち、処理対象の投影撮像装置102の投影部111からパターン画像を投影させ、他の投影撮像装置102の撮像部112によりその投影画像を撮像させるようにしてもよい。つまり、他の投影撮像装置102の撮像部112の画素と、処理対象の投影撮像装置102の投影部111の画素との対応点を検出するようにしてもよい。処理対象の投影撮像装置102の姿勢は、この対応点からも、第1の実施の形態の場合と同様に推定することができる。
<投影撮像システムの構成例>
なお、本技術を適用した投影撮像装置や投影撮像システムの構成例は、上述した例に限定されない。例えば、図21のAに示される投影撮像システム400のように、制御装置101を省略するようにしてもよい。つまり、幾何補正等に用いられる補正情報の更新に関する処理は、制御装置101以外の装置で行われるようにしてもよい。例えば、投影撮像装置102が行うようにしてもよい。
上述した一連の処理は、ハードウェアにより実行させることもできるし、ソフトウェアにより実行させることもできる。上述した一連の処理をソフトウェアにより実行させる場合には、そのソフトウェアを構成するプログラムが、ネットワークや記録媒体からインストールされる。
また、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、全ての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。
(1) 画像を投影面に投影する投影部と前記投影面を撮像する撮像部とを有する投影撮像装置について、姿勢が変化した前記投影撮像装置と姿勢が変化していない前記投影撮像装置との間で、前記投影部の画素と前記撮像部の画素との対応点を検出する対応点検出部と、
前記対応点検出部により検出された、姿勢が変化した前記投影撮像装置と姿勢が変化していない前記投影撮像装置との間の、前記投影部の画素と前記撮像部の画素との対応点に基づいて、姿勢が変化した前記投影撮像装置の、姿勢が変化していない前記投影撮像装置に対する相対姿勢を推定する相対姿勢推定部と
を備える情報処理装置。
(2) 前記対応点検出部は、姿勢が変化していない前記投影撮像装置の前記投影部の画素と、姿勢が変化した前記投影撮像装置の前記撮像部の画素との対応点を検出する
(1)に記載の情報処理装置。
(3) 前記対応点検出部は、姿勢が変化した前記投影撮像装置の前記投影部の画素と、姿勢が変化していない前記投影撮像装置の前記撮像部の画素との対応点を検出する
(1)または(2)に記載の情報処理装置。
(4) 前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点の変化を検出する対応点変化検出部と、
前記投影撮像装置が有する位置および方向の少なくともいずれか1つを検出するセンサ部のセンサ出力に基づいて、前記投影撮像装置の姿勢変化を検出する姿勢変化検出部と、
前記対応点変化検出部の検出結果と前記姿勢変化検出部の検出結果とに基づいて、前記投影撮像装置の姿勢が変化したか否かを判定する変化判定部と
をさらに備え、
前記対応点検出部は、前記変化判定部により姿勢が変化したと判定された前記投影撮像装置と、前記変化判定部により姿勢が変化していないと判定された前記投影撮像装置との間で前記対応点を検出するように構成され、
前記相対姿勢推定部は、前記変化判定部により姿勢が変化したと判定された前記投影撮像装置の、前記変化判定部により姿勢が変化していないと判定された前記投影撮像装置に対する相対姿勢を推定するように構成される
(1)乃至(3)のいずれかに記載の情報処理装置。
(5) 前記対応点検出部は、さらに、姿勢が変化した前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点を検出するように構成され、
前記対応点検出部により検出された、姿勢が変化した前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点に基づいて、前記投影面の形状を推定する投影面形状推定部をさらに備える
(1)乃至(4)のいずれかに記載の情報処理装置。
(6) 前記対応点検出部は、さらに、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点を検出するように構成され、
前記対応点検出部により検出された、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点に基づいて、前記投影面の形状を推定する投影面形状推定部をさらに備える
(1)乃至(5)のいずれかに記載の情報処理装置。
(7) 前記投影面形状推定部は、前記投影面の、姿勢が変化した前記投影撮像装置の前記投影部により画像が投影される範囲内の、形状が未知の部分について、前記投影面の形状を推定する
(5)または(6)に記載の情報処理装置。
(8) 画像を投影面に投影する投影部と前記投影面を撮像する撮像部とを有する投影撮像装置について、姿勢が変化した前記投影撮像装置と姿勢が変化していない前記投影撮像装置との間で、前記投影部の画素と前記撮像部の画素との対応点を検出し、
検出された、姿勢が変化した前記投影撮像装置と姿勢が変化していない前記投影撮像装置との間の、前記投影部の画素と前記撮像部の画素との対応点に基づいて、姿勢が変化した前記投影撮像装置の、姿勢が変化していない前記投影撮像装置に対する相対姿勢を推定する
情報処理方法。
(9) 画像を投影面に投影する投影部と前記投影面を撮像する撮像部とを有する投影撮像装置について、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点を検出する対応点検出部と、
前記対応点検出部により検出された、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点に基づいて、前記投影面の形状を推定する投影面形状推定部と
を備える情報処理装置。
(10) 前記投影面形状推定部は、形状が未知の部分について前記投影面の形状を推定する
(9)に記載の情報処理装置。
(11) 画像を投影面に投影する投影部と前記投影面を撮像する撮像部とを有する投影撮像装置について、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点を検出し、
検出された、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点に基づいて、前記投影面の形状を推定する
情報処理方法。
(12) 画像を投影面に投影する投影部と、
前記投影面を撮像する撮像部と、
位置および方向の少なくともいずれか一方を検出するセンサ部と、
前記センサ部のセンサ出力に基づいて姿勢が変化したか否かを判定する判定部と、
前記判定部により姿勢が変化したと判定された場合、姿勢が変化していない、前記投影部、前記撮像部、および前記センサ部を備える他の投影撮像装置との間で、前記投影部の画素と前記撮像部の画素との対応点を検出する対応点検出部と、
前記対応点検出部により検出された前記対応点に基づいて、前記他の投影撮像装置に対する相対姿勢を推定する相対姿勢推定部と
を備える投影撮像装置。
(13) 前記対応点検出部は、前記他の投影撮像装置の前記投影部の画素と、前記投影撮像装置自身の前記撮像部の画素との対応点を検出する
(12)に記載の投影撮像装置。
(14) 前記対応点検出部は、前記投影撮像装置自身の前記投影部の画素と、前記他の投影撮像装置の前記撮像部の画素との対応点を検出する
(12)または(13)に記載の投影撮像装置。
(15) 前記投影撮像装置自身の前記投影部の画素と前記撮像部の画素との対応点の変化を検出する対応点変化検出部と、
前記センサ部のセンサ出力に基づいて、前記投影撮像装置自身の姿勢変化を検出する姿勢変化検出部と
をさらに備え、
前記判定部は、前記対応点変化検出部の検出結果と前記姿勢変化検出部の検出結果とに基づいて、前記投影撮像装置自身の姿勢が変化したか否かを判定するように構成される
(12)乃至(14)のいずれかに記載の投影撮像装置。
(16) 前記対応点検出部は、さらに、前記投影撮像装置自身の前記投影部の画素と前記撮像部の画素との対応点を検出するように構成され、
前記対応点検出部により検出された前記投影撮像装置自身の前記投影部の画素と前記撮像部の画素との対応点に基づいて、前記投影面の形状を推定する投影面形状推定部をさらに備える
(12)乃至(15)のいずれかに記載の投影撮像装置。
(17) 前記投影面形状推定部は、前記投影面の、前記投影撮像装置自身の前記投影部により画像が投影される範囲内の、形状が未知の部分について、前記投影面の形状を推定する
(16)に記載の投影撮像装置。
(18) 画像を投影面に投影する投影部と、前記投影面を撮像する撮像部と、位置および方向の少なくともいずれか一方を検出するセンサ部とを備える投影撮像装置の情報処理方法であって、
前記センサ部のセンサ出力に基づいて姿勢が変化したか否かを判定し、
姿勢が変化したと判定された場合、姿勢が変化していない、前記投影部、前記撮像部、および前記センサ部を備える他の投影撮像装置との間で、前記投影部の画素と前記撮像部の画素との対応点を検出し、
検出された前記対応点に基づいて、前記他の投影撮像装置に対する相対姿勢を推定する
情報処理方法。
(19) 画像を投影面に投影する投影部と、
前記投影面を撮像する撮像部と、
位置および方向の少なくともいずれか一方を検出するセンサ部と、
前記センサ部のセンサ出力に基づいて姿勢が変化したか否かを判定する判定部と、
前記判定部により姿勢が変化していないと判定された場合、前記投影撮像装置自身の前記投影部の画素と前記撮像部の画素との対応点を検出する対応点検出部と、
前記対応点検出部により検出された前記対応点に基づいて、前記投影面の形状を推定する投影面形状推定部と
を備える投影撮像装置。
(20) 画像を投影面に投影する投影部と、前記投影面を撮像する撮像部と、位置および方向の少なくともいずれか一方を検出するセンサ部とを備える投影撮像装置の情報処理方法であって、
前記センサ部のセンサ出力に基づいて姿勢が変化したか否かを判定し、
姿勢が変化していないと判定された場合、前記投影撮像装置自身の前記投影部の画素と前記撮像部の画素との対応点を検出し、
検出された前記対応点に基づいて、前記投影面の形状を推定する
情報処理方法。
Claims (20)
- 画像を投影面に投影する投影部と前記投影面を撮像する撮像部とを有する投影撮像装置について、姿勢が変化した前記投影撮像装置と姿勢が変化していない前記投影撮像装置との間で、前記投影部の画素と前記撮像部の画素との対応点を検出する対応点検出部と、
前記対応点検出部により検出された、姿勢が変化した前記投影撮像装置と姿勢が変化していない前記投影撮像装置との間の、前記投影部の画素と前記撮像部の画素との対応点に基づいて、姿勢が変化した前記投影撮像装置の、姿勢が変化していない前記投影撮像装置に対する相対姿勢を推定する相対姿勢推定部と
を備える情報処理装置。 - 前記対応点検出部は、姿勢が変化していない前記投影撮像装置の前記投影部の画素と、姿勢が変化した前記投影撮像装置の前記撮像部の画素との対応点を検出する
請求項1に記載の情報処理装置。 - 前記対応点検出部は、姿勢が変化した前記投影撮像装置の前記投影部の画素と、姿勢が変化していない前記投影撮像装置の前記撮像部の画素との対応点を検出する
請求項1に記載の情報処理装置。 - 前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点の変化を検出する対応点変化検出部と、
前記投影撮像装置が有する位置および方向の少なくともいずれか1つを検出するセンサ部のセンサ出力に基づいて、前記投影撮像装置の姿勢変化を検出する姿勢変化検出部と、
前記対応点変化検出部の検出結果と前記姿勢変化検出部の検出結果とに基づいて、前記投影撮像装置の姿勢が変化したか否かを判定する変化判定部と
をさらに備え、
前記対応点検出部は、前記変化判定部により姿勢が変化したと判定された前記投影撮像装置と、前記変化判定部により姿勢が変化していないと判定された前記投影撮像装置との間で前記対応点を検出するように構成され、
前記相対姿勢推定部は、前記変化判定部により姿勢が変化したと判定された前記投影撮像装置の、前記変化判定部により姿勢が変化していないと判定された前記投影撮像装置に対する相対姿勢を推定するように構成される
請求項1に記載の情報処理装置。 - 前記対応点検出部は、さらに、姿勢が変化した前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点を検出するように構成され、
前記対応点検出部により検出された、姿勢が変化した前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点に基づいて、前記投影面の形状を推定する投影面形状推定部をさらに備える
請求項1に記載の情報処理装置。 - 前記対応点検出部は、さらに、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点を検出するように構成され、
前記対応点検出部により検出された、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点に基づいて、前記投影面の形状を推定する投影面形状推定部をさらに備える
請求項1に記載の情報処理装置。 - 前記投影面形状推定部は、前記投影面の、姿勢が変化した前記投影撮像装置の前記投影部により画像が投影される範囲内の、形状が未知の部分について、前記投影面の形状を推定する
請求項5または請求項6に記載の情報処理装置。 - 画像を投影面に投影する投影部と前記投影面を撮像する撮像部とを有する投影撮像装置について、姿勢が変化した前記投影撮像装置と姿勢が変化していない前記投影撮像装置との間で、前記投影部の画素と前記撮像部の画素との対応点を検出し、
検出された、姿勢が変化した前記投影撮像装置と姿勢が変化していない前記投影撮像装置との間の、前記投影部の画素と前記撮像部の画素との対応点に基づいて、姿勢が変化した前記投影撮像装置の、姿勢が変化していない前記投影撮像装置に対する相対姿勢を推定する
情報処理方法。 - 画像を投影面に投影する投影部と前記投影面を撮像する撮像部とを有する投影撮像装置について、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点を検出する対応点検出部と、
前記対応点検出部により検出された、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点に基づいて、前記投影面の形状を推定する投影面形状推定部と
を備える情報処理装置。 - 前記投影面形状推定部は、形状が未知の部分について前記投影面の形状を推定する
請求項9に記載の情報処理装置。 - 画像を投影面に投影する投影部と前記投影面を撮像する撮像部とを有する投影撮像装置について、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点を検出し、
検出された、姿勢が変化していない前記投影撮像装置の前記投影部の画素と前記撮像部の画素との対応点に基づいて、前記投影面の形状を推定する
情報処理方法。 - 画像を投影面に投影する投影部と、
前記投影面を撮像する撮像部と、
位置および方向の少なくともいずれか一方を検出するセンサ部と、
前記センサ部のセンサ出力に基づいて姿勢が変化したか否かを判定する判定部と、
前記判定部により姿勢が変化したと判定された場合、姿勢が変化していない、前記投影部、前記撮像部、および前記センサ部を備える他の投影撮像装置との間で、前記投影部の画素と前記撮像部の画素との対応点を検出する対応点検出部と、
前記対応点検出部により検出された前記対応点に基づいて、前記他の投影撮像装置に対する相対姿勢を推定する相対姿勢推定部と
を備える投影撮像装置。 - 前記対応点検出部は、前記他の投影撮像装置の前記投影部の画素と、前記投影撮像装置自身の前記撮像部の画素との対応点を検出する
請求項12に記載の投影撮像装置。 - 前記対応点検出部は、前記投影撮像装置自身の前記投影部の画素と、前記他の投影撮像装置の前記撮像部の画素との対応点を検出する
請求項12に記載の投影撮像装置。 - 前記投影撮像装置自身の前記投影部の画素と前記撮像部の画素との対応点の変化を検出する対応点変化検出部と、
前記センサ部のセンサ出力に基づいて、前記投影撮像装置自身の姿勢変化を検出する姿勢変化検出部と
をさらに備え、
前記判定部は、前記対応点変化検出部の検出結果と前記姿勢変化検出部の検出結果とに基づいて、前記投影撮像装置自身の姿勢が変化したか否かを判定するように構成される
請求項12に記載の投影撮像装置。 - 前記対応点検出部は、さらに、前記投影撮像装置自身の前記投影部の画素と前記撮像部の画素との対応点を検出するように構成され、
前記対応点検出部により検出された前記投影撮像装置自身の前記投影部の画素と前記撮像部の画素との対応点に基づいて、前記投影面の形状を推定する投影面形状推定部をさらに備える
請求項12に記載の投影撮像装置。 - 前記投影面形状推定部は、前記投影面の、前記投影撮像装置自身の前記投影部により画像が投影される範囲内の、形状が未知の部分について、前記投影面の形状を推定する
請求項16に記載の投影撮像装置。 - 画像を投影面に投影する投影部と、前記投影面を撮像する撮像部と、位置および方向の少なくともいずれか一方を検出するセンサ部とを備える投影撮像装置の情報処理方法であって、
前記センサ部のセンサ出力に基づいて姿勢が変化したか否かを判定し、
姿勢が変化したと判定された場合、姿勢が変化していない、前記投影部、前記撮像部、および前記センサ部を備える他の投影撮像装置との間で、前記投影部の画素と前記撮像部の画素との対応点を検出し、
検出された前記対応点に基づいて、前記他の投影撮像装置に対する相対姿勢を推定する
情報処理方法。 - 画像を投影面に投影する投影部と、
前記投影面を撮像する撮像部と、
位置および方向の少なくともいずれか一方を検出するセンサ部と、
前記センサ部のセンサ出力に基づいて姿勢が変化したか否かを判定する判定部と、
前記判定部により姿勢が変化していないと判定された場合、前記投影撮像装置自身の前記投影部の画素と前記撮像部の画素との対応点を検出する対応点検出部と、
前記対応点検出部により検出された前記対応点に基づいて、前記投影面の形状を推定する投影面形状推定部と
を備える投影撮像装置。 - 画像を投影面に投影する投影部と、前記投影面を撮像する撮像部と、位置および方向の少なくともいずれか一方を検出するセンサ部とを備える投影撮像装置の情報処理方法であって、
前記センサ部のセンサ出力に基づいて姿勢が変化したか否かを判定し、
姿勢が変化していないと判定された場合、前記投影撮像装置自身の前記投影部の画素と前記撮像部の画素との対応点を検出し、
検出された前記対応点に基づいて、前記投影面の形状を推定する
情報処理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/573,050 US10802384B2 (en) | 2015-07-08 | 2016-06-24 | Information processing apparatus and method, and projection imaging apparatus and information processing method |
JP2017527173A JP6915537B2 (ja) | 2015-07-08 | 2016-06-24 | 情報処理装置および方法、並びに、投影撮像装置および情報処理方法 |
US16/942,220 US11526072B2 (en) | 2015-07-08 | 2020-07-29 | Information processing apparatus and method, and projection imaging apparatus and information processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015136727 | 2015-07-08 | ||
JP2015-136727 | 2015-07-08 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/573,050 A-371-Of-International US10802384B2 (en) | 2015-07-08 | 2016-06-24 | Information processing apparatus and method, and projection imaging apparatus and information processing method |
US16/942,220 Continuation US11526072B2 (en) | 2015-07-08 | 2020-07-29 | Information processing apparatus and method, and projection imaging apparatus and information processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017006779A1 true WO2017006779A1 (ja) | 2017-01-12 |
Family
ID=57685152
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/068753 WO2017006779A1 (ja) | 2015-07-08 | 2016-06-24 | 情報処理装置および方法、並びに、投影撮像装置および情報処理方法 |
Country Status (3)
Country | Link |
---|---|
US (2) | US10802384B2 (ja) |
JP (1) | JP6915537B2 (ja) |
WO (1) | WO2017006779A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020218028A1 (ja) * | 2019-04-25 | 2020-10-29 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム、および画像処理システム |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107580779B (zh) | 2015-05-06 | 2020-01-03 | 杜比实验室特许公司 | 图像投影中的热补偿 |
JP2019040003A (ja) * | 2017-08-24 | 2019-03-14 | セイコーエプソン株式会社 | プロジェクターおよびプロジェクターの制御方法 |
US11109006B2 (en) * | 2017-09-14 | 2021-08-31 | Sony Corporation | Image processing apparatus and method |
JP7103387B2 (ja) | 2020-06-16 | 2022-07-20 | セイコーエプソン株式会社 | 画像投射システムの調整要否判定方法、画像投射システム、及び画像投射制御装置 |
JP7200978B2 (ja) * | 2020-06-23 | 2023-01-10 | セイコーエプソン株式会社 | 画像投射システムの調整要否判定方法、画像投射システム、及び画像投射制御装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005079939A (ja) * | 2003-09-01 | 2005-03-24 | Mitsubishi Electric Corp | プロジェクター装置 |
JP2005229282A (ja) * | 2004-02-12 | 2005-08-25 | Seiko Epson Corp | プロジェクタおよびマルチプロジェクションディスプレイ |
JP2007295375A (ja) * | 2006-04-26 | 2007-11-08 | Nippon Telegr & Teleph Corp <Ntt> | 投影映像補正装置及び投影映像補正プログラム |
JP2009290412A (ja) * | 2008-05-28 | 2009-12-10 | Nikon Corp | プロジェクタ及びマルチプロジェクションシステム |
JP2011170174A (ja) * | 2010-02-19 | 2011-09-01 | Nippon Telegr & Teleph Corp <Ntt> | 光学投影安定化装置、光学投影安定化方法およびプログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3994290B2 (ja) * | 2005-01-17 | 2007-10-17 | セイコーエプソン株式会社 | 画像処理システム、プロジェクタ、プログラム、情報記憶媒体および画像処理方法 |
US7857461B2 (en) * | 2007-11-06 | 2010-12-28 | Panasonic Corporation | Projector and projection method |
JP2012173378A (ja) * | 2011-02-18 | 2012-09-10 | Seiko Epson Corp | プロジェクター及びその制御方法 |
US9369658B2 (en) * | 2014-01-20 | 2016-06-14 | Lenovo (Singapore) Pte. Ltd. | Image correction of surface projected image |
JP2016100698A (ja) * | 2014-11-19 | 2016-05-30 | 株式会社リコー | 校正装置、校正方法、プログラム |
-
2016
- 2016-06-24 US US15/573,050 patent/US10802384B2/en active Active
- 2016-06-24 JP JP2017527173A patent/JP6915537B2/ja active Active
- 2016-06-24 WO PCT/JP2016/068753 patent/WO2017006779A1/ja active Application Filing
-
2020
- 2020-07-29 US US16/942,220 patent/US11526072B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005079939A (ja) * | 2003-09-01 | 2005-03-24 | Mitsubishi Electric Corp | プロジェクター装置 |
JP2005229282A (ja) * | 2004-02-12 | 2005-08-25 | Seiko Epson Corp | プロジェクタおよびマルチプロジェクションディスプレイ |
JP2007295375A (ja) * | 2006-04-26 | 2007-11-08 | Nippon Telegr & Teleph Corp <Ntt> | 投影映像補正装置及び投影映像補正プログラム |
JP2009290412A (ja) * | 2008-05-28 | 2009-12-10 | Nikon Corp | プロジェクタ及びマルチプロジェクションシステム |
JP2011170174A (ja) * | 2010-02-19 | 2011-09-01 | Nippon Telegr & Teleph Corp <Ntt> | 光学投影安定化装置、光学投影安定化方法およびプログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020218028A1 (ja) * | 2019-04-25 | 2020-10-29 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム、および画像処理システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2017006779A1 (ja) | 2018-04-19 |
JP6915537B2 (ja) | 2021-08-04 |
US11526072B2 (en) | 2022-12-13 |
US20200355994A1 (en) | 2020-11-12 |
US10802384B2 (en) | 2020-10-13 |
US20180164670A1 (en) | 2018-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017006779A1 (ja) | 情報処理装置および方法、並びに、投影撮像装置および情報処理方法 | |
WO2017006780A1 (ja) | 情報処理装置および方法、並びにプログラム | |
US11314321B2 (en) | Object and environment tracking via shared sensor | |
US10250789B2 (en) | Electronic device with modulated light flash operation for rolling shutter image sensor | |
US9407837B2 (en) | Depth sensor using modulated light projector and image sensor with color and IR sensing | |
US20170366805A1 (en) | Method and system for displaying three-dimensional objects | |
US11640000B2 (en) | System and method of capturing and generating panoramic three-dimensional images | |
US20140241614A1 (en) | System for 2D/3D Spatial Feature Processing | |
US11156843B2 (en) | End-to-end artificial reality calibration testing | |
JP6658520B2 (ja) | 画像処理装置および方法 | |
KR20180042030A (ko) | 외부 디스플레이 장치의 화면 보정 장치 및 방법 | |
JP5590668B2 (ja) | プロジェクタ装置、映像信号補正装置、映像信号補正方法及びプログラム | |
CN112204961A (zh) | 从动态视觉传感器立体对和脉冲散斑图案投射器进行半密集深度估计 | |
JP2016014720A (ja) | 情報処理装置および方法 | |
US11727597B2 (en) | Calibrating volumetric rig with structured light | |
WO2022044807A1 (ja) | 情報処理装置および方法 | |
US20240112315A1 (en) | Distortion correction via analytical projection | |
WO2022185719A1 (ja) | 情報処理装置、情報処理方法、及び、表示装置 | |
US20230300315A1 (en) | Information processing apparatus, information processing method, program, and display apparatus | |
JP2023021670A (ja) | 画像処理装置、撮像システム、及び照明装置 | |
Jiang et al. | Project report: Light source estimation using kinect | |
Siriborvornratanakul et al. | ipProjector: Designs and Techniques for Geometry‐Based Interactive Applications Using a Portable Projector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16821248 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15573050 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2017527173 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16821248 Country of ref document: EP Kind code of ref document: A1 |