WO2019176594A1 - Projection control device, projection apparatus, projection control method, and projection control program - Google Patents

Projection control device, projection apparatus, projection control method, and projection control program Download PDF

Info

Publication number
WO2019176594A1
WO2019176594A1 PCT/JP2019/008190 JP2019008190W WO2019176594A1 WO 2019176594 A1 WO2019176594 A1 WO 2019176594A1 JP 2019008190 W JP2019008190 W JP 2019008190W WO 2019176594 A1 WO2019176594 A1 WO 2019176594A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
correction
unit
shape
overlapping
Prior art date
Application number
PCT/JP2019/008190
Other languages
French (fr)
Japanese (ja)
Inventor
晶啓 石塚
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2020506397A priority Critical patent/JPWO2019176594A1/en
Publication of WO2019176594A1 publication Critical patent/WO2019176594A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to a projection control device, a projection device, a projection control method, and a projection control program.
  • a projection device having a photographing function is known.
  • the projection image projected by the projection unit is photographed by the photographing unit, and based on the photographed image obtained by photographing, it is detected that an object such as a human hand overlaps the projected image.
  • a projection device is described.
  • Patent Document 1 describes that when it is detected that an object overlaps the projection image, the portion of the projection image that overlaps the object is corrected to improve the appearance of the projection image. .
  • a device for performing such correction adds a so-called gesture operation function that recognizes that the shape of a person's hand has a specific shape and performs specific processing.
  • a gesture operation function that recognizes that the shape of a person's hand has a specific shape and performs specific processing.
  • Patent Documents 1-3 do not recognize the problem that such gesture recognition becomes impossible.
  • the present invention has been made in view of the above circumstances, and a projection control apparatus capable of achieving both improvement in the quality of a projected image and improvement in gesture recognition accuracy, a projection apparatus including the projection control apparatus, a projection control method, and projection control.
  • the purpose is to provide a program.
  • the projection control apparatus includes an overlapping region that overlaps an object in the projected image based on captured image data obtained by capturing the projected image by a capturing unit that captures the projected image projected from the projecting unit.
  • An overlapping area detecting unit for detecting the correction area, correcting the overlapping area, correcting the overlapping area, storing a correction history indicating the content of the correction, and the projection image based on the correction history
  • a recognition unit for recognizing the shape of an object overlapping with the display unit, and a display control unit for changing the projection image based on the shape of the object recognized by the recognition unit.
  • the projection device of the present invention includes the projection control device and the projection unit.
  • the projection device of the present invention includes the projection control device, the projection unit, and the photographing unit.
  • the projection control program of the present invention includes an overlapping region that overlaps an object in the projected image based on captured image data obtained by capturing the projected image by a capturing unit that captures the projected image projected from the projecting unit.
  • An overlapping area detecting step for detecting the correction area, a correction step for controlling the projection unit to correct the overlapping area and storing a correction history indicating the content of the correction, and the projection image based on the correction history.
  • a display control step for changing the projection image based on the shape of the object recognized by the recognition step.
  • a projection control device capable of achieving both improvement in the quality of a projection image and improvement in gesture recognition accuracy, a projection device including the projection control device, a projection control method, and a projection control program.
  • FIG. 1 is a schematic diagram showing a schematic configuration of a projector 100 that is an embodiment of a projection apparatus of the present invention. It is a schematic diagram which shows the structural example of the display part 1 of the projector 100 shown in FIG. It is a block diagram which shows the internal structure of the system control part 10 of the projector 100 shown in FIG. It is a functional block diagram of the control part 11 shown in FIG. 3 is a flowchart for explaining an operation of projector 100 shown in FIG. 1. It is a schematic diagram which shows the transition state of a projection image in case the user of the projector 100 changes a projection image by gesture. It is a schematic diagram which shows the transition state of a projection image in case the user of the projector 100 changes a projection image by gesture.
  • FIG. 1 is a schematic diagram showing a schematic configuration of a projector 100 which is an embodiment of the projection apparatus of the present invention.
  • FIG. 2 is a schematic diagram illustrating a configuration example of the display unit 1 of the projector 100 illustrated in FIG. 1.
  • the projector 100 is configured to project an image onto the screen SC and to capture a range including the projected image projected onto the screen SC.
  • the projector 100 includes a display unit 1, a projection optical system 2, a common optical system 3, an optical member 4, a photographing optical system 5, an image sensor 6, and a system control unit 10 that performs overall control. .
  • the display unit 1, the projection optical system 2, the optical member 4, and the common optical system 3 constitute a projection unit 20.
  • An imaging unit 30 is configured by the imaging element 6, the imaging optical system 5, the optical member 4, and the common optical system 3.
  • the display unit 1 displays a projection image based on the input projection image data. As shown in FIG. 2, the display unit 1 includes a light source unit 50 and a light modulation element 44.
  • the light source unit 50 includes an R light source 41r that is a red light source that emits red light, a G light source 41g that is a green light source that emits green light, a B light source 41b that is a blue light source that emits blue light, and a dichroic prism 43.
  • a collimator lens 42 r provided between the R light source 41 r and the dichroic prism 43, a collimator lens 42 g provided between the G light source 41 g and the dichroic prism 43, and a B light source 41 b and the dichroic prism 43.
  • a collimator lens 42b is a red light source that emits red light
  • a G light source 41g that is a green light source that emits green light
  • a B light source 41b that is a blue light source that emits blue light
  • a collimator lens 42 r provided between the R light source 41 r and the dichroic prism 43
  • a collimator lens 42 g provided between the G light source
  • the dichroic prism 43 is an optical member for guiding light emitted from each of the R light source 41r, the G light source 41g, and the B light source 41b to the same optical path. That is, the dichroic prism 43 transmits the red light that has been collimated by the collimator lens 42 r and emits the red light to the light modulation element 44. The dichroic prism 43 reflects the green light that has been collimated by the collimator lens 42 g and emits it to the light modulation element 44. Further, the dichroic prism 43 reflects the blue light that has been collimated by the collimator lens 42 b and emits it to the light modulation element 44.
  • the optical member having such a function is not limited to the dichroic prism. For example, a cross dichroic mirror may be used.
  • Each of the R light source 41r, the G light source 41g, and the B light source 41b uses a light emitting element such as a laser or an LED (Light Emitting Diode).
  • the number of light sources included in the light source unit 50 may be one, two, or four or more.
  • the light modulation element 44 spatially modulates the light emitted from the dichroic prism 43 based on the image data, and the spatially modulated image light (red image light, blue image light, and green image light) is the projection optical system in FIG. 2 is emitted.
  • FIG. 2 shows an example in which a DMD (Digital Micromirror Device) is used as the light modulation element 44.
  • a DMD Digital Micromirror Device
  • the light modulation element 44 for example, LCOS (Liquid crystal on silicon), MEMS (Micro Electro Mechanical System element, or the like). It is also possible to use a liquid crystal display element or the like.
  • the display unit 1 may display an image using a self-luminous organic EL (electro-luminescence) display element, and may cause the displayed image to enter the projection optical system 2.
  • a self-luminous organic EL electro-luminescence
  • an apparatus that displays an image by scanning with laser light may be used.
  • the optical member 4 is composed of, for example, a half mirror, a beam splitter, or a polarizing member.
  • the optical member 4 transmits the image light that has passed through the projection optical system 2 and guides it to the common optical system 3, and reflects the subject light that has passed through the common optical system 3 and guides it to the photographing optical system 5.
  • the projection optical system 2 is an optical system into which image light from the display unit 1 is incident, and includes at least one lens.
  • the light that has passed through the projection optical system 2 enters the optical member 4, passes through the optical member 4, and enters the common optical system 3.
  • the common optical system 3 projects the light that has passed through the projection optical system 2 onto the screen SC that is a projection target, and subjects the screen SC (subjects in a range including the projection image projected by the projection unit 20).
  • An optical system that forms an image and includes at least one lens.
  • the subject light incident on the common optical system 3 from the screen SC side passes through the common optical system 3, is reflected by the optical member 4, and enters the photographing optical system 5.
  • the photographing optical system 5 is an optical system for forming an image of subject light that has passed through the common optical system 3 and reflected by the optical member 4 on the image sensor 6.
  • the photographing optical system 5 is disposed in front of the image sensor 6 and focuses the subject light transmitted through the optical member 4 to form an image on the image sensor 6.
  • the photographing optical system 5 includes at least one lens.
  • the image sensor 6 is a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the imaging range by the imaging unit 30 is set to a range including the projection range of the image on the screen SC by the projection unit 20. Therefore, the photographing unit 30 can photograph the entire projection image projected by the projection unit 20.
  • FIG. 3 is a block diagram showing an internal configuration of the system control unit 10 of the projector 100 shown in FIG.
  • the system control unit 10 includes a control unit 11, a ROM (Read Only Memory) 12, and a RAM (Random Access Memory) 13.
  • the control unit 11 has various hardware structures.
  • programmable logic which is a processor whose circuit configuration can be changed after manufacturing, such as a CPU (Central Processing Unit) and an FPGA (Field Programmable Gate Array), which are general-purpose processors that execute programs and perform various processes Examples include a dedicated electrical circuit that is a processor having a circuit configuration that is specifically designed to execute a specific process such as a device (Programmable Logic Device: PLD) or an ASIC (Application Specific Integrated Circuit).
  • PLD Programmable Logic Device
  • ASIC Application Specific Integrated Circuit
  • the structures of these various processors are electric circuits in which circuit elements such as semiconductor elements are combined.
  • the control unit 11 may be composed of one of various processors, or a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs or a combination of a CPU and an FPGA). May be.
  • the control unit 11 controls the image data for projection input to the light modulation element 44 of the display unit 1, so that the content of the projection image projected from the projection unit 20 onto the screen SC, the luminance of each pixel of the projection image, or The saturation of each pixel of the projected image is controlled.
  • the control unit 11 controls the image sensor 6 and performs shooting of the subject by the image sensor 6.
  • FIG. 4 is a functional block diagram of the control unit 11 shown in FIG.
  • the control unit 11 functions as a projection control apparatus including an image processing unit 11A, an overlapping area detection unit 11B, a correction unit 11C, a recognition unit 11D, and a display control unit 11E by executing an application program including a projection control program. .
  • the image processing unit 11A acquires a captured image signal obtained by the imaging element 6 capturing an image of a subject (a range including a projection image projected by the projection unit 20), and processes this to generate captured image data.
  • the overlapping region detection unit 11B Based on the captured image data generated by the image processing unit 11A, the overlapping region detection unit 11B detects an overlapping region that overlaps an object such as a human hand in the projection image projected on the screen SC by the projection unit 20. .
  • the overlapping area detection unit 11B determines the similarity for each area from the image data for projection input to the light modulation element 44 of the display unit 1 and the captured image data generated by the image processing unit 11A.
  • the region where the similarity is equal to or less than the threshold is detected as an overlapping region.
  • the similarity is, for example, an index indicating the degree of coincidence of the positions or numbers of feature points such as edges included in the image.
  • the overlapping area detection unit 11B may detect an area where distortion occurs in the captured image data as an overlapping area.
  • the correcting unit 11C controls the projecting unit 20 to correct the overlapping area, and stores a correction history indicating the content of the correction in the ROM 12.
  • the correction unit 11C corrects the overlapping area of the projection image by changing at least one of the luminance and the saturation of the overlapping area or correcting the distortion of the overlapping area in the projection image data. By this correction, the overlapping area that should have occurred in the projected image projected on the screen SC is corrected to the appearance in the same state as when there is no object.
  • the correction unit 11C When the correction is performed as described above, the correction unit 11C generates correction history information indicating which pixel in the image data for projection has been corrected, and stores the correction history information in the ROM 12.
  • the correction history information includes, for example, the correction time and the coordinates of the corrected pixel (information indicating the corrected area).
  • the recognition unit 11D recognizes the shape of the object that overlaps the projection image projected by the projection unit 20 based on the correction history stored in the ROM 12. This recognition method will be described later.
  • the display control unit 11E changes the projection image projected from the projection unit 20 onto the screen SC based on the shape of the object recognized by the recognition unit 11D.
  • the display control unit 11E determines whether or not the shape of the object recognized by the recognition unit 11D is a predetermined specific shape. If the shape is a specific shape, the display unit 11E The projection image projected on the screen SC from 20 is changed. The display control unit 11E changes the projection image by changing the projection image data input to the light modulation element 44 of the display unit 1 to another one.
  • This specific shape is a shape for instructing the projector 100 to change the projection image (for example, an instruction to change a display image when a presentation is being performed or an instruction to start an operation of an animation included in the display image).
  • FIG. 5 is a flowchart for explaining the operation of the projector 100 shown in FIG. 6 to 10 are schematic diagrams illustrating a transition state of the projected image when the user of the projector 100 changes the projected image by a gesture.
  • control unit 11 causes the projection unit 20 to project a projection image based on the projection image data designated by the user onto the screen SC (step S1).
  • FIG. 6 shows a state in which the projected image 40 is projected on the screen SC in step S1.
  • control unit 11 When projection is started in step S1, the control unit 11 causes the photographing unit 30 to photograph the projection image projected on the screen SC (step S2).
  • a captured image signal is output from the image sensor 6 in the imaging in step S2, this is processed by the image processing unit 11A to generate captured image data, and this captured image data is acquired by the overlapping area detection unit 11B (step S1). S3).
  • the overlapping area detection unit 11B compares the acquired captured image data with the image data for projection currently input to the display unit 1, and detects an overlapping area based on the comparison result. If no overlapping area is detected (step S4: NO), the process returns to step S2.
  • the correction unit 11C corrects the overlapping area in the image data for projection currently input to the display unit 1 and performs the corrected projection.
  • a projection image based on the image data for projection is projected on the screen SC by the projection unit 20. Further, the correction unit 11C stores information of correction history indicating the content of the correction in the ROM 12 (step S5).
  • FIG. 7 shows a state in which a human hand H (a hand with all fingers closed) is inserted in front of the range where the projected image 40 is projected on the screen SC after step S1.
  • a human hand H a hand with all fingers closed
  • step S ⁇ b> 4 an overlapping area Ha (a hatched area) overlapping the hand H is detected in the projection image 40.
  • the overlapping area Ha is detected in this way, the portion of the overlapping area Ha in the projection image data that is the source of the projection image 40 is corrected by the process of step S5, and as shown in FIG.
  • the hand H becomes invisible by the corrected projected image 40x.
  • the coordinates of the pixel detected as the overlapping area Ha among the pixels of the image data for projection and the time when the correction is performed are corrected in step S5. It is stored in the ROM 12 as a history. Note that only two correction histories are stored in the ROM 12 in order from the newest time, and the old correction history is overwritten.
  • step S5 the recognition unit 11D determines whether or not two correction histories are stored in the ROM 12 (step S6).
  • the recognition unit 11D uses the shape of the overlapping area detected in step S4 as the shape of the object that overlaps the projection image projected on the screen SC. (Step S7).
  • the recognition unit 11D determines whether or not the shape of the object recognized in step S7 is the specific shape (step S8). If it is determined that the shape of the object recognized in step S7 is not the specific shape (step S8: NO), the process returns to step S2.
  • step S4 it is assumed that a specific shape is registered as a hand shape in which only the index finger is opened from a state where all fingers of the human hand are closed (the state of hand H in FIG. 7). If the overlapping area detected in step S4 is the one shown in FIG. 7, the determination in step S8 is NO and the process returns to step S2.
  • step S4 when the index finger F of the hand H shown in FIG. 8 is opened and changes to the state shown in FIG. An overlapping area Hb between the index finger F and the projected image 40x is detected in step S4, the overlapping area Hb is corrected in step S5, and correction history information indicating that the overlapping area Hb has been corrected is stored in the ROM 12.
  • step S6 two correction histories are stored in the ROM 12, and the determination in step S6 is YES.
  • the recognizing unit 11D determines the shape of the overlapping area (the overlapping area Hb in FIG. 9) detected in step S4.
  • the shape obtained by combining the shape of the correction region based on the older one of the two correction histories (information indicating the correction contents of the overlapping region Ha in FIG. 8) is superimposed on the projected image projected on the screen SC. It is recognized as the shape of the object (step S10).
  • step S10 a shape obtained by combining the overlapping region Ha shown in FIG. 7 and the overlapping region Hb shown in FIG. 9 is recognized as the shape of the object.
  • the shape obtained by combining the overlapping region Ha shown in FIG. 7 and the overlapping region Hb shown in FIG. 9 matches the shape of the hand in a state in which only the index finger is opened from the state where all fingers of the human hand are closed. For this reason, determination of step S8 becomes YES and the process of step S9 is performed.
  • step S9 the display control unit 11E changes the image data for projection input to the display unit 1 to the one specified in advance, and erases the correction history stored in the ROM 12.
  • step S9 the process returns to step S1, and the changed projection image 40A is projected onto the screen SC as shown in FIG.
  • the region recognized as a specific shape in the projection image after the change (a region obtained by combining the overlapping region Ha and the overlapping region Hb).
  • the same correction as that performed by the correction unit 11C is performed.
  • the region overlapping the hand H in the projection image 40 ⁇ / b> A remains corrected so that the hand H cannot be seen.
  • the projector 100 when an object overlaps the projected image projected on the screen SC, the overlapping area overlapping the object is corrected, so that the observer feels the presence of the object.
  • a safe display is possible.
  • the correction unit 11C performs correction with such a high accuracy that the hand H cannot be seen, the correction unit 11C performs the correction of the person held in front of the screen SC based on the correction history information stored in the ROM 12.
  • the shape of the hand can be recognized, and the operation by the gesture can be continued.
  • the projector 100 it is possible to improve both the quality of the projected image and the accuracy of gesture recognition.
  • FIG. 11 is a flowchart for explaining a modification of the operation of the projector 100 shown in FIG.
  • the flowchart shown in FIG. 11 is obtained by changing step S10 of the flowchart shown in FIG. 5 to step S10a.
  • step S10 of the flowchart shown in FIG. 5 the same processes as those in FIG.
  • step S10a the recognizing unit 11D recognizes the shape obtained by combining the shapes of the correction areas based on each of the two correction histories stored in the ROM 12 as the shape of the object overlapping the projection image projected on the screen SC.
  • the ROM 12 stores the correction history of the overlapping area Ha shown in FIG. 7 and the correction history of the overlapping area Hb shown in FIG.
  • it is based on the shape of the correction area Hax (area represented by a set of corrected pixels) based on the correction history of the overlapping area Ha and the correction history of the overlapping area Hb.
  • the process of step S9 is performed.
  • the recognition unit 11D can also determine the shape of the object in front of the screen SC based on the latest two correction histories stored in the ROM 12.
  • the shape of the object is recognized using two correction histories in step S10a.
  • the shape of the object may be recognized by combining three shapes based on three or more correction histories. . In this way, a more complicated shape can be recognized.
  • the imaging unit 30 may have a configuration provided outside.
  • the system control unit 10 is communicably connected to the imaging unit 30 prepared separately from the projector 100.
  • the projection unit 20, the imaging unit 30, and an electronic device such as a personal computer are separately prepared and installed so that the electronic device can communicate with the projection unit 20 and the imaging unit 30, and the system control unit is connected to the electronic device. It is good also as a structure with 10 functions. According to this configuration, the system can be realized only by improving the general-purpose projector. Therefore, as shown in FIG. 1, the projection unit 20, the photographing unit 30, and the system control unit 10 are all provided. According to the configuration in the same housing, the overlapping region can be corrected with high accuracy. Further, the cost required for system construction can be reduced as compared with the case where the projection unit 20, the photographing unit 30, and the electronic device are individually prepared.
  • An overlapping region detecting unit for detecting an overlapping region overlapping with an object in the projected image based on captured image data obtained by capturing the projected image by a capturing unit that captures a projected image projected from the projecting unit; , A correction unit for controlling the projection unit to correct the overlapping region and storing a correction history indicating the content of the correction; A recognition unit for recognizing the shape of an object overlapping the projection image based on the correction history; A projection control apparatus comprising: a display control unit that changes the projection image based on the shape of the object recognized by the recognition unit.
  • the projection control apparatus is a projection control apparatus which recognizes the shape of the said object which overlaps with the said projection image based on the said overlap area
  • the projection control apparatus is a projection control device that recognizes a shape obtained by combining the shape of the correction region based on the correction history and the shape of the overlapping region as the shape of the object overlapping the projection image.
  • the recognition unit is a projection control device that recognizes the shape of the object overlapping the projection image based on the plurality of correction histories stored by the correction unit.
  • the projection control apparatus is a projection control apparatus which recognizes the shape which combined the shape of the correction
  • a projection apparatus comprising: the projection control apparatus according to any one of (1) to (6); and the projection unit.
  • An overlapping area detecting step for detecting an overlapping area overlapping with an object in the projected image based on captured image data obtained by capturing the projected image by an imaging section that captures the projected image projected from the projection section; , A correction step of controlling the projection unit to correct the overlapping area and storing a correction history indicating the content of the correction;
  • a recognition step for recognizing the shape of an object overlapping the projection image based on the correction history;
  • a projection control method comprising: a display control step of changing the projection image based on the shape of the object recognized by the recognition step.
  • the recognition step recognizes the shape of the object that overlaps the projection image based on the overlap region detected by the overlap region detection step and the correction history stored by the correction step.
  • the projection control method according to (9), The recognition step is a projection control method for recognizing the shape of the object overlapping the projection image based on the plurality of correction histories stored in the correction step.
  • the projection control method according to (12), The recognition step is a projection control method for recognizing a shape obtained by combining the shapes of correction regions based on each of the plurality of correction histories as the shape of the object overlapping the projection image.
  • An overlapping area detecting step for detecting an overlapping area overlapping with an object in the projected image based on captured image data obtained by capturing the projected image by an imaging section that captures the projected image projected from the projection section; , A correction step of controlling the projection unit to correct the overlapping area and storing a correction history indicating the content of the correction; A recognition step for recognizing the shape of an object overlapping the projection image based on the correction history; A projection control program for causing a computer to execute a display control step of changing the projection image based on the shape of the object recognized by the recognition step.
  • a projection control device capable of achieving both improvement in the quality of a projection image and improvement in gesture recognition accuracy, a projection device including the projection control device, a projection control method, and a projection control program.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention provides a projection control device capable of achieving both improvement in the quality of a projected image and improvement in gesture recognition accuracy, a projection apparatus provided therewith, a projection control method, and a projection control program. A system control unit (10) is provided with: an overlap region detection unit (11B) which, on the basis of photographed image data obtained by photographing a projected image projected from a projection unit (20) by a photographing unit (30) for photographing the projected image, detects an overlap region overlapping an object within the projected image; a correction unit (11C) which corrects the overlap region by controlling the projection unit (20), and stores a correction history indicating the contents of the correction in a ROM 12; a recognition unit (11D) which, on the basis of the correction history in the ROM 12, recognizes the shape of the object overlapping the projected image; and a display control unit (11E) which, on the basis of the shape of the object recognized by the recognition unit (11D), changes the projected image.

Description

投影制御装置、投影装置、投影制御方法、及び投影制御プログラムProjection control apparatus, projection apparatus, projection control method, and projection control program
 本発明は、投影制御装置、投影装置、投影制御方法、及び投影制御プログラムに関する。 The present invention relates to a projection control device, a projection device, a projection control method, and a projection control program.
 撮影機能を有する投影装置が知られている。特許文献1-3には、投影部によって投影された投影像を撮影部によって撮影し、撮影して得た撮影像に基づいて、投影像に人物の手等の物体が重なっていることを検出する投影装置が記載されている。 A projection device having a photographing function is known. In Patent Documents 1-3, the projection image projected by the projection unit is photographed by the photographing unit, and based on the photographed image obtained by photographing, it is detected that an object such as a human hand overlaps the projected image. A projection device is described.
 また、特許文献1には、投影像に物体が重なっていることが検出されると、この物体と重なる投影像の部分を補正して、投影像の見え方を改善することが記載されている。 Further, Patent Document 1 describes that when it is detected that an object overlaps the projection image, the portion of the projection image that overlaps the object is corrected to improve the appearance of the projection image. .
日本国特開2012-208439号公報Japanese Unexamined Patent Publication No. 2012-208439 日本国特開2012-113564号公報Japanese Unexamined Patent Publication No. 2012-113564 日本国特開2013-257686号公報Japanese Unexamined Patent Publication No. 2013-257686
 特許文献1に記載されているような補正を高精度に行うことのできる装置において、例えば、人物が指をすべて閉じた状態の手を投影像にかざした場合を想定する。この場合、上記の補正によって人物の手はほぼ見えなくなるよう補正される。この状態から、人物が、投影像にかざしている手の人差し指だけを開くと、撮影像によってこの人差し指の認識が可能になるため、再び、この人差し指を消すような補正が行われる。 In an apparatus capable of performing correction as described in Patent Document 1 with high accuracy, for example, a case is assumed in which a person holds a hand with all fingers closed over a projected image. In this case, the above correction is performed so that the human hand is almost invisible. In this state, when the person opens only the index finger of the hand held over the projected image, the index finger can be recognized from the photographed image, and thus correction for removing the index finger is performed again.
 このような補正を行う装置において、人物の手の形が特定の形になったことを認識して特定の処理を行う、いわゆるジェスチャ操作の機能を付加した場合を想定する。上述したように高精度の補正が行われると、実際には、人差し指が立てられた手が投影像に重なっているのにも関わらず、補正後の撮影像による物体の認識では、人差し指のみしか検出することができない。 Suppose that a device for performing such correction adds a so-called gesture operation function that recognizes that the shape of a person's hand has a specific shape and performs specific processing. As described above, when high-accuracy correction is performed, in reality, only the index finger is recognized when recognizing an object based on the corrected image, even though the hand with the index finger overlapped the projected image. It cannot be detected.
 このため、例えば、ジェスチャ認識される特定の形が、人差し指が立てられた手の形であった場合には、人物の行ったジェスチャ動作が正確に認識されないことになる。特許文献1-3には、こういったジェスチャ認識ができなくなるという課題については認識されていない。 For this reason, for example, when the specific shape recognized by the gesture is the shape of the hand with the index finger raised, the gesture operation performed by the person cannot be accurately recognized. Patent Documents 1-3 do not recognize the problem that such gesture recognition becomes impossible.
 本発明は、上記事情に鑑みてなされたものであり、投影像の品質向上とジェスチャ認識精度の向上とを両立させることのできる投影制御装置、これを備える投影装置、投影制御方法、及び投影制御プログラムを提供することを目的とする。 The present invention has been made in view of the above circumstances, and a projection control apparatus capable of achieving both improvement in the quality of a projected image and improvement in gesture recognition accuracy, a projection apparatus including the projection control apparatus, a projection control method, and projection control. The purpose is to provide a program.
 本発明の投影制御装置は、投影部から投影された投影像を撮影する撮影部、により上記投影像を撮影して得られる撮影画像データに基づいて、上記投影像における物体と重なっている重複領域を検出する重複領域検出部と、上記投影部を制御して、上記重複領域の補正を行い、上記補正の内容を示す補正履歴を記憶する補正部と、上記補正履歴に基づいて、上記投影像に重なる物体の形状を認識する認識部と、上記認識部により認識された上記物体の形状に基づいて、上記投影像を変更する表示制御部と、を備えるものである。 The projection control apparatus according to the present invention includes an overlapping region that overlaps an object in the projected image based on captured image data obtained by capturing the projected image by a capturing unit that captures the projected image projected from the projecting unit. An overlapping area detecting unit for detecting the correction area, correcting the overlapping area, correcting the overlapping area, storing a correction history indicating the content of the correction, and the projection image based on the correction history A recognition unit for recognizing the shape of an object overlapping with the display unit, and a display control unit for changing the projection image based on the shape of the object recognized by the recognition unit.
 本発明の投影装置は、上記投影制御装置と、上記投影部と、を備えるものである。 The projection device of the present invention includes the projection control device and the projection unit.
 本発明の投影装置は、上記投影制御装置と、上記投影部と、上記撮影部と、を備えるものである。 The projection device of the present invention includes the projection control device, the projection unit, and the photographing unit.
 本発明の投影制御方法は、投影部から投影された投影像を撮影する撮影部、により上記投影像を撮影して得られる撮影画像データに基づいて、上記投影像における物体と重なっている重複領域を検出する重複領域検出ステップと、上記投影部を制御して、上記重複領域の補正を行い、上記補正の内容を示す補正履歴を記憶する補正ステップと、上記補正履歴に基づいて、上記投影像に重なる物体の形状を認識する認識ステップと、上記認識ステップにより認識された上記物体の形状に基づいて、上記投影像を変更する表示制御ステップと、を備えるものである。 According to the projection control method of the present invention, an overlapping region that overlaps an object in the projected image based on captured image data obtained by capturing the projected image by a capturing unit that captures the projected image projected from the projecting unit. An overlapping area detecting step for detecting the correction area, a correction step for controlling the projection unit to correct the overlapping area and storing a correction history indicating the content of the correction, and the projection image based on the correction history. A recognition step for recognizing the shape of the object that overlaps the display, and a display control step for changing the projected image based on the shape of the object recognized by the recognition step.
 本発明の投影制御プログラムは、投影部から投影された投影像を撮影する撮影部、により上記投影像を撮影して得られる撮影画像データに基づいて、上記投影像における物体と重なっている重複領域を検出する重複領域検出ステップと、上記投影部を制御して、上記重複領域の補正を行い、上記補正の内容を示す補正履歴を記憶する補正ステップと、上記補正履歴に基づいて、上記投影像に重なる物体の形状を認識する認識ステップと、上記認識ステップにより認識された上記物体の形状に基づいて、上記投影像を変更する表示制御ステップと、をコンピュータに実行させるためのものである。 The projection control program of the present invention includes an overlapping region that overlaps an object in the projected image based on captured image data obtained by capturing the projected image by a capturing unit that captures the projected image projected from the projecting unit. An overlapping area detecting step for detecting the correction area, a correction step for controlling the projection unit to correct the overlapping area and storing a correction history indicating the content of the correction, and the projection image based on the correction history. And a display control step for changing the projection image based on the shape of the object recognized by the recognition step.
 本発明によれば、投影像の品質向上とジェスチャ認識精度の向上とを両立させることのできる投影制御装置、これを備える投影装置、投影制御方法、及び投影制御プログラムを提供することができる。 According to the present invention, it is possible to provide a projection control device capable of achieving both improvement in the quality of a projection image and improvement in gesture recognition accuracy, a projection device including the projection control device, a projection control method, and a projection control program.
本発明の投影装置の一実施形態であるプロジェクタ100の概略構成を示す模式図である。1 is a schematic diagram showing a schematic configuration of a projector 100 that is an embodiment of a projection apparatus of the present invention. 図1に示すプロジェクタ100の表示部1の構成例を示す模式図である。It is a schematic diagram which shows the structural example of the display part 1 of the projector 100 shown in FIG. 図1に示すプロジェクタ100のシステム制御部10の内部構成を示すブロック図である。It is a block diagram which shows the internal structure of the system control part 10 of the projector 100 shown in FIG. 図3に示す制御部11の機能ブロック図である。It is a functional block diagram of the control part 11 shown in FIG. 図1に示すプロジェクタ100の動作を説明するためのフローチャートである。3 is a flowchart for explaining an operation of projector 100 shown in FIG. 1. プロジェクタ100の使用者がジェスチャによって投影像の変更を行う場合の投影像の遷移状態を示す模式図である。It is a schematic diagram which shows the transition state of a projection image in case the user of the projector 100 changes a projection image by gesture. プロジェクタ100の使用者がジェスチャによって投影像の変更を行う場合の投影像の遷移状態を示す模式図である。It is a schematic diagram which shows the transition state of a projection image in case the user of the projector 100 changes a projection image by gesture. プロジェクタ100の使用者がジェスチャによって投影像の変更を行う場合の投影像の遷移状態を示す模式図である。It is a schematic diagram which shows the transition state of a projection image in case the user of the projector 100 changes a projection image by gesture. プロジェクタ100の使用者がジェスチャによって投影像の変更を行う場合の投影像の遷移状態を示す模式図である。It is a schematic diagram which shows the transition state of a projection image in case the user of the projector 100 changes a projection image by gesture. プロジェクタ100の使用者がジェスチャによって投影像の変更を行う場合の投影像の遷移状態を示す模式図である。It is a schematic diagram which shows the transition state of a projection image in case the user of the projector 100 changes a projection image by gesture. 図1に示すプロジェクタ100の動作の変形例を説明するためのフローチャートである。10 is a flowchart for explaining a modification of the operation of projector 100 shown in FIG. 1. 2つの補正履歴に基づく補正領域の形状を組み合わせた状態を示す模式図である。It is a schematic diagram which shows the state which combined the shape of the correction area | region based on two correction log | history.
 図1は、本発明の投影装置の一実施形態であるプロジェクタ100の概略構成を示す模式図である。図2は、図1に示すプロジェクタ100の表示部1の構成例を示す模式図である。 FIG. 1 is a schematic diagram showing a schematic configuration of a projector 100 which is an embodiment of the projection apparatus of the present invention. FIG. 2 is a schematic diagram illustrating a configuration example of the display unit 1 of the projector 100 illustrated in FIG. 1.
 プロジェクタ100は、スクリーンSCに画像を投影すると共に、スクリーンSCに投影された投影像を含む範囲を撮影可能に構成されている。 The projector 100 is configured to project an image onto the screen SC and to capture a range including the projected image projected onto the screen SC.
 プロジェクタ100は、表示部1と、投影光学系2と、共通光学系3と、光学部材4と、撮影光学系5と、撮像素子6と、全体を統括制御するシステム制御部10と、を備える。表示部1、投影光学系2、光学部材4、及び共通光学系3により投影部20が構成されている。撮像素子6、撮影光学系5、光学部材4、及び共通光学系3により撮影部30が構成されている。 The projector 100 includes a display unit 1, a projection optical system 2, a common optical system 3, an optical member 4, a photographing optical system 5, an image sensor 6, and a system control unit 10 that performs overall control. . The display unit 1, the projection optical system 2, the optical member 4, and the common optical system 3 constitute a projection unit 20. An imaging unit 30 is configured by the imaging element 6, the imaging optical system 5, the optical member 4, and the common optical system 3.
 表示部1は、入力される投影用の画像データに基づいて投影用の画像を表示するものである。図2に示すように、表示部1は、光源ユニット50と、光変調素子44と、を備える。 The display unit 1 displays a projection image based on the input projection image data. As shown in FIG. 2, the display unit 1 includes a light source unit 50 and a light modulation element 44.
 光源ユニット50は、赤色光を出射する赤色光源であるR光源41rと、緑色光を出射する緑色光源であるG光源41gと、青色光を出射する青色光源であるB光源41bと、ダイクロイックプリズム43と、R光源41rとダイクロイックプリズム43の間に設けられたコリメータレンズ42rと、G光源41gとダイクロイックプリズム43の間に設けられたコリメータレンズ42gと、B光源41bとダイクロイックプリズム43の間に設けられたコリメータレンズ42bと、を備えている。 The light source unit 50 includes an R light source 41r that is a red light source that emits red light, a G light source 41g that is a green light source that emits green light, a B light source 41b that is a blue light source that emits blue light, and a dichroic prism 43. A collimator lens 42 r provided between the R light source 41 r and the dichroic prism 43, a collimator lens 42 g provided between the G light source 41 g and the dichroic prism 43, and a B light source 41 b and the dichroic prism 43. A collimator lens 42b.
 ダイクロイックプリズム43は、R光源41r、G光源41g、及びB光源41bの各々から出射される光を同一光路に導くための光学部材である。すなわち、ダイクロイックプリズム43は、コリメータレンズ42rによって平行光化された赤色光を透過させて光変調素子44に出射する。また、ダイクロイックプリズム43は、コリメータレンズ42gによって平行光化された緑色光を反射させて光変調素子44に出射する。さらに、ダイクロイックプリズム43は、コリメータレンズ42bによって平行光化された青色光を反射させて光変調素子44に出射する。このような機能を持つ光学部材としては、ダイクロイックプリズムに限らない。例えば、クロスダイクロイックミラーを用いてもよい。 The dichroic prism 43 is an optical member for guiding light emitted from each of the R light source 41r, the G light source 41g, and the B light source 41b to the same optical path. That is, the dichroic prism 43 transmits the red light that has been collimated by the collimator lens 42 r and emits the red light to the light modulation element 44. The dichroic prism 43 reflects the green light that has been collimated by the collimator lens 42 g and emits it to the light modulation element 44. Further, the dichroic prism 43 reflects the blue light that has been collimated by the collimator lens 42 b and emits it to the light modulation element 44. The optical member having such a function is not limited to the dichroic prism. For example, a cross dichroic mirror may be used.
 R光源41r、G光源41g、及びB光源41bは、それぞれ、レーザ又はLED(Light Emitting Diode)等の発光素子が用いられる。光源ユニット50に含まれる光源の数は1つ、2つ、又は4つ以上であってもよい。 Each of the R light source 41r, the G light source 41g, and the B light source 41b uses a light emitting element such as a laser or an LED (Light Emitting Diode). The number of light sources included in the light source unit 50 may be one, two, or four or more.
 光変調素子44は、ダイクロイックプリズム43から出射された光を画像データに基づいて空間変調し、空間変調した画像光(赤色画像光、青色画像光、及び緑色画像光)を図1の投影光学系2に出射する。 The light modulation element 44 spatially modulates the light emitted from the dichroic prism 43 based on the image data, and the spatially modulated image light (red image light, blue image light, and green image light) is the projection optical system in FIG. 2 is emitted.
 図2は、光変調素子44としてDMD(Digital Micromirror Device)を用いた例であるが、光変調素子44としては、例えば、LCOS(Liquid crystal on silicon)、MEMS(Micro Electro Mechanical Systems)素子、又は液晶表示素子等を用いることも可能である。 FIG. 2 shows an example in which a DMD (Digital Micromirror Device) is used as the light modulation element 44. As the light modulation element 44, for example, LCOS (Liquid crystal on silicon), MEMS (Micro Electro Mechanical System element, or the like). It is also possible to use a liquid crystal display element or the like.
 表示部1は、自発光型の有機EL(electro-luminescence)表示素子を用いて画像を表示し、表示した画像を投影光学系2に入射させるものであってもよい。また、レーザ光を走査することで画像の表示を行うものを用いてもよい。 The display unit 1 may display an image using a self-luminous organic EL (electro-luminescence) display element, and may cause the displayed image to enter the projection optical system 2. Alternatively, an apparatus that displays an image by scanning with laser light may be used.
 光学部材4は、例えばハーフミラー、ビームスプリッタ―、又は偏光部材等によって構成されている。光学部材4は、投影光学系2を通過した画像光を透過させて共通光学系3に導き、且つ、共通光学系3を通過した被写体光を反射させて撮影光学系5に導く。 The optical member 4 is composed of, for example, a half mirror, a beam splitter, or a polarizing member. The optical member 4 transmits the image light that has passed through the projection optical system 2 and guides it to the common optical system 3, and reflects the subject light that has passed through the common optical system 3 and guides it to the photographing optical system 5.
 投影光学系2は、表示部1からの画像光が入射される光学系であり、少なくとも1つのレンズを含む。投影光学系2を通過した光は光学部材4に入射され、光学部材4を透過して共通光学系3に入射される。 The projection optical system 2 is an optical system into which image light from the display unit 1 is incident, and includes at least one lens. The light that has passed through the projection optical system 2 enters the optical member 4, passes through the optical member 4, and enters the common optical system 3.
 共通光学系3は、投影光学系2を通過した光を投影対象物であるスクリーンSCに投影し、且つ、スクリーンSC側の被写体(投影部20によって投影された投影像を含む範囲の被写体)を結像させる光学系であり、少なくとも1つのレンズを含んで構成されている。 The common optical system 3 projects the light that has passed through the projection optical system 2 onto the screen SC that is a projection target, and subjects the screen SC (subjects in a range including the projection image projected by the projection unit 20). An optical system that forms an image, and includes at least one lens.
 スクリーンSC側から共通光学系3に入射した被写体光は、共通光学系3を通過し、光学部材4にて反射されて、撮影光学系5に入射される。 The subject light incident on the common optical system 3 from the screen SC side passes through the common optical system 3, is reflected by the optical member 4, and enters the photographing optical system 5.
 撮影光学系5は、共通光学系3を通過して光学部材4にて反射した被写体光を撮像素子6に結像させるための光学系である。撮影光学系5は、撮像素子6の前方に配置されており、光学部材4を透過した被写体光を集光して撮像素子6に結像させる。撮影光学系5は、少なくとも1つのレンズを含んで構成されている。 The photographing optical system 5 is an optical system for forming an image of subject light that has passed through the common optical system 3 and reflected by the optical member 4 on the image sensor 6. The photographing optical system 5 is disposed in front of the image sensor 6 and focuses the subject light transmitted through the optical member 4 to form an image on the image sensor 6. The photographing optical system 5 includes at least one lens.
 撮像素子6は、CCD(Charge Coupled Device)イメージセンサ又はCMOS(Complementary Metal Oxide Semiconductor)イメージセンサ等が用いられる。 The image sensor 6 is a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
 撮影部30による撮影範囲は、投影部20によるスクリーンSCへの画像の投影範囲を含む範囲に設定されている。したがって、撮影部30は、投影部20によって投影された投影像の全体を撮影することができる。 The imaging range by the imaging unit 30 is set to a range including the projection range of the image on the screen SC by the projection unit 20. Therefore, the photographing unit 30 can photograph the entire projection image projected by the projection unit 20.
 図3は、図1に示すプロジェクタ100のシステム制御部10の内部構成を示すブロック図である。 FIG. 3 is a block diagram showing an internal configuration of the system control unit 10 of the projector 100 shown in FIG.
 システム制御部10は、制御部11と、ROM(Read Only Memory)12と、RAM(Random Accsess Memory)13と、を備える。 The system control unit 10 includes a control unit 11, a ROM (Read Only Memory) 12, and a RAM (Random Access Memory) 13.
 制御部11は、ハードウェア的な構造は、各種のプロセッサである。各種のプロセッサとしては、プログラムを実行して各種処理を行う汎用的なプロセッサであるCPU(Central Prosessing Unit)、FPGA(Field Programmable Gate Array)等の製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)、又はASIC(Application Specific Integrated Circuit)等の特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路等が含まれる。 The control unit 11 has various hardware structures. As various processors, programmable logic, which is a processor whose circuit configuration can be changed after manufacturing, such as a CPU (Central Processing Unit) and an FPGA (Field Programmable Gate Array), which are general-purpose processors that execute programs and perform various processes Examples include a dedicated electrical circuit that is a processor having a circuit configuration that is specifically designed to execute a specific process such as a device (Programmable Logic Device: PLD) or an ASIC (Application Specific Integrated Circuit).
 これら各種のプロセッサの構造は、より具体的には、半導体素子等の回路素子を組み合わせた電気回路である。 More specifically, the structures of these various processors are electric circuits in which circuit elements such as semiconductor elements are combined.
 制御部11は、各種のプロセッサのうちの1つで構成されてもよいし、同種又は異種の2つ以上のプロセッサの組み合わせ(例えば、複数のFPGAの組み合わせ又はCPUとFPGAの組み合わせ)で構成されてもよい。 The control unit 11 may be composed of one of various processors, or a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs or a combination of a CPU and an FPGA). May be.
 制御部11は、表示部1の光変調素子44に入力する投影用の画像データを制御することで、投影部20からスクリーンSCに投影させる投影像の内容、投影像の各画素の輝度、又は投影像の各画素の彩度等を制御する。また、制御部11は、撮像素子6を制御して、撮像素子6による被写体の撮影を実行する。 The control unit 11 controls the image data for projection input to the light modulation element 44 of the display unit 1, so that the content of the projection image projected from the projection unit 20 onto the screen SC, the luminance of each pixel of the projection image, or The saturation of each pixel of the projected image is controlled. In addition, the control unit 11 controls the image sensor 6 and performs shooting of the subject by the image sensor 6.
 図4は、図3に示す制御部11の機能ブロック図である。制御部11は、投影制御プログラムを含むアプリケーションプログラムを実行することにより、画像処理部11A、重複領域検出部11B、補正部11C、認識部11D、及び表示制御部11Eを備える投影制御装置として機能する。 FIG. 4 is a functional block diagram of the control unit 11 shown in FIG. The control unit 11 functions as a projection control apparatus including an image processing unit 11A, an overlapping area detection unit 11B, a correction unit 11C, a recognition unit 11D, and a display control unit 11E by executing an application program including a projection control program. .
 画像処理部11Aは、撮像素子6が被写体(投影部20によって投影された投影像を含む範囲)を撮像して得た撮像画像信号を取得し、これを処理して撮影画像データを生成する。 The image processing unit 11A acquires a captured image signal obtained by the imaging element 6 capturing an image of a subject (a range including a projection image projected by the projection unit 20), and processes this to generate captured image data.
 重複領域検出部11Bは、画像処理部11Aによって生成された撮影画像データに基づいて、投影部20によってスクリーンSCに投影された投影像における人の手等の物体と重なっている重複領域を検出する。 Based on the captured image data generated by the image processing unit 11A, the overlapping region detection unit 11B detects an overlapping region that overlaps an object such as a human hand in the projection image projected on the screen SC by the projection unit 20. .
 例えば、重複領域検出部11Bは、表示部1の光変調素子44に入力している投影用の画像データと、画像処理部11Aによって生成された撮影画像データとで領域毎の類似度の判定を行い、類似度が閾値以下となる領域を重複領域として検出する。類似度は、例えば、画像に含まれるエッジ等の特徴点の位置又は数の一致度合いを示す指標である。画像が平面であるスクリーンSCに投影された状態と、この画像が立体物に投影された状態とでは、その画像を撮影したときに、その画像に含まれる特徴点の位置がずれたり、特徴点の数が増減したりするため、この一致度を見ることで、重複領域の有無を判定可能である。 For example, the overlapping area detection unit 11B determines the similarity for each area from the image data for projection input to the light modulation element 44 of the display unit 1 and the captured image data generated by the image processing unit 11A. The region where the similarity is equal to or less than the threshold is detected as an overlapping region. The similarity is, for example, an index indicating the degree of coincidence of the positions or numbers of feature points such as edges included in the image. When the image is projected onto a plane screen SC and when the image is projected onto a three-dimensional object, the position of the feature points included in the image may be shifted or Therefore, the presence / absence of an overlapping region can be determined by looking at the degree of coincidence.
 また、投影像に物体が重なると、投影像におけるその物体に投影されている部分には歪みが生じる。このことを利用し、重複領域検出部11Bは、撮影画像データにおいて歪みが生じている領域を重複領域として検出してもよい。 In addition, when an object overlaps the projected image, distortion occurs in the portion of the projected image projected onto the object. Utilizing this fact, the overlapping area detection unit 11B may detect an area where distortion occurs in the captured image data as an overlapping area.
 補正部11Cは、重複領域検出部11Bによって重複領域が検出された場合に、投影部20を制御して、この重複領域の補正を行い、この補正の内容を示す補正履歴をROM12に記憶する。 When the overlapping area is detected by the overlapping area detecting unit 11B, the correcting unit 11C controls the projecting unit 20 to correct the overlapping area, and stores a correction history indicating the content of the correction in the ROM 12.
 上述したように、投影用の画像データにおける重複領域では、特徴点の位置又は数が元のデータから大きくずれた状態、或いは、歪みが生じた状態にある。補正部11Cは、投影用の画像データにおいて、この重複領域の輝度又は彩度の少なくとも一方を変更したり、この重複領域の歪み補正を行ったりすることで、投影像の重複領域を補正する。この補正により、スクリーンSCに投影された投影像に生じているはずの重複領域は、物体がない場合と同じ状態の見た目に補正される。 As described above, in the overlapping area in the image data for projection, the position or number of the feature points is greatly deviated from the original data, or is distorted. The correction unit 11C corrects the overlapping area of the projection image by changing at least one of the luminance and the saturation of the overlapping area or correcting the distortion of the overlapping area in the projection image data. By this correction, the overlapping area that should have occurred in the projected image projected on the screen SC is corrected to the appearance in the same state as when there is no object.
 補正部11Cは、上記のような補正を行った場合には、投影用の画像データにおけるどの画素を補正したかを示す補正履歴の情報を生成し、この補正履歴の情報をROM12に記憶する。補正履歴の情報は、例えば、補正を行った時刻と、補正を行った画素の座標(補正を行った領域を示す情報)と、を含んで構成される。 When the correction is performed as described above, the correction unit 11C generates correction history information indicating which pixel in the image data for projection has been corrected, and stores the correction history information in the ROM 12. The correction history information includes, for example, the correction time and the coordinates of the corrected pixel (information indicating the corrected area).
 認識部11Dは、ROM12に記憶された補正履歴に基づいて、投影部20によって投影された投影像に重なる物体の形状を認識する。この認識方法については後述する。 The recognition unit 11D recognizes the shape of the object that overlaps the projection image projected by the projection unit 20 based on the correction history stored in the ROM 12. This recognition method will be described later.
 表示制御部11Eは、認識部11Dにより認識された物体の形状に基づいて、投影部20からスクリーンSCに投影させている投影像を変更する。 The display control unit 11E changes the projection image projected from the projection unit 20 onto the screen SC based on the shape of the object recognized by the recognition unit 11D.
 具体的には、表示制御部11Eは、認識部11Dにより認識された物体の形状が予め決められた特定の形状であるか否かを判断し、特定の形状であった場合には、投影部20からスクリーンSCに投影させている投影像を変更する。表示制御部11Eは、表示部1の光変調素子44に入力する投影用の画像データを別のものに変えることで、投影像を変更する。 Specifically, the display control unit 11E determines whether or not the shape of the object recognized by the recognition unit 11D is a predetermined specific shape. If the shape is a specific shape, the display unit 11E The projection image projected on the screen SC from 20 is changed. The display control unit 11E changes the projection image by changing the projection image data input to the light modulation element 44 of the display unit 1 to another one.
 この特定の形状は、プロジェクタ100に対して投影像の変更の指示(例えばプレゼンテーションを行っているときの表示画像の変更指示又は表示画像に含まれるアニメーションの動作開始指示等)を行うための形状としてROM12に予め記憶されている。 This specific shape is a shape for instructing the projector 100 to change the projection image (for example, an instruction to change a display image when a presentation is being performed or an instruction to start an operation of an animation included in the display image). Stored in the ROM 12 in advance.
 図5は、図1に示すプロジェクタ100の動作を説明するためのフローチャートである。図6から図10は、プロジェクタ100の使用者がジェスチャによって投影像の変更を行う場合の投影像の遷移状態を示す模式図である。 FIG. 5 is a flowchart for explaining the operation of the projector 100 shown in FIG. 6 to 10 are schematic diagrams illustrating a transition state of the projected image when the user of the projector 100 changes the projected image by a gesture.
 まず、制御部11は、使用者から指定された投影用の画像データに基づく投影像を投影部20によってスクリーンSCに投影させる(ステップS1)。図6は、ステップS1にて、スクリーンSCに投影像40が投影された状態を示している。 First, the control unit 11 causes the projection unit 20 to project a projection image based on the projection image data designated by the user onto the screen SC (step S1). FIG. 6 shows a state in which the projected image 40 is projected on the screen SC in step S1.
 ステップS1にて投影が開始されると、制御部11は、スクリーンSCに投影された投影像の撮影を撮影部30に行わせる(ステップS2)。 When projection is started in step S1, the control unit 11 causes the photographing unit 30 to photograph the projection image projected on the screen SC (step S2).
 ステップS2の撮影で撮像素子6から撮像画像信号が出力されると、画像処理部11Aによってこれが処理されて撮影画像データが生成され、この撮影画像データが重複領域検出部11Bにより取得される(ステップS3)。 When a captured image signal is output from the image sensor 6 in the imaging in step S2, this is processed by the image processing unit 11A to generate captured image data, and this captured image data is acquired by the overlapping area detection unit 11B (step S1). S3).
 重複領域検出部11Bは、取得した撮影画像データと、現時点にて表示部1に入力されている投影用の画像データとを比較し、その比較結果によって重複領域を検出する。そして、重複領域が検出されなかった場合(ステップS4:NO)には、ステップS2に処理が戻る。 The overlapping area detection unit 11B compares the acquired captured image data with the image data for projection currently input to the display unit 1, and detects an overlapping area based on the comparison result. If no overlapping area is detected (step S4: NO), the process returns to step S2.
 重複領域が検出された場合(ステップS4:YES)には、補正部11Cが、現時点にて表示部1に入力されている投影用の画像データにおける重複領域の部分を補正し、補正後の投影用画像データに基づく投影像を投影部20によってスクリーンSCに投影させる。また、補正部11Cは、この補正の内容を示す補正履歴の情報をROM12に記憶する
(ステップS5)。
When the overlapping area is detected (step S4: YES), the correction unit 11C corrects the overlapping area in the image data for projection currently input to the display unit 1 and performs the corrected projection. A projection image based on the image data for projection is projected on the screen SC by the projection unit 20. Further, the correction unit 11C stores information of correction history indicating the content of the correction in the ROM 12 (step S5).
 図7は、ステップS1の後、スクリーンSCにおける投影像40が投影されている範囲の前方に人の手H(全ての指を閉じた状態の手)が挿入された状態を示している。図7に示す状態では、ステップS4において、投影像40のうち、手Hと重なる重複領域Ha(斜線を付した領域)が検出されることになる。 FIG. 7 shows a state in which a human hand H (a hand with all fingers closed) is inserted in front of the range where the projected image 40 is projected on the screen SC after step S1. In the state shown in FIG. 7, in step S <b> 4, an overlapping area Ha (a hatched area) overlapping the hand H is detected in the projection image 40.
 このように重複領域Haが検出されると、ステップS5の処理により、投影像40の元となる投影用の画像データにおける重複領域Haの部分が補正され、図8に示すように、
手Hが、補正後の投影像40xによって見えない状態になる。
When the overlapping area Ha is detected in this way, the portion of the overlapping area Ha in the projection image data that is the source of the projection image 40 is corrected by the process of step S5, and as shown in FIG.
The hand H becomes invisible by the corrected projected image 40x.
 図8に示す補正が行われた場合には、投影用の画像データの画素のうち重複領域Haとして検出された画素の座標と、その補正が行われた時刻とが、ステップS5にて、補正履歴としてROM12に記憶される。なお、ROM12には、時刻が新しい順に2つの補正履歴のみが記憶され、古い補正履歴は上書きされる。 When the correction shown in FIG. 8 is performed, the coordinates of the pixel detected as the overlapping area Ha among the pixels of the image data for projection and the time when the correction is performed are corrected in step S5. It is stored in the ROM 12 as a history. Note that only two correction histories are stored in the ROM 12 in order from the newest time, and the old correction history is overwritten.
 ステップS5の後、認識部11Dは、ROM12に2つの補正履歴が記憶されているか否かを判定する(ステップS6)。2つの補正履歴が記憶されていない場合(ステップS6:NO)には、認識部11Dは、ステップS4にて検出された重複領域の形状を、スクリーンSCに投影された投影像に重なる物体の形状として認識する(ステップS7)。 After step S5, the recognition unit 11D determines whether or not two correction histories are stored in the ROM 12 (step S6). When the two correction histories are not stored (step S6: NO), the recognition unit 11D uses the shape of the overlapping area detected in step S4 as the shape of the object that overlaps the projection image projected on the screen SC. (Step S7).
 そして、認識部11Dは、ステップS7にて認識した物体の形状が上記の特定の形状であるか否かを判定する(ステップS8)。ステップS7にて認識した物体の形状が上記の特定の形状ではないと判定された場合(ステップS8:NO)には、ステップS2に処理が戻る。 Then, the recognition unit 11D determines whether or not the shape of the object recognized in step S7 is the specific shape (step S8). If it is determined that the shape of the object recognized in step S7 is not the specific shape (step S8: NO), the process returns to step S2.
 例えば、特定の形状が、人の手の指を全て閉じた状態(図7の手Hの状態)から人差し指だけを開いた状態の手の形状として登録されているものとする。ステップS4にて検出された重複領域が図7に示すものであった場合には、ステップS8の判定はNOとなり、ステップS2に処理が戻る。 For example, it is assumed that a specific shape is registered as a hand shape in which only the index finger is opened from a state where all fingers of the human hand are closed (the state of hand H in FIG. 7). If the overlapping area detected in step S4 is the one shown in FIG. 7, the determination in step S8 is NO and the process returns to step S2.
 そして、その後、図8に示す手Hの人差し指Fが開いて図9に示す状態に変化すると、
この人差し指Fと投影像40xとの重複領域HbがステップS4にて検出され、この重複領域HbがステップS5にて補正され、重複領域Hbを補正したことを示す補正履歴の情報がROM12に記憶される。この状態では、ROM12に補正履歴が2つ記憶されることになり、ステップS6の判定はYESとなる。
Then, when the index finger F of the hand H shown in FIG. 8 is opened and changes to the state shown in FIG.
An overlapping area Hb between the index finger F and the projected image 40x is detected in step S4, the overlapping area Hb is corrected in step S5, and correction history information indicating that the overlapping area Hb has been corrected is stored in the ROM 12. The In this state, two correction histories are stored in the ROM 12, and the determination in step S6 is YES.
 このように、ROM12に2つの補正履歴が記憶されていた場合(ステップS6:YES)には、認識部11Dは、ステップS4にて検出された重複領域(図9の重複領域Hb)の形状と、2つの補正履歴のうちの古い方の補正履歴(図8の重複領域Haの補正内容を示す情報)に基づく補正領域の形状とを組み合わせた形状を、スクリーンSCに投影された投影像に重なる物体の形状として認識する(ステップS10)。 As described above, when two correction histories are stored in the ROM 12 (step S6: YES), the recognizing unit 11D determines the shape of the overlapping area (the overlapping area Hb in FIG. 9) detected in step S4. The shape obtained by combining the shape of the correction region based on the older one of the two correction histories (information indicating the correction contents of the overlapping region Ha in FIG. 8) is superimposed on the projected image projected on the screen SC. It is recognized as the shape of the object (step S10).
 つまり、ステップS10では、図7に示した重複領域Haと、図9に示した重複領域Hbとを組み合わせた形状が物体の形状として認識される。図7に示した重複領域Haと、図9に示した重複領域Hbとを組み合わせた形状は、人の手の指を全て閉じた状態から人差し指だけを開いた状態の手の形状と一致する。このため、ステップS8の判定はYESとなり、ステップS9の処理が行われる。 That is, in step S10, a shape obtained by combining the overlapping region Ha shown in FIG. 7 and the overlapping region Hb shown in FIG. 9 is recognized as the shape of the object. The shape obtained by combining the overlapping region Ha shown in FIG. 7 and the overlapping region Hb shown in FIG. 9 matches the shape of the hand in a state in which only the index finger is opened from the state where all fingers of the human hand are closed. For this reason, determination of step S8 becomes YES and the process of step S9 is performed.
 ステップS9では、表示制御部11Eが、表示部1に入力する投影用の画像データを予め指定されたものに変更すると共に、ROM12に記憶されている補正履歴を消去する。ステップS9の後はステップS1に処理が戻って、図10に示すように、変更後の投影像40AがスクリーンSCに投影された状態となる。 In step S9, the display control unit 11E changes the image data for projection input to the display unit 1 to the one specified in advance, and erases the correction history stored in the ROM 12. After step S9, the process returns to step S1, and the changed projection image 40A is projected onto the screen SC as shown in FIG.
 なお、表示制御部11Eは、ステップS9において投影用の画像データを変更する際に、変更後の投影像における特定の形状として認識されている領域(重複領域Haと重複領域Hbを合わせた領域)については、補正部11Cが行う補正と同じ補正を行う。これにより、図10に示すように、投影像40Aにおける手Hと重なる領域は、手Hが見えない状態に補正されたままとなる。 Note that when the display control unit 11E changes the image data for projection in step S9, the region recognized as a specific shape in the projection image after the change (a region obtained by combining the overlapping region Ha and the overlapping region Hb). The same correction as that performed by the correction unit 11C is performed. As a result, as shown in FIG. 10, the region overlapping the hand H in the projection image 40 </ b> A remains corrected so that the hand H cannot be seen.
 以上のように、プロジェクタ100によれば、スクリーンSCに投影された投影像に物体が重なった場合に、この物体と重なる重複領域が補正されるため、観察者に対して物体の存在を感じさせることのない表示が可能となる。 As described above, according to the projector 100, when an object overlaps the projected image projected on the screen SC, the overlapping area overlapping the object is corrected, so that the observer feels the presence of the object. A safe display is possible.
 そして、このような補正が行われると、重複領域が継続して存在していても、撮影部30により撮影して得られる撮影画像データからは重複領域の検出がなされなくなる。しかし、図9に示すように、重複領域Haを補正したことを示す補正履歴が記憶されている状態にて、新たに重複領域Hbが検出されると、この重複領域Hbと補正履歴に基づく補正領域とを組み合わせた形状によって、人の手Hが特定の形状になったのを判定することができる。 When such correction is performed, even if there are continuous overlapping areas, the overlapping areas are not detected from the captured image data obtained by the imaging unit 30. However, as shown in FIG. 9, when a new overlapping area Hb is detected in a state where a correction history indicating that the overlapping area Ha has been corrected is stored, correction based on the overlapping area Hb and the correction history is performed. It can be determined that the human hand H has a specific shape based on the shape combined with the region.
 したがって、補正部11Cにより、手Hが見えなくなるほどの高精度の補正が行われた場合であっても、ROM12に記憶された補正履歴の情報に基づいて、スクリーンSC前方にかざされた人の手の形状を認識することができ、ジェスチャによる操作を継続させることができる。このように、プロジェクタ100によれば、投影像の品質向上とジェスチャ認識精度の向上とを両立させることができる。 Therefore, even when the correction unit 11C performs correction with such a high accuracy that the hand H cannot be seen, the correction unit 11C performs the correction of the person held in front of the screen SC based on the correction history information stored in the ROM 12. The shape of the hand can be recognized, and the operation by the gesture can be continued. Thus, according to the projector 100, it is possible to improve both the quality of the projected image and the accuracy of gesture recognition.
 図11は、図1に示すプロジェクタ100の動作の変形例を説明するためのフローチャートである。図11に示すフローチャートは、図5に示すフローチャートのステップS10をステップS10aに変更したものである。図5において図11と同じ処理には同一符号を付して説明を省略する。 FIG. 11 is a flowchart for explaining a modification of the operation of the projector 100 shown in FIG. The flowchart shown in FIG. 11 is obtained by changing step S10 of the flowchart shown in FIG. 5 to step S10a. In FIG. 5, the same processes as those in FIG.
 ステップS10aにおいて、認識部11Dは、ROM12に記憶されている2つの補正履歴の各々に基づく補正領域の形状を組み合わせた形状を、スクリーンSCに投影された投影像に重なる物体の形状として認識する。 In step S10a, the recognizing unit 11D recognizes the shape obtained by combining the shapes of the correction areas based on each of the two correction histories stored in the ROM 12 as the shape of the object overlapping the projection image projected on the screen SC.
 例えば、ROM12に、図7に示す重複領域Haの補正履歴と、図9に示す重複領域Hbの補正履歴とが記憶された場合を想定する。この場合には、図12に示すように、重複領域Haの補正履歴に基づく補正領域Hax(補正が行われた画素の集合で表される領域)の形状と、重複領域Hbの補正履歴に基づく補正領域Hbx(補正が行われた画素の集合で表される領域)の形状とを組み合わせた形状が、特定の形状と一致する場合に、ステップS9の処理が行われる。 For example, it is assumed that the ROM 12 stores the correction history of the overlapping area Ha shown in FIG. 7 and the correction history of the overlapping area Hb shown in FIG. In this case, as shown in FIG. 12, it is based on the shape of the correction area Hax (area represented by a set of corrected pixels) based on the correction history of the overlapping area Ha and the correction history of the overlapping area Hb. When the shape combined with the shape of the correction region Hbx (region represented by a set of corrected pixels) matches a specific shape, the process of step S9 is performed.
 このように、認識部11Dは、ROM12に記憶されている最新の2つの補正履歴に基づいてスクリーンSC前方にある物体の形状を判別することもできる。 Thus, the recognition unit 11D can also determine the shape of the object in front of the screen SC based on the latest two correction histories stored in the ROM 12.
 図11の例では、ステップS10aにおいて2つの補正履歴を用いて物体の形状を認識するものとしたが、3つ以上の補正履歴に基づく3つの形状を組み合わせて物体の形状を認識してもよい。このようにすることで、より複雑な形状の認識が可能となる。 In the example of FIG. 11, the shape of the object is recognized using two correction histories in step S10a. However, the shape of the object may be recognized by combining three shapes based on three or more correction histories. . In this way, a more complicated shape can be recognized.
 以上説明してきたプロジェクタ100は、投影部20と撮影部30とシステム制御部10とを備えるものとしたが、撮影部30は外部に設けられた構成であってもよい。この場合には、システム制御部10が、プロジェクタ100とは別に用意された撮影部30と通信可能に接続される。 Although the projector 100 described above includes the projection unit 20, the imaging unit 30, and the system control unit 10, the imaging unit 30 may have a configuration provided outside. In this case, the system control unit 10 is communicably connected to the imaging unit 30 prepared separately from the projector 100.
 また、投影部20と撮影部30とパーソナルコンピュータ等の電子機器とを個別に用意して設置し、この電子機器と投影部20及び撮影部30とを通信可能とし、この電子機器にシステム制御部10の機能を持たせた構成としてもよい。この構成によれば汎用のプロジェクタの改良のみでシステムを実現することができるため、システム構築に要するコストまた、図1に示すように、投影部20、撮影部30、及びシステム制御部10が全て同一筐体内にある構成によれば、重複領域を高精度に補正することができる。また、投影部20と撮影部30と電子機器を個別に用意する場合と比較してシステム構築に要する費用を削減することができる。 Further, the projection unit 20, the imaging unit 30, and an electronic device such as a personal computer are separately prepared and installed so that the electronic device can communicate with the projection unit 20 and the imaging unit 30, and the system control unit is connected to the electronic device. It is good also as a structure with 10 functions. According to this configuration, the system can be realized only by improving the general-purpose projector. Therefore, as shown in FIG. 1, the projection unit 20, the photographing unit 30, and the system control unit 10 are all provided. According to the configuration in the same housing, the overlapping region can be corrected with high accuracy. Further, the cost required for system construction can be reduced as compared with the case where the projection unit 20, the photographing unit 30, and the electronic device are individually prepared.
 以上説明してきたように、本明細書には以下の事項が開示されている。 As described above, the following items are disclosed in this specification.
(1)
 投影部から投影された投影像を撮影する撮影部、により上記投影像を撮影して得られる撮影画像データに基づいて、上記投影像における物体と重なっている重複領域を検出する重複領域検出部と、
 上記投影部を制御して、上記重複領域の補正を行い、上記補正の内容を示す補正履歴を記憶する補正部と、
 上記補正履歴に基づいて、上記投影像に重なる物体の形状を認識する認識部と、
 上記認識部により認識された上記物体の形状に基づいて、上記投影像を変更する表示制御部と、を備える投影制御装置。
(1)
An overlapping region detecting unit for detecting an overlapping region overlapping with an object in the projected image based on captured image data obtained by capturing the projected image by a capturing unit that captures a projected image projected from the projecting unit; ,
A correction unit for controlling the projection unit to correct the overlapping region and storing a correction history indicating the content of the correction;
A recognition unit for recognizing the shape of an object overlapping the projection image based on the correction history;
A projection control apparatus comprising: a display control unit that changes the projection image based on the shape of the object recognized by the recognition unit.
(2)
 (1)記載の投影制御装置であって、
 上記認識部は、上記重複領域検出部により検出された上記重複領域と上記補正部によって記憶された上記補正履歴とに基づいて、上記投影像に重なる上記物体の形状を認識する投影制御装置。
(2)
(1) The projection control apparatus according to (1),
The said recognition part is a projection control apparatus which recognizes the shape of the said object which overlaps with the said projection image based on the said overlap area | region detected by the said overlap area | region detection part, and the said correction | amendment log | history memorize | stored by the said correction | amendment part.
(3)
 (2)記載の投影制御装置であって、
 上記認識部は、上記補正履歴に基づく補正領域の形状と、上記重複領域の形状とを組み合わせた形状を上記投影像に重なる上記物体の形状として認識する投影制御装置。
(3)
(2) The projection control apparatus according to (2),
The recognizing unit is a projection control device that recognizes a shape obtained by combining the shape of the correction region based on the correction history and the shape of the overlapping region as the shape of the object overlapping the projection image.
(4)
 (1)記載の投影制御装置であって、
 上記認識部は、上記補正部によって記憶された複数の上記補正履歴に基づいて、上記投影像に重なる上記物体の形状を認識する投影制御装置。
(4)
(1) The projection control apparatus according to (1),
The recognition unit is a projection control device that recognizes the shape of the object overlapping the projection image based on the plurality of correction histories stored by the correction unit.
(5)
 (4)記載の投影制御装置であって、
 上記認識部は、複数の上記補正履歴の各々に基づく補正領域の形状を組み合わせた形状を上記投影像に重なる上記物体の形状として認識する投影制御装置。
(5)
(4) The projection control apparatus according to (4),
The said recognition part is a projection control apparatus which recognizes the shape which combined the shape of the correction | amendment area | region based on each of the said some correction | amendment log | history as a shape of the said object which overlaps with the said projection image.
(6)
 (1)から(5)のいずれか1つに記載の投影制御装置であって、
 上記補正履歴は、上記投影像に対して上記補正を行った領域を示す情報を含む投影制御装置。
(6)
The projection control apparatus according to any one of (1) to (5),
The projection control apparatus, wherein the correction history includes information indicating a region where the correction is performed on the projection image.
(7)
 (1)から(6)のいずれか1つに記載の投影制御装置と、上記投影部と、を備える投影装置。
(7)
A projection apparatus comprising: the projection control apparatus according to any one of (1) to (6); and the projection unit.
(8)
 (1)から(6)のいずれか1つに記載の投影制御装置と、上記投影部と、上記撮影部と、を備える投影装置。
(8)
(1) The projection apparatus provided with the projection control apparatus as described in any one of (6), the said projection part, and the said imaging | photography part.
(9)
 投影部から投影された投影像を撮影する撮影部、により上記投影像を撮影して得られる撮影画像データに基づいて、上記投影像における物体と重なっている重複領域を検出する重複領域検出ステップと、
 上記投影部を制御して、上記重複領域の補正を行い、上記補正の内容を示す補正履歴を記憶する補正ステップと、
 上記補正履歴に基づいて、上記投影像に重なる物体の形状を認識する認識ステップと、
 上記認識ステップにより認識された上記物体の形状に基づいて、上記投影像を変更する表示制御ステップと、を備える投影制御方法。
(9)
An overlapping area detecting step for detecting an overlapping area overlapping with an object in the projected image based on captured image data obtained by capturing the projected image by an imaging section that captures the projected image projected from the projection section; ,
A correction step of controlling the projection unit to correct the overlapping area and storing a correction history indicating the content of the correction;
A recognition step for recognizing the shape of an object overlapping the projection image based on the correction history;
A projection control method comprising: a display control step of changing the projection image based on the shape of the object recognized by the recognition step.
(10)
 (9)記載の投影制御方法であって、
 上記認識ステップは、上記重複領域検出ステップにより検出された上記重複領域と上記補正ステップによって記憶された上記補正履歴とに基づいて、上記投影像に重なる上記物体の形状を認識する投影制御方法。
(10)
(9) The projection control method according to (9),
In the projection control method, the recognition step recognizes the shape of the object that overlaps the projection image based on the overlap region detected by the overlap region detection step and the correction history stored by the correction step.
(11)
 (10)記載の投影制御方法であって、
 上記認識ステップは、上記補正履歴に基づく補正領域の形状と、上記重複領域の形状とを組み合わせた形状を上記投影像に重なる上記物体の形状として認識する投影制御方法。
(11)
(10) The projection control method according to (10),
The projection control method for recognizing a shape obtained by combining the shape of the correction region based on the correction history and the shape of the overlapping region as the shape of the object overlapping the projection image.
(12)
 (9)記載の投影制御方法であって、
 上記認識ステップは、上記補正ステップによって記憶された複数の上記補正履歴に基づいて、上記投影像に重なる上記物体の形状を認識する投影制御方法。
(12)
(9) The projection control method according to (9),
The recognition step is a projection control method for recognizing the shape of the object overlapping the projection image based on the plurality of correction histories stored in the correction step.
(13)
 (12)記載の投影制御方法であって、
 上記認識ステップは、複数の上記補正履歴の各々に基づく補正領域の形状を組み合わせた形状を上記投影像に重なる上記物体の形状として認識する投影制御方法。
(13)
(12) The projection control method according to (12),
The recognition step is a projection control method for recognizing a shape obtained by combining the shapes of correction regions based on each of the plurality of correction histories as the shape of the object overlapping the projection image.
(14)
 (9)から(13)のいずれか1つに記載の投影制御方法であって、
 上記補正履歴は、上記投影像に対して上記補正を行った領域を示す情報を含む投影制御方法。
(14)
The projection control method according to any one of (9) to (13),
The projection control method, wherein the correction history includes information indicating a region where the correction is performed on the projection image.
(15)
 投影部から投影された投影像を撮影する撮影部、により上記投影像を撮影して得られる撮影画像データに基づいて、上記投影像における物体と重なっている重複領域を検出する重複領域検出ステップと、
 上記投影部を制御して、上記重複領域の補正を行い、上記補正の内容を示す補正履歴を記憶する補正ステップと、
 上記補正履歴に基づいて、上記投影像に重なる物体の形状を認識する認識ステップと、
 上記認識ステップにより認識された上記物体の形状に基づいて、上記投影像を変更する表示制御ステップと、をコンピュータに実行させるための投影制御プログラム。
(15)
An overlapping area detecting step for detecting an overlapping area overlapping with an object in the projected image based on captured image data obtained by capturing the projected image by an imaging section that captures the projected image projected from the projection section; ,
A correction step of controlling the projection unit to correct the overlapping area and storing a correction history indicating the content of the correction;
A recognition step for recognizing the shape of an object overlapping the projection image based on the correction history;
A projection control program for causing a computer to execute a display control step of changing the projection image based on the shape of the object recognized by the recognition step.
 以上、図面を参照しながら各種の実施の形態について説明したが、本発明はかかる例に限定されないことは言うまでもない。当業者であれば、特許請求の範囲に記載された範疇内において、各種の変更例又は修正例に想到し得ることは明らかであり、それらについても当然に本発明の技術的範囲に属するものと了解される。また、発明の趣旨を逸脱しない範囲において、上記実施の形態における各構成要素を任意に組み合わせてもよい。 Although various embodiments have been described above with reference to the drawings, it goes without saying that the present invention is not limited to such examples. It will be apparent to those skilled in the art that various changes and modifications can be made within the scope of the claims, and these are naturally within the technical scope of the present invention. Understood. In addition, the constituent elements in the above-described embodiment may be arbitrarily combined without departing from the spirit of the invention.
 なお、本出願は、2018年3月16日出願の日本特許出願(特願2018-049878)に基づくものであり、その内容は本出願の中に参照として援用される。 Note that this application is based on a Japanese patent application filed on March 16, 2018 (Japanese Patent Application No. 2018-049878), the contents of which are incorporated herein by reference.
 本発明によれば、投影像の品質向上とジェスチャ認識精度の向上とを両立させることのできる投影制御装置、これを備える投影装置、投影制御方法、及び投影制御プログラムを提供することができる。 According to the present invention, it is possible to provide a projection control device capable of achieving both improvement in the quality of a projection image and improvement in gesture recognition accuracy, a projection device including the projection control device, a projection control method, and a projection control program.
100 プロジェクタ
1 表示部
50 光源ユニット
41r R光源
41g G光源
41b B光源
42r、42g、42b コリメータレンズ
43 ダイクロイックプリズム
44 光変調素子
2 投影光学系
3 共通光学系
4 光学部材
5 撮影光学系
6 撮像素子
10 システム制御部
11 制御部
11A 画像処理部
11B 重複領域検出部
11C 補正部
11D 認識部
11E 表示制御部
12 ROM
13 RAM
20 投影部
30 撮影部
SC スクリーン
40、40x、40A 投影像
H 手
F 人差し指
Ha、Hb 重複領域
Hbx、Hax 補正履歴に基づく形状
DESCRIPTION OF SYMBOLS 100 Projector 1 Display part 50 Light source unit 41r R light source 41g G light source 41b B light sources 42r, 42g, 42b Collimator lens 43 Dichroic prism 44 Light modulation element 2 Projection optical system 3 Common optical system 4 Optical member 5 Imaging optical system 6 Imaging element 10 System control unit 11 Control unit 11A Image processing unit 11B Overlapping region detection unit 11C Correction unit 11D Recognition unit 11E Display control unit 12 ROM
13 RAM
20 Projection unit 30 Imaging unit SC Screen 40, 40x, 40A Projection image H Hand F Index finger Ha, Hb Overlapping region Hbx, Hax Shape based on correction history

Claims (15)

  1.  投影部から投影された投影像を撮影する撮影部、により前記投影像を撮影して得られる撮影画像データに基づいて、前記投影像における物体と重なっている重複領域を検出する重複領域検出部と、
     前記投影部を制御して、前記重複領域の補正を行い、前記補正の内容を示す補正履歴を記憶する補正部と、
     前記補正履歴に基づいて、前記投影像に重なる物体の形状を認識する認識部と、
     前記認識部により認識された前記物体の形状に基づいて、前記投影像を変更する表示制御部と、を備える投影制御装置。
    An overlapping region detecting unit for detecting an overlapping region overlapping with an object in the projected image based on captured image data obtained by capturing the projected image by a capturing unit that captures a projected image projected from the projecting unit; ,
    A correction unit that controls the projection unit, corrects the overlapping region, and stores a correction history indicating the content of the correction;
    A recognition unit for recognizing the shape of an object overlapping the projection image based on the correction history;
    A projection control apparatus comprising: a display control unit that changes the projection image based on the shape of the object recognized by the recognition unit.
  2.  請求項1記載の投影制御装置であって、
     前記認識部は、前記重複領域検出部により検出された前記重複領域と前記補正部によって記憶された前記補正履歴とに基づいて、前記投影像に重なる前記物体の形状を認識する投影制御装置。
    The projection control apparatus according to claim 1,
    The recognition unit is a projection control device that recognizes the shape of the object overlapping the projection image based on the overlap region detected by the overlap region detection unit and the correction history stored by the correction unit.
  3.  請求項2記載の投影制御装置であって、
     前記認識部は、前記補正履歴に基づく補正領域の形状と、前記重複領域の形状とを組み合わせた形状を前記投影像に重なる前記物体の形状として認識する投影制御装置。
    The projection control device according to claim 2,
    The said recognition part is a projection control apparatus which recognizes the shape which combined the shape of the correction | amendment area | region based on the said correction history, and the shape of the said overlap area | region as the shape of the said object which overlaps with the said projection image.
  4.  請求項1記載の投影制御装置であって、
     前記認識部は、前記補正部によって記憶された複数の前記補正履歴に基づいて、前記投影像に重なる前記物体の形状を認識する投影制御装置。
    The projection control apparatus according to claim 1,
    The recognition unit is a projection control device that recognizes the shape of the object overlapping the projection image based on the plurality of correction histories stored by the correction unit.
  5.  請求項4記載の投影制御装置であって、
     前記認識部は、複数の前記補正履歴の各々に基づく補正領域の形状を組み合わせた形状を前記投影像に重なる前記物体の形状として認識する投影制御装置。
    The projection control device according to claim 4,
    The said recognition part is a projection control apparatus which recognizes the shape which combined the shape of the correction | amendment area | region based on each of the said some correction | amendment log | history as a shape of the said object which overlaps with the said projection image.
  6.  請求項1から5のいずれか1項記載の投影制御装置であって、
     前記補正履歴は、前記投影像に対して前記補正を行った領域を示す情報を含む投影制御装置。
    A projection control apparatus according to any one of claims 1 to 5,
    The projection control apparatus, wherein the correction history includes information indicating a region where the correction is performed on the projection image.
  7.  請求項1から6のいずれか1項記載の投影制御装置と、前記投影部と、を備える投影装置。 A projection apparatus comprising: the projection control apparatus according to any one of claims 1 to 6; and the projection unit.
  8.  請求項1から6のいずれか1項記載の投影制御装置と、前記投影部と、前記撮影部と、を備える投影装置。 A projection apparatus comprising: the projection control apparatus according to any one of claims 1 to 6, the projection unit, and the photographing unit.
  9.  投影部から投影された投影像を撮影する撮影部、により前記投影像を撮影して得られる撮影画像データに基づいて、前記投影像における物体と重なっている重複領域を検出する重複領域検出ステップと、
     前記投影部を制御して、前記重複領域の補正を行い、前記補正の内容を示す補正履歴を記憶する補正ステップと、
     前記補正履歴に基づいて、前記投影像に重なる物体の形状を認識する認識ステップと、
     前記認識ステップにより認識された前記物体の形状に基づいて、前記投影像を変更する表示制御ステップと、を備える投影制御方法。
    An overlapping area detecting step for detecting an overlapping area overlapping with an object in the projected image based on captured image data obtained by capturing the projected image by an imaging section that captures a projected image projected from the projection section; ,
    A correction step of controlling the projection unit to correct the overlapping area and storing a correction history indicating the content of the correction;
    A recognition step for recognizing the shape of an object overlapping the projection image based on the correction history;
    A display control step of changing the projection image based on the shape of the object recognized by the recognition step.
  10.  請求項9記載の投影制御方法であって、
     前記認識ステップは、前記重複領域検出ステップにより検出された前記重複領域と前記補正ステップによって記憶された前記補正履歴とに基づいて、前記投影像に重なる前記物体の形状を認識する投影制御方法。
    The projection control method according to claim 9, comprising:
    The recognition step is a projection control method for recognizing the shape of the object overlapping the projection image based on the overlap region detected by the overlap region detection step and the correction history stored by the correction step.
  11.  請求項10記載の投影制御方法であって、
     前記認識ステップは、前記補正履歴に基づく補正領域の形状と、前記重複領域の形状とを組み合わせた形状を前記投影像に重なる前記物体の形状として認識する投影制御方法。
    The projection control method according to claim 10, comprising:
    In the projection control method, the recognition step recognizes a shape obtained by combining the shape of the correction region based on the correction history and the shape of the overlap region as the shape of the object overlapping the projection image.
  12.  請求項9記載の投影制御方法であって、
     前記認識ステップは、前記補正ステップによって記憶された複数の前記補正履歴に基づいて、前記投影像に重なる前記物体の形状を認識する投影制御方法。
    The projection control method according to claim 9, comprising:
    The recognition step is a projection control method for recognizing the shape of the object overlapping the projection image based on the plurality of correction histories stored in the correction step.
  13.  請求項12記載の投影制御方法であって、
     前記認識ステップは、複数の前記補正履歴の各々に基づく補正領域の形状を組み合わせた形状を前記投影像に重なる前記物体の形状として認識する投影制御方法。
    The projection control method according to claim 12, comprising:
    The recognition step is a projection control method for recognizing a shape obtained by combining the shapes of correction regions based on each of the plurality of correction histories as the shape of the object overlapping the projection image.
  14.  請求項9から13のいずれか1項記載の投影制御方法であって、
     前記補正履歴は、前記投影像に対して前記補正を行った領域を示す情報を含む投影制御方法。
    A projection control method according to any one of claims 9 to 13,
    The projection control method, wherein the correction history includes information indicating a region where the correction is performed on the projection image.
  15.  投影部から投影された投影像を撮影する撮影部、により前記投影像を撮影して得られる撮影画像データに基づいて、前記投影像における物体と重なっている重複領域を検出する重複領域検出ステップと、
     前記投影部を制御して、前記重複領域の補正を行い、前記補正の内容を示す補正履歴を記憶する補正ステップと、
     前記補正履歴に基づいて、前記投影像に重なる物体の形状を認識する認識ステップと、
     前記認識ステップにより認識された前記物体の形状に基づいて、前記投影像を変更する表示制御ステップと、をコンピュータに実行させるための投影制御プログラム。
    An overlapping area detecting step for detecting an overlapping area overlapping with an object in the projected image based on captured image data obtained by capturing the projected image by an imaging section that captures a projected image projected from the projection section; ,
    A correction step of controlling the projection unit to correct the overlapping area and storing a correction history indicating the content of the correction;
    A recognition step for recognizing the shape of an object overlapping the projection image based on the correction history;
    A projection control program for causing a computer to execute a display control step of changing the projection image based on the shape of the object recognized by the recognition step.
PCT/JP2019/008190 2018-03-16 2019-03-01 Projection control device, projection apparatus, projection control method, and projection control program WO2019176594A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020506397A JPWO2019176594A1 (en) 2018-03-16 2019-03-01 Projection control device, projection device, projection control method, and projection control program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-049878 2018-03-16
JP2018049878 2018-03-16

Publications (1)

Publication Number Publication Date
WO2019176594A1 true WO2019176594A1 (en) 2019-09-19

Family

ID=67907166

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/008190 WO2019176594A1 (en) 2018-03-16 2019-03-01 Projection control device, projection apparatus, projection control method, and projection control program

Country Status (2)

Country Link
JP (1) JPWO2019176594A1 (en)
WO (1) WO2019176594A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113615168A (en) * 2020-02-28 2021-11-05 松下电器(美国)知识产权公司 Smart window device, image display method, and program
CN114040097A (en) * 2021-10-27 2022-02-11 苏州金螳螂文化发展股份有限公司 Large-scene interactive action capturing system based on multi-channel image acquisition and fusion

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003131319A (en) * 2001-10-25 2003-05-09 Seiko Epson Corp Optical transmission and reception device
JP2009064110A (en) * 2007-09-04 2009-03-26 Canon Inc Image projection device and control method therefor
JP2009135921A (en) * 2007-11-06 2009-06-18 Panasonic Corp Image projection apparatus, and image projection method
JP2012208439A (en) * 2011-03-30 2012-10-25 Sony Corp Projection device, projection method and projection program
JP2013257686A (en) * 2012-06-12 2013-12-26 Sony Corp Projection type image display apparatus, image projecting method, and computer program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4747232B2 (en) * 2006-09-06 2011-08-17 独立行政法人産業技術総合研究所 Small portable terminal
US9241143B2 (en) * 2008-01-29 2016-01-19 At&T Intellectual Property I, L.P. Output correction for visual projection devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003131319A (en) * 2001-10-25 2003-05-09 Seiko Epson Corp Optical transmission and reception device
JP2009064110A (en) * 2007-09-04 2009-03-26 Canon Inc Image projection device and control method therefor
JP2009135921A (en) * 2007-11-06 2009-06-18 Panasonic Corp Image projection apparatus, and image projection method
JP2012208439A (en) * 2011-03-30 2012-10-25 Sony Corp Projection device, projection method and projection program
JP2013257686A (en) * 2012-06-12 2013-12-26 Sony Corp Projection type image display apparatus, image projecting method, and computer program

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113615168A (en) * 2020-02-28 2021-11-05 松下电器(美国)知识产权公司 Smart window device, image display method, and program
US20210407465A1 (en) * 2020-02-28 2021-12-30 Panasonic Intellectual Property Corporation Of America Smart window device, image display method, and recording medium
US11847994B2 (en) * 2020-02-28 2023-12-19 Panasonic Intellectual Property Corporation Of America Smart window device, image display method, and recording medium
CN114040097A (en) * 2021-10-27 2022-02-11 苏州金螳螂文化发展股份有限公司 Large-scene interactive action capturing system based on multi-channel image acquisition and fusion

Also Published As

Publication number Publication date
JPWO2019176594A1 (en) 2021-02-04

Similar Documents

Publication Publication Date Title
RU2579154C1 (en) Projector and projector control method
CN104660946B (en) Projector and its control method
US8403500B2 (en) Projector and method of controlling projector
US9529422B2 (en) Image display and photographing system, photographing device, display device, image display and photographing method, and computer-readable storage medium for computer program
CN104658462A (en) Porjector and method of controlling projector
WO2019176594A1 (en) Projection control device, projection apparatus, projection control method, and projection control program
JP2005318652A (en) Projector with distortion correcting function
JP7062751B2 (en) Projection control device, projection device, projection control method, and projection control program
WO2020250739A1 (en) Projection control device, projection apparatus, projection control method, and projection control program
JP2012181264A (en) Projection device, projection method, and program
WO2012147368A1 (en) Image capturing apparatus
JP2012181721A (en) Position input device, projector, control method for projector, and display system
JP2012053227A (en) Projection type video display device
CN114175624B (en) Control device, projection system, control method, and storage medium
US10474020B2 (en) Display apparatus and method for controlling display apparatus to display an image with an orientation based on a user&#39;s position
JP6903824B2 (en) Projection device and its control method and control program
JP6427888B2 (en) Image display system, image display apparatus, and image display method
JP6057407B2 (en) Touch position input device and touch position input method
JP7228112B2 (en) PROJECTION CONTROL DEVICE, PROJECTION DEVICE, PROJECTION METHOD AND PROGRAM
JP2003087689A (en) Image correcting device for projector
JPH0564061A (en) Video camera
JP6960522B2 (en) Projection control device, projection control method, projection control program, projection system
US10860144B2 (en) Projector and method for controlling projector
JP2018055410A (en) Indicator for image display device and image display system
JP2016154282A (en) Projector and control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19767312

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020506397

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19767312

Country of ref document: EP

Kind code of ref document: A1