WO2021114502A1 - 投影仪及投影方法 - Google Patents

投影仪及投影方法 Download PDF

Info

Publication number
WO2021114502A1
WO2021114502A1 PCT/CN2020/079170 CN2020079170W WO2021114502A1 WO 2021114502 A1 WO2021114502 A1 WO 2021114502A1 CN 2020079170 W CN2020079170 W CN 2020079170W WO 2021114502 A1 WO2021114502 A1 WO 2021114502A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
projector
target
mechanical system
optical
Prior art date
Application number
PCT/CN2020/079170
Other languages
English (en)
French (fr)
Inventor
钟波
肖适
王鑫
余金清
Original Assignee
成都极米科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 成都极米科技股份有限公司 filed Critical 成都极米科技股份有限公司
Priority to US17/599,571 priority Critical patent/US20220196836A1/en
Priority to EP20897688.6A priority patent/EP4075193A4/en
Priority to JP2021576768A priority patent/JP2022523277A/ja
Publication of WO2021114502A1 publication Critical patent/WO2021114502A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/142Adjusting of projection optics
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/26Projecting separately subsidiary matter simultaneously with main image
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/315Modulator illumination systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/315Modulator illumination systems
    • H04N9/3164Modulator illumination systems using multiple light sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/005Projectors using an electronic spatial light modulator but not peculiar thereto
    • G03B21/008Projectors using an electronic spatial light modulator but not peculiar thereto using micromirror devices

Definitions

  • This application relates to the field of projection technology, and specifically to a projector and a projection method.
  • the purpose of the embodiments of the present application is to provide a projector and a projection method. It can achieve the effect that the projector can recognize the surrounding three-dimensional scene.
  • an embodiment provides a projector, including:
  • the first light source arranged on the housing is used to emit target light of a specified wavelength
  • a transflective sheet arranged in the optical-mechanical system is used to transmit visible light and reflect the target light entering the optical-mechanical system, and the target light entering the optical-mechanical system is the The target light emitted by the first light source is reflected back after encountering an obstacle;
  • the light sensor is a time-of-flight sensor
  • the first light source is an infrared light source
  • the target light is infrared light
  • the first infrared light source emits modulated near-infrared light, which is reflected after encountering an obstacle
  • the time-of-flight sensor calculates the time difference or phase difference between infrared light emission and reflection to convert the distance of the obstacle to be photographed , In order to generate depth information, so as to realize the three-dimensional coordinates of obstacles.
  • the optical-mechanical system further includes:
  • a digital micro-mirror device arranged inside the housing, where the digital micro-mirror device is located on a side away from the lens of the opto-mechanical system;
  • the semi-transparent and semi-reflective sheet is arranged between the lens of the opto-mechanical system and the digital micro-mirror device.
  • the photosensitive surface of the light sensor is perpendicular to the light path of the reflected light of the transflective sheet.
  • the projector provided by the embodiment of the present application can make the light sensor better receive the reflected target light by verticalizing the photosensitive surface of the light sensor and the light path of the reflected light of the transflective sheet, thereby making the detection data more accurate .
  • the photosensitive surface of the light sensor is perpendicular to the digital micro-mirror device, and the included angle between the digital micro-mirror device and the semi-transparent and semi-reflective sheet is 45°;
  • the included angle between the transflective sheet and the photosensitive surface of the light sensor is 45°.
  • the angle between the transflective sheet and the outgoing light passing through the optical-mechanical system is 45°, and the light-sensing surface of the light sensor is reflected by the transflective sheet.
  • the light path is vertical.
  • the first light source is arranged on a first surface of the housing, and the first surface is a surface on which a lens of the optomechanical system is arranged.
  • the light-sensing surface of the light sensor can be perpendicular to the light path of the reflected light of the transflective sheet through the above-mentioned installation angle, so that the light sensor can better detect the reflected target light.
  • an embodiment provides a projection method, which is applied to the projector according to any one of the foregoing embodiments, and the projection method includes:
  • Emitting target light of a specified wavelength through the first light source of the projector Emitting target light of a specified wavelength through the first light source of the projector
  • the method further includes:
  • the instruction associated with the indicated action is executed.
  • the projection method provided by the embodiment of the present application can also recognize the user's actions, so as to realize the interaction with the user and improve the practicability of the projector.
  • the method further includes:
  • the projection method provided by the embodiment of the present application can also calibrate the light sensor, so that the test data of the light sensor can be more accurate.
  • the projector and projection method provided by the embodiments of the present application adopt a light sensor added to the projector, according to which the light sensor can detect the back and forth propagation time of the light encountering the obstacle, so that the coordinate information of the obstacle can be determined.
  • the solution in the embodiment of the present application can perceive the three-dimensional surroundings. Information, so that the projector can adapt to more three-dimensional scenes. Further, since the projector of the embodiment of the present application can perceive the surrounding three-dimensional scene, it can be adapted to project the image onto a three-dimensional plane for display.
  • FIG. 1 is a schematic structural diagram of a projector provided by an embodiment of the application.
  • FIG. 2 is a schematic diagram of another structure of a projector provided by an embodiment of the application.
  • FIG. 3 is a schematic block diagram of a projector provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram of the optical path of the projector provided by an embodiment of the application.
  • FIG. 5 is a flowchart of a projection method provided by an embodiment of the application.
  • FIG. 6 is a partial flowchart of a projection method provided by an embodiment of the application.
  • Icon 100-projector; 110-housing; 120-optical-mechanical system; 121-lens; 122-second light source; 123-digital micro-mirror device; 130-transflective film; 140-first light source; 150- Light sensor; 160-memory; 170-storage controller; 180-processor; 190-peripheral interface; C1-screen.
  • the projector 100 in this embodiment includes: a housing 110, an optical-mechanical system 120 installed inside the housing 110, a first light source 140 arranged on the housing 110, and a first light source 140 installed inside the housing 110 ⁇ sensor 150.
  • the optical-mechanical system 120 in this embodiment is used to project the image to be projected onto the target display surface.
  • the target display surface can be a curtain, a wall, etc.
  • the optical machine system 120 may include: a lens 121, a second light source 122, and a digital micromirror device 123 (Digital Micromirror Device, DMD).
  • a lens 121 a lens 121
  • a second light source 122 a second light source 122
  • a digital micromirror device 123 Digital Micromirror Device, DMD
  • the first surface of the housing 110 of the projector 100 is provided with a through hole
  • the first end of the lens 121 of the optical mechanical system 120 is exposed on the surface of the housing 110 through the through hole
  • the second end of the lens 121 is set at The inside of the housing 110.
  • the projector 100 may further include a first protective cover configured to cover the first end of the lens 121.
  • the digital micro-mirror device 123 of the optical-mechanical system 120 is located on the side where the second end of the lens 121 of the optical-mechanical system 120 is located.
  • the digital micro-mirror device 123 is used for a device that uses a digital voltage signal to control the micro-mirror to perform mechanical movement to achieve optical functions.
  • the second light source 122 may also be installed on the side where the second end of the lens 121 of the optical mechanical system 120 is located.
  • the projector 100 may further include a plurality of adjustment knobs (not numbered in the figure), and the adjustment knobs are used to condition the focal length of the lens 121 of the optical-mechanical system 120.
  • the first light source 140 may be installed on the first surface of the housing 110 for emitting target light of a specified wavelength.
  • the first light source 140 may be installed on the first surface of the outer surface of the housing 110.
  • a through hole may be provided on the first surface of the outer surface of the housing 110, the first light source 140 may be installed inside the housing 100, and the light emitting surface of the first light source 140 may face the through hole, so that the first light source The target light emitted by 140 can be emitted from the through hole.
  • the first light source 140 may also pass through the passage, so that the light-emitting surface of the first light source 140 is exposed on the outer surface of the housing 100.
  • the projector 100 may further include a second protective cover.
  • the second protective cover can be used to cover the light-emitting surface of the first light source 140.
  • the first light source 140 is an infrared light source, and the target light emitted by the first light source 140 is infrared light.
  • the wavelength of the target light may be in the range of 760 nanometers to 1 millimeter.
  • the first light source 140 may perform high-frequency modulation on the light before emitting it.
  • the first light source 140 may be an LED (Light Emitting Diode, light emitting diode) or a laser.
  • the laser may be a laser diode or a VCSEL (Vertical Cavity Surface Emitting Laser).
  • the first light source 140 emits high-performance pulsed light, and the pulse of the pulsed light can reach about 100 MHz.
  • the transflective film 130 of the projector 100 may be disposed between the lens 121 of the optical-mechanical system 120 and the digital micro-mirror device 123.
  • the angle between the semi-transparent and semi-reflective sheet 130 and the side of the digital micro-mirror device 123 close to the lens 121 of the optomechanical system 120 is 45°.
  • the included angle between the semi-transparent and semi-reflective sheet 130 and the outgoing light passing through the optical mechanical system 120 is 45°.
  • the included angle between the transflective sheet 130 and the emitted light from the first light source 140 is 45°
  • the transflective sheet 130 can transmit visible light and can reflect invisible light.
  • the target light emitted by the first light source 140 may be an invisible light.
  • the target light may be infrared light.
  • the emitted light from the second light source 122 is visible light. Therefore, the transflective sheet 130 can transmit the light emitted by the second light source 122 and can reflect the target light emitted by the first light source 140.
  • the transflective sheet 130 can be used to reflect the target light incident on the optical machine system 120.
  • the target light incident into the optical-mechanical system 120 is the target light emitted by the first light source 140 and reflected back after encountering an obstacle.
  • the obstacle may be the display surface of the image projected by the projector 100.
  • the obstacle may also be a person interacting with the projector 100.
  • the light sensor 150 included in the projector 100 can be used to detect the target light reflected by the transflective sheet 130.
  • the light emitting surface of the light sensor 150 in this embodiment corresponds to the reflective surface of the transflective sheet 130, so that the light sensor 150 can detect the target light reflected by the transflective sheet 130.
  • the light sensor 150 in this embodiment may be installed on the light path of the light reflected by the transflective sheet 130.
  • the photosensitive surface of the light sensor 150 is perpendicular to the reflected light path of the transflective sheet 130.
  • the photosensitive surface of the light sensor 150 may be perpendicular to the digital micro-mirror device 123, and the angle between the digital micro-mirror device 123 and the transflective film 130 is 45°; so that the transflective film 130 and the light sensor The angle between the photosensitive surface of 150 is 45°.
  • the photosensitive surface of the light sensor 150 may be perpendicular to the digital micro-mirror device 123, which may mean that the photosensitive surface of the light sensor 150 may be perpendicular to the side of the digital micro-mirror device 123 facing the lens 121 of the optical machine system 120.
  • the light sensor 150 determines the position information of the obstacle according to the first time when the first light source 140 emits the target light and the second time when the light sensor 150 detects the target light reflected by the transflective sheet 130.
  • the light sensor 150 is a time of flight (Time of flight, TOF for short) sensor.
  • the angle between the transflective sheet 130 and the outgoing light passing through the optical machine system 120 is 45°, and the photosensitive surface of the light sensor 150 is perpendicular to the reflected light path of the transflective sheet 130 .
  • the included angle between the semi-transparent and semi-reflective sheet 130 and the light-emitting surface of the second light source 122 may be 45°.
  • FIG. 3 it is a block diagram of the projector 100.
  • the projector 100 may include a memory 160, a storage controller 170, a processor 180, and a peripheral interface 190.
  • Those of ordinary skill in the art can understand that the structure shown in FIG. 3 is only for illustration, and does not limit the structure of the projector 100.
  • the projector 100 may also include more or fewer components than those shown in FIG. 3, or have a different configuration from that shown in FIG.
  • the aforementioned components of the memory 160, the storage controller 170, the processor 180, and the peripheral interface 190 are directly or indirectly electrically connected to each other to realize data transmission or interaction.
  • these components can be electrically connected to each other through one or more communication buses or signal lines.
  • the aforementioned processor 180 is used to execute the executable module stored in the memory 160.
  • the memory 160 may be, but is not limited to, random access memory 160 (Random Access Memory, RAM for short), read only memory 160 (Read Only Memory, ROM for short), and programmable read-only memory 160 (Programmable Read-Only Memory). , PROM for short), Erasable Programmable Read-Only Memory 160 (Erasable Programmable Read-Only Memory, EPROM for short), and Electric Erasable Programmable Read-Only Memory 160 (Electric Erasable Programmable Read-Only Memory, EEPROM for short), etc.
  • the memory 160 is used to store a program, and the processor 180 executes the program after receiving an execution instruction.
  • the method executed by the projector 100 of the process definition disclosed in any embodiment of the present application can be applied to processing In the processor 180, or implemented by the processor 180.
  • the aforementioned processor 180 may be an integrated circuit chip with signal processing capability.
  • the above-mentioned processor 180 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; it may also be a digital signal processor (DSP for short). ), Application Specific Integrated Circuit (ASIC), Field Programmable Gate Array (FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components.
  • the methods, steps, and logical block diagrams disclosed in the embodiments of the present application can be implemented or executed.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • peripheral interface 190 couples various input/output devices to the processor 180 and the memory 160.
  • the peripheral interface 190, the processor 180, and the storage controller 170 may be implemented in a single chip. In some other instances, they can be implemented by independent chips.
  • the projector 100 may also include a control circuit for controlling the projection-related parameters of the projector 100.
  • the projection-related parameters may be projection brightness parameters and projection screen parameters.
  • the control circuit can also project an image frame according to the video signal.
  • the three-dimensional environment around the projector 100 can be perceived.
  • the optical sensor 150 and the lens 121 of the optical-mechanical system 120 in this embodiment can form a depth camera, which can obtain three-dimensional environmental data around the projector 100.
  • the depth camera in this embodiment and the optical-mechanical system 120 of the projector 100 share a lens 121, and the projector 100 with three-dimensional perception is realized with less improvement to the existing projector 100.
  • the working principle of the projector 100 will be described below by taking the display interface of the projector as a screen as an example.
  • the target light L1 emitted by the first light source 140 hits the curtain C1
  • the target light L1 encounters the curtain C1
  • it is reflected back to the lens 121, and the target light L1 encounters a transflective film that can reflect the target light. 130
  • the light sensor 150 obtains the distance from the screen C1 according to the flight time of the light pulse of the target light passing through.
  • the optical sensor 150 can determine the three-dimensional coordinates of the screen C according to the determined distance from the screen C1.
  • the emitted light L2 emitted by the second light source 122 can sequentially pass through the transflective sheet 130 and the lens 121 to be projected onto the screen C1 and form an image on the screen C1.
  • the projector 100 in this embodiment can be used to execute each step in each method provided in the embodiment of the present application.
  • the following describes the implementation process of the projection method in detail through several embodiments.
  • FIG. 5 is a flowchart of a projection method provided by an embodiment of the present application. The specific process shown in FIG. 5 will be described in detail below.
  • Step 201 A target light of a specified wavelength is emitted through the first light source of the projector.
  • the first light source continuously emits the target light.
  • Step 202 Determine the first depth image data of the projection plane through the optical sensor of the projector.
  • the optical sensor and the optical-mechanical system multiplex a lens, that is, the lens of the optical-mechanical system and the optical sensor can be combined to form a depth camera.
  • the optical-mechanical system and the optical sensor share a lens, the optical centers of the optical-mechanical system and the depth camera can be consistent.
  • the original projection equipment does not need to be modified; as for the depth camera, it is necessary to be able to detect infrared light to realize the detection of the three-dimensional position of the depth camera.
  • the light sensor can detect the target light back into the lens through the reflection of the transflective film, which also realizes that the light sensor of the depth camera can detect the target light.
  • the first light source continuously emits target light.
  • the target light is reflected back into the lens of the light machine system when encountering an obstacle, and reflected back to the light sensor when it encounters a transflective sheet, and then the light sensor receives The target light reflected from the transflective sheet.
  • the optical sensor obtains the determined obstacle distance by detecting the round-trip time of the target light.
  • each pixel in the image in the first depth image data may indicate the distance between the object represented by the pixel and the light sensor.
  • the distance detection is realized by detecting the phase shift of the light wave of the target light.
  • Step 203 Correct the projection image of the projector according to the first depth image data.
  • the first depth image data may be an image of a display interface that the projector needs to project.
  • the display interface that needs to be projected may be any projection medium that can display a projected image, such as a screen, a wall, or the like.
  • the display interface corresponding to the first depth image data may be a plane.
  • the correction of the projection image may have the following multiple situations.
  • the central axis of the lens of the optical-mechanical system of the projector is perpendicular to the display interface that needs to be projected, and the extension line of the central axis of the lens of the optical-mechanical system intersects the midpoint of the display interface that needs to be projected, it is required
  • the center point of the projected image can be projected perpendicular to the display interface to be projected, so that the projected image can be a required rectangle.
  • the display image directly projected to the display interface to be projected may be trapezoidal.
  • keystone correction can be performed on the projection screen.
  • the image to be projected can be pixel-matched with the aforementioned first depth image data, and the image to be projected can be deformed according to the pixel matching, so that the image to be projected can be evenly distributed in the first depth image Data. Further, the deformed image can be projected onto the display interface that needs to be projected, so that the projected image can meet the viewing needs of the human eye.
  • the first depth image data may be a projection carrier image
  • the carrier may be a three-dimensional model.
  • the three-dimensional model may be a mountain-shaped model with protrusions and grooves.
  • a projection display of an animal viscera distribution map is required, and the three-dimensional model may be an animal shape model.
  • the animal shape model can be determined based on the first depth image data, and then the first depth image data can be identified to determine the position of each animal shape model. Distribution area in the image. Further, the pixel value of the first depth image data can also be read to determine the coordinate information of each part of the animal shape model. Further, the image that needs to be displayed is deformed according to the various parts of the animal shape model, so that the image that needs to be displayed matches each part of the animal shape model. Exemplarily, the liver in the image to be displayed may correspond to the liver part of the animal shape model. Further, the deformed image is projected onto the animal shape model.
  • the method in the embodiment of the present application can determine through image recognition of the first depth image data, the planar visual image of the display interface used to display the projection screen is confirmed by confirming the pixel value of the image pixel of the first depth image data,
  • the relative position of the display interface and the projector can be known.
  • the mapping relationship between the image that needs to be projected and the display interface can be determined by the plane visual image and the relative position of the display interface and the projector, so that the image that needs to be projected can be deformed according to the mapping relationship, so that the projector can be modified. Correction of the projected picture. It can be known that the application scenario of the method in this embodiment is not limited to the above-mentioned scenario, and any scene that requires a projection screen based on depth image correction can use the projection method in the embodiment of this application.
  • the interaction with the user can also be realized by using the above-mentioned projector.
  • the projection method in this embodiment may further include the following steps.
  • Step 204 Determine the second depth image data in the target area through the light sensor.
  • the second depth image data may be one depth image or multiple depth images.
  • Step 205 Determine, according to the second depth image data, an indication action that appears in the target area.
  • the above-mentioned instruction action may be an “OK” action, for example, an “OK” gesture, nodding of the head, swinging the hand downward, and the like.
  • the above-mentioned instruction action may also be a “page turning” action, for example, a hand swinging to the left or right.
  • the above-mentioned instruction action may also be a “cut” action, a “shooting” action, etc. in the game.
  • the collected object in the second depth image data is a plane image or a three-dimensional indicator.
  • the three-dimensional indicator can be a user's hand, an indicator stick, a physical model, and the like.
  • the pixel value of the image pixel in the second depth image data can be used to determine whether the collected object in the second depth image data is a stereo indicator.
  • the shape of the collected object can be recognized.
  • the current projector needs to receive an instruction action that is determined or not. Since “confirmation” can be determined by a static action, a frame of depth image can be identified to determine whether the shape of the captured object is For the "OK" action.
  • the instruction action that the current projector needs to receive can be whether it needs to "cut” the fruit in the picture. Since the "cut" action needs to be expressed by a dynamic action, it can be based on the image in the multi-frame depth image. The change in the position of the collected object determines whether the action performed by the collected object is "cut".
  • the instruction action that the current projector needs to receive may be whether page turning is required. Since the page turning action needs to be expressed by a dynamic action, it can be based on changes in the position of the object being collected in the multi-frame depth image. , It is determined that the action performed by the collected object moves continuously from one position to another. If it is, it means the page turning action performed by the user.
  • Step 206 Execute the instruction associated with the instruction action.
  • the corresponding instruction can be executed.
  • the instruction may be a page turning action when the display image needs to be switched; the instruction may be a confirmation action to confirm whether a certain video needs to be played; the instruction may also be a game action instruction during game interaction.
  • the method in the embodiment of the present application uses a projector with three-dimensional perception capabilities, interaction with the user can be realized.
  • 3D perception such as automatic keystone correction, virtual touch, etc.
  • additional 3D perception sensing devices such as tof (Time of flight, time of flight) are needed.
  • tof Time of flight, time of flight
  • Camera etc.
  • a three-dimensional sensing device is added outside the projector, it needs to be calibrated with the optical system of the projector every time it is used. This calibration process is very complicated and the calibration result is important in the three-dimensional sensing sensor. The failure of the equipment after replacement can be more convenient to use, and it does not need to be calibrated every time it is used, which can improve the convenience of the projector.
  • the projection method in this embodiment may further include: calibrating the light sensor of the projector.
  • a third-party camera can be used to assist in calibration, and the third-party camera can collect visible light and infrared light.
  • the relative position of the third-party camera and the projector can be kept unchanged.
  • step a the projector can project a preset image onto the target plane, and the above-mentioned third-party camera collects the projected image of the preset image. Calculate the position coordinates of the preset image in the projected image.
  • an infrared light source emits infrared light, projecting an infrared pattern to the above-mentioned target plane, and imaging the infrared light with a depth camera composed of a light sensor and a lens of an opto-mechanical system, and the above-mentioned third-party camera. Respectively calculate the transformation relationship between the depth camera coordinates and the third-party camera coordinates.
  • Step c using the transformation relationship obtained in step b, transform the position coordinates of the preset image obtained in step a in the projection image to the depth camera coordinate system composed of the optical sensor and the lens of the optical-mechanical system, and the projection can be obtained
  • the projection image of the instrument is at the imaging position of the depth camera, so that the light sensor can be calibrated.
  • the depth sensor formed by the light sensor and the projector share the same lens, the projected image of the projector does not change at the imaging position of the depth camera. Therefore, it can be used for a long time with only one calibration, and there is no need to perform additional calibration when it is used again, which can improve the convenience of the projector.
  • the embodiments of the present application also provide a computer-readable storage medium having a computer program stored on the computer-readable storage medium, and the computer program executes the steps of the projection method described in the above method embodiment when the computer program is run by a processor.
  • the computer program product of the projection method provided by the embodiment of the present application includes a computer-readable storage medium storing program code.
  • the instructions included in the program code can be used to execute the steps of the projection method described in the above method embodiment. Please refer to the above method embodiment, which will not be repeated here.
  • the various steps in the above-mentioned projection method embodiment can be implemented by software modules, and the functional modules can be integrated together to form an independent part, or each module can exist alone, or two or more modules can be integrated to form one. Separate part.
  • the function is implemented in the form of a software function module and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disks or optical disks and other media that can store program codes.
  • relational terms such as first and second are only used to distinguish one entity or operation from another entity or operation, and do not necessarily require or imply one of these entities or operations. There is any such actual relationship or order between.
  • the terms "include”, “include” or any other variants thereof are intended to cover non-exclusive inclusion, so that a process, method, article or device including a series of elements not only includes those elements, but also includes those that are not explicitly listed Other elements of, or also include elements inherent to this process, method, article or equipment. If there are no more restrictions, the elements defined by the sentence "including" do not exclude the existence of other identical elements in the process, method, article, or equipment that includes the elements.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Optics & Photonics (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

一种投影仪(100),包括:外壳(110);设置在外壳(110)内部的光机***(120);设置在外壳(110)上的第一光源(140),用于射出指定波长的目标光线;设置在光机***(120)内的半透半反片(130),用于透射可见光线和将射入光机***(120)的目标光线进行反射,射入光机***(120)的目标光线是第一光源(140)射出的目标光线遇到障碍物后反射回的目标光线;光传感器(150),用于检测半透半反片(130)反射的目标光线,并根据第一光源(140)发射出目标光线的第一时间及光传感器(150)检测半透半反片(130)反射的目标光线的第二时间确定出障碍物的位置信息。还提供一种投影方法。

Description

投影仪及投影方法 技术领域
本申请涉及投影技术领域,具体而言,涉及一种投影仪及投影方法。
背景技术
由于投影仪一般仅具有投影功能,且一般的投影仪的投影画面针对平面的投影背景的画面,因此,导致投影仪的使用场景非常受限。
发明内容
有鉴于此,本申请实施例的目的在于提供一种投影仪及投影方法。能够达到投影仪可以识别周边的三维景象的效果。
第一方面,实施例提供一种投影仪,包括:
外壳;
设置在所述外壳内部的光机***;
设置在所述外壳上的第一光源,用于射出指定波长的目标光线;
设置在所述光机***内的半透半反片,用于透射可见光线和将射入所述光机***的目标光线进行反射,所述射入所述光机***的目标光线是所 述第一光源射出的目标光线遇到障碍物后反射回的目标光线;
光传感器,用于检测所述半透半反片反射的目标光线,并根据所述第一光源发射出目标光线的第一时间及所述光传感器检测所述半透半反片反射的目标光线的第二时间确定出所述障碍物的位置信息。
在可选的实施方式中,所述光传感器为飞行时间传感器,所述第一光源为红外线光源,所述目标光线为红外光。
本申请实施例提供的投影仪,红外第一光源发出经调制的近红外光,遇障碍物后反射,飞行时间传感器通过计算红外光发射和反射时间差或相位差,来换算被拍摄障碍物的距离,以产生深度信息,从而实现获知障碍物的三维坐标。
在可选的实施方式中,所述光机***还包括:
设置在所述外壳内部的数字微镜装置,所述数字微镜装置位于远离光机***的镜头的一侧;
所述半透半反片设置在所述光机***的镜头与所述数字微镜装置之间。
在可选的实施方式中,所述光传感器的感光面与半透半反片的反射光线的光路垂直。
本申请实施例提供的投影仪,通过将光传感器的感光面与半透半反片的反射光线的光路垂直,可以使光传感器更好地接收到反射过来的目标光 线,从而使检测数据更准确。
在可选的实施方式中,所述光传感器的感光面与所述数字微镜装置垂直,所述数字微镜装置与所述半透半反片的夹角为45°;
所述半透半反片与所述光传感器的感光面的夹角为45°。
在可选的实施方式中,所述半透半反片与穿过所述光机***的出射光线的夹角为45°,所述光传感器的感光面与所述半透半反片的反射光路垂直。
在可选的实施方式中,所述第一光源设置在所述外壳的第一面,所述第一面为设置有所述光机***的镜头的一面。
本申请实施例提供的投影仪,通过上述的安装角度可以实现光传感器的感光面与半透半反片的反射光线的光路垂直,从而可以使光传感器更好地检测反射的目标光线。
第二方面,实施例提供一种投影方法,应用于前述实施方式任意一项所述的投影仪,所述投影方法包括:
通过所述投影仪的第一光源发射出指定波长的目标光线;
通过所述投影仪的光传感器确定出投影平面的第一深度图像数据;
根据所述第一深度图像数据校正所述投影仪的投影画面。
在可选的实施方式中,所述方法还包括:
通过所述光传感器确定出目标区域中的第二深度图像数据;
根据所述第二深度图像数据,确定出所述目标区域出现的指示动作;
执行与所述指示动作关联的指令。
本申请实施例提供的投影方法,还可以对用户的动作进行识别,从而可以实现与用户交互,提高投影仪的实用性。
在可选的实施方式中,所述方法还包括:
对所述投影仪的光传感器进行标定。
本申请实施例提供的投影方法,还可以对光传感器进行标定,从而可以使光传感器的测试数据更加准确。
本申请实施例提供的投影仪及投影方法,采用通过在投影仪中增设光传感器,根据该光传感器可以检测到光遇障碍物的来回传播时间,从而可以确定出障碍物的坐标信息,从而可以实现获知周边环境的三维环境信息,与现有技术中的仅具有投影功能,且一般的投影仪的投影画面针对平面的投影背景的画面相比,本申请实施例中的方案可以感知周边的三维信息,从而使投影仪能够适应更多的三维场景。进一步地,由于本申请实施例的投影仪可以感知周边的三维场景,从而可以适用画面投射至三维立体平面中显示。
为使本申请的上述目的、特征和优点能更明显易懂,下文特举实施例,并配合所附附图,作详细说明如下。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,应当理解,以下附图仅示出了本申请的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1为本申请实施例提供的投影仪的结构示意图。
图2为本申请实施例提供的投影仪的另一结构示意图。
图3为本申请实施例提供的投影仪的方框示意图。
图4为本申请实施例提供的投影仪的光路示意图。
图5为本申请实施例提供的投影方法的流程图。
图6为本申请实施例提供的投影方法的部分流程图。
图标:100-投影仪;110-外壳;120-光机***;121-镜头;122-第二光源;123-数字微镜装置;130-半透半反片;140-第一光源;150-光传感器;160-存储器;170-存储控制器;180-处理器;190-外设接口;C1-幕布。
具体实施方式
下面将结合本申请实施例中附图,对本申请实施例中的技术方案进行描述。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一 旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释。同时,在本申请的描述中,术语“第一”、“第二”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
实施例一
为便于对本实施例进行理解,首先对执行本申请实施例所公开的投影方法的投影仪100进行详细介绍。
如图1和2所示,本实施中的投影仪100包括:外壳110、安装在外壳110内部的光机***120、设置在所述外壳110上的第一光源140、以及安装在外壳110内部的光传感器150。
本实施例中的光机***120用于将待投影的图像投影到目标显示面上。该目标显示面可以是幕布、墙面等。
本实施例中,光机***120可以包括:镜头121、第二光源122及数字微镜装置123(Digital Micromirror Device,DMD)。
示例性地,投影仪100的外壳110的第一面开设有通孔,光机***120的镜头121的第一端通过该通孔显露在外壳110的表面,该镜头121的第二端设置在外壳110的内部。
可选地,投影仪100还可以包括第一保护盖,该第一保护盖用于盖设在镜头121的第一端。
示例性地,光机***120的数字微镜装置123位于光机***120的镜头121的第二端所在一侧。该数字微镜装置123用于使用数字电压信号控制微镜片执行机械运动来实现光学功能的装置。
可选地,第二光源122也可以安装在光机***120的镜头121的第二端所在一侧。
可选地,如图1所示,投影仪100还可以包括多个调解旋钮(图中未标号),该调解旋钮用于对光机***120的镜头121的焦距进行条件。
本实施例中,如图1所示,第一光源140可以安装在外壳110的第一面,用于射出指定波长的目标光线。
示例性地,如图1或2所示,第一光源140可以安装在外壳110的外表的第一面上。示例性地,外壳110的外表的第一面上可以设置有通孔,第一光源140可以安装在外壳100的内部,第一光源140的出光面可正对该通孔,以使第一光源140射出的目标光线可以从该穿过通孔射出。第一光源140也可以穿过该通过,以使第一光源140的出光面显露在外壳100的外表。
可选地,投影仪100还可以包括第二保护盖。该第二保护盖可用于盖设在第一光源140的出光面。
在一实施方式中,第一光源140为红外线光源,第一光源140射出的目标光线为红外光,此时,该目标光线的波长可以在760纳米至1毫米区 间内。
可选地,第一光源140可以对光进行高频调制之后再进行发射。示例性地,第一光源140可以是LED(Light Emitting Diode,发光二极管)或激光。该激光可以是激光二极管或VCSEL(Vertical cavity surface emitting laser,垂直腔面发射激光器)。第一光源140发射高性能脉冲光,该脉冲光的脉冲可以达到100MHz左右。
请再次参阅图2,投影仪100的半透半反片130可以设置在光机***120的镜头121与数字微镜装置123之间。
示例性地,该半透半反片130可以与数字微镜装置123的靠近光机***120的镜头121的一面的夹角为45°。
可选地,该半透半反片130与穿过光机***120的出射光线的夹角为45°。示例性地,该半透半反片130与第一光源140射出的出射光线的夹角为45°
可选地,该半透半反片130能够透射可见光线,能够反射不可见光。
本实施例中,第一光源140射出的目标光线可以是一种不可见光。示例性地,目标光线可以是红外光。第二光源122射出的出射光线为可见光。因此,该半透半反片130能够透射第二光源122射出的出射光线,能够反射第一光源140射出的目标光线。
本实施例中,该半透半反片130可以用于将射入光机***120的目标 光线进行反射。其中,该射入光机***120的目标光线是第一光源140射出的目标光线遇到障碍物后反射回的目标光线。
示例性地,该障碍物可以是投影仪100投射的图像的显示面。该障碍物也可以是与该投影仪100互动的人。
本实施例中,投影仪100包括的光传感器150可用于检测半透半反片130反射的目标光线。
示例性地,本实施例中的光传感器150的出光面与半透半反片130的反光面对应,以使该光传感器150能够检测到半透半反片130反射的目标光线。
示例性地,本实施例中的光传感器150可以安装在半透半反片130反射的光线的光路上。可选地,光传感器150的感光面与半透半反片130的反射光路垂直。
可选地,光传感器150的感光面可以与数字微镜装置123垂直,该数字微镜装置123与半透半反片130的夹角为45°;以使半透半反片130与光传感器150的感光面的夹角为45°。示例性地,光传感器150的感光面可以与数字微镜装置123垂直可以表示光传感器150的感光面可以与数字微镜装置123正对光机***120的镜头121的一面垂直。
其中,该光传感器150根据第一光源140发射出目标光线的第一时间及光传感器150检测所述半透半反片130反射的目标光线的第二时间确定 出所述障碍物的位置信息。
可选地,光传感器150为飞行时间(Time of flight,简称:TOF)传感器。
可选地,半透半反片130与穿过所述光机***120的出射光线的夹角为45°,所述光传感器150的感光面与所述半透半反片130的反射光路垂直。示例性地,该半透半反片130与第二光源122的出光面可以的夹角为45°。
如图3所示,是投影仪100的方框示意图。投影仪100可以包括存储器160、存储控制器170、处理器180、外设接口190。本领域普通技术人员可以理解,图3所示的结构仅为示意,其并不对投影仪100的结构造成限定。例如,投影仪100还可包括比图3中所示更多或者更少的组件,或者具有与图1所示不同的配置。
上述的存储器160、存储控制器170、处理器180、外设接口190各元件相互之间直接或间接地电性连接,以实现数据的传输或交互。例如,这些元件相互之间可通过一条或多条通讯总线或信号线实现电性连接。上述的处理器180用于执行存储器160中存储的可执行模块。
其中,存储器160可以是,但不限于,随机存取存储器160(Random Access Memory,简称RAM),只读存储器160(Read Only Memory,简称ROM),可编程只读存储器160(Programmable Read-Only Memory,简称PROM),可擦除只读存储器160(Erasable Programmable Read-Only Memory, 简称EPROM),电可擦除只读存储器160(Electric Erasable Programmable Read-Only Memory,简称EEPROM)等。其中,存储器160用于存储程序,所述处理器180在接收到执行指令后,执行所述程序,本申请实施例任一实施例揭示的过程定义的投影仪100所执行的方法可以应用于处理器180中,或者由处理器180实现。
上述的处理器180可能是一种集成电路芯片,具有信号的处理能力。上述的处理器180可以是通用处理器,包括中央处理器(Central Processing Unit,简称CPU)、网络处理器(Network Processor,简称NP)等;还可以是数字信号处理器(digital signal processor,简称DSP)、专用集成电路(Application Specific Integrated Circuit,简称ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
上述的外设接口190将各种输入/输出装置耦合至处理器180以及存储器160。在一些实施例中,外设接口190,处理器180以及存储控制器170可以在单个芯片中实现。在其他一些实例中,他们可以分别由独立的芯片实现。
可选地,投影仪100也可以包括一控制电路,该控制电路用于控制投 影仪100的投影相关参数。示例性地,投影相关参数可以是投影亮度参数、投影画面参数。示例性地,该控制电路还可以依据视频信号投影影像画面。
通过本实施例提供的投影仪100,能够实现对投影仪100周围的三维环境进行感知。进一步地,本实施例中的光传感器150与光机***120的镜头121可以形成一个深度相机,该深度相机可以获得投影仪100周边的三维环境数据。本实施例中的深度相机与投影仪100的光机***120共用一个镜头121,在对现有的投影仪100的改进较少的情况下,实现具有三维感知的投影仪100。
下面以投影仪的显示界面为幕布为例,对投影仪100的工作原理进行说明。
如图4所示,第一光源140射出的目标光线L1射至幕布C1后,当目标光线L1遇到幕布C1被反射回镜头121,目标光线L1遇到能够反射目标光线的半透半反片130,被反射至光传感器150,从而光传感器150,根据目标光线的通过探测光脉冲的飞行时间来得到与幕布C1的距离。进一步地,光传感器150根据确定出的与幕布C1的距离,则可以确定出幕布C的三维坐标。
进一步地,由于可见光可以穿过半透半反片130,因此,第二光源122射出的出射光线L2可以依次穿过半透半反片130、镜头121从而投影至幕布C1,在幕布C1上成像。
本实施例中的投影仪100可以用于执行本申请实施例提供的各个方法中的各个步骤。下面通过几个实施例详细描述投影方法的实现过程。
实施例二
请参阅图5,是本申请实施例提供的投影方法的流程图。下面将对图5所示的具体流程进行详细阐述。
步骤201,通过投影仪的第一光源发射出指定波长的目标光线。
本实施例中,可以在投影仪启动后,第一光源持续性发出目标光线。
步骤202,通过所述投影仪的光传感器确定出投影平面的第一深度图像数据。
本实施例中,该光传感器与光机***复用一个镜头,也就是光机***的镜头与该光传感器可组合成一深度相机。本实施例中,由于光机***与光传感器共用一个镜头,则可以实现光机***与深度相机的光心一致。在本实施例中,就投影仪而言,不需要对原有的投影设备进行更改;对于深度相机而言,需要能够检测到红外光则可以实现深度相机的三维位置的检测。本实施例中,通过半透半反片的反射可以使光传感器能够检测到回到镜头内的目标光线,也就实现了深度相机的光传感器能够检测到目标光线。
本实施例中,第一光源连续发射目标光线,该目标光线遇到障碍物后背反射回光机***的镜头内,遇到半透半反片后背反射至光传感器,然后 该光传感器接收从半透半反片反射的目标光线。该光传感器通过探测目标光线的飞行往返时间来得到确定的障碍物的距离。
本实施例中,第一深度图像数据中的图像中的各个像素点可以表示该像素点所表示的对象与光传感器的距离。
本实施例中,通过检测目标光线的光波的相位偏移来实现距离的检测。
步骤203,根据所述第一深度图像数据校正所述投影仪的投影画面。
在一使用场景中,该第一深度图像数据可以是投影仪需要投影的显示界面的图像。示例性地,该需要投影的显示界面可以是幕布、墙面等任意可以显示投影图像的投影介质。在此使用场景中,第一深度图像数据对应的显示界面可能为一平面,此时,则对投影画面的校正可能存在以下多种情况。
在一个实例中,若投影仪的光机***的镜头的中轴线与需要投影的显示界面垂直,且光机***的镜头的中轴线的延长线与需要投影的显示界面的中点相交,则需要投影的图像的中心点可以垂直于需要投影的显示界面投射出去,从而可以使投影画面为需要的矩形。
在一个实例中,若投影仪的光机***的镜头的中轴线与需要投影的显示界面的夹角小于九十度,则直接投影至需要投影的显示界面的显示图像可能是梯形。示例性地,可以对投影画面进行梯形校正。可选地,可以将需要投影的图像与上述的第一深度图像数据进行像素匹配,根据像素的匹 配对需要投影的图像进行变形处理,从而使需要投影的图像可均匀地分布在第一深度图像数据中。进一步地,可以将变形后的图像进行投影至需要投影的显示界面中,从而使投影画面能够适应人眼观看需求。
在另一使用场景中,该第一深度图像数据可以是一投影载体图像,该载体可以是三维模型。例如,需要投影显示一座高山上的景色,则该三维模型可以是具有凸起和凹槽的山形模型。再例如,在医学教学中,需要投影显示一动物内脏分布图,则该三维模型可以是动物外形模型。
在一个实例中,若需要投影显示一动物内脏分布图,则可以根据第一深度图像数据确定出动物外形模型,则可以对第一深度图像数据进行识别,以确定出各个动物外形模型各个部位的在图像中的分布区域。进一步地,还可以读取第一深度图像数据的像素值以确定出动物外形模型各个部位的坐标信息。进一步地,根据动物外形模型各个部位对需要显示的图像进行变形处理,以使需要显示的图像与动物外形模型各个部位匹配。示例性地,可以使需要显示的图像中的肝脏对应到动物外形模型的肝脏部位。进一步地,将变形的图像进行投影至动物外形模型上。
本申请实施例中的方法通过第一深度图像数据的图像识别可以确定出,用于显示投影画面的显示界面的平面视觉形象,通过对第一深度图像数据的图像像素点的像素值的确认,可以获知显示界面与投影仪的相对位置。通过平面视觉形象和显示界面与投影仪的相对位置可以确定出需要投 影的图像与显示界面的映射关系,从而可以根据该映射关系可以对需要投影的图像进行变形处理,从而可以实现对投影仪的投影画面的校正。可以知道的是,本实施例中的方法的应用场景并不仅限于上述的场景,任意需要基于深度图像校正的投影画面的场景都可以使用本申请实施例中的投影方法。
本申请实施例中的方法,由于使用了实施例一种的具有三维感知能力的投影仪,因此,通过使用上述的投影仪还可以实现与用户的互动。
本实施例中,请参阅图6,本实施例中的投影方法还可以包括以下步骤。
步骤204,通过所述光传感器确定出目标区域中的第二深度图像数据。
本实施例中,第二深度图像数据可以是一张深度图像,也可以是多张深度图像。
步骤205,根据所述第二深度图像数据,确定出所述目标区域出现的指示动作。
示例性地,上述的指示动作可以是“确定”动作,例如,“OK”手势、点头、手向下摆等。示例性地,上述的指示动作也可以是“翻页”动作,例如,手向左或向右摆动。示例性地,上述的指示动作还可以是在游戏中的“切”动作、“射击”动作等。
本实施例中,需要确定第二深度图像数据中是否存在指示动作,则需要首先确定出是否有用户在投影仪前执行了指定动作。通过三维立体物体 的确认,一些显示器中的视频中出现了上述的指示动作,而导致投影仪执行了错误的指令。
首先,可以通过第二深度图像数据确定出第二深度图像数据中的被采集到的对象是平面图像,还是立体指示物。该立体指示物可以是用户的手、指示棍、实体模型等。示例性地,可以通过第二深度图像数据中的图像像素点的像素值,确定出第二深度图像数据中的被采集到的对象是否为立体指示物。
其次,若确定被采集到的对象是立体指示物,则可以对被采集到的对象的形态进行识别。在一个实例中,当前投影仪需要接收的指示动作是确定与否,由于“确认”可以通过一静态动作确定,则可以通过对一帧深度图像进行识别,以确定被采集到的对象的形态是否为“确定”动作。在另一个实例中,当前投影仪需要接收的指示动作可以是是否需要“切”画面中的水果,由于“切”的动作需要由一动态的动作表达,则可以根据多帧深度图像中的被采集到的对象位置的变化,确定出被采集到的对象执行的动作是否为“切”。在另一个实例中,当前投影仪需要接收的指示动作可以是是否需要翻页,由于翻页动作需要由一动态的动作表达,则可以根据多帧深度图像中的被采集到的对象位置的变化,确定出被采集到的对象执行的动作由一位置到另一位置的连续移动,若是,则表示用户执行的翻页的动作。
步骤206,执行与所述指示动作关联的指令。
本实施例中,若识别到指示动作能够匹配上对应的指令,则可以执行对应的指令。
示例性地,该指令可以是需要切换显示图像时的翻页动作;该指令可以是确认是否需要播放某一视频的确认动作;该指令还可以是游戏互动过程中的游戏动作指令等。
由于本申请实施例中的方法使用了具有三维感知能力的投影仪,从而可以实现与用户的互动。相对于一般不具备三维感知能力的投影仪,在需要三维感知的场合,如自动梯形矫正,虚拟触控等场合,则需要增加额外的三维感知传感设备,如tof(Time of flight,飞行时间)相机等。但是如果仅仅是在投影仪外部增设一个三维感知设备,则每次需要使用的时候都需要与投影仪的光学***之间进行一些标定,这个标定过程十分复杂且该标定结果,在三维感知传感设备更换后失效能够更方便使用,且不需要每次使用时都要进行标定,可以提高投影仪的便利性。
本实施例中,本实施例中的投影方法还可以包括:对所述投影仪的光传感器进行标定。
可选地,可以使用第三方摄像头协助标定,该第三方摄像头能够采集可见光和红外光。在标定时,可以保持第三方摄像头与投影仪的相对位置不变。
步骤a,投影仪可以投射一预设图像至目标平面上,上述的第三方摄像 头采集该预设图像的投影图像。计算该预设图像的在投影图像中的位置坐标。
步骤b,由一红外光源发射红外光,投射红外图案至上述的目标平面,分别用光传感器和光机***的镜头组成的深度相机,和上述的第三方摄像头对红外光进行成像。分别计算深度相机坐标与第三方摄像头坐标之间的变换关系。
步骤c,使用步骤b得到的变换关系,将步骤a得到的中的预设图像的在投影图像中的位置坐标变换至光传感器和光机***的镜头组成的深度相机坐标系下,即可以得到投影仪的投影图像在该深度相机的成像位置,从而可以实现对光传感器的标定。
由于光传感器形成的深度传感器与投影仪共用同一镜头,因此,投影仪的投影图像在该深度相机的成像位置不变。因此,可以实现仅一次标定就可以长期使用,再次使用时不需要再另外进行标定,可以提高投影仪的便利性。
此外,本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质上存储有计算机程序,该计算机程序被处理器运行时执行上述方法实施例中所述的投影方法的步骤。
本申请实施例所提供的投影方法的计算机程序产品,包括存储了程序代码的计算机可读存储介质,所述程序代码包括的指令可用于执行上述方 法实施例中所述的投影方法的步骤,具体可参见上述方法实施例,在此不再赘述。
另外,上述投影方法实施例中的各个步骤可以通过软件模块实现,各功能模块可以集成在一起形成一个独立的部分,也可以是各个模块单独存在,也可以两个或两个以上模块集成形成一个独立的部分。
所述功能如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括 为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上所述仅为本申请的优选实施例而已,并不用于限制本申请,对于本领域的技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (10)

  1. 一种投影仪,其特征在于,包括:
    外壳;
    设置在所述外壳内部的光机***;
    设置在所述外壳上的第一光源,用于射出指定波长的目标光线;
    设置在所述光机***内的半透半反片,用于透射可见光线和将射入所述光机***的目标光线进行反射,所述射入所述光机***的目标光线是所述第一光源射出的目标光线遇到障碍物后反射回的目标光线;
    光传感器,用于检测所述半透半反片反射的目标光线,并根据所述第一光源发射出目标光线的第一时间及所述光传感器检测所述半透半反片反射的目标光线的第二时间确定出所述障碍物的位置信息。
  2. 根据权利要求1所述的投影仪,其特征在于,所述光传感器为飞行时间传感器,所述第一光源为红外线光源,所述目标光线为红外光。
  3. 根据权利要求1所述的投影仪,其特征在于,所述光机***还包括:
    设置在所述外壳内部的数字微镜装置,所述数字微镜装置位于远离光机***的镜头的一侧;
    所述半透半反片设置在所述光机***的镜头与所述数字微镜装置之间。
  4. 根据权利要求3所述的投影仪,其特征在于,所述光传感器的感光 面与半透半反片的反射光线的光路垂直。
  5. 根据权利要求4所述的投影仪,其特征在于,所述光传感器的感光面与所述数字微镜装置垂直,所述数字微镜装置与所述半透半反片的夹角为45°;
    所述半透半反片与所述光传感器的感光面的夹角为45°。
  6. 根据权利要求1所述的投影仪,其特征在于,所述半透半反片与穿过所述光机***的出射光线的夹角为45°,所述光传感器的感光面与所述半透半反片的反射光路垂直。
  7. 根据权利要求1所述的投影仪,其特征在于,所述第一光源设置在所述外壳的第一面,所述第一面为设置有所述光机***的镜头的一面。
  8. 一种投影方法,其特征在于,应用于权利要求1-6任意一项所述的投影仪,所述投影方法包括:
    通过所述投影仪的第一光源发射出指定波长的目标光线;
    通过所述投影仪的光传感器确定出投影平面的第一深度图像数据;
    根据所述第一深度图像数据校正所述投影仪的投影画面。
  9. 根据权利要求8所述的投影方法,其特征在于,所述方法还包括:
    通过所述光传感器确定出目标区域中的第二深度图像数据;
    根据所述第二深度图像数据,确定出所述目标区域出现的指示动作;
    执行与所述指示动作关联的指令。
  10. 根据权利要求8所述的投影方法,其特征在于,所述方法还包括:
    对所述投影仪的光传感器进行标定。
PCT/CN2020/079170 2019-12-13 2020-03-13 投影仪及投影方法 WO2021114502A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US17/599,571 US20220196836A1 (en) 2019-12-13 2020-03-13 Projector and projection method
EP20897688.6A EP4075193A4 (en) 2019-12-13 2020-03-13 PROJECTOR AND PROJECTION METHOD
JP2021576768A JP2022523277A (ja) 2019-12-13 2020-03-13 プロジェクター及び投影方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911289136.8 2019-12-13
CN201911289136.8A CN111123625B (zh) 2019-12-13 2019-12-13 投影仪及投影方法

Publications (1)

Publication Number Publication Date
WO2021114502A1 true WO2021114502A1 (zh) 2021-06-17

Family

ID=70498888

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/079170 WO2021114502A1 (zh) 2019-12-13 2020-03-13 投影仪及投影方法

Country Status (5)

Country Link
US (1) US20220196836A1 (zh)
EP (1) EP4075193A4 (zh)
JP (1) JP2022523277A (zh)
CN (1) CN111123625B (zh)
WO (1) WO2021114502A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115273711A (zh) * 2021-04-29 2022-11-01 中强光电股份有限公司 投影设备及其自动调整投影方法

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102207667A (zh) * 2010-03-31 2011-10-05 香港应用科技研究院有限公司 交互式投影装置
CN103024324A (zh) * 2012-12-10 2013-04-03 Tcl通力电子(惠州)有限公司 一种短焦投影***
CN103974048A (zh) * 2014-04-28 2014-08-06 京东方科技集团股份有限公司 控制可穿戴设备投影的方法及装置、可穿戴设备
CN105824173A (zh) * 2015-01-27 2016-08-03 财团法人工业技术研究院 交互式投影仪及其用于确定对象的深度信息的操作方法
US20170212640A1 (en) * 2014-05-23 2017-07-27 Piqs Technology (Shenzhen) Limited Interactive display systems
CN107561833A (zh) * 2017-09-13 2018-01-09 明基电通有限公司 投影机
CN108027441A (zh) * 2015-09-08 2018-05-11 微视公司 混合模式深度检测
CN108683895A (zh) * 2018-04-16 2018-10-19 广景视睿科技(深圳)有限公司 一种互动投影仪及互动投影方法
CN108762483A (zh) * 2018-04-16 2018-11-06 广景视睿科技(深圳)有限公司 一种互动投影仪及互动投影方法

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002006397A (ja) * 2000-06-22 2002-01-09 Sony Corp 画像表示装置
JP4535714B2 (ja) * 2003-11-19 2010-09-01 Necディスプレイソリューションズ株式会社 プロジェクタ
CN101256264A (zh) * 2008-03-26 2008-09-03 清华大学深圳研究生院 一种投影与摄像两用光学镜头模块
JP2011007948A (ja) * 2009-06-24 2011-01-13 Sharp Corp プロジェクタおよびプロジェクタ内蔵携帯機器
JP5349365B2 (ja) * 2010-02-18 2013-11-20 三菱電機株式会社 画像投影装置および画像表示装置
JP5018988B2 (ja) * 2011-08-04 2012-09-05 カシオ計算機株式会社 投影装置、投影装置の運転制御方法及びプログラム
US9132346B2 (en) * 2012-04-04 2015-09-15 Kenneth J. Huebner Connecting video objects and physical objects for handheld projectors
US9462255B1 (en) * 2012-04-18 2016-10-04 Amazon Technologies, Inc. Projection and camera system for augmented reality environment
US8933974B1 (en) * 2012-09-25 2015-01-13 Rawles Llc Dynamic accommodation of display medium tilt
US9304379B1 (en) * 2013-02-14 2016-04-05 Amazon Technologies, Inc. Projection display intensity equalization
US9778546B2 (en) * 2013-08-15 2017-10-03 Mep Tech, Inc. Projector for projecting visible and non-visible images
CN106255938B (zh) * 2014-02-28 2019-12-17 惠普发展公司,有限责任合伙企业 传感器和投影仪的校准
JP6385729B2 (ja) * 2014-06-13 2018-09-05 株式会社東芝 画像処理装置および画像投影装置
JP6372266B2 (ja) * 2014-09-09 2018-08-15 ソニー株式会社 投射型表示装置および機能制御方法
JP2016122179A (ja) * 2014-12-25 2016-07-07 パナソニックIpマネジメント株式会社 投影装置及び投影方法
US10962764B2 (en) * 2016-04-01 2021-03-30 Intel Corporation Laser projector and camera
JP2018031803A (ja) * 2016-08-22 2018-03-01 富士通株式会社 プロジェクタ装置
KR20180134256A (ko) * 2017-06-08 2018-12-18 샐터스 주식회사 영상 투사 장치
CN108549187A (zh) * 2018-04-24 2018-09-18 歌尔科技有限公司 一种交互式投影灯的控制方法、装置及交互式投影灯
CN110491316A (zh) * 2019-07-08 2019-11-22 青岛小鸟看看科技有限公司 一种投影仪及其投影控制方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102207667A (zh) * 2010-03-31 2011-10-05 香港应用科技研究院有限公司 交互式投影装置
CN103024324A (zh) * 2012-12-10 2013-04-03 Tcl通力电子(惠州)有限公司 一种短焦投影***
CN103974048A (zh) * 2014-04-28 2014-08-06 京东方科技集团股份有限公司 控制可穿戴设备投影的方法及装置、可穿戴设备
US20170212640A1 (en) * 2014-05-23 2017-07-27 Piqs Technology (Shenzhen) Limited Interactive display systems
CN105824173A (zh) * 2015-01-27 2016-08-03 财团法人工业技术研究院 交互式投影仪及其用于确定对象的深度信息的操作方法
CN108027441A (zh) * 2015-09-08 2018-05-11 微视公司 混合模式深度检测
CN107561833A (zh) * 2017-09-13 2018-01-09 明基电通有限公司 投影机
CN108683895A (zh) * 2018-04-16 2018-10-19 广景视睿科技(深圳)有限公司 一种互动投影仪及互动投影方法
CN108762483A (zh) * 2018-04-16 2018-11-06 广景视睿科技(深圳)有限公司 一种互动投影仪及互动投影方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4075193A4 *

Also Published As

Publication number Publication date
CN111123625B (zh) 2021-05-18
JP2022523277A (ja) 2022-04-21
CN111123625A (zh) 2020-05-08
EP4075193A4 (en) 2023-12-27
EP4075193A1 (en) 2022-10-19
US20220196836A1 (en) 2022-06-23

Similar Documents

Publication Publication Date Title
TWI758368B (zh) 包含可調整焦距成像感測器的距離感測器
JP6494863B2 (ja) プリズムによる視線追跡
EP3903171A1 (en) Head mounted display calibration using portable docking station with calibration target
US20240012474A1 (en) Head-mounted display device and operating method of the same
KR102463712B1 (ko) 가상 터치 인식 장치 및 그의 인식 오류 보정 방법
US20130002823A1 (en) Image generating apparatus and method
CN111083453B (zh) 一种投影装置、方法及计算机可读存储介质
JP6047763B2 (ja) ユーザインターフェース装置およびプロジェクタ装置
WO2021208582A1 (zh) 标定装置、标定***、电子设备及标定方法
US12003898B2 (en) Projector and projection method
WO2023087947A1 (zh) 一种投影设备和校正方法
KR101523046B1 (ko) 영상처리기반의 거리 측정장치
KR20190000052A (ko) 광 송출장치 및 이를 이용한 ToF(Time of Flight)모듈
TW201329508A (zh) 眼部保護裝置及方法
WO2021114502A1 (zh) 投影仪及投影方法
US9841847B2 (en) Projection device and projection method, for projecting a first image based on a position of a moving object and a second image without depending on the position
US20220239871A1 (en) Projector Focusing Method and Projector Focusing System Capable of Projecting High Resolution Images at Arbitrary Positions
CN115086622B (zh) 投影机及其校正方法
US20150185321A1 (en) Image Display Device
WO2022196109A1 (ja) 計測装置および計測方法、ならびに、情報処理装置
JP6740614B2 (ja) 物体検出装置、及び物体検出装置を備えた画像表示装置
CN113661433B (zh) 头戴式显示设备及其操作方法
CN114760454A (zh) 一种投影设备及触发校正方法
TWI535288B (zh) 深度攝影機系統
US11080874B1 (en) Apparatuses, systems, and methods for high-sensitivity active illumination imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20897688

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021576768

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020897688

Country of ref document: EP

Effective date: 20220713