CN114885136B - Projection apparatus and image correction method - Google Patents

Projection apparatus and image correction method Download PDF

Info

Publication number
CN114885136B
CN114885136B CN202210325721.4A CN202210325721A CN114885136B CN 114885136 B CN114885136 B CN 114885136B CN 202210325721 A CN202210325721 A CN 202210325721A CN 114885136 B CN114885136 B CN 114885136B
Authority
CN
China
Prior art keywords
correction
projection
image
gray value
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210325721.4A
Other languages
Chinese (zh)
Other versions
CN114885136A (en
Inventor
王昊
郑晴晴
卢平光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN114885136A publication Critical patent/CN114885136A/en
Priority to PCT/CN2022/122670 priority Critical patent/WO2023087947A1/en
Application granted granted Critical
Publication of CN114885136B publication Critical patent/CN114885136B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/317Convergence or focusing systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/53Means for automatic focusing, e.g. to compensate thermal effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)
  • Automatic Focus Adjustment (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Focusing (AREA)

Abstract

Some embodiments of the application provide a projection device and an image correction method. The projection device includes a light engine, a camera, and a controller. When an image correction instruction is obtained, the optical machine of the projection device projects a correction chart card containing correction characteristic points to a projection surface, and the camera shoots the correction chart card to obtain a correction image. The projection device may determine a first position of the correction feature point in the correction image and acquire the projection relationship according to the first position. The projection relation is expressed as a mapping relation between the playing content projected by the optical machine and the projection surface. The projection equipment determines the target position information of the area to be projected in the projection plane according to the projection relation, and controls the optical machine to project the play content to the area to be projected, so that the image correction is realized. The projection equipment can determine the mapping relation between the optical machine and the projection surface according to the correction chart card, so that the position of the area to be projected is accurately determined, the projection image can be accurately corrected, and the use experience of a user is improved.

Description

Projection apparatus and image correction method
The present application claims priority from chinese patent office, application number 202111355866.0, entitled "a projection device and display control method based on geometric correction", filed on day 16, 11, 2021, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of projection devices, and in particular, to a projection device and an image correction method.
Background
A projection device is a display device that can project images or video onto a screen. The projection device can project laser rays with specific colors to a screen through the refraction action of the optical lens assembly to form a specific image.
When the projection device is used, if the position of the projection device is shifted or the projection device is not perpendicular to the projection surface, the projection device cannot completely project an image to a preset projection area, so that the situation that a projection picture generates a trapezoid occurs, and the experience of a user is poor. For this reason, the display position and shape of the projection image can be corrected by the image correction function of the projection apparatus to correct the case where the projected screen generates a trapezoid.
When the existing projection equipment corrects the image, a user can control the projection equipment to adjust the projection angle, so that the display position and the shape of the projection image are controlled, and the image correction is realized. However, this approach requires the user to manually select the adjustment direction, is cumbersome in process, and cannot accurately correct the projected image, giving the user a poor experience.
Disclosure of Invention
The invention provides a projection device and an image correction method. The method solves the problems that in the related art, the projected image cannot be accurately corrected, and the experience of a user is poor.
In a first aspect, the present application provides a projection device comprising an optical engine, a camera, and a controller. The optical machine is configured to project the playing content to the projection surface; a camera configured to capture an image displayed in the projection surface; the controller is configured to perform the steps of:
responding to an image correction instruction, controlling the optical machine to project a correction chart card to the projection surface, and controlling the camera to shoot the correction chart card to obtain a correction image; the correction chart card comprises correction characteristic points;
Determining a first position of the correction feature point in the correction image, and acquiring a projection relation according to the first position; the projection relation is a mapping relation between the playing content projected by the optical machine and the projection surface;
determining target position information of a region to be projected in the projection surface according to the projection relation;
And controlling the optical machine to project playing content to the area to be projected according to the target position information.
In a second aspect, the present application provides an image correction method, applied to a projection apparatus, the method comprising:
responding to an image correction instruction, controlling the optical machine to project a correction chart card to the projection surface, and controlling the camera to shoot the correction chart card to obtain a correction image; the correction chart card comprises correction characteristic points;
Determining a first position of the correction feature point in the correction image, and acquiring a projection relation according to the first position; the projection relation is a mapping relation between the playing content projected by the optical machine and the projection surface;
determining target position information of a region to be projected in the projection surface according to the projection relation;
And controlling the optical machine to project playing content to the area to be projected according to the target position information.
It can be seen from the above technical solutions that the present application provides a projection apparatus and an image correction method. When an image correction instruction is obtained, the optical machine of the projection device projects a correction chart card containing correction characteristic points to a projection surface, and the camera shoots the correction chart card to obtain a correction image. The projection device may determine a first position of the correction feature point in the correction image and acquire the projection relationship according to the first position. The projection relation is expressed as a mapping relation between the playing content projected by the optical machine and the projection surface. The projection equipment determines the target position information of the area to be projected in the projection plane according to the projection relation, and controls the optical machine to project the play content to the area to be projected, so that the image correction is realized. The projection equipment can determine the mapping relation between the optical machine and the projection surface according to the correction chart card, so that the position of the area to be projected is accurately determined, the projection image can be accurately corrected, and the use experience of a user is improved.
Drawings
In order to more clearly illustrate the technical solution of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 illustrates a schematic layout of a projection device in some embodiments;
FIG. 2 illustrates a schematic diagram of a projection device optical path in some embodiments;
FIG. 3 illustrates a schematic circuit architecture of a projection device in some embodiments;
FIG. 4 illustrates a schematic diagram of the structure of a projection device in some embodiments;
FIG. 5 illustrates a schematic circuit diagram of a projection device in some embodiments;
FIG. 6 illustrates a schematic diagram of some embodiments with a projection device position change;
FIG. 7 illustrates a system frame diagram of a projection device implementing display control in some embodiments;
FIG. 8 illustrates an interactive flow diagram for components of a projection device in some embodiments;
FIG. 9 illustrates a schematic diagram of a calibration chart in some embodiments;
FIG. 10 illustrates a schematic diagram of a calibration chart in some embodiments;
FIG. 11 illustrates a flow diagram for acquiring a projection relationship in some embodiments;
FIG. 12 illustrates a schematic diagram of a calibration chart in some embodiments;
FIG. 13 illustrates a schematic diagram of an image before and after correction in some embodiments;
FIG. 14 shows a schematic diagram of HSV values for each color in some embodiments;
FIG. 15 is a schematic diagram of gray value curves in some embodiments;
FIG. 16 is a schematic diagram of gray value curves in some embodiments;
FIG. 17 is a schematic diagram of gray value curves in some embodiments;
fig. 18 shows a flow diagram of one embodiment of an image correction method.
Detailed Description
For the purposes of making the objects, embodiments and advantages of the present application more apparent, an exemplary embodiment of the present application will be described more fully hereinafter with reference to the accompanying drawings in which exemplary embodiments of the application are shown, it being understood that the exemplary embodiments described are merely some, but not all, of the examples of the application.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the appended claims. Furthermore, while the present disclosure has been described in terms of an exemplary embodiment or embodiments, it should be understood that each aspect of the disclosure can be practiced separately from the other aspects. It should be noted that the brief description of the terminology in the present application is for the purpose of facilitating understanding of the embodiments described below only and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms first, second, third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the function associated with that element.
The embodiment of the application can be applied to various types of projection devices. Hereinafter, a projector will be exemplified as a projection apparatus.
The projector is a device capable of projecting images or videos on a screen, and can play corresponding video signals through different interfaces in connection with a computer, a broadcasting network, the Internet, a VCD (Video Compact Disc: video compact disc), a DVD (DIGITAL VERSATILE DISC Recordable: digital video disc), a game console, a DV and the like. Projectors are widely used in homes, offices, schools, entertainment venues, and the like.
FIG. 1 illustrates a schematic layout of a projection device in some embodiments. In some embodiments, the present application provides a projection device comprising a projection screen 1 and a projection device 2. The projection screen 1 is fixed in a first position and the projection device 2 is placed in a second position such that the projected image coincides with the projection screen 1.
Fig. 2 illustrates a schematic diagram of the optical path of a projection device in some embodiments.
The embodiment of the application provides a projection device, which comprises a laser light source 100, a light machine 200, a lens 300 and a projection medium 400. The laser light source 100 provides illumination for the optical machine 200, and the optical machine 200 modulates the light beam of the light source, outputs the modulated light beam to the lens 300 for imaging, and projects the imaged light beam onto the projection medium 400 to form a projection screen.
In some embodiments, the laser light source 100 of the projection device includes a laser assembly and an optical lens assembly, and a light beam emitted from the laser assembly can penetrate the optical lens assembly to provide illumination for the optical machine. Wherein, for example, the optical lens assembly requires a higher level of environmental cleanliness, hermetic level of sealing; and the chamber for mounting the laser assembly can be sealed by adopting a dustproof grade with a lower sealing grade so as to reduce the sealing cost.
In some embodiments, the light engine 200 of the projection device may be implemented to include a blue light engine, a green light engine, a red light engine, a heat dissipation system, a circuit control system, and the like. It should be noted that, in some embodiments, the light emitting component of the projector may also be implemented by an LED light source.
In some embodiments, the present application provides a projection device comprising a trichromatic machine and a controller; the trichromatic machine is used for modulating and generating laser of the user interface including pixel points, and comprises a blue ray machine, a green ray machine and a red ray machine; the controller is configured to: acquiring an average gray value of a user interface; and when the average gray value is judged to be larger than a first threshold value and the duration time of the average gray value is judged to be larger than a time threshold value, the working current value of the red light machine is controlled to be reduced according to a preset gradient value so as to reduce the heating of the trichromatic light machine. It has been found that by reducing the operating current of the integrated red light engine in the trichromatic engine, control of the overheating of the red light engine can be achieved, so as to control the overheating of the trichromatic engine and the projection device.
The light engine 200 may be implemented as a trichromatic engine integrating a blue light engine, a green light engine, and a red light engine.
The following description will take the implementation of the optical engine 200 of the projection device as including a blue optical engine, a green optical engine, and a red optical engine as an example.
In some embodiments, the optical system of the projection device is composed of a light source part and a light machine part, the light source part is used for providing illumination for the light machine, the light machine part is used for modulating the illumination beam provided by the light source, and finally the illumination beam is emitted through the lens to form a projection picture.
Fig. 3 illustrates a schematic circuit architecture of a projection device in some embodiments. In some embodiments, the projection device may include a display control circuit 10, a laser light source 20, at least one laser driving assembly 30, and at least one brightness sensor 40, and the laser light source 20 may include at least one laser in one-to-one correspondence with the at least one laser driving assembly 30. Wherein, the at least one means one or more, and the plurality means two or more.
Based on the circuit architecture, the projection device can realize adaptive adjustment. For example, by providing the luminance sensor 40 in the light-emitting path of the laser light source 20, the luminance sensor 40 can detect the first luminance value of the laser light source and send the first luminance value to the display control circuit 10.
The display control circuit 10 may obtain a second brightness value corresponding to the driving current of each laser, and determine that the laser has a COD fault when it is determined that the difference between the second brightness value of the laser and the first brightness value of the laser is greater than a difference threshold; the display control circuit can adjust the current control signals of the corresponding laser driving components of the lasers until the difference value is smaller than or equal to the difference value threshold value, so that the COD fault of the blue laser is eliminated; the projection equipment can timely eliminate the COD fault of the laser, reduce the damage rate of the laser and improve the image display effect of the projection equipment.
In some embodiments, the laser light source 20 includes three lasers, which may be a blue laser 201, a red laser 202, and a green laser 203, respectively, in a one-to-one correspondence with the laser driving assembly 30. The blue laser 201 is used for emitting blue laser light, the red laser 202 is used for emitting red laser light, and the green laser 203 is used for emitting green laser light. In some embodiments, the laser driving assembly 30 may be implemented to include a plurality of sub-laser driving assemblies, each corresponding to a different color laser.
The display control circuit 10 is used for outputting a primary color enable signal and a primary color current control signal to the laser driving assembly 30 to drive the laser to emit light. The display control circuit 10 is connected to the laser driving components 30, and is configured to output at least one enable signal corresponding to three primary colors of each frame of images in the multi-frame display image, transmit the at least one enable signal to the corresponding laser driving components 30, output at least one current control signal corresponding to the three primary colors of each frame of images, and transmit the at least one current control signal to the corresponding laser driving components 30. For example, the display control circuit 10 may be a micro control unit (microcontroller unit, MCU), also referred to as a single chip microcomputer. The current control signal may be a pulse width modulation (pulse width modulation, PWM) signal, among others.
In some embodiments, blue laser 201, red laser 202, and green laser 203 are each coupled to laser drive assembly 30. The laser driving assembly 30 may provide a corresponding driving current to the blue laser 201 in response to the blue PWM signal and the enable signal transmitted by the display control circuit 10. The blue laser 201 is configured to emit light when driven by the driving current.
The brightness sensor is arranged in the light-emitting path of the laser light source, and is usually arranged at one side of the light-emitting path, and does not shade the light path. As shown in fig. 2, at least one luminance sensor 40 is provided in the light-emitting path of the laser light source 20, and each of the luminance sensors is connected to the display control circuit 10 for detecting a first luminance value of one laser and transmitting the first luminance value to the display control circuit 10.
In some embodiments, the display control circuit 10 may obtain, from the correspondence, a second luminance value corresponding to a driving current of each laser, where the driving current is a current actual working current of the laser, and the second luminance value corresponding to the driving current is a luminance value that can be emitted when the laser is working normally under the driving of the driving current. The difference threshold may be a fixed value stored in advance in the display control circuit 10.
In some embodiments, the display control circuit 10 may reduce the duty cycle of the current control signal of the laser driving assembly 30 corresponding to the laser when adjusting the current control signal of the laser driving assembly 30 corresponding to the laser, thereby reducing the driving current of the laser.
In some embodiments, the brightness sensor 40 may detect a first brightness value of the blue laser 201 and send the first brightness value to the display control circuit 10. The display control circuit 10 may obtain the driving current of the blue laser 201, and obtain the second luminance value corresponding to the driving current from the correspondence relationship between the current and the luminance value. Then, it is detected whether the difference between the second luminance value and the first luminance value is greater than a difference threshold, and if the difference is greater than the difference threshold, it indicates that the blue laser 201 has a COD failure, and the display control circuit 10 may reduce the current control signal of the laser driving component 30 corresponding to the blue laser 201. The display control circuit 10 may then acquire the first luminance value of the blue laser 201 and the second luminance value corresponding to the driving current of the blue laser 201 again, and reduce the current control signal of the laser driving assembly 30 corresponding to the blue laser 201 again when the difference between the second luminance value and the first luminance value is greater than the difference threshold. And circulating until the difference value is smaller than or equal to the difference value threshold value. Thereby eliminating the COD failure of the blue laser 201 by reducing the drive current of the blue laser 201.
Fig. 4 shows a schematic structural diagram of a projection device in some embodiments.
In some embodiments, the laser light source 20 in the projection device may include a blue laser 201, a red laser 202 and a green laser 203 that are separately disposed, and the projection device may also be referred to as a three-color projection device, where the blue laser 201, the red laser 202 and the green laser 203 are all light-weight (Mirai Console Loader, MCL) packaged lasers, which are small in size and facilitate compact arrangement of the optical paths.
In some embodiments, the controller includes at least one of a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processor (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read-Only Memory), first to nth interfaces for input/output, a communication Bus (Bus), and the like.
In some embodiments, the projection device may be configured with a camera for cooperating with the projection device to effect regulatory control of the projection process. For example, the projector configured camera may be embodied as a 3D camera, or a binocular camera; when the camera is implemented as a binocular camera, the camera specifically includes a left camera and a right camera; the binocular camera can acquire an image and play content presented by a screen corresponding to the projector, namely a projection surface, and the image or the play content is projected by an optical machine built in the projector.
When the projector moves to a position, the projection angle and the projection plane distance are changed, so that the projection image is deformed, and the projection image is displayed as a trapezoid image or other malformed images; the projector controller can realize automatic trapezoid correction based on the image shot by the camera through coupling the included angle between the projection surfaces of the optical machine and the correct display of the projected image.
Fig. 5 shows a schematic circuit configuration of the projection apparatus in some embodiments. In some embodiments, the laser drive assembly 30 may include a drive circuit 301, a switching circuit 302, and an amplification circuit 303. The driving circuit 301 may be a driving chip. The switching circuit 302 may be a metal-oxide-semiconductor (MOS) transistor. The driving circuit 301 is connected to the switching circuit 302, the amplifying circuit 303, and the corresponding laser included in the laser light source 20, respectively. The driving circuit 301 is configured to output a driving current to a corresponding laser in the laser light source 20 through the VOUT terminal based on a current control signal sent from the display control circuit 10, and transmit a received enable signal to the switching circuit 302 through the ENOUT terminal. The lasers may include n sub lasers in series, namely sub lasers LD1 to LDn, respectively. n is a positive integer greater than 0.
The switch circuit 302 is connected in series in the current path of the laser, and is used for controlling the current path to be conducted when the received enabling signal is at an effective potential. The amplifying circuit 303 is connected to the detection node E in the current path of the laser light source 20 and the display control circuit 10, respectively, for converting the detected driving current of the laser assembly into a driving voltage, amplifying the driving voltage, and transmitting the amplified driving voltage to the display control circuit 10. The display control circuit 10 is further configured to determine the amplified driving voltage as a driving current of the laser, and obtain a second luminance value corresponding to the driving current.
In some embodiments, the amplifying circuit 303 may include: a first operational amplifier A1, a first resistor (also called a sampling power resistor) R1, a second resistor R2, a third resistor R3 and a fourth resistor R4.
In some embodiments, the display control circuit 10, the driving circuit 301, the switching circuit 302 and the amplifying circuit 303 form a closed loop to realize feedback adjustment of the driving current of the laser, so that the display control circuit 10 can timely adjust the driving current of the laser through the difference value between the second brightness value and the first brightness value of the laser, that is, timely adjust the actual light-emitting brightness of the laser, avoid the occurrence of COD failure of the laser for a long time, and improve the accuracy of light-emitting control of the laser. It should be noted that if the laser light source 20 includes one blue laser, one red laser, and one green laser. The blue laser 201 may be disposed at the L1 position, the red laser 202 may be disposed at the L2 position, and the green laser 203 may be disposed at the L3 position.
The laser light at the L1 position is transmitted once through the fourth dichroic plate 604, reflected once through the fifth dichroic plate 605, and enters the first lens 901. The light efficiency p1=pt×pf at this L1 position. Where Pt denotes the transmittance of the dichroic sheet, and Pf denotes the reflectance of the dichroic sheet or the fifth reflectance.
In some embodiments, among the three L1, L2, and L3 positions, the laser light at the L3 position has the highest optical efficiency, and the laser light at the L1 position has the lowest optical efficiency. Since the maximum optical power pb=4.5 watts (W) output by the blue laser 201, the maximum optical power pr=2.5W output by the red laser 202, and the maximum optical power pg=1.5W output by the green laser 203. That is, the maximum optical power output from the blue laser 201 is maximum, the maximum optical power output from the red laser 202 is sub-maximum, and the maximum optical power output from the green laser 203 is minimum. Thus, the green laser 203 is disposed at the L3 position, the red laser 202 is disposed at the L2 position, and the blue laser 201 is disposed at the L1 position. That is, the green laser 203 is disposed in the optical path having the highest light efficiency, thereby ensuring that the projection apparatus can obtain the highest light efficiency.
In some embodiments the controller includes at least one of a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processor (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read-Only Memory, ROM), first to nth interfaces for input/output, a communication Bus (Bus), and the like. In some embodiments, the projection device may directly enter the display interface of the signal source selected last time after being started, or the signal source selection interface, where the signal source may be a preset video on demand program, or may be at least one of an HDMI interface, a live tv interface, etc., and after the user selects a different signal source, the projector may display the content obtained from the different signal source.
After the user opens the projection device, the projection device can project the content preset by the user into the projection surface, which can be a wall surface or a curtain, and the projection image can be displayed in the projection surface for the user to watch. However, when the user places the projection device at an improper position, the projection device may be not perpendicular to the projection surface, so that the projected image may be displayed as a trapezoid image or other malformed image. Or in the use process, the user touches the projection equipment to change the position of the projection equipment, the projection angle and the projection surface distance of the projection equipment are changed, so that the projection image is deformed, and the projection image is also displayed as a trapezoid image or other malformed images. For the above case, it is necessary to perform image correction processing on the projection apparatus so that the projection apparatus can project an image of a standard shape in a projection plane for viewing by a user. Fig. 6 shows a schematic diagram of some embodiments with the projection device position changed. The projection device is initially in the position a and can project onto a suitable projection area, which is rectangular and can normally completely and accurately cover a corresponding rectangular curtain. When the projector moves from the position a to the position B, a deformed projection image is often generated, for example, a trapezoidal image appears, which causes a problem that the projection image does not coincide with the rectangular curtain.
In the related art, a user can manually control the projection device to adjust the projection angle, thereby controlling the display position and shape of the projection image and realizing image correction. However, this manual correction method is cumbersome in process and cannot accurately correct the projected image. The projection equipment can also realize automatic trapezoid correction based on the deep learning neural network through coupling the included angle between the projection surfaces of the optical machine and the correct display of the projection image. However, the correction method is slow in correction speed, and a large amount of scene data is required for model training to achieve a certain precision, so that the method is not suitable for immediately and rapidly correcting the scene of the user equipment.
The projection equipment provided by the embodiment of the application can quickly and accurately correct the image so as to solve the problems.
In some embodiments, the projection device may include an optical engine, a camera, and a controller. The optical machine is used for projecting the preset playing content to the projection surface, and the projection surface can be a wall surface or a curtain.
The camera is used for shooting an image displayed in the projection surface, and may be a camera. The camera may include a lens assembly having a photosensitive element and a lens disposed therein. The lens makes the light of the image of the scenery irradiate on the photosensitive element through the refraction action of a plurality of lenses on the light. The photosensitive element can select a detection principle based on a charge coupled device or a complementary metal oxide semiconductor according to the specification of the camera, converts an optical signal into an electric signal through a photosensitive material, and outputs the converted electric signal into image data. The image captured by the camera may be used for image correction of the projection device.
The projection device has an image correction function. When the projection image projected by the projection device in the projection plane has a deformed image such as a trapezoid, the projection device can automatically correct the projection image to obtain an image with a regular shape, which can be a rectangular image, so that the image correction is realized.
Specifically, when receiving the image correction instruction, the projection device may turn on the image correction function to correct the projection image. The image correction instruction refers to a control instruction for triggering the projection device to automatically perform image correction.
In some embodiments, the image correction instruction may be an instruction that is actively entered by a user. For example, the projection device may project an image on the projection surface after the power of the projection device is turned on. At this time, the user can press an image correction switch preset in the projection device or an image correction key on a remote controller matched with the projection device, so that the projection device starts an image correction function and automatically performs image correction on the projection image.
In some embodiments, the image correction instructions may also be automatically generated according to a control program built into the projection device.
The projection device can actively generate an image correction instruction after being started. For example, when the projection device detects a first video signal input after power-on, an image correction instruction may be generated, triggering the image correction function.
Or the projection equipment can automatically generate an image correction instruction in the working process. In consideration of the fact that a user may actively move the projection device or inadvertently touch the projection device during use of the projection device, the placement posture or the setting position of the projection device may be changed, and the projection image may also become a trapezoidal image. At this time, in order to secure viewing experience of the user, the projection apparatus may automatically perform image correction.
Specifically, the projection device may detect its own situation in real time. When the projection device detects that the self-placement posture or the setting position is changed, an image correction instruction can be generated, so that an image correction function is triggered.
In some embodiments, the projection device can monitor the movement of the device in real time through its configuration component, and feed back the monitoring result to the projection device controller in real time, so as to realize the automatic correction of the projection image at the first time after the projection device moves, when the controller starts the image correction function in real time. For example, by a gyroscope, or a TOF (Time of Flight) sensor, configured by the projection device, the controller receives monitoring data from the gyroscope, TOF sensor, to determine whether the projection device is moving.
After determining that the projection device is moving, the controller may generate an image correction instruction and turn on the image correction function.
The time-of-flight sensor realizes distance measurement and position movement monitoring by adjusting the frequency change of the transmitted pulse, the measurement accuracy of the time-of-flight sensor cannot be reduced along with the increase of the measurement distance, and the time-of-flight sensor has strong anti-interference capability.
FIG. 7 illustrates a system frame diagram of a projection device implementing display control in some embodiments.
In some embodiments, the projector provided by the application has the characteristic of long-focus micro-projection, and comprises a controller, wherein the controller can perform display control on an optical machine picture through a preset algorithm so as to realize the functions of automatic trapezoid correction, automatic curtain entering, automatic obstacle avoidance, automatic focusing, eye shooting prevention and the like of the display picture. It can be understood that the projector can realize flexible position movement in a long-focus micro-projection scene by a display control method based on geometric correction; in the process of moving the equipment each time, aiming at the problems of projection picture distortion, projection plane foreign matter shielding, projection picture abnormality from a curtain and the like which possibly occur, the controller can control the projector to realize an automatic display correction function, so that the projector automatically resumes normal display.
In some embodiments, a geometry correction based display control system includes an application service layer (APK SERVICE: android application PACKAGE SERVICE), a service layer, and an underlying algorithm library.
The application service layer is used for realizing interaction between the projector and the user; based on the display of the user interface, the user can configure various parameters and display pictures of the projector, and the controller can realize the function of automatically correcting the display pictures of the projector when the display of the projector is abnormal by coordinating and calling algorithm services corresponding to various functions.
The service layer may include correction services, camera services, time of flight (TOF) services, etc., the services corresponding up to the application service layer (APK SERVICE), implementing corresponding specific functions of the different configuration services of the projector; the service layer is downwards connected with data acquisition services such as an algorithm library, a camera, a flight time sensor and the like, so that the packaging of the bottom layer complex logic is realized.
The bottom algorithm library provides correction service and control algorithm for the projector to realize various functions, and the algorithm library can complete various mathematical operations based on OpenCV (open source based on permission) to provide basic computing capacity for the correction service; openCV is a cross-platform computer vision, machine learning software library that is based on BSD license open source release and can be run in a variety of existing operating system environments.
In some embodiments, the projector is configured with a gyroscopic sensor; in the moving process of the equipment, the gyroscope sensor can sense the position movement and actively collect movement data; and then the acquired data is sent to an application program service layer through a system framework layer and used for supporting user interface interaction and application data required in the application program interaction process, wherein the acquired data can also be used for data calling of the controller in algorithm service realization.
In some embodiments, the projector is configured with a time of flight (TOF) sensor that, after acquiring corresponding data, will be sent to a corresponding time of flight service of the service layer;
After the flight time service acquires the data, the acquired data is sent to an application program service layer through a process communication framework, and the data is used for data calling, user interfaces, program applications and the like of the controller in an interactive mode.
In some embodiments, the projector is configured with a camera for capturing images, which may be implemented as a binocular camera, or a depth camera, or a 3D camera, or the like;
the camera acquisition data is sent to a camera service, and then the camera service sends the acquisition image data to a process communication frame and/or a projector correction service; the projector correction service can receive camera acquisition data sent by the camera service, and the controller can call a corresponding control algorithm in the algorithm library according to different functions to be realized.
In some embodiments, data interaction with the application service is performed through the process communication framework, and then the calculation result is fed back to the correction service through the process communication framework; the correction service sends the acquired calculation result to the projector operation system to generate a control signaling, and sends the control signaling to the optical machine control drive to control the working condition of the optical machine so as to realize automatic correction of the display image.
In some embodiments, the projection device may correct the projected image when an image correction instruction is detected.
For correction of the projection image, an association relationship between the distance, the horizontal included angle, and the offset angle may be created in advance. And then the projection equipment controller determines the included angle between the light machine and the projection surface at the moment by acquiring the current distance from the light machine to the projection surface and combining the association relation to realize the correction of the projection image. Wherein, the contained angle is embodied as the contained angle of ray apparatus axis and projection face.
However, in some complex environments, it may happen that the association relationship created in advance cannot adapt to all complex environments, which may cause the projection image to fail to correct.
In order to accurately correct the projection image, the projection device can redetermine the position of the area to be projected, so that the play content is projected to the area to be projected, and the correction of the projection image is realized.
FIG. 8 illustrates an interactive flow diagram of components of a projection device in some embodiments.
In some embodiments, the projection device may perform trapezoidal correction on the projected image when the image correction instruction is acquired. Specifically, the correction function can be realized on the projection image through a preset trapezoidal correction algorithm. The projection device may project a correction chart card in the projection surface and determine positional information of the projection surface based on the correction chart card. Based on the positional information of the projection surface, a projection relationship between the projection surface and the projection device can be determined. In the embodiment of the application, the projection relation refers to a projection relation of the projection device for projecting an image onto the projection surface, specifically, a mapping relation between a play content projected by an optical machine of the projection device and the projection surface. After determining the projection relationship between the projection surface and the projection device, the projection device can determine the position information of the area to be projected, so as to perform projection and realize image correction.
It should be noted that, in the embodiment of the present application, a conversion matrix of a projection plane and an optical machine coordinate system under a world coordinate system may be constructed based on a binocular camera, where the conversion matrix is a homography relationship between a projection image in the projection plane and a playing card played by the optical machine, and the homography relationship is also referred to as a projection relationship, and any shape conversion between the projection image and the playing card may be implemented by using the homography relationship.
In some embodiments, the projection device may first project the correction map card when the image correction instruction is acquired. Specifically, the controller may control the optical engine to project the calibration chart onto the projection surface.
After the correction chart card is projected, the controller can also control the camera to shoot the correction chart card displayed in the projection surface, so as to obtain a correction image.
The correction chart card can contain a plurality of correction characteristic points, so that the correction image shot by the camera also contains all the correction characteristic points in the correction chart card. The position information of the projection surface can be determined by correcting the feature points. It should be noted that, for a plane, after determining the positions of three points in the plane, the position information of the plane can be determined. Therefore, in order to be able to determine the position information of the projection surface, the positions of at least three points in the projection surface need to be determined, i.e. at least three correction feature points need to be included in the correction map card. And determining the position information of the projection surface according to the at least three correction characteristic points.
In some embodiments, the calibration chart may include a pattern and a color feature preset by a user. The correction chart card may be a checkerboard chart card, and the checkerboard chart card is set to be a checkerboard chart with black and white intervals, as shown in fig. 9, and correction characteristic points contained in the checkerboard chart card are corner points of a rectangle. The pattern in the correction chart card can also be configured as a ring chart card, including a ring pattern, as shown in fig. 10, where correction feature points included in the ring chart card are corresponding solid points on each ring. In some embodiments, the correction chart card may also be configured as a combination of the two types of patterns described above, or may also be configured as other patterns having identifiable feature points.
FIG. 11 illustrates a flow diagram for acquiring a projection relationship in some embodiments.
In some embodiments, after the optical engine projects the correction chart card onto the projection surface, the controller may control the camera to shoot the correction chart card, so as to obtain a correction image, so as to obtain the position of the correction feature point.
Specifically, the camera may be a binocular camera, and two cameras are respectively disposed on two sides of the optical machine. And shooting the correction chart card through the binocular camera, shooting by a left camera to obtain a first correction image, and shooting by a right camera to obtain a second correction image.
The controller may perform image recognition processing on the corrected image to obtain a first position of the correction feature point in the corrected image. In the embodiment of the present application, the first position refers to: and correcting the coordinate information of the characteristic points under an image coordinate system corresponding to the corrected image.
The image coordinate system refers to: the center of the image is taken as the origin of coordinates, and the X and Y axes are parallel to the coordinate systems on two sides of the image. The image coordinate system in the embodiment of the application can be set as follows: for a projection area preset by a user, the center point of the projection area may be set as an origin, the horizontal direction is the X axis, and the vertical direction is the Y axis. The high image coordinate system may be set in advance according to a preset projection area.
Coordinate information of correction feature points in the correction image can be determined from the image coordinate system.
For the same correction feature point, it may be different at a first position in a first correction image captured by the left camera and at a first position in a second correction image captured by the right camera. The coordinates of the correction feature point in the camera coordinate system of either the left or right camera can be determined by the two first positions of the same correction feature point.
In the embodiment of the application, the camera coordinate system is specifically: the light spot of the camera is taken as the center, the optical axis is taken as the Z axis, and a direct coordinate system of space is established parallel to the projection plane and taken as the XOY plane.
For a binocular camera, the coordinates of the correction feature point under the camera coordinate system corresponding to any one of the cameras are determined, and in the embodiment of the application, the left camera is taken as an example for introduction.
Specifically, the coordinate information of the correction feature point in the camera coordinate system of the left camera can be determined according to the first position of the same correction feature point in the two correction images, and the coordinate information is set as P (x, y, z).
In some embodiments, after acquiring the coordinate information of the correction feature point in the camera coordinate system of the left camera, the controller may convert the coordinate information into the coordinate of the correction feature point in the opto-mechanical coordinate system.
In the embodiment of the application, the optical machine coordinate system is specifically: the light spot of the optical machine is taken as the center, the optical axis is taken as the Z axis, and a direct coordinate system of space is established parallel to the projection plane and taken as the XOY plane. It should be noted that, the optical-mechanical coordinate system and the camera coordinate system may be mutually converted, so that the coordinates of the correction feature points in the camera coordinate system may be converted into the coordinates in the optical-mechanical coordinate system. Specifically, the correction feature points can be converted between two coordinate systems according to external parameters between the opto-mechanical cameras. The external parameters between the optical-mechanical cameras are the equipment parameters marked on the equipment shell or the specification of the projection equipment when the projection equipment is manufactured and delivered, are usually set based on the functions, assembly, manufacture and parts of the projection equipment, are applicable to all projection equipment of the same model, and can comprise a rotation matrix and a translation matrix between the optical-mechanical cameras.
According to external parameters among the optical machine cameras, the conversion relation between the optical machine coordinate system and the camera coordinate system can be determined, and the coordinates of the correction feature points under the optical machine coordinate system can be further obtained.
The conversion formula is as follows:
P′(x′,y′,z′)=RRR*P+TTT (1)
Wherein:
p '(x', y ', z') is the coordinate of the correction feature point in the opto-mechanical coordinate system.
RRR is the inter-camera rotation matrix and TTT is the inter-camera translation matrix.
In some embodiments, after the coordinates of the correction feature points in the optical-mechanical coordinate system are obtained, a projection plane equation of the projection plane in the optical-mechanical coordinate system may be determined.
It should be noted that, since coordinate information of at least three points is required, position information of one plane can be determined. Accordingly, the controller may acquire first positions of at least three correction feature points in at least the correction image, and determine coordinate information in the camera coordinate system from the first positions of the correction feature points. The coordinate information in the camera coordinate system can be further converted into coordinate information in the opto-mechanical coordinate system.
After determining the coordinate information of at least three correction feature points in the optical machine coordinate system, the controller can fit the coordinates of the correction feature points, so as to obtain a projection plane equation of the projection plane in the optical machine coordinate system. The projection plane equation can be expressed as:
z=ax+by+c (2)
Or the following formula:
in some embodiments, after determining the projection plane equation for the projection plane in the ray machine coordinate system, the controller may obtain a transformation matrix for the ray machine coordinate system and the world coordinate system from the projection plane equation, the transformation matrix being used to characterize the projection relationship.
In the embodiment of the application, the world coordinate system is set as follows: the image coordinate system is taken as an XOY plane, namely the projection plane is taken as an XOY plane, wherein the origin is the center point of a projection area preset by a user. The Z axis is set in the direction perpendicular to the projection plane, and a space coordinate system is established.
The controller may determine the representation of the projection surface in the world coordinate system and the representation of the projection surface in the world coordinate system, respectively, when acquiring the transformation matrices of the ray machine coordinate system and the world coordinate system.
Specifically, the controller may first determine a unit normal vector of the projection surface in the world coordinate system.
Since the projection plane itself is the XOY plane of the world coordinate system, the unit normal vector of the projection plane under the world coordinate system can be expressed as:
m=(0,0,1)*T (4)
The controller may further obtain a unit normal vector of the projection plane in the optical machine coordinate system according to a projection plane equation of the projection plane in the optical machine coordinate system, where the unit normal vector of the projection plane in the optical machine coordinate system may be expressed as a formula:
According to the unit normal vector of the projection surface under two coordinate systems, a conversion matrix between an optical machine coordinate system and a world coordinate system can be obtained, and the interrelationship is expressed as the following formula:
m=R1*n (6)
wherein: r1 represents a transformation matrix between the opto-mechanical coordinate system and the world coordinate system.
The conversion matrix can represent the mapping relation between the playing content projected by the optical machine and the projection surface.
After determining the mapping relationship, the controller may implement a conversion of coordinates of a certain point between the world coordinate system and the opto-mechanical coordinate system. For a certain target area already determined in the projection plane, its coordinate representation in the world coordinate system can be determined directly. The controller can convert the coordinate representation in the world coordinate system into the coordinate representation in the optical machine coordinate system according to the conversion matrix, so that the position information of the target area for the projection equipment is determined, the playing content is directly projected into the target disk area, and the image correction can be realized.
In the related art, when determining an equation of a certain plane through a feature map card, generally, all feature points in the feature map card are identified together, and all feature points are fitted together, so as to determine a representation equation of the plane. However, there is camera distortion during the camera shooting process, and the distortion is greater for the region farther from the center of the picture for the shot picture. For the feature points extracted in the area with larger distortion, the accuracy of the coordinates obtained by identifying the feature points is poor, and if the feature points are used for fitting, the obtained plane equation is also inaccurate, so that the accuracy of image correction is influenced, and the effect is poor.
In some embodiments, in order to improve accuracy of image correction, when an image correction instruction is acquired, the controller may project a preset correction chart card into the projection surface, where the correction chart card may include a plurality of correction areas set by a user. FIG. 12 illustrates a schematic diagram of a calibration chart in some embodiments. Three correction areas may be included in the correction chart card: a first correction region, a second correction region, and a third correction region.
The three correction areas may be arranged one after the other along the direction from the center to the edge of the correction chart card. The first correction area is located in the middle area of the correction chart card, such as area A in the chart. In order to determine the projection plane equation, the first correction area needs to include at least three correction feature points, and the first correction area may be a checkerboard pattern or a circular ring pattern. The second correction region is located between the first correction region and the third correction region, as four B regions in the figure, including four correction regions B1 to B4. The area of each second correction region may be the same, and may be smaller than the area of the first correction region. Each second correction area comprises a pattern preset by a user, and the pattern comprises at least three correction feature points, wherein the pattern of each second correction area can be the same or different, and the embodiment of the application is not limited. The third correction area is located in the edge area of the correction chart card, such as 16C areas in the chart, and comprises 16 correction areas from C1 to C16. The area of each third correction region may be the same, being smaller than the area of the second correction region. Each third correction area comprises at least three correction feature points, and can comprise a pattern preset by a user, wherein the pattern of each third correction area can be the same or different.
Since there is a case of camera distortion during the camera shooting process, it can be known that the distortion condition of the first correction area is the weakest among the three correction areas, so that the accuracy of the projection plane equation determined according to the correction feature points in the first correction area is the highest. The distortion condition of the second correction region is inferior to that of the first correction region, so that the accuracy of the projection plane equation determined by the correction feature points in the second correction region is also lower than that of the first correction region. The distortion condition of the third correction region is strongest, and therefore the accuracy of the projection plane equation determined from the correction feature points in the third correction region is also lowest.
Accordingly, the controller may select correction feature points in the correction image as follows:
since the distortion condition of the middle region of the corrected image photographed by the camera is the weakest, the middle region can be preferentially identified. Specifically, the controller may identify the first correction region in the corrected image first, so as to obtain position information of at least three correction feature points in the first correction region. The projection plane equation can be determined based on the positional information of the at least three correction feature points.
When the optical machine projects the correction chart card, a part of areas may be blocked due to environmental reasons, so that the image cannot be displayed on the projection surface, for example, a first correction area may be blocked, and the first correction area is not present in the correction image shot by the camera, so that the subsequent acquisition of the projection surface equation is affected. Therefore, if the controller does not recognize the first correction region, the other correction regions may be recognized.
Since the distortion condition of the second correction region may be weaker than that of the third correction region, the second correction region can be preferentially identified. Specifically, the controller may identify a second correction region in the correction image, and obtain position information of at least three correction feature points in the second correction region. Because each second correction area contains at least three correction feature points, any one second correction area can be identified to obtain the position information of at least three correction feature points, and all second correction areas can be identified to obtain the position information of at least three correction feature points.
If all the second correction areas are also obscured, i.e. the controller cannot identify the second correction areas in the correction image, a third correction area may be identified. Specifically, the controller may identify a third correction region in the corrected image, and obtain position information of at least three correction feature points in the third correction region. Any one of the third correction areas can be identified, and all the third correction areas can be identified, so that the position information of at least three correction feature points can be obtained. The projection plane equation can be determined based on the positional information of the at least three correction feature points.
In the above method, the controller identifies each correction region individually, that is, at least three correction feature points identified belong to the same correction region.
In some embodiments, the controller may also identify all correction areas in the correction image, so that the position information of at least three correction feature points is randomly identified in all correction areas. These correction feature points may belong to the same correction region or may belong to different kinds of correction regions.
In some embodiments, multiple recognition modes may be provided for the projection device, considering that the accuracy of correction to the image is different when different correction areas are recognized.
For example, three recognition modes may be provided. In the first recognition mode, the controller recognizes only the first correction region, and acquires a projection plane equation by using correction feature points in the first correction region to perform image correction. At this time, the accuracy of the image correction is highest, but the number of correction feature points in the first correction region is small compared to the entire corrected image, and therefore the adaptability to recognizing the correction feature points in the first correction region is low. Once the first correction region is occluded, no correction feature points can be identified.
In the second recognition mode, the controller may recognize the first correction region and the second correction region, and perform image correction using correction feature points in the two correction regions. The accuracy of the image correction may be lower than in the first recognition mode, but the adaptability of the recognized correction feature points is higher. When the first correction region is blocked, the correction feature point can be identified using the second correction region.
In the third recognition mode, the controller may recognize all correction areas, so that correction feature points in all correction areas may be recognized. The accuracy of the image correction may be lower than in the first and second recognition modes, but the adaptability of the recognized correction feature points is higher. Even if both the first correction region and the second correction region are blocked, the correction feature point can be identified using the third correction region.
The user may select different recognition modes as desired.
In some embodiments, after determining the projection relationship, i.e., the conversion matrix between the opto-mechanical coordinate system and the world coordinate system, the controller may determine the target position information of the region to be projected in the projection plane using the projection relationship. In the embodiment of the application, the target position information refers to the position information of the area to be projected under the optical-mechanical coordinate system.
The controller may first determine the position information of the region to be projected in the projection plane in the world coordinate system. And converting the position information in the world coordinate system into the position information under the optical machine coordinate system according to the conversion matrix. The controller may control the light engine to project the play content to the target position information, i.e. to the area to be projected.
In some embodiments, the controller may first determine a specific region to be projected when acquiring target position information of the region to be projected.
The area to be projected may be a preset projection area. Specifically, the projection device can be operated by a professional after-sales technician, or can be a fixed projection area set by a user, and the projection device is arranged at a placement position for realizing the optimal projection effect during working and is projected to a display area in a projection surface.
The position information of the predetermined projection region in the projection plane is determined, so that the position information of the projection region in the projection plane, i.e. in the image coordinate system, can be determined directly. The projection area is generally set as a rectangular area, and the position information of the rectangular area can be represented by coordinates of four vertexes of the rectangular area. Coordinates of one vertex in the rectangular region are set to (x, y). Since the projection plane is the XOY plane of the world coordinate system, the origin is the center point of the preset projection area. Thus, the vertex coordinates are represented in the world coordinate system as (x, y, z).
The controller may convert the coordinate representation in the world coordinate system to a coordinate representation in the opto-mechanical coordinate system according to the conversion matrix, so that the representation of the vertex coordinates in the world coordinate system is (x ', y', z). The controller may further obtain coordinates of all vertices of the preset projection area in the optical-mechanical coordinate system, so as to determine position information of the projection area in the optical-mechanical coordinate system. The control can control the optical machine to project the playing content to the position information to realize image correction.
In some embodiments, when determining the area to be projected, the controller may acquire a maximum inscribed rectangle of the correction image corresponding to the correction chart card, and determine the maximum inscribed rectangle as the area to be projected.
Wherein the size ratio (aspect ratio) of the maximum inscribed rectangle may be the same as that of the correction chart card. In the display area of the correction chart card in the correction image, the size of the rectangle is continuously corrected according to the size proportion of the correction chart card until all four vertexes of the rectangle are positioned on the edge of the display area of the correction chart card, so that the maximum inscribed rectangle is obtained.
The controller can acquire the position information of the maximum inscribed rectangle under the coordinate system of the optical machine, and control the optical machine to project the playing content into the maximum inscribed rectangle, so as to realize image correction.
Fig. 13 shows a schematic diagram before and after image correction in some embodiments.
In some embodiments, when the projection device projects the broadcast content into the projection surface, if an obstacle exists between the projection device and the projection surface, a part of the broadcast content cannot be projected into the projection surface. Therefore, the projection equipment can perform obstacle avoidance processing, avoid obstacles, determine an accurate projection area and ensure that the playing content projected into the projection area is complete.
Specifically, the projection device has an obstacle avoidance function, and a user can select to turn on or off the function. When the projection equipment closes the obstacle avoidance function, the projection equipment does not carry out obstacle avoidance treatment. When the projection equipment starts the obstacle avoidance function, the controller can control the optical machine to project the obstacle avoidance image card to the projection surface and control the camera to shoot the obstacle avoidance image card, so as to obtain an obstacle avoidance image. The obstacle avoidance card can be a pure white card.
In some embodiments, the controller may synchronize the trapezoidal correction and obstacle avoidance process when it is detected that the projection device is turned on for obstacle avoidance.
The controller can control the light machine to project the obstacle avoidance image card to the projection surface and control the camera to shoot the obstacle avoidance image card, store the obstacle avoidance image and further carry out obstacle avoidance processing according to the obstacle avoidance image. After the obstacle avoidance image card is shot, the controller can control the optical machine to project the correction image card to the projection surface and control the camera to shoot the correction image card, so that trapezoidal correction processing is carried out according to the correction image.
In some embodiments, after the control camera shoots the obstacle avoidance image corresponding to the obstacle avoidance image card, the obstacle avoidance image may be preprocessed, so as to ensure that only a display area corresponding to the obstacle avoidance image card remains in the obstacle avoidance image. The controller can further perform obstacle avoidance processing according to the obstacle avoidance image so as to determine an area to be projected in the projection plane.
In the related art, when obstacle avoidance processing is performed on a projection device, when an obstacle is located inside a captured image, the outline of the obstacle is generally extracted by using binarization, so that obstacle avoidance is realized. However, binarization is a fixed threshold, and when the scene of the user changes (such as day to night), the brightness of the light and the gray level of the obstacle change, and the fixed threshold obviously does not adapt to all scenes, so that obstacle avoidance cannot be accurately realized.
In some embodiments, the controller may perform HSV preprocessing on the obstacle avoidance images. Specifically, the obstacle avoidance image can be converted into an HSV space, and the HSV space defines HSV values of various colors, so that the HSV values of all pixel points in the obstacle avoidance image can be determined. Fig. 14 shows a schematic of HSV values for each color in some embodiments. It can be seen that for white, the HSV values are:
0<H<180,0<S<30,221<V<255
namely, for the obstacle avoidance card, the HSV value of the pixel point in the obstacle avoidance card is a pure white card and meets the white HSV value range.
The controller can detect whether the HSV value of each pixel point in the obstacle avoidance image meets the white HSV value.
In some embodiments, the controller may preset an HSV threshold that may be slightly larger than the range of white HSV values, for example, to account for error effects: 0< H <180,0< S <50, 200< V <255.
The controller can detect whether the HSV value of each pixel point in the obstacle avoidance image accords with a preset HSV threshold value. And setting the pixel points which do not meet the conditions as first-class pixel points, and setting the pixel points which meet the conditions as second-class pixel points.
The controller may detect whether the number of the first type of pixel points satisfies a preset first number condition. The first number of conditions may be set to: the number of pixels of the first type does not exceed half the total number of pixels.
If the condition is satisfied, the controller may set the gray values of all the first type pixel points to a preset value, and the preset value may be 0. If the condition is not satisfied, i.e. the number of first type pixel points is greater than half of the total number, no processing is performed.
When the projection equipment projects to other backgrounds (such as yellow and blue backgrounds), after HSV conversion is adopted, most of projection areas are regarded as obstacles to cause error obstacle avoidance, so that the number of pixels outside the HSV threshold range is judged by adopting the method, and the special situation can be avoided.
In some embodiments, after HSV preprocessing the obstacle avoidance image, the controller may further perform obstacle avoidance processing on the obstacle avoidance image.
The controller can carry out gray scale processing on the obstacle avoidance image and count the number of pixel points corresponding to each gray scale value. A gray value curve can be obtained as shown in fig. 15. The gray value curve is used for representing the relation between the gray value and the number of pixel points. Wherein, the X-axis represents gray values and the Y-axis represents the number of pixels.
The controller may first determine the gray value with the largest number of pixels, and set the gray value as the first gray value, such as point a in the figure. And respectively searching a nearest minimum gray value at two sides of the first gray value by taking the first gray value as the center. It can be seen that point B in the figure is the minimum point on the left, and point B is the minimum point on the right. Thereby determining two minimum gray values corresponding to the point B and the point C.
The controller may count the gray value intervals between these two gray value minima.
In some embodiments, after the gray value minimum value is obtained, the controller may detect the gray value minimum value according to a preset condition. The preset conditions are as follows: the number of pixels of the minimum gray value is smaller than the number of pixels of the first gray value of the preset proportion. The preset ratio may be set by the skilled person himself, for example 2/3. Thus, the preset conditions may be: the number of pixels of the minimum gray value is less than 2/3 of the number of pixels of the first gray value.
When the detected gray value minimum value satisfies the preset condition, the gray value minimum value may be retained. If the preset condition is not met, the gray value minimum value is not adopted, and the next nearest gray value minimum value is searched along the direction of the gray value minimum value until the gray value minimum value meeting the preset condition is searched.
Fig. 16 shows a schematic of gray value curves in some embodiments. The point a is the gray value point with the largest number of pixel points, namely the first gray value. To the left of point A there are two minima points B1 and B2. However, the number of pixels B1 is significantly larger than 2/3 of the number of pixels a, so B1 cannot be regarded as a gray value minimum value and needs to be searched for further leftward. The point B2 is satisfactory, and thus B2 can be regarded as a gradation value minimum value. The right side of the point A is provided with a minimum value point C which meets the preset condition and can be used as a gray value minimum value. Therefore, the point B2 and the point C correspond to two gradation value minima.
In some embodiments, after determining the minimum gray values on the left and right sides of the first gray value, a gray value interval between the two minimum gray values may be further determined, and the areas corresponding to all the pixels in the gray value interval are determined as the projection areas.
The controller may acquire a maximum inscribed rectangle in the projection region, and take the maximum inscribed rectangle as the region to be projected.
In some embodiments, considering that the photographing environment is complex, after the first gray value is selected, the number of pixels in the gray value interval corresponding to the first gray value is small, which results in a small selected area to be projected.
Therefore, after the gray value interval corresponding to the first gray value is acquired (set as the first gray value interval), the controller may detect whether the number of all the pixel points in the first gray value interval meets the preset second number condition. The second number of conditions may be set to: the number of all the pixel points in the first gray value interval is smaller than the total number of the pixel points in the preset proportion. The preset ratio may be 1/30. I.e. the second number of conditions may be set to: the number of all the pixel points in the first gray value interval is less than 1/30 of the total number of the pixel points.
If the condition is not met, the area corresponding to all the pixel points in the first gray value interval can still be used as a projection area, and the area to be projected is further obtained.
If the condition is detected to be met, the controller can respectively acquire a second gray value and a third gray value with the largest number of pixels at two sides of the first gray value interval.
Specifically, the gradation value is generally 0 to 255. The first gray value interval is set to (a, b). The entire gray value case can be divided into three intervals: (0, a), (a, b), (b, 255). In the first gray value interval (a, b), the gray value of the pixel point with the largest gray value is the first gray value.
And searching gray values with the largest number of pixels on the left side and the right side of the first gray value interval. In the interval (0, a), the gray value with the largest number of pixels is the second gray value. In the interval (b, 255), the gradation value with the largest number of pixels is the third gradation value.
And respectively taking the second gray value and the third gray value as centers, determining the minimum value of the nearest gray values on the left side and the right side of the center, and determining the interval between the two minimum values of the gray values as the gray value interval of the center. The specific determination method is referred to above, and will not be described here again.
Therefore, the second gray value interval corresponding to the second gray value and the third gray value interval corresponding to the third gray value can be obtained.
Fig. 17 shows a schematic diagram of gray value curves in some embodiments.
The point A1 is the gray value point with the largest number of pixel points, i.e. the first gray value. The gray value minima points corresponding to the point A are the point B1 and the point C1. By determining that the number of pixels satisfies the second number condition in the interval between the point B1 and the point C1, it is possible to find the second gray value A2 with the largest number of pixels on the left side of the point B1 and the third gray value A2 with the largest number of pixels on the right side of the point C1. The minimum gray value point corresponding to the point A2 is the point B2 and the point C2, that is, the interval between the point B2 and the point C2 is the second gray value interval. The minimum gray value point corresponding to the point A3 is the point B3 and the point C3, that is, the interval between the point B3 and the point C3 is the third gray value interval.
The controller may count the number of pixels in the second gray value interval and the third gray value interval. And determining the gray value interval with the largest number of pixels in the three intervals of the first gray value interval, the second gray value interval and the third gray value interval as a target gray value interval. The area formed by all the pixel points corresponding to the target gray value interval can be determined as a projection area. The controller further determines the area to be projected based on the projection area.
In some embodiments, after determining the region to be projected in the projection plane, position information of the region to be projected in the world coordinate system may be acquired. According to the projection relation, the controller can convert the position information under the world coordinate system into the position information under the optical machine coordinate system to obtain the target position information.
According to the target position information, the controller can control the optical machine to project the playing content to the area to be projected, so that image correction is realized.
The embodiment of the application also provides an image correction method, which is applied to projection equipment, as shown in fig. 18, and comprises the following steps:
Step 1801, in response to the image correction command, the optical machine is controlled to project the correction chart card onto the projection surface, and the camera is controlled to shoot the correction chart card, so as to obtain a correction image. The correction chart card comprises correction characteristic points.
Step 1802, determining a first position of a correction feature point in a correction image, and acquiring a projection relationship according to the first position; the projection relation refers to the mapping relation between the playing content projected by the optical machine and the projection surface.
Step 1803, determining target position information of the area to be projected in the projection plane according to the projection relation.
Step 1804, controlling the optical machine to project the playing content to the area to be projected according to the target position information.
The same and similar parts of the embodiments in this specification are referred to each other, and are not described herein.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied essentially or in parts contributing to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method of the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. The illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (9)

1. A projection device, comprising:
a light machine configured to project the play content to a projection surface;
A camera configured to capture an image displayed in the projection surface;
a controller configured to:
responding to an image correction instruction, controlling the optical machine to project a correction chart card to the projection surface, and controlling the camera to shoot the correction chart card to obtain a correction image; the correction chart card comprises correction characteristic points;
Determining a first position of the correction feature point in the correction image, and acquiring a projection relation according to the first position; the projection relation is a mapping relation between the playing content projected by the optical machine and the projection surface;
Controlling the optical machine to project an obstacle avoidance image card to the projection surface, and controlling the camera to shoot the obstacle avoidance image card to obtain an obstacle avoidance image; carrying out gray scale processing on the obstacle avoidance image, and counting the number of pixel points corresponding to each gray scale value to generate a gray scale value curve;
Acquiring target gray value minima on the left side and the right side of a first gray value in the gray value curve; the first gray value is the gray value with the largest number of pixel points; the target gray value minimum value is a gray value minimum value which meets a preset condition and is nearest to the first gray value; the preset conditions are as follows: the number of the pixel points corresponding to the minimum gray value is smaller than that of the pixel points corresponding to the first gray value of the preset proportion;
acquiring a first gray value interval between the target gray value minima, and determining the areas of all pixel points corresponding to the first gray value interval as projection areas; acquiring the maximum inscribed rectangle in the projection area and taking the maximum inscribed rectangle as an area to be projected;
determining target position information of a region to be projected in the projection surface according to the projection relation;
And controlling the optical machine to project playing content to the area to be projected according to the target position information.
2. The projection device of claim 1, wherein the controller is configured to:
In the step of performing the acquisition of the projection relationship according to the first positional relationship,
Acquiring a first coordinate of the correction feature point under a camera coordinate system according to the first position;
Converting the first coordinate into a second coordinate of the correction feature point under an optical machine coordinate system;
Acquiring a projection plane equation of the projection plane under an optical machine coordinate system according to the second coordinate;
And obtaining a conversion matrix of the optical machine coordinate system and the world coordinate system according to the projection plane equation, wherein the conversion matrix is used for representing the projection relation.
3. The projection apparatus according to claim 2, wherein the first position is coordinate information of the correction feature point in an image coordinate system corresponding to the correction image; the controller is configured to:
In the step of performing the acquisition of the conversion matrix of the opto-mechanical coordinate system and the world coordinate system based on the projection plane equation,
Determining a first unit normal vector of the projection surface under the world coordinate system;
obtaining a second unit normal vector of the projection surface under the optical machine coordinate system according to the projection surface equation;
and obtaining the conversion matrix of the optical machine coordinate system and the world coordinate system according to the first unit normal vector and the second unit normal vector.
4. The projection device of claim 2, wherein the controller is configured to:
In performing the step of determining the first position of the correction feature point in the correction image,
Acquiring first positions of at least three correction feature points in the correction image;
In the step of performing the acquisition of the projection plane equation of the projection plane in the opto-mechanical coordinate system based on the second coordinates,
And fitting the second coordinates of the at least three correction characteristic points to obtain a projection plane equation of the projection plane under an optical machine coordinate system.
5. The projection device of claim 4, wherein the correction map card comprises: a first correction region, a second correction region, and a third correction region; the first correction area is positioned in the middle area of the correction chart card, the third correction area is positioned in the edge area of the correction chart card, and the second correction area is positioned between the first correction area and the third correction area; the controller is configured to:
in the step of performing the acquisition of the first positions of the at least three correction feature points in the correction image,
Identifying the first correction region in the correction image to obtain first positions of at least three correction characteristic points in the first correction region;
Identifying the second correction area in the correction image based on the fact that the first correction area is not identified, and obtaining first positions of at least three correction feature points in the second correction area;
And identifying the third correction area in the correction image based on the fact that the second correction area is not identified, and obtaining first positions of at least three correction feature points in the third correction area.
6. A projection device as claimed in claim 3, wherein the controller is configured to:
in the step of performing determination of target position information of an area to be projected in the projection plane according to the projection relation,
Determining a region to be projected in the projection surface;
And acquiring the position information of the region to be projected under the world coordinate system, and converting the position information into the position information under the optical machine coordinate system according to the projection relation, wherein the position information under the optical machine coordinate system is the target position information.
7. The projection device of claim 6, wherein the controller is configured to:
before the step of gray-scale processing the obstacle avoidance image is performed,
Acquiring HSV values of all pixel points in the obstacle avoidance image;
counting the first type of pixel points which do not accord with a preset HSV threshold according to the HSV value of each pixel point;
If the number of the first type of pixel points meets a preset first number condition, setting the gray values of all the first type of pixel points to be preset values; if the condition is not satisfied, no processing is performed.
8. The projection device of claim 6, wherein the controller is configured to:
in the step of performing determination of the areas of all pixel points corresponding to the first gray value interval as projection areas,
Judging whether the number of all pixel points corresponding to the first gray value interval meets a preset second number condition or not;
If not, determining the areas of all the pixel points corresponding to the first gray value interval as projection areas;
If yes, respectively acquiring a second gray value and a third gray value with the largest number of pixel points at two sides of the first gray value interval; respectively acquiring a second gray value interval and a third gray value interval corresponding to the second gray value and the third gray value; and determining the region of all the corresponding pixel points in the gray value interval with the largest number of pixel points in the three gray value intervals as a projection region.
9. An image correction method applied to a projection device, the method comprising:
Responding to an image correction instruction, controlling an optical machine to project a correction chart card to a projection surface, and controlling a camera to shoot the correction chart card to obtain a correction image; the correction chart card comprises correction characteristic points;
Determining a first position of the correction feature point in the correction image, and acquiring a projection relation according to the first position; the projection relation is a mapping relation between the playing content projected by the optical machine and the projection surface;
Controlling the optical machine to project an obstacle avoidance image card to the projection surface, and controlling the camera to shoot the obstacle avoidance image card to obtain an obstacle avoidance image; carrying out gray scale processing on the obstacle avoidance image, and counting the number of pixel points corresponding to each gray scale value to generate a gray scale value curve;
Acquiring target gray value minima on the left side and the right side of a first gray value in the gray value curve; the first gray value is the gray value with the largest number of pixel points; the target gray value minimum value is a gray value minimum value which meets a preset condition and is nearest to the first gray value; the preset conditions are as follows: the number of the pixel points corresponding to the minimum gray value is smaller than that of the pixel points corresponding to the first gray value of the preset proportion;
acquiring a first gray value interval between the target gray value minima, and determining the areas of all pixel points corresponding to the first gray value interval as projection areas; acquiring the maximum inscribed rectangle in the projection area and taking the maximum inscribed rectangle as an area to be projected;
determining target position information of a region to be projected in the projection surface according to the projection relation;
And controlling the optical machine to project playing content to the area to be projected according to the target position information.
CN202210325721.4A 2021-11-16 2022-03-29 Projection apparatus and image correction method Active CN114885136B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/122670 WO2023087947A1 (en) 2021-11-16 2022-09-29 Projection device and correction method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111355866 2021-11-16
CN2021113558660 2021-11-16

Publications (2)

Publication Number Publication Date
CN114885136A CN114885136A (en) 2022-08-09
CN114885136B true CN114885136B (en) 2024-05-28

Family

ID=80658581

Family Applications (13)

Application Number Title Priority Date Filing Date
CN202210006233.7A Pending CN114466173A (en) 2021-11-16 2022-01-05 Projection equipment and projection display control method for automatically throwing screen area
CN202210050600.3A Pending CN114205570A (en) 2021-11-16 2022-01-17 Projection equipment and display control method for automatically correcting projected image
CN202210168263.8A Pending CN114401390A (en) 2021-11-16 2022-02-23 Projection equipment and projection image correction method based on optical machine camera calibration
CN202210325721.4A Active CN114885136B (en) 2021-11-16 2022-03-29 Projection apparatus and image correction method
CN202210343443.5A Active CN114885137B (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210343444.XA Pending CN114885138A (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210345204.3A Pending CN114727079A (en) 2021-11-16 2022-03-31 Projection equipment and focusing method based on position memory
CN202210389054.6A Pending CN114827563A (en) 2021-11-16 2022-04-13 Projection apparatus and projection region correction method
CN202210583357.1A Active CN115022606B (en) 2021-11-16 2022-05-25 Projection equipment and obstacle avoidance projection method
CN202210709400.4A Active CN115174877B (en) 2021-11-16 2022-06-21 Projection device and focusing method thereof
CN202280063192.3A Pending CN118104230A (en) 2021-11-16 2022-09-29 Projection equipment and display control method
CN202280063350.5A Pending CN118077192A (en) 2021-11-16 2022-11-16 Projection equipment and projection area correction method
CN202280063329.5A Pending CN118104231A (en) 2021-11-16 2022-11-16 Projection apparatus and projection image correction method

Family Applications Before (3)

Application Number Title Priority Date Filing Date
CN202210006233.7A Pending CN114466173A (en) 2021-11-16 2022-01-05 Projection equipment and projection display control method for automatically throwing screen area
CN202210050600.3A Pending CN114205570A (en) 2021-11-16 2022-01-17 Projection equipment and display control method for automatically correcting projected image
CN202210168263.8A Pending CN114401390A (en) 2021-11-16 2022-02-23 Projection equipment and projection image correction method based on optical machine camera calibration

Family Applications After (9)

Application Number Title Priority Date Filing Date
CN202210343443.5A Active CN114885137B (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210343444.XA Pending CN114885138A (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210345204.3A Pending CN114727079A (en) 2021-11-16 2022-03-31 Projection equipment and focusing method based on position memory
CN202210389054.6A Pending CN114827563A (en) 2021-11-16 2022-04-13 Projection apparatus and projection region correction method
CN202210583357.1A Active CN115022606B (en) 2021-11-16 2022-05-25 Projection equipment and obstacle avoidance projection method
CN202210709400.4A Active CN115174877B (en) 2021-11-16 2022-06-21 Projection device and focusing method thereof
CN202280063192.3A Pending CN118104230A (en) 2021-11-16 2022-09-29 Projection equipment and display control method
CN202280063350.5A Pending CN118077192A (en) 2021-11-16 2022-11-16 Projection equipment and projection area correction method
CN202280063329.5A Pending CN118104231A (en) 2021-11-16 2022-11-16 Projection apparatus and projection image correction method

Country Status (2)

Country Link
CN (13) CN114466173A (en)
WO (3) WO2023087950A1 (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023087948A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and display control method
CN115002432A (en) * 2022-05-30 2022-09-02 海信视像科技股份有限公司 Projection equipment and obstacle avoidance projection method
WO2023087960A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and focusing method
CN114760454A (en) * 2022-05-24 2022-07-15 海信视像科技股份有限公司 Projection equipment and trigger correction method
CN118104229A (en) * 2021-11-16 2024-05-28 海信视像科技股份有限公司 Projection equipment and display control method of projection image
WO2023087947A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and correction method
CN114466173A (en) * 2021-11-16 2022-05-10 海信视像科技股份有限公司 Projection equipment and projection display control method for automatically throwing screen area
CN114640832A (en) * 2022-02-11 2022-06-17 厦门聚视智创科技有限公司 Automatic correction method for projected image
CN115002429B (en) * 2022-05-07 2023-03-24 深圳市和天创科技有限公司 Projector capable of automatically calibrating projection position based on camera calculation
CN115002430A (en) * 2022-05-17 2022-09-02 深圳市当智科技有限公司 Projection method, projector, and computer-readable storage medium
CN114885142B (en) * 2022-05-27 2024-05-17 海信视像科技股份有限公司 Projection equipment and method for adjusting projection brightness
CN115314689A (en) * 2022-08-05 2022-11-08 深圳海翼智新科技有限公司 Projection correction method, projection correction device, projector and computer program product
CN115314691B (en) * 2022-08-09 2023-05-09 北京淳中科技股份有限公司 Image geometric correction method and device, electronic equipment and storage medium
CN115061415B (en) * 2022-08-18 2023-01-24 赫比(成都)精密塑胶制品有限公司 Automatic process monitoring method and device and computer readable storage medium
CN115474032B (en) * 2022-09-14 2023-10-03 深圳市火乐科技发展有限公司 Projection interaction method, projection device and storage medium
CN115529445A (en) * 2022-09-15 2022-12-27 海信视像科技股份有限公司 Projection equipment and projection image quality adjusting method
WO2024066776A1 (en) * 2022-09-29 2024-04-04 海信视像科技股份有限公司 Projection device and projection-picture processing method
CN115361540B (en) * 2022-10-20 2023-01-24 潍坊歌尔电子有限公司 Method and device for self-checking abnormal cause of projected image, projector and storage medium
CN115760620B (en) * 2022-11-18 2023-10-20 荣耀终端有限公司 Document correction method and device and electronic equipment
WO2024124978A1 (en) * 2022-12-12 2024-06-20 海信视像科技股份有限公司 Projection device and projection method
CN116723395A (en) * 2023-04-21 2023-09-08 深圳市橙子数字科技有限公司 Non-inductive focusing method and device based on camera
CN116993879B (en) * 2023-07-03 2024-03-12 广州极点三维信息科技有限公司 Method for automatically avoiding obstacle and distributing light, electronic equipment and storage medium
CN116886881A (en) * 2023-07-26 2023-10-13 深圳市极鑫科技有限公司 Projector based on omnidirectional trapezoidal technology
CN117278735B (en) * 2023-09-15 2024-05-17 山东锦霖智能科技集团有限公司 Immersive image projection equipment
CN117830437B (en) * 2024-03-01 2024-05-14 中国科学院长春光学精密机械与物理研究所 Device and method for calibrating internal and external parameters of large-view-field long-distance multi-view camera

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006109088A (en) * 2004-10-05 2006-04-20 Olympus Corp Geometric correction method in multi-projection system
JP2016014712A (en) * 2014-07-01 2016-01-28 キヤノン株式会社 Shading correction value calculation device and shading correction value calculation method
CN110336987A (en) * 2019-04-03 2019-10-15 北京小鸟听听科技有限公司 A kind of projector distortion correction method, device and projector
CN112598589A (en) * 2020-12-17 2021-04-02 青岛海信激光显示股份有限公司 Laser projection system and image correction method
CN112995625A (en) * 2021-02-23 2021-06-18 峰米(北京)科技有限公司 Trapezoidal correction method and device for projector
CN112995624A (en) * 2021-02-23 2021-06-18 峰米(北京)科技有限公司 Trapezoidal error correction method and device for projector
WO2021143330A1 (en) * 2020-01-15 2021-07-22 浙江大学 Projector out-of-focus correction method based on edge perception

Family Cites Families (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005031267A (en) * 2003-07-09 2005-02-03 Sony Corp Picture projection device and picture projection method
JP3951984B2 (en) * 2003-08-22 2007-08-01 日本電気株式会社 Image projection method and image projection apparatus
JP4984968B2 (en) * 2007-02-28 2012-07-25 カシオ計算機株式会社 Projection apparatus, abnormality control method and program
US7872637B2 (en) * 2007-04-25 2011-01-18 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. System and method for tracking a laser spot on a projected computer screen image
JP4831219B2 (en) * 2008-10-29 2011-12-07 セイコーエプソン株式会社 Projector and projector control method
CN102236784A (en) * 2010-05-07 2011-11-09 株式会社理光 Screen area detection method and system
CN102681312B (en) * 2011-03-16 2015-06-24 宏瞻科技股份有限公司 Human eye safety protection system of laser projection system
JP2013033206A (en) * 2011-07-06 2013-02-14 Ricoh Co Ltd Projection display device, information processing device, projection display system, and program
CN103293836A (en) * 2012-02-27 2013-09-11 联想(北京)有限公司 Projection method and electronic device
CN103002240B (en) * 2012-12-03 2016-11-23 深圳创维数字技术有限公司 A kind of method and apparatus setting avoiding obstacles projection
JP6201359B2 (en) * 2013-03-22 2017-09-27 カシオ計算機株式会社 Projection system, projection method, and projection program
JP2015128242A (en) * 2013-12-27 2015-07-09 ソニー株式会社 Image projection device and calibration method of the same
CN103905762B (en) * 2014-04-14 2017-04-19 上海索广电子有限公司 Method for automatically detecting projection picture for projection module
CN103942796B (en) * 2014-04-23 2017-04-12 清华大学 High-precision projector and camera calibration system and method
JP6186599B1 (en) * 2014-12-25 2017-08-30 パナソニックIpマネジメント株式会社 Projection device
CN104536249B (en) * 2015-01-16 2016-08-24 努比亚技术有限公司 The method and apparatus of regulation projector focal length
CN104835143A (en) * 2015-03-31 2015-08-12 中国航空无线电电子研究所 Rapid projector system parameter calibration method
JP2016197768A (en) * 2015-04-02 2016-11-24 キヤノン株式会社 Image projection system and control method of projection image
WO2016194191A1 (en) * 2015-06-04 2016-12-08 日立マクセル株式会社 Projection-type picture display apparatus and picture display method
CN105208308B (en) * 2015-09-25 2018-09-04 广景视睿科技(深圳)有限公司 A kind of method and system for the best projection focus obtaining projecting apparatus
EP3416370B1 (en) * 2016-03-23 2023-07-26 Huawei Technologies Co., Ltd. Photography focusing method, device, and apparatus for terminal
CN107318007A (en) * 2016-04-27 2017-11-03 中兴通讯股份有限公司 The method and device of projected focus
CN107547881B (en) * 2016-06-24 2019-10-11 上海顺久电子科技有限公司 A kind of auto-correction method of projection imaging, device and laser television
CN106713879A (en) * 2016-11-25 2017-05-24 重庆杰夫与友文化创意有限公司 Obstacle avoidance projection method and apparatus
KR101820905B1 (en) * 2016-12-16 2018-01-22 씨제이씨지브이 주식회사 An image-based projection area automatic correction method photographed by a photographing apparatus and a system therefor
CN109215082B (en) * 2017-06-30 2021-06-22 杭州海康威视数字技术股份有限公司 Camera parameter calibration method, device, equipment and system
CN109426060A (en) * 2017-08-21 2019-03-05 深圳光峰科技股份有限公司 Projector automatic focusing method and projector
CN107479168A (en) * 2017-08-22 2017-12-15 深圳市芯智科技有限公司 A kind of projector that can realize rapid focus function and focusing method
KR101827221B1 (en) * 2017-09-07 2018-02-07 주식회사 조이펀 Mixed Reality Content Providing Device with Coordinate System Calibration and Method of Coordinate System Calibration using it
CN109856902A (en) * 2017-11-30 2019-06-07 中强光电股份有限公司 Projection arrangement and Atomatic focusing method
CN110058483B (en) * 2018-01-18 2022-06-10 深圳光峰科技股份有限公司 Automatic focusing system, projection equipment, automatic focusing method and storage medium
WO2019203002A1 (en) * 2018-04-17 2019-10-24 ソニー株式会社 Information processing device and method
CN110769214A (en) * 2018-08-20 2020-02-07 成都极米科技股份有限公司 Automatic tracking projection method and device based on frame difference
CN111147732B (en) * 2018-11-06 2021-07-20 浙江宇视科技有限公司 Focusing curve establishing method and device
CN109544643B (en) * 2018-11-21 2023-08-11 北京佳讯飞鸿电气股份有限公司 Video camera image correction method and device
CN109495729B (en) * 2018-11-26 2023-02-10 青岛海信激光显示股份有限公司 Projection picture correction method and system
CN110769225B (en) * 2018-12-29 2021-11-09 成都极米科技股份有限公司 Projection area obtaining method based on curtain and projection device
CN110769226B (en) * 2019-02-27 2021-11-09 成都极米科技股份有限公司 Focusing method and focusing device of ultra-short-focus projector and readable storage medium
CN110769227A (en) * 2019-02-27 2020-02-07 成都极米科技股份有限公司 Focusing method and focusing device of ultra-short-focus projector and readable storage medium
CN110336951A (en) * 2019-08-26 2019-10-15 厦门美图之家科技有限公司 Contrast formula focusing method, device and electronic equipment
CN110636273A (en) * 2019-10-15 2019-12-31 歌尔股份有限公司 Method and device for adjusting projection picture, readable storage medium and projector
CN112799275B (en) * 2019-11-13 2023-01-06 青岛海信激光显示股份有限公司 Focusing method and focusing system of ultra-short-focus projection lens and projector
CN111028297B (en) * 2019-12-11 2023-04-28 凌云光技术股份有限公司 Calibration method of surface structured light three-dimensional measurement system
CN111050150B (en) * 2019-12-24 2021-12-31 成都极米科技股份有限公司 Focal length adjusting method and device, projection equipment and storage medium
CN111050151B (en) * 2019-12-26 2021-08-17 成都极米科技股份有限公司 Projection focusing method and device, projector and readable storage medium
CN110996085A (en) * 2019-12-26 2020-04-10 成都极米科技股份有限公司 Projector focusing method, projector focusing device and projector
CN113554709A (en) * 2020-04-23 2021-10-26 华东交通大学 Camera-projector system calibration method based on polarization information
CN111429532B (en) * 2020-04-30 2023-03-31 南京大学 Method for improving camera calibration accuracy by utilizing multi-plane calibration plate
CN113301314B (en) * 2020-06-12 2023-10-24 阿里巴巴集团控股有限公司 Focusing method, projector, imaging apparatus, and storage medium
CN112050751B (en) * 2020-07-17 2022-07-22 深圳大学 Projector calibration method, intelligent terminal and storage medium
CN111932571B (en) * 2020-09-25 2021-01-22 歌尔股份有限公司 Image boundary identification method and device and computer readable storage medium
CN112584113B (en) * 2020-12-02 2022-08-30 深圳市当智科技有限公司 Wide-screen projection method and system based on mapping correction and readable storage medium
CN112904653A (en) * 2021-01-26 2021-06-04 四川长虹电器股份有限公司 Focusing method and focusing device for projection equipment
CN113099198B (en) * 2021-03-19 2023-01-10 深圳市火乐科技发展有限公司 Projection image adjusting method and device, storage medium and electronic equipment
CN112804507B (en) * 2021-03-19 2021-08-31 深圳市火乐科技发展有限公司 Projector correction method, projector correction system, storage medium, and electronic device
CN112689136B (en) * 2021-03-19 2021-07-02 深圳市火乐科技发展有限公司 Projection image adjusting method and device, storage medium and electronic equipment
CN113038105B (en) * 2021-03-26 2022-10-18 歌尔股份有限公司 Projector adjusting method and adjusting apparatus
CN113160339B (en) * 2021-05-19 2024-04-16 中国科学院自动化研究所苏州研究院 Projector calibration method based on Molaque law
CN113286134A (en) * 2021-05-25 2021-08-20 青岛海信激光显示股份有限公司 Image correction method and shooting equipment
CN113473095B (en) * 2021-05-27 2022-10-21 广景视睿科技(深圳)有限公司 Method and device for obstacle avoidance dynamic projection
CN113489961B (en) * 2021-09-08 2022-03-22 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment
CN114466173A (en) * 2021-11-16 2022-05-10 海信视像科技股份有限公司 Projection equipment and projection display control method for automatically throwing screen area

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006109088A (en) * 2004-10-05 2006-04-20 Olympus Corp Geometric correction method in multi-projection system
JP2016014712A (en) * 2014-07-01 2016-01-28 キヤノン株式会社 Shading correction value calculation device and shading correction value calculation method
CN110336987A (en) * 2019-04-03 2019-10-15 北京小鸟听听科技有限公司 A kind of projector distortion correction method, device and projector
WO2021143330A1 (en) * 2020-01-15 2021-07-22 浙江大学 Projector out-of-focus correction method based on edge perception
CN112598589A (en) * 2020-12-17 2021-04-02 青岛海信激光显示股份有限公司 Laser projection system and image correction method
CN112995625A (en) * 2021-02-23 2021-06-18 峰米(北京)科技有限公司 Trapezoidal correction method and device for projector
CN112995624A (en) * 2021-02-23 2021-06-18 峰米(北京)科技有限公司 Trapezoidal error correction method and device for projector

Also Published As

Publication number Publication date
CN118104230A (en) 2024-05-28
CN115022606B (en) 2024-05-17
CN114205570A (en) 2022-03-18
CN114401390A (en) 2022-04-26
CN114885136A (en) 2022-08-09
CN114885138A (en) 2022-08-09
WO2023088329A1 (en) 2023-05-25
CN115174877A (en) 2022-10-11
CN114885137B (en) 2024-05-31
CN118077192A (en) 2024-05-24
CN115022606A (en) 2022-09-06
CN115174877B (en) 2024-05-28
CN118104231A (en) 2024-05-28
CN114885137A (en) 2022-08-09
WO2023088304A1 (en) 2023-05-25
CN114727079A (en) 2022-07-08
CN114466173A (en) 2022-05-10
WO2023087950A1 (en) 2023-05-25
CN114827563A (en) 2022-07-29

Similar Documents

Publication Publication Date Title
CN114885136B (en) Projection apparatus and image correction method
US6172706B1 (en) Video camera with automatic zoom adjustment based on distance between user&#39;s eyes
WO2023087947A1 (en) Projection device and correction method
JP5108093B2 (en) Imaging apparatus and imaging method
JP2007078821A (en) Projector, projecting method and program
CN116055696A (en) Projection equipment and projection method
CN114866751A (en) Projection equipment and trigger correction method
CN115002432A (en) Projection equipment and obstacle avoidance projection method
CN115883803A (en) Projection equipment and projection picture correction method
CN115002433A (en) Projection equipment and ROI (region of interest) feature region selection method
JP2012181264A (en) Projection device, projection method, and program
CN116320335A (en) Projection equipment and method for adjusting projection picture size
CN114928728A (en) Projection apparatus and foreign matter detection method
CN114760454A (en) Projection equipment and trigger correction method
CN115529445A (en) Projection equipment and projection image quality adjusting method
CN115623181A (en) Projection equipment and projection picture moving method
CN116246553A (en) Projection apparatus and projection method
CN115604445A (en) Projection equipment and projection obstacle avoidance method
CN114885142B (en) Projection equipment and method for adjusting projection brightness
WO2023087951A1 (en) Projection device, and display control method for projected image
WO2023087948A1 (en) Projection device and display control method
CN115243021A (en) Projection equipment and obstacle avoidance projection method
WO2024066776A9 (en) Projection device and projection-picture processing method
WO2023115857A1 (en) Laser projection device, and projection image correction method
CN118075435A (en) Projection equipment and instruction response method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant