WO2023087960A1 - 投影设备及调焦方法 - Google Patents

投影设备及调焦方法 Download PDF

Info

Publication number
WO2023087960A1
WO2023087960A1 PCT/CN2022/123540 CN2022123540W WO2023087960A1 WO 2023087960 A1 WO2023087960 A1 WO 2023087960A1 CN 2022123540 W CN2022123540 W CN 2022123540W WO 2023087960 A1 WO2023087960 A1 WO 2023087960A1
Authority
WO
WIPO (PCT)
Prior art keywords
focusing
focus
projection device
controller
projection
Prior art date
Application number
PCT/CN2022/123540
Other languages
English (en)
French (fr)
Inventor
王英俊
陈先义
张伟
何营昊
郑晴晴
岳国华
唐高明
卢平光
李雨欣
孙超
甄凌云
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202210345204.3A external-priority patent/CN114727079A/zh
Priority claimed from CN202210625987.0A external-priority patent/CN115002433A/zh
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Publication of WO2023087960A1 publication Critical patent/WO2023087960A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/53Means for automatic focusing, e.g. to compensate thermal effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Definitions

  • the present application relates to the technical field of display devices, and in particular to a projection device and a focusing method.
  • a projection device is a display device that can project images or video onto a screen.
  • the projection device can project the laser light of a specific color onto the screen through the refraction of the optical lens assembly to form a specific image.
  • the focal length of the optical lens assembly of the projection device In order to adapt to complex application scenarios and screens of different specifications, it is necessary to adjust the focal length of the optical lens assembly of the projection device. For example, the user can manually adjust the distance between the lenses in the optical lens assembly by observing the sharpness of the picture projected by the projection device, so that the overall focal length of the optical lens assembly changes. With the user's adjustment process, the clarity of the projected image will change, and stop when the clarity meets the user's needs.
  • the manual focusing process is cumbersome and inconvenient for users to use.
  • some projection devices also support auto-focus function.
  • the automatic focusing function can adjust the focus by setting the driving motor so that the driving motor drives some lenses in the optical lens assembly to move.
  • the projection device also detects the definition of the projected picture, and controls the driving motor to start or stop according to the detected definition, so as to realize automatic focusing.
  • the above-mentioned automatic focusing method takes a long time and has poor anti-interference ability, which reduces user experience.
  • the first aspect of the embodiments of the present application provides a projection device, including: an optical machine configured to project projection content to a projection surface; a lens, the lens includes an optical assembly and a driving motor; the driving motor is connected to the optical assembly , to adjust the focal length of the optical component; the memory is configured to store position memory information; the controller is configured to: acquire a focusing instruction input by a user; in response to the focusing instruction, extract the position memory information, The position memory information includes the current position of the optical component and the reliability of the current position; in response to the reliability being a first value, calculating a first focusing amount based on the current position, and adding to the
  • the driving motor sends a first focusing instruction, and the first focusing instruction is used to control the driving motor to move the position of the optical component according to the first focusing amount; in response to the reliability being the second value, calculate the second focus amount based on the starting position of the focus interval, and send a second focus instruction to the drive motor, the second value is smaller than the first value; the
  • the second aspect of the embodiments of the present application provides a focusing method for a projection device, the projection device includes an optical machine, a lens, and a controller; wherein the lens includes an optical component and a driving motor, and the driving motor is connected to The optical component is used to adjust the focal length of the optical component; the focusing method includes: obtaining a focusing instruction input by a user; in response to the focusing instruction, extracting the position memory information, and the position memory information includes The current position of the optical component and the reliability of the current position; in response to the reliability being a first value, calculating a first focus amount based on the current position, and sending a first a focusing instruction, the first focusing instruction is used to control the driving motor to move the position of the optical component according to the first focusing amount; in response to the reliability being a second value, based on the focusing interval Calculate a second focus amount at the starting point, and send a second focus instruction to the drive motor, the second value is smaller than the first value; the second focus instruction is used to
  • FIG. 1 is a schematic diagram of a projection state of a projection device in an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of a projection device in an embodiment of the present application.
  • FIG. 3 is a schematic diagram of the optical-mechanical architecture of the projection device in the embodiment of the present application.
  • FIG. 4 is a schematic diagram of the optical path of the projection device in the embodiment of the present application.
  • FIG. 5 shows a schematic diagram of a circuit structure of a projection device according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of a system framework of a projection device in an embodiment of the present application.
  • FIG. 7 is a schematic diagram of the lens structure of the projection device in the embodiment of the present application.
  • Fig. 8 is a schematic diagram of the lens projection optical path in the embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a distance sensor and a camera in an embodiment of the present application.
  • FIG. 10 is a schematic flow chart of a focusing method based on position memory in an embodiment of the present application.
  • FIG. 11 is a schematic diagram of the focusing process based on the first focusing amount in the embodiment of the present application.
  • Fig. 12 is a schematic flow chart of calculating the first focusing amount in the embodiment of the present application.
  • FIG. 13 is a schematic diagram of the focusing process based on the second focusing amount in the embodiment of the present application.
  • Fig. 14 is a schematic diagram of data interaction of each functional module in the embodiment of the present application.
  • Fig. 15 is a schematic flow chart of setting reliability in the embodiment of the present application.
  • FIG. 16 is a schematic diagram of the multi-stage focusing process in the embodiment of the present application.
  • FIG. 17 is a time sequence flow chart of the focusing method based on position memory in the embodiment of the present application.
  • FIG. 18 is a schematic diagram of the signaling interaction sequence of the projection device realizing the radioactive eye function according to another embodiment of the present application.
  • FIG. 19 is a schematic diagram of a signaling interaction sequence of a projection device implementing a display screen correction function according to another embodiment of the present application.
  • FIG. 20 is a schematic flow diagram of a projection device implementing an autofocus algorithm according to another embodiment of the present application.
  • FIG. 21 is a schematic flow diagram of a projection device implementing trapezoidal correction and obstacle avoidance algorithms according to another embodiment of the present application.
  • FIG. 22 is a schematic flow diagram of a projection device implementing a screen entry algorithm according to another embodiment of the present application.
  • FIG. 23 is a schematic flow diagram of the projection device implementing the anti-eye algorithm according to another embodiment of the present application.
  • FIG. 24 is a schematic flowchart of a method for selecting an ROI feature region according to other embodiments of the present application.
  • Fig. 25 exemplarily shows a schematic diagram of contour region classification in ROI feature region selection methods in some embodiments
  • Fig. 26 exemplarily shows a flowchart of a method for selecting an ROI feature region in some embodiments
  • Fig. 27 exemplarily shows a schematic diagram of a fine focusing process in some embodiments
  • Fig. 28 exemplarily shows a schematic diagram of the process of the ROI feature area selection method in some embodiments
  • FIG. 29 exemplarily shows a timing relationship diagram of the ROI feature area selection method in the embodiment of the present application.
  • Fig. 30 exemplarily shows a schematic flowchart of an automatic focusing method in some embodiments
  • Fig. 31 exemplarily shows a flowchart of an autofocus method in some embodiments
  • Fig. 32 exemplarily shows a flowchart of an autofocus method in some embodiments
  • Fig. 33 exemplarily shows a flowchart of an autofocus method in some embodiments
  • Fig. 34 exemplarily shows a schematic diagram of a fine-focusing process in an auto-focusing method in some embodiments
  • FIG. 35 exemplarily shows a timing diagram of an autofocus method in the embodiment of the present application.
  • a projection device is a device that can project images or videos onto a screen.
  • the projection device can communicate with computers, radio and television networks, the Internet, VCD (Video Compact Disc), DVD (Digital Versatile Disc Recordable) through different interfaces. : Digital Video Disc), game consoles, DV, etc. are connected to play corresponding video signals.
  • Projection equipment is widely used in homes, offices, schools and entertainment venues, etc.
  • FIG. 1 is a schematic diagram of an arrangement of a projection device according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of an optical path of a projection device according to an embodiment of the present application.
  • a projection device of the present application includes a projection screen 1 and a device 2 for projection.
  • the projection screen 1 is fixed at the first position, and the device 2 for projection is placed at the second position, so that the projected picture coincides with the projection screen 1 .
  • the projection device includes a laser light source 100 , an optical machine 200 , a lens 300 , and a projection medium 400 .
  • the laser light source 100 provides illumination for the light machine 200
  • the light machine 200 modulates the light beam, and outputs it to the lens 300 for imaging, and projects it to the projection medium 400 to form a projection picture.
  • the laser light source 100 of the projection device includes a laser component 110 and an optical lens component 120 , the light beam emitted by the laser component 110 can pass through the optical lens component 120 to provide illumination for the optical machine.
  • the optical lens assembly 120 requires a higher level of environmental cleanliness and airtight level sealing; while the chamber where the laser component is installed can be sealed with a lower level of dustproof level to reduce sealing costs.
  • FIG. 3 is a schematic diagram of a circuit structure of a projection device according to an embodiment of the present application.
  • the projection device may include a display control circuit 10, a laser light source 20, at least one laser driver assembly 30, and at least one brightness sensor 40.
  • the laser light source 20 may include at least one laser corresponding to the at least one laser driver assembly 30 one-to-one. Wherein, the at least one includes one or more, and a plurality refers to two or more.
  • the projection device can realize adaptive adjustment. For example, by setting the brightness sensor 40 in the light output path of the laser light source 20 , the brightness sensor 40 can detect the first brightness value of the laser light source and send the first brightness value to the display control circuit 10 .
  • the display control circuit 10 can obtain the second brightness value corresponding to the driving current of each laser, and when it is determined that the difference between the second brightness value of the laser and the first brightness value of the laser is greater than the difference threshold, determine that the laser When a COD (Catastrophic optical damage) fault occurs; the display control circuit can adjust the current control signal of the corresponding laser drive component of the laser until the difference is less than or equal to the difference threshold, thereby eliminating the COD fault of the blue laser; the projection The equipment can eliminate the COD failure of the laser in time, reduce the damage rate of the laser, and improve the image display effect of the projection equipment.
  • COD Catastrophic optical damage
  • FIG. 4 is a schematic structural diagram of a projection device according to an embodiment of the present application.
  • the laser light source 20 in the projection device may include a blue laser 201, a red laser 202 and a green laser 203 which are set independently, and the projection device may also be called a three-color projection device, the blue laser 201, the red laser 201
  • the laser 202 and the green laser 203 are both Mirai Console Loader (MCL) packaged lasers, which are small in size and conducive to compact arrangement of optical paths.
  • the laser light source can also be a monochromatic laser or a dual-color laser.
  • the projection device may be configured with a camera for cooperating with the projection device to achieve adjustment and control of the projection process.
  • the camera configured by the projection device can be specifically implemented as a 3D camera, or a binocular camera; when the camera is implemented as a binocular camera, it specifically includes a left camera and a right camera; the binocular camera can obtain the corresponding screen of the projection device, that is, the projection
  • the image and playback content presented on the surface are projected by the built-in optical machine of the projection device.
  • the projection device controller can be based on the image captured by the camera, through The correct display of the included angle between the projection surfaces of the coupled light machine and the projected image realizes automatic keystone correction.
  • FIG. 5 is a schematic diagram of a circuit structure of a projection device according to an embodiment of the present application.
  • the laser driving component 30 may include a driving circuit 301 , a switching circuit 302 and an amplifying circuit 303 .
  • the driving circuit 301 may be a driving chip.
  • the switch circuit 302 may be a metal-oxide-semiconductor (MOS) transistor.
  • the driving circuit 301 is respectively connected with the switch circuit 302 , the amplification circuit 303 and the corresponding laser included in the laser light source 20 .
  • the driving circuit 301 is used to output the driving current to the corresponding laser in the laser light source 20 through the VOUT terminal based on the current control signal sent by the display control circuit 10 , and transmit the received enabling signal to the switch circuit 302 through the ENOUT terminal.
  • the display control circuit 10 is further configured to determine the amplified driving voltage as the driving current of the laser, and obtain a second brightness value corresponding to the driving current.
  • the amplifying circuit 303 may include: a first operational amplifier A1, a first resistor (also known as a sampling power resistor) R1, a second resistor R2, a third resistor R3 and a fourth resistor R4.
  • the display control circuit 10 is further configured to recover the current control signal of the laser driving component corresponding to the laser when the difference between the second brightness value of the laser and the first brightness value of the laser is less than or equal to the difference threshold
  • the initial value is the magnitude of the PWM (Pulse Width Modulation, pulse width modulation) current control signal to the laser in the normal state. Therefore, when a COD failure occurs in the laser, it can be quickly identified, and measures to reduce the driving current can be taken in time to reduce the continuous damage of the laser itself and help it recover itself. The whole process does not require dismantling and human intervention, which improves the laser light source. The reliability of use ensures the projection display quality of laser projection equipment.
  • the controller includes a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processing unit (Graphics Processing Unit, GPU), RAM (Random Access Memory, RAM), ROM (Read -Only Memory, ROM), at least one of the first interface to the nth interface for input/output, a communication bus (Bus), and the like.
  • CPU Central Processing Unit
  • video processor video processor
  • audio processor audio processor
  • graphics processing unit GPU
  • RAM Random Access Memory
  • ROM Read -Only Memory
  • the system is divided into four layers, from top to bottom are the Applications (Applications) layer (abbreviated as “Application Layer”), Application Framework (Application Framework) layer (abbreviated as “Framework Layer”), Android Runtime (Android runtime) and system library layer (referred to as “system runtime layer”), and the kernel layer.
  • Applications Applications
  • Application Framework Application Framework
  • Android Runtime Android runtime
  • system library layer system library layer
  • the projection device after the projection device is started, it can directly enter the display interface of the signal source selected last time, or the signal source selection interface, wherein the signal source can be a preset video-on-demand program, or an HDMI (High Definition Multimedia Interface ) interface, live TV interface, etc.
  • the projector After the user selects different signal sources, the projector can display the content obtained from different signal sources.
  • the projection device may be configured with a camera for cooperating with the projection device to achieve adjustment and control of the projection process.
  • the camera configured by the projection device can be specifically implemented as a 3D camera, or a binocular camera; when the camera is implemented as a binocular camera, it specifically includes a left camera and a right camera; the binocular camera can obtain the corresponding screen of the projection device, that is, the projection
  • the image and playback content presented on the surface are projected by the built-in optical machine of the projection device.
  • FIG. 6 is a schematic diagram of a system framework for realizing display control by a projection device according to an embodiment of the present application.
  • the projection device has the characteristics of telephoto micro-projection, and its controller can display and control the projected light image through a preset algorithm, so as to realize automatic keystone correction, automatic screen entry, automatic obstacle avoidance, and automatic focusing of the display screen. , and anti-shooting eyes and other functions.
  • the projection device is configured with a gyroscope sensor; during the movement of the device, the gyroscope sensor can sense the position movement and actively collect movement data; then the collected data is sent to the application service layer through the system framework layer, It supports user interface interaction and application data required in the process of application program interaction, and the collected data can also be used for data calls by the controller in the implementation of algorithm services.
  • the projection device is configured with a time-of-flight sensor. After the time-of-flight sensor collects corresponding data, the data will be sent to the time-of-flight service corresponding to the service layer; after the above-mentioned time-of-flight service obtains the data, it will collect the data The data is sent to the application service layer through the process communication framework, and the data will be used for data calls of the controller, user interface, program application, etc. for interactive use.
  • the projection device is configured with a camera for collecting images, and the camera can be implemented as a binocular camera, or a depth camera, or a 3D camera, etc.; the data collected by the camera will be sent to the camera service, and then the camera service will collect The image data is sent to the process communication framework and/or the projection device correction service; the projection device correction service can receive the camera acquisition data sent by the camera service, and the controller can call the corresponding control in the algorithm library for different functions to be realized algorithm.
  • data interaction is performed with the application service through the process communication framework, and then the calculation result is fed back to the correction service through the process communication framework; the correction service sends the obtained calculation result to the operating system of the projection device to generate a control signal command, and send the control signal to the optical-mechanical control driver to control the optical-mechanical working conditions and realize the automatic correction of the displayed image.
  • the projection device uses an automatic focusing algorithm to obtain the current object distance by using its configured laser ranging to calculate the initial focal length and search range; then the projection device drives the camera to take pictures, and uses the corresponding algorithm to perform clear degree evaluation.
  • the projection device searches for the best possible focal length based on the search algorithm, then repeats the above steps of photographing and sharpness evaluation, and finally finds the optimal focal length through sharpness comparison to complete automatic focusing.
  • the projection device After the projection device is started, the user moves the device; the projection device automatically completes the calibration and re-focuses, and the controller will detect whether the auto-focus function is turned on; when the auto-focus function is not turned on, the controller will end the auto-focus business ;When the auto-focus function is turned on, the projection device will obtain the detection distance of the time-of-flight sensor through the middleware for calculation;
  • the controller queries the preset mapping table according to the obtained distance to obtain the focal length of the projection device; then the middleware will obtain the focal length and set it to the optical machine of the projection device; after the optical machine emits laser light with the above focal length, the camera will execute the camera command; The controller judges whether the focusing of the projection device is completed according to the captured image and the evaluation function;
  • the control auto-focusing process ends; if the judgment result does not meet the preset completion conditions, the middleware will fine-tune the focal length parameters of the optical machine of the projection device. Set the adjusted focal length parameters to the optical machine again; thus realize the steps of repeated photographing and sharpness evaluation, and finally find the optimal focal length through sharpness comparison to complete automatic focusing.
  • Fig. 7 is a schematic diagram of a lens structure of a projection device in some embodiments.
  • the lens 300 of the projection device may further include an optical assembly 310 and a driving motor 320 .
  • the optical component 310 is a lens group composed of one or more lenses, which can refract the light emitted by the optical machine 200, so that the light emitted by the optical machine 200 can be transmitted to the projection surface to form a transmitted content image.
  • the optical assembly 310 may include a lens barrel and a plurality of lenses disposed in the lens barrel. According to whether the position of the lens can be moved, the lens in the optical assembly 310 can be divided into a movable lens 311 and a fixed lens 312, by changing the position of the movable lens 311, adjusting the distance between the movable lens 311 and the fixed lens 312, changing the overall optical assembly 310 focal length. Therefore, the driving motor 320 can drive the moving lens 311 to move its position by connecting with the moving lens 311 in the optical assembly 310 to realize the auto-focus function.
  • the focusing process described in some embodiments of the present application refers to changing the position of the moving lens 311 by driving the motor 320, thereby adjusting the distance between the moving lens 311 and the fixed lens 312, that is, adjusting the position of the image plane , so the imaging principle of the lens combination in the optical assembly 310, the adjustment of the focal length is actually the adjustment of the image distance, but in terms of the overall structure of the optical assembly 310, adjusting the position of the moving lens 311 is equivalent to adjusting the overall focal length adjustment of the optical assembly 310 . Therefore, for the convenience of description, in the following embodiments, adjusting the focal length is used to illustrate the above process.
  • the driving motor 320 can be connected to the moving lens 311 through a specific transmission mechanism.
  • the transmission principle of the transmission mechanism can be any transmission structure that converts the rotation action into the movement action.
  • worm gear transmission structure ball screw transmission structure, thread screw transmission structure, etc.
  • threaded screw transmission structure the outer edge of the moving lens 311 is provided with a frame, and the frame may be provided with threads.
  • the power output shaft of the drive motor 320 is connected to the screw, and the screw is matched with the thread on the frame, so that the rotation output by the drive motor 320 can be converted into the movement of the frame, thereby driving the moving lens 311 to move in the lens barrel.
  • the projection device can rotate the motor 320 by a specific angle or number of turns to make the moving lens 311 in a corresponding position.
  • the driving motor 320 can be a stepper motor, a servo motor, etc. whose rotation angle can be controlled.
  • the controller 500 of the projection device may send a movement instruction to the drive motor 320 , and the movement instruction may include angle data required to control the rotation of the drive motor 320 .
  • the movement instruction sent by the controller 500 may include a pulse signal corresponding to the rotation angle, and then after the movement instruction is sent to the drive motor 320, the drive motor 320 can move from The pulse signal is analyzed in the instruction, and the rotation is performed according to the pulse signal.
  • the driving motor may also be an ultrasonic motor or a voice coil motor, which will not be described in detail here.
  • the corresponding relationship between the moving distance of the moving mirror 311 and the rotation angle of the driving motor 320 may be calculated in advance according to the internal structure of the projection device.
  • the corresponding relationship between the moving distance and the rotation angle can be a linear relationship, which is affected by the transmission ratio of the transmission mechanism.
  • the projection device can first calculate the target position of the moving lens 311 , and then calculate the distance that the moving lens 311 needs to move during the focusing process by making a difference from the current position of the moving lens 311 . Then, according to the corresponding relationship between the moving distance and the turning angle, the angle that the driving motor 320 needs to turn is calculated, so as to generate a moving command and send it to the driving motor 320 .
  • FIG 8 is a schematic diagram of the optical path of lens projection in some embodiments, as shown in Figure 8, for the convenience of description, the end of the moving lens 311 closest to the optical machine can be called the proximal end, and the end of the moving lens 311 farthest away from the optical machine can be called One end is called the distal end, and the overall movement stroke of the moving lens 311 is the distance between the proximal end and the distal end.
  • the actual focus range of the projection device may be within the travel range of the moving lens 311 .
  • the drive motor 320 can be adjusted forward by 300 steps to meet the focus adjustment requirements of the projection device at the shortest projection distance;
  • the actual focus range may be the position interval corresponding to the drive motor 320 in the range of 300 steps to 900 steps.
  • the adjustment range of the projection device may be based on the actual focus range, with some adjustment margin added. For example, for the position interval corresponding to the adjustment of 300 steps to 900 steps, after setting the adjustment margin of 100 steps, the position of 200 steps from the near end and the position of 1000 steps can be set as the adjustment starting point and the adjustment end point to form the final focus range.
  • the lens of the projection device needs to be adjusted to different focal lengths so as to transmit a clear image on the projection surface.
  • the distance between the projection device and the projection surface will require different focal lengths depending on the location of the user. Therefore, in order to adapt to different usage scenarios, the projection device needs to adjust the focal length of the optical assembly 310 .
  • the projection device can support a manual focus function, that is, there can be interactive buttons on the projection device, or the projection device is equipped with a remote control, and the user can adjust the focus through the focus button on the projection device or the focus on the remote control. Adjust the buttons to interact with the projection device. Then, during the interaction process, the controller 500 of the projection device can generate a movement command according to the user's button operation, and send the movement command to the driving motor, so as to control the driving motor to drive the moving lens 311 to move and change the focal length of the optical component.
  • the projection device can generate and control the moving lens 311 to move away from the optical machine.
  • Direction movement command and send the movement command to the drive motor, and the drive motor will rotate in the forward (clockwise) direction to output torque after receiving the movement command, and drive the moving lens 311 to move away from the optical machine.
  • the clarity of the picture transmitted by the projection device will change, and the user can choose to continue or stop the focusing operation according to the picture clarity until the user is satisfied with the picture effect.
  • the controller 500 of the projection device may implement corresponding control on the driving motor 320 according to a preset interaction rule. That is, in some embodiments, the controller 500 can determine the rotation angle of the driving motor according to the duration of the user's key pressing. Then, when the focus adjustment amount is large, the user can press the focus adjustment button for a long time; and when the focus adjustment amount is small, the user can press the focus adjustment button for a short time.
  • the controller 500 of the projection device can also determine the rotation angle of the driving motor according to the number of key presses by the user. For example, when the projection device is in the rough adjustment state, the adjustment amount of each button operation is set to 100 steps, corresponding to one rotation of the drive motor, then when the user adjusts 300 steps to the far end, the user needs to press the "forward" button three times in a row.
  • the projection device can also automatically adjust the focal length according to the projection effect.
  • a focusing method based on the relationship between separation distance and focal length is provided. That is, the projection device can detect the distance between the optical machine and the projection surface. Since different distances require different focal lengths to present a clear picture, after detecting the distance from the projection surface, the projection device can be adapted accordingly. focal length. Then send a movement instruction to the driving motor, so as to adjust the overall focal length of the optical assembly to a suitable focal length.
  • Fig. 9 is a schematic structural diagram of a distance sensor and a camera in some embodiments.
  • the projection device can also have a built-in or external camera 700 , and the camera 700 can take images of images projected by the projection device to obtain projection content images.
  • the projection device checks the definition of the projected content image to determine whether the current lens focal length is appropriate, and adjusts the focal length if it is not appropriate.
  • the projection device can continuously adjust the lens position and take pictures, and find the focus position by comparing the clarity of the front and rear position pictures, so as to adjust the moving lens 311 in the optical assembly to Suitable location.
  • the controller 500 may first control the driving motor 320 to gradually move the moving lens 311 from the focus start position to the focus end position, and continuously obtain projected content images through the camera 700 during this period. Then, by performing definition detection on multiple projected content images, the position with the highest definition is determined, and finally the driving motor 320 is controlled to adjust the moving lens 311 from the focusing terminal to the position with the highest definition, and automatic focusing is completed.
  • Fig. 10 is a schematic flowchart of a focusing method based on position memory in some embodiments.
  • the focusing method can memorize the position of the driving motor 320 and perform focusing based on the memorized position, so as to reduce the movement stroke of the driving motor 320 and increase the focusing speed.
  • the focusing method may be applied to a projection device, and in order to meet the implementation of the focusing method, the projection device may include an optical machine 200 , a lens 300 , a memory, and a controller 500 . Wherein, as shown in FIG. 10, the controller 500 can be used to execute the program steps corresponding to the focusing method, including the following:
  • the focusing instruction input by the user may include a manual focusing instruction and an automatic focusing instruction.
  • the manual focus instruction can be input through a physical button on the projection device, or a physical button on a remote controller supporting the projection device.
  • the auto-focus instruction may be actively input by the user. For example, after turning on the power of the projection device, the user can press the auto-focus button on the projection device or the remote controller supporting the projection device to make the projection device automatically focus, that is, to obtain the auto-focus instruction.
  • the auto-focus instruction can also be automatically generated according to the built-in control program of the projection device. For example, when the projection device detects the first video signal input after being powered on, it may trigger auto-focusing, and then generate an auto-focusing instruction. Also for example, when the projection device detects that its own posture or setting position has changed, in order to eliminate the influence of the change process, the projection device can automatically adjust the focus after detecting that the posture or setting position has changed, that is, generate an automatic adjustment focus command.
  • the projection device may extract position memory information from the memory.
  • a position storage module can be pre-configured in the projection device, and the position storage module can record the rotation of the drive motor 320 in real time during the focusing process.
  • the position memory information can be generated by recording the position after focusing each time the focus is performed.
  • the focusing process capable of triggering real-time recording may be a manual focusing process or an automatic focusing process.
  • the position memory information includes the current position of the optical component 310 and the reliability of the current position.
  • the current position is the position of the optical assembly 310 after each focus adjustment, and the current position will not change when no focus adjustment is performed.
  • the reliability is used to characterize the confidence level of the current position, that is, whether the recorded current position can be used in the subsequent focusing process, so as to realize the evaluation of the recorded current position by setting the reliability.
  • a temporary variable may be initialized first.
  • the temporary variable is used to record the moving direction and the number of moving steps of the optical assembly 310 driven by the driving motor 320 during the focusing process. That is, the projection device can update the focusing amount during the focusing process to the temporary variable in real time.
  • the current position in the position memory information is updated according to the temporary variable, and the reliability of the current position is set according to whether an abnormal situation occurs during the focusing process.
  • the controller 500 can notify the location storage module that the focus will be adjusted, and the location storage module will set the location reliability to 0 after receiving the notification, and wait for the completion of the focus adjustment process.
  • the projection device receives user operation keys again, and if the user adjusts the focus through the direction keys, the position of the lens 311 is moved according to the user's requirements, and the user's operation process, including the moving direction and the number of moving steps, is memorized in real time through temporary variables.
  • the projection device can judge whether the current location is available according to the credibility of the location memory information.
  • the reliability may represent the validity of the memory information of the current location through a specific numerical value. That is, when the reliability is the first value, it means that the current location memory information is valid; when the reliability is the second value, it means that the current location memory information is invalid.
  • the first value may be set to be greater than the second value. For example, setting the first value to 1 means that the current memory information is valid; setting the second value to 0 means that the current memory information is invalid.
  • the validity of the current location memory information can also be indicated by other numerical values. For example, when the reliability is an odd number, it indicates that the location memory information is valid; when the reliability is an even number, it indicates that the location memory information is invalid.
  • FIG. 11 is a schematic diagram of the focusing process based on the first focusing amount in some embodiments.
  • the projection device can calculate the first focusing value based on the current position. Focus adjustment amount, and send a first focus adjustment command to the drive motor 320 .
  • the first focus instruction is used to control the driving motor 320 to move the position of the optical assembly according to the first focus amount.
  • the projection device may respond to the manual focus instruction and extract position memory information.
  • the extracted position memory information includes that the current position is a position 350 steps away from the near end, and the reliability corresponding to the current position is 1.
  • the "forward" arrow key corresponding to the manual focus instruction corresponds to the focus adjustment amount for 100 steps. Therefore, the projection device may calculate the first focusing amount based on the current position, that is, add 100 steps to the 350 steps.
  • the controller 500 of the projection device may respond to the auto-focusing instruction and acquire the separation distance through the distance sensor 600 .
  • the distance sensor 600 may be a sensor device based on the time of flight (Time of Flight, TOF) principle, such as laser radar and infrared radar capable of detecting the target distance.
  • the distance sensor 600 can be set at the position of the optical machine 200, including the signal transmitting end and the receiving end.
  • the transmitting end of the distance sensor 600 can transmit wireless signals to the direction of the projection surface. Calculate the flight time of the signal with the time when the receiving end receives the signal, and then combine the flight speed to get the actual flight distance of the wireless signal, and then calculate the distance between the projection surface and the optical machine.
  • Figure 12 is a schematic flow chart of calculating the first focusing amount in some embodiments, as shown in Figure 12, including: step 1201, the projection device can first obtain the separation distance; step 1202, query from the memory, and compare the preset focusing distance Table, wherein the focus distance comparison table includes a mapping relationship between the separation distance and the target focus position.
  • the projection device can query the focal length corresponding to the current separation distance from the comparison table.
  • the focal length data can be expressed as the distance of the moving lens 311 relative to the near or far end of the stroke, and the corresponding direction, angle and number of turns that the drive motor 320 needs to rotate;
  • Step 1203 query the target focus position in the focus distance comparison table according to the separation distance.
  • Step 1204 obtaining the current position;
  • Step 1205 calculating a first focusing amount in combination with the target focusing position and the current position, wherein the first focusing amount is the difference between the target focusing position and the current position value.
  • Step 1206 Generate a first focus instruction according to the first focus amount. That is, after obtaining the separation distance, the projection device can call the stored separation distance and focal length comparison table, and finally generate the first focus adjustment instruction according to the focal length data obtained from the query, and send the first focus adjustment instruction to the drive motor 320 to The driving motor 320 is controlled to drive the moving lens 311 to move to the target position.
  • the target position of the optical component 310 corresponding to the best focal length can be determined to be 560 steps from the near end through the focus distance comparison table.
  • the current position is extracted, which is 350 steps from the near end.
  • a first focusing instruction is sent to the driving motor 320 according to the first focusing amount, so as to control the driving motor 320 to drive the optical assembly 310 to move forward 210 steps to reach the target position.
  • Fig. 13 is a schematic diagram of a focusing process based on a second focusing amount in some embodiments.
  • the projection device can calculate the second focus amount, and send a second focus instruction to the drive motor 320 .
  • the second focus instruction is used to control the driving motor 320 to move the position of the optical assembly 320 according to the second focus amount.
  • the projection device may first control the driving motor to move the optical assembly to the starting position of the focusing interval at the first rate, and then Calculating a second focusing amount, where the second focusing amount is a difference between the starting position and the target focusing position. Then according to the second focusing amount, a second focusing command is generated, so as to send the second focusing command to the driving motor, so as to control the driving motor to move the optical assembly to the target focusing position at a second rate, wherein the second rate is less than first rate.
  • the projection device can first quickly move the optical component to the starting position of the focusing interval according to the first rate, such as the near end of the travel of the optical component, and then relatively slowly move the optical component to the target focusing position according to the second rate, thereby Reduce the influence of inertia and stroke clearance on focusing accuracy during fast movement.
  • the projection device can obtain the current position by reading the position memory information after obtaining the focusing instruction, and directly calculate the first value based on the current position when the reliability of the current position is the first value. Focusing amount, so that the drive motor 320 of the projection device can directly adjust the focus based on the current focusing position, shorten the adjustment stroke of the focusing process, and reduce the influence of accumulated errors on the focusing accuracy, and improve the speed and accuracy of the focusing process .
  • the projection device can also record the operation of the drive motor 320 and the movement of the optical assembly 310 during this focusing process, and store the position memory information after the focusing process is over. Information is updated.
  • the projection device may initialize a temporary variable after obtaining the focusing instruction, and use the temporary variable to record the first or second focusing amount; and then update the current position in the position memory information according to the temporary variable, As well as setting the confidence level of the current location.
  • the projection device may adopt different information update methods according to different focusing methods.
  • the projection device can receive the key information input by the user during the process of optically projecting the content of the manual focusing interface, and analyze the moving direction and moving steps in the key information, and then calculate the moving direction and moving steps Store the number to a temporary variable to update the current position using the temporary variable.
  • the automatic focusing process since the automatic focusing process does not need to judge the user's button, the movement of the driving motor 320 is automatically completed according to the automatic focusing strategy.
  • the optical component 310 only needs to move the information.
  • Figure 14 is a schematic diagram of the data interaction of each functional module in some embodiments, as shown in Figure 14, in some embodiments, the controller of the projection device can also be divided into multiple functional modules according to different functional purposes, such as focusing control module, auto focus module and manual focus module. Wherein, the focus control module can trigger an automatic focus process or a manual focus process according to system program settings or user operations.
  • the controller of the projection device can also be divided into multiple functional modules according to different functional purposes, such as focusing control module, auto focus module and manual focus module.
  • the focus control module can trigger an automatic focus process or a manual focus process according to system program settings or user operations.
  • the user can trigger corresponding operations according to the options of the UI interface projected by the projection device.
  • the focus control module only needs to set corresponding parameters to the auto-focus module according to the scene, such as conventional focus, keystone correction trigger, calibration trigger, etc., and the auto-focus module completes the focusing process by itself And memorize the motor position.
  • the focus control module can control the focus motor through the manual focus module, and the automatic focus module sets the manual focus flag, and sets the position reliability corresponding to the position memory information to 0.
  • the actual operation of the driving motor 320 is memorized in real time through the temporary variable, and the change of the moving position of the optical component 310 driven by the driving motor 320 is set in real time to the auto-focus module to form a double memory and ensure the reliability of the position memory information to the greatest extent. .
  • the automatic focusing module can receive the instruction issued by the focusing control module, and execute the corresponding focusing strategy.
  • the position reliability in the memory can be obtained. If the position reliability is not 0, the reliability is set to 0, and the user manually adjusts the focus , receive the driving motor movement information issued by the focus control module, and perform temporary memory.
  • the automatic focus module can compare and verify the motor movement information stored in this module with the information stored in the focus control module, so that the position information of the position storage module can be set according to the verification result. Reliability, select whether to update location memory information.
  • the manual focus module can receive the operation instructions issued by the focus control module, such as the direction of rotation of the drive motor, the number of steps, etc., and control the movement of the drive motor through the drive integrated circuit (Integrated Circuit, IC) chip, and the actual number of steps Return to the focus control module.
  • the actual number of moving steps can be obtained by detecting the limit switch set on the travel path of the optical component.
  • the motor control unit controls the drive motor 320 to move in a step-by-step manner. If the total number of steps of the current movement is less than the number of steps set by the user, before each movement, the corresponding general-purpose input-output interface (General-purpose Input/output, GPIO) level level detection to determine whether to reach the start or end of the adjustable interval.
  • General-purpose Input/output, GPIO General-purpose Input/output
  • control the drive motor 320 to drive the optical assembly to move and add 1 to the step number, otherwise stop moving and return the current actual step number.
  • the motor control unit receives an instruction to move 20 steps forward, and when it is ready to move the 16th step, it detects that the end of the adjustable interval has been reached, then it does not continue to execute at this time, and directly returns the actual number of moving steps to 16. .
  • Fig. 15 is a schematic flowchart of setting reliability in some embodiments, as shown in Fig. 15 , including: step 1501, in order to set the reliability of the recorded current position, the projection device can generate a broadcast message by monitoring the focusing process. A specified field is used in the broadcast message to indicate whether the focusing process is valid. If it indicates that the focusing process is valid, the broadcast message is the first broadcast message. If it indicates that the focusing process is invalid, the broadcast message is the second message. For the manual focusing process, the projection device can monitor key information.
  • buttons information are not monitored within the preset monitoring period, generate the second broadcast message; if the button information is detected within the preset monitoring period, generate the first broadcast message, and respond to the exit button operation in the button information, use the update The position replaces the current position in the position memory information.
  • Step 1502 according to the content of the generated broadcast message, in response to the broadcast message being the first broadcast message, use the temporary variable and the current position to calculate the updated position; Step 1503, use the updated position to replace the current position in the position memory information, and step 1504, you can set the position
  • the reliability of the memory information is the first value.
  • Step 1505 in response to the broadcast message being the second broadcast message, or no broadcast message is generated within the preset receiving period, clearing the temporary variable; Step 1506, setting the reliability of the location memory information to the second value.
  • the focus control module when the user enters the manual focus interface, the focus control module first notifies the automatic focus module that manual focus will be performed, and sets the position reliability of the location storage module to 0, and then monitors the abnormal information broadcast from the automatic focus module , such as timeout, etc. At the same time, it receives the key information of the user, initializes the temporary variables, and temporarily stores the operation conditions of the drive motor 320, including the moving direction and the number of steps.
  • the user can perform the focusing operation by pressing the up and down direction buttons.
  • the focusing control module receives the valid keys, that is, the up and down direction keys, it can first convert the key information into the corresponding rotation direction and steps of the drive motor 320, and set it to the manual focusing module.
  • the manual focus module drives the driving motor 320 to move the optical assembly 310 in a specified direction for a specified number of steps. After the movement is completed, the projection device returns the actual number of moving steps to the focusing control module.
  • the focus control module After the focus control module receives the moving direction and the number of steps returned by the manual focus module this time, it is sent to the automatic focus module synchronously, and then the focus control module is based on the existing temporary position information and the manual focus module itself.
  • the location information returned for the second time updates the temporary variable. That is to say, in the focus control module, the number of forward movement in the temporary variable is A, the actual number of moving steps is AA, the number of reverse moving steps is B, and the actual number of moving steps is BB, then the temporary variable is recorded as [A, AA, B , BB]; if the information returned by the manual focusing module this time is moving 20 steps forward, the temporary variable of the focusing control module is updated to [A+1, AA+20, B, BB].
  • the focus control module When the focus control module receives the return button, it executes the exit operation, that is, notifies the automatic focus module that the manual focus has been completed, and issues the movement of the current temporary memory focus position. Then monitor the normally received broadcast message sent by the auto-focus module; if the above broadcast message is received, the temporary variable will be cleared normally, and this process will end; otherwise, an exception process will be triggered, the position memory information will not be updated, and the position memory information will be saved. The confidence level is set to the second numerical value. It should be pointed out that when the standby broadcast is monitored during the manual focusing process, it is equivalent to the user performing an exit operation, and the processing process is the same as the above example.
  • the projection device can judge that the input time meets the timeout condition set by itself or monitor the automatic focus When the exception query broadcast sent by the module, it enters the exception processing and does not update the location memory information.
  • the projection device in order to prevent the focus control module from exiting abnormally during manual focus adjustment, resulting in the loss of the movement information of the drive motor 320 and making the stored position memory information unavailable, the projection device can detect that the drive motor responds to the first A single movement information of a focus instruction, and use the single movement information to update the temporary variable.
  • the end instruction is acquired, wherein the end instruction is actively input by the user, or is automatically generated according to the focusing process.
  • accumulating a plurality of single movement information within the detection period to obtain an accumulated movement amount.
  • the reliability of the current position is consistent with the target focus position, set the reliability of the current position as the first value, and use the temporary variable to update the position memory information. If the actual position of the optical component corresponding to the accumulated movement amount is inconsistent with the target focusing position, set the reliability of the current position as the second value, and clear the temporary variable.
  • the automatic focusing module needs to back up the movement information of the optical assembly 310 driven by the driving motor 320 synchronously. For example, after receiving the manual focus notification issued by the focus control module, the automatic focus module first sets the manual focus start flag, and at the same time initializes a temporary variable for backing up and storing the movement information driven by the drive motor 320 . Then read the reliability corresponding to the position memory information stored in the memory, if it is not 0, then set the reliability to 0 to ensure that the position reliability is accurate.
  • the automatic focus module receives a single movement information from the focus control module, then updates the temporary variable, stores the movement information locally in real time, and waits for the next information from the focus control module.
  • the automatic focusing module receives the notification of the completion of manual focusing and the accumulated single movement information from the focusing control module, it sends a normal reception broadcast and compares it with the stored movement information. If the comparison results are consistent, the location memory information of the location memory is directly updated, that is, the current location and reliability are updated, and the local temporary variable is cleared. If the result of the comparison is that the information is inconsistent, exception processing is triggered and the location memory information is not updated.
  • the automatic focus module After the automatic focus module receives the manual focus start notification, if it receives the manual focus start notification again before receiving the manual focus start notification, it will be regarded as an exception, and the exception processing will be triggered.
  • Set confidence level to 0.
  • the auto-focus module does not receive valid information from the focus control module beyond the set timeout period, it is also considered to be abnormal, triggers exception processing, and does not update the position memory information.
  • the abnormal situation may be affected by various factors in the actual focusing process, and the projection device may determine whether the abnormal situation occurs by detecting various parameters in the focusing process. For example, each time the focus control module actually completes a user operation, it will synchronously deliver the movement information to the automatic focus module. Therefore, if the automatic focusing module receives the information sent by the focusing control module at a certain time, and exceeds the set timeout period, and has not received the movement information or manual focusing end information sent by the focusing control module again, It can be regarded as an abnormal situation, causing the focus control module to fail to continue to work normally, for example, other applications pop up abnormally and the focus control module cannot receive buttons, etc.
  • the automatic focus module can be configured to send an abnormal query broadcast of movement information to the focus control module, and after the focus control module receives the broadcast, it can be regarded as the end of manual focus, and the stored position information will be downloaded to sent to the autofocus module. If the automatic focusing module can receive the information sent by the focusing control module, it will send a normal reception broadcast to the focusing control module, so that the focusing control module can normally execute the end process. Then compare and verify the movement information stored by itself with the movement information issued by the focus control module. If the two are consistent, the movement information is considered valid, so the position memory information can be updated and the reliability can be reset to the first. value.
  • the auto-focus module cannot receive the information sent by the focus control module, it means that the focus control module may exit abnormally at this time. At this time, check whether the auto-focus module has crashed and restarted.
  • the data stored in the focusing module shall prevail; otherwise, the position memory information is set to be unreliable, that is, the reliability is set to the second value.
  • the projection device can set a process-related system property, and each time the process is pulled up for initialization, read the value of the property and add 1 after the initialization is completed; The default value of the property value is 1 after the AC is turned on, so when the power is turned on for the first time, the property value is 1.
  • the auto-focus module first reads the attribute value and sets the backup attribute value; when the trigger abnormality detection process crashes and restarts, it reads the attribute value again and compares it with the backup Value comparison, if they are consistent, it means that there has been no restart, otherwise it can be detected that there has been a crash and restart in this process.
  • the focusing control module may not receive a user button for a long time or receive an abnormal query broadcast from the automatic focusing module, and send a notification to the automatic focusing module.
  • the module sets the manual focus end flag, when the normal reception broadcast is not received, it is determined that an abnormal situation occurs. For example, if the user does not operate for a long time or misuses the manual focus interface and exits the manual focus interface abnormally, the focus control module will not receive key values for a long time.
  • the projection device can handle this situation by setting a timeout mechanism, that is, the focus control module It has its own timeout mechanism, and the auto-focus module also has its own timeout mechanism, and when the time exceeds, it will send an exception query broadcast to the focus control module to trigger exception processing.
  • a timeout mechanism that is, the focus control module It has its own timeout mechanism, and the auto-focus module also has its own timeout mechanism, and when the time exceeds, it will send an exception query broadcast to the focus control module to trigger exception processing.
  • the focus control module triggers this kind of abnormal processing, which is equivalent to the user exiting the manual focus. At this time, it will send a notification to the automatic focus module to stop the manual focus and send the motor movement information, and then wait for the normal reception of the broadcast by the automatic focus module. If the normal reception broadcast from the auto-focus module is received, the temporary variable is cleared, and the same as the normal exit process, the auto-focus module completes the motor position information maintenance and reliability setting; if no normal broadcast from the auto-focus module is received When receiving the broadcast, the focus control module maintains the motor position information and sets the reliability, and sets the manual docking end flag to the automatic focus module separately, so that the automatic focus module can clear the existing temporary motor movement information.
  • the focus control module can also set a special flag, that is, if the current timeout is caused by the user not operating for a long time in the manual focus interface, when the user adjusts up and down again, the operation of entering manual focus will not be triggered again , at this time, the focus control module needs to send the manual focus start flag to the automatic focus module again according to the flag bit.
  • Fig. 16 is a schematic diagram of a multi-stage focusing process in some embodiments.
  • the projection device may use an image acquisition component such as a built-in or external camera to monitor the image projected by the projection device. Real-time image detection to calculate the position with the highest clarity. That is, the projection device can control the driving motor to adjust the optical component to the target focus position according to the first rate, and then calculate the fine adjustment interval according to the target focus position.
  • the target focus position is an interval starting point of the fine adjustment interval.
  • the driving motor is controlled according to the second speed to drive the optical assembly to move in the fine adjustment interval, wherein the second speed is smaller than the first speed. That is, the optical assembly 310 is moved to the target focusing position at a relatively fast moving speed to shorten the fuzzy focusing time, and then the optical assembly 310 is controlled to move from one end of the fine adjustment interval to the other end at a relatively slow speed. So that the camera can capture images of the projected content in more positions. By acquiring the projected content image captured by the camera during the movement of the optical components, and calculating the sharpness of the projected content image, the fine focus position is obtained. Wherein, the fine focus position is the shooting position of the projected content image with the highest definition.
  • the projection device may perform focusing in stages.
  • a complete focusing process can be divided into three stages, namely, a fuzzy focusing stage, a fine-tuning stage, and a compensation stage.
  • the fuzzy focusing stage is to quickly find the fine-tuning range
  • the fine-tuning stage is to find the best sharpness position
  • the compensation stage is to eliminate the position deviation that may be introduced by parallel control.
  • the projection device can query the target position from the distance-focus comparison table according to the interval distance data detected by the distance sensor, and use the queried target focusing position as a reference to determine the fine focus interval (abbreviation: fine adjustment interval).
  • fine adjustment interval takes the target focusing position as the midpoint, and extends forward and backward by 100 steps respectively to form a 200-step fine-tuning interval.
  • the projection device can also determine the fine-tuning interval based on the sharpness of the images captured by the camera, that is, the projection device can first sort the focus positions according to the sharpness of each projected content image to obtain a sharpness sequence, and The fine focus interval is extracted in the definition sequence, wherein the projection content image with the highest definition corresponds to a focus position within the fine focus interval.
  • the controller 500 may notify the camera 700 to take a picture, and notify the sharpness evaluation unit to do so. Afterwards, the camera takes photos at a specific frequency to obtain images of the projected content.
  • the sharpness evaluation unit starts to poll whether the container storing the focusing position is empty, and reads the position information if it is not empty, and reads the corresponding picture accordingly and calculates the picture sharpness, and then stores the sharpness calculation result in the clear Store in a storage container until use.
  • the controller 500 can read the current memorized rotational position of the drive motor 320 again, and adjust the moving lens 311 to the adjustment start position or the adjustment end position according to the principle of proximity. Then drive the drive motor 320 in a specific direction according to the fixed number of steps.
  • the controller 500 can send the current position to the camera and ask the camera to return to the photo path, and then the current position Save to the focus position storage container.
  • the sharpness evaluation unit detects that the container is not empty, it reads the position information, obtains the corresponding photo, and performs sharpness calculation. At the same time, the drive motor 320 continues to move without waiting for the resolution comparison result.
  • the above process is repeated until the driving motor 320 drives the moving lens 311 to the adjustment start position or the adjustment end position.
  • the sharpness evaluation unit detects that the adjustment start point or the adjustment end point has been reached, the sharpness of each position will be sorted, and the best sharpness position will be returned to the controller 500 .
  • the controller 500 then controls the operation of the driving motor 320 according to the proximity principle according to the returned optimal position and the corresponding focusing steps, so as to drive the lens to move to a certain side of the fine adjustment interval.
  • the optimal position is detected to be a position 600 steps away from the near end, and the fine focusing interval can be determined to be the position interval between 500 steps and 700 steps away from the near end .
  • the controller 500 may send a second movement command to the drive motor 320 to control the optical assembly 310 to move within the fine focusing interval according to a preset adjustment step.
  • the projection device needs to obtain the fine focus image captured by the camera 700, and then calculate the sharpness of the fine focus image, and search for the sharpness according to the sharpness of the fine focus image The focus position corresponding to the fine focus image with the highest precision is used as the best focus position.
  • the projection device can drive the lens to a position of 700 steps from the near end through the drive motor 320, and transfer the lens from the camera 700 every 10 steps. Acquire a fine-focus image, and calculate the sharpness of each fine-focus image, so as to determine the position corresponding to the image with the highest resolution in the fine-focus interval, for example, the position 550 steps away from the near end is the best focus Location.
  • the fine focus interval needs to be specially selected at the end point of the focusing stroke, that is, the confirmed best sharp point needs to be processed separately. If the best clear point is at the end of the interval, only the end point will be further searched at this time. Just go back to the interval of 100 steps; if the best clear position is at the beginning of the interval, the drive motor 320 drives the optical assembly to move to the position 100 steps ahead of the starting point, and then search within this interval.
  • the position compensation value can be calculated by driving the pace of the motor 320 and the photographing frequency of the camera 700, and move a specific number of steps back and forth at the optimal position according to the compensation value, and then combine the resolution to obtain the final relative clearest Location.
  • the projection device can first obtain the moving speed of the driving motor and the shooting frequency of the camera 700 when searching for the best focus position according to the definition of each projected content image, and then according to Position compensation values are calculated from moving pixels and shooting frequency. Then, the target focus position is obtained by extracting the focus position corresponding to the projection content image with the highest definition, and the target focus position is corrected by using the position compensation value to obtain the optimal focus position.
  • the best position obtained is a position 560 steps away from the near end, and according to the pace of the driving motor 320 and the shooting frequency of the camera 700, the calculated average position compensation value is 15 steps, then Finally, the projection content images corresponding to the three positions of 545 steps, 560 steps and 575 steps from the proximal end are obtained, so as to determine the final clearest point by comparing the sharpness.
  • Fig. 17 is a timing flowchart of a focusing method based on position memory in some embodiments. , as shown in Figure 17. include:
  • Step 1 obtaining the focusing instruction input by the user
  • Step 2 extracting the position memory information in response to the focusing instruction, the position memory information including the current position of the optical component and the reliability of the current position;
  • Step 3 in response to the reliability being a first value, calculate a first focus adjustment amount based on the current position;
  • Step 4 send a first focus adjustment instruction to the drive motor, and use the first focus adjustment instruction to for controlling the driving motor to move the position of the optical component according to the first focusing amount;
  • Step 5 in response to the reliability being a second value, calculate a second focus amount based on the starting point of the focus interval; Step 6, send a second focus instruction to the drive motor, the second value is less than the set The first numerical value; the second focus instruction is used to control the driving motor to move the position of the optical assembly according to the second focus amount.
  • the projection device may extract the position memory information from the memory after receiving the focusing instruction, and judge the reliability of the current position in the position memory information. If the reliability is the first value, the first focusing amount is calculated based on the current position; if the reliability is the second value, the second focusing amount is calculated based on the start position of the focusing interval. And send focusing instructions to the drive motor according to the first focusing amount and the second focusing amount, so that the projection device can adjust the position of the optical components based on the recorded current position, improve the definition of the projected picture, and shorten the time spent on the focusing process. time.
  • the projection device provided by the present application can realize the anti-eye function.
  • the controller can control the user interface to display corresponding prompt information to remind the user to leave In the current area, the controller can also control the user interface to reduce the display brightness, so as to prevent the laser from causing damage to the user's eyesight.
  • the controller when the projection device is configured as a children's viewing mode, the controller will automatically turn on the anti-eye switch.
  • the controller controls the projection device to turn on the anti-eye switch.
  • the controller when the data collected by time-of-flight (TOF) sensors, camera devices and other devices triggers any preset threshold condition, the controller will control the user interface to reduce the display brightness, display prompt information, and reduce the optical-mechanical transmission power. , brightness, intensity, in order to protect the user's eyesight.
  • TOF time-of-flight
  • the projection device controller can control the calibration service to send signaling to the time-of-flight sensor to query the current device status of the projection device, and then the controller receives data feedback from the time-of-flight sensor.
  • the correction service can send a notification algorithm service to the process communication framework (HSP Core) to start the anti-eye process signaling;
  • the process communication framework (HSP Core) will call the service capability from the algorithm library to call the corresponding algorithm service, for example, it can include taking pictures Detection algorithm, screenshot algorithm, foreign object detection algorithm, etc.;
  • the process communication framework returns the foreign object detection result to the correction service based on the above algorithm service; for the returned result, if the preset threshold condition is reached, the controller will control the user interface to display prompt information and reduce the display brightness.
  • the signaling sequence is shown in the figure 18.
  • the projection device when the anti-eye switch of the projection device is turned on, when the user enters a predetermined area, the projection device will automatically reduce the intensity of the laser light emitted by the optical machine, reduce the display brightness of the user interface, and display safety prompt information.
  • the control of the projection device on the above-mentioned anti-eye function can be realized by the following methods:
  • the controller Based on the projection screen acquired by the camera, the controller uses an edge detection algorithm to identify the projection area of the projection device; when the projection area is displayed as a rectangle or a rectangle, the controller obtains the coordinate values of the four vertices of the above-mentioned rectangular projection area through a preset algorithm;
  • the perspective transformation method can be used to correct the projection area to be a rectangle, and the difference between the rectangle and the projection screenshot can be calculated to realize whether there are foreign objects in the display area; if the judgment result is that there are foreign objects, the projection device Automatically trigger the anti-eye function to start.
  • the difference between the camera content of the current frame and the camera content of the previous frame can be used to determine whether foreign objects have entered the area outside the projection area; if it is judged that foreign objects have entered, the projection
  • the device automatically triggers the anti-eye function.
  • the projection device can also use a time-of-flight (ToF) camera or a time-of-flight sensor to detect real-time depth changes in a predetermined area; if the depth value changes beyond a preset threshold, the projection device will automatically trigger the anti-eye function.
  • TOF time-of-flight
  • the projection device judges whether to enable the anti-eye function based on the collected time-of-flight data, screenshot data, and camera data analysis.
  • the controller performs depth difference analysis; if the depth difference is greater than the preset threshold X, when the preset threshold X is implemented as 0, it can be determined that there is a foreign object in the predetermined area of the projection device . If it is detected that a specific object is located in a predetermined area and its vision is at risk of being damaged by the laser, the projection device will automatically activate the anti-eye function to reduce the intensity of the laser emitted by the light machine, reduce the display brightness of the user interface, and display safety reminders.
  • the projection device performs color addition mode (RGB) difference analysis based on the captured screenshot data. If the color addition mode difference is greater than the preset threshold Y, it can be determined that there is a foreign object in a predetermined area of the projection device; If there is a specific object in the predetermined area, such as a user, whose vision is at risk of being damaged by the laser, the projection device will automatically activate the anti-eye function, reduce the intensity of the emitted laser light, reduce the brightness of the user interface display, and display the corresponding safety reminder information.
  • RGB color addition mode
  • the projection device obtains the projection coordinates according to the collected camera data, then determines the projection area of the projection device according to the projection coordinates, and further analyzes the difference of the color addition mode (RGB) in the projection area, if the difference of the color addition mode is greater than If the preset threshold Y is used, it can be determined that there is a foreign object in the predetermined area of the projection device. If there is a specific object in the predetermined area, such as a user, whose vision is at risk of being damaged by the laser, the projection device will automatically activate the anti-eye function to reduce Send out laser intensity, reduce user interface display brightness and display corresponding safety prompt information.
  • RGB color addition mode
  • the controller can still perform color additive mode (RGB) difference analysis in the extended area; if the color additive mode difference is greater than the preset threshold Y, it can be determined that there is a foreign object in the projection device If there is a user in the predetermined area, there is a risk that their eyesight will be damaged by the laser light emitted by the projection device.
  • the projection device will automatically activate the anti-eye function, reduce the intensity of the emitted laser light, reduce the brightness of the user interface display, and display the corresponding security information.
  • the prompt information as shown in Figure 23.
  • FIG. 19 is a schematic diagram of a signaling interaction sequence of a projection device implementing a display image correction function according to another embodiment of the present application.
  • the projection device can monitor the movement of the device through a gyroscope or a gyroscope sensor.
  • the correction service sends a signaling to the gyroscope to query the status of the device, and receives a signaling from the gyroscope to determine whether the device is moving.
  • the display correction strategy of the projection device can be configured such that when the gyroscope and the time-of-flight sensor change simultaneously, the projection device triggers keystone correction first; after the gyroscope data stabilizes for a preset length of time, the controller starts the trigger Keystone correction; and the controller can also configure the projection device not to respond to the commands sent by the remote control buttons when the keystone correction is in progress; in order to cooperate with the realization of the keystone correction, the projection device will display a pure white image card.
  • the trapezoidal correction algorithm can construct the transformation matrix between the projection surface and the optical-mechanical coordinate system in the world coordinate system based on the binocular camera; further combine the optical-mechanical internal parameters to calculate the homography between the projection screen and the playing card, and use the homography to realize Arbitrary shape conversion between the projected screen and the playing card.
  • the correction service sends a signaling for informing the algorithm service to start the keystone correction process to the process communication framework (HSP CORE), and the process communication framework further sends a service capability call signaling to the algorithm service to obtain the capability corresponding algorithm;
  • the algorithm service obtains and executes the camera and picture algorithm processing service and the obstacle avoidance algorithm service, and sends them to the process communication framework in the form of signaling; in some embodiments, the process communication framework executes the above algorithms and feeds back the execution results to the Calibration service, the execution results may include successful photographing and successful obstacle avoidance.
  • the user interface will be controlled to display an error return prompt, and the user interface will be controlled to display keystone correction and auto focus charts again.
  • the projection device can identify the screen; and use the projection changes to correct the projection screen to be displayed inside the screen, so as to achieve the effect of aligning with the edge of the screen.
  • the projection device can use the time-of-flight (ToF) sensor to obtain the distance between the optical machine and the projection surface, based on the distance, find the best image distance in the preset mapping table, and use the image algorithm to evaluate the clarity of the projection screen. Based on this, the image distance can be fine-tuned.
  • ToF time-of-flight
  • the automatic keystone correction signaling sent by the correction service to the process communication framework may include other function configuration instructions, for example, it may include control instructions such as whether to implement synchronous obstacle avoidance, whether to enter a scene, and so on.
  • the process communication framework sends the service capability call signaling to the algorithm service, so that the algorithm service acquires and executes the auto-focus algorithm to realize the adjustment of the line-of-sight between the device and the screen; in some embodiments, after applying the auto-focus algorithm to realize the corresponding function, the algorithm
  • the service may also obtain and execute an automatic entry algorithm, which may include a keystone correction algorithm.
  • the projection device automatically enters the screen, and the algorithm service can set the 8-position coordinates between the projection device and the screen; and then through the autofocus algorithm again, the adjustment of the viewing distance between the projection device and the screen is realized; finally, the correction result Feedback to the correction service, and control the user interface to display the correction results, as shown in Figure 19.
  • the projection device uses an autofocus algorithm to obtain the current object distance by using its configured laser ranging to calculate the initial focal length and search range; then the projection device drives the camera (Camera) to take pictures, and uses the corresponding algorithm Perform clarity evaluation.
  • the projection device searches for the best possible focal length based on the search algorithm, then repeats the above steps of photographing and sharpness evaluation, and finally finds the optimal focal length through sharpness comparison to complete autofocus.
  • step 2001 the projection device is started; in step 2002, the user moves the device, and the projection device automatically completes calibration and refocuses; in step 2003, the controller will detect whether the auto focus function is enabled; when the auto focus function is not enabled, the controller will end Auto-focus business; step 2004, when the auto-focus function is turned on, the projection device will obtain the detection distance of the time-of-flight (TOF) sensor through the middleware to calculate;
  • TOF time-of-flight
  • Step 2005 the controller queries the preset mapping table according to the obtained distance to obtain the approximate focal length of the projection device; step 2006, the middleware sets the obtained focal length to the optical engine of the projection device;
  • the projection device provided by the present application can implement a display correction function through a keystone correction algorithm.
  • two sets of external parameters between the two cameras and between the camera and the optical machine can be obtained, that is, the rotation and translation matrices; then the specific checkerboard chart is played through the optical machine of the projection device, and the projected checkerboard angle is calculated
  • Point depth value for example, solve the xyz coordinate value through the translation relationship between binocular cameras and the principle of similar triangles; then fit the projection surface based on the xyz, and obtain the rotation relationship and translation relationship with the camera coordinate system , which can specifically include pitch relationship (Pitch) and yaw relationship (Yaw).
  • the Roll parameter value can be obtained through the gyroscope configured by the projection device to combine the complete rotation matrix, and finally calculate the external parameters from the projection plane to the optical-mechanical coordinate system in the world coordinate system.
  • Step 2101 the projection device controller obtains the depth value of the point corresponding to the pixel point of the photo, or the coordinates of the projection point in the camera coordinate system;
  • Step 2102 through the depth value, the middleware obtains the relationship between the optical machine coordinate system and the camera coordinate system;
  • step 2103 the controller calculates the coordinate value of the projected point in the optical machine coordinate system
  • step 2104 obtains the angle between the projection surface and the optical machine based on the coordinate value fitting plane
  • step 2105 obtain the corresponding coordinates of the projection point in the world coordinate system of the projection surface according to the angle relationship;
  • Step 2106 according to the coordinates of the map card in the optical-mechanical coordinate system and the coordinates of the corresponding points on the projection surface of the projection plane, a homography matrix can be calculated.
  • Step 2107 the controller determines whether an obstacle exists based on the above acquired data
  • Step 2108 when obstacles exist, randomly select rectangular coordinates on the projection surface in the world coordinate system, and calculate the area to be projected by the optical machine according to the homography relationship;
  • Step 2109 when the obstacle does not exist, the controller can obtain the feature points of the two-dimensional code, for example;
  • Step 2110 obtaining the coordinates of the two-dimensional code on the prefabricated map card
  • Step 2111 obtaining the homography relationship between the camera photo and the drawing card
  • Step 2112 converting the obtained coordinates of the obstacle into the chart, and then obtaining the coordinates of the chart of obstacles blocked.
  • Step 2113 according to the coordinates of the occlusion area of the obstacle map in the optical-mechanical coordinate system, the coordinates of the occlusion area of the projection surface are obtained through homography matrix transformation;
  • Step 2114 randomly select rectangular coordinates on the projection surface in the world coordinate system, avoid obstacles at the same time, and calculate the area to be projected by the optical machine according to the homography relationship.
  • the obstacle avoidance algorithm uses the algorithm (OpenCV) library to complete the contour extraction of foreign objects when selecting the rectangle step in the trapezoidal correction algorithm process, and avoids the obstacle when selecting the rectangle to realize the projection obstacle avoidance function.
  • OpenCV algorithm
  • Step 2201 the middleware obtains the QR code image card captured by the camera
  • step 2202 identify the feature points of the two-dimensional code, and obtain the coordinates under the camera coordinate system
  • Step 2203 the controller further acquires the coordinates of the preset image card in the optical-mechanical coordinate system
  • Step 2204 solving the homography relationship between the camera plane and the optical-mechanical plane
  • Step 2205 the controller identifies the coordinates of the four vertices of the curtain captured by the camera based on the above-mentioned homography;
  • Step 2206 according to the homography matrix, obtain the range of the chart to be projected by the screen light machine.
  • the screen entry algorithm is based on the algorithm library (OpenCV), which can identify and extract the largest black closed rectangle outline, and judge whether it is a 16:9 size; project a specific picture card and use a camera to take photos, and extract more details in the photos.
  • OpenCV algorithm library
  • the corner points are used to calculate the homography between the projection surface (curtain) and the optical-mechanical display card, and the four vertices of the screen are converted to the optical-mechanical pixel coordinate system through homography, and the optical-mechanical graphic card is converted to the four vertices of the screen.
  • OpenCV algorithm library
  • the projection equipment of the telephoto micro-projection has the characteristics of flexible movement, and the projection image may be distorted after each displacement.
  • the projection equipment provided by this application and the geometry-based The corrected display control method can automatically complete the correction for the above problems, including the realization of functions such as automatic keystone correction, automatic screen entry, automatic obstacle avoidance, automatic focus, and anti-eye.
  • the automatic focusing function in the projection device by configuring the automatic focusing function in the projection device, according to the two-stage focusing, fast focusing is realized, the first focusing position is determined through the first stage of coarse focusing, and the second stage of fine focusing process Calculate the second focusing position according to the image definition, complete the automatic focusing, and avoid the problem of unclear focusing caused by partial focusing without increasing the time-consuming of focusing.
  • the controller compares the focusing distance with the distance between the projection surface and the optical machine detected by the distance sensor and the preset focusing distance comparison table. Query the target focusing position in the table, determine the difference between the target focusing position and the current position as the first focusing amount, and control the drive motor to move the optical assembly to the first focusing amount according to the first focusing amount position to complete the first stage of the coarse focusing process.
  • the projection screen is not clear.
  • Some embodiments of the present application provide a method for selecting an ROI feature region.
  • the ROI characteristic region selection can be applied to a projection device, and in order to meet the implementation of the ROI characteristic region selection method, the projection device can include an optical engine 200 , a lens 300 , a controller 500 , a distance sensor 600 and a camera 700 .
  • the controller 500 can be used to implement the ROI feature area selection method, including the following steps:
  • the controller 500 After the controller 500 obtains the focusing instruction, it automatically turns on the automatic focusing function, wherein the automatic focusing function includes two-stage focusing; in some embodiments, the first focusing position can be determined through the first stage of coarse focusing, according to the The first movement instruction sent by the controller 500 controls the drive motor 320 to drive the optical assembly 310 to move to the first focusing position; the controller 500 sends the second movement instruction to control the drive motor 320 to drive the optical assembly 310 to the position with the highest definition, completing the second Segment fine focus.
  • the first movement instruction is that the controller 500 controls the driving motor 320 to drive the optical assembly 310 to move to the first focusing position
  • the second movement instruction is that the controller 500 controls the driving motor 320 to drive the optical assembly 310 to the position with the highest definition.
  • the projection device may acquire the focus adjustment instruction based on any input method in the above embodiments, and when combined with the above embodiments, the focus adjustment instruction includes the first focus adjustment instruction or the second focus adjustment instruction.
  • the present application does not limit the manner or approach for the controller 500 to obtain the focusing instruction.
  • the separation distance is acquired according to the manner of the above-mentioned embodiment.
  • the controller 500 determines the position that the lens 300 needs to reach, that is, the first focusing position, in combination with the preset focusing curve, for example, the separation distance and the preset focusing distance comparison table in the above-mentioned embodiment, for example , the target focusing position in the above embodiment, by comparing the first focusing position with the current position, the difference distance between the two is obtained, and the controller 500 queries the preset mapping table according to the difference distance, The number of rotation steps of the drive motor 320 corresponding to the difference distance is obtained.
  • the preset focusing curve may be established before the projection device leaves the factory. That is, a coordinate system is established with the zoom parameter as the abscissa and the focus parameter as the ordinate, and the object distance in the process of capturing the image to be processed by the lens is a fixed value, and the monotone hill climbing algorithm is used to determine the key points of the preset zoom parameters in the coordinate system, and each of the preset zoom parameter key points corresponds to a focus parameter point that satisfies the preset clarity condition; a focus curve is generated according to each of the preset zoom parameter key points corresponding to the focus parameter points that meet the preset clarity condition , the focusing curve is a preset focusing curve.
  • the preset mapping table is produced by analyzing the focusing curve and the projection distance of the projected picture, wherein the mapping table includes the projected picture, the projected distance, the moving amount of the focus group and the number of rotation steps of the focus group.
  • the preset focusing curve and the preset mapping table are stored in the controller 500 .
  • the preset mapping table in some embodiments of the present application is shown in Table 1.
  • querying the preset mapping table can obtain the required movement amount and the number of rotation steps to determine the first focusing position for fine focusing.
  • the projection device also collects the time-of-flight distance for multiple times, calculates the average value, and substitutes the average value into the preset linear regression model of the time-of-flight distance measurement configured by the system to calculate the distance between the projection device and Theoretical distance between the projection surfaces; because the more the number of acquisitions, the closer the average value is to the real value, which can reduce the impact of different equipment during the time-of-flight ranging process.
  • the controller 500 acquires a plurality of flight data at preset times, thereby calculating a plurality of separation distances; taking the average value of the plurality of separation distances to obtain an average distance; and inputting the average distance into the time-of-flight measurement From the preset linear regression model, the theoretical distance value between the projection device and the projection surface is obtained, and the theoretical distance value is combined with the preset focusing curve to determine the position that the lens 300 needs to reach, that is, the first focusing position.
  • the controller 500 sends a first movement instruction to the drive motor 320, that is, controls the drive motor 320 to move the optical assembly 310 to the first position according to the number of rotation steps. Focus position.
  • the projection device in order to meet the needs of calculating the focus amount, can also be configured with multiple functional units, such as a strategy selection unit, a motor control unit, an image acquisition unit (camera), and a sharpness evaluation unit.
  • Each functional unit can work independently of each other, or they can work together to complete predetermined functions. These units may be configured integrally with other components of the projection device.
  • the strategy selection unit determines the first focusing position calculated based on the separation distance, it notifies the motor control unit to control the drive motor 320 to drive the optical assembly 310 to the first focusing position at one time, without stopping to wait for taking pictures and calculating the definition. .
  • the controller 500 can send the position information to the image acquisition unit and write the position information to the image clarity evaluation unit when the drive motor 320 rotates to a specific state, so as to realize the synchronization of the three.
  • the sharpness evaluation unit is configured with multiple sharpness evaluation functions to perform sharpness evaluation.
  • Clarity evaluation function can be Brenner, Tenengrad, Laplacian, SMD, Variance, Energy and so on.
  • the camera 700 takes pictures at a specific frequency, obtains the projection content image captured during the movement of the optical assembly 310 to the first focusing position, and stores the projection content image Store to the focus position storage container.
  • the controller 500 After the focusing unit 320 moves the optical assembly 310 to the first focusing position, the controller 500 reads the projection content image in the storage container of the focusing position, and the controller 500 obtains the current projection at the first focusing position Content image.
  • the sharpness evaluation unit starts to poll whether the storage container storing the focus adjustment position is empty, reads the position information if it is not empty, and reads the corresponding photo to calculate the picture sharpness, and then stores the result in the sharpness storage container
  • the sharpness evaluation can be realized in multiple ways such as frequency domain function, gray scale function, and information entropy preset in the sharpness evaluation unit.
  • the ROI feature area selection function is automatically enabled, and the projection is identified through the ROI feature area selection function.
  • the algorithms configured in the ROI feature area selection function include but are not limited to: adaptive binarization threshold algorithm, dilation and erosion algorithm, contour detection algorithm, feature point matching algorithm, and constant-scale feature transformation image processing algorithm.
  • the controller 500 may calculate the ROI feature region in the projection content image according to a preset contour detection algorithm, and the ROI feature region is the contour region with the largest area or the smallest perimeter ratio in the projection content image,
  • the outline area is an area defined in the projected content image according to grayscale values.
  • the controller 500 converts the image into a grayscale image after acquiring the projected content image.
  • S8142 Identify at least one contour region in the grayscale image based on the grayscale value, and calculate an area of the contour region.
  • the controller 500 calculates the ROI feature area in the grayscale image according to the preset adaptive binarization threshold algorithm, dilation and erosion algorithm, and contour detection algorithm.
  • the controller 500 calculates the average value M of the overall gray value of the entire gray image, and obtains the gray value threshold interval [M-s, M+s] of the pixels in the gray image according to the average value M; wherein, s is Empirical values, in some embodiments, the value of s is 50; the controller 500 uses each point in the threshold interval of the gray value as a segmented pixel, and calculates the gray level of each pixel in the threshold interval of the gray value variance.
  • the controller 500 uses the maximum inter-class numerical variance algorithm, inputs the gray level variance of the pixel point, obtains the maximum inter-class variance, and uses the maximum inter-class variance as a reference for the optimal threshold value of the binary segmentation of the gray-scale image
  • the threshold T total is the first segmented pixel.
  • the controller 500 divides the grayscale image into multiple image blocks according to a preset segmentation value, wherein the size of the divided image blocks is r*r.
  • the controller 500 takes the current pixel point (x, y) of the image block as the center area to calculate the mean value m(x, y) and standard deviation ⁇ (x, y) of the pixel point set, wherein the calculation is as formula 1 and formula 2 Shown:
  • the controller 500 uses m(x, y) and ⁇ (x, y) as input parameter data to calculate the individual threshold f(x, y) T of the current pixel point (x, y), where the individual threshold f(x, y )
  • the calculation formula of T is as follows:
  • k represents the correction parameter, and its value range is (0, 1), and R represents the dynamic change parameter value of the variance.
  • the controller 500 obtains the individual threshold f(x,y) T of each image block.
  • the controller 500 uses the obtained individual threshold f(x, y) T as the second segmentation pixel point; the controller 500 segments the pixel points of the image block according to the first segmentation pixel point and the second segmentation pixel point respectively , to obtain the segmented first image block and the second image block.
  • the controller 500 respectively calculates the variance value of the pixel gray value in the first image block and the second image block to obtain the first variance value and the second variance value; the controller 500 calculates the first image block and the second variance value respectively.
  • the average value of the pixel gray value in the image block; the variance value, the average value combined with the difference algorithm to obtain the first threshold and the second threshold of the pixel, and the first threshold and the second threshold The gray value corresponding to the maximum value of is used as the optimal threshold, that is, as the optimal threshold f(x,y) TT of the current image block.
  • the controller 500 identifies the target part and the background part in the current image block according to the optimal threshold f(x, y) TT and the color value of the current image block. If the color value of the pixel in the image block is greater than the optimal threshold, the pixel in the image block is divided into target parts; if the color value of the pixel in the image block is less than or equal to the optimal threshold, then the The pixels in the image block are divided into background parts; in some embodiments, when the color value of the current pixel is greater than the optimal threshold, it is set to 1, otherwise, it is set to 0.
  • each image block segmented according to the grayscale image can be adapted, that is, each segmented image block is based on the grayscale of its own pixel
  • each segmented image block is based on the grayscale of its own pixel
  • the controller 500 uses the dilation and erosion algorithm to denoise the grayscale image that has been distinguished from the target part and the background part.
  • the controller 500 first performs an expansion algorithm on the grayscale image: that is, reads the pixel points (x, y) in the image in sequence, and performs convolution calculation with a 3 ⁇ 3 structural element (convolution kernel), and when the result is When the value exceeds the threshold number, the pixel is set to 1, otherwise it is set to 0.
  • the structural elements can be structural diagrams of different sizes and ratios such as 3 ⁇ 3, 5 ⁇ 5, etc.; in some embodiments, 3 ⁇ 3 structural elements are used, and the values are represented by 0 or 1, and the specific values are ⁇ [0, 1, 0], [1, 1, 1], [0, 1, 0] ⁇ , that is, a 4-connected value with a radius of 2.
  • the controller 500 uses the above-mentioned convolution kernel to sequentially traverse the pixels in the image. If the value in the convolution kernel is 1, the pixel at the origin of the corresponding convolution kernel in the image is set to 1, otherwise it is set to 0.
  • the controller 500 uses an erosion algorithm to denoise the dilated grayscale image.
  • the controller 500 sequentially reads the pixel points (x, y) in the grayscale image and performs convolution calculation with the 3 ⁇ 3 structural elements (convolution kernels). When all the pixel points in the result are 1, the pixel Point is 1, otherwise, the pixel point is 0.
  • the controller 500 removes noise stains in the image, sets the pixels as background pixels, and obtains a grayscale image after denoising.
  • the thin edge part of the image can be effectively closed through the expansion algorithm, and finally the grayscale image after expansion is obtained.
  • the grayscale image is processed by the dilation and erosion algorithm, small objects or noise in the image are removed, objects can be separated at thin edge points, and the boundary of larger objects is smoothed without significantly changing its area.
  • the controller 500 uses a contour detection algorithm on the denoised grayscale image to obtain multiple closed contour regions in the projection region of the first focusing position.
  • the controller 500 adopts an image processing algorithm of eight-neighborhood tracking to traverse the first pixel point in the grayscale image that changes from 0 to 1, and use this pixel point as the starting point or boundary point of the outer contour.
  • the pixel point is used as the starting point, the starting point or isolated point of the contour is found one by one through the counterclockwise method.
  • the contour search is completed, every time a contour point is obtained, the value of the edge marker is increased by one.
  • the controller 500 sequentially processes each pixel in the grayscale image, and finally calculates all contours and corresponding hierarchical relationships.
  • its characterization points include: contour information ndarray, hierarchical relationship Hierarchy; where [Next , Preview, Child, Parent] four parameters represent the index numbers of the next contour, the previous contour, the child contour, and the parent contour. If there is no corresponding item, the value is a negative number (in some embodiments, expressed by -1).
  • the grades of contour regions at all levels are obtained.
  • the boundary point of the inner contour is mainly to find the pixel point that changes from 1 to 0.
  • the 8-neighborhood search method is used.
  • the edge mark value is increased by one. If the starting point or isolated point of the contour is found, Count as the end of the contour point search.
  • contours in Figure 25 There are 5 contours in Figure 25, among which 1a and 1b are the relationship between the inner and outer contours, that is, the outer contour and the inner contour.
  • contour0 and contour1a are the outermost contours, which are the same level relationship, namely It is level 0; contour 1b is a sub-contour of contour1, that is, contour 1b counts as a level, that is, level 1; contour2 and contour3 are sub-contours of contour 1b, that is, contour2 and contour3 are at the same level, that is, level 2 ; Therefore, for contour0, its Hierarchy parameter information is represented as [1,-1,-1,-1].
  • S8144 Define the ROI feature region according to the coordinate extremum in the boundary coordinates.
  • the controller 500 obtains the outermost contour list ContoursList with a level of 0 based on the obtained contour hierarchical relationship, and calculates the area area of each contour through multi-point coordinate values of each contour.
  • the controller 500 sorts the contours based on the size of the area, and filters out the largest area preferably and the second largest area Narea; compares the largest area preferably and the second largest area Narea with a preset area threshold; the preset area threshold Not specific, in some embodiments the preset area threshold is set to 1/2 of the area of the grayscale image.
  • the controller 500 calculates the ratio of the largest area to the second largest area, if the ratio is within the preset interval, the controller 500 obtains the centroid coordinates of the largest area preferably and the second largest area Narea respectively, obtains the width and height (w, h) of the two areas, and obtains the width and height (w, h) of the two areas respectively.
  • the perimeter ratio of w/h, the area with the smaller ratio is selected as the optimal area, that is, the current projection area.
  • the controller 500 calculates the perimeter ratio of the largest area, and if the perimeter ratio is greater than the preset perimeter ratio threshold, the largest area is taken as the optimal area, ie, the current projection area.
  • the controller 500 outputs the acquired multi-point coordinates of the current projection area, among which four-point coordinates or eight-point coordinates (including boundary midpoint lines), as ROI feature regions.
  • the controller 500 may calculate the ROI feature region in the projected content image according to a preset keystone correction algorithm.
  • the controller 500 first obtains two sets of external parameters between the left camera and the right camera or between the camera 700 and the optical machine 200 in the binocular camera, that is, the rotation and translation matrices; Chart (the standard chart is a pre-designed chart, which is projected and displayed by the optical machine 200), and calculates the depth value of the corner point of the projected chart, for example, through the translation relationship between binocular cameras and the principle of similar triangles. (x, y, z) coordinate value; then based on the (x, y, z) fit the projection surface, and obtain its rotation relationship and translation relationship with the camera 700 coordinate system, which may specifically include the pitch relationship ( Pitch) and yaw relationship (Yaw).
  • Pitch pitch relationship
  • Yaw yaw relationship
  • the controller 500 can obtain the Roll parameter value through the gyroscope configured in the projection device to combine the complete rotation matrix, and finally calculate and obtain the external parameters from the projection plane in the global coordinate system to the optical machine 200 coordinate system.
  • a homography matrix from the points on the projection surface to the points on the standard chart can be formed.
  • the controller 500 calculates the coordinate value of the projection point in the coordinate system of the optical machine 200, and obtains the angle between the projection surface and the optical machine 200 based on the coordinate value fitting plane, and then obtains the world coordinate of the projection point on the projection surface according to the angle relationship
  • the corresponding coordinates in the system according to the coordinates of the standard map card in the optical machine 200 coordinate system and the coordinates of the corresponding points on the projection plane projection surface, the homography matrix can be calculated.
  • the controller 500 selects a rectangle on the projection surface, uses the homography to inversely calculate the coordinates corresponding to the preset chart, that is, the correction coordinates, and sets them to the optical machine 200 to end the trapezoidal correction operation.
  • the controller 500 converts the projected coordinates after trapezoidal correction into camera coordinates, and uses the camera coordinates as the ROI feature region.
  • the controller 500 converts the projected coordinates in the optical machine 200 coordinate system to the camera coordinate system through the homography matrix, specifically the four corner point coordinates, or the four corner points plus the four edge midpoints to obtain eight points Coordinates; return the coordinates as the ROI feature area.
  • the controller 500 of the projection device obtains the depth value of the point corresponding to the pixel point in the projected content image, or the coordinates of the projected point in the camera coordinate system; through the depth value, the middleware obtains the relationship between the optical machine 200 coordinate system and the camera 700 coordinate system ;
  • the middleware is above the operating system, network and database.
  • the middleware is a kind of software between the application system and the system software. It uses the basic services (functions) provided by the system software.
  • the middleware is used to connect the applications on the network. various parts of the system or different applications.
  • the controller 500 may calculate the ROI feature region in the projection content image according to a preset feature point matching algorithm.
  • the controller 500 converts the image into a grayscale image after acquiring the projected content image.
  • S8342 Obtain a standard graphic card, and project and display it through the optical machine 200, to obtain a projected content image of the standard graphic card.
  • the middleware obtains the standard image card captured by the camera 700, and identifies the black and white graphic feature points in the standard image card, and obtains the coordinates in the coordinate system of the camera 700; the controller 500 further obtains the coordinates of the standard image card in the optical-mechanical 200 coordinate system , to solve the homography relationship between the camera 700 plane and the light machine 200 plane; based on the above homography relationship, the controller 500 recognizes the coordinates of the four vertices of the screen captured by the camera 700, and obtains the projection onto the screen light machine according to the homography matrix 200 to project the range of standard charts.
  • the controller 500 converts the projection content image of the standard graphics card into a grayscale image, traverses the pixels in the standard graphics card, and obtains the coordinate point information of the pixels in the standard graphics card.
  • S8343 Compare the projection content image of the standard chart card with the grayscale image, and respectively obtain the key point description information of the projection content image of the standard chart card and the grayscale image.
  • S8344 Determine the same characteristic key points in the standard image card and the grayscale image according to the above key point description information.
  • the controller 500 uses the SIFT (Scale Invariant Feature Transform) image processing algorithm to obtain the descriptors des1 and des2 of the two images respectively, where the descriptor is the description information describing the key points in the features in the image.
  • SIFT Scale Invariant Feature Transform
  • the controller 500 matches the descriptors to obtain a matching result, uses the first two description information in the descriptor as the optimal matching value, and then outputs the matching result to obtain the feature key point.
  • the knn K Nearest neighbors, fast nearest neighbor search algorithm
  • the value parameter value is set to 2, that is, each match can return two closest matches ⁇ m, n ⁇ , and the first two matching information is obtained; there are ⁇ queryIdx, trainIdx, distance ⁇ related parameters in the obtained matching result variable Information, where queryIdx represents the description information of the preset image, trainIdx represents the image description information of the actual photo taken, and distance represents two distances.
  • k is a preset coefficient, in order to increase the effective matching rate, in some embodiments, the value of k is 0.75.
  • the controller 500 obtains the updated matching list according to the fast nearest neighbor search algorithm, combines the position coordinates of the key points of the feature, and outputs the coordinates of the multi-points under the key feature, closes the coordinates of the key points of the feature, and obtains the key points containing the feature.
  • the obtained image coordinate points are output as the ROI feature area.
  • the current projection content image captured by the camera 700 may contain at least one projection object, so in the step of converting the projection content image into a grayscale image, it is also necessary to determine the number of current projection objects, as shown in Figure 26 shown.
  • the controller 500 acquires the number N of object targets within the current range of action in the current projection content map.
  • N is equal to 1, it means that there is only one projection reflector in the projection area.
  • the feature point matching algorithm obtains the ROI feature area
  • the quantity judgment threshold is set to 1, and the controller 500 executes any one of steps S8141, S8241, and S8341 to obtain the ROI feature region.
  • the controller 500 obtains multiple feature key points in the grayscale image according to the same key point, and determines multiple feature areas in the grayscale image according to the multiple feature key points. Wherein, the step of obtaining key points is consistent with steps S8341 to S8344 in the feature point matching algorithm, and will not be repeated here.
  • S8044 Prioritize the feature regions according to preset rules, and define the feature region with the highest priority as the ROI feature region.
  • the controller 500 decodes feature key points of multiple feature regions, outputs a depth image, and obtains depth values and depth information parameters. Based on the number of depth detection objects N (1 ⁇ N) returned by TOF and the corresponding depth information parameters (D1 ⁇ DN), the depth map is calculated as a layered map Dp to obtain the layered information of each feature area; for each feature area The denoising process is performed to reduce the abnormal noise points that appear in the layering process, and then the layered area is closed, and the distance between the feature key points and the area of each feature area are calculated according to the depth value.
  • the controller 500 calculates each feature region according to the number of target objects and the depth parameter, and obtains the number of pixels occupied by the target object image in each feature region; respectively accumulates the number of pixels of each depth analysis Dp image , to obtain the number of area pixels in each partition.
  • the preset rule is to prioritize the feature areas according to the number of pixels occupied by the target object image, the area size of the feature area, and the perimeter ratio of the feature area, and the feature with the highest priority region as the ROI feature region.
  • the controller 500 sorts the contours based on the size of the area, and filters out the largest area preferably and the second largest area Narea; compares the largest area preferably and the second largest area Narea with a preset area threshold; the preset area threshold Not specific, in some embodiments the preset area threshold is set to 1/2 of the area of the grayscale image.
  • the controller 500 calculates the ratio of the largest area to the second largest area, if the If the ratio is within the preset interval, the controller 500 obtains the centroid coordinates of the largest area preferably and the second largest area Narea respectively, calculates the width and height (w, h) of the two areas, and calculates the w/ The perimeter ratio of h, the area with the smaller ratio is selected as the optimal area, that is, the current projection area.
  • the controller 500 calculates the perimeter ratio of the largest area, and if the perimeter ratio is greater than the preset perimeter ratio threshold, the largest area is taken as the optimal area, ie, the current projection area. In some embodiments, the controller 500 takes the region containing the largest number of pixels and the smallest perimeter ratio as the optimal region. The controller 500 outputs the acquired multi-point coordinates of the current projection area, among which four-point coordinates or eight-point coordinates (including boundary midpoint lines), as ROI feature regions.
  • S805 Calculate the image sharpness of the ROI feature region, and calculate a second focus position according to the image sharpness.
  • S806 Control the driving motor to adjust the focal length of the optical assembly according to the second focusing position.
  • the controller 500 sends the first movement command to control the drive motor 320 to move from the current position (adjustment starting point) to the first focus position (adjustment end point).
  • the camera 700 Taking pictures at a specific frequency to capture the projected content at the current location, when the drive motor 320 controls the optical assembly 310 to reach the first focusing position, it stops moving, which is recorded as a fine focusing.
  • the controller 500 obtains the ROI feature area, calculate the image definition of the ROI feature area, and compare the image clarity of the ROI feature area with the definition threshold, if the image definition of the ROI feature area is higher than the definition threshold, record the first focusing position as the best sharpness position, and end the focusing.
  • the controller 500 acquires multiple projection content images captured by the camera 700 during the movement of the optical assembly 310 , and calculates the sharpness of all projection content images. Sort the sharpness of all projected content images to get the highest sharpness value. The controller 500 compares the highest sharpness value with the sharpness threshold, and if the highest sharpness value is higher than the sharpness threshold, determines that the highest sharpness value corresponds to the shooting position of the projected content image, and records the shooting position as the sharpness Best location.
  • the controller 500 sends a second movement instruction to the drive motor 320, and controls the drive motor 320 to move the optical assembly 310 to a target position, where the target position is the shooting position of the projected content image with the highest definition value, and the process is completed. focusing.
  • some embodiments of the present application further provide a projection device, including: an optical machine 200, a lens 300, a distance sensor 600, a camera 700, and a controller 500, as shown in FIG. 28 .
  • the optical machine 200 is configured to project content to the projection surface
  • the lens 300 includes an optical assembly 310 and a driving motor 320
  • the driving motor 320 is connected to the optical assembly 310 to adjust the optical assembly 310 focal length
  • the camera 700 is configured to take images of projected content
  • the distance sensor 600 is configured to detect the distance between the projection device and the projection surface
  • the controller 500 is configured to:
  • the ROI feature area is the contour area with the largest area or the smallest perimeter ratio in the projected content image, and the contour area is in the projected content image according to the gray value the area delineated in;
  • the projection device and the ROI characteristic area selection method can detect the separation distance between the projection device and the projection surface based on the time-of-flight ranging principle after receiving the focusing instruction.
  • the first focusing position is obtained through calculation based on the preset focusing curve and the separation distance.
  • the method obtains the projected content image at the current focusing position in the coarse focusing stage, and selects the ROI characteristic region of the projected content image, so as to reduce the phenomenon that the projected picture is not clear due to the projected environment, optimize the projected effect, and improve user experience .
  • Some embodiments of the present application also provide an automatic focusing method, which can take into account the advantages of the above-mentioned projection device focusing method, and realize fast focusing according to multi-stage focusing.
  • the autofocus method can be applied to a projection device, and in order to meet the implementation of the autofocus method, the projection device can include an optical machine 200 , a lens 300 , a sensor 600 , a camera 700 and a controller 500 . Wherein, as shown in FIG. 30, the controller can be used to execute the program steps of the autofocus method, including the following steps:
  • the auto-focus instruction may be the above-mentioned first focus instruction.
  • the distance between the projection plane and the camera 700 or sensor 600 is obtained according to the above implementation manner, for example, the separation distance in the above embodiment.
  • the controller 500 After acquiring the first distance, the controller 500 determines the position that the lens 300 needs to reach, that is, the first position, for example, the target focus in the above embodiment, in combination with the preset focus curve, such as the preset focal length comparison table in the above embodiment Position, by comparing the first position with the current position, the distance difference between the two is obtained, and the controller 500 queries the preset mapping table according to the distance, see Table 1 for details, and obtains the distance corresponding to the distance difference
  • the number of rotation steps of the drive motor 320 is the first number of rotation steps.
  • Querying the preset mapping table can obtain the required movement amount and the number of rotation steps to determine the fine-tuning interval for fine-focusing.
  • the controller 500 After determining the number of rotation steps of the drive motor 320, that is, the first number of rotation steps, the controller 500 sends a third movement instruction to the drive motor, that is, controls the drive motor 320 to move the optical device according to the first number of rotation steps. Assembly 310 moves to the first position.
  • the camera 700 takes pictures at a specific frequency, obtains the projection content image captured during the movement of the optical assembly 310 to the first position, and stores the projection content image in Focus position storage container.
  • the controller 500 reads the projected content image in the focus position storage container. Calculate the sharpness values of all projected content images through the sharpness evaluation function set in the sharpness evaluation unit in the system, and compare all the calculated sharpness values with the preset first sharpness, and filter out the higher than the first The definition of the projected content image is determined, and the shooting position for shooting the projected content image is determined, and the shooting position is the second position.
  • the first sharpness and second sharpness are preset in the sharpness evaluation unit before leaving the factory, where the second sharpness value is higher than the first sharpness value, and the first sharpness is used to evaluate the projection obtained in the first rough focusing step.
  • the clarity of the content image, the second clarity is used to evaluate the clarity of the projected content image obtained in the second fine focusing step.
  • the sharpness evaluation unit starts to poll whether the storage container storing the focus position is empty. If it is not empty, it reads the position information, and reads the corresponding photo to calculate the sharpness of the picture, and then stores the result in the sharpness storage container. Standby; the sharpness evaluation can be realized based on multiple methods such as frequency domain function, gray scale function, and information entropy preset in the sharpness evaluation unit.
  • the user can modify the first sharpness and the second sharpness value by himself, but the second sharpness must be higher than the first sharpness; the fine-tuning interval is determined by the first sharpness, and fine-tuning is performed. Find the position of the highest sharpness value in the fine-tuning interval. If the second sharpness is lower than the first sharpness, the final highest sharpness value may be lower than the first sharpness, and fine focus cannot be achieved.
  • the adjustment start point and the adjustment end point of the fine-tuning interval may be adaptively exchanged due to the current position of the drive motor 320, that is, the first position may be the adjustment end point of the fine-tuning interval, and the second position may be the end point of the fine-tuning interval. Adjust the starting point.
  • the first position is 0.134mm away from the current position of the drive motor 320 (i.e. the original position), and the first rotation step is 242 steps obtained by looking up the table.
  • the drive motor 320 drives the optical assembly 310 to move forward according to the first rotation step.
  • the image definition of the projection content at the position of 150 steps is higher than that of the first one, and the position of 150 steps is determined as the second position.
  • 150-242 steps away from the original position is the fine adjustment interval
  • the second position is the adjustment starting point, and the first position to adjust the endpoint.
  • the first position is -0.087mm away from the current position (i.e. the original position) of the drive motor 320
  • the first rotation step is -157 steps obtained by looking up the table
  • the drive motor 320 drives the optical assembly 310 to move backward according to the first rotation step
  • the image definition of the projection content at the -50 step position is higher than the first definition
  • the -50 step position is determined as the second position.
  • -157 to -50 steps away from the original position is the fine adjustment interval
  • the first position is The starting point of adjustment
  • the second position is the end point of adjustment.
  • the controller 500 in the step of moving the optical assembly 310 to the first position by the single focusing unit 320, is further configured to determine whether the sharpness of the projected image after focusing is better than that of the previous frame The clarity of the projected image, if yes, send a focus signal in the same direction to the drive motor 320 , otherwise send a reverse focus signal to the drive motor 320 .
  • the rotation direction information of the driving motor 320 of the projection device in the focusing signal in the same direction is the same as the rotation direction information of the driving motor 320 of the projection device in the focus signal sent last time, and the driving motor 320 of the projection device in the focusing signal in the reverse direction
  • the rotation direction information is opposite to the rotation direction information of the projection device drive motor 320 in the focusing signal sent last time.
  • the first measurement at the same position will result.
  • the distance fluctuates.
  • the projection device also collects the time-of-flight distance for multiple times, calculates the average value, and substitutes the average value into the time-of-flight ranging preset linear regression model configured by the system to calculate the distance between the projection device and the projection surface.
  • the theoretical distance value between because the more the number of acquisitions, the closer the average value is to the real value, which can reduce the impact of different equipment during the time-of-flight ranging process.
  • the controller 500 acquires a plurality of flight data at preset time intervals, thereby calculating a plurality of first distances; taking the average value of the plurality of first distances to obtain an average distance; and inputting the average distance into the flight data.
  • the linear regression model is preset for time ranging to obtain the theoretical distance value between the projection device and the projection surface, that is, the second distance; the second distance is combined with the preset focus curve to determine the position that the lens 300 needs to reach, that is, the first position.
  • the linear regression model of TOF distance measurement is preset by the system when the projection device leaves the factory and saved in the disk partition of the controller 500. When disassembling, maintaining and changing the position of the TOF sensor, a new calibration is required, that is, data collection is performed again to obtain a new linear regression model.
  • the drive motor 320 when the drive motor 320 rotates in the forward direction and reverses again, there will be a return error, and the position reached by the drive motor 320 according to the first number of rotation steps is not the target position, that is, not the first position. The position will affect the subsequent fine focusing process.
  • step 3202 when the controller 500 controls the driving motor 320 to move the optical assembly 310 to the first position according to the first rotation steps, as shown in FIG.
  • the controller 500 acquires the current rotation direction of the drive motor 320; compares the current rotation direction of the drive motor 320 with the last rotation direction of the drive motor 320 to perform autofocus, if the current rotation direction of the drive motor 320 is consistent with the The rotation direction of the driving motor 320 in the last execution of autofocus is inconsistent, and it is judged that there will be a return error in the autofocus process this time, and the second number of rotation steps is obtained by adding the first number of rotation steps to the preset return error; After the second number of rotation steps, the controller 500 sends a fourth movement command to the drive motor 320, that is, controls the drive motor 320 to move the optical assembly 310 to the first position according to the second number of rotation steps .
  • the return error is measured before the projection equipment leaves the factory, and the average error is obtained through the forward rotation and reverse rotation of the driving motor 320 for many times, and the preset return error is finally obtained.
  • the preset return error is also recorded in the controller 500 in the disk partition.
  • a second stage of fine-focusing is performed. Including: step 3301, the controller 500 sends a first movement instruction to the driving motor 320, and controls the driving motor 320 to move the optical assembly 310 within the fine adjustment interval according to the preset number of rotation steps or the first fine adjustment speed.
  • the controller 500 when the controller 500 sends the first movement command, the camera 700 takes pictures at a specific frequency to capture the projected content at the current position. , stop moving, it is recorded as a fine focus adjustment. After a fine focus adjustment is completed, in step 3302, the controller 500 acquires multiple projection content images captured by the camera 700 during the movement of the optical assembly 310, and calculates the sharpness of all projection content images. Step 3303, sort the sharpness of all projected content images to obtain the highest sharpness value. The controller 500 compares the highest sharpness value with the preset second sharpness, and judges whether the highest sharpness value is higher than the second preset sharpness.
  • step 3304 If it is higher, execute step 3304, otherwise adjust the number of steps or speed and return to Execute step 3301; step 3304, if the highest value of sharpness is higher than the second sharpness, determine the highest sharpness value corresponds to the shooting position of the projected content image, and record the shooting position as the best sharpness position.
  • the controller 500 sends a second movement instruction to the driving motor 320 to control the driving motor 320 to move the optical assembly 310 to a target position, where the target position is the shooting position of the projected content image with the highest definition.
  • the determined fine adjustment interval is between 400-800 rotation steps of the drive motor 320 from the current position, and the current position of the drive motor is recorded as the original position, and the controller 500 sends the first movement command Control the drive motor 320 to drive the optical assembly 310 to move within the fine adjustment interval.
  • the drive motor 320 drives the optical assembly 310 from the first position to the second position, that is, a position 800 steps away from the original position of the drive motor 320, the drive motor 320 stops moving.
  • the sharpness evaluation unit of the system acquires the projection content images captured by the camera, sorts the sharpness of each position, and obtains the highest value of sharpness.
  • the shooting position of the projection content corresponding to the highest sharpness value is 200 steps away from the adjustment starting point position, that is, a position 600 steps away from the original position of the drive motor 320 .
  • the highest value of the definition is higher than the second definition, and the position 600 steps away from the original position of the drive motor 320 is recorded as the target position, and the controller 500 sends a second movement command to control the drive motor 320 to drive the optical assembly 310 from the current position , that is, to move from the adjustment end point to the target position, that is, the drive motor 320 drives the optical assembly 310 to move from the adjustment end point to the adjustment start point with 200 rotation steps, and finally completes this autofocus.
  • the controller 500 adjusts the number of rotation steps or fine-tuning speed of the driving motor 320, controls the driving motor 320 to move the optical assembly 310 within the fine-tuning interval, and starts a new cycle. wheel for fine-tuning.
  • the controller 500 drives the drive motor 320 to move from the adjustment start point or the adjustment end point to the other side of the fine adjustment interval according to the proximity principle.
  • the fine-tuning interval is that the driving motor 320 rotates between 400-800 steps from the original position, and the driving motor 320 is in the second position after the last fine-tuning, that is, a position 800 steps away from the original position of the driving motor 320.
  • the controller 500 controls the driving motor 320 to drive the optical assembly 310 to move from the second position to the first position.
  • the sharpness evaluation unit passes Calculate the sharpness of the projected image captured by the camera 700 during this fine focusing process, if the highest value of the sharpness of this fine focusing is still not higher than the second sharpness, then continue to adjust the number of rotation steps of the drive motor 320 Alternatively, the speed can be fine-tuned to start a new round of focusing until the position with the best definition is found (this position is the shooting position of the projected content image whose highest value of definition is higher than the second definition).
  • the controller 500 records the number of times of fine focusing, and before starting a new round of fine focusing, the controller 500 compares the recorded number of times of fine focusing with a preset threshold, and if the recorded number of fine focusing If the number of times of focusing is higher than the preset threshold, a new round of fine focusing will not be started, and the controller 500 will send a fifth movement instruction to control the driving motor 320 to drive the optical assembly 310 to move to the highest value of definition during the last round of fine focusing. The shooting position of the corresponding projected content image ends the current auto-focusing.
  • the number of fine focus adjustments is set to be no more than 20 times. After 20 fine focus adjustments are started, the position with the best definition still cannot be found, that is, the highest value of sharpness is not higher than the second sharpness, and no new For a round of fine focusing, the controller 500 sends a fifth movement command to control the drive motor 320 to drive the optical assembly 310 to move to the shooting position corresponding to the projected screen image with the highest definition during the 20th fine focusing process, and the automatic Adjust the focus to avoid excessive time-consuming autofocus attempts.
  • multiple specific positions can be set in the fine adjustment interval.
  • the drive motor 320 drives the optical assembly 310 to a specific position
  • the camera 700 takes pictures to obtain the projection content image of the specific position, and takes multiple specific positions
  • the sharpness of the projected content image is sorted from high to low, and the fine adjustment interval is gradually narrowed, so as to reduce the time for one fine adjustment, thereby saving time and realizing fast automatic focusing.
  • the fine-tuning interval is 400-900 steps away from the original position
  • multiple specific positions are set in the fine-tuning interval: specific position 1 (position 500 steps away from the original position), specific position 2 (position 600 steps away from the original position), specific position 3 (position 700 steps away from the original position) and specific position 4 (position 800 steps away from the original position); with the position 400 steps away from the original position as the adjustment starting point, the controller 500 controls the drive motor 320 to drive the optical assembly 310 to move from the adjustment starting point to the adjustment end point (900 steps away from the original position).
  • the camera 700 takes a photo.
  • the drive motor 320 drives the optical assembly 310 to move to the adjustment end point , to stop the movement.
  • the controller 500 compares the sharpness of the projected content images taken at multiple specific positions, and determines that the specific position 3 is the highest value of sharpness, but the highest value of sharpness is not higher than the second sharpness, and then starts the next round of fine focusing, In a new round of fine-tuning, the fine-tuning interval is no longer 400-900 steps away from the original position, but 700-900 steps away from the original position; the controller 500 controls the drive motor 320 to drive the optical assembly 310 from the adjustment end point to the Adjust the starting point to move 200 steps.
  • the camera 700 shoots the projected content images during the movement multiple times at a preset frequency, calculates the projected content images, and obtains the highest value of definition higher than the second definition, and the definition
  • the shooting position of the projected content image corresponding to the highest value is 750 steps away from the original position, which is the position with the best definition, and the controller 500 controls the drive motor 320 to drive the optical assembly 310 to move to the position with the best definition to complete auto-focusing.
  • the camera 700 since the camera 700 takes pictures not after the driving motor 320 stops, but takes pictures while the driving motor 320 is moving, the actual shooting position of the picture and the position set by the controller 500 exist. There is a certain deviation, but because the speed of the driving motor 320 and the photographing frequency of the camera 700 are fixed, the above deviation will be limited within a certain range. This application solves this problem by introducing a compensation value.
  • the position with the best definition is obtained according to the second paragraph of fine focusing, that is, the position is the shooting position of the projected content image whose highest value of definition is higher than the second definition, according to the pace of the drive motor 320 and the camera 700
  • the frequency is calculated to obtain the compensation value, and according to the compensation value, move a specific number of steps before and after the optimal position to calculate the theoretical value of definition, and then the final relative definition optimal position can be obtained.
  • the best position of sharpness is 200 steps away from the starting point of adjustment, and the calculated average position compensation value is 20 steps, then the final position is 180 steps, 200 steps and 220 steps away from the starting point of adjustment. position to determine the final relative sharpness best position.
  • some embodiments of the present application further provide a projection device, including: an optical machine, a lens, a camera, and a controller, as shown in FIG. 35 .
  • the optical machine is configured to project the playback content to the projection surface
  • the lens includes an optical assembly and a drive motor
  • the drive motor is connected to the optical assembly to adjust the focal length of the optical assembly
  • the camera is configured In order to shoot images of projected content
  • the controller is configured to: step 1, obtain an auto-focus instruction; step 2, respond to the auto-focus instruction, obtain the first distance between the projection device and the projection surface, and calculate it in the above manner The first position; step 3, calculating the fine adjustment interval based on the preset focus curve and the first distance; sending a first movement command to the drive motor to control the drive motor to adjust the optical assembly in the fine adjustment move within the interval; step 4, calculate the sharpness of the projection content image captured by the camera during the movement of the optical assembly; step 5, send a second movement instruction to the drive motor to control the drive motor to move the optical
  • the projection device obtained by the above embodiment obtains the first distance between the projection device and the projection surface after receiving the auto-focus instruction, and then calculates the fine adjustment interval according to the preset focusing curve and the first distance, and completes the first coarse Focusing; when the fine-tuning interval is determined, the drive motor is controlled to move the optical assembly within the fine-tuning interval, and the projection content with the highest definition is obtained by calculating the sharpness of the projection content image captured by the camera during the movement of the optical assembly. Shooting position, control the driving motor to move the optical components to this position to complete autofocus.
  • the projection device determines the fine-tuning interval through the first-stage coarse focusing, and finds the position with the best definition in the fine-tuning interval through the second stage of fine-focusing to complete automatic focusing without increasing the focus consumption. Under the premise of time, avoid falling into the problem of unclear focus caused by partial focus, improve the focus speed and improve user experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Projection Apparatus (AREA)

Abstract

本申请一些实施例中提供一种投影设备及调焦方法,所述调焦方法可以在接收到调焦指令后,从存储器中提取位置记忆信息,并对位置记忆信息中当前位置的可信度进行判断。如果可信度为第一数值,则基于当前位置计算第一调焦量;如果可信度为第二数值,则基于调焦区间起点位置计算第二调焦量。并按照第一调焦量和第二调焦量向驱动马达发送调焦指令,使投影设备可以基于记录的当前位置调整光学组件的位置,提高投影画面清晰度,并缩短调焦过程所消耗的时间。

Description

投影设备及调焦方法
相关申请的交叉引用
本申请要求在2021年11月16日提交、申请号为202111355866.0;在2022年03月31日提交、申请号为202210345204.3;在2022年03月31日提交、申请号为202210343444.X;在2022年06月02日提交、申请号为202210625987.0的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示设备技术领域,尤其涉及一种投影设备及调焦方法。
背景技术
投影设备是一种可以将图像或视频投射到屏幕上的显示设备。投影设备可以将特定颜色的激光光线通过光学透镜组件的折射作用,投射到屏幕上形成具体影像。在投影过程中,需要将投影设备与屏幕之间保持一定距离,使屏幕上形成的影像可以符合光学透镜组件的焦距范围,以获得清晰的影像。
为了适应复杂的应用场景以及不同规格的屏幕,需要对投影设备光学透镜组件的焦距进行调整。例如,用户可以通过观察投影设备投射的画面清晰度,手动调整光学透镜组件中,镜片之间的距离,使光学透镜组件的整体焦距发生变化。随着用户的调整过程,投射画面的清晰度将发生变化,待清晰度满足用户需求后停止。
显然,手动调焦过程操作繁琐,不便于用户使用。为此,部分投影设备还支持自动调焦功能。自动调焦功能可以通过设置驱动电机,使驱动电机带动光学透镜组件中的部分透镜移动进行调焦。投影设备同时还检测投射画面的清晰度,并根据检测的清晰度控制驱动电机启动或停止运行,实现自动调焦。但是,由于清晰度检测和驱动电机启停过程的控制繁琐,导致上述自动调焦方式的耗时较长,且抗干扰能力差,降低用户体验。
发明内容
本申请实施例的第一方面提供一种投影设备,包括:光机,被配置为投射投影内容至投影面;镜头,所述镜头包括光学组件和驱动马达;所述驱动马达连接所述光学组件,以调整所述光学组件的焦距;存储器,被配置为存储位置记忆信息;控制器,被配置为:获取用户输入的调焦指令;响应于所述调焦指令,提取所述位置记忆信息,所述位置记忆信息包括所述光学组件的当前位置以及所述当前位置的可信度;响应于所述可信度为第一数值,基于所述当前位置计算第一调焦量,以及向所述驱动马达发送第一调焦指令,所述第一调焦指令用于控制所述驱动马达按照所述第一调焦量移动所述光学组件的位置;响应于所述可信度为第二数值,基于调焦区间起点位置计算第二调焦量,以及向所述驱动马达发送第二调焦指令,所述第二数值小于所述第一数值;所述第二调焦指令用于控制所述驱动马达按照所述第二调焦量移动所述光学组件的位置。
本申请实施例的第二方面提供一种用于投影设备的调焦方法,所述投影设备包括光机、镜头以及控制器;其中,所述镜头包括光学组件和驱动马达,所述驱动马达连接所述光学组件,以调整所述光学组件的焦距;所述调焦方法包括:获取用户输入的调焦指令;响应于所述调焦指令,提取所述位置记忆信 息,所述位置记忆信息包括所述光学组件的当前位置以及所述当前位置的可信度;响应于所述可信度为第一数值,基于所述当前位置计算第一调焦量,以及向所述驱动马达发送第一调焦指令,所述第一调焦指令用于控制所述驱动马达按照所述第一调焦量移动所述光学组件的位置;响应于所述可信度为第二数值,基于调焦区间起点位置计算第二调焦量,以及向所述驱动马达发送第二调焦指令,所述第二数值小于所述第一数值;所述第二调焦指令用于控制所述驱动马达按照所述第二调焦量移动所述光学组件的位置。
附图说明
图1为本申请实施例中投影设备投影状态示意图;
图2为本申请实施例中投影设备结构示意图;
图3为本申请实施例中投影设备的光机架构示意图;
图4为本申请实施例中投影设备光路示意图;
图5示出了本申请一实施例投影设备的电路结构示意图;
图6为本申请实施例中投影设备的***框架示意图;
图7为本申请实施例中投影设备的镜头结构示意图;
图8为本申请实施例中镜头投影光路示意图;
图9为本申请实施例中距离传感器和相机结构示意图;
图10为本申请实施例中基于位置记忆的调焦方法流程示意图;
图11为本申请实施例中基于第一调焦量的调焦过程示意图;
图12为本申请实施例中计算第一调焦量的流程示意图;
图13为本申请实施例中基于第二调焦量的调焦过程示意图;
图14为本申请实施例中各功能模块数据交互示意图;
图15为本申请实施例中设置可信度的流程示意图;
图16为本申请实施例中多阶段调焦过程示意图;
图17为本申请实施例中基于位置记忆的调焦方法时序流程图;
图18为本申请另一实施例投影设备实现放射眼功能的信令交互时序示意图;
图19为本申请另一实施例投影设备实现显示画面校正功能的信令交互时序示意图;
图20为本申请另一实施例投影设备实现自动对焦算法的流程示意图;
图21为本申请另一实施例投影设备实现梯形校正、避障算法的流程示意图;
图22为本申请另一实施例投影设备实现入幕算法的流程示意图;
图23为本申请另一实施例投影设备实现防射眼算法的流程示意图;
图24为本申请另一些实施例的ROI特征区域选取方法流程示意图;
图25示例性示出了一些实施例的ROI特征区域选取方法中轮廓区域分级示意图;
图26示例性示出了一些实施例的ROI特征区域选取方法流程图;
图27示例性示出了一些实施例的精细调焦过程示意图;
图28示例性示出了一些实施例的ROI特征区域选取方法过程示意图;
图29示例性示出了本申请实施例中ROI特征区域选取方法时序关系图;
图30示例性示出了一些实施例的自动调焦方法流程示意图;
图31示例性示出了一些实施例的自动对焦方法流程图;
图32示例性示出了一些实施例的自动对焦方法流程图;
图33示例性示出了一些实施例的自动对焦方法流程图;
图34示例性示出了一些实施例的自动对焦方法中精细调焦过程示意图;
图35示例性示出了本申请实施例中自动对焦方法时序关系图。
具体实施方式
为使本申请的目的和实施方式更加清楚,下面将结合本申请示例性实施例中的附图,对本申请示例性实施方式进行清楚、完整地描述,显然,描述的示例性实施例仅是本申请一部分实施例,而不是全部的实施例。
投影设备是一种可以将图像、或视频投射到屏幕上的设备,投影设备可以通过不同的接口同计算机、广电网络、互联网、VCD(Video Compact Disc:视频高密光盘)、DVD(Digital Versatile Disc Recordable:数字化视频光盘)、游戏机、DV等相连接播放相应的视频信号。投影设备广泛应用于家庭、办公室、学校和娱乐场所等。
图1为本申请一实施例投影设备的摆放示意图,图2为本申请一实施例投影设备光路示意图。
参考图1和图2,本申请的一种投影设备包括投影屏幕1和用于投影的装置2。投影屏幕1固定于第一位置上,用于投影的装置2放置于第二位置上,使得其投影出的画面与投影屏幕1吻合。投影设备包括激光光源100,光机200,镜头300,投影介质400。其中,激光光源100为光机200提供照明,光机200对光源光束进行调制,并输出至镜头300进行成像,投射至投影介质400形成投影画面。
在一些实施例中,投影设备的激光光源100包括激光器组件110和光学镜片组件120,激光器组件110发出的光束可透过光学镜片组件120进而为光机提供照明。其中,例如,光学镜片组件120需要较高等级的环境洁净度、气密等级密封;而安装激光器组件的腔室可以采用密封等级较低的防尘等级密封,以降低密封成本。
图3为本申请一实施例投影设备的电路架构示意图。该投影设备可以包括显示控制电路10、激光光源20、至少一个激光器驱动组件30以及至少一个亮度传感器40,该激光光源20可以包括与至少一个激光器驱动组件30一一对应的至少一个激光器。其中,该至少一个包括一个或多个,多个是指两个或两个以上。
基于该电路架构,投影设备可以实现自适应调整。例如,通过在激光光源20的出光路径中设置亮度传感器40,使亮度传感器40可以检测激光光源的第一亮度值,并将第一亮度值发送至显示控制电路10。
该显示控制电路10可以获取每个激光器的驱动电流对应的第二亮度值,并在确定该激光器的第二亮度值与该激光器的第一亮度值的差值大于差值阈值时,确定该激光器发生COD(Catastrophic optical damage)故障;则显示控制电路可以调整激光器的对应的激光器驱动组件的电流控制信号,直至该差值小于等于该差值阈值,从而消除该蓝色激光器的COD故障;该投影设备能够及时消除激光器的COD故障,降低激光器的损坏率,提高投影设备的图像显示效果。
图4为本申请一实施例投影设备的结构示意图。
在一些实施例中,该投影设备中的激光光源20可以包括独立设置的蓝色激光器201、红色激光器202和绿色激光器203,该投影设备也可以称为三色投影设备,蓝色激光器201、红色激光器202和绿色激光器203均为模块轻量化(Mirai Console Loader,MCL)封装激光器,其体积小,利于光路的紧凑排布。在另外一些实施方式中,激光光源还可以在单色激光器、双色的激光器。
在一些实施例中,投影设备可以配置相机,用于和投影设备协同运行,以实现对投影过程的调节控制。例如,投影设备配置的相机可具体实施为3D相机,或双目相机;在相机实施为双目相机时,具体包括左相机、以及右相机;双目相机可获取投影设备对应的幕布,即投影面所呈现的图像及播放内容,该图像或播放内容由投影设备内置的光机进行投射。
当投影设备移动位置后,其投射角度、及至投影面距离发生变化,会导致投影图像发生形变,投影图像会显示为梯形图像、或其他畸形图像;投影设备控制器可基于相机拍摄的图像,通过耦合光机投影面之间夹角和投影图像的正确显示实现自动梯形校正。
图5为本申请一实施例投影设备的电路结构示意图。
在一些实施例中,激光器驱动组件30可以包括驱动电路301、开关电路302和放大电路303。该驱动电路301可以为驱动芯片。该开关电路302可以为金属氧化物半导体(metal-oxide-semiconductor,MOS)管。
其中,该驱动电路301分别与开关电路302、放大电路303以及激光光源20所包括的对应的激光器连接。该驱动电路301用于基于显示控制电路10发送的电流控制信号通过VOUT端向激光光源20中对应的激光器输出驱动电流,并通过ENOUT端将接收到的使能信号传输至开关电路302。
显示控制电路10还用于将放大后的驱动电压确定为激光器的驱动电流,并获取该驱动电流对应的第二亮度值。
在一些实施例中,放大电路303可以包括:第一运算放大器A1、第一电阻(又称取样功率电阻)R1、第二电阻R2、第三电阻R3和第四电阻R4。
在一些实施例中,显示控制电路10,还用于当激光器的第二亮度值与激光器的第一亮度值的差值小于等于差值阈值时,恢复与激光器对应的激光器驱动组件的电流控制信号至初始值,该初始值为正常状态下对激光器的PWM(Pulse Width Modulation,脉宽调制)电流控制信号的大小。从而,当激光器发生COD故障时,可以快速的识别,并及时采取降低驱动电流的措施,减轻激光器自身的持续损伤,帮助其自恢复,整个过程中不需要拆机和人为干涉,提高了激光器光源使用的可靠性,保证了激光投影设备的投影显示质量。
在一些实施例中,控制器包括中央处理器(Central Processing Unit,CPU),视频处理器,音频处理器,图形处理器(Graphics Processing Unit,GPU),RAM Random Access Memory,RAM),ROM(Read-Only Memory,ROM),用于输入/输出的第一接口至第n接口,通信总线(Bus)等中的至少一种。
在一些实施例中,将***分为四层,从上至下分别为应用程序(Applications)层(简称“应用层”),应用程序框架(Application Framework)层(简称“框架层”),安卓运行时(Android runtime)和***库层(简称“***运行库层”),以及内核层。
在一些实施例中,投影设备启动后可以直接进入上次选择的信号源的显示界面,或者信号源选择界面,其中信号源可以是预置的视频点播程序,还可以是HDMI(High Definition Multimedia Interface)接口,直播电视接口等中的至少一种,用户选择不同的信号源后,投影机可以显示从不同信号源获得的内容。
在一些实施例中,投影设备可以配置相机,用于和投影设备协同运行,以实现对投影过程的调节控制。例如,投影设备配置的相机可具体实施为3D相机,或双目相机;在相机实施为双目相机时,具体包括左相机、以及右相机;双目相机可获取投影设备对应的幕布,即投影面所呈现的图像及播放内容,该图像或播放内容由投影设备内置的光机进行投射。
图6为本申请一实施例投影设备实现显示控制的***框架示意图。
在一些实施例中,投影设备具备长焦微投的特点,其控制器通过预设算法可对投影光图像进行显示控制,以实现显示画面自动梯形校正、自动入幕、自动避障、自动调焦、以及防射眼等功能。
在一些实施例中,投影设备配置有陀螺仪传感器;设备在移动过程中,陀螺仪传感器可感知位置移动、并主动采集移动数据;然后通过***框架层将已采集数据发送至应用程序服务层,支撑用户界面交互、应用程序交互过程中所需应用数据,采集数据还可用于控制器在算法服务实现中的数据调用。
在一些实施例中,投影设备配置有飞行时间传感器,在飞行时间传感器采集到相应数据后,所述数据将被发送至服务层对应的飞行时间服务;上述飞行时间服务获取数据后,将采集数据通过进程通信框架发送至应用程序服务层,数据将用于控制器的数据调用、用户界面、程序应用等交互使用。
在一些实施例中,投影设备配置有用于采集图像的相机,所述相机可实施为双目相机、或深度相机、或3D相机等;相机采集数据将发送至摄像头服务,然后由摄像头服务将采集图像数据发送至进程通信框架、和/或投影设备校正服务;所述投影设备校正服务可接收摄像头服务发送的相机采集数据,控制器针对所需实现的不同功能可在算法库中调用对应的控制算法。
在一些实施例中,通过进程通信框架、与应用程序服务进行数据交互,然后经进程通信框架将计算结果反馈至校正服务;校正服务将获取的计算结果发送至投影设备操作***,以生成控制信令,并将控制信令发送至光机控制驱动以控制光机工况、实现显示图像的自动校正。
在一些实施例中,投影设备通过自动调焦算法,利用其配置的激光测距可获得当前物距,以计算初始焦距、及搜索范围;然后投影设备驱动相机进行拍照,并利用对应算法进行清晰度评价。
投影设备在上述搜索范围内,基于搜索算法查找可能的最佳焦距,然后重复上述拍照、清晰度评价步骤,最终通过清晰度对比找到最优焦距,完成自动调焦。
例如,在投影设备启动后,用户移动设备;投影设备自动完成校正后重新调焦,控制器将检测自动调焦功能是否开启;当自动调焦功能未开启时,控制器将结束自动调焦业务;当自动调焦功能开启时,投影设备将通过中间件获取飞行时间传感器的检测距离进行计算;
控制器根据获取的距离查询预设的映射表,以获取投影设备的焦距;然后中间件将获取焦距设置到投影设备的光机;光机以上述焦距进行发出激光后,摄像头将执行拍照指令;控制器根据获取的拍摄图像、评价函数,判断投影设备调焦是否完成;
如果判定结果符合预设完成条件,则控制自动调焦流程结束;如果判定结果不符合预设完成条件,中间件将微调投影设备光机的焦距参数,例如可以预设步长逐渐微调焦距,并将调整的焦距参数再次设置到光机;从而实现反复拍照、清晰度评价步骤,最终通过清晰度对比找到最优焦距完成自动调焦。
图7为在一些实施例中投影设备的镜头结构示意图。为了支持投影设备的自动调焦过程,如图7所示,投影设备的镜头300还可以包括光学组件310和驱动马达320。其中,光学组件310是由一个或多个透镜组成的透镜组,可以对光机200发射的光线进行折射,使光机200发出的光线能够透射到投影面上,形成透射内容影像。
光学组件310可以包括镜筒以及设置在镜筒内的多个透镜。根据透镜位置是否能够移动,光学组件310中的透镜可以划分为移动镜片311和固定镜片312,通过改变移动镜片311的位置,调整移动镜片311和固定镜片312之间的距离,改变光学组件310整体焦距。因此,驱动马达320可以通过连接光学组件310中的移动镜片311,带动移动镜片311进行位置移动,实现自动调焦功能。
需要说明的是,本申请部分实施例中所述的调焦过程是指通过驱动马达320改变移动镜片311的位置,从而调整移动镜片311相对于固定镜片312之间的距离,即调整像面位置,因此光学组件310中镜片组合的成像原理,所述调整焦距实则为调整像距,但就光学组件310的整体结构而言,调整移动镜片311的位置等效于调节光学组件310的整体焦距调整。因此,为了描述方便,在后续实施例中都使用调 整焦距来说明上述过程。
驱动马达320可以通过特定的传动机构连接移动镜片311。传动机构的传动原理可以为任何将转动动作转化为移动动作的传动结构。例如,涡轮蜗杆传动结构、滚珠丝杠传动结构、螺纹螺杆传动结构等。对于螺纹螺杆传动结构,移动镜片311的外侧边缘设有镜框,镜框上可以设有螺纹。驱动马达320的动力输出轴连接螺杆,通过螺杆与镜框上的螺纹配合,使驱动马达320输出的转动动作可以转化为镜框的移动动作,从而带动移动镜片311在镜筒内移动。
由于移动镜片311处于不同的位置上时,对光学组件310整体焦距的影响也不同,因此投影设备可以通过驱动马达320转动特定的角度或圈数,使移动镜片311处在相对应的位置上。为了实现上述功能,驱动马达320可以为转动角度可控制的步进电机、伺服电机等。在调焦过程中,投影设备的控制器500可以向驱动马达320发送移动指令,移动指令中可以包括控制驱动马达320所需要旋转的角度数据。例如,对于步进电机形式的驱动马达320,控制器500所发送的移动指令中可以包括需要转动角度对应的脉冲信号,则在将移动指令方发给驱动马达320后,驱动马达320可以从移动指令中解析出脉冲信号,并根据脉冲信号进行转动。在一些实施例中,驱动马达还可以是超声波马达或音圈电机,这里不详细展开说明。
需要说明的是,为了能够将移动镜片311调整至特定的位置,可以预先根据投影设备的内部结构,计算移动镜片311移动距离与驱动马达320转动角度之间的对应关系。移动距离与转动角度之间的对应关系可以为线性关系,受传动机构的传动比影响。则在进行调焦时,投影设备可以先计算移动镜片311的目标位置,再与移动镜片311的当前位置做差计算出调焦过程移动镜片311需要移动距离。再根据移动距离与转动角度之间的对应关系,计算出驱动马达320需要转动的角度,从而生成移动指令发送给驱动马达320。
由于移动镜片311通常只能够沿着镜筒,在镜筒内移动,因此在调焦过程中,移动镜片311具有行程限制。图8为在一些实施例中镜头投影光路示意图,如图8所示,为了便于描述,可以将移动镜片311最靠近光机时的一端称为近端,将移动镜片311最远离光机时的一端称为远端,则移动镜片311的整体移动行程为近端与远端之间的距离。在一些实施例中,为了便于准确调节投影效果,投影设备的实际调焦范围可以在移动镜片311的行程范围内。例如,将移动镜片311移动至近端后,驱动马达320再向前调节300步,可满足投影设备在最近投射距离的调焦需求;而从近端向前再调节900步,则可满足投影设备在最远投射距离的调焦需求,则实际调焦范围可以为驱动马达320在调节300步至900步所对应的位置区间。
由于调焦过程还可以受到驱动马达特性、环境特性、摆放姿态等综合因素的影响,使得投影设备在近端和远端位置还需要设定一定的调节余量,以满足实际投影效果。因此,在一些实施例中,投影设备的调节区间可以在实际调焦范围的基础上,增加部分调节余量。例如,对于调节300步至900步所对应的位置区间,在设置100步调节余量后,可以将距离近端200步所在位置和1000步所在位置设定为调节起点和调节终点,以构成最终的调焦范围。
当投影设备与投影面之间相距不同距离时,需要投影设备的镜头调整不同的焦距从而在投影面上透射清晰的图像。而在投影过程中,投影设备与投影面的间隔距离会受用户的摆放位置的不同而需要不同的焦距。因此,为适应不同的使用场景,投影设备需要调节光学组件310的焦距。
在一些实施例中,投影设备可以支持手动调焦功能,即在投影设备上可以设有交互按键,或者投影设备配套有遥控器,用户可以通过投影设备上的焦距调节按键或者遥控器上的焦距调节按键,与投影设备进行交互。则在交互过程中,投影设备的控制器500可以根据用户的按键操作,生成移动指令,并将移动指令发送给驱动马达,以控制驱动马达带动移动镜片311进行移动,改变光学组件的焦距。
例如,当用户在使用投影设备时发现投影画面不清晰时,可以按下投影设备上的“前进”按键,则响应于此时用户的按键操作,投影设备可以生成控制移动镜片311向远离光机方向移动的移动指令,并将移动指令发送给驱动马达,驱动马达则在接收到该移动指令后,按照正向(顺时针)方向旋转以输出力矩,驱动移动镜片311向远离光机方向移动。随着移动镜片311的运动过程,投影设备所透射的画面清晰度将发生变化,用户则根据画面的清晰度选择继续调焦或停止调焦操作,直至呈现用户满意的画面效果。
在手动调焦过程中,投影设备的控制器500可以根据预设的交互规则,实现对驱动马达320的对应控制。即在一些实施例中,控制器500可以根据用户的按键时长确定驱动马达的转动角度量。则在调焦量较大时,用户可以长时间按下调焦按键;而在调焦量较小时,用户可以短时间按下调焦按键。
在一些实施例中,投影设备的控制器500还可以根据用户的按键次数,确定驱动马达的转动角度量。例如,投影设备在粗调状态下,设置每次按键操作的调节量为100步,对应驱动马达转动一周,则当用户向远端调整300步时,需要用户连续按下三次“前进”按键。
投影设备还可以根据投影效果自动调整焦距。在本申请的一些实施例中提供一种基于间隔距离与焦距关系的调焦方法。即投影设备可以检测光机与投影面之间的间隔距离,由于不同的间隔距离需要不同的焦距才能呈现清晰的画面,因此在检测到相对于投影面的间隔距离后,投影设备可以得到相适应的焦距。再向驱动马达发送移动指令,从而将光学组件的整体焦距调整至相适应的焦距。
图9为在一些实施例中距离传感器和相机结构示意图。如图9所示,投影设备还可以内置或外接相机700,相机700可以对投影设备投射的画面进行图像拍摄,以获取投影内容图像。投影设备再通过对投射内容图像进行清晰度检测,确定当前镜头焦距是否合适,并在不合适时进行焦距调整。基于相机700拍摄的投影内容图像进行自动调焦时,投影设备可以通过不断调整镜头位置并拍照,并通过对比前后位置图片的清晰度找到调焦位置,从而将光学组件中的移动镜片311调整至合适的位置。例如,控制器500可以先控制驱动马达320将移动镜片311调焦起点位置逐渐移动至调焦终点位置,并在此期间不断通过相机700获取投影内容图像。再通过对多个投影内容图像进行清晰度检测,确定清晰度最高的位置,最后控制驱动马达320将移动镜片311从调焦终端调整到清晰度最高的位置,完成自动调焦。
为了获得更好的调焦效果,并提高调焦速度,本申请的一些实施例中还提供一种基于位置记忆的调焦方法。图10为在一些实施例中基于位置记忆的调焦方法流程示意图。所述调焦方法可以通过记忆驱动马达320的位置,并基于记忆的位置进行调焦,以减少驱动马达320的运动行程,提高调焦速度。所述调焦方法可以应用于投影设备,并且为了满足该调焦方法的实施,所述投影设备可以包括光机200、镜头300、存储器以及控制器500。其中,如图10所示,控制器500可以用于执行该调焦方法对应的程序步骤,包括以下内容:
获取用户输入的调焦指令。根据调焦指令后调焦方式的不同,用户输入的调焦指令可以包括手动调焦指令和自动调焦指令。其中,手动调焦指令可以通过投影设备上的实体按键,或者投影设备配套遥控器上的实体按键进行输入。自动调焦指令可以由用户主动输入。例如,用户可以在接通投影设备的电源后,按下投影设备或投影设备配套遥控器上的自动调焦按键,使投影设备自动进行调焦,即获取自动调焦指令。
在一些实施例中,自动调焦指令也可以根据投影设备内置的控制程序自动生成。例如,当投影设备检测到开机后的首次视频信号输入时,可以触发自动调焦,则生成自动调焦指令。还例如,当投影设备检测到自身摆放姿态或设置位置发生改变时,为了消除改变过程的影响,在检测到摆放姿态或设置位置发生改变,投影设备可以自动进行焦距调整,即生成自动调焦指令。
在获取调焦指令后,投影设备可以从存储器中提取位置记忆信息。投影设备中可以预先配置位置存 储模块,位置存储模块可以对调焦过程中驱动马达320的转动情况进行实时记录。位置记忆信息可以在每次执行调焦后,通过记录调焦后的位置生成。显然,能够触发实时记录的调焦过程可以是手动调焦过程也可以是自动调焦过程。
在一些实施例中,所述位置记忆信息包括光学组件310的当前位置以及所述当前位置的可信度。其中,当前位置即每次调焦后光学组件310所处的位置,在未进行调焦时,所说当前位置不会发生改变。可信度用于表征当前位置的置信程度,即记录的当前位置是否可以用于后续调焦过程,实现通过设置可信度对记录的当前位置进行评价。
因此,投影设备在每次调焦过程中,可以先初始化临时变量。其中,所述临时变量用于记录调焦过程中驱动马达320带动光学组件310的移动方向和移动步数。即投影设备可以将调焦过程中的调焦量实时更新至临时变量。最后,根据临时变量更新位置记忆信息中的当前位置,以及根据调焦过程中是否出现异常情况设置当前位置的可信度。
例如,用户进入手动调焦界面后,控制器500可以通知位置存储模块将进行调焦,位置存储模块接收到通知后将位置可信度置为0,并等待调焦过程完成。投影设备再接收用户操作按键,如果用户通过方向键进行调焦,则按照用户要求移动镜片311位置,并通过临时变量,实时记忆用户的操作过程,包括移动方向和移动步数。
提取位置记忆信息后,投影设备可以根据位置记忆信息中的可信度,判断当前位置是否可用。其中,所述可信度可以通过特定的数值对当前位置记忆信息的有效性进行表示。即当可信度为第一数值时,表示当前位置记忆信息有效;当可信度为第二数值时,表示当前位置记忆信息无效。为了表示有效性,可以设置第一数值大于第二数值。例如,设置第一数值为1,表示当前记忆信息有效;设置第二数值为0,表示当前记忆信息无效。也可以通过其他数值表示当前位置记忆消息的有效性,如可信度为奇数时,表示位置记忆信息有效;可信度为偶数时,表示位置记忆信息无效。
图11为在一些实施例中基于第一调焦量的调焦过程示意图,如图11所示,如果可信度为第一数值,即当前位置可用,因此投影设备可以基于当前位置计算第一调焦量,以及向驱动马达320发送第一调焦指令。其中,第一调焦指令用于控制驱动马达320按照第一调焦量移动所述光学组件的位置。
例如,当用户按下一次“前”方向键输入手动调焦指令时,投影设备可以响应于手动调焦指令,提取位置记忆信息。其中,提取的位置记忆信息中,包含当前位置为距离近端350步的位置,以及当前位置对应的可信度为1,同时,手动调焦指令对应的一次“前”方向键对应调焦量为100步。因此,投影设备可以基于当前位置计算第一调焦量,即在350步的基础上增加100步。
对于自动调焦过程,在获取自动调焦指令后,投影设备的控制器500可以响应于自动调焦指令,通过距离传感器600获取间隔距离。其中,距离传感器600可以是能够检测目标距离的激光雷达、红外雷达等基于飞行时间(Time of Flight,TOF)原理的传感器设备。距离传感器600可以设置在光机200位置,包括信号的发射端和接收端。在进行间隔距离检测过程中,距离传感器600的发射端可以向投影面方向发射无线信号,无线信号在接触到投影面后,会被反射回距离传感器600接收端,从而根据发射端发出信号的时间和接收端接收信号的时间计算信号飞行时间,再结合飞行速度可以得到无线信号实际飞行距离,进而计算出投影面与光机之间的间隔距离。
图12为在一些实施例中计算第一调焦量的流程示意图,如图12所示,包括:步骤1201,投影设备可以先获取间隔距离;步骤1202,从存储器查询,预设调焦距离对照表,其中,调焦距离对照表中包括间隔距离与目标调焦位置的映射关系。投影设备可以从对照关系表中查询当前间隔距离相适应的焦距。焦距数据可以表现为移动镜片311相对于行程近端或远端的距离,以及对应需要驱动马达320需要转动的方向、角度以及圈数;
步骤1203,根据间隔距离,在调焦距离对照表中查询目标调焦位置。步骤1204,获取当前位置;步骤1205,结合目标调焦位置与所述当前位置计算第一调焦量,其中,所述第一调焦量为所述目标调焦位置与所述当前位置的差值。步骤1206,最根据第一调焦量,生成第一调焦指令。即在获取间隔距离后,投影设备可以调用存储的间隔距离与焦距对照关系表,再最后根据查询获得的焦距数据,生成第一调焦指令,并向驱动马达320发送第一调焦指令,以控制驱动马达320带动移动镜片311移动至目标位置。
例如,通过激光雷达检测到投影面与投影设备之间的距离为1300mm,此时通过调焦距离对照表可以确定最佳焦距对应的光学组件310所处的目标位置为距近端560步。同时,提取当前位置,即距离近端350步。在确定可信度为1后,再计算第一调焦量,即计算560-350=210为第一调焦量。再按照该第一调焦量向驱动马达320发送第一调焦指令,以控制驱动马达320带动光学组件310向前移动210步,到达目标位置。
图13为在一些实施例中基于第二调焦量的调焦过程示意图。如图13所示,根据提取的位置记忆信息,如果可信度为第二数值,即位置记忆信息中记录的当前位置不能用于后续调焦过程,则投影设备可以基于调焦区间起点位置计算第二调焦量,并向驱动马达320发送第二调焦指令。其中,所述第二调焦指令用于控制驱动马达320按照第二调焦量移动光学组件320的位置。
例如,通过对提取的位置记忆信息进行读取,确定当前位置的可信度为0时,则无论当前位置为何值,都先向驱动马达320发送移动指令,使驱动马达320带动光学组件310移动至投影设备光学组件行程的近端或远端,即调焦区间起点位置。再根据调焦区间起点位置和基于调焦距离对照表确定目标位置为距近端560步,确定第二调焦量为560-0=560。从而生成控制光学组件320向前移动560步的第二调焦指令。
在一些实施例中,为提高调焦速度,在向驱动马达320发送第二调焦指令的过程中,投影设备可以先按照第一速率控制驱动马达将光学组件移动至调焦区间起点位置,并计算第二调焦量,其中,第二调焦量为起点位置与目标调焦位置的差值。再根据第二调焦量,生成第二调焦指令,从而向驱动马达发送第二调焦指令,以控制驱动马达按照第二速率将光学组件移动至目标调焦位置,其中,第二速率小于第一速率。
即投影设备可以先按照第一速率快速将光学组件移动至调焦区间的起点位置,如光学组件行程近端位置,再按照第二速率相对较缓慢的将光学组件移动至目标调焦位置,从而减轻快速运动过程中惯性和行程间隙对调焦精度的影响。
可见,在上述实施例中,投影设备可以在获取调焦指令后,通过读取位置记忆信息,获取当前位置,并在当前位置的可信度为第一数值时,直接基于当前位置计算第一调焦量,从而使投影设备的驱动马达320可以直接基于当前调焦位置进行焦距调整,缩短调焦过程的调整行程,并减轻累计误差对调焦精度的影响,提高调焦过程的速度和精度。
为了使后续调焦过程也可以应用位置记忆信息,投影设备还可以对本次调焦过程中驱动马达320运转情况以及光学组件310的移动情况进行记录,并在调焦过程结束后,对位置记忆信息进行更新。在一些实施例中,投影设备可以在获取调焦指令后,初始化临时变量,并使用临时变量记录第一调焦量或第二调焦量;再根据临时变量更新位置记忆信息中的当前位置,以及设置当前位置的可信度。
针对位置记忆信息的更新过程,投影设备可以根据调焦方式的不同,采用不同的信息更新方式。对于手动调焦过程,投影设备可以在光机投射手动调焦界面的投影内容过程中,接收用户输入的按键信息,以及解析按键信息中的移动方向和移动步数,再将移动方向和移动步数存储至临时变量,以使用临时变量更新当前位置。而对于自动调焦过程,由于自动调焦过程无需判断用户按键,驱动马达320的移动完 全是根据自动调焦策略自动完成的,因此每次执行相应的调焦策略后,实时存储驱动马达320带动光学组件310移动信息即可。
图14为在一些实施例中各功能模块数据交互示意图,如图14所示,在一些实施例中,投影设备的控制器还可以根据不同的功能用途划分为多个功能模块,例如调焦控制模块、自动调焦模块以及手动调焦模块。其中,调焦控制模块可以根据***程序设定或用户操作,触发自动调焦过程或手动调焦过程。
用户可以根据投影设备投射的UI界面的选项,触发相应操作。当触发自动调焦时,调焦控制模块只需要根据场景,如常规调焦、梯形校正触发、校准触发等,向自动调焦模块设置相应参数即可,由自动调焦模块自行完成调焦过程并记忆马达位置。当触发手动调焦时,调焦控制模块可以通过手动调焦模块操控调焦马达外,并自动调焦模块设置手动调焦标志,将位置记忆信息对应的位置可信度设为0。同时,通过临时变量实时记忆驱动马达320的实际操作,并将驱动马达320带动光学组件310移动位置的变化情况实时设置到自动调焦模块,形成双重记忆,最大程度保证位置记忆信息的可信度。
自动调焦模块可以接收调焦控制模块的下发的指令,执行相应的调焦策略。在接收到调焦控制模块下发的手动调焦标志时,可以获取存储器中的位置可信度,若位置可信度不为0,则将可信度设置为0,并在用户手动调焦时,接收调焦控制模块下发的驱动马达移动信息,进行临时记忆。当接收到手动调焦结束的标志后,自动调焦模块可以将本模块存储的马达移动信息与调焦控制模块存储的信息进行对比校验,以根据校验结果设置位置存储模块的位置信息可信度,选择是否更新位置记忆信息。
手动调焦模块可以接收调焦控制模块下发的操作指令,如驱动马达的转动方向、步数等,并通过驱动集成电路(Integrated Circuit,IC)芯片控制驱动马达移动,并将实际移动步数返回给调焦控制模块。其中,实际移动步数可以通过对光学组件行程路径上设置的限位开关进行检测获得。例如,电机控制单元以步进方式控制驱动马达320移动,如果当前移动的总步数小于用户设定的步数,则在每次移动之前,可以先通过对应通用型输入输出接口(General-purpose input/output,GPIO)的电平高低检测,确定是否到达可调区间的起点或终点。如果未到达起止点,则控制驱动马达320带动光学组件进行移动并将步数加1,否则停止移动,并返回当前实际步数。例如,电机控制单元接收到向前移动20步的指令,当准备移动第16步时,检测到已经达到可调区间终点,则此时不再继续执行,直接返回本次实际移动步数为16。
图15为在一些实施例中设置可信度的流程示意图,如图15所示,包括:步骤1501,为了设置所记录当前位置的可信度,投影设备可以通过监测调焦进程生成广播消息。广播消息中利用指定字段指示调焦过程是否有效,若指示调焦过程有效,广播消息为第一广播消息,若指示调焦过程无效,广播消息为第二消息。对于手动调焦过程,投影设备可以监听按键信息。如果在预设监听周期内未监听到按键信息,生成第二广播消息;如果在预设监听周期内监听到按键信息,生成第一广播消息,以及响应于按键信息中的退出按键操作,使用更新位置替换位置记忆信息中当前位置。
步骤1502根据生成的广播消息内容,响应于广播消息为第一广播消息,则使用临时变量和当前位置计算更新位置;步骤1503,使用更新位置替换位置记忆信息中当前位置,步骤1504,可以设置位置记忆信息的可信度为第一数值。步骤1505,响应于广播消息为第二广播消息,或者,在预设接收周期内未生成广播消息,清除临时变量;步骤1506,设置位置记忆信息的可信度为第二数值。
例如,用户进入手动调焦界面,调焦控制模块先通知自动调焦模块将要进行手动调焦,并将位置存储模块的位置可信度置为0,再监听来自自动调焦模块的异常信息广播,如超时等。同时接收用户按键信息,并初始化临时变量,临时存储驱动马达320的操作情况,包括移动方向和步数等。
再通过UI提示,使用户可以通过上、下方向按键进行调焦操作。调焦控制模块接收到有效按键,即上、下方向按键后,则可以先将按键信息转换成对应的驱动马达320的转动方向和步数,并设置到手 动调焦模块。手动调焦模块接收到调焦控制模块下发的指令后,驱动上述驱动马达320将光学组件310按指定方向移动指定步数。在完成移动后,投影设备再将本次实际移动步数返回给调焦控制模块。
调焦控制模块接收到手动调焦模块本次返回的移动方向和步数后,将其同步下发到自动调焦模块,之后调焦控制模块基于已有的临时位置信息和手动调焦模块本次返回的位置信息,更新临时变量。即在调焦控制模块中,临时变量中正向移动次数为A,实际移动步数为AA,反向移动步数为B,实际移动步数为BB,则临时变量记为[A,AA,B,BB];如果本次手动调焦模块返回的信息为正向移动20步,则调焦控制模块临时变量更新为[A+1,AA+20,B,BB]。
当调焦控制模块接收到返回按键,则执行退出操作,即通知自动调焦模块手动调焦已结束,并下发当前临时记忆的调焦位置移动情况。再监听自动调焦模块发送的正常接收的广播消息;如果接收到上述广播消息,则正常清除临时变量,结束本次过程;否则触发异常处理,不更新位置记忆信息,并将位置记忆信息中的可信度设置为第二数值。需要指出的是,当手动调焦过程中监听到待机广播时,则等同于用户进行退出操作,处理过程与上述示例相同。
当调焦控制模块在规定时间内持续没有接收到有效按键,即确定用户没有按照UI界面提示进行手动调焦时,投影设备可以在判断输入时间满足自身设定的超时条件或者监听到自动调焦模块发送的异常查询广播时,进入异常处理,不更新位置记忆信息。
在一些实施例中,为避免调焦控制模块调节手动调焦过程中出现异常退出,导致驱动马达320的移动信息丢失,而使存储的位置记忆信息不可用,投影设备可以检测驱动马达响应于第一调焦指令的单次移动信息,并使用单次移动信息更新临时变量。同时获取结束指令,其中,结束指令由用户主动输入,或者根据调焦进程自动生成。响应于结束指令,累加检测周期内的多个单次移动信息,以获得累计移动量。如果累计移动量对应的光学组件实际位置与目标调焦位置一致,设置当前位置的可信度为第一数值,以及使用临时变量更新位置记忆信息。如果累计移动量对应的光学组件实际位置与目标调焦位置不一致,设置当前位置的可信度为第二数值,以及清除临时变量。
例如,在手动调焦的过程中,自动调焦模块需要同步对驱动马达320带动光学组件310的移动信息做备份处理。例如,自动调焦模块接收到调焦控制模块下发的手动调焦通知后,先设置手动调焦开始标志,同时初始化临时变量,用于备份存储驱动马达320驱动的移动信息。再读取存储器中存储的位置记忆信息对应的可信度,若不为0,则将可信度设为0,以确保位置可信度准确。
对应于用户的一次按键操作,自动调焦模块接收到调焦控制模块下发的单次移动信息,则更新临时变量,对移动信息进行本地实时存储,并等待调焦控制模块下一次的信息。自动调焦模块接收到调焦控制模块下发的手动调焦结束通知及累计的单次移动信息后,则发送正常接收广播,并与本次存储的移动信息进行对比。如果对比结果一致,则直接更新位置存储器的位置记忆信息,即更新当前位置及可信度,并清除本地临时变量。如果对比结果为信息不一致,则触发异常处理,不更新位置记忆信息。
同理,自动调焦模块在接收到手动调焦开始通知后,在尚未接到手动调焦结束的情况下,如果再次接收到手动调焦开始通知,则视为出现异常,触发异常处理,直接设置可信度为0。并且,自动调焦模块超过设定的超时时间没有接收到来自调焦控制模块的有效信息,则同样视为出现异常,触发异常处理,不更新位置记忆信息。
需要说明的是,在上述实施例中异常情况可以根据实际调焦过程的多种因素影响,投影设备可以通过检测调焦过程中的各项参数,确定是否发生异常情况。例如,调焦控制模块每次实际完成一次用户操作,都会将移动信息同步下发到自动调焦模块。因此,如果自动调焦模块接收到调焦控制模块在某一次下发的信息后,超过设定的超时时间,仍未再次接收到调焦控制模块下发的移动信息或手动调焦结束信息,则可视为出现异常情况,导致调焦控制模块无法继续正常工作,如其他应用异常弹出导致调焦控制 模块无法接收按键等。
对此,自动调焦模块可以被配置为向调焦控制模块发送移动信息的异常查询广播,则调焦控制模块接收到该项广播后,可视作手动调焦结束,将存储的位置信息下发到自动调焦模块。如果自动调焦模块能够收到调焦控制模块下发的信息,则向调焦控制模块发送正常接收广播,便于调焦控制模块正常执行结束流程。再将自身存储的移动信息与调焦控制模块下发的移动信息进行比对校验,若两者一致,则视为移动信息有效,因此可以更新位置记忆信息并重新设置可信度为第一数值。若两者不一致,则需要校验自动调焦模块及调焦控制模块是否出现过崩溃重启。如果自动调焦模块及调焦控制模块中有一个重启过,一个没有重启过,则以没有崩溃重启的程序记录的数据为准;如果两者都重启过,则设置位置存储信息不可信,设置可信度为第二数值,使下次自动调焦时需要校准。如果两者都没重启过,则以调焦控制模块的记录为准。
另外,如果自动调焦模块无法收到调焦控制模块下发的信息,则说明此时调焦控制模块可能出现异常退出,此时校验自动调焦模块是否出现过崩溃重启,没有则以自动调焦模块存储的数据为准,否则设置位置记忆信息不可信,即可信度设置为第二数值。
为了确定是否出现崩溃异常,在一些实施例中,投影设备可以通过设置进程相关的***属性,并在每次进程被拉起进行初始化时,在完成初始化后读取该属性值并加1;在交流开机后属性值默认值为1,因此开机首次拉起时,属性值为1。之后每次调焦控制模块通知自动调焦开始手动调焦时,自动调焦模块先读取属性值并设置备份属性值;当触发异常检测进程是否崩溃重启过时,再次读取属性值并与备份值对比,如果一致则说明没有重启过,否则可以检测出本次过程中出现过崩溃重启。
与上述自动调焦模块的异常情况判断过程相适应的,在一些实施例中,调焦控制模块可以长时间未收到用户按键或收到自动调焦模块的异常查询广播,以及向自动调焦模块设置手动调焦结束标志后,未收到正常接收广播时,确定出现异常情况。例如,用户长时间未操作或者误操作导致异常退出手动调焦界面,会导致调焦控制模块长时间收不到键值,则投影设备可以通过设置超时机制处理这种情形,即调焦控制模块有自身的超时机制,自动调焦模块也有自身的超时机制,且超过时间时会给调焦控制模块发送异常查询广播,触发异常处理。
调焦控制模块触发此种异常处理,等同于用户退出手动调焦,此时会向自动调焦模块发送手动调焦停止的通知并发送马达移动信息,之后等待自动调焦模块的正常接收广播。如果接收到了来自自动调焦模块的正常接收广播,则清除临时变量,同正常退出过程,由自动调焦模块完成马达位置信息维护及可信度设置;如果没有接收到来自自动调焦模块的正常接收广播,则由调焦控制模块进行马达位置信息维护及可信度设置,并向自动调焦模块单独设置手动对接结束标志位,便于自动调焦模块清除现有的马达移动临时信息。
调焦控制模块还可以设定特殊标志位,即如果当前超时是因为用户在手动调焦界面下长时间未操作导致的,则当用户再次上下调整时,不会再触发进入手动调焦的操作,此时,调焦控制模块需要根据该标志位重新向自动调焦模块下发手动调焦开始标志。
图16为在一些实施例中多阶段调焦过程示意图,如图16所示,为了获得更准确的调焦位置,投影设备可以通过内置或外接相机等图像采集组件,对投影设备投射的画面进行实时图像检测,从而计算出清晰度最高的位置。即投影设备可以按照第一速率控制驱动马达将光学组件调整至目标调焦位置,再根据目标调焦位置,计算精调区间。其中,目标调焦位置为精调区间的区间起点。
再按照第二速率控制驱动马达带动光学组件在精调区间内移动,其中,第二速率小于第一速率。即先通过较快的移动速度,将光学组件310移动至目标调焦位置,以实现缩短模糊调焦时间,再通过相对较慢的速度控制光学组件310从精调区间的一端移动至另一端,以便于相机可以拍摄更多位置上的投影 内容图像。通过获取相机在光学组件移动期间拍摄的投影内容图像,并计算投影内容图像的清晰度,以获得精细调焦位置。其中,精细调焦位置为清晰度最高的投影内容图像的拍摄位置。
由于在根据图像清晰度进行调焦的过程中,需要实时进行图像拍摄以获得多个投影内容图像,并且对投影内容图像的清晰度进行计算,这导致根据图像清晰度进行调焦的过程耗时较长。因此,为了减少调焦耗时,并保证调焦过程的精确度,在一些实施例中,投影设备可以分阶段进行调焦。例如,完整的调焦过程可以分为三个阶段,即模糊调焦阶段、细调阶段和补偿阶段。其中,模糊调焦阶段在于快速找到细调范围,细调阶段在于找到最佳清晰度位置,补偿阶段用于消除并行控制可能引入的位置偏差。
对于模糊调焦阶段,投影设备可以根据距离传感器检测的间隔距离数据,从距离-焦距对照表中查询目标位置,并以查询的目标调焦位置为基准,确定精细调焦区间(简称:精调区间)。例如,精细调焦区间以目标调焦位置为中点,分别向前和向后延长100步的距离,形成一个200步长度的精调区间。
在一些实施例中,投影设备还可以基于相机拍摄的图像清晰度确定精调区间,即投影设备可以先根据每个投影内容图像的清晰度对调焦位置进行排序,以获得清晰度序列,并在清晰度序列中提取精细调焦区间,其中,清晰度最高的投影内容图像对应调焦位置在精细调焦区间内。例如,当接收到自动调焦指令时,控制器500可以通知相机700进行拍照,并通知清晰度评价单元进行。之后,camera以特定频率进行拍照,得到投影内容图像。则在拍照后,如果接收到控制器发送的读取照片的指令,则给出最近一次的照片路径,否则做丢弃处理。清晰度评价单元则开始轮询存储调焦位置的容器是否为空,不为空则读取位置信息,并据此读取相应的图片并计算图片清晰度,再将清晰度计算结果存入清晰度存储容器中待用。
控制器500可以再读取当前记忆的驱动马达320转动位置,根据就近原则将移动镜片311调节至调节起点位置或调节终点位置。再根据固定步数按特定方向驱动驱动马达320。当驱动马达320驱动镜片到达特定位置后,比如调节起点位置和调节终点位置之间每个100步作为一个特定位置,控制器500可以向camera发送当前位置并要求camera返回照片路径,再将当前位置存至调焦位置存储容器。清晰度评价单元检测到上述容器不为空,则读取位置信息,并获得相应的照片后进行清晰度计算。同时驱动马达320继续运动,无需等到清晰度对比结果。
重复上述过程直至驱动马达320将移动镜片311驱动到调节起点位置或调节终点位置。此时,清晰度评价单元检测到当前已经抵达调节起点位置或调节终点位置后,会将各个位置的清晰度进行排序,并向控制器500返回清晰度最佳位置。
控制器500再根据返回的最佳位置和对应的调焦步数,根据就近原则控制驱动马达320运行,以驱动镜片移动到精调区间的某一侧。比如,在上述步骤中从调节起点位置到达调节终点位置后,经检测最佳位置为距离近端600步的位置,则可以确定精细调焦区间为距离近端500步和700步之间位置区间。
确定精细调焦区间后,控制器500可以在向驱动马达320发送第二移动指令,以控制光学组件310按照预设调节步长在精细调焦区间内移动。为了获得精细调整效果,光学组件每次移动后,投影设备都需要从相机700获取拍摄的精细调焦图像,再通过计算精细调焦图像的清晰度,以及按照精细调焦图像的清晰度查找清晰度最高的精细调焦图像对应的调焦位置,以作为最佳调焦位置。
例如,在确定精细调焦区间为距离近端500步和700步之间位置区间后,投影设备可以通过驱动马达320将镜片驱动到距离近端700步的位置,并每隔10步从相机700获取一张精细调焦图像,并计算每张精细调焦图像的清晰度,从而确定精细调焦区间内的清晰度最高的图像对应的位置,如距离近端550步的位置为最佳调焦位置。
需要说明的是,精细调焦区间在涉及调焦行程的端点位置需要特殊选定,即需要对确认的最佳清晰点分别处理,如最佳清晰点在区间终点,则此时只进一步检索终点到回退100步这一区间即可;如果最 佳清晰位置在区间起点,则驱动马达320带动光学组件先运行到起点向前100步的位置,再在此区间内进行搜索即可。
由于相机700拍照不是在驱动马达停止后才进行的拍照,因此照片实际的位置和控制器500中记录的位置存在一定的偏差,但驱动马达320的移动步速和相机700的拍照频率是固定的,因此上述偏差会限定在一定的区间范围内,并且是可计算的。则在最终调焦完成后,可以驱动马达320的步速和相机700的拍照频率计算位置补偿值,并根据补偿值在最佳位置前后移动特定步数,再结合清晰度得到最终的相对最清晰位置。
即为了消除偏差,在一些实施例中,投影设备可以在根据每个投影内容图像的清晰度查找最佳调焦位置时,先获取驱动马达的移动速度,以及获取相机700的拍摄频率,再根据移动素的和拍摄频率计算位置补偿值。再通过提取清晰度最高的投影内容图像对应的调焦位置,以获得目标调焦位置,从而使用位置补偿值修正目标调焦位置,以获得最佳调焦位置。
例如,经过精细调焦过程之后,得到的最佳位置是距离近端560步的位置,而根据驱动马达320的步速和相机700的拍摄频率,计算得到的平均位置补偿值为15步,则最终获取在距离近端545步、560步和575步三个位置对应的投影内容图像,以通过对比清晰度,确定最终的最清晰点。
图17为在一些实施例中基于位置记忆的调焦方法时序流程图。,如图17所示。包括:
步骤1,获取用户输入的调焦指令;
步骤2,响应于所述调焦指令,提取所述位置记忆信息,所述位置记忆信息包括所述光学组件的当前位置以及所述当前位置的可信度;
步骤3,响应于所述可信度为第一数值,基于所述当前位置计算第一调焦量;步骤4,向所述驱动马达发送第一调焦指令,所述第一调焦指令用于控制所述驱动马达按照所述第一调焦量移动所述光学组件的位置;
步骤5,响应于所述可信度为第二数值,基于调焦区间起点位置计算第二调焦量;步骤6,向所述驱动马达发送第二调焦指令,所述第二数值小于所述第一数值;所述第二调焦指令用于控制所述驱动马达按照所述第二调焦量移动所述光学组件的位置。
上述实施例提供的投影设备可以在接收到调焦指令后,从存储器中提取位置记忆信息,并对位置记忆信息中当前位置的可信度进行判断。如果可信度为第一数值,则基于当前位置计算第一调焦量;如果可信度为第二数值,则基于调焦区间起点位置计算第二调焦量。并按照第一调焦量和第二调焦量向驱动马达发送调焦指令,使投影设备可以基于记录的当前位置调整光学组件的位置,提高投影画面清晰度,并缩短调焦过程所消耗的时间。
在一些实施例中,本申请提供的投影设备可实现防射眼功能。为防止用户偶然进入投影设备射出激光轨迹范围内而导致的视力损害危险,在用户进入投影设备所在的预设特定非安全区域时,控制器可控制用户界面显示对应的提示信息,以提醒用户离开当前区域,控制器还可控制用户界面降低显示亮度,以防止激光对用户视力造成伤害。
在一些实施例中,投影设备被配置为儿童观影模式时,控制器将自动开启防射眼开关。
在一些实施例中,控制器接收到陀螺仪传感器发送的位置移动数据后、或接收到其它传感器所采集的异物入侵数据后,控制器将控制投影设备开启防射眼开关。
在一些实施例中,在飞行时间(TOF)传感器、摄像头设备等设备所采集数据触发预设的任一阈值条件时,控制器将控制用户界面降低显示亮度、显示提示信息、降低光机发射功率、亮度、强度,以实现对用户视力的保护。
在一些实施例中,投影设备控制器可控制校正服务向飞行时间传感器发送信令,以查询投影设备当 前设备状态,然后控制器接受来自飞行时间传感器的数据反馈。
校正服务可向进程通信框架(HSP Core)发送通知算法服务启动防射眼流程信令;进程通信框架(HSP Core)将从算法库进行服务能力调用,以调取对应算法服务,例如可包括拍照检测算法、截图画面算法、以及异物检测算法等;
进程通信框架(HSP Core)基于上述算法服务返回异物检测结果至校正服务;针对返回结果,若达到预设阈值条件,控制器将控制用户界面显示提示信息、降低显示亮度,其信令时序如图18所示。
在一些实施例中,投影设备防射眼开关在开启状态下,用户进入预定区域时,投影设备将自动降低光机发出激光强度、降低用户界面显示亮度、显示安全提示信息。投影设备对上述防射眼功能的控制,可通过以下方法实现:
控制器基于相机获取的投影画面,利用边缘检测算法识别投影设备的投影区域;在投影区域显示为矩形、或类矩形时,控制器通过预设算法获取上述矩形投影区域四个顶点的坐标值;
在实现对于投影区域内的异物检测时,可使用透视变换方法校正投影区域为矩形,计算矩形和投影截图的差值,以实现判断显示区域内是否有异物;若判断结果为存在异物,投影设备自动触发防射眼功能启动。
在实现对投影区域外一定区域的异物检测时,可将当前帧的相机内容、和上一帧的相机内容做差值,以判断投影区域外区域是否有异物进入;若判断有异物进入,投影设备自动触发防射眼功能。
于此同时,投影设备还可利用飞行时间(ToF)相机、或飞行时间传感器检测预定区域的实时深度变化;若深度值变化超过预设阈值,投影设备将自动触发防射眼功能。
在一些实施例中,投影设备基于采集的飞行时间数据、截图数据、以及相机数据分析判断是否需要开启防射眼功能。
例如,根据采集的飞行时间数据,控制器做深度差值分析;如果深度差值大于预设阈值X是,当预设阈值X实施为0时,则可判定有异物已处于投影设备的预定区域。若检测到特定对象位于预定区域,其视力存在被激光损害风险,投影设备将自动启动防射眼功能,以降低光机发出激光强度、降低用户界面显示亮度、并显示安全提示信息。
又例如,投影设备根据已采集截图数据做加色模式(RGB)差值分析,如所述色加模式差值大于预设阈值Y,则可判定有异物已处于投影设备的预定区域;所述预定区域内若存在特定对象如用户,其视力存在被激光损害风险,投影设备将自动启动防射眼功能,降低发出激光强度、降低用户界面显示亮度并显示对应的安全提示信息。
又例如,投影设备根据已采集相机数据获取投影坐标,然后根据所述投影坐标确定投影设备的投影区域,进一步在投影区域内进行加色模式(RGB)差值分析,如果色加模式差值大于预设阈值Y,则可判定有异物已处于投影设备的预定区域,所述预定区域内若存在特定对象如用户,其视力存在被激光损害的风险,投影设备将自动启动防射眼功能,降低发出激光强度、降低用户界面显示亮度并显示对应的安全提示信息。
若获取的投影坐标处于扩展区域,控制器仍可在所述扩展区域进行加色模式(RGB)差值分析;如果色加模式差值大于预设阈值Y,则可判定有异物已处于投影设备的预定区域,所述预定区域内若存在用户,其视力存在被投影设备发出激光损害的风险,投影设备将自动启动防射眼功能,降低发出激光强度、降低用户界面显示亮度并显示对应的安全提示信息,如图23所示。
图19为本申请另一实施例投影设备实现显示画面校正功能的信令交互时序示意图。
在一些实施例中,通常情况下,投影设备可通过陀螺仪、或陀螺仪传感器对设备移动进行监测。校正服务向陀螺仪发出用于查询设备状态的信令,并接收陀螺仪反馈用于判定设备是否发生移动的信令。
在一些实施例中,投影设备的显示校正策略可配置为,在陀螺仪、飞行时间传感器同时发生变化时,投影设备优先触发梯形校正;在陀螺仪数据稳定预设时间长度后,控制器启动触发梯形校正;并且控制器还可将投影设备配置为在梯形校正进行时不响应遥控器按键发出的指令;为了配合梯形校正的实现,投影设备将打出纯白图卡。
其中,梯形校正算法可基于双目相机构建世界坐标系下的投影面与光机坐标系转换矩阵;进一步结合光机内参计算投影画面与播放图卡的单应性,并利用该单应性实现投影画面与播放图卡间的任意形状转换。
在一些实施例中,校正服务发送用于通知算法服务启动梯形校正流程的信令至进程通信框架(HSP CORE),所述进程通信框架进一步发送服务能力调用信令至算法服务,以获取能力对应的算法;
算法服务获取执行拍照和画面算法处理服务、避障算法服务,并将其以信令携带的方式发送至进程通信框架;在一些实施例中,进程通信框架执行上述算法,并将执行结果反馈给校正服务,所述执行结果可包括拍照成功、以及避障成功。
在一些实施例中,投影设备执行上述算法、或数据传送过程中,若出现错误校正服务将控制用户界面显示出错返回提示,并控制用户界面再次打出梯形校正、自动对焦图卡。
通过自动避障算法,投影设备可识别幕布;并利用投影变化,将投影画面校正至幕布内显示,实现与幕布边沿对齐的效果。
通过自动对焦算法,投影设备可利用飞行时间(ToF)传感器获取光机与投影面距离,基于所述距离在预设的映射表中查找最佳像距,并利用图像算法评价投影画面清晰程度,以此为依据实现微调像距。
在一些实施例中,校正服务发送至进程通信框架的自动梯形校正信令可包含其他功能配置指令,例如可包含是否实现同步避障、是否入幕等控制指令。
进程通信框架发送服务能力调用信令至算法服务,使算法服务获取执行自动对焦算法,实现调节设备与幕布之间的视距;在一些实施例中,在应用自动对焦算法实现对应功能后,算法服务还可获取执行自动入幕算法,所述过程中可包含梯形校正算法。
在一些实施例中,投影设备通过执行自动入幕,算法服务可设置投影设备与幕布之间的8位置坐标;然后再次通过自动对焦算法,实现投影设备与幕布的视距调节;最终,将校正结果反馈至校正服务,并控制用户界面显示校正结果,如图19所示。
在一些实施例中,投影设备通过自动对焦算法,利用其配置的激光测距可获得当前物距,以计算初始焦距、及搜索范围;然后投影设备驱动相机(Camera)进行拍照,并利用对应算法进行清晰度评价。
投影设备在上述搜索范围内,基于搜索算法查找可能的最佳焦距,然后重复上述拍照、清晰度评价步骤,最终通过清晰度对比找到最优焦距,完成自动对焦。
例如,步骤2001,投影设备启动;步骤2002,用户移动设备,投影设备自动完成校正后重新对焦;步骤2003,控制器将检测自动对焦功能是否开启;当自动对焦功能未开启时,控制器将结束自动对焦业务;步骤2004,当自动对焦功能开启时,投影设备将通过中间件获取飞行时间(TOF)传感器的检测距离进行计算;
步骤2005,控制器根据获取的距离查询预设的映射表,以获取投影设备的大致焦距;步骤2006,中间件将获取焦距设置到投影设备的光机;
步骤2007,光机以上述焦距进行发出激光后,摄像头将执行拍照指令;步骤2008,控制器根据获取的拍照结果、评价函数,判断投影设备对焦是否完成;如果判定结果符合预设完成条件,则控制自动对焦流程结束;步骤2009,如果判定结果不符合预设完成条件,中间件将微调投影设备光机的焦距参数,例如可以预设步长逐渐微调焦距,并将调整的焦距参数再次设置到光机;从 而实现反复拍照、清晰度评价步骤,最终通过清晰度对比找到最优焦距完成自动对焦,如图20所示。
在一些实施例中,本申请提供的投影设备可通过梯形校正算法实现显示校正功能。
首先基于标定算法,可获取两相机之间、相机与光机之间的两组外参,即旋转、平移矩阵;然后通过投影设备的光机播放特定棋盘格图卡,并计算投影棋盘格角点深度值,例如通过双目相机之间的平移关系、及相似三角形原理求解xyz坐标值;之后再基于所述xyz拟合出投影面、并求得其与相机坐标系的旋转关系与平移关系,具体可包括俯仰关系(Pitch)和偏航关系(Yaw)。
通过投影设备配置的陀螺仪可得到卷(Roll)参数值,以组合出完整旋转矩阵,最终计算求得世界坐标系下投影面到光机坐标系的外参。
结合上述步骤中计算获取的相机与光机的R、T值,可以得出投影面世界坐标系与光机坐标系的转换关系;结合光机内参,可以组成投影面的点到光机图卡点的单应性矩阵。
最终在投影面选择矩形,利用单应性反求光机图卡对应的坐标,该坐标就是校正坐标,将其设置到光机,即可实现梯形校正。
例如,流程如图21所示:
步骤2101,投影设备控制器获取照片像素点对应点的深度值,或投影点在相机坐标系下的坐标;
步骤2102,通过深度值,中间件获取光机坐标系与相机坐标系关系;
然后步骤2103,控制器计算得到投影点在光机坐标系下的坐标值;步骤2104,基于坐标值拟合平面获取投影面与光机的夹角;
然后步骤2105,根据夹角关系获取投影点在投影面的世界坐标系中的对应坐标;
步骤2106,根据图卡在光机坐标系下的坐标与投影平面投影面对应点的坐标,可计算得到单应性矩阵。
步骤2107,控制器基于上述已获取数据判定障碍物是否存在;
步骤2108,障碍物存在时,在世界坐标系下的投影面上任取矩形坐标,根据单应性关系计算出光机要投射的区域;
步骤2109,障碍物不存在时,控制器例如可获取二维码特征点;
步骤2110,获取二维码在预制图卡的坐标;
步骤2111,获取相机照片与图纸图卡单应性关系;
步骤2112,将获取的障碍物坐标转换到图卡中,即可获取障碍物遮挡图卡坐标。
步骤2113,依据障碍物图卡遮挡区域在光机坐标系下坐标,通过单应性矩阵转换得到投影面的遮挡区域坐标;
步骤2114,在世界坐标系下投影面上任取矩形坐标,同时避开障碍物,根据单应性关系求出光机要投射的区域。
可以理解,避障算法在梯形校正算法流程选择矩形步骤时,利用算法(OpenCV)库完成异物轮廓提取,选择矩形时避开该障碍物,以实现投影避障功能。
在一些实施例中,如图22所示:
步骤2201,中间件获取相机拍到的二维码图卡;
并步骤2202,识别二维码特征点,获取在相机坐标系下的坐标;
步骤2203,控制器进一步获取预置图卡在光机坐标系下的坐标;
步骤2204,,求解相机平面与光机平面的单应性关系;
步骤2205,控制器基于上述单应性关系,识别相机拍到的幕布四个顶点坐标;
步骤2206,根据单应性矩阵获取投影到幕布光机要投射图卡的范围。
可以理解,在一些实施例中,入幕算法基于算法库(OpenCV),可识别最大黑色闭合矩形轮廓并提取,判断是否为16:9尺寸;投影特定图卡并使用相机拍摄照片,提取照片中多个角点用于计算投影面(幕布)与光机播放图卡的单应性,将幕布四顶点通过单应性转换至光机像素坐标系,将光机图卡转换至幕布四顶点即可完成计算比对。
长焦微投的投影设备具有灵活移动的特点,每次位移后投影画面可能会出现失真,另外如投影面存在异物遮挡、或投影画面从幕布异常时,本申请提供的投影设备、以及基于几何校正的显示控制方法,可针对上述问题自动完成校正,包括实现自动梯形校正、自动入幕、自动避障、自动对焦、防射眼等功能的。
在一些实施例中,通过在投影设备中配置自动调焦功能,根据两段式调焦,实现快速调焦,通过第一段粗调焦确定第一调焦位置,第二段精细调焦过程根据图像清晰度计算第二调焦位置,完成自动调焦,在不增加调焦耗时的前提下,避免陷入局部调焦导致调焦不清晰的问题。
在与上述实施例中,若调焦指令为字段调焦指令,控制器根据距离传感器检测的投影面与光机之间的间隔距离以及预设调焦距离对照表,在所述调焦距离对照表中查询目标调焦位置,将目标调焦位置与当前位置的差值确定为第一调焦量,控制所述驱动马达按照所述第一调焦量移动所述光学组件至第一调焦位置,完成第一段的粗调焦过程。
由于在第二段精细调焦过程中,存在因为投影环境复杂,导致自动调焦不精准的问题,在一些实施例中,通过在投影设备中配置ROI(Region of Interest)特征区域选取功能,减少因投影环境复杂,导致投影画面不清晰的现象。
本申请的一些实施例中提供一种ROI特征区域选取方法。所述ROI特征区域选取可以应用于投影设备,并且为了满足该ROI特征区域选取方法的实施,所述投影设备可以包括光机200、镜头300、控制器500、距离传感器600以及相机700。其中,如图24和图29所示,控制器500可以用于执行ROI特征区域选取方法,包括以下步骤:
S801:获取调焦指令。
控制器500获取调焦指令后,自动开启自动调焦功能,其中自动调焦功能包括两段式调焦;在一些实施例中可以通过第一段粗调焦确定第一调焦位置,根据控制器500发送的第一移动指令控制驱动马达320带动光学组件310移动至第一调焦位置;控制器500发送第二移动指令控制驱动马达320带动光学组件310到达清晰度最高的位置,完成第二段精细调焦。其中,第一移动指令是控制器500控制驱动马达320带动光学组件310移动至第一调焦位置,第二移动指令是控制器500控制驱动马达320带动光学组件310到达清晰度最高的位置。
需要说明的是,投影设备可以基于上述实施例中任一种输入方式获取调焦指令,与上述实施例结合时,该调焦指令包括第一调焦指令或第二调焦指令。本申请不对控制器500获取调焦指令的方式或途径进行限定。
S802:响应于所述调焦指令,控制所述驱动马达320将所述光学组件310移动至所述间隔距离关联的第一调焦位置。
在第一段粗调焦过程中,按照上述实施例方式获取间隔距离。
在获取间隔距离之后,控制器500结合预设的调焦曲线,例如,上述实施例中的间隔距离和预设调焦距离对照表,确定镜头300需要到达的位置即第一调焦位置,例如,上述实施例中的目标调焦位置,通过将第一调焦位置与当前位置的比对,得到两者之间的差值距离,控制器500根据 该差值距离查询预设的映射表,获取该差值距离对应的驱动马达320的旋转步数。
其中,预设调焦曲线可以是在投影设备出厂前建立的。即以变焦参数为横坐标,调焦参数为纵坐标建立坐标系,镜头捕获待处理图像过程中的物距为定值,采用单调爬山算法,在所述坐标系确定预设变焦参数关键点,以及各所述预设变焦参数关键点对应的满足预设清晰条件的调焦参数点;根据各所述预设变焦参数关键点对应的满足预设清晰条件的调焦参数点的生成调焦曲线,该调焦曲线为预设调焦曲线。
预设映射表是通过解析调焦曲线以及投影画面的投影距离制作得到,其中映射表中包含投影画面,投影距离,聚焦组移动量以及聚焦组旋转步数。其中预设调焦曲线和预设映射表存储在控制器500中。
其中,本申请一些实施例中预设映射表如表1所示。
表1
Figure PCTCN2022123540-appb-000001
根据投影画面与投影设备的间隔距离,查询预设的映射表可以得到需要的移动量以及旋转步 数从而确定精细调焦的第一调焦位置。
采用TOF原理测量投影面和投影设备之间的距离过程中,会因为每个TOF传感器或者相机的不同灵敏度范围以及每个TOF传感器或者相机不同的误差,因此导致在同一个位置测量得到的间隔距离有波动。
基于此,在上述方法得到第一调焦位置过程中,投影设备还通过采集多次飞行时间距离,求取均值,将均值代入***配置的飞行时间测距预设线性回归模型计算得到投影设备与投影面之间的理论距离值;因为采集次数越多,均值越接近真实值,可以降低飞行时间测距过程中由于设备的不同所带来的影响。
在一些实施例中,控制器500每隔预设时间获取多个飞行数据,从而计算得到多个间隔距离;并将多个间隔距离取均值,得到平均距离;将所述平均距离输入飞行时间测距预设线性回归模型,得到投影设备与投影面之间的理论距离值,将理论距离值结合预设的调焦曲线,确定镜头300需要到达的位置即第一调焦位置。
当确定驱动马达320的旋转步数,控制器500向所述驱动马达320发送第一移动指令,即控制所述驱动马达320根据所述旋转步数将所述光学组件310移动至所述第一调焦位置。
在一些实施例中,为了满足计算调焦量的需求,投影设备还可以配置多个功能单元,如策略选择单元、电机控制单元、图像采集单元(camera)、清晰度评价单元等,各功能单元可以相互独立工作,也可以共同配合完成预定功能。这些单元可以配置为与投影设备的其他组件为一体。例如,策略选择单元确定基于间隔距离计算得到的第一调焦位置后,通知电机控制单元控制驱动马达320一次性带动光学组件310移动至第一调焦位置,无需停下等待拍照和清晰度计算。并且,对于单次调焦过程,控制器500可以在驱动马达320转动到特定状态时,向图像采集单元发送位置信息,并向图像清晰度评价单元写入位置信息,以实现三者的同步。
在一些实施例中,清晰度评价单元配置有多种清晰度评价函数来进行清晰度评价。清晰度评函数可以为Brenner,Tenengrad,Laplacian,SMD,Variance,Energy等等。
S803:获取所述光学组件310移动至所述第一调焦位置后的投影内容图像。
在控制器500发送第一移动指令的同时,相机700以特定频率进行拍照,得到所述光学组件310移动至所述第一调焦位置过程中拍摄的投影内容图像,并将所述投影内容图像存储至调焦位置存储容器。
当调焦单机320将所述光学组件310移动至所述第一调焦位置后,控制器500读取调焦位置存储容器中的投影内容图像,控制器500获取第一调焦位置的当前投影内容图像。
相机700拍照后,如果接收到***读取照片的指令,则给出最近一次的照片路径,否则做丢弃处理。清晰度评价单元则开始轮询存储调焦位置的存储容器是否为空,不为空则读取位置信息,并据此读取相应的照片计算图片清晰度,之后将结果存入清晰度存储容器中待用;清晰度评价可以基于清晰度评价单元中预设的频域函数、灰度函数、信息熵等多种方式实现。
S804:在所述投影内容图像中识别ROI特征区域。
控制器500获取第一调焦位置的当前投影内容图像后,为了消除非投影区域的干扰,以及从多投影面中选取投影区域,自动开启ROI特征区域选取功能,通过ROI特征区域选取功能识别投影内容图像中的ROI特征区域。其中ROI特征区域选取功能中配置的算法,包括但不限于:自适应二值化阈值算法、膨胀腐蚀算法、轮廓检测算法、特征点匹配算法以及恒定尺度特征变换图像处理算法等。
在一些实施例中,控制器500可以根据预置的轮廓检测算法计算投影内容图像中的ROI特征 区域,所述ROI特征区域为所述投影内容图像中面积最大或周长比值最小的轮廓区域,所述轮廓区域为根据灰度值在所述投影内容图像中划定的区域。
S8141:将所述投影内容图像转化为灰度图。
控制器500获取投影内容图像后将该图像转化为灰度图。
S8142:基于灰度值在所述灰度图中识别至少一个轮廓区域,以及计算所述轮廓区域的面积。
控制器500根据预置的自适应二值化阈值算法、膨胀腐蚀算法以及轮廓检测算法计算灰度图中的ROI特征区域。
首先,控制器500求取整幅灰度图整体灰度值的平均值M,根据平均值M得到灰度图中像素点的灰度值阈值区间【M-s,M+s】;其中,s为经验值,在一些实施例中s取值为50;控制器500分别以灰度值阈值区间中的每一个点作为分割像素点,计算所述灰度值阈值区间中每一个像素点的灰度方差。
控制器500运用最大类间数值方差算法,输入所述像素点的灰度方差,求取最大类间方差,将所述最大类间方差作为该灰度图二值化分割的最优阈值的参考阈值T total即第一分割像素点。
控制器500根据预设分割值将所述灰度图分割成多个图像块,其中分割图像块大小为r*r。
控制器500以所述图像块当前的像素点(x,y)为中心领域计算像素点集合的均值m(x,y)和标准差σ(x,y),其中计算如公式1和公式2所示:
Figure PCTCN2022123540-appb-000002
Figure PCTCN2022123540-appb-000003
控制器500以m(x,y)和σ(x,y)作为入参数据,计算当前像素点(x,y)的个体阈值f(x,y) T,其中个体阈值f(x,y) T的计算公式如下式所示:
f(x,y) T=m(x,y)×[1+k(R×σ(x,y)-1)]      (3);
其中,k表示修正参数,其数值范围(0,1),R表示方差的动态变化参数值。控制器500得到每一个图像块的个体阈值f(x,y) T
控制器500将得到的所述个体阈值f(x,y) T作为第二分割像素点;控制器500分别根据第一分割像素点以及第二分割像素点对所述图像块的像素点进行分割,得到分割后的第一图像块以及第二图像块。
控制器500分别计算第一图像块以及第二图像块中像素点灰度值的方差值,得到第一方差值以及第二方差值;控制器500分别计算第一图像块以及第二图像块中像素点灰度值的平均值;将所述方差值,平均值结合差异性算法得到所述像素点的第一阈值以及第二阈值,将所述第一阈值以及第二阈值中的最大值对应的灰度值作为最优阈值,即作为当前图像块的最优阈值f(x,y) TT
控制器500根据最优阈值f(x,y) TT以及当前图像块的颜色值识别当前图像块中的目标部分和背景部分。如果所述图像块中像素点的颜色值大于最优阈值,则将所述图像块中像素点划分为目标部分;如果所述图像块中像素点的颜色值小于或等于最优阈值,则将所述图像块中像素点划分为背景部分;在一些实施例中,当前像素点的颜色值大于最优阈值时,置为1,反之,则置为0。
通过自适应二值化阈值算法对灰度图进行前景以及背景处理,可以根据所述灰度图分割的每一个图像块进行自适应,即每一个被分割的图像块基于其自身像素点的灰度值使用最符合的分割像素点,避免由于某些图像块中像素点的灰度值,由于普遍低于平均值而被处理成背景部分,可以消除非投影区域的干扰。
控制器500将已经区分目标部分和背景部分的灰度图使用膨胀与腐蚀算法进行去噪。
控制器500对所述灰度图先进行膨胀算法:即依次读取图像中的像素点(x,y),与3×3的结构元素(卷积核)进行卷积计算,当结果中的数值超出阈值个数时,该像素点置为1,反之置为0。
其中,结构元素可以是3×3,5×5等不同尺寸比例的结构图;在一些实施例中使用的是3×3的结构元素,数值用0或1表示,具体数值为{[0,1,0],[1,1,1],[0,1,0]},即半径为2的4连通数值。
控制器500使用上述卷积核依次遍历图像中的像素点,若卷积核中有数值为1时,即将图像中对应的卷积核的原点位置的像素点置为1,否则置为0。
控制器500将膨胀处理后的灰度图使用腐蚀算法进行去噪声。控制器500依次读取所述灰度图中的像素点(x,y)与3×3的结构元素(卷积核)进行卷积计算当结果中的像素点均为1时,则该像素点为1,反之,则该像素点为0。控制器500去除图像中的噪声污点,将所述像素点置为背景像素,得到去噪声之后的灰度图。
通过膨胀算法可以有效将纤细的图像边缘部分完成闭合,最终得到膨胀处理后的灰度图。将所述灰度图通过膨胀与腐蚀算法处理后,去除了图像中的小物体或噪声、在纤细边缘点处可以分离物体,在平滑较大物体的边界的同时并不明显的改变其面积。
控制器500将去噪声之后的灰度图使用轮廓检测算法,得到第一调焦位置的投影区域内的多闭合轮廓区域。
首先,控制器500采用八邻域跟踪的图像处理算法,遍历灰度图中第一个由0变为1的像素点,将该像素点作为外轮廓的起始点或边界点。当以该像素点为起始点,通过逆时针的方法,来逐一查找到轮廓的起始点或者孤立点。轮廓查找结束后,每得到一个轮廓点,将边缘标记值加一。
其次,控制器500依次处理灰度图中的每一个像素点,最终计算得出所有轮廓及对应层级关系,对于每个轮廓,其表征点包括:轮廓信息ndarray,层级关系Hierarchy;其中采用[Next,Preview,Child,Parent]四个参数表示后一个轮廓、前一个轮廓、子轮廓、父轮廓的索引编号,如果没有对应项,则该值为负数(在一些实施例中采用-1表示)。最终得到各级轮廓区域的等级。
其中,内轮廓边界点主要是查找由1变为0的像素点,具体采用8邻域的查找方式,当找到的轮廓边界点,将边缘标记值加一,如果找到轮廓起始点或者孤立点,计为轮廓点查找结束。
下面进行举例说明,图25中有5条轮廓,其中1a和1b为内外轮廓关系,即外层轮廓和里层轮廓,其中,轮廓contour0、contour1a是最外层轮廓,即为同一等级关系,即为0级;轮廓1b是contour1的子轮廓,即轮廓1b算一个等级,即为1级;轮廓contour2和轮廓contour3是轮廓1b的子轮廓,即轮廓contour2和轮廓contour3处于一个等级,即为2级;所以,对于轮廓contour0,其Hierarchy参数信息表征为[1,-1,-1,-1]。
S8143:获取所述面积最大或周长比值最小的轮廓区域的边界坐标。
S8144:根据所述边界坐标中的坐标极值划定所述ROI特征区域。
控制器500基于所求得的轮廓等级关系,获取等级为0的最外层轮廓列表ContoursList,通过每个轮廓的多点坐标数值,计算求得各个轮廓的区域面积。
控制器500基于面积大小进行轮廓排序,筛选出最大区域面积Marea以及第二大区域面积Narea;将最大区域面积Marea以及第二大区域面积Narea与预设面积阈值进行比对;该预设面积阈值不建于具化,在一些实施例中预设面积阈值设为所述灰度图面积的1/2。若最大区域面积大于预设面积阈值,即取最大区域作为最优区域;若最大区域面积Marea小于预设面积阈值时,控制器500计算最大区域面积与第二大区域面积的比值,如果所述比值在预设区间内,控制器500分别获取最大区域面积Marea以及第二大区域面积Narea的质心坐标,求取所述两个区域的宽高(w,h),分别求取两个区域的w/h的周长比值,选择比值越小的区域作为最优区域即当前投影区域。 如果所述比值不在预设区间内,控制器500计算所述最大区域的周长比值,如果所述周长比值大于预设周长比阈值,则将最大区域作为最优区域即当前投影区域。控制器500将获取的当前投影区域的多点坐标,其中,四点坐标或八点坐标(含边界中点线),作为ROI特征区域进行输出。
在一些实施例中,控制器500可以根据预置的梯形校正算法计算投影内容图像中的ROI特征区域。
S8241:控制器500执行梯形校正操作。
控制器500首先基于标定算法,获取双目相机中左相机和右相机之间或相机700与光机200之间的两组外参,即旋转、平移矩阵;然后通过投影设备的光机200播放标准图卡(标准图卡为预先设计好的图卡,通过光机200进行投影显示),并计算投影图卡格角点深度值,例如通过双目相机之间的平移关系、及相似三角形原理求解(x,y,z)坐标值;之后再基于所述(x,y,z)拟合出投影面、并求得其与相机700坐标系的旋转关系与平移关系,具体可包括俯仰关系(Pitch)和偏航关系(Yaw)。
控制器500通过投影设备配置的陀螺仪可得到卷(Roll)参数值,以组合出完整旋转矩阵,最终计算求得整体坐标系下投影面到光机200坐标系的外参。
结合光机200内参,可以组成投影面的点到标准图卡点的单应性矩阵。
控制器500计算得到投影点在光机200坐标系下的坐标值,并基于坐标值拟合平面获取投影面与光机200的夹角,然后根据夹角关系获取投影点在投影面的世界坐标系中的对应坐标;根据标准图卡在光机200坐标系下的坐标与投影平面投影面对应点的坐标,可计算得到单应性矩阵。
最终,控制器500在投影面选择矩形,利用单应性反求预设图卡对应的坐标即校正坐标,将其设置到光机200,结束梯形校正操作。
S8242:控制器500将梯形校正后的投影坐标转换成相机坐标,将所述相机坐标作为ROI特征区域。
控制器500通过单应性矩阵将光机200坐标系下的投影坐标转换到相机坐标系下,具体为四个角点坐标,或四个角点加上四个边缘中点,得到八个点坐标;将该坐标返回值,作为ROI特征区域。
例如,投影设备的控制器500获取投影内容图像中像素点对应点的深度值,或投影点在相机坐标系下的坐标;通过深度值,中间件获取光机200坐标系与相机700坐标系关系;其中中间件在操作***、网络和数据库之上,中间件为应用***和***软件之间的一类软件,它使用***软件所提供的基础服务(功能),中间件用于衔接网络上应用***的各个部分或不同的应用。
在一些实施例中,控制器500可以根据预置的特征点匹配算法计算投影内容图像中的ROI特征区域。
S8341:将所述投影内容图像转化为灰度图。
控制器500获取投影内容图像后将该图像转化为灰度图。
S8342:获取标准图卡,以及通过光机200将其进行投影显示,得到标准图卡的投影内容图像。
中间件获取相机700拍到的标准图卡,并识别标准图卡中黑白图形特征点,获取在相机700坐标系下的坐标;控制器500进一步获取标准图卡在光机200坐标系下的坐标,以求解相机700平面与光机200平面的单应性关系;控制器500基于上述单应性关系,识别相机700拍到的幕布四个顶点坐标,根据单应性矩阵获取投影到幕布光机200要投射标准图卡的范围。控制器500将标准图卡的投影内容图像转化为灰度图,遍历标准图卡中的像素点,得到标准图卡中的像素点的坐标点信息。
S8343:将所述标准图卡的投影内容图像与所述灰度图进行比对,分别得到所述标准图卡的投影内容图像与所述灰度图的关键点描述信息。
S8344:根据上述关键点描述信息确定所述标准图卡与所述灰度图的中相同的特征关键点。
控制器500利用SIFT(Scale Invariant Feature Transform,恒定尺度特征变换)图像处理算法,分别获取两个图像的描述子des1,des2,其中描述子为描述图像中特征中关键点的描述信息。控制器500将所述描述子进行匹配,得到匹配结果,将描述子中前两个描述信息作为最优匹配值,然后输出,得到匹配结果即得到特征关键点。
S8345:根据特征关键点得到ROI特征区域。
在一些实施例中为了提升匹配速度和精确度,结合特征关键点坐标信息,使用knn(K Nearest neighbors,快速最近邻搜索算法)算法,来获取matches匹配列表,在knn算法中的最邻近匹配点数值参数值设为2,即每个匹配可以返回两个最邻近的匹配{m,n},取前两个匹配信息;在求得的匹配结果变量中有{queryIdx,trainIdx,distance}相关参数信息,其中queryIdx表示为预置图像的描述信息,trainIdx表示为实际拍照的图像描述信息,distance表征两个距离,距离越小,两者越匹配;根据特征点距离远近分类,遍历匹配结果列表,对于每个匹配结果项中的两个匹配结果{m,n}进行对比,若m.distance<k*n.distance,即取匹配结果m作为当前最优匹配结果项,否则不计入匹配结果列表。其中,k为预设系数,为增加有效匹配率,在一些实施例中k取值为0.75。
控制器500根据快速最近邻搜索算法求得更新后的匹配列表,结合特征关键点位置坐标,输出关键特征下的多点坐标,将该特征关键点坐标进行闭合,求取含该特征关键点将获取的图像坐标点输出,作为ROI特征区域。
由于投影环境的复杂,相机700拍摄的当前投影内容图像中可能包含至少一个投影物体,所以在将所述投影内容图像转化为灰度图步骤中,还需要判断当前投影物体个数,如图26所示。
S8041:检测所述投影内容图像中的目标物体数量,如果是,执行步骤S8141、S8241以及S8341,否则执行步骤S8043。
控制器500获取当前投影内容图中当前作用范围内的物体目标数N,当N等于1时,说明该投影区域内仅有1个投影反射物。
S8141:如果目标物体数量等于数量判断阈值,使用轮廓检测算法,获取ROI特征区域。
S8241,如果目标物体数量等于数量判断阈值,使用梯形校正算法获取ROI特征区域。
S8341,如果目标物体数量等于数量判断阈值,特征点匹配算法获取ROI特征区域
在一些实施例中,将数量判断阈值设为1,控制器500执行S8141、S8241以及S8341步骤中的任意一个步骤获取ROI特征区域。
S8043:如果目标物体数量大于数量判断阈值,根据目标物体数量将所述灰度图分为多个特征区域。
控制器500根据相同的关键点得到所述灰度图中的多个特征关键点,以及根据所述多个特征关键点确定所述灰度图中的多个特征区域。其中,获取关键点获取步骤与特征点匹配算法中步骤S8341至S8344一致,此处不再赘述。
S8044:根据预设规则对所述特征区域进行优先级排序,将优先级最高的特征区域划定为ROI特征区域。
控制器500将多个特征区域的特征关键点进行解码,输出深度图像,得到深度值以及深度信息参数。基于TOF返回的深度检测物体数量N(1~N)和对应深度信息参数(D1~DN),将深度图进行分层图Dp计算,得到每个特征区域的分层信息;对于每个特征区域的分层信息,进行去噪 处理,减少因分层过程中出现的异常噪声点,再将分层区域闭合,根据深度值计算所述特征关键点的间隔距离以及每个特征区域的面积。控制器500根据所述目标物体数量以及深度参数对每个特征区域进行计算,得到每个特征区域的目标物体图像占据的像素点数量;分别将每个深度分析Dp的图像的像素个数进行累加,得出每个分区的区域像素点数量。其中,预设规则为根据所述目标物体图像占据的像素点数量、所述特征区域的面积大小以及所述特征区域的周长比值对所述特征区域进行优先级排序,将优先级最高的特征区域作为ROI特征区域。
控制器500基于面积大小进行轮廓排序,筛选出最大区域面积Marea以及第二大区域面积Narea;将最大区域面积Marea以及第二大区域面积Narea与预设面积阈值进行比对;该预设面积阈值不建于具化,在一些实施例中预设面积阈值设为所述灰度图面积的1/2。若最大区域面积大于预设面积阈值,即取最大区域作为最优区域;若最大区域面积Marea小于预设面积阈值时,控制器500计算最大区域面积与第二大区域面积的比值,如果所述比值在预设区间内,控制器500分别获取最大区域Marea以及第二大区域Narea的质心坐标,求取所述两个区域的宽高(w,h),分别求取两个区域的w/h的周长比值,选择比值越小的区域作为最优区域即当前投影区域。如果所述比值不在预设区间内,控制器500计算所述最大区域的周长比值,如果所述周长比值大于预设周长比阈值,则将最大区域作为最优区域即当前投影区域。在一些实施例中,控制器500将包含像素点数量最多,且周长比值最小的区域作为最优区域。控制器500将获取的当前投影区域的多点坐标,其中,四点坐标或八点坐标(含边界中点线),作为ROI特征区域进行输出。
S805:计算所述ROI特征区域的图像清晰度,以及根据所述图像清晰度计算第二调焦位置。
S806:控制所述驱动马达按照所述第二调焦位置调整所述光学组件的焦距。
如图27所示,控制器500发送第一移动指令控制驱动马达320从当前位置(调节起点)移动至第一调焦位置(调节终点),控制器500发送第一移动指令的同时,相机700以特定频率进行拍照,拍摄当前位置的投影内容,当驱动马达320控制所述光学组件310到达第一调焦位置后,停止移动,记为一次精细调焦。当控制器500获取ROI特征区域之后,计算所述ROI特征区域的图像清晰度,将ROI特征区域的图像清晰度与清晰度阈值进行比较,如果所述ROI特征区域的图像清晰度高于清晰度阈值,将所述第一调焦位置记为清晰度最佳位置,结束调焦。
如果所述ROI特征区域的图像清晰度低于清晰度阈值,控制器500获取相机700在所述光学组件310移动过程中拍摄的多个投影内容图像,并计算所有投影内容图像的清晰度。将所有投影内容图像的清晰度进行排序,得到清晰度最高值。控制器500将清晰度最高值与清晰度阈值进行比较,如果所述清晰度最高值高于清晰度阈值,确定清晰度最高值对应投影内容图像的拍摄位置,将所述拍摄位置记为清晰度最佳位置。控制器500向所述驱动马达320发送第二移动指令,控制所述驱动马达320将光学组件310移动至目标位置,所述目标位置为所述清晰度最高值的投影内容图像的拍摄位置,完成调焦。
基于上述自动调焦方法,本申请的一些实施例还提供一种投影设备,包括:光机200、镜头300、距离传感器600、相机700以及控制器500,如图28所示。其中,所述光机200被配置为投射播放内容至投影面;所述镜头300包括光学组件310和驱动马达320;所述驱动马达320连接所述光学组件310,以调整所述光学组件310的焦距;所述相机700被配置为拍摄投影内容图像;距离传感器600,被配置为检测投影设备与投影面之间的间隔距离;控制器500,被配置为:
获取调焦指令;
响应于所述调焦指令,控制所述驱动马达将所述光学组件移动至所述间隔距离关联的第一调焦位置;
获取所述光学组件移动至所述第一调焦位置后的投影内容图像;
在所述投影内容图像中识别ROI特征区域,所述ROI特征区域为所述投影内容图像中面积最大或周长比值最小的轮廓区域,所述轮廓区域为根据灰度值在所述投影内容图像中划定的区域;
计算所述ROI特征区域的图像清晰度,以及根据所述图像清晰度计算第二调焦位置;
控制所述驱动马达按照所述第二调焦位置调整所述光学组件的焦距。
本申请一些实施例中提供的投影设备及ROI特征区域选取方法可以在接收到调焦指令后,基于飞行时间测距原理检测投影设备与投影面之间的间隔距离。基于预设调焦曲线与间隔距离计算得到第一调焦位置。通过控制驱动马达将光学组件移动至第一调焦位置,获取光学组件移动至所述第一调焦位置后的投影内容图像;在所述投影内容图像中识别ROI特征区域,计算所述ROI特征区域的图像清晰度,以及根据所述图像清晰度计算第二调焦位置;控制所述驱动马达按照所述第二调焦位置调整所述光学组件的焦距。所述方法通过在粗调焦阶段,获取当前调焦位置的投影内容图像,通过对投影内容图像ROI特征区域选取,减少因投影环境,导致投影画面不清晰的现象,优化投影效果,提升用户体验。
本申请的一些实施例中还提供一种自动对焦方法,所述自动对焦方法可以兼顾上述投影设备对焦方法的优势,根据多段式调焦,实现快速调焦。所述自动对焦方法可以应用于投影设备,并且为了满足该自动对焦方法的实施,所述投影设备可以包括光机200、镜头300、传感器600、相机700以及控制器500。其中,如图30所示,控制器可以用于执行自动对焦方法的程序步骤,包括以下步骤:
获取自动对焦指令。其中,自动对焦指令可以为上述第一对焦指令。
在第一段粗调焦过程中,按照上述实施方式获取投影面离相机700或传感器600的之间的距离即第一距离,例如,上述实施例中的间隔距离。
在获取第一距离之后,控制器500结合预设的对焦曲线,例如上述实施例中的预设焦距对照表,确定镜头300需要到达的位置即第一位置,例如上述实施例中的目标调焦位置,通过将第一位置与当前位置的比对,得到两者之间的距离差值,控制器500根据该距离查询预设的映射表,具体见上述表1,获取该距离差值对应的驱动马达320的旋转步数即第一旋转步数。
查询预设的映射表可以得到需要的移动量以及旋转步数从而确定精细调焦的精调区间。
当确定驱动马达320的旋转步数即第一旋转步数之后,控制器500向所述驱动马达发送第三移动指令,即控制所述驱动马达320根据所述第一旋转步数将所述光学组件310移动至所述第一位置。
在控制器500发送第三移动指令的同时,相机700以特定频率进行拍照,得到所述光学组件310移动至所述第一位置过程中拍摄的投影内容图像,并将所述投影内容图像存储至对焦位置存储容器。
当调焦单机320将所述光学组件310移动至所述第一位置后,控制器500读取对焦位置存储容器中的投影内容图像。通过***中清晰度评价单元中设置的清晰度评价函数计算所有投影内容图像的清晰度值,并将计算得到的所有清晰度值与预设的第一清晰度进行比较,筛选得到高于第一清晰度的投影内容图像,并确定拍摄该投影内容图像的拍摄位置,该拍摄位置为第二位置。
将第一位置确定为精调区间的调节起点,将第二位置确定为精调区间的调节终点,从而得到精调区间,完成第一段粗调焦,如图31所示。
清晰度评价单元中出厂前预先设置第一清晰度以及第二清晰度,其中第二清晰度值高于第一清晰度值,第一清晰度用来评价第一段粗调焦步骤中获得投影内容图像的清晰度,第二清晰度用 来评价第二段精细调焦步骤中获得投影内容图像的清晰度。
相机700拍照后如果接收到***读取照片的指令,则给出最近一次的照片路径,否则做丢弃处理。清晰度评价单元则开始轮询存储对焦位置的存储容器是否为空,不为空则读取位置信息,并据此读取相应的照片计算图片清晰度,之后将结果存入清晰度存储容器中待用;清晰度评价可以基于清晰度评价单元中预设的频域函数、灰度函数、信息熵等多种方式实现。
在一些实施例中,用户可以自行修改第一清晰度以及第二清晰度值,但是第二清晰度必须高于第一清晰度;通过第一清晰度确定精调区间,执行精细调焦,在精调区间中寻找清晰度最高值的位置,如果第二清晰度低于第一清晰度,最终得到清晰度最高值可能会低于第一清晰度,无法实现精细调焦。
在一些实施例中,精调区间的调节起点和调节终点可能由于驱动马达320当前的位置而适应***换,即第一位置可能是精调区间的调节终点,第二位置可能是精调区间的调节起点。
例如,第一位置距离驱动马达320当前位置(即原始位置)0.134mm,查表得到第一旋转步数为242步,驱动马达320根据第一旋转步数带动光学组件310向前方移动过程中,150步位置的投影内容图像清晰度高于第一清晰度,确定150步位置为第二位置,此时距离原始位置150-242步为精调区间,那第二位置为调节起点,第一位置为调节终点。
例如,第一位置距离驱动马达320当前位置(即原始位置)-0.087mm,查表得到第一旋转步数为-157步,驱动马达320根据第一旋转步数带动光学组件310向后方移动过程中,-50步位置的投影内容图像清晰度高于第一清晰度,确定-50步位置为第二位置,此时距离原始位置-157至-50步为精调区间,那第一位置为调节起点,第二位置为调节终点。
在一些实施例中,在调焦单机320将所述光学组件310移动至所述第一位置步骤中,控制器500还被配置为判断调焦后的投影图像的清晰度是否优于前一帧投影图像的清晰度,如果是,发送同向调焦信号给驱动马达320,否则发送反向调焦信号给驱动马达320。
其中,同向调焦信号中的投影设备的驱动马达320转动方向信息与前一次发送的调焦信号中的投影设备驱动马达320转动方向信息相同,反向调焦信号中的投影设备驱动马达320转动方向信息与前一次发送的调焦信号中的投影设备驱动马达320转动方向信息相反。
采用TOF原理测量投影面和投影设备之间的距离过程中,会因为每个TOF传感器或者相机的不同灵敏度范围以及每个TOF传感器或者相机不同的误差,因此导致在同一个位置测量得到的第一距离有波动。
基于此,在上述方法得到第一位置过程中,投影设备还通过采集多次飞行时间距离,求取均值,将均值代入***配置的飞行时间测距预设线性回归模型计算得到投影设备与投影面之间的理论距离值;因为采集次数越多,均值越接近真实值,可以降低飞行时间测距过程中由于设备的不同所带来的影响。
在一些实施例中,控制器500每隔预设时间获取多个飞行数据,从而计算得到多个第一距离;并将多个第一距离取均值,得到平均距离;将所述平均距离输入飞行时间测距预设线性回归模型,得到投影设备与投影面之间的理论距离值,即第二距离;将第二距离结合预设的对焦曲线,确定镜头300需要到达的位置即第一位置。
在一些实施例中,可以人为设置30ms内测量5次飞行数据,从而得到五个第一距离并取均值,将均值代入TOF测距的线性回归模型(y=mx+b,其中x为样本点,y为真实值),得到TOF传感器或者相机在当前位置的理论距离值,从而减小由于设备的不同所带来的影响。其中,TOF测距的线性回归模型是投影设备出厂时,***预设并保存到控制器500的磁盘分区中,该分 区在程序升级以及其他操作均保留,作为设备的属性进行管理,当用户进行拆机、维修变更TOF传感器的位置时,需要进行一次新的校正,即再执行一遍数据采集,得到新的线性回归模型。
在一些实施例中,当驱动马达320在正向旋转后,再次进行反转时,会存在一个回程误差,此时驱动马达320根据第一旋转步数到达的位置不是目标位置,即不是第一位置,会影响后续的精细调焦过程。
所以在控制器500控制所述驱动马达320根据所述第一旋转步数将所述光学组件310移动至所述第一位置过程中,如图32所示,包括:步骤3201,投影设备获取驱动马达当前的旋转方向;步骤3202,根据驱动马达320当前的旋转方向,结合***记录的上次旋转方向,判断旋转方向是否一致,若是,执行步骤3203,否则执行步骤3204;步骤3203,保持第一旋转步数不变;步骤3204,第一旋转步数加上回程误差,得到真实要旋转的步数,以使得驱动马达带动镜头准确的到达第一位置。
控制器500获取所述驱动马达320当前的旋转方向;将驱动马达320当前的旋转方向与驱动马达320上一次执行自动对焦的旋转方向进行比对,如果所述驱动马达320当前的旋转方向与所述驱动马达320上一次执行自动对焦的旋转方向不一致,判断本次自动调焦过程中会存在回程误差,将所述第一旋转步数加上预设回程误差得到第二旋转步数;当获得第二旋转步数之后,控制器500向所述驱动马达320发送第四移动指令,即控制所述驱动马达320根据所述第二旋转步数将所述光学组件310移动至所述第一位置。
其中,在投影设备出厂前进行回程误差的测量,通过驱动马达320的多次正转、多次反转,求得平均误差,最终得到预设回程误差,预设回程误差同样记载在控制器500磁盘分区中。
如图33所示,当控制器500获得精调区间后,进行第二段精细调焦。包括:步骤3301,控制器500向所述驱动马达320发送第一移动指令,根据预设的旋转步数或者第一精调速度控制驱动马达320将光学组件310在精调区间内移动。
如图34所示,当控制器500发送第一移动指令的同时,相机700以特定频率进行拍照,拍摄当前位置的投影内容,当驱动马达320控制所述光学组件310到达调节起点或者调节终点之后,停止移动,记为一次精细调焦。当一次精细调焦结束之后,步骤3302,控制器500获取相机700在所述光学组件310移动过程中拍摄的多个投影内容图像,并计算所有投影内容图像的清晰度。步骤3303,将所有投影内容图像的清晰度进行排序,得到清晰度最高值。控制器500将清晰度最高值与预设的第二清晰度进行比较,判断清晰度最高值是否高于第二预设清晰度,如果高于,执行步骤3304,否则调整步数或速度后返回执行步骤3301;步骤3304,如果所述清晰度最高值高于第二清晰度,确定清晰度最高值对应投影内容图像的拍摄位置,将所述拍摄位置记为清晰度最佳位置。控制器500向所述驱动马达320发送第二移动指令,控制所述驱动马达320将光学组件310移动至目标位置,所述目标位置为所述清晰度最高值的投影内容图像的拍摄位置。
例如:第一次粗调焦后,确定的精调区间为驱动马达320距离当前位置旋转步数为400-800之间,将驱动马达当前位置记为原始位置,控制器500发送第一移动指令控制驱动马达320带动光学组件310在精调区间内移动,当驱动马达320带动光学组件310从第一位置到达第二位置,即距离驱动马达320原始位置800步的位置,驱动马达320停止移动,此时***的清晰度评价单元获取相机拍摄的投影内容图像,将各个位置的清晰度进行排序,得到清晰度最高值,该清晰度最高值对应的投影内容的拍摄位置在距离调节起点200步的位置,即距离驱动马达320原始位置600步的位置。此时,清晰度最高值高于第二清晰度,将距离驱动马达320原始位置600步的位置记为目标位置,控制器500发送第二移动指令,控制驱动马达320带动光学组件310从当前位 置,即从调节终点移动至目标位置,即驱动马达320带动光学组件310以旋转步数为200步从调节终点至调节起点进行移动,最终完成本次自动对焦。
如果所述清晰度最高值低于或者等于第二清晰度,控制器500调整驱动马达320的旋转步数或者精调速度,控制驱动马达320将光学组件310在精调区间内移动,启新一轮的精细调焦。控制器500根据就近原则驱动驱动马达320从调节起点或者调节终点移动至精调区间的另一侧。例如:精调区间为驱动马达320距离原始位置旋转步数为400-800之间,上一次精细调焦后驱动马达320在第二位置,即距离驱动马达320原始位置800步的位置,在新一轮的精细调焦中,控制器500控制驱动马达320带动光学组件310从第二位置移动至第一位置,当驱动马达320带动光学组件310移动至第一位置时,再次通过清晰度评价单元对本次精细调焦过程中相机700拍摄的投影图像的清晰度进行计算,如果本次精细调焦的清晰度最高值仍不高于第二清晰度,则继续调整驱动马达320的旋转步数或者精调速度,开启新一轮的调焦,直至找到清晰度最佳的位置(该位置为清晰度最高值高于第二清晰度的投影内容图像的拍摄位置)。
在一些实施例中,控制器500记录精细调焦的次数,每启动新一轮的精细调焦之前,控制器500将记录的精细调焦的次数与预设阈值进行比较,如果记录的精细调焦的次数高于预设阈值,则不启动新一轮的精细调焦,控制器500发送第五移动指令,控制驱动马达320带动光学组件310移动至上一轮精细调焦过程中清晰度最高值对应的投影内容图像的拍摄位置,结束本次自动对焦。
例如:设定精细调焦的次数不超过20次,当启动20次精细调焦后,仍无法找到清晰度最佳的位置,即清晰度最高值不高于第二清晰度,不再启动新一轮的精细调焦,控制器500发送第五移动指令,控制驱动马达320带动光学组件310移动至第20次精细调焦过程中,清晰度最高值的投影画面图像对应的拍摄位置,结束自动调焦,避免自动对焦尝试次数过多而导致耗时过长。
在一些实施例中,可以在精调区间内设置多个特定位置,当驱动马达320带动光学组件310到达特定位置后,相机700进行拍照得到该特定位置的投影内容图像,将多个特定位置拍摄的投影内容图像的清晰度进行高低排序,逐步缩小精调区间,从而能够减少一次精细调焦的时间,从而节省时间,实现快速自动调焦。
例如:通过第一段粗调焦得到精调区间在距离原始位置400-900步的位置,在精调区间设置多个特定位置:特定位置1(距离原始位置500步的位置),特定位置2(距离原始位置600步的位置),特定位置3(距离原始位置700步的位置)以及特定位置4(距离原始位置800步的位置);以距离原始位置400步的位置作为调节起点,控制器500控制驱动马达320带动光学组件310从调节起点移动至调节终点(距离原始位置900步的位置),当达到特定位置时,相机700进行一次拍照,当驱动马达320带动光学组件310移动至调节终点,停止运动。
控制器500比对多个特定位置拍摄投影内容图像的清晰度,确定特定位置3为清晰度最高值,但是该清晰度最高值不高于第二清晰度,则启动下一轮精细调焦,在新一轮的精细调焦的精调区间不再是距离原始位置400-900步的位置,而是距离原始位置700-900步;控制器500控制驱动马达320带动光学组件310从调节终点向调节起点移动200步,在移动过程中,相机700以预设频率多次拍摄移动过程中的投影内容图像,计算所述投影内容图像,得到高于第二清晰度的清晰度最高值,清晰度最高值对应的投影内容图像的拍摄位置距离原始位置750步的位置为清晰度最佳位置,控制器500控制驱动马达320带动光学组件310移动至所述清晰度最佳位置,完成自动对焦。
在一些实施例中,由于相机700进行拍照过程不是在驱动马达320停止后才进行拍照的,而是在驱动马达320边运动边拍照,因此照片实际的拍摄位置和控制器500设定的位置存在一定的 偏差,但是因为驱动马达320的速度和相机700的拍照频率是固定的,因此上述偏差会限定在一定的区间范围内,本申请通过引入补偿值来解决这个问题。当根据第二段精细调焦得到清晰度最佳的位置,即该位置为清晰度最高值高于第二清晰度的投影内容图像的拍摄位置,根据驱动马达320的步速和相机700的拍照频率计算得到补偿值,并根据补偿值在最佳位置前后移动特定步数计算清晰度理论值,即可得到最终的相对清晰度最佳位置。
例如:通过第二段精细调焦得到清晰度最佳位置是距离调节起点200步的位置,计算的平均位置补偿值是20步,则最终在距离调节起点180步,200步以及220步这三个位置,确定最终的相对清晰度最佳位置。
基于上述自动对焦方法,本申请的一些实施例还提供一种投影设备,包括:光机、镜头、相机、控制器,如图35所示。其中,所述光机被配置为投射播放内容至投影面;所述镜头包括光学组件和驱动马达;所述驱动马达连接所述光学组件,以调整所述光学组件的焦距;所述相机被配置为拍摄投影内容图像;控制器,被配置为:步骤1,获取自动对焦指令;步骤2,响应于所述自动对焦指令,获取投影设备与投影面之间的第一距离,按照上述方式计算得到第一位置;步骤3,基于预设对焦曲线与所述第一距离计算精调区间;向所述驱动马达发送第一移动指令,以控制所述驱动马达将所述光学组件在所述精调区间内移动;步骤4,计算所述相机在所述光学组件移动过程中拍摄的投影内容图像的清晰度;步骤5,向所述驱动马达发送第二移动指令,以控制所述驱动马达将光学组件移动至目标位置,所述目标位置为所述清晰度最高值的投影内容图像的拍摄位置。
上述实施例提供的投影设备在接收到自动对焦指令后,获取投影设备与投影面之间的第一距离,再根据预设调焦曲线以及第一距离计算得到精调区间,完成第一段粗调焦;当确定精调区间后,控制驱动马达将光学组件在精调区间移动,通过计算相机在光学组件在移动过程中拍摄的投影内容图像的清晰度,得到清晰度最高值的投影内容的拍摄位置,控制驱动马达将光学组件移动至该位置,完成自动对焦。所述投影设备根据多段式调焦,通过第一段粗调焦确定精调区间,通过第二段精细调焦在精调区间寻找清晰度最佳的位置,完成自动对焦,在不增加对焦耗时的前提下,避免陷入局部对焦导致对焦不清晰的问题,提高对焦速度,提升用户体验。
应当理解的是,本申请并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本申请的范围仅由所附的权利要求书来限制。

Claims (34)

  1. 一种投影设备,包括:
    光机,被配置为投射投影内容至投影面;
    镜头,所述镜头包括光学组件和驱动马达;所述驱动马达连接所述光学组件,以调整所述光学组件的焦距;
    存储器,被配置为存储位置记忆信息;
    控制器,被配置为:
    获取用户输入的调焦指令;
    响应于所述调焦指令,提取所述位置记忆信息,所述位置记忆信息包括所述光学组件的当前位置以及所述当前位置的可信度;
    响应于所述可信度为第一数值,基于所述当前位置计算第一调焦量,以及向所述驱动马达发送第一调焦指令,所述第一调焦指令用于控制所述驱动马达按照所述第一调焦量移动所述光学组件的位置;
    响应于所述可信度为第二数值,基于调焦区间起点位置计算第二调焦量,以及向所述驱动马达发送第二调焦指令,所述第二数值小于所述第一数值;所述第二调焦指令用于控制所述驱动马达按照所述第二调焦量移动所述光学组件的位置。
  2. 根据权利要求1所述的投影设备,所述控制器被配置为:
    初始化临时变量,所述临时变量用于记录调焦过程中,所述驱动马达带动所述光学组件的移动方向和移动步数;
    使用所述临时变量记录所述第一调焦量或所述第二调焦量;
    根据所述临时变量更新所述位置记忆信息中的当前位置,以及设置所述当前位置的可信度。
  3. 根据权利要求2所述的投影设备,所述控制器被配置为:
    通过监测调焦进程生成广播消息,所述广播消息包括第一广播消息和第二广播消息,所述第一广播消息用于指示当前调焦过程有效;所述第二广播消息用于指示当前调焦过程无效;
    响应于所述广播消息为所述第一广播消息,使用所述临时变量和所述当前位置计算更新位置;
    使用所述更新位置替换所述位置记忆信息中当前位置;
    设置所述位置记忆信息的可信度为第一数值;
    响应于所述广播消息为所述第二广播消息,或者,在预设接收周期内未生成所述广播消息,清除所述临时变量;
    设置所述位置记忆信息的可信度为第二数值。
  4. 根据权利要求3所述的投影设备,所述调焦指令包括手动调焦指令,所述控制器被配置为:
    控制所述光机投射包含手动调焦界面的投影内容;
    在所述光机投射所述手动调焦界面的投影内容过程中,接收用户输入的按键信息,以及解析所述按键信息中的移动方向和移动步数;
    将所述移动方向和移动步数存储至所述临时变量;
    使用所述临时变量更新所述当前位置。
  5. 根据权利要求4所述的投影设备,所述控制器被配置为:
    监听所述按键信息;
    如果在预设监听周期内未监听到所述按键信息,生成所述第二广播消息;
    如果在预设监听周期内监听到所述按键信息,生成所述第一广播消息,以及响应于所述按键信息中 的退出按键操作,使用所述更新位置替换所述位置记忆信息中当前位置。
  6. 根据权利要求1所述的投影设备,所述投影设备还包括距离传感器,所述距离传感器被配置为检测投影面与光机之间的间隔距离;所述调焦指令包括自动调焦指令,所述控制器被配置为:
    获取所述间隔距离和预设调焦距离对照表,所述调焦距离对照表中包括间隔距离与目标调焦位置的映射关系;
    根据所述间隔距离,在所述调焦距离对照表中查询目标调焦位置;
    计算第一调焦量,所述第一调焦量为所述目标调焦位置与所述当前位置的差值;
    根据所述第一调焦量,生成所述第一调焦指令。
  7. 根据权利要求6所述的投影设备,所述投影设备还包括相机,所述相机被配置为获取所述投影内容图像,所述控制器被配置为:
    按照第一速率控制所述驱动马达将所述光学组件调整至所述目标调焦位置;
    根据所述目标调焦位置,计算精调区间,所述目标调焦位置为所述精调区间的区间起点;
    按照第二速率控制所述驱动马达带动所述光学组件在所述精调区间内移动,所述第二速率小于所述第一速率;
    获取所述相机在所述光学组件移动期间拍摄的投影内容图像;
    计算所述投影内容图像的清晰度,以获得精细调焦位置,所述精细调焦位置为所述清晰度最高的投影内容图像的拍摄位置。
  8. 根据权利要求1所述的投影设备,所述控制器被配置为:
    按照第一速率控制所述驱动马达将所述光学组件移动至调焦区间起点位置;
    计算第二调焦量,所述第二调焦量为所述起点位置与目标调焦位置的差值;
    根据所述第二调焦量,生成第二调焦指令;
    向所述驱动马达发送第二调焦指令,以控制所述驱动马达按照第二速率将所述光学组件移动至所述目标调焦位置,所述第二速率小于所述第一速率。
  9. 根据权利要求1所述的投影设备,所述控制器被配置为:
    向所述驱动马达发送第一调焦指令的步骤后,检测所述驱动马达响应于所述第一调焦指令的单次移动信息;
    使用所述单次移动信息更新临时变量;
    获取结束指令,所述结束指令由用户主动输入,或者根据调焦进程自动生成;
    响应于所述结束指令,累加检测周期内的多个所述单次移动信息,以获得累计移动量;
    如果所述累计移动量对应的光学组件实际位置与所述目标调焦位置一致,设置所述当前位置的可信度为第一数值,以及使用所述临时变量更新所述位置记忆信息;
    如果所述累计移动量对应的光学组件实际位置与所述目标调焦位置不一致,设置所述当前位置的可信度为第二数值,以及清除所述临时变量。
  10. 根据权利要求1所述的投影设备,所述投影设备还包括相机,所述控制器还被配置为:基于相机获取的投影画面,利用边缘检测算法识别投影设备的投影区域;在投影区域显示为矩形、或类矩形时,控制器通过预设算法获取上述矩形投影区域四个顶点的坐标值。
  11. 根据权利要求10所述的投影设备,所述控制器还被配置为:使用透视变换方法校正投影区域为矩形,计算矩形和投影截图的差值,以实现判断显示区域内是否有异物。
  12. 根据权利要求10所述的投影设备,所述控制器还被配置为:在实现对投影区域外一定区域的异物检测时,可将当前帧的相机内容、和上一帧的相机内容做差值,以判断投影区域外区域是否有异物 进入;若判断有异物进入,自动触发防射眼功能。
  13. 根据权利要求10所述的投影设备,所述控制器还被配置为:利用飞行时间相机、或飞行时间传感器检测预定区域的实时深度变化;若深度值变化超过预设阈值,自动触发防射眼功能。
  14. 根据权利要求10所述的投影设备,所述控制器还被配置为:基于采集的飞行时间数据、截图数据、以及相机数据分析判断是否需要开启防射眼功能。
  15. 根据权利要求10所述的投影设备,所述控制器还被配置为:若检测到特定对象位于预定区域,将自动启动防射眼功能,以降低光机发出激光强度、降低用户界面显示亮度、并显示安全提示信息。
  16. 根据权利要求10所述的投影设备,所述控制器还被配置为:通过陀螺仪、或陀螺仪传感器对设备移动进行监测;向陀螺仪发出用于查询设备状态的信令,并接收陀螺仪反馈用于判定设备是否发生移动的信令。
  17. 根据权利要求10所述的投影设备,所述控制器还被配置为:
    在陀螺仪数据稳定预设时间长度后,控制启动触发梯形校正,在梯形校正进行时不响应遥控器按键发出的指令。
  18. 根据权利要求10所述的投影设备,所述控制器还被配置为:
    通过自动避障算法识别幕布,并利用投影变化,将投影画面校正至幕布内显示,实现与幕布边沿对齐的效果。
  19. 根据权利要求6或9所述的投影设备,所述控制器被配置为:
    向所述驱动马达发送第一调焦指令或第二调焦指令之后,控制所述驱动马达按照所述第一调焦量或第二调焦量移动所述光学组件至目标调焦位置后,获取所述光学组件移动至所述目标调焦位置后的投影内容图像;
    在所述投影内容图像中识别感兴趣区域ROI特征区域,所述ROI特征区域为所述投影内容图像中面积最大或周长比值最小的轮廓区域,所述轮廓区域为根据灰度值在所述投影内容图像中划定的区域;
    计算所述ROI特征区域的图像清晰度,以及根据图像清晰度计算第二调焦位置;
    控制所述驱动马达按照所述第二调焦位置调整所述光学组件的焦距。
  20. 根据权利要求19所述的投影设备,所述控制器还被配置为:
    将所述投影内容图像转化为灰度图;
    基于灰度值在所述灰度图得到中识别至少一个轮廓区域,以及计算所述轮廓区域的面积;
    获取所述面积最大或周长比值最小的轮廓区域的边界坐标;
    根据所述边界坐标中的坐标极值划定所述ROI特征区域。
  21. 根据权利要求20所述的投影设备,所述控制器还被配置为:
    检测所述投影内容图像中的目标物体数量;
    如果目标物体数量等于数量判断阈值,基于灰度值在所述灰度图中识别至少一个轮廓区域;
    如果目标物体数量大于数量判断阈值,根据目标物体数量将所述灰度图分为多个特征区域;
    根据预设规则对所述特征区域进行优先级排序,将优先级最高的特征区域划定为ROI特征区域。
  22. 根据权利要求21所述的投影设备,所述控制器还被配置为:
    根据所述灰度图中像素点灰度值的平均值得到灰度值阈值区间,以及计算所述灰度值阈值区间的灰度方差;
    将所述灰度方差结合差异性算法计算得到所述灰度图的第一分割像素点;
    根据预设分割值将所述灰度图分割成多个图像块;
    根据所述图像块中像素点灰度值的平均值以及标准差,计算得到所述图像块的第二分割像素点;
    分别根据所述第一分割像素点以及所述第二分割像素点对所述图像块的像素点进行分割,得到分割后的第一图像块以及第二图像块;
    分别计算所述第一图像块以及所述第二图像块中像素点灰度值的平均值以及方差值;
    将所述方差值,平均值结合差异性算法得到所述像素点的第一阈值以及第二阈值,将所述第一阈值以及第二阈值中的最大值对应的灰度值作为最优阈值;
    根据最优阈值以及所述图像块的颜色值识别所述灰度图中的目标部分和背景部分。
  23. 根据权利要求22所述的投影设备,所述控制器还被配置为:
    如果所述图像块的颜色值大于最优阈值,则将所述图像块中像素点划分为目标部分;
    如果所述图像块的颜色值小于或等于最优阈值,则将所述图像块中像素点划分为背景部分。
  24. 根据权利要求23所述的投影设备,所述控制器还被配置为:
    获取多个所述轮廓区域中的面积在预设面积区间内的轮廓区域;
    从面积在预设面积区间内的轮廓区域中求得所述面积最大或周长比值最小的轮廓区域;
    求取所述面积最大或周长比值最小的轮廓区域的边界坐标。
  25. 根据权利要求22所述的投影设备,所述控制器还被配置为:
    获取标准图卡,以及遍历标准图卡中的像素点,得到标准图卡中的坐标点信息;
    将所述标准图卡与所述灰度图进行比对,分别得到所述标准图卡与所述灰度图的关键点描述信息;
    根据相同的关键点得到所述灰度图中的多个特征关键点,以及根据所述多个特征关键点确定所述灰度图中的多个特征区域。
  26. 根据权利要求25所述的投影设备,所述控制器被配置为:将多个特征区域进行深度处理得到每个特征区域的深度图像信息;所述深度图像信息包括深度值以及深度参数;
    根据深度值计算所述特征关键点的间隔距离以及每个特征区域的面积;
    根据所述目标物体数量以及深度参数对每个特征区域进行计算,得到每个特征区域的目标物体图像占据的像素点数量;
    根据所述目标物体图像占据的像素点数量、所述特征区域的面积大小以及所述特征区域的周长比值对所述特征区域进行优先级排序,将优先级最高的特征区域作为ROI特征区域。
  27. 根据权利要求7所述的投影设备,所述控制器还被配置为:
    根据所述驱动马达的当前位置以及所述目标调焦位置计算得到所述驱动马达的第一旋转步数;
    向所述驱动马达发送第三移动指令,以控制所述驱动马达根据所述第一旋转步数将所述光学组件移动至所述目标调焦位置,所述目标调焦位置为精调区间的调节起点。
  28. 根据权利要求27所述的投影设备,所述控制器进一步被配置为:
    每隔预设时间获取多个所述间隔距离;
    计算多个所述间隔距离的均值,以获得平均距离;
    将所述平均距离输入飞行时间测距预设线性回归模型,得到第二距离;所述第二距离为投影设备与投影面之间的理论距离值;
    根据预设调焦距离对照表以及所述第二距离得到目标调焦位置。
  29. 根据权利要求27所述的投影设备,所述控制器进一步被配置为:
    获取所述驱动马达当前的旋转方向;
    如果所述驱动马达当前的旋转方向与所述驱动马达上一次执行自动对焦的旋转方向不一致,将所述第一旋转步数加上预设回程误差得到第二旋转步数;
    向所述驱动马达发送第四移动指令,以控制所述驱动马达根据所述第二旋转步数将所述光学组件移 动至所述目标调焦位置。
  30. 根据权利要求27-29任一项所述的投影设备,所述控制器还被配置为:
    计算所述相机在所述光学组件移动至所述目标调焦位置过程中拍摄的投影内容图像的清晰度;
    筛选得到高于第一清晰度的所述投影内容图像;
    根据所述目标调焦位置以及第二位置确定精调区间,所述第二位置为高于第一清晰度的所述投影内容图像的拍摄位置,所述第二位置为精调区间的调节终点。
  31. 根据权利要求27所述的投影设备,所述控制器还被配置为:
    根据第二速率控制所述驱动马达将所述光学组件在所述精调区间内移动过程中,获取所述相机以预设频率在所述光学组件移动过程中拍摄的多个投影内容图像;
    计算所述多个投影内容图像的清晰度,得到清晰度最高值;
    向所述驱动马达发送第二移动指令,以控制所述驱动马达将光学组件移动至精细调焦位置。
  32. 根据权利要求31所述的投影设备,所述控制器还被配置为:
    将所述清晰度最高值与第二清晰度进行比较;
    如果所述清晰度最高值高于第二清晰度,向所述驱动马达发送第二移动指令。
  33. 根据权利要求32所述的投影设备,所述控制器进一步被配置为:
    如果所述清晰度最高值等于或低于第二清晰度,按照第三速率速度控制所述驱动马达将所述光学组件在所述精调区间内移动。
  34. 一种用于投影设备的调焦方法,所述投影设备包括光机、镜头以及控制器;其中,所述镜头包括光学组件和驱动马达,所述驱动马达连接所述光学组件,以调整所述光学组件的焦距;所述调焦方法包括:
    获取用户输入的调焦指令;
    响应于所述调焦指令,提取所述位置记忆信息,所述位置记忆信息包括所述光学组件的当前位置以及所述当前位置的可信度;
    响应于所述可信度为第一数值,基于所述当前位置计算第一调焦量,以及向所述驱动马达发送第一调焦指令,所述第一调焦指令用于控制所述驱动马达按照所述第一调焦量移动所述光学组件的位置;
    响应于所述可信度为第二数值,基于调焦区间起点位置计算第二调焦量,以及向所述驱动马达发送第二调焦指令,所述第二数值小于所述第一数值;所述第二调焦指令用于控制所述驱动马达按照所述第二调焦量移动所述光学组件的位置。
PCT/CN2022/123540 2021-11-16 2022-09-30 投影设备及调焦方法 WO2023087960A1 (zh)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
CN202111355866 2021-11-16
CN202111355866.0 2021-11-16
CN202210345204.3 2022-03-31
CN202210345204.3A CN114727079A (zh) 2021-11-16 2022-03-31 投影设备及基于位置记忆的调焦方法
CN202210343444.X 2022-03-31
CN202210343444.XA CN114885138A (zh) 2021-11-16 2022-03-31 投影设备及自动对焦方法
CN202210625987.0 2022-06-02
CN202210625987.0A CN115002433A (zh) 2022-06-02 2022-06-02 投影设备及roi特征区域选取方法

Publications (1)

Publication Number Publication Date
WO2023087960A1 true WO2023087960A1 (zh) 2023-05-25

Family

ID=86396220

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/123540 WO2023087960A1 (zh) 2021-11-16 2022-09-30 投影设备及调焦方法

Country Status (1)

Country Link
WO (1) WO2023087960A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008216352A (ja) * 2007-02-28 2008-09-18 Casio Comput Co Ltd 投影装置、異常制御方法及びプログラム
CN102681312A (zh) * 2011-03-16 2012-09-19 宏瞻科技股份有限公司 激光投影***的人眼安全保护***
CN109426060A (zh) * 2017-08-21 2019-03-05 深圳光峰科技股份有限公司 投影仪自动调焦方法及投影仪
CN109856902A (zh) * 2017-11-30 2019-06-07 中强光电股份有限公司 投影装置及自动对焦方法
CN114727079A (zh) * 2021-11-16 2022-07-08 海信视像科技股份有限公司 投影设备及基于位置记忆的调焦方法
CN115002433A (zh) * 2022-06-02 2022-09-02 海信视像科技股份有限公司 投影设备及roi特征区域选取方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008216352A (ja) * 2007-02-28 2008-09-18 Casio Comput Co Ltd 投影装置、異常制御方法及びプログラム
CN102681312A (zh) * 2011-03-16 2012-09-19 宏瞻科技股份有限公司 激光投影***的人眼安全保护***
CN109426060A (zh) * 2017-08-21 2019-03-05 深圳光峰科技股份有限公司 投影仪自动调焦方法及投影仪
CN109856902A (zh) * 2017-11-30 2019-06-07 中强光电股份有限公司 投影装置及自动对焦方法
CN114727079A (zh) * 2021-11-16 2022-07-08 海信视像科技股份有限公司 投影设备及基于位置记忆的调焦方法
CN114885138A (zh) * 2021-11-16 2022-08-09 海信视像科技股份有限公司 投影设备及自动对焦方法
CN115002433A (zh) * 2022-06-02 2022-09-02 海信视像科技股份有限公司 投影设备及roi特征区域选取方法

Similar Documents

Publication Publication Date Title
CN115174877B (zh) 投影设备及投影设备的调焦方法
WO2018201809A1 (zh) 基于双摄像头的图像处理装置及方法
US20110221920A1 (en) Digital photographing apparatus, method of controlling the same, and computer readable storage medium
CN108076278B (zh) 一种自动对焦方法、装置及电子设备
JP2020537382A (ja) デュアルカメラベースの撮像のための方法および装置ならびに記憶媒体
US8411159B2 (en) Method of detecting specific object region and digital camera
CN107566741B (zh) 对焦方法、装置、计算机可读存储介质和计算机设备
US9918004B2 (en) Camera body capable of driving an image sensor along an optical axis in response to a change in an optical state of an object image
US20140043522A1 (en) Image pickup apparatus and control method therefor
WO2023087947A1 (zh) 一种投影设备和校正方法
JP2012124555A (ja) 撮像装置
CN115002433A (zh) 投影设备及roi特征区域选取方法
US8199247B2 (en) Method for using flash to assist in focal length detection
US20120019709A1 (en) Assisting focusing method using multiple face blocks
CN115002432A (zh) 一种投影设备及避障投影方法
US8436934B2 (en) Method for using flash to assist in focal length detection
CN114866751A (zh) 一种投影设备及触发校正方法
JP2007133301A (ja) オートフォーカスカメラ
WO2023087960A1 (zh) 投影设备及调焦方法
US11012615B2 (en) Imaging apparatus, control method and non-transitory computer readable medium that determines a confused state
JP3985005B2 (ja) 撮像装置、画像処理装置、撮像装置の制御方法、およびこの制御方法をコンピュータに実行させるためのプログラム
US10248859B2 (en) View finder apparatus and method of operating the same
WO2023088303A1 (zh) 投影设备及避障投影方法
CN116320335A (zh) 一种投影设备及调整投影画面尺寸的方法
JP2003289468A (ja) 撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22894498

Country of ref document: EP

Kind code of ref document: A1