CN109922251B - Method, device and system for quick snapshot - Google Patents

Method, device and system for quick snapshot Download PDF

Info

Publication number
CN109922251B
CN109922251B CN201711322098.2A CN201711322098A CN109922251B CN 109922251 B CN109922251 B CN 109922251B CN 201711322098 A CN201711322098 A CN 201711322098A CN 109922251 B CN109922251 B CN 109922251B
Authority
CN
China
Prior art keywords
camera
distance
target
angle
ptz
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711322098.2A
Other languages
Chinese (zh)
Other versions
CN109922251A (en
Inventor
朱力于
杨昆
骆立俊
张德
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201711322098.2A priority Critical patent/CN109922251B/en
Priority to PCT/CN2018/119698 priority patent/WO2019114617A1/en
Publication of CN109922251A publication Critical patent/CN109922251A/en
Application granted granted Critical
Publication of CN109922251B publication Critical patent/CN109922251B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention provides a method, equipment and a system for quick snapshot, wherein the method comprises the following steps: detecting a target in a current picture monitored by the stereoscopic vision camera, calculating the distance and the angle between the target and the stereoscopic vision camera, obtaining the object distance and the angle between the target and the PTZ camera according to the distance and the angle between the target and the stereoscopic vision camera, calculating shooting parameters according to the object distance and the angle between the target and the PTZ camera, and adjusting the focal length and the shooting angle of a lens of the PTZ camera according to the shooting parameters. According to the embodiment of the invention, the distance and the angle of the target are calculated by using the stereoscopic vision camera through a binocular distance measuring technology, the shooting parameters of the PTZ camera for capturing the target are quickly obtained and adjusted, the step of automatic focusing is omitted, the focusing time is shortened, and the target is rapidly captured.

Description

Method, device and system for quick snapshot
Technical Field
The present application relates to the field of video surveillance, and in particular, to a method, an apparatus, and a system for implementing fast snapshot.
Background
With the development of the existing video monitoring technology, monitoring is developed from simple video recording to intellectualization. The need for detection and identification of objects in many surveillance scenarios is increasing, which requires rapid capture and extraction of objects. At present, a camera generally adopts a contrast focusing mode to carry out focusing operation, and the intensity of contrast textures of an image is adopted to judge whether the current focusing is at the clearest position. The focusing technology directly judges the acquired image information, so the precision is high, but the whole focusing process needs to be analyzed, so the focusing speed is low, and the requirement of quick focusing cannot be met.
The existing snapshot technical scheme is that a fixed-focus pan-focus lens and a wide-angle lens are combined, and after a target is found on the wide-angle lens, the target in a fixed distance range is directly snapshot by utilizing the characteristics of small aperture and large depth of field of the pan-focus lens. However, because the zoom lens adopts a fixed aperture and a fixed focal length, the zoom lens can only adopt the design of a fixed-focus lens, and can only be suitable for shooting objects in a limited range, and cannot zoom, so that the application range is limited.
Disclosure of Invention
The embodiment of the invention provides a method, a device and a system for quick snapshot, which can realize quick positioning, quick focusing and quick snapshot of an image of a target after the target is detected.
In order to achieve the above purpose, the embodiments of the present invention provide the following technical solutions:
in a first aspect, an embodiment of the present invention provides a method for fast snapshot, where the method includes: firstly, detecting a target in a current picture monitored by a first camera, and calculating the distance and the angle between the target and the first camera; then, calculating shooting parameters required by the second camera for capturing the target according to the obtained distance and angle, wherein the shooting parameters comprise a lens focal length and a shooting angle; and finally, adjusting the angle of the second camera and the focal length of the lens according to the shooting parameters and then capturing the target. The first camera detects the target and calculates the distance of the target, so that the shooting parameters of the second camera are obtained, the focusing time of the second camera can be reduced, and the image of the target can be captured more timely and clearly.
Wherein the first camera is a stereoscopic vision camera and the second camera is a PTZ (Pan/Tilt/Zoom) camera.
The PTZ camera can set the angle as the shooting angle of the PTZ camera when receiving the distance and the angle calculated by the first camera, set the distance as the object distance between the target and the PTZ camera, obtain the lens focal length of the PTZ camera according to the object distance, and further adjust the PTZ camera to shoot the target in a snapping mode according to the shooting angle and the lens focal length.
Alternatively, the lens focal length of the PTZ camera may be obtained by querying a look-up table of object distances and lens focal lengths of the PTZ camera.
In one possible design, in order to adjust the PTZ camera more accurately, a distance estimation error corresponding to the distance can be calculated when the distance of the target is calculated, the depth of field of the PTZ camera at the object distance is calculated when the object distance is obtained, the distance estimation error and the depth of field are compared, and if the distance estimation error is less than or equal to the depth of field, the target is captured after the PTZ camera is adjusted according to the shooting parameters; and if the distance estimation error is larger than the depth of field, automatically adjusting the focal length of the lens and snapping the target after the PTZ camera is adjusted according to the shooting parameters. When the depth of field is larger than or equal to the distance estimation error, the depth of field can cover the distance estimation error range of the target, even if a certain error exists in distance calculation, a clear image can be shot within the depth of field range, and the PTZ camera can be directly adjusted to shoot the target according to shooting parameters; however, if the depth of field is smaller than the distance estimation error, the depth of field cannot completely cover the distance estimation error range of the target, and a clear image of the target may not be captured, so that the PTZ camera needs to be adjusted according to the capturing parameters and then the target needs to be captured by auxiliary auto-focusing, and the PTZ camera can be adjusted according to the actual situation to obtain a clear target image.
In some cases, a plurality of targets may be detected in a current picture of the stereoscopic vision camera, distances and angles between the plurality of targets and the stereoscopic vision camera are calculated, lens parameters of the PTZ camera are calculated according to the distances and the angles, a snapshot sequence of the plurality of targets is determined according to priorities, the targets are respectively snapshot according to the determined snapshot sequence, and parameters according to which the snapshot sequence of the plurality of targets is determined according to the priorities include a position, an angle, a distance, a running track where the targets are located, whether the targets are about to leave a monitored area or not.
In a second aspect, an embodiment of the present invention provides a fast snapshot system, including: the first camera is used for acquiring a current monitoring picture, detecting a target in the monitoring picture, calculating the distance and the angle between the target and the first camera, and sending the distance and the angle to the second camera; and the second camera is used for receiving the distance and the angle sent by the first camera, calculating shooting parameters of the second camera according to the distance and the angle, and adjusting a target to be captured by the second camera according to the shooting parameters, wherein the shooting parameters comprise a lens focal length and a shooting angle.
In a third aspect, an embodiment of the present invention further provides a fast snapshot apparatus, including: the first camera is used for acquiring current monitoring pictures from the two lenses simultaneously; the processor is used for detecting a target in the current picture of the first camera, calculating the distance and the angle between the target and the first camera, calculating shooting parameters of the second camera according to the distance and the angle, and controlling the second camera to shoot the target in a snapping mode, wherein the shooting parameters comprise a shooting angle and a lens focal length; and the second camera is used for capturing the target.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium for storing computer software instructions for the above fast-capture apparatus and/or system, which includes program code designed to execute the method provided in the first aspect.
According to the method, firstly, the distance and the angle of a target are quickly measured through a stereoscopic vision technology, then the corresponding shooting parameters of a snapshot camera are obtained, the snapshot camera is quickly adjusted according to the shooting parameters, and the target is snapshot. Compared with the prior art, the method disclosed by the embodiment of the invention can determine the shooting parameters of the snapshot camera more accurately and quickly, reduce the focusing time of the snapshot camera, and has higher accuracy, so that the snapshot is more timely and the image is clearer.
These and other aspects of the invention are apparent from and will be elucidated with reference to the embodiments described hereinafter.
Drawings
Fig. 1 is a schematic diagram of a fast snapshot system according to an embodiment of the present invention.
Fig. 2 is a schematic structural diagram of a fast capture device according to an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of a network camera according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a fast snapshot method according to an embodiment of the present invention.
Fig. 5 is a schematic diagram of calculating a distance and an angle between a target and a stereoscopic vision camera according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of a target coordinate transformation according to an embodiment of the present invention.
Fig. 7 is a schematic diagram illustrating a method for calculating a depth of field according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be described below with reference to the accompanying drawings. The particular methods of operation in the method embodiments may also be applied to apparatus embodiments or system embodiments. In the description of the present invention, the term "plurality" means two or more unless otherwise specified.
As shown in fig. 1, a system 100 for fast snap shooting provided by an embodiment of the present invention includes a first camera and a second camera, where the first camera is a stereo vision camera 110, and the second camera is a PTZ (Pan/Tilt/Zoom) camera 120.
The stereoscopic vision camera 110 in the system is provided with two lenses, so that the principle of human vision can be simulated, two images of the current scene can be simultaneously obtained from two different angles, the actual distance and angle between a target in the images and the stereoscopic vision camera can be calculated according to the matching relation of pixels between the images by utilizing the difference of the shooting angles of the two images, and the coordinates of the target in the coordinate system of the stereoscopic vision camera 110 can be further calculated. The stereoscopic vision camera 110 in the embodiment of the present invention refers to a camera or a camera group that can be used to simultaneously acquire images of a current scene from different lenses, and this name is not limited to the device itself, and may be other names, such as: a binocular stereo vision camera, a binocular range finding camera, or a stereo vision range finding camera, etc. In practice, two ordinary video cameras can be used to form a set of stereoscopic cameras, or two ordinary camera lenses can be integrated on one device to form a stereoscopic camera.
The fast snapshot system 100 in the embodiment of the present invention may further include multiple sets of stereoscopic cameras to expand the monitoring range. The technical solution of the present invention is described below only in the case of one set of stereoscopic cameras, and the technical solution of one set of stereoscopic cameras can be referred to in the case of multiple sets of stereoscopic cameras.
The PTZ camera 120 in the fast capturing system 100 in the embodiment of the present invention is equipped with a pan-tilt, which can realize omnidirectional (left-right/up-down) movement and zoom control of a lens, and is used to obtain shooting parameters of the PTZ camera 120 capturing the target, including a shooting focal length and a shooting angle, according to the distance and angle between the target and the stereoscopic vision camera, and adjust an image of the target captured after the PTZ camera 120 according to the obtained parameters.
When the fast snap-shot system 100 works, the stereoscopic vision camera 110 simultaneously obtains current monitoring pictures from the two lenses, where the current monitoring pictures include a first picture and a second picture, and the first picture and the second picture are pictures shot by the first lens and the second lens of the stereoscopic vision camera 110 at the same time. The stereoscopic vision camera 110 detects an object in the current screen and calculates the distance and angle of the object from the stereoscopic vision camera 110 after detecting the object. If multiple objects are detected, the distance and angle between each object and the stereo vision camera 110 are calculated. The stereoscopic camera 110 then sends the distance and angle of the object from the stereoscopic camera 110 to the PTZ camera 120. After receiving the distance and angle sent by the stereoscopic vision camera 110, the PTZ camera 120 calculates shooting parameters for the PTZ camera 120 to shoot the target according to the distance and angle between the target and the stereoscopic vision camera 110. After obtaining the shooting parameters, the PTZ camera 120 adjusts the angle of the PTZ camera 120 and the focal length of the lens of the PTZ camera 120 according to the shooting parameters. After the adjustment is completed, PTZ camera 120 takes an image of the target.
In the event that a more accurate capture of an image of a target is desired, the stereo vision camera 110 may also calculate a corresponding distance estimation error for the target and send to the PTZ camera 120, and the PTZ camera 120 may also calculate a depth of field for the target corresponding to the object distance of the PTZ camera 120. The PTZ camera 120 compares the distance estimation error of the target with the depth of field at the object distance before adjustment, and if the depth of field is greater than or equal to the distance estimation error, the PTZ camera 120 is directly adjusted according to the shooting parameters and the target is captured; if the depth of field is less than the distance estimation error, the PTZ camera 120 is adjusted according to the shooting parameters and then the PTZ camera 120 is continuously controlled to automatically focus.
In some cases, it is possible to detect multiple targets in the current monitoring screen of the stereoscopic vision camera 110, the stereoscopic vision camera 110 calculates the distance and angle between each target and the stereoscopic vision camera 110, the PTZ camera 120 obtains the shooting parameters of the PTZ camera 120 corresponding to each target according to the distance and angle between each target and the stereoscopic vision camera 110, determines the capturing order of the multiple targets according to the priority, adjusts the PTZ camera 120 to capture the images of the multiple targets according to the determined capturing order, and determines the priority according to the parameters including the position, angle, distance, and moving track of the target or whether the target is about to leave the monitoring area.
The fast snap-shot system in the above embodiment is to separately set the stereoscopic vision camera 110 and the PTZ camera 120, and the two independently perform the corresponding operation and control functions. The stereoscopic vision camera 110 and the PTZ camera 120 may be connected in a wired or wireless manner to enable information interaction between the two devices. It should be noted that the calculation operations in the stereoscopic vision camera 110, such as detecting the object and calculating the distance and angle between the object and the stereoscopic vision camera 110, can also be performed by the PTZ camera 120, and those skilled in the art can make the above-mentioned modifications to the fast snapshot system in the present embodiment, and such modifications should be considered to be within the scope of the claims of the present invention and the equivalent technology thereof.
Fig. 2 is a schematic diagram illustrating a possible structure of a fast capturing apparatus provided in an embodiment of the present invention, where the apparatus includes: a first camera 210, a processor 220, a memory 230, a second camera 240, and at least one communication bus 250, wherein the first camera 210 may be a stereoscopic vision camera 210 and the second camera 240 may be a PTZ camera 240.
Processor 220 may be a general purpose central processing unit CPU, microprocessor, ASIC, or one or more integrated circuits for controlling the execution of programs in accordance with aspects of the present invention. The processor 220 may also be implemented using an FPGA or a DSP.
A memory 230, which may be a volatile memory (volatile memory), such as a random-access memory (RAM); or a non-volatile memory (non-volatile memory), Hard Disk Drive (HDD) or solid-state drive (SDD); or a combination of the above types of memories and provides instructions and data to the processor.
The bus 250 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus 250 may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 2, but it is not intended that there be only one bus or one type of bus.
The stereoscopic camera 210 includes two lenses, and is configured to obtain a current monitoring frame from the two lenses at the same time, where the current monitoring frame includes a first frame and a second frame, and the first frame and the second frame are respectively frames captured by the two lenses of the stereoscopic camera 210 at the same time.
The processor 220 detects a target in the current monitoring picture of the stereoscopic vision camera 210, calculates the distance and angle between the target and the stereoscopic vision camera 210, and calculates shooting parameters corresponding to the PTZ camera 240 according to the distance and angle between the target and the stereoscopic vision camera 210, wherein the shooting parameters comprise a shooting angle and a lens focal length. After obtaining the shooting parameters required by PTZ camera 240 to capture the target, processor 220 generates corresponding control signals according to the shooting parameters, and controls PTZ camera 240 to adjust the shooting angle and the focal length of the lens.
PTZ camera 240 may rotate up and down, left and right, and may rotate to a corresponding angle according to a photographing angle in the photographing parameters, a focal length of PTZ camera 240 may be variable, and processor 220 may control PTZ camera 240 to adjust a focal length of a lens and capture an image of a target. PTZ camera 240 is also capable of auto-focusing, which may be accomplished in accordance with auto-focusing commands sent by processor 220.
In a possible implementation manner, the processor 220 is further configured to calculate a corresponding distance estimation error according to a distance between the target and the stereoscopic vision camera 210, calculate a depth of field corresponding to the object distance of the PTZ camera 240 according to an object distance between the target and the PTZ camera 240 and a focal length of a lens, compare the distance estimation error with a size relationship of the depth of field, and if the distance estimation error is less than or equal to the depth of field, the processor 220 controls the PTZ camera 240 to adjust and capture an image of the target according to the shooting parameters; if the distance estimation error is larger than the depth of field, the processor 220 controls the capturing camera 240 to adjust according to the shooting parameters, and then controls the PTZ camera 240 to automatically adjust the focal length of the lens.
In some possible cases, if the processor 220 detects multiple targets, determining a snapshot order of the multiple targets according to priority, and controlling the PTZ camera 240 to snapshot images of the multiple targets according to the determined snapshot order, where parameters according to which the priority is determined include an orientation, an angle, a distance, a trajectory, or whether the multiple targets are about to leave a monitored area.
Optionally, the fast-snap apparatus 200 may further include a communication interface 260, and the communication interface 260 is used to transmit the image of the snap-shot target to an external apparatus.
The fast capture apparatus 200 in the above embodiment is an apparatus that integrates the stereoscopic vision camera 210, the processor 220, and the PTZ camera 240, three components of the fast capture apparatus 200 may be separately provided and connected in a wired or wireless manner to realize communication between the components, or the three components may be combined, and one device may be used to realize the functions of the processor 220 and the memory 230. The stereoscopic vision camera 210 and the PTZ camera 240 in the fast-capture apparatus 200 are mainly used to acquire a target image, and the processor 220 is used to perform calculation and control operations. It should be noted that, according to actual needs, part of the calculation or control functions of the processor 220 may be implemented by the stereo vision camera 210 or the PTZ camera 240, and those skilled in the art may make the above-mentioned modifications to the fast snapshot system in the present embodiment, and such modifications should be considered to be within the scope of the claims of the present invention and the equivalent technology thereof.
As shown in fig. 3, the configuration of the network camera 300 is common to the first camera and the second camera in the fast capture system 100 and the fast capture apparatus 200. The network camera 300 includes the structure common to the first camera and the first camera in the above-described embodiments, and for ease of understanding, standard features of the network camera 300 not relevant to the present invention will not be described again. The network camera 300 includes a lens 310 as a front end part of the network camera 300, and the lens 310 has a fixed aperture, an auto zoom, and the like; an image sensor 320, such as a Complementary Metal Oxide Semiconductor (CMOS), a Charge-coupled Device (CCD), or the like, for recording incident light; an image processor 330; a processor 340 for performing computational operations and controlling the camera; a memory 350 for storing programs or data; a communication bus 360 for communicating information between the various components, and a communication interface 370 for communicating information over a communication network to other nodes connected to the network.
The image sensor 320 receives information about the recorded light and processes this information by means of an a/D converter and a signal processor 331, wherein the a/D converter and the signal processor 331 are well known to the skilled person. In some embodiments, such as when the image sensor 320 is a CMOS sensor, the image sensor 320 includes an a/D converter, and thus no a/D converter is required in the image processor 330. The result produced by the a/D converter and signal processor 331 is digital image data, which according to one embodiment is processed in a scaling unit 332 and an image encoder 333 before being sent to a processor 340. The scaling unit 332 is used to process the digital image data into at least one image of a specific size. However, the scaling unit 332 may be arranged to generate a plurality of differently sized images, all representing the same image/frame provided by the a/D converter and the signal processor 331. According to another embodiment, the function of the scaling unit 332 is performed by the image encoder 333, and in yet another embodiment, there is no need to perform any scaling or resizing on the image from the image sensor 320.
The encoder 333 is optional for carrying out the invention and is arranged to encode the digital image data into any of a number of known formats for a continuous video sequence, for a limited video sequence, for a still image or for an image/video stream. For example, the image information may be encoded as MPEG1, MPEG2, MPEG4, JEPG, mjpeg, bitmap, or the like. The processor 340 may use the unencoded image as input data. In this case, the image data is transferred from the signal processor 331 or from the scaling unit 332 to the processor 340 without passing the image data through the image encoder 333. The uncoded image may be in any uncoded image format, such as BMP, PNG, PPM, PGM, PNM, and PBM, although the processor 340 may also use the encoded data as input data.
In one embodiment of the present invention, the image data may be directly transmitted from the signal processor 331 to the processor 340 without passing through the scaling unit 332 or the image encoder 333. In yet another embodiment, the image data may be sent from the scaling unit 332 to the processor 340 without passing through the image encoder 333.
Processor 340 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the present invention. The processor 340 may also be implemented using a Field Programmable Gate Array (FPGA) or a DSP. When DSP-based software code compression is employed, some of the functions in the image processor 330 may also be integrated on the processor 340. The processor 340 is used to manage and control the network camera 300.
Memory 350 is used to store application program code for performing aspects of the present invention and may be a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited thereto. The memory 350 may be self-contained and coupled to the processor 340 via a bus 360. Memory 350 may also be integrated with processor 340.
Communication bus 360 may include a path that transfers information between components.
Communication interface 370 may use any transceiver or the like for communicating with other devices or communication networks, such as ethernet, Radio Access Network (RAN), Wireless Local Area Networks (WLAN), etc.
The stereoscopic vision camera in the above embodiment is provided with a plurality of lenses on the basis of the general-purpose video camera 300 to realize that the currently detected picture is obtained from the plurality of lenses; the PTZ camera in the above embodiment is provided with the pan/tilt unit on the basis of the general-purpose video camera 300, so as to realize omnidirectional (left/right/up/down) movement and adjust the shooting angle of the PTZ camera.
The fast snapshot system and method provided by the embodiments of the present invention will be further described with reference to the accompanying drawings.
As shown in fig. 4, in the method for fast capturing provided by the embodiment of the present invention, when a target is captured, a stereoscopic vision camera is used to assist the capturing camera in focusing, so that the problem of too long focusing time in the existing capturing technology is solved. The fast capturing method provided by the embodiment of the present invention can be applied to the fast capturing system 100 in fig. 1 and the fast capturing apparatus 200 in fig. 2, and is used for capturing a fast moving target in a large scene, and the following describes a specific implementation of the method provided by the embodiment of the present invention with reference to fig. 1. The quick snapshot method provided by the embodiment of the invention comprises the following steps:
and 410, detecting a target in the current picture of the first camera, and calculating the distance and the angle between the target and the first camera.
The first camera may be the stereoscopic vision camera 110, and the monitored current frames include a first frame and a second frame, where the first frame and the second frame are respectively frames captured by the first lens and the second lens of the stereoscopic vision camera 110 at the same time. It is possible to detect an object in a frame and calculate the distance and angle between the object and the stereoscopic camera 110 according to the visual difference of the object in different frames.
And 420, calculating shooting parameters of the second camera according to the distance and the angle of the target from the first camera.
In the fast snap-shot system, the second camera is PTZ camera 120. After the distance and angle between the target and the stereoscopic vision camera 110 are obtained, the object distance and angle between the target and the PTZ camera 120 can be obtained according to the position relationship between the stereoscopic vision camera 110 and the PTZ camera 120, and then the shooting parameters of the PTZ camera 120 for shooting the target are obtained, wherein the shooting parameters comprise a shooting angle and a lens focal length.
And 430, adjusting the shooting angle and the lens focal length of the second camera according to the shooting parameters and then capturing the target.
The shooting angle and the lens focal length are two most important parameters for the PTZ camera 120 to capture the target, the shooting angle and the lens focal length of the PTZ camera 120 are obtained according to the above steps, the pan-tilt of the PTZ camera 120 can be controlled to rotate to a corresponding angle, the lens of the PTZ camera 120 is controlled to zoom, the lens focal length is adjusted to a corresponding value, and the target is captured.
Through the steps, the PTZ camera 120 can complete the snapshot of the target without automatic focusing, so that the long-time automatic focusing process is avoided, the snapshot efficiency is improved, the snapshot is more timely, and the image is clearer.
The specific implementation method of the above steps is further described below with reference to the accompanying drawings.
As shown in fig. 5, a method of calculating a distance between a target and the stereoscopic vision camera 110 is provided for the embodiment of the present invention. Wherein Ol, Or are target surface central point that the left and right cameras of stereoscopic vision camera 110 put respectively, and target point P is at the target surface formation of image of left and right cameras respectively to Pl, Pr point, and the light path is from the target point through the center of lens plane to the target surface formation of image point, according to similar triangle-shaped principle, can obtain:
(B+(xl-xr))/D=B/(Z-f)
after simplification, the following is obtained:
D=(f×(B+xl-xr))/(xl-xr)
wherein, B is a baseline distance, namely the distance between the left camera and the right camera; f is the focal length of the stereoscopic vision camera; xl is the horizontal distance between the imaging point of the target on the left side and the center point of the camera on the left side; xr is the horizontal distance between the imaging point of the target at the right camera and the center point of the right camera; d is the distance of the object from the stereoscopic camera.
The angle relation between the target and the left camera, the angle relation between the target and the right camera and the angle between the target and the central line of the stereoscopic vision camera can be calculated according to a trigonometric function calculation formula on the basis of target distance calculation.
However, the distance measured by the stereo distance measurement technology is not absolutely accurate, the measured distance has a certain error, the error is a distance estimation error, and the error range is related to the lens parameters of the stereo vision camera, the pixel size of the photoreceptor and the distance between the two lenses. The distance estimation error increases with the distance of the target, in direct proportion to the square of the distance. Thomas Luhmann, Close-Range photography and 3D Imaging (2014) has provided a method for calculating distance estimation errors. In addition, the coefficient between the distance estimation error and the square of the distance can be obtained through multiple times of measurement, and the relation between the measured distance of the target and the distance estimation error can be further obtained.
There are many calculation methods for measuring the target distance by using the stereo distance measurement technology, and the above calculation method is only one calculation method adopted in one embodiment of the present invention, and does not limit the protection scope of the present invention.
The object distance of the object from the PTZ camera 120 and the photographing parameters of the PTZ camera 120 in step 420 may be obtained in various ways after the distance and angle of the object from the stereoscopic vision camera 110 are obtained.
In one implementation, the distance between the stereoscopic vision camera 110 and the target may be directly set as the object distance between the target and the PTZ camera 120, the angle between the stereoscopic vision camera 110 and the target may be set as the shooting angle of the PTZ camera 120, and the shooting focal length of the PTZ camera 120 may be obtained according to the object distance between the target and the PTZ camera 120. Since the fast capturing system 100 in the embodiment of the present invention is generally used to capture a long-distance object, the distance between the object and the fast capturing system 100 is much larger than the distance between the lens of the stereoscopic vision camera 110 and the lens of the PTZ camera 120, and under the condition of low requirement on the capturing precision, the distance and the angle between the object and the stereoscopic vision camera 110 can be considered to be equal to the object distance and the angle between the object and the PTZ camera 120. In addition, the stereoscopic camera 110 generally includes two lenses, and the coordinate system is usually set up with the center positions of the two lenses as the origin, and if the PTZ camera 120 is located at the center positions of the two lenses of the stereoscopic camera 110, the coordinate system of the stereoscopic camera 110 coincides with the coordinate system of the PTZ camera 120, and the distance and angle between the object and the stereoscopic camera 110 are equal to the object distance and angle between the object and the PTZ camera 120.
In another implementation, the spatial position difference between the stereoscopic vision camera 110 and the PTZ camera 120 is considered, the object distance and the angle between the object and the PTZ camera 120 are calculated according to the spatial position difference and the distance and the angle between the object and the stereoscopic vision camera 110, the angle is set as the shooting angle of the PTZ camera 120, and the shooting focal length of the PTZ camera 120 is obtained according to the object distance between the object and the PTZ camera 120.
For a spatial position difference between the stereoscopic vision camera 110 and the PTZ camera 120, the coordinates of the object in the PTZ camera 120 can be obtained by three-dimensional coordinate conversion.
As shown in fig. 6, a manner for implementing coordinate transformation between the stereoscopic vision camera and the PTZ camera is provided for the embodiment of the present invention, where a coordinate O is a coordinate origin of a coordinate system of the PTZ camera, and a coordinate O' is a coordinate origin of a coordinate system of the stereoscopic vision camera, and a relationship between points in the two coordinate systems is shown as follows:
Figure GDA0003188470290000081
wherein λ is a scale proportion factor between two coordinate systems, Δ X, Δ Y, and Δ Z are position differences between an origin of coordinates of a PTZ camera coordinate system and an origin of coordinates of a stereoscopic vision camera coordinate system, and R is a rotation matrix during coordinate conversion, and is used for rotating each coordinate axis of the coordinate system of the stereoscopic vision camera to a coordinate axis corresponding to the coordinate system of the PTZ camera.
R=R(εY)R(εYX)R(εYZ)
The expansion is as follows:
Figure GDA0003188470290000082
from the above equation, the coordinates of the target in the coordinate system of PTZ camera 120, and thus the object distance of the target and the angle to PTZ camera 120, can be obtained.
After the object distance of the target from PTZ camera 120 is obtained, the focal length of the lens of PTZ camera 120 may be obtained in a variety of ways. Specifically, as an implementation manner, after the object distance of the target is calculated by storing a comparison table of the object distance and the lens focal length in advance, the comparison table is queried to obtain the shooting focal length of the PTZ camera 120.
Further, a lookup table of the distance and angle between the object and the stereoscopic vision camera 110 and the photographing parameters of the PTZ camera 120 may be stored in advance, and the pre-stored lookup table may be queried according to the distance and angle between the object and the stereoscopic vision camera 110 to obtain the corresponding photographing parameters.
Optionally, a depth of field of PTZ camera 120 at the target may also be calculated prior to adjusting PTZ camera 120 in step 430, and the magnitude of the depth of field versus distance estimation error may be compared to determine how to adjust PTZ camera 120.
Fig. 7 is a schematic diagram illustrating a method for calculating a depth of field according to an embodiment of the present invention.
Wherein: δ is the allowable circle diameter of confusion, F is the focal length of the lens, F is the shooting aperture value of the lens, L is the focusing lens, Δ L1 is the foreground depth, Δ L2 is the back field depth, Δ L is the field depth, and the calculation formula of the field depth is as follows:
Figure GDA0003188470290000083
under a certain application scene, the aperture F and the diameter delta of the circle of confusion of the camera are both determined values, and the formula shows that the depth of field is infinite after the object distance is small to a certain value, and the depth of field is reduced along with the increase of the object distance.
Comparing the depth of field of the PTZ camera 120 at the target with the distance estimation error of the target, wherein if the depth of field is greater than or equal to the distance estimation error, the depth of field can cover the distance estimation error range of the target, even if a certain error exists in distance calculation, the depth of field is also within the depth of field range, clear images can be shot, and the PTZ camera 120 can be directly adjusted according to the shooting parameters and the target can be captured; if the depth of field is smaller than the distance estimation error, the depth of field may not completely cover the distance estimation error range of the target, and a clear image of the target may not be captured, so that the PTZ camera 120 may need to perform auxiliary auto-focus to capture the target after being adjusted according to the above-mentioned capturing parameters.
In a specific implementation, as an embodiment, if a plurality of targets are detected in a monitoring screen of the stereoscopic vision camera 110, the distance and the angle between each target and the stereoscopic vision camera 110 are respectively calculated, shooting parameters of the PTZ camera 120 corresponding to each target are respectively obtained according to the distance and the angle between each target and the stereoscopic vision camera 110, a snapshot sequence of the plurality of targets is determined according to a priority, images of the plurality of targets are respectively captured by the PTZ camera 120 according to the determined snapshot sequence, and parameters according to which the priority is determined include a position, an angle, a distance, a running track where the target is located, whether the target is about to leave a monitoring area, and the like. The above list is non-exhaustive and may include other parameters. It should be clear that the above parameters are not all necessary, and several parameters can be selected as the consideration factors of the priority according to the actual application scenario.
The above method embodiment is described with reference to the fast capturing system 100 in fig. 1, and the specific steps of implementing the above method embodiment in the fast capturing apparatus 200 in fig. 2 are similar to those described above, and it should be clearly understood by those skilled in the art that the detailed description is omitted here.
Embodiments of the present invention further provide a computer-readable storage medium for storing computer software instructions for the above fast capture apparatus and/or system, which includes program codes designed to execute the above method embodiments. By executing the stored program codes, the distance and the angle of the target can be quickly measured through a stereoscopic vision technology, so that shooting parameters of a snapshot image are obtained, the PTZ camera is quickly adjusted, the snapshot of the target is completed, the snapshot time can be shortened, and the real-time performance of the snapshot is guaranteed.
The embodiment of the invention also provides a computer program product. The computer program product comprises computer software instructions which can be loaded by a processor for implementing the method in the above-described method embodiments.
While the invention has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, apparatus (system), or computer program product. Accordingly, this application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "module" or "system. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein. A computer program stored/distributed on a suitable medium supplied together with or as part of other hardware, may also take other distributed forms, such as via the Internet or other wired or wireless telecommunication systems.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the invention has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the invention. Accordingly, the specification and figures are merely exemplary of the invention as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the invention. It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (15)

1. A method of fast snap shots, the method comprising:
detecting a target in a current picture of a first camera, and calculating the distance and the angle between the target and the first camera;
calculating shooting parameters of a second camera according to the distance and the angle, wherein the shooting parameters comprise a lens focal length and a shooting angle;
adjusting the second camera to shoot the target according to the shooting parameters;
wherein the first camera is a stereo vision camera and the second camera is a PTZ (Pan/Tilt/Zoom) camera; the stereoscopic vision camera is used for sending the distance and the angle between the target and the stereoscopic vision camera to the PTZ camera, and the PTZ camera is used for calculating shooting parameters of the PTZ camera for shooting the target according to the distance and the angle between the target and the stereoscopic vision camera after receiving the distance and the angle sent by the stereoscopic vision camera;
wherein adjusting the target for snapshot by the second camera according to the shooting parameters comprises:
if the distance estimation error corresponding to the distance is smaller than or equal to the depth of field, adjusting the PTZ camera according to the shooting parameters and then capturing the target, wherein the depth of field is the depth of field of the PTZ camera corresponding to the object distance between the target and the PTZ camera;
and if the distance estimation error is larger than the depth of field, automatically adjusting the focal length of the lens and capturing the target after adjusting the PTZ camera according to the shooting parameters.
2. The method according to claim 1, wherein the calculating the photographing parameters of the PTZ camera according to the distance and angle comprises:
setting the distance as an object distance of the target from the PTZ camera;
setting the angle as a capture angle of the PTZ camera;
and obtaining the focal length of the lens of the PTZ camera according to the object distance.
3. The method of claim 2, wherein deriving a focal length of a lens of the PTZ camera from the object distance comprises:
acquiring a comparison table of the object distance and the focal length of a lens of the PTZ camera;
and inquiring the comparison table according to the object distance to obtain the lens focal length of the PTZ camera corresponding to the object distance.
4. A method according to any one of claims 1 to 3, characterized in that:
calculating shooting parameters of the PTZ camera according to the distance and the angle, and further comprising: calculating a distance estimation error corresponding to the distance according to the distance, and calculating a depth of field of the PTZ camera corresponding to the object distance according to the object distance between the target and the PTZ camera;
before the adjusting the PTZ camera, further comprising comparing the distance estimation error to the magnitude of the depth of field.
5. The method according to any one of claims 1 to 3, wherein if a plurality of targets are detected in the current frame of the stereoscopic vision camera, after calculating the lens parameters of the PTZ camera according to the distance and the angle, determining a snapshot order of the plurality of targets according to priority, and respectively snapping the targets according to the determined snapshot order; the parameters according to which the capturing sequence of the targets is determined according to the priority comprise the position, the angle, the distance and the running track of the target or whether the target is about to leave the monitoring area.
6. The method of claim 4, wherein after calculating the lens parameters of the PTZ camera based on the distance and the angle if a plurality of objects are detected in the current frame of the stereoscopic vision camera, determining a snapshot order of the plurality of objects according to a priority, and respectively snapping the objects according to the determined snapshot order;
the parameters according to which the capturing sequence of the targets is determined according to the priority comprise the position, the angle, the distance and the running track of the target or whether the target is about to leave the monitoring area.
7. A fast snap-shot system, comprising:
the system comprises a first camera, a second camera and a third camera, wherein the first camera is used for acquiring a current monitoring picture, detecting a target in the monitoring picture, calculating the distance and the angle between the target and the first camera and sending the distance and the angle to the second camera;
the second camera is used for receiving the distance and the angle sent by the first camera, calculating shooting parameters of the second camera according to the distance and the angle, and adjusting the second camera to shoot the target according to the shooting parameters, wherein the shooting parameters comprise a lens focal length and a shooting angle;
wherein the first camera is a stereo vision camera and the second camera is a PTZ (Pan/Tilt/Zoom) camera;
if the distance estimation error is smaller than or equal to the depth of field, the PTZ camera is adjusted according to the shooting parameters by the PTZ phase, and then the focal length of the lens is automatically adjusted and the target is captured;
and if the distance estimation error is larger than the depth of field, the PTZ camera automatically adjusts the focal length of the lens and captures the target after adjusting the shooting angle and the focal length of the lens according to the shooting parameters.
8. The fast snap-shot system according to claim 7, characterized in that:
the stereoscopic vision camera is further used for calculating a distance estimation error corresponding to the distance according to the distance and sending the distance estimation error to the PTZ camera;
the PTZ camera is further used for receiving the distance estimation error sent by the stereoscopic vision camera and calculating the depth of field of the PTZ camera corresponding to the object distance according to the object distance between the target and the PTZ camera;
the PTZ camera compares the distance estimation error to the magnitude of the depth of field.
9. The fast snap-shot system according to claim 7, wherein said calculating the shooting parameters of the PTZ camera from the distance and angle comprises: setting the distance as an object distance of the target from the PTZ camera; setting the angle as a capture angle of the PTZ camera; and obtaining the focal length of the lens of the PTZ camera according to the object distance.
10. The fast snap-shot system according to claim 7, 8 or 9, characterized in that:
if the stereoscopic vision camera detects a plurality of targets, calculating the distance and the angle between each target and the stereoscopic vision camera;
the PTZ camera is further used for calculating shooting parameters corresponding to the targets, determining a snapshot sequence of the targets according to the priority, and adjusting the second camera to snapshot the targets according to the determined snapshot sequence;
the parameters according to which the capturing sequence of the targets is determined according to the priority comprise the position, the angle, the distance and the running track of the target or whether the target is about to leave the monitoring area.
11. A fast snap-shot apparatus, comprising a first camera, a processor and a second camera, wherein:
the first camera comprises two lenses and is used for acquiring current monitoring pictures from the two lenses simultaneously;
the processor is used for detecting a target in the current picture of the first camera, calculating the distance and the angle between the target and the first camera, calculating shooting parameters of a second camera according to the distance and the angle, and controlling the second camera to shoot the target in a snapshot manner, wherein the shooting parameters comprise a shooting angle and a lens focal length;
a second camera for capturing the target;
wherein the first camera is a stereo vision camera and the second camera is a PTZ (Pan/Tilt/Zoom) camera;
if the distance estimation error corresponding to the distance is smaller than or equal to the depth of field, the processor adjusts the PTZ camera according to the shooting parameters and captures the target, wherein the depth of field is the depth of field of the PTZ camera corresponding to the object distance between the target and the PTZ camera;
and if the distance estimation error is larger than the depth of field, the processor adjusts the PTZ camera according to the shooting parameters, and then controls the PTZ camera to automatically adjust the focal length of the lens and capture the target.
12. The quick snap-shot apparatus according to claim 11, characterized in that:
the processor is further used for calculating a corresponding distance estimation error according to the distance, and calculating a corresponding depth of field of the PTZ camera at the object distance according to the object distance between the target and the PTZ camera;
the processor compares the magnitude of the distance estimation error to the depth of field.
13. The fast snap-shot apparatus according to claim 11, wherein said calculating the shooting parameters of the PTZ camera according to the distance and angle comprises: setting the distance as an object distance of the target from the PTZ camera; setting the angle as a capture angle of the PTZ camera; and obtaining the focal length of the lens of the PTZ camera according to the object distance.
14. The quick-snap shooting apparatus according to claim 11, 12 or 13, characterized in that:
if the processor detects a plurality of targets, calculating the distance and the angle between each target and the stereoscopic vision camera and shooting parameters corresponding to the plurality of targets;
the processor determines the capturing sequence of the targets according to the priority, controls the PTZ camera to adjust according to the shooting parameters of the targets according to the determined capturing sequence, and captures images of the targets;
the parameters according to which the capturing sequence of the targets is determined according to the priority comprise the position, the angle, the distance and the running track of the target or whether the target is about to leave the monitoring area.
15. A computer-usable storage medium characterized in that,
the computer-usable storage medium stores a computer program, wherein the computer program is capable of implementing the method of any one of claims 1 to 6 when executed by hardware.
CN201711322098.2A 2017-12-12 2017-12-12 Method, device and system for quick snapshot Active CN109922251B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201711322098.2A CN109922251B (en) 2017-12-12 2017-12-12 Method, device and system for quick snapshot
PCT/CN2018/119698 WO2019114617A1 (en) 2017-12-12 2018-12-07 Method, device, and system for fast capturing of still frame

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711322098.2A CN109922251B (en) 2017-12-12 2017-12-12 Method, device and system for quick snapshot

Publications (2)

Publication Number Publication Date
CN109922251A CN109922251A (en) 2019-06-21
CN109922251B true CN109922251B (en) 2021-10-22

Family

ID=66818967

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711322098.2A Active CN109922251B (en) 2017-12-12 2017-12-12 Method, device and system for quick snapshot

Country Status (2)

Country Link
CN (1) CN109922251B (en)
WO (1) WO2019114617A1 (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110278382B (en) * 2019-07-22 2020-12-08 浙江大华技术股份有限公司 Focusing method, device, electronic equipment and storage medium
CN110599739A (en) * 2019-10-11 2019-12-20 宁夏广天夏电子科技有限公司 Personnel safety detection system based on three-dimensional image and intelligent video technology
CN111147840A (en) * 2019-12-23 2020-05-12 南京工业职业技术学院 Automatic control and communication system for video and audio acquisition of 3D camera rocker arm
CN113099163B (en) * 2019-12-23 2023-04-11 中移物联网有限公司 Monitoring adjusting method, monitoring system, electronic device and readable storage medium
CN113038070B (en) * 2019-12-25 2022-10-14 浙江宇视科技有限公司 Equipment focusing method and device and cloud platform
CN111144478B (en) * 2019-12-25 2022-06-14 电子科技大学 Automatic detection method for through lens
CN111158107B (en) * 2020-01-03 2021-07-06 支付宝(杭州)信息技术有限公司 Focusing method, device and equipment of lens module
CN111314602B (en) * 2020-02-17 2021-09-17 浙江大华技术股份有限公司 Target object focusing method, target object focusing device, storage medium and electronic device
CN115299031A (en) * 2020-03-20 2022-11-04 深圳市大疆创新科技有限公司 Automatic focusing method and camera system thereof
CN111595292A (en) * 2020-04-29 2020-08-28 杭州电子科技大学 Binocular vision distance measurement method based on unequal focal lengths
CN111405193B (en) * 2020-04-30 2021-02-09 重庆紫光华山智安科技有限公司 Focusing method and device and camera equipment
CN111652802B (en) * 2020-05-19 2024-03-05 杭州海康威视数字技术股份有限公司 Panorama making method, interaction method and device based on panorama
CN111986248B (en) * 2020-08-18 2024-02-09 东软睿驰汽车技术(沈阳)有限公司 Multi-vision sensing method and device and automatic driving automobile
CN112084925A (en) * 2020-09-03 2020-12-15 厦门利德集团有限公司 Intelligent electric power safety monitoring method and system
CN112581547B (en) * 2020-12-30 2022-11-08 安徽地势坤光电科技有限公司 Rapid method for adjusting installation angle of imaging lens
CN113422901B (en) * 2021-05-29 2023-03-03 华为技术有限公司 Camera focusing method and related equipment
CN113452903B (en) * 2021-06-17 2023-07-11 浙江大华技术股份有限公司 Snapshot equipment, snap method and main control chip
CN113569813A (en) * 2021-09-05 2021-10-29 中国电波传播研究所(中国电子科技集团公司第二十二研究所) Intelligent image recognition system and method based on server side
CN113888651A (en) * 2021-10-21 2022-01-04 天津市计量监督检测科学研究院电子仪表实验所 Dynamic and static vision detection system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101068342A (en) * 2007-06-05 2007-11-07 西安理工大学 Video frequency motion target close-up trace monitoring method based on double-camera head linkage structure
CN101924923A (en) * 2010-08-03 2010-12-22 杭州翰平电子技术有限公司 Embedded intelligent automatic zooming snapping system and method thereof
CN102867304A (en) * 2012-09-04 2013-01-09 南京航空航天大学 Method for establishing relation between scene stereoscopic depth and vision difference in binocular stereoscopic vision system
CN103209298A (en) * 2012-01-13 2013-07-17 索尼公司 Blur-matching Model Fitting For Camera Automatic Focusing Adaptability
CN105407283A (en) * 2015-11-20 2016-03-16 成都因纳伟盛科技股份有限公司 Multi-target active recognition tracking and monitoring method
CN106251334A (en) * 2016-07-18 2016-12-21 华为技术有限公司 A kind of camera parameters method of adjustment, instructor in broadcasting's video camera and system
WO2017099541A1 (en) * 2015-12-09 2017-06-15 공간정보기술 주식회사 Subject spatial movement tracking system using multiple stereo cameras

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7929801B2 (en) * 2005-08-15 2011-04-19 Sony Corporation Depth information for auto focus using two pictures and two-dimensional Gaussian scale space theory
US8310554B2 (en) * 2005-09-20 2012-11-13 Sri International Method and apparatus for performing coordinated multi-PTZ camera tracking
MY158543A (en) * 2009-09-08 2016-10-14 Mimos Berhad Control mechanism for automated surveillance system
CN104363376A (en) * 2014-11-28 2015-02-18 广东欧珀移动通信有限公司 Continuous focusing method, device and terminal
CN109155842B (en) * 2016-05-17 2020-12-08 富士胶片株式会社 Stereo camera and control method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101068342A (en) * 2007-06-05 2007-11-07 西安理工大学 Video frequency motion target close-up trace monitoring method based on double-camera head linkage structure
CN101924923A (en) * 2010-08-03 2010-12-22 杭州翰平电子技术有限公司 Embedded intelligent automatic zooming snapping system and method thereof
CN103209298A (en) * 2012-01-13 2013-07-17 索尼公司 Blur-matching Model Fitting For Camera Automatic Focusing Adaptability
CN102867304A (en) * 2012-09-04 2013-01-09 南京航空航天大学 Method for establishing relation between scene stereoscopic depth and vision difference in binocular stereoscopic vision system
CN105407283A (en) * 2015-11-20 2016-03-16 成都因纳伟盛科技股份有限公司 Multi-target active recognition tracking and monitoring method
WO2017099541A1 (en) * 2015-12-09 2017-06-15 공간정보기술 주식회사 Subject spatial movement tracking system using multiple stereo cameras
CN106251334A (en) * 2016-07-18 2016-12-21 华为技术有限公司 A kind of camera parameters method of adjustment, instructor in broadcasting's video camera and system

Also Published As

Publication number Publication date
WO2019114617A1 (en) 2019-06-20
CN109922251A (en) 2019-06-21

Similar Documents

Publication Publication Date Title
CN109922251B (en) Method, device and system for quick snapshot
CN107948519B (en) Image processing method, device and equipment
KR102143456B1 (en) Depth information acquisition method and apparatus, and image collection device
US8773509B2 (en) Imaging device, imaging method and recording medium for adjusting imaging conditions of optical systems based on viewpoint images
US8970770B2 (en) Continuous autofocus based on face detection and tracking
TWI432870B (en) Image processing system and automatic focusing method
EP3318054B1 (en) Systems and methods for autofocus trigger
US8233078B2 (en) Auto focus speed enhancement using object recognition and resolution
US9489747B2 (en) Image processing apparatus for performing object recognition focusing on object motion, and image processing method therefor
WO2016049889A1 (en) Autofocus method, device and electronic apparatus
CN107205109B (en) Electronic device with multiple camera modules and control method thereof
WO2014153950A1 (en) Quick automatic focusing method and image acquisition device
US20100214445A1 (en) Image capturing method, image capturing apparatus, and computer program
US7627240B2 (en) Optical device with improved autofocus performance and method related thereto
WO2021000063A1 (en) Automatic focus distance extension
CN111246100B (en) Anti-shake parameter calibration method and device and electronic equipment
CN110784653A (en) Dynamic focusing method based on flight time and camera device thereof
US20160212410A1 (en) Depth triggered event feature
US20110149045A1 (en) Camera and method for controlling a camera
JP2012500506A5 (en)
JP2015106116A (en) Imaging apparatus
US20130093856A1 (en) Stereoscopic imaging digital camera and method of controlling operation of same
JP2016142924A (en) Imaging apparatus, method of controlling the same, program, and storage medium
US9854150B2 (en) Auto-focus control in a camera to prevent oscillation
WO2018235256A1 (en) Stereo measurement device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant