CN106803880A - Orbit camera device people's is autonomous with clapping traveling control method - Google Patents

Orbit camera device people's is autonomous with clapping traveling control method Download PDF

Info

Publication number
CN106803880A
CN106803880A CN201710077249.6A CN201710077249A CN106803880A CN 106803880 A CN106803880 A CN 106803880A CN 201710077249 A CN201710077249 A CN 201710077249A CN 106803880 A CN106803880 A CN 106803880A
Authority
CN
China
Prior art keywords
target
frame
camera device
device people
clapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710077249.6A
Other languages
Chinese (zh)
Inventor
秦华旺
刘光杰
赵玉鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yangzhou Xi Zhong Technology Co Ltd
Original Assignee
Yangzhou Xi Zhong Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yangzhou Xi Zhong Technology Co Ltd filed Critical Yangzhou Xi Zhong Technology Co Ltd
Priority to CN201710077249.6A priority Critical patent/CN106803880A/en
Publication of CN106803880A publication Critical patent/CN106803880A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The present invention discloses a kind of the autonomous with clapping traveling control method of orbit camera device people, and photographer is selected with clapping object and determining reference frame by monitor video first;After track cameras people enters with bat pattern, obtained with clapping the relative displacement of the target in target and reference frame in present frame with clapping control module in each speed governing cycle, and the speed governing in this, as traveling motor basis;In shooting task is performed, cameraman can be manipulated and force cameras people to exit with bat pattern cameras people by long-range brake.The present invention can be used for the automatic shooting needs of the fast moving objects of competitive sports and large-scale activity.

Description

Orbit camera device people's is autonomous with clapping traveling control method
Technical field
The invention belongs to radio and television technique for taking field, and in particular to a kind of orbit camera device people's is autonomous with clapping row Enter control method.
Background technology
TV and film producing process need to carry out substantial amounts of shooting work, and generally same photographed scene needs multiple bat Take the photograph seat in the plane cooperation.Traditional screening-mode is that have the photographer to carry out shooting operation on each seat in the plane, also some auxiliary Personnel are using track or hang the forms such as mast and assist the photographer to carry out the change of camera site and angle, and this mode not only works Efficiency is low, and easily due to multiple seat in the plane matching problems cause camera lens to expose the false wait influence shooting quality situation.Using track Robot is shot, and photographer carries out remote control on backstage, in sports event live broadcast, perform in a radio or TV programme hall in be to adopt extensively The orbit camera device people of technology, such as stadium, the underwater robot swum on competition field etc., these robot platforms Use, improve operating efficiency and shoot effect.However, during reference object is quickly moved, cameraman is often It is difficult to directly manipulate the human-computer interaction device on operation bench, the accurate tracking to object of matching colors is realized, often largely Depend on the operation skill and the qualification to equipment of photographer.Simply take the traveling strategy of constant speed, the effect of shooting Sometimes it is extremely difficult to require again.
It is current to fail to find the inventive technique with clapping control for being directly used in orbit camera device people, related technology master Have:
(1) video target tracking method, device and automatic video frequency following system (CN 101290681B [P]), invention master A kind of video target tracking method is disclosed, including gradient vector flow GVF deformations are carried out to each position candidate in present frame, obtained To each deformation curve;The video features of deformation curve are calculated, determines that a position candidate is made according to the video features being calculated It is a kind of various methods of target of target location.
(2) a kind of method (CN101860732 B) of control holder camera to automatically track target, the invention is related to one kind The method for controlling holder camera to automatically track target, comprises the following steps:Automatically select tracing mode:Video processing module root Automatic tracing mode and global search pattern are automatically selected according to the tracking mode of target;Automatically the processing procedure of tracing mode is: Video processing module judges monopod video camera state, and after tracking target is matched, position and size according to target send control System order is adjusted to camera position and focal length;The processing procedure of global search pattern is:Video processing module is to monitoring Several regions of search in region carry out target detection and matching until searching tracking target according to taking turn mode.
But, the above does not have for orbit camera device people with clapping the control actual solution of proposition, existing method Though referring to part in video target tracking method, also cannot directly apply.
The content of the invention
Goal of the invention:It is an object of the invention to solve the deficiencies in the prior art, there is provided a kind of orbit camera Device people's is autonomous with clapping traveling control method, and orbit camera device people is manipulated to shooting right with parallel motion relation for distal end Elephant it is automatic with clap.
Technical scheme:A kind of orbit camera device people's of the present invention is autonomous with clapping traveling control method, specifically includes successively Following steps:
Step 1:Implement photographer's manually determined tracking target o of remote control, and determine reference image frame Fr
Step 2:With time T as controlling cycle, orbit camera device people obtains the frame of video F of current periodt;Wherein, obtain The frame of video for obtaining current period does not need extras, has vision signal special all the way to make for cameraman in monitor picture With;
Step 3:Judge FtWhether F is includedrIn with clap target o, if nothing, jump to step 8, otherwise continue;
Step 4:With clapping displacement in frame of video coordinate system of the target relative to reference image frame in calculating present frame dt
Step 5:According to displacement dtAnd the displacement d of a upper controlling cyclet-1, calculate orbit camera device people's Target speed governing increment Delta v=f (dt,dt-1), and platform speed is adjusted to vt+Δv;
Step 6:Detect whether to receive the brake manipulation instruction of console, if nothing, jump to step 2, otherwise continue;
Step 7:Output terminates with bat process, jumps to step 9;
Step 8:Output loses information with clapping target;
Step 9:Exit with bat pattern.
Further, in the step 1, reference image frame FrThe principle of determination is:The important portion of target o at least A% Position is located in the middle of shooting image in the region Rc of B%.
Further, the detailed process of the step 3 is:
Step 301;Extract reference frame FrThe n key point p that middle target o is included1, p2,…,pn, and its corresponding spy Levy the sub- fp of description1, fp2,…,fpn
Step 302;Extract key frame FtIn m key point q1, q2,…,qm, and its corresponding Feature Descriptor fq1, fq2,…,fqm
Step 303:Calculate piAnd qjThe distance between son fd (i, j) is described between point, and asks for n dimensions Fdi=minjfd (i,j);I=1,2 ..., n, j=j, 2 ..., m;
Step 304:If FdiIn less than less than threshold epsilon quantity | { Fdi<ε}|>μn(0<μ<1) in, then judging present frame Ft Comprising target o, otherwise it is judged as that target is lost.
Further, the specific method of the step 4 is, by the Fd of gained in step 3iSort from small to large, take wherein Preceding ρ constitute set omega, and 0<ρ<1, then calculate its corresponding key point piAnd qjDisplacement d in frame coordinate systemij =(xij,yij), ask for dijX durection components average value dx=1/ | Ω | × ΣΩ{xij, as in present frame with clap target The relative displacement d with reference framet;Xij and yij represent the transverse and longitudinal coordinate value of a vector in frame coordinate system.
Further, in the step 5, the computing formula f (d of speed knots modificationt,dt-1) it is proportional control mode, i.e. f (dt,dt-1)=k1×dt+k2×dt-1, wherein, parameter k1And k2It is to be obtained by pre-stage test and correct on-the-spot.Wherein, here K1 and k2 refer to ratio, and acquisition is debugged by specific works scene.Specifically, k1 and k2 here, not only and entirely transports The kinematic parameter of moving platform is relevant, goes back and the focal length of video camera setting has relation.
Beneficial effect:The present invention manipulates cameras people to the automatic of parallel motion relation reference object for distal end With clapping, can be used for the automatic shooting needs of the fast moving objects of competitive sports and large-scale activity.
Brief description of the drawings
Fig. 1 is schematic flow sheet of the invention.
Specific embodiment
Technical solution of the present invention is described in detail below, but protection scope of the present invention is not limited to the implementation Example.
As shown in figure 1, a kind of orbit camera device people's of the present embodiment is autonomous with clapping traveling control method, specifically successively Comprise the steps of:
Step 1:Implement photographer's manually determined tracking target o of remote control, and determine reference image frame Fr.It is worth referring to What is gone out is to determine reference image frame FrThe principle of determination is target o at least 50% significant points in the area of shooting image centre 25% Domain Rc;
Step 2:With time T=1000ms as controlling cycle, orbit camera device people obtains current frame of video Ft
Step 3:Judge present frame whether comprising reference frame with clapping target o, if nothing, jump to step 8, otherwise continue;
Specifically judge whether as follows with clapping the deterministic process of target o:
Step 301:Extract key frame FrIn, the n key point p that target o is included1, p2,…,pn, and its Feature Descriptor fp1, fp2,…,fpn.In the present embodiment, key point we using the SIFT points commonly used in image procossing and the gradients of 128 dimensions Direction histogram is used as description;
Step 302:Extract key frame FtIn m key point q1, q2,…,qN, and its Feature Descriptor fq1, fq2,…, fqmIt is the gradient orientation histogram characteristic quantity of 128 dimensions.
Step 303:Calculate piAnd qjThe distance between son fd (i, j) is described, distance calculates to enter using Euclidean distance between point OK.Specifically, to the sub- fp of description of 128 dimensionsi=[ρ i1,ρi2,...,ρi128] and fqi=[ρ j1,ρj2,...,ρj128] for:
Fd (i, j)=(ΣK=(1,2 ..., 128)(ρik-ρjk)2)1/2
After fd (i, j) is obtained, n dimensions Fd is further asked fori=minjfd(i,j);
Step 304:If FdiIn less than less than threshold epsilon quantity | { Fdi<ε}|>0.6 × n, then judge to be wrapped in present frame Ft O containing target, is otherwise judged as that target is lost.
Step 4:With clapping the relative displacement d with reference frame of target in calculating present framet
It is step 401,402,403 identical respectively at step 301,302 and 303;
Step 404:By FdiSort from small to large, take therein preceding 0.4, constitute set omega, calculate its corresponding point piWith qjDisplacement d in frame of video coordinate systemij=(xij,yij), ask for dijX-component average value dx=1/ | Ω | × ΣΩ {xij, as the displacement d in present frame with bat target relatively with reference framet
Step 5:According to displacement dtAnd the displacement d of a upper controlling cyclet-1, calculate orbit camera device people's Target speed governing increment Delta v=f (dt,dt-1)=f (dt, dt-1)=k1×dt+k2×dt-1, and platform speed is adjusted to vt+Δ v;Here core control parameter k1And k2Needs obtain empirical value by a large amount of external testings, and after field deployment is completed, lead to Cross correct on-the-spot selection.Step 6:Detect whether to receive the brake manipulation instruction of console, if nothing, jump to step 2, it is no Then continue;
Step 7:Output terminates with bat process, jumps to step 9;
Step 8:Output loses information with clapping target;
Step 9:Exit with bat pattern.

Claims (5)

1. a kind of orbit camera device people's is autonomous with clapping traveling control method, it is characterised in that:It is specific to include following step successively Suddenly:
Step 1:Implement photographer's manually determined tracking target o of remote control, and determine reference image frame Fr
Step 2:With time T as controlling cycle, orbit camera device people obtains the frame of video F of current periodt
Step 3:Judge FtWhether F is includedrIn with clap target o, if nothing, jump to step 8, otherwise continue;
Step 4:Calculate current video frame FtIn with clap target relative to reference image frame the displacement in frame of video coordinate system dt
Step 5:According to displacement dtAnd the displacement d of a upper controlling cyclet-1, calculate the target tune of orbit camera device people Fast increment Delta v=f (dt,dt-1), and the platform speed of service is adjusted to vt+ Δ v, vtRefer to the operation speed of whole trolley platform Degree;
Step 6:Detect whether to receive the brake manipulation instruction of console, if nothing, jump to step 2, otherwise continue;
Step 7:Output terminates with bat process, jumps to step 9;
Step 8:Output loses information with clapping target;
Step 9:Exit with bat pattern.
2. according to claim 1 orbit camera device people it is autonomous with clap traveling control method, it is characterised in that:The step In rapid 1, reference image frame FrThe principle of determination is:The target o at least significant points of A% are located at B% in the middle of shooting image In the Rc of region.
3. according to claim 1 orbit camera device people it is autonomous with clap traveling control method, it is characterised in that:The step Rapid 3 detailed process is:
Step 301;Extract reference frame FrThe n key point p that middle target o is included1, p2,…,pn, and its corresponding feature retouches State sub- fp1, fp2,…,fpn
Step 302;Extract present frame FtIn m key point q1, q2,…,qm, and its corresponding Feature Descriptor fq1, fq2,…,fqm
Step 303:Calculate piAnd qjThe distance between son fd (i, j) is described between point, and asks for n dimensions Fdi=minjFd (i, j), i =1,2 ..., n, j=j, 2 ..., m;
Step 304:If FdiIn less than less than threshold epsilon quantity | { Fdi<ε}|>μn(0<μ<1), then judge to be included in present frame Ft Target o, is otherwise judged as that target is lost.
4. according to claim 1 orbit camera device people it is autonomous with clap traveling control method, it is characterised in that:The step Rapid 4 specific method is, by the Fd of gained in step 3iSort from small to large, take preceding ρ therein and constitute set omega, and 0<ρ<1, Then its corresponding key point p is calculatediAnd qjDisplacement d in frame coordinate systemij=(xij,yij), ask for dijX directions Average value dx=1/ | Ω | × Σ of componentΩ{xij, as the displacement d in present frame with bat target relatively with reference framet; Xij and yij represent the transverse and longitudinal coordinate value of a vector in frame coordinate system.
5. according to claim 1 orbit camera device people it is autonomous with clap traveling control method, it is characterised in that:The step In rapid 5, the computing formula f (d of speed knots modification Δ vt,dt-1) it is proportional control mode, i.e. f (dt,dt-1)=k1×dt+k2× dt-1, wherein, parameter k1And k2It is to be obtained by pre-stage test and scene.
CN201710077249.6A 2017-02-14 2017-02-14 Orbit camera device people's is autonomous with clapping traveling control method Pending CN106803880A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710077249.6A CN106803880A (en) 2017-02-14 2017-02-14 Orbit camera device people's is autonomous with clapping traveling control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710077249.6A CN106803880A (en) 2017-02-14 2017-02-14 Orbit camera device people's is autonomous with clapping traveling control method

Publications (1)

Publication Number Publication Date
CN106803880A true CN106803880A (en) 2017-06-06

Family

ID=58988515

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710077249.6A Pending CN106803880A (en) 2017-02-14 2017-02-14 Orbit camera device people's is autonomous with clapping traveling control method

Country Status (1)

Country Link
CN (1) CN106803880A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108044596A (en) * 2017-12-07 2018-05-18 ***创新科技有限公司 Photography machine people
CN108044597A (en) * 2017-12-07 2018-05-18 ***创新科技有限公司 Machinery performs structure and photography machine people
CN109143899A (en) * 2018-07-13 2019-01-04 南京理工大学 A kind of implementation method of photography machine people host computer various control
CN110633612A (en) * 2019-11-20 2019-12-31 中通服创立信息科技有限责任公司 Monitoring method and system for inspection robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101540890A (en) * 2009-04-28 2009-09-23 南京航空航天大学 Method for obtaining a clear face image of a moving human body in a monitored video
CN101770568A (en) * 2008-12-31 2010-07-07 南京理工大学 Target automatically recognizing and tracking method based on affine invariant point and optical flow calculation
CN102007516A (en) * 2008-04-14 2011-04-06 汤姆森特许公司 Technique for automatically tracking an object
CN102438122A (en) * 2010-09-29 2012-05-02 鸿富锦精密工业(深圳)有限公司 Camera device and method for dynamically detecting monitoring object by using same
CN103955950A (en) * 2014-04-21 2014-07-30 中国科学院半导体研究所 Image tracking method utilizing key point feature matching
CN105574894A (en) * 2015-12-21 2016-05-11 零度智控(北京)智能科技有限公司 Method and system for screening moving object feature point tracking results

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102007516A (en) * 2008-04-14 2011-04-06 汤姆森特许公司 Technique for automatically tracking an object
CN101770568A (en) * 2008-12-31 2010-07-07 南京理工大学 Target automatically recognizing and tracking method based on affine invariant point and optical flow calculation
CN101540890A (en) * 2009-04-28 2009-09-23 南京航空航天大学 Method for obtaining a clear face image of a moving human body in a monitored video
CN102438122A (en) * 2010-09-29 2012-05-02 鸿富锦精密工业(深圳)有限公司 Camera device and method for dynamically detecting monitoring object by using same
CN103955950A (en) * 2014-04-21 2014-07-30 中国科学院半导体研究所 Image tracking method utilizing key point feature matching
CN105574894A (en) * 2015-12-21 2016-05-11 零度智控(北京)智能科技有限公司 Method and system for screening moving object feature point tracking results

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108044596A (en) * 2017-12-07 2018-05-18 ***创新科技有限公司 Photography machine people
CN108044597A (en) * 2017-12-07 2018-05-18 ***创新科技有限公司 Machinery performs structure and photography machine people
CN109143899A (en) * 2018-07-13 2019-01-04 南京理工大学 A kind of implementation method of photography machine people host computer various control
CN110633612A (en) * 2019-11-20 2019-12-31 中通服创立信息科技有限责任公司 Monitoring method and system for inspection robot
CN110633612B (en) * 2019-11-20 2020-09-11 中通服创立信息科技有限责任公司 Monitoring method and system for inspection robot

Similar Documents

Publication Publication Date Title
CN109887040B (en) Moving target active sensing method and system for video monitoring
CN109151439B (en) Automatic tracking shooting system and method based on vision
CN110142785A (en) A kind of crusing robot visual servo method based on target detection
Iwase et al. Parallel tracking of all soccer players by integrating detected positions in multiple view images
CN106803880A (en) Orbit camera device people&#39;s is autonomous with clapping traveling control method
CN105898107B (en) A kind of target object grasp shoot method and system
CN107357286A (en) Vision positioning guider and its method
CN105654512A (en) Target tracking method and device
CN103424959B (en) Image picking system and lens devices
CN110910459B (en) Camera device calibration method and device and calibration equipment
CN106910206B (en) Target tracking method and device
CN106054627B (en) Control method and device based on gesture recognition and air conditioner
CN108020158A (en) A kind of three-dimensional position measuring method and device based on ball machine
CN116309685A (en) Multi-camera collaborative swimming movement speed measurement method and system based on video stitching
CN106447735A (en) Panoramic camera geometric calibration processing method
KR101469099B1 (en) Auto-Camera Calibration Method Based on Human Object Tracking
CN113506340A (en) Method and equipment for predicting cloud deck pose and computer readable storage medium
Fahn et al. A high-definition human face tracking system using the fusion of omni-directional and PTZ cameras mounted on a mobile robot
CN106067943A (en) Control device, optical device, picture pick-up device and control method
Karungaru et al. Ground sports strategy formulation and assistance technology develpoment: player data acquisition from drone videos
Perš et al. Tracking people in sport: Making use of partially controlled environment
CN106846284A (en) Active-mode intelligent sensing device and method based on cell
CN114594770A (en) Inspection method for inspection robot without stopping
Matsumoto et al. 3—22 Optimized Camera Viewpoint Determination System for Soccer Game Broadcasting
KR101649181B1 (en) Flight information estimator and estimation method of the flying objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170606

RJ01 Rejection of invention patent application after publication