WO2021232826A1 - 基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置 - Google Patents

基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置 Download PDF

Info

Publication number
WO2021232826A1
WO2021232826A1 PCT/CN2021/070287 CN2021070287W WO2021232826A1 WO 2021232826 A1 WO2021232826 A1 WO 2021232826A1 CN 2021070287 W CN2021070287 W CN 2021070287W WO 2021232826 A1 WO2021232826 A1 WO 2021232826A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
target object
trajectory
target
parameters
Prior art date
Application number
PCT/CN2021/070287
Other languages
English (en)
French (fr)
Inventor
刘小华
汤夕根
刘四奎
唐旭栋
Original Assignee
浩鲸云计算科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 浩鲸云计算科技股份有限公司 filed Critical 浩鲸云计算科技股份有限公司
Publication of WO2021232826A1 publication Critical patent/WO2021232826A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Definitions

  • the invention belongs to the technical field of intelligent transportation, and specifically relates to a control method and device for a camera to dynamically track a road target based on a wireless positioning technology.
  • Video control is currently mainly controlled by humans, regardless of response Ability, speed, accuracy, and intelligence are not enough.
  • the mainstream similar product design on the market is based on video recognition technology.
  • This technology has some shortcomings, and its use in the field of urban transportation will have greater limitations.
  • the premise of this technical route is that the video needs to be able to see the target, and then the target identification and location tracking.
  • the use of this technical solution under the conditions of dark night, occlusion, light, and complex urban background will be limited.
  • one of the objectives of the present invention is to provide a method and device for dynamic camera control based on wireless positioning technology.
  • the embodiment of the invention discloses a control method for a camera to dynamically track a road target based on a wireless positioning technology, which includes: determining a mutual transformation model between the visual center of gravity of the camera and the geodetic coordinates, and the corresponding relationship between the camera motion parameters and the elements on the geodetic coordinates Confirm the state of the target object and the motion trajectory of the target object, obtain the parameters of the control camera corresponding to the motion trajectory of the target object according to the corresponding relationship, and control the camera to track the target object according to the parameters.
  • determining the mutual transformation model between the visual center of gravity of the camera and the geodetic coordinates includes determining the mutual transformation model at least according to the geographic location, suspension height, tilt angle and performance parameters of the camera.
  • the method further includes: performing trajectory cutting on the trajectory of the target object corresponding to the geodetic coordinates, and establishing the corresponding relationship between the trajectory and the camera control parameters.
  • target object tracking includes: confirming the state of the target object according to the mobile positioning data; judging the motion trajectory of the target object, and binding the trajectory of the target object and geographic information; controlling the camera according to the elements on the geodetic coordinates The trajectory tracks the target object.
  • it further includes: target occlusion analysis: using geographic information data in the geographic information system to establish an occlusion model to determine the occlusion range of the camera's field of view of the target object.
  • a wireless positioning technology-based camera dynamic tracking control device for road targets including: a target object positioning module, used to determine the mutual transformation model of the camera's visual center of gravity and the geodetic coordinates, and the corresponding relationship between camera motion parameters and elements on the geodetic coordinates
  • the target object tracking module is used to confirm the state of the target object and the motion trajectory of the target object, obtain the parameters of the control camera corresponding to the motion trajectory of the target object according to the corresponding relationship, and control the camera to track the target object according to the parameters.
  • the mutual transformation model is determined according to the geographic location of the camera, the suspension height, the tilt angle, and the performance parameters of the camera.
  • the method further includes: performing trajectory cutting on the trajectory of the target object corresponding to the geodetic coordinates, and establishing the corresponding relationship between the trajectory and the camera control parameter.
  • target object tracking is also used to: confirm the state of the target object according to the mobile positioning data; determine the motion trajectory of the target object, and bind the trajectory of the target object with geographic information; control the camera according to the geodetic coordinates The element trajectory tracks the target object.
  • it further includes: a target occlusion analysis module, configured to use geographic information data in the geographic information system to establish an occlusion model to determine the occlusion range of the camera's field of view of the target object.
  • a target occlusion analysis module configured to use geographic information data in the geographic information system to establish an occlusion model to determine the occlusion range of the camera's field of view of the target object.
  • a computer storage medium that stores a computer program, and when the computer program is executed, the method described above is implemented.
  • the present invention has the following beneficial effects:
  • the solution of the invention realizes that the computer replaces the manual to complete most of the control scenes of the camera, and realizes the control automation of the camera. At the same time, the precise control is achieved in one step, the target is accurately positioned, and the control accuracy and response speed are improved. In addition, the control scheme further improves the intelligent level of camera control.
  • FIG. 1 is a flow chart of a control method for a camera to dynamically track a road target based on a wireless positioning technology according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of the principle of a camera according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a focal length field of view of a camera according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of the focusing principle of a camera according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of occlusion analysis according to an embodiment of the present invention.
  • Fig. 6 is a schematic diagram of camera occlusion analysis according to an embodiment of the present invention.
  • Fig. 7 is a schematic structural diagram of a camera control device according to an embodiment of the present invention.
  • the position expression of the mobile positioning data is a geographic information system.
  • the trajectory of the object on the geographic information system is reversely mapped to the video.
  • Fig. 1 it specifically shows a control method for a camera based on wireless positioning technology to dynamically track a road target, including:
  • S101 Determine the mutual transformation model between the visual center of gravity of the camera and the geodetic coordinates, and the corresponding relationship between the camera motion parameters and the elements on the geodetic coordinates.
  • the main focus is on roads similar to cities and incidents that occur on the roads. Therefore, the camera field of view that meets the actual requirements of the traffic management business should be able to lock the elements on the road at all times.
  • the focal length range of the camera is like this. As shown in Figure 2, the dots in the figure represent the camera.
  • the focal length of the first 100 meters on the road is a reasonable focal length, and 200 meters on the road, the same focal length will be very uncomfortable, because the object is far away from the pupil. As the distance becomes shorter, the object will be particularly large. On the contrary, it will be very small, so it is very important to choose a reasonable focal length.
  • the mutual transformation model of the camera's visual center of gravity and the geodetic coordinates is established.
  • the mutual conversion model can be determined according to the geographic location, suspension height, tilt angle of the camera, and performance parameters of the camera.
  • the camera is aimed at the target, and the steps to control the camera rotation accordingly, refer to Figure 4:
  • Step 1 Rotate left and right horizontally to adjust to the position of the long dotted line, see 1 in Figure 4.
  • Step 2 Adjust the vertical angle. Adjust the length of the long dotted line to the length of the short dotted line. Refer to 2 in Figure 4.
  • the third step adjust the focus, adjust the imaging size of the target object to an appropriate size, see 3 in Figure 4.
  • Step 4 Make fine adjustments and adjust the target to position 4.
  • the above 4 steps can be divided into 4 steps to be completed separately, or can be completed directly in one action.
  • the above method provides the logic that 4 actions are directly completed in one action.
  • the logic of controlling the camera is directly bound to the elements on the geographic information, and the objects appearing on the geographic information, as long as there is a camera supported by the light mirror product around, the above-mentioned focusing operation can be completed with a single head shake.
  • the movement trajectory of the object is reflected on the map.
  • the movement of the object on the map needs to be completely synchronized with the corresponding camera control logic. In this way, whether you click on the map or move naturally, the camera can accurately track and capture.
  • the method of directly binding the control camera logic and the elements on the geographic information is: cutting the trajectory of the elements on the geodetic coordinates, and establishing the corresponding relationship between the trajectory and the camera angle, focal length, zoom, and pitch.
  • S102 Confirm the state of the target object and the motion trajectory of the target object, obtain a parameter for controlling the camera corresponding to the motion trajectory of the target object according to the foregoing corresponding relationship, and control the camera to track the target object according to the parameter.
  • Target object tracking includes: confirming the state of the target object according to the mobile positioning data; judging the movement trajectory of the target object, and binding the trajectory of the target object and geographic information according to the aforementioned corresponding relationship; controlling the camera to track the target object according to the element trajectory in the geodetic coordinates .
  • the driving state of the target object is judged, and the target object is bound to the trajectory of geographic information, and the trajectory of the target object is represented by the geographic information.
  • Target tracking includes three parts: confirming target status, confirming target trajectory and calculating camera parameters.
  • confirming the target state includes judging that the target object is moving at a constant speed, accelerating, decelerating, standing still, maneuvering quickly, and walking slowly. These states are used to predict the position of the target in the next 5 seconds.
  • the data of a single GPS includes the plane positioning, altitude, and direction angle on the map. Since the main goal of traffic is to drive in the framework of "roads", multiple GPS data sorted by time expresses the trajectory of objects.
  • Confirming the target trajectory includes predicting the trajectory of the target when the positioning signal upload interval exceeds 1 second, and then converting the displacement (S), velocity (V), and time (T) of its movement according to the angle.
  • S displacement
  • V velocity
  • T time
  • the status and motion shutdown of the target object can be acquired through the global positioning system, so as to accurately determine the position of the target object.
  • the camera control parameters are calculated according to the aforementioned corresponding relationship.
  • the camera is used to scan the target motion trajectory.
  • the whole process is carried out in a continuous correction process. Specifically, the movement of the camera is controlled according to the aforementioned corresponding relationship and the trajectory element of the target object in the geodetic coordinates.
  • this component also adds anti-shake capability, which is mainly used to prevent the random movement of the target in a small area from causing the camera to shake tracking.
  • the foregoing method further includes: using geographic information data in the geographic information system to establish an occlusion model, and determine the occlusion range of the camera's field of view of the target object.
  • a two-point-one-line occlusion model can be established using geographic information data to determine the occlusion range of the field of view of each camera for different elements.
  • the main objects that need to be observed are people, cars, and accidents that occur on the road. These objects are very easy to be obscured by buildings and trees on the road, resulting in objects that can be seen in theory but in fact invisible, then the camera may need to be replaced if necessary, or the position of the object after being obscured may be prompted.
  • the occlusion analysis is to map the parameters of the road, the occlusion in the facility, and the road surface illumination in the geographical information to the data of the geographical information according to the above requirements.
  • the specific principle is to establish a spatial connection between the camera and the target object. If the spatial connection intersects with any spatial element, it is determined that the target is occluded.
  • the result of the camera tracking the movement trajectory of the target object is displayed through the front-end interface.
  • the embodiment of the present invention also discloses a control device for a camera to dynamically track a road target based on a wireless positioning technology, including: a target object positioning module 10 for determining the mutual transformation model of the camera's visual center of gravity and the geodetic coordinates and camera motion parameters Correspondence with elements on the geodetic coordinates; target object tracking module 20, used to confirm the state of the target object and the target object motion trajectory, obtain the parameters of the control camera corresponding to the target object motion trajectory according to the corresponding relationship, and according to the parameters Control the camera to track the target object.
  • a target object positioning module 10 for determining the mutual transformation model of the camera's visual center of gravity and the geodetic coordinates and camera motion parameters Correspondence with elements on the geodetic coordinates
  • target object tracking module 20 used to confirm the state of the target object and the target object motion trajectory, obtain the parameters of the control camera corresponding to the target object motion trajectory according to the corresponding relationship, and according to the parameters Control the camera to track the target object.
  • the embodiment mode of the present invention corresponds to the foregoing method embodiment, and the specific content can refer to the method embodiment.
  • the disclosed device and method may be implemented in other ways.
  • the above-described embodiments are only illustrative.
  • the division of units is only a logical function division.
  • multiple units or components can be combined or integrated into Another system, or some features can be ignored, or not implemented.
  • the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments of the present invention.
  • the functional units in the various embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

一种基于无线定位技术的摄像机动态跟踪路面目标的控制方法,包括:确定摄像机视觉重心点与大地坐标的相互变换模型,以及摄像机运动参数与大地坐标上的元素的对应关系;确认目标物体状态和目标物体运动轨迹,根据所述对应关系获取目标物体运动轨迹对应的控制摄像机的参数,并根据所述参数控制摄像机跟踪目标物体。本发明方案实现了摄像机的控制自动化。同时,在控制精上一步到位,精准定位目标,实现控制精度、响应速度上的提升。

Description

基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置 技术领域
本发明属于智能交通技术领域,具体涉及一种基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置。
背景技术
随着中国基础建设水平的不断提高,大量的高清、360°全景相机的大范围覆盖,因此对视频控制的智能化水品要求大大的提升,视频控制目前主要是依托人工去控制,无论从响应能力、速度还是精准性、智能性方面都不足。
目前市面上主流的类似的产品设计是基于视频识别技术,这种技术有一些缺陷,使用在城市交通领域会有较大的局限性。首先,这种技术路线的前提是需要视频能够看的见目标,然后再进行目标的识别和定位跟踪。其次,此种技术方案在黑夜、遮挡、光照、复杂城市背景的情况下使用会受到一颗限制。第三,比较依赖设备之间的相互配合,对基础建设的规划要求较高。
发明内容
针对于上述现有技术的不足,本发明的目的之一是提供一种基于无线定位技术的摄像机动态控制的方法和装置。
本发明实施例公开了一种基于无线定位技术的摄像机动态跟踪路面目标的控制方法,包括:确定摄像机视觉重心点与大地坐标的相互变换模型,以及摄像机运动参数与大地坐标上的元素的对应关系;确认目标物体状态和目标物体运动轨迹,根据所述对应关系获取目标物体运动轨迹对应的控 制摄像机的参数,并根据所述参数控制摄像机跟踪目标物体。
在一个可能的实施例中,确定摄像机视觉重心点与大地坐标的相互变换模型包括:至少依据摄像机的地理位置、悬挂高度、倾斜角度和摄像机的性能参数确定相互变换模型。
在一个可能的实施例中,还包括:对大地坐标上对应的目标物体的轨迹进行轨迹切割,建立轨迹与摄像机控制参数的对应关系。
在一个可能的实施例中,目标物体跟踪,包括:根据移动定位数据确认目标物体的状态;判断目标物体的运动轨迹,并绑定目标物体与地理信息的轨迹;控制摄像机根据大地坐标上的元素轨迹跟踪目标物体。
在一个可能的实施例中,还包括:目标遮挡分析:利用地理信息***中的地理信息数据建立遮挡模型,判断摄像机对目标物体的视野的遮挡范围。
一种基于无线定位技术的摄像机动态跟踪路面目标的控制装置,包括:目标物体定位模块,用于确定摄像机视觉重心点与大地坐标的相互变换模型以及摄像机运动参数与大地坐标上的元素的对应关系;目标物体跟踪模块,用于确认目标物体状态和目标物体运动轨迹,根据所述对应关系获取目标物体运动轨迹对应的控制摄像机的参数,并根据所述参数控制摄像机跟踪目标物体。
在一个可能的实施例中,依据摄像机的地理位置、悬挂高度、倾斜角度和摄像机的性能参数确定相互变换模型。
在一个可能的实施例中,还包括:对大地坐标上对应的目标物体的轨迹进行轨迹切割,建立所述轨迹与摄像机控制参数的对应关系。
在一个可能的实施例中,目标物体跟踪还用于:根据移动定位数据确认目标物体的状态;判断目标物体的运动轨迹,并绑定目标物体与地理信息的轨迹;控制摄像机根据大地坐标上的元素轨迹跟踪目标物体。
在一个可能的实施例中,还包括:目标遮挡分析模块,用于利用地理信息***中的地理信息数据建立遮挡模型,判断摄像机对目标物体的视野的遮挡范围。
一种计算机存储介质,其存储计算机程序,在所述计算机程序被执行时,实施前文所述的方法。
与现有技术相比,本发明具有以下有益效果:
本发明方案实现了计算机取代人工完成摄像机的大部分控制场景,实现了摄像机的控制自动化。同时,在控制精上一步到位,精准定位目标,实现控制精度、响应速度上的提升。此外,该控制方案进一步提高摄像机控制的智能化水平。
附图说明
图1为本发明实施例的基于无线定位技术的摄像机动态跟踪路面目标的控制方法流程图;
图2为本发明实施例的一种摄像机原理示意图;
图3为本发明实施例的摄像机焦距视野范围示意图;
图4为本发明实施例的摄像机对焦原理示意图;
图5为本发明实施例的遮挡分析示意图;
图6为本发明实施例的摄像机遮挡分析示意图;
图7为本发明实施例的摄像机控制装置结构示意图。
具体实施方式
为了便于本领域技术人员的理解,下面结合实施例与附图对本发明作进一步的说明,实施方式提及的内容并非对本发明的限定。
本方案是基于目标方位探测、通过移动定位数据对目标的行进状态、位置进行判定,移动定位数据位置表达是地理信息***,要实现摄像机对目标物体的动态跟踪,就必须有一种技术能够将目标物体在地理信息***上的轨迹反向映射到视频之中。当两种轨迹同步之后,就可以通过摄像机控制技术完成轨迹的复刻,实现摄像机对目标物体的跟踪。
参考图1,具体示出了一种基于无线定位技术的摄像机动态跟踪路面目标的控制方法,包括:
S101,确定摄像机视觉重心点与大地坐标的相互变换模型,以及摄像机运动参数与大地坐标上的元素的对应关系。
在交通管理业务中,主要关注的对象是类似城市的道路,以及道路上发生的事件。因此,符合交通管理业务实际要求的摄像机视野应该时刻能够锁定道路上的元素。
通常摄像机的焦距范围是这样的。如图2,图中圆点表示摄像机,当在道路的第一个100米的焦距是一个合理的焦距,而到道路的200米处,如果同样的焦距就会非常难受,因为物体距离瞳孔的距离变短了,物体就会特别的大。反之,就会特别的小,因此选择一个合理的焦距就十分重要。
通常使用智能摄像机观测目标的方法是:1.选择一个合适的摄像机,这个摄像机不是“距离近”的,而是“可以看的见目标的”。2.当转动摄像 机的时候,摄像机一直能够合理的调整角度和焦距,使目标一直关注在到“道路”上的目标。参考图3,黑色表达了摄像机的位置。周边的方格表达了不同位置的视野强度。数值越大表示视野越好,数值越小表示视野越差。
基于上述摄像机视觉原理,建立摄像机视觉重心点与大地坐标的相互变换模型。具体的,可以依据摄像机的地理位置、悬挂高度、倾斜角度和摄像机的性能参数确定相互变换模型。
摄像机对准目标,相应控制摄像机旋转的步骤,参考图4:
第一步:水平上左右的旋转,调整到长虚线的位置,参看图4中的1。
第二步:垂直角度上的调整,将长虚线调整一个短虚线的长度,参看图4中的2。
第三步:调整焦距,将目标物体的成像大小调整为一个合适的大小,参看图4中的3。
第四步:进行微调,把目标调整到4位置。
上述的4个步骤可以分为4个步骤分别完成,也可以在一次动作中直接完成。上述方法提供的是4个动作在一个动作中直接完成的逻辑。
将这个控制摄像机的逻辑和地理信息上的元素直接绑定,在地理信息上出现的物体,只要周边有光镜产品支撑的摄像机,就可以通过一次甩头,完成上述的对焦操作。物体的移动轨迹是体现在地图上的,要想实现地图的轨迹和视频的轨迹完全同步,则需要将物体在地图上的移动与所对应的摄像机控制逻辑完全同步。这样无论在地图上点击,还是自然移动,摄像机都能够准确的跟踪和捕获到。
具体地,将控制摄像机的逻辑和地理信息上的元素直接绑定方法是: 对大地坐标上的元素进行轨迹切割,建立轨迹与摄像机角度、焦距、变倍、俯仰的对应关系。
S102,确认目标物体状态和目标物体运动轨迹,根据前述对应关系获取目标物体运动轨迹对应的控制摄像机的参数,并根据所述参数控制摄像机跟踪目标物体。
目标物体跟踪,包括:根据移动定位数据确认目标物体的状态;判断目标物体的运动轨迹,并根据前述对应关系绑定目标物体与地理信息的轨迹;控制摄像机根据大地坐标上的元素轨迹跟踪目标物体。
具体的,根据移动定位数据和道路本身的形态,判断目标物体的行驶状态,并将目标物体与地理信息的轨迹进行绑定,通过地理信息表示目标物体的运动轨迹。
目标跟踪包括确认目标状态,确认目标轨迹和计算摄像机参数三个部分。
具体的,确认目标状态包括判断目标物体处于匀速移动,加速,减速,静止,快速机动和慢速行走等状态,这些状态用于预测目标在未来的5秒的位置。单个GPS的数据包含地图上的平面定位、高度、方向角。由于交通的主要目标是在“道路”这个框架中行驶,多个GPS数据根据时间排序就表达了物体的移动轨迹。
确认目标轨迹包括在定位信号上传间隔超过1秒的情况下,对目标的轨迹做出预判,那么根据角度换算其移动的位移(S),速度(V),时间(T),需要满足下面的公式:
S=VT,用S的大小来判定目标的移动状态,例如目标位移大于期望位移时目标物体处于加速状态,目标位移小于期望位移时目标物体处于减速状态,目标位移等于期望位移S=0时目标物体处于静止状态等。
在一个实施例中,可以通过全球定位***获取目标物体的状态和运动关机,从而准确判断目标物体的位置。
最后根据前述对应关系计算摄像机控制参数,在确认了目标物体未来位置的情况,使用摄像机扫描目标运动轨迹,整个过程是在不断的修正过程中进行的。具体地,依据前述对应关系和目标物体在大地坐标上的轨迹元素控制摄像机运动。
为了保证摄像机运动的简单性,本组件还加入了防抖动能力,主要用于防止目标在一个小范围内做无规则运动导致摄像机抖动跟踪的情况。
在一个实施例中,前述方法还包括:利用地理信息***中的地理信息数据建立遮挡模型,判断摄像机对目标物体的视野的遮挡范围。
具体的,可以利用地理信息的数据建立两点一线的遮挡模型,判断每个摄像机的视野对于不同元素的遮挡范围。
参考图5,交通管理中,主要需要观测的目标是道路上发生的人,车,事故。这些物体,非常容易被道路上建筑物,树木所遮挡,导致理论上可以看见的物体的,其实看不见,那么需要就有可能需要更换摄像机,或者提示物体被遮挡后的位置。
图6中,在摄像机没有经过特殊处理之前,其视野的目标是上图的虚线部分。但是由于道路存在坡度,则会导致目标会被完全的丢失,因此需要算法将坡度的元素换算到角度和焦距,使之达到黑色实线的部分。
遮挡分析是对地理信息中的道路、设施中的遮挡、路面照射的参数按照上面的要求映射到地理信息的数据中。其具体原理是建立摄像机、目标物体之间的空间连线,如果空间连线中与任意空间元素相交,则判定为目标被遮挡。
在一个实施例中,通过前端界面显示摄像机跟踪目标物体的运动轨迹的结果。
本发明实施例中还公开了一种基于无线定位技术的摄像机动态跟踪路面目标的控制装置,包括:目标物体定位模块10,用于确定摄像机视觉重心点与大地坐标的相互变换模型以及摄像机运动参数与大地坐标上的元素的对应关系;目标物体跟踪模块20,用于确认目标物体状态和目标物体运动轨迹,根据所述对应关系获取目标物体运动轨迹对应的控制摄像机的参数,并根据所述参数控制摄像机跟踪目标物体。
本发明实施例方式对应于前述的方法实施例,具体内容可参考方法实施例。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的实施例仅仅是示意性的,例如,单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方, 或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本发明实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以是两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
本发明具体应用途径很多,以上所述仅是本发明的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明原理的前提下,还可以作出若干改进,这些改进也应视为本发明的保护范围。

Claims (10)

  1. 一种基于无线定位技术的摄像机动态跟踪路面目标的控制方法,其特征在于,
    获取摄像机视觉重心点与大地坐标的相互变换模型,以及摄像机运动参数与大地坐标上的元素的对应关系;
    获取目标物体状态和目标物体运动轨迹,根据所述对应关系获取目标物体运动轨迹对应的控制摄像机的参数,并根据所述参数控制摄像机跟踪目标物体。
  2. 如权利要求1所述的方法,其特征在于,获取摄像机视觉重心点与大地坐标的相互变换模型包括:至少依据摄像机的地理位置、悬挂高度、倾斜角度和摄像机的性能参数确定相互变换模型。
  3. 如权利要求2所述的方法,其特征在于,还包括:对大地坐标上对应的目标物体的轨迹进行轨迹切割,建立轨迹与摄像机控制参数的对应关系。
  4. 如权利要求1所述的方法,其特征在于,还包括:根据移动定位数据确认目标物体的状态;判断目标物体的运动轨迹,并绑定目标物体与地理信息的轨迹;控制摄像机根据大地坐标上的元素轨迹跟踪目标物体。
  5. 如权利要求1所述的方法,其特征在于,还包括:利用地理信息***中的地理信息数据建立遮挡模型,判断摄像机对目标物体的视野的遮挡范围。
  6. 一种基于无线定位技术的摄像机动态跟踪路面目标的控制装置,其特征在于,包括:
    目标物体定位模块,用于确定摄像机视觉重心点与大地坐标的相互变换模型,以及摄像机运动参数与大地坐标上的元素的对应关系;
    目标物体跟踪模块,用于确认目标物体状态和目标物体运动轨迹,根据所述对应关系获取目标物体运动轨迹对应的控制摄像机的参数,并根据所述参数控制摄像机跟踪目标物体。
  7. 如权利要求6所述的控制装置,其特征在于,所述目标物体定位模块还用于依据摄像机的地理位置、悬挂高度、倾斜角度和摄像机的性能参数确定相互变换模型。
  8. 如权利要求7所述的控制装置,其特征在于,目标物体定位模块还用于:对大地坐标上对应的目标物体的轨迹进行轨迹切割,建立所述轨迹与摄像机控制参数的对应关系。
  9. 如权利要求6所述的控制装置,其特征在于,目标物体跟踪还用于:根据移动定位数据确认目标物体的状态;判断目标物体的运动轨迹,并绑定目标物体与地理信息的轨迹;控制摄像机根据大地坐标上的元素轨迹跟踪目标物体。
  10. 如权利要求6所述的控制装置,其特征在于,还包括:目标遮挡分析模块,用于利用地理信息***中的地理信息数据建立遮挡模型,判断摄像机对目标物体的视野的遮挡范围。
PCT/CN2021/070287 2020-05-22 2021-01-05 基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置 WO2021232826A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010439848.XA CN111586303A (zh) 2020-05-22 2020-05-22 基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置
CN202010439848.X 2020-05-22

Publications (1)

Publication Number Publication Date
WO2021232826A1 true WO2021232826A1 (zh) 2021-11-25

Family

ID=72112309

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/070287 WO2021232826A1 (zh) 2020-05-22 2021-01-05 基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置

Country Status (2)

Country Link
CN (1) CN111586303A (zh)
WO (1) WO2021232826A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111586303A (zh) * 2020-05-22 2020-08-25 浩鲸云计算科技股份有限公司 基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置
CN113744299B (zh) * 2021-09-02 2022-07-12 上海安维尔信息科技股份有限公司 一种相机控制方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102263933A (zh) * 2010-05-25 2011-11-30 杭州华三通信技术有限公司 智能监控的实现方法和装置
US20130162838A1 (en) * 2011-12-22 2013-06-27 Pelco, Inc. Transformation between Image and Map Coordinates
CN104184995A (zh) * 2014-08-26 2014-12-03 天津市亚安科技股份有限公司 一种实现联网视频监控***实时联动监控的方法及***
CN105353772A (zh) * 2015-11-16 2016-02-24 中国航天时代电子公司 一种无人机机动目标定位跟踪中的视觉伺服控制方法
CN109788201A (zh) * 2019-02-14 2019-05-21 四川宏图智慧科技有限公司 定位方法及装置
CN111586303A (zh) * 2020-05-22 2020-08-25 浩鲸云计算科技股份有限公司 基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8180107B2 (en) * 2009-02-13 2012-05-15 Sri International Active coordinated tracking for multi-camera systems
US20140132729A1 (en) * 2012-11-15 2014-05-15 Cybernet Systems Corporation Method and apparatus for camera-based 3d flaw tracking system
CN103453875A (zh) * 2013-08-07 2013-12-18 北京理工大学 一种用于无人机俯仰角与滚转角实时计算方法
WO2017199347A1 (ja) * 2016-05-17 2017-11-23 三菱電機株式会社 画像表示装置、画像表示方法及び画像表示プログラム
CN106210643B (zh) * 2016-07-29 2019-02-12 林玉峰 一种摄像机可视区域调用方法
CN106468552A (zh) * 2016-08-30 2017-03-01 中国科学院长春光学精密机械与物理研究所 一种基于机载光电平台的双机交会定位方法
CN107729808B (zh) * 2017-09-08 2020-05-01 国网山东省电力公司电力科学研究院 一种用于输电线路无人机巡检的图像智能采集***及方法
CN110163885B (zh) * 2018-02-12 2022-06-03 杭州海康威视数字技术股份有限公司 一种目标跟踪方法及装置
CN110764111B (zh) * 2019-11-15 2022-12-09 深圳市镭神智能***有限公司 雷达坐标与大地坐标的转换方法、装置、***及介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102263933A (zh) * 2010-05-25 2011-11-30 杭州华三通信技术有限公司 智能监控的实现方法和装置
US20130162838A1 (en) * 2011-12-22 2013-06-27 Pelco, Inc. Transformation between Image and Map Coordinates
CN104184995A (zh) * 2014-08-26 2014-12-03 天津市亚安科技股份有限公司 一种实现联网视频监控***实时联动监控的方法及***
CN105353772A (zh) * 2015-11-16 2016-02-24 中国航天时代电子公司 一种无人机机动目标定位跟踪中的视觉伺服控制方法
CN109788201A (zh) * 2019-02-14 2019-05-21 四川宏图智慧科技有限公司 定位方法及装置
CN111586303A (zh) * 2020-05-22 2020-08-25 浩鲸云计算科技股份有限公司 基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置

Also Published As

Publication number Publication date
CN111586303A (zh) 2020-08-25

Similar Documents

Publication Publication Date Title
WO2020224375A1 (zh) 定位方法、装置、设备和计算机可读存储介质
US10999511B2 (en) LIDAR and camera synchronization
WO2021232826A1 (zh) 基于无线定位技术的摄像机动态跟踪路面目标的控制方法和装置
US7689347B2 (en) Traffic signal light control system and method
CN102045549A (zh) 一种控制监控设备联动跟踪运动目标的方法及装置
KR20120118478A (ko) 교통 신호 맵핑 및 검출
WO2020182176A1 (zh) 控制球机与枪机联动的方法、装置及介质
CN110235188A (zh) 用于交通监控、事件检测和变化预测的视频数据和gis映射
CN102074016A (zh) 运动目标自动跟踪的装置和方法
CN113409459B (zh) 高精地图的生产方法、装置、设备和计算机存储介质
CN107360394A (zh) 应用于边防视频监控***的多预置点动态智能监测方法
KR100820952B1 (ko) 단일 카메라를 이용한 불법 주정차 무인 자동 단속방법 및그 시스템
US11971536B2 (en) Dynamic matrix filter for vehicle image sensor
CN103414872A (zh) 一种目标位置驱动ptz摄像机的方法
CN105874384A (zh) 基于多种测距方式的跟焦***、方法及拍摄***
EP4136570A1 (en) Artificial intelligence and computer vision powered driving-performance assessment
CN103632538A (zh) 道路3d实景采集***
CN110351528A (zh) 一种基于ar化实景聚合的指挥平台
KR102332179B1 (ko) 영상 및 실공간 매핑장치 및 그 장치에서 각 기능을 실행시키기 위해 매체에 저장된 컴퓨터 프로그램
CN115004273A (zh) 交通道路的数字化重建方法、装置和***
JPWO2020174916A1 (ja) 撮影システム
Notz et al. Extraction and assessment of naturalistic human driving trajectories from infrastructure camera and radar sensors
CN106612414A (zh) 一种重点区域视频可疑目标的跟踪控制方法
TWI805077B (zh) 路徑規劃方法及其系統
CN109443325A (zh) 利用固定式摄像机的空间定位***

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21809301

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21809301

Country of ref document: EP

Kind code of ref document: A1