CN115190237A - Method and equipment for determining rotation angle information of bearing equipment - Google Patents

Method and equipment for determining rotation angle information of bearing equipment Download PDF

Info

Publication number
CN115190237A
CN115190237A CN202210696871.6A CN202210696871A CN115190237A CN 115190237 A CN115190237 A CN 115190237A CN 202210696871 A CN202210696871 A CN 202210696871A CN 115190237 A CN115190237 A CN 115190237A
Authority
CN
China
Prior art keywords
information
area
angle information
determining
rotation angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210696871.6A
Other languages
Chinese (zh)
Other versions
CN115190237B (en
Inventor
陈嘉伟
黄海波
袁科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hiscene Information Technology Co Ltd
Original Assignee
Hiscene Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hiscene Information Technology Co Ltd filed Critical Hiscene Information Technology Co Ltd
Priority to CN202210696871.6A priority Critical patent/CN115190237B/en
Publication of CN115190237A publication Critical patent/CN115190237A/en
Application granted granted Critical
Publication of CN115190237B publication Critical patent/CN115190237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The application aims to provide a method and equipment for determining rotation angle information of bearing equipment, and the method and the equipment specifically comprise the following steps: acquiring shooting parameter information of a PTZ shooting device; acquiring current image information currently shot by the PTZ camera device and first angle information of the bearing equipment when the current image information is shot; acquiring an area image position of a frame area in the current image information, wherein the frame area is used for indicating a direction to be adjusted of the bearing equipment; and determining the rotation angle information of the bearing equipment according to the shooting parameter information, the first angle information and the area image position. The method and the device can realize rapid and visual angle adjustment of the bearing equipment, and the change process of continuous comparison in the scene video image sequence is not needed, so that the computing resources are saved, and the data processing and scene command execution efficiency is improved.

Description

Method and equipment for determining rotation angle information of bearing equipment
Technical Field
The application relates to the field of communication, in particular to a technology for determining rotation angle information of bearing equipment.
Background
PTZ camera means a monitoring camera with a PTZ control bearing device (e.g. Pan/Tilt/Zoom), wherein PTZ is abbreviated as Pan/Tilt/Zoom, and the monitoring camera includes three dimensions of Pan/Tilt control: left-right rotation, up-down pitching and zooming. With the continuous development of cities, the types and the number of monitoring cameras are more and more, so that the control and the maintenance are more and more difficult. When images of a field target object need to be acquired, at present, a cloud deck is controlled by manually clicking a button or by continuously comparing scene video image sequences to acquire a picture of the field target object, which needs to be acquired, but the requirement on experience of an operator is high by manually clicking the button, the accuracy of a manual operation on a rotation angle of the target object which needs to be acquired is insufficient, the corresponding operation efficiency is low, and the consumption of computing resources is high by continuously comparing the scene video image sequences, and the efficiency is low.
Disclosure of Invention
An object of the present application is to provide a method and apparatus for determining rotation angle information of a carrier apparatus.
According to an aspect of the present application, there is provided a method of determining rotation angle information of a carrier apparatus for carrying a PTZ camera, the method including:
acquiring the camera shooting parameter information of the PTZ camera shooting device;
acquiring current image information currently shot by the PTZ camera device and first angle information of the bearing equipment when the current image information is shot;
acquiring an area image position of a frame selection area in the current image information, wherein the frame selection area is used for indicating a direction to be adjusted of the bearing equipment;
and determining rotation angle information of the bearing equipment according to the shooting parameter information, the first angle information and the area image position, wherein the rotation angle information is used for indicating angle information for adjusting the PTZ shooting device from the current shooting posture to a space position corresponding to the area image position.
According to another aspect of the application, an apparatus for determining rotation angle information of a carrying apparatus for carrying a PTZ camera is provided, the apparatus comprising:
the one-to-one module is used for acquiring the camera shooting parameter information of the PTZ camera shooting device;
the first module and the second module are used for acquiring current image information shot by the PTZ camera device at present and first angle information of the bearing equipment when the current image information is shot;
a third module, configured to obtain an area image position of a frame area in the current image information, where the frame area is used to indicate a direction to be adjusted of the bearing device;
and the four modules are used for determining rotation angle information of the bearing equipment according to the shooting parameter information, the first angle information and the area image position, wherein the rotation angle information is used for indicating angle information for adjusting the PTZ shooting device from the current shooting posture to a space position corresponding to the area image position.
According to an aspect of the present application, there is provided a computer apparatus, wherein the apparatus comprises:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the steps of a method as described in any one of the above.
According to an aspect of the application, there is provided a computer readable storage medium having stored thereon a computer program/instructions, characterized in that the computer program/instructions, when executed, cause a system to perform the steps of performing the method as described in any of the above.
According to an aspect of the application, there is provided a computer program product comprising computer program/instructions, characterized in that the computer program/instructions, when executed by a processor, implement the steps of the method as described in any of the above.
Compared with the prior art, the method and the device have the advantages that through the shooting parameter information, the first angle information of the bearing device and the area image position of the frame selection area, the bearing device can be rapidly and visually adjusted in angle, corresponding buttons do not need to be clicked, and scene video image sequences do not need to be continuously compared, so that computing resources are saved, and meanwhile, the data processing and scene command execution efficiency is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
fig. 1 shows a flowchart of a method for determining rotation angle information of a carrying device according to an embodiment of the present application;
FIG. 2 illustrates functional modules of a computer device according to another embodiment of the present application;
FIG. 3 illustrates an exemplary system that can be used to implement the various embodiments described in this application.
The same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
The present application is described in further detail below with reference to the attached figures.
In a typical configuration of the present application, the terminal, the device serving the network, and the trusted party each include one or more processors (e.g., central Processing Units (CPUs)), input/output interfaces, network interfaces, and memory.
The Memory may include forms of volatile Memory, random Access Memory (RAM), and/or non-volatile Memory in a computer-readable medium, such as Read Only Memory (ROM) or Flash Memory. Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase-Change Memory (PCM), programmable Random Access Memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash Memory or other Memory technologies, compact Disc Read-Only Memory (CD-ROM), digital Versatile Disc (DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The device referred to in this application includes, but is not limited to, a user device, a network device, or a device formed by integrating a user device and a network device through a network. The user equipment includes, but is not limited to, any mobile electronic product, such as a smart phone, a tablet computer, etc., capable of performing human-computer interaction with a user (e.g., human-computer interaction through a touch panel), and the mobile electronic product may employ any operating system, such as an Android operating system, an iOS operating system, etc. The network Device includes an electronic Device capable of automatically performing numerical calculation and information processing according to a preset or stored instruction, and the hardware includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded Device, and the like. The network device includes but is not limited to a computer, a network host, a single network server, multiple network server sets, or a cloud of multiple servers; here, the Cloud is composed of a large number of computers or web servers based on Cloud Computing (Cloud Computing), which is a kind of distributed Computing, one virtual supercomputer consisting of a collection of loosely coupled computers. Including, but not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, a wireless Ad Hoc network (Ad Hoc network), etc. Preferably, the device may also be a program running on the user device, the network device, or a device formed by integrating the user device and the network device, the touch terminal, or the network device and the touch terminal through a network.
Of course, those skilled in the art will appreciate that the foregoing is by way of example only, and that other existing or future devices, which may be suitable for use in the present application, are also encompassed within the scope of the present application and are hereby incorporated by reference.
In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
Fig. 1 illustrates a method for determining rotation angle information of a carrying apparatus for carrying a PTZ camera according to an aspect of the present application, and specifically includes step S101, step S102, step S103, and step S104. In step S101, acquiring imaging parameter information of the PTZ imaging apparatus; in step S102, current image information currently captured by the PTZ camera and first angle information of the supporting device when the current image information is captured are obtained; in step S103, obtaining an area image position of a frame area in the current image information, where the frame area is used to indicate a direction to be adjusted of the bearing device; in step S104, rotation angle information of the supporting device is determined according to the imaging parameter information, the first angle information, and the area image position, where the rotation angle information is used to indicate angle information for adjusting the PTZ imaging apparatus from a current imaging posture to a spatial position corresponding to the area image position. The computer device includes, but is not limited to, a user device, a network device, or a device formed by integrating a user device and a network device through a network. The user equipment includes, but is not limited to, any mobile electronic product capable of human-computer interaction with a user (e.g., human-computer interaction through a touch panel); the network device includes, but is not limited to, a computer, a network host, a single network server, a plurality of network server sets, or a cloud of multiple servers.
Specifically, in step S101, imaging parameter information of the PTZ imaging apparatus is acquired. For example, the PTZ imaging apparatus (such as a PTZ camera) includes a Zoom Pan/Tilt/Zoom camera installed in a city, and a PTZ (Pan/Tilt/Zoom) parameter of the camera is adjustable, where the PTZ parameter may be adjusted based on a control instruction sent by a computer device to which the present solution belongs, or may be adjusted based on a control instruction of another device (such as another control device or a server). Specifically, the calibration of the camera device comprises the steps of establishing the relationship between the pixel position of the camera image and a target object, and solving the parameters of the camera model according to the corresponding relationship between the coordinates of the feature points in the image and the world coordinates of the camera imaging model. The model parameters to be calibrated by the camera include internal parameters and external parameters. For the same camera, the internal reference matrix of the camera depends on the internal parameters of the camera, and the internal reference matrix of the camera is not changed no matter what the position relationship between the calibration board and the camera is. However, the external reference matrix reflects the position relationship between the calibration board and the camera, and the position relationship between the calibration board and the camera has changed for different pictures, and at this time, the external reference matrix corresponding to each picture is different. The internal parameters of the PTZ imaging device may be pre-calculated, for example, imaging parameter information (e.g., internal parameters of a camera) corresponding to different zoom ratios (scaling ratios) of the PTZ imaging device is pre-determined by a calibration algorithm, or imaging parameter information (e.g., internal parameters of a camera) corresponding to a current zoom ratio is calculated and determined in real time according to the current zoom ratio, so as to obtain the internal parameters of the PTZ imaging device corresponding to different zoom ratios. In some embodiments, the camera parameter information includes a perspective projection matrix (e.g., camera parameters) for indicating the camera coordinate system of the camera device to the corresponding pixel coordinate system, wherein the camera coordinate system corresponding to the camera device is usually regarded as a special "object" coordinate system, which is defined in the screen visible area of the camera, such as the camera coordinate system, with the camera center as the origin, the x-axis to the right, the z-axis to the front (toward the outside of the screen or the camera direction), and the y-axis to the bottom (not below the world but below the camera itself). The corresponding pixel coordinate system may be a planar rectangular coordinate system with a vector distance as a unit or a rectangular coordinate system u-v with a pixel as a unit, which is established with an upper left corner of an image captured by the PTZ imaging device as an origin, for example, an abscissa u and an ordinate v of a pixel are respectively the number of columns and the number of rows in the image array. In other embodiments, the image capturing parameter information includes basic parameters of the PTZ image capturing device, such as a long focal length, a short focal length, a long axis optical center offset, a short axis optical center offset, and an image resolution of the video image sequence, where the image resolution includes a width pixel number W and a height pixel number H included in a corresponding image width, and the computer device may obtain a corresponding perspective projection matrix (such as camera parameters) based on the corresponding image capturing parameter information, for example, calculate and determine the corresponding perspective projection matrix according to the following formula:
Figure BDA0003702968210000061
wherein, M is proj For a homogeneous perspective projection matrix from a camera coordinate system to a pixel coordinate system/image coordinate system, each specific parameter distinguishes the identifier: long focal length f corresponding to each zoom ratio x Short focal length f y Long axis optical center offset c x Minor axis optical center offset c y Far plane F, near plane N, video image sequence width W, height H. The internal parameters of the PTZ camera device corresponding to different zooming proportions can be calibrated in advance through the formula (1).
In some cases, the imaging parameter information further includes a current zoom ratio z 0 The computer device may obtain a corresponding perspective projection matrix (e.g., camera parameters) based on the corresponding camera parameter information, for example, obtain a long focal length f corresponding to the camera device without zooming x Short focal length f y Long axis optical center shift c x Minor axis optical center offset c y And calculating and determining a corresponding perspective projection matrix according to the following formula:
Figure BDA0003702968210000062
and (3) calculating and determining the internal reference corresponding to the current zooming proportion in real time according to the current zooming proportion through a formula (2) so as to obtain the internal reference corresponding to the PTZ camera shooting device in different zooming proportions.
Based on the foregoing formula (1) or (2), we can calculate and determine a perspective projection matrix of the corresponding imaging coordinate system to the pixel coordinate system based on the basic parameters included in the imaging parameter information.
Of course, besides the above method, other camera calibration methods may be used to calculate the perspective projection matrix from the camera coordinate system to the pixel coordinate system, such as the traditional camera calibration method, the active vision camera calibration method, and the camera self-calibration method. In the traditional camera calibration method, a constraint condition is formed by using a calibration reference object and the corresponding relation between points on the calibration reference object and corresponding pixel points of the calibration reference object in different images, so that camera model parameters are determined. Such as the Tsai two-step process and the zhang scaling process. The Zhang calibration method is characterized in that a calibration board consisting of two-dimensional grids is used for calibration, pictures of the calibration board at different poses are collected, pixel coordinates of grid corner points in the pictures are extracted, initial values of internal and external parameters of a camera are calculated through a homography matrix, a distortion coefficient is estimated by using a nonlinear least square method, and finally a maximum likelihood estimation method is used for optimizing parameters. The active vision camera calibration method is characterized in that rotation and movement of a camera are controlled through an active system, multiple groups of pictures are collected in the process of controlling the motion of the camera, and camera internal parameters are solved according to picture information and corresponding poses. If the calibration is carried out, the camera is enabled to carry out a group of two-dimensional translation motions, images of the round hole target piece are collected, the image point coordinates of the circle center are calculated, the moving distance of the camera is recorded at the same time, the characteristic points required by calibration are obtained, and the calibration parameters of the camera are calculated by utilizing the characteristic points. The camera self-calibration method does not need calibration objects, and refers to directly calibrating by only depending on the corresponding point relation among a plurality of images acquired by a camera. Such as self-scaling and hierarchical step-by-step scaling based on the Kruppa equation. The Kruppa-based self-calibration method is to establish a constraint equation about a camera internal parameter matrix through a quadratic curve, and at least 3 pairs of images are used for calibrating a camera. The image sequence length affects the stability of the calibration algorithm and cannot guarantee an infinite plane in the projection space. The layered gradual calibration method comprises the steps of firstly performing projective reconstruction on a sequence of images, performing radioactive calibration and European calibration on the basis of reconstruction, and obtaining camera intrinsic parameters through a nonlinear optimization algorithm.
In some embodiments, the computer device may establish or update a mapping relationship based on the current zoom ratio and the solved internal references, the mapping relationship including one or more zoom ratios of the PTZ camera, wherein there is one-to-one internal reference for each zoom ratio.
In step S102, current image information currently captured by the PTZ imaging device and first angle information of the carrier apparatus when the current image information is captured are acquired. For example, the PTZ camera is mounted on a corresponding support device, which includes, but is not limited to, a pan/tilt head of the PTZ camera, and the pan/tilt head is a support device for mounting and fixing a camera, and is classified into a fixed pan/tilt head and a motorized pan/tilt head. The fixed tripod head is suitable for the condition that the monitoring range is not large, the horizontal and pitching angles of the camera can be adjusted after the camera is installed on the fixed tripod head, and the adjusting mechanism can be locked after the best working posture is achieved. The electric pan-tilt is suitable for scanning and monitoring a large range, and can enlarge the monitoring range of the camera. The cloud platform is according to its characteristics of gyration, and the change of gyration position appearance can cause the change of first angle information, and first angle information includes driftage angle information on the horizontal direction and pitch angle information on the vertical direction etc.. Coordinate transformation information (such as an external reference) of an imaging coordinate system of the PTZ imaging device with respect to a world coordinate system changes according to a change in the first angle information of the pan/tilt head. When real-time first angle information of the pan/tilt head, such as pitch angle information and yaw angle information of the pan/tilt head, coordinate transformation information of the imaging coordinate system of the PTZ camera and the world coordinate system, such as an observation matrix (e.g., external reference) of the world coordinate system relative to the imaging coordinate system, can be obtained, and further first transformation information of the world coordinate system to a pixel coordinate system of the imaging device and the like can be determined based on the observation matrix and the perspective projection matrix. Wherein the world coordinate system comprises a camera with a pitch angle of 0 and a yaw angle of 0, the right direction of the camera is an x axis, the upper direction is a y axis, and the front direction is aA space rectangular coordinate system O is constructed by taking the z axis and the self position of the camera as the origin xyz And the like. The Pan-Tilt of the PTZ camera can adjust the parameters of Pan Tilt Zoom in three dimensions. Where Pan represents yaw angle information and Tilt represents pitch angle information. When the PTZ camera device acquires an image, a corresponding deflection angle and the like in each direction exist on the corresponding bearing holder, and the computer device determines the angle information of the bearing holder corresponding to the current image information acquired by the PTZ camera device as first angle information, wherein the first angle information is used for indicating the deflection angles of the bearing holder in different directions when the current image information is shot, such as a yaw angle in the horizontal direction and/or a pitch angle in the vertical direction.
In step S103, an area image position of a selected area in the current image information is obtained, where the selected area is used to indicate a direction to be adjusted of the carrying device. For example, the computer device determines a corresponding frame selection area through operations of clicking, frame selection, dragging, touching, sliding and the like in current image information acquired by the PTZ camera device by a user, and determines area image position information corresponding to the frame selection area, or determines image position information of an area where an identification object is located as an area image position of the frame selection area based on the identification object in the current image information identified by relevant template features in a database; specifically, the recognition object may be a certain geographical location, a symbolic street, a building, a car, an animal, a person, or the like. The frame selection area is used to indicate a specific mark area corresponding to a certain position determined based on a corresponding operation, target recognition, or the like, and may be a mark area in a triangle, rectangle, circle, or other arbitrary shape, for example. The area image position may also be a set of pixel/image coordinates determined based on other user operations/image recognition at the other device, and sent to the computer device based on a communication connection between the other device and the computer device, and the computer device receives the set of pixel/image coordinates, determines the set of pixel/image coordinates as an area image position of the framed area, and so on. The area image position is used for indicating a pixel/image position set corresponding to the frame selection area, and the corresponding frame selection area is used for indicating the direction to be adjusted of the corresponding camera device. In some cases, the corresponding other device (such as a command device) may adjust the acquisition area of the corresponding camera device based on user operations of the other device, so as to select a corresponding intended acquisition area from the current image, transmit an area image position of the framed area related to the current image information to the computer device, and the like; in other cases, other devices (such as a command device) may upload object template features corresponding to an intended object or object identification information indicating the object template features, and the like, and the computer device may recognize an image position where the intended object is located from a current image as an area image position and the like according to the object template features.
In step S104, rotation angle information of the supporting device is determined according to the imaging parameter information, the first angle information, and the area image position, where the rotation angle information is used to indicate angle information for adjusting the PTZ imaging apparatus from a current imaging posture to a spatial position corresponding to the area image position. For example, after the computer device acquires the corresponding imaging parameter information, the first angle information, and the corresponding area image position, the computer device may determine the coordinate transformation information from the corresponding imaging coordinate system to the pixel coordinate system and from the world coordinate system to the imaging coordinate system based on the imaging parameter information and the first angle information, so that the first transformation information from the world coordinate system to the corresponding pixel coordinate system may be calculated. Accordingly, the computer device may convert the corresponding area image position into the world coordinate system based on the first conversion information, thereby calculating the rotation angle information of the current PTZ camera, that is, the angle information for adjusting the PTZ camera from the current camera attitude to the spatial position corresponding to the directly-facing area image position, and the like. The rotation angle information may be a relative angle change adjusted from current image capture attitude information to target attitude information, or may be absolute angle information of the target attitude information, where the target attitude information is used to indicate image capture attitude information and the like when the image capture device is facing a spatial position corresponding to an area image position of the frame selection area, and when the image capture device is in the target attitude, a center of a picture acquired by the image capture device coincides with a center of the frame selection area.
In some embodiments, the first angle information includes yaw angle information in a horizontal direction and pitch angle information in a vertical direction, and the rotation angle information includes yaw rotation angle information in the horizontal direction and pitch rotation angle information in the vertical direction. For example, the corresponding first angle information includes yaw angle information of the bearing holder in the horizontal direction and pitch angle information in the vertical direction, and the corresponding rotation angle information includes yaw rotation angle information in the horizontal direction and pitch rotation angle information in the vertical direction, where the yaw rotation angle information in the horizontal direction includes yaw angle change information/target yaw angle information in the horizontal direction, and the pitch rotation angle information in the vertical direction includes pitch angle change information/target pitch angle information in the vertical direction, and the like. For example, the computer device may first calculate and determine the corresponding yaw angle change information and pitch angle change information, and determine the corresponding target yaw angle information and target pitch angle information by combining the first angle information. Or, the computer device may calculate and determine the target yaw angle information and the target pitch angle information first, and determine the corresponding yaw angle change information and pitch angle change information by combining the first angle information.
In some embodiments, the method further includes step S105 (not shown), in step S105, sending a corresponding rotation adjustment instruction to a corresponding bearing device to adjust a spatial position of the PTZ camera facing the frame selection area and perform image acquisition, where the rotation adjustment instruction includes the rotation angle information. For example, after the computer device determines the corresponding rotation angle information, a rotation adjustment instruction about the rotation angle information may be sent to the corresponding carrier device, the rotation adjustment instruction including the rotation angle information. The bearing equipment receives the rotation adjusting instruction, adjusts a corresponding yaw angle and a corresponding pitch angle based on the rotation adjusting instruction, adjusts the bearing equipment to corresponding target attitude information and the like, for example, when the rotation angle information includes yaw angle change information in the horizontal direction and pitch angle change information in the vertical direction, the bearing equipment determines target yaw angle information and target pitch angle information according to the first angle information and the rotation angle information, and adjusts the current yaw angle and the pitch angle to the target yaw angle information and the target pitch angle information; for another example, when the rotation angle information includes target yaw angle information and target pitch angle information in the horizontal direction, the load-bearing device directly adjusts the current yaw angle and pitch angle to the target yaw angle information and the target pitch angle information. In some cases, the order of adjustment of the carrying apparatus in the horizontal and vertical directions is to adjust the yaw angle in the horizontal direction and then adjust the pitch angle in the vertical direction, and the like.
In some embodiments, the step S104 includes a sub-step S1041 (not shown), a sub-step S1042 (not shown), and a sub-step S1043 (not shown). In sub-step S1041, determining first transformation information from a world coordinate system to a pixel coordinate system of the PTZ imaging device according to the imaging parameter information and the first angle information; in the sub-step S1042, a region center image position corresponding to a region center of the frame selected region is determined according to the region image position; in sub-step S1043, determining rotation angle information of the carrier device according to the region center image position and the first transformation information, where the rotation angle information is used to indicate angle information for adjusting the PTZ camera from a current camera pose to a spatial position corresponding to the region image position. For example, after acquiring the corresponding image capturing parameter information and the first angle information, the computer device may determine a perspective projection matrix and an observation matrix (such as internal and external parameters) corresponding to the PTZ image capturing device based on the image capturing parameter information and the first angle information, so as to convert the area center image position from the pixel coordinate system to the corresponding world coordinate system, calculate an angle between the area center image position (for example, a ray direction corresponding to the calculated area center image position, and the like) and an image center of the current image information (for example, a ray direction in which the optical axis is located, and determine the corresponding rotation angle information and the like according to the angle. Generally, for calculation of one area, we may calculate a corresponding area center (for example, a centroid or a center of gravity of the area, etc.) by selecting a plurality of points to statistically calculate a center of the area (for example, a centroid or a center of gravity of the area, etc.), so that the area center image position corresponding to the center of the area represents the framed area, and calculate corresponding rotation angle information based on the area center image position, etc., for example, when the framed area is a rectangle, the area center image position corresponding to the center of the rectangle is determined according to the area image position of the rectangle, for example, the area center image position corresponding to the center of the rectangle is determined according to two pixels/image coordinates of opposite corners of the rectangle (for example, the pixels/image coordinates of the upper left corner and the lower right corner, etc.), and for example, when the framed area is a circle, the area center image position corresponding to a circle center is determined according to the area image position of the circle, for example, the area center image position corresponding to a circle center is determined according to the two pixels/image coordinates and a radius on the circle.
In some embodiments, in step S1041, determining a perspective projection matrix of an imaging coordinate system of the PTZ imaging device to a corresponding pixel coordinate system according to the imaging parameter information; determining an observation matrix corresponding to a world coordinate system to the camera coordinate system according to the first angle information; and determining first transformation information from the world coordinate system to the pixel coordinate system according to the perspective projection matrix and the observation matrix. The origin of the world coordinate system is the same as the origin of the imaging coordinate system, so that the conversion relationship of the world coordinate system to the corresponding imaging coordinate system only involves the rotation matrix, and does not involve the translation relationship and the like. We can determine a perspective projection matrix of the imaging coordinate system of the imaging device to the pixel coordinate system based on the aforementioned formula (1) or formula (2), and determine an observation matrix of the world coordinate system to the imaging coordinate system based on the corresponding first angle information, thereby determining first transformation information of the world coordinate system to the pixel coordinate system, etc. in combination with the perspective projection matrix and the observation matrix. Wherein, according to the first angle information (yaw angle information phi in horizontal direction) of the bearing device 0 And pitch angle information theta in the vertical direction 0 ) Determining rotation parameters of the PTZ camera:
Figure BDA0003702968210000121
based on the above-mentioned rotation parameter R x R y We can compute to determine the corresponding observation matrix:
M view =R x R y (4)
in conjunction with the perspective projection matrix determined by the foregoing equation (1) or (2), we can determine the corresponding first transformation information as follows:
M projview =M proj M view (5)
wherein M is projview First transformation information indicating a world coordinate system to a pixel coordinate system.
In some embodiments, in sub-step S1042, according to the minimum bounding rectangle of the region image position, the center position of the minimum bounding rectangle is determined as the region center image position corresponding to the region center of the frame-selected region. For example, the corresponding frame region may not be limited to a shape, when the shape of the frame region is a regular shape, the region center image position corresponding to the region center of the frame region may be determined according to the region image position of the frame region, and when the shape of the frame region is an irregular shape, we may focus on the minimum bounding rectangle of the frame region and determine the corresponding region center image position based on the minimum bounding rectangle. After the computer device acquires the minimum circumscribed rectangle frame of the frame selection area, the center position of the minimum circumscribed rectangle may be determined based on the corner coordinates of the minimum circumscribed rectangle, the center position is determined as the area center image position corresponding to the area center of the frame selection area, for example, the average values in the u and v directions are calculated based on two pixel/image coordinates (e.g., pixel/image coordinates of upper left corner, lower right corner, etc.) of the diagonal of the minimum circumscribed rectangle in the pixel coordinate system, so as to determine the average values in the u and v directions as the pixel coordinates of the area center image position, etc. Of course, we can also calculate by combining a plurality of corner points/points on the frame line of the minimum bounding rectangle, and the like, and do not limit the calculation. Of course, when the shape of the frame selection area is a regular shape, the area center image position corresponding to the area center of the frame selection area may also be determined according to the minimum bounding rectangle of the frame selection area, which is not limited herein.
In some embodiments, in step S1043, constructing first vector information of a facing direction of the current image information according to the first transformation information; determining second vector information of the space direction corresponding to the area center according to the first transformation information and the position of the image of the area center; and determining rotation angle information of the bearing equipment according to the first vector information and the second vector information, wherein the rotation angle information is used for indicating angle information for adjusting the PTZ camera to a space position corresponding to the position just facing the area image from the current camera shooting attitude. For example, after the computer device acquires the first transformation information corresponding to the world coordinate system to the pixel coordinate system, we can determine, based on the first transformation information, the first vector information in the right direction of the current image, where the first vector information includes unit vector information in the direction pointed by the ray of the optical center of the camera along the optical axis direction, and the specific calculation formula is as follows:
Figure BDA0003702968210000131
wherein the content of the first and second substances,
Figure BDA0003702968210000132
the length of the unit vector is 1, and the unit vector is a unit vector corresponding to the current front direction of the camera. Here, norm is a normalization function.
Figure BDA0003702968210000133
Is composed of
Figure BDA0003702968210000134
The x, y, z components of (a). Accordingly, the computer device may determine the location of the corresponding region center image, e.g.,
Figure BDA0003702968210000135
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003702968210000136
coordinate information(s) for indicating the position of the center image of the region in a pixel coordinate system x ,s y ) The coordinate of the upper left corner for representing the minimum bounding rectangle, and the corresponding coordinate of the lower right corner is (s' x ,s′ y ). The computer device can calculate and determine a unit vector corresponding to the central point under the rectangular space coordinate system based on the coordinate information of the central image position of the area in the pixel coordinate system and the first transformation information:
Figure BDA0003702968210000137
wherein, the
Figure BDA0003702968210000138
The unit vector in the ray direction corresponding to the optical center of the camera to the central point is the second vector information, and the vector length value is 1. After the computer device acquires the first vector information and the second vector information, the corresponding included angle information can be solved based on the first vector information and the second vector information, and the corresponding rotation angle information and the like can be determined based on the corresponding included angle information. For example, the corresponding angle is calculated based on the two vector information, and the angle is decomposed into the horizontal direction and the vertical direction, thereby determining yaw rotation angle information in the horizontal direction and pitch rotation angle information in the vertical direction, and the like.
In some embodiments, the determining the rotation angle information of the carrying device according to the first vector information and the second vector information includes: converting the first vector information and the second vector information into a spherical polar coordinate system, and determining corresponding first spherical polar vector information and second spherical polar vector information; calculating corresponding angle difference according to the first spherical polar vector information and the second spherical polar vector information, and thus obtaining the bearing equipment according to the angle differenceThe angle information of (2), wherein the angle information is used to indicate angle information for adjusting the PTZ imaging apparatus from a current imaging posture to a spatial position corresponding to the area image position. For example, because an angle deviation is easily caused by calculating an included angle between two vectors in a spatial rectangular coordinate system, we can convert the two vector information into a spherical polar coordinate system, and determine corresponding first spherical polar vector information and second spherical polar vector information, where the spherical polar coordinate system is used to determine positions of a midpoint, a line, a plane, and a body in a three-dimensional space, and the positions are composed of an azimuth angle, an elevation angle, and a distance with an origin of coordinates as a reference point. In order to make the movement of the head follow the rule of spherical polar coordinates, the head is to be moved
Figure BDA0003702968210000141
And with
Figure BDA0003702968210000142
Conversion to spherical polar coordinate vector on unit sphere
Figure BDA0003702968210000143
Order to
Figure BDA0003702968210000144
Wherein v is x 、v y 、v z Are respectively as
Figure BDA0003702968210000145
At O x 、O y 、O z Component of three axes, an
Figure BDA0003702968210000146
The corresponding first spherical polar coordinate information is:
Figure BDA0003702968210000147
the corresponding second spherical polar coordinate information is respectively as follows:
Figure BDA0003702968210000148
calculating a corresponding difference value based on the first spherical polar coordinate information and the second spherical polar coordinate information, and determining angle change information:
Figure BDA0003702968210000149
Figure BDA00037029682100001410
where Δ Φ is used to indicate yaw angle change information in the horizontal direction, that is, an angle change amount of a yaw angle, and Δ θ is used to indicate pitch angle change information in the vertical direction, that is, an angle change amount of a pitch angle.
Wherein, if
Figure BDA0003702968210000151
Then the
Figure BDA0003702968210000152
Based on yaw angle change information and pitch angle change information, and first angle information (phi) 0 、θ 0 ) We can calculate corresponding target yaw angle information (φ ') and target pitch angle information (θ '), e.g., φ ' = Δ φ + φ 0 ,θ′=Δθ+θ 0
In some embodiments, the method further includes step S106 (not shown), and in step S106, corresponding zoom ratio variation information is determined according to the frame selection area and the resolution information of the current image information, where the zoom ratio variation information is used for focusing the acquisition area of the PTZ camera to the frame selection area. For example, in order to enable the PTZ camera to clearly and completely acquire a spatial region corresponding to the frame selection region, the frame selection region may be determined to be a region to be acquired of the PTZ camera, the spatial region is directly aligned through the rotation angle information, and the focal length of the corresponding PTZ camera is adjusted through the zoom ratio, so that the content of the frame selection region after zooming is displayed in the acquired image as much as possible. In some embodiments, the change information of the zoom ratio is determined based on a ratio of the current image information to the resolution of the frame selected area, wherein the change information of the zoom ratio may be a relative scale change from the current zoom ratio to the target zoom ratio, i.e. a variable of the zoom ratio, or may be absolute zoom ratio information of the target zoom ratio, etc. The target zoom ratio may be determined based on the current zoom ratio and a variable of the zoom ratio, and the variable of the zoom ratio may also be determined based on the target zoom ratio and the current zoom ratio. For example, the ratio of the width of the current image information to the width of the frame selection area and the ratio of the height of the current image information to the height of the frame selection area are calculated, respectively, and the variable of the zoom ratio takes the minimum value of the two ratios. For example, if the corresponding frame region is a rectangular region, the variable (Δ z) of the zoom ratio is the minimum of the ratio of the width of the current image information to the width of the frame region and the ratio of the height of the current image information to the height of the frame region:
Figure BDA0003702968210000153
wherein, the coordinate of the upper left corner of the framing rectangular area is (m) x ,m y ) And the lower right corner is (m' x ,m′ y )。
For another example, if the corresponding frame region is a circular region, the variable (Δ z) of the zoom ratio is the minimum value of the ratio of the width of the current image information to the radius of the frame region and the ratio of the height of the current image information to the radius of the frame region:
Figure BDA0003702968210000154
wherein r is the radius of the framed circular area, etc.
In some embodiments, the determining the corresponding zoom ratio change information according to the frame-selected region and the resolution information of the current image information includes: and determining corresponding zooming proportion change information according to the minimum circumscribed rectangular frame corresponding to the frame selection area and the resolution ratio of the minimum circumscribed rectangular frame and the current image information, wherein the zooming proportion change information is used for focusing the acquisition area of the PTZ camera device to the frame selection area. The change information of the zoom ratio may be a variable of the zoom ratio or a target zoom ratio. For example, the variable of the zoom ratio is determined by the resolution ratio of the current image information to the minimum bounding rectangle corresponding to the frame selection area, and the computer device may determine the minimum bounding rectangle of the frame selection area and determine the variable corresponding to the zoom ratio according to the current image information, where the variable of the zoom ratio is the minimum of the ratio of the width of the current image information to the width of the minimum bounding rectangle and the ratio of the height of the current image information to the height of the minimum bounding rectangle. For example, the variables corresponding to the zoom ratio are as follows:
Figure BDA0003702968210000161
where Δ z is used to describe the zoom variable,(s) x ,s y ) The coordinate of the upper left corner for representing the minimum bounding rectangle, and the corresponding coordinate of the lower right corner is (s' x ,s′ y )。
In some embodiments, the method further includes step S107 (not shown), and in step S107, the zoom ratio change information is sent to the PTZ camera to adjust the capture area of the PTZ camera to focus on the frame selection area. For example, after the computer device determines the corresponding zoom ratio change information, the zoom ratio change information may be transmitted to the PTZ camera, the corresponding zoom may be completed before the PTZ camera acquires the frame selection area, the acquisition of a complete clear image about the frame selection area may be completed, and the like. Example (b)For example, the computer device sends a variable of the zoom ratio (Δ z) to the PTZ imaging apparatus, which is based on the current zoom ratio (z) 0 ) Determining a target zoom ratio (z), e.g. z = Δ zz 0 The PTZ camera device adjusts the current zooming proportion to a target zooming proportion; for another example, the computer device sends the target zoom ratio to the PTZ camera, which directly adjusts the current zoom ratio to the target zoom ratio.
Embodiments of a method for determining rotation angle information of a carrier device according to an aspect of the present application are mainly described above, and further, specific devices capable of implementing the above embodiments are provided in the present application, which is described below with reference to fig. 2.
Fig. 2 shows a computer device 100 for determining rotation angle information of a carrying device for carrying a PTZ camera according to an aspect of the present application, specifically comprising a module 101, a module 102, a module 103 and a module 104. A one-to-one module 101, configured to obtain imaging parameter information of the PTZ imaging apparatus; a second module 102, configured to obtain current image information currently captured by the PTZ camera and first angle information of the bearing device when the current image information is captured; a third module 103, configured to obtain an area image position of a frame region in the current image information, where the frame region is used to indicate a direction to be adjusted of the bearing device; a fourth module 104, configured to determine rotation angle information of the bearing device according to the imaging parameter information, the first angle information, and the area image position, where the rotation angle information is used to indicate angle information for adjusting the PTZ imaging apparatus from a current imaging posture to a spatial position corresponding to the area image position.
In some embodiments, the first angle information includes yaw angle information in a horizontal direction and pitch angle information in a vertical direction, and the rotation angle information includes yaw rotation angle information in the horizontal direction and pitch rotation angle information in the vertical direction.
In some embodiments, the quad-module 104 includes a quad-unit (not shown), and a quad-unit (not shown). A four-in-one unit, configured to determine first transformation information from a world coordinate system to a pixel coordinate system of the PTZ imaging device according to the imaging parameter information and the first angle information; a fourth and a second unit, configured to determine, according to the area image position, an area center image position corresponding to an area center of the frame selection area; and a fourth and third unit, configured to determine rotation angle information of the bearing device according to the area center image position and the first transformation information, where the rotation angle information is used to indicate angle information for adjusting the PTZ imaging apparatus from a current imaging posture to a spatial position corresponding to the area image position. In some embodiments, a four-in-one unit is configured to determine a perspective projection matrix of a camera coordinate system of the PTZ camera to a corresponding pixel coordinate system according to the camera parameter information; determining an observation matrix corresponding to a world coordinate system to the camera coordinate system according to the first angle information; and determining first transformation information from the world coordinate system to the pixel coordinate system according to the perspective projection matrix and the observation matrix. In some embodiments, a quadruple-two unit is configured to determine, according to a minimum bounding rectangle of the region image positions, a center position of the minimum bounding rectangle as a region center image position corresponding to a region center of the frame selection region. In some embodiments, a forty-three unit is configured to construct first vector information of a facing direction of the current image information according to the first transformation information; determining second vector information of the space direction corresponding to the area center according to the first transformation information and the position of the image of the area center; and determining rotation angle information of the bearing equipment according to the first vector information and the second vector information, wherein the rotation angle information is used for indicating angle information for adjusting the PTZ camera to a space position corresponding to the position just facing the area image from the current camera shooting attitude. In some embodiments, the determining the rotation angle information of the carrying device according to the first vector information and the second vector information includes: converting the first vector information and the second vector information into a spherical polar coordinate system, and determining corresponding first spherical polar vector information and second spherical polar vector information; and calculating a corresponding angle difference value according to the first spherical polar vector information and the second spherical polar vector information, so as to obtain rotation angle information of the bearing equipment according to the angle difference value, wherein the rotation angle information is used for indicating angle information for adjusting the PTZ camera shooting device from the current camera shooting posture to a space position corresponding to the area image position.
Here, the specific implementation corresponding to the one-to-one module 101, the two-to-two module 102, the one-to-three module 103, and the one-to-four module 104 shown in fig. 2 is the same as or similar to the aforementioned embodiment of step S101, step S102, step S103, and step S104 shown in fig. 1, and thus is not repeated herein and is included herein by way of reference.
In some embodiments, the apparatus further includes a fifth module (not shown) configured to send a corresponding rotation adjustment instruction to a corresponding bearing apparatus, so as to adjust a spatial position of the PTZ camera facing the frame selection area and perform image acquisition, where the rotation adjustment instruction includes the rotation angle information.
In some embodiments, the apparatus further includes a sixth module (not shown) configured to determine corresponding focal length change information according to the frame selection area and resolution information of the current image information, where the zoom ratio change information is used to focus the acquisition area of the PTZ camera to the frame selection area. In some embodiments, the determining the corresponding zoom ratio change information according to the frame-selected region and the resolution information of the current image information includes: and determining corresponding zooming proportion change information according to the minimum circumscribed rectangle frame of the frame selection area and the resolution of the minimum circumscribed rectangle frame and the current image information, wherein the zooming proportion change information is used for focusing the acquisition area of the PTZ camera device to the frame selection area. In some embodiments, the apparatus further includes a seventh module (not shown) configured to send the zoom ratio change information to the PTZ camera to adjust the acquisition area of the PTZ camera to focus on the frame selection area.
Here, the specific implementation corresponding to the five-module to the seven-module is the same as or similar to the embodiment of the steps S105 to S107, and thus is not repeated herein and is included herein by reference.
In addition to the methods and apparatus described in the embodiments above, the present application also provides a computer readable storage medium storing computer code that, when executed, performs the method as described in any of the previous items.
The present application also provides a computer program product, which when executed by a computer device, performs the method of any of the preceding claims.
The present application further provides a computer device, comprising:
one or more processors;
a memory for storing one or more computer programs;
the one or more computer programs, when executed by the one or more processors, cause the one or more processors to implement the method of any preceding claim.
FIG. 3 illustrates an exemplary system that can be used to implement the various embodiments described herein;
in some embodiments, as shown in FIG. 3, the system 300 can be implemented as any of the above-described devices in the various described embodiments. In some embodiments, system 300 may include one or more computer-readable media (e.g., system memory or NVM/storage 320) having instructions and one or more processors (e.g., processor(s) 305) coupled with the one or more computer-readable media and configured to execute the instructions to implement modules to perform the actions described herein.
For one embodiment, system control module 310 may include any suitable interface controllers to provide any suitable interface to at least one of processor(s) 305 and/or to any suitable device or component in communication with system control module 310.
The system control module 310 may include a memory controller module 330 to provide an interface to the system memory 315. Memory controller module 330 may be a hardware module, a software module, and/or a firmware module.
System memory 315 may be used, for example, to load and store data and/or instructions for system 300. For one embodiment, system memory 315 may include any suitable volatile memory, such as suitable DRAM. In some embodiments, the system memory 315 may include a double data rate type four synchronous dynamic random access memory (DDR 4 SDRAM).
For one embodiment, system control module 310 may include one or more input/output (I/O) controllers to provide an interface to NVM/storage 320 and communication interface(s) 325.
For example, NVM/storage 320 may be used to store data and/or instructions. NVM/storage 320 may include any suitable non-volatile memory (e.g., flash memory) and/or may include any suitable non-volatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
NVM/storage 320 may include storage resources that are physically part of the device on which system 300 is installed or may be accessed by the device and not necessarily part of the device. For example, NVM/storage 320 may be accessible over a network via communication interface(s) 325.
Communication interface(s) 325 may provide an interface for system 300 to communicate over one or more networks and/or with any other suitable device. System 300 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols.
For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) (e.g., memory controller module 330) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) of the system control module 310 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310 to form a system on a chip (SoC).
In various embodiments, system 300 may be, but is not limited to being: a server, a workstation, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.). In various embodiments, system 300 may have more or fewer components and/or different architectures. For example, in some embodiments, system 300 includes one or more cameras, a keyboard, a Liquid Crystal Display (LCD) screen (including a touch screen display), a non-volatile memory port, multiple antennas, a graphics chip, an Application Specific Integrated Circuit (ASIC), and speakers.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, for example, implemented using Application Specific Integrated Circuits (ASICs), general purpose computers or any other similar hardware devices. In one embodiment, the software programs of the present application may be executed by a processor to implement the steps or functions described above. Likewise, the software programs (including associated data structures) of the present application may be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Further, some of the steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
In addition, some of the present application may be implemented as a computer program product, such as computer program instructions, which when executed by a computer, may invoke or provide methods and/or techniques in accordance with the present application through the operation of the computer. Those skilled in the art will appreciate that the form in which the computer program instructions reside on a computer-readable medium includes, but is not limited to, source files, executable files, installation package files, and the like, and that the manner in which the computer program instructions are executed by a computer includes, but is not limited to: the computer directly executes the instruction, or the computer compiles the instruction and then executes the corresponding compiled program, or the computer reads and executes the instruction, or the computer reads and installs the instruction and then executes the corresponding installed program. Computer-readable media herein can be any available computer-readable storage media or communication media that can be accessed by a computer.
Communication media includes media by which communication signals, including, for example, computer readable instructions, data structures, program modules, or other data, are transmitted from one system to another. Communication media may include conductive transmission media such as cables and wires (e.g., fiber optics, coaxial, etc.) and wireless (non-conductive transmission) media capable of propagating energy waves such as acoustic, electromagnetic, RF, microwave, and infrared. Computer readable instructions, data structures, program modules or other data may be embodied in a modulated data signal, such as a carrier wave or similar mechanism that is embodied in a wireless medium, such as part of spread-spectrum techniques, for example. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. The modulation may be analog, digital or hybrid modulation techniques.
By way of example, and not limitation, computer-readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media include, but are not limited to, volatile memory such as random access memory (RAM, DRAM, SRAM); and non-volatile memory such as flash memory, various read-only memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, feRAM); and magnetic and optical storage devices (hard disk, tape, CD, DVD); or other now known media or later developed that can store computer-readable information/data for use by a computer system.
An embodiment according to the present application herein comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the apparatus to perform a method and/or solution according to embodiments of the present application as described above.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not to denote any particular order.

Claims (15)

1. A method of determining rotation angle information of a carrier apparatus, wherein the carrier apparatus is used for carrying a PTZ camera, the method comprising:
acquiring shooting parameter information of the PTZ shooting device;
acquiring current image information currently shot by the PTZ camera device and first angle information of the bearing equipment when the current image information is shot;
acquiring an area image position of a frame area in the current image information, wherein the frame area is used for indicating a direction to be adjusted of the bearing equipment;
and determining rotation angle information of the bearing equipment according to the shooting parameter information, the first angle information and the area image position, wherein the rotation angle information is used for indicating angle information for adjusting the PTZ shooting device from the current shooting posture to a space position corresponding to the area image position.
2. The method of claim 1, wherein the method further comprises:
and sending a corresponding rotation adjusting instruction to corresponding bearing equipment so as to adjust the spatial position of the PTZ camera device just facing the frame selection area and carry out image acquisition, wherein the rotation adjusting instruction comprises the rotation angle information.
3. The method according to claim 1 or 2, wherein the first angle information comprises yaw angle information in a horizontal direction and pitch angle information in a vertical direction, and the rotation angle information comprises yaw rotation angle information in the horizontal direction and pitch rotation angle information in the vertical direction.
4. The method according to claim 1, wherein the determining of the rotation angle information of the carrying device according to the imaging parameter information, the first angle information and the area image position comprises:
determining first transformation information from a world coordinate system to a pixel coordinate system of the PTZ camera device according to the camera shooting parameter information and the first angle information;
determining the area center image position corresponding to the area center of the frame selection area according to the area image position;
and determining rotation angle information of the bearing equipment according to the area center image position and the first transformation information, wherein the rotation angle information is used for indicating angle information for adjusting the PTZ camera from the current camera shooting posture to a space position corresponding to the area image position.
5. The method according to claim 4, wherein the determining first transformation information of a world coordinate system to a pixel coordinate system of the PTZ camera according to the imaging parameter information and the first angle information comprises:
determining a perspective projection matrix from a shooting coordinate system of the PTZ shooting device to a corresponding pixel coordinate system according to the shooting parameter information;
determining an observation matrix corresponding to a world coordinate system to the camera coordinate system according to the first angle information;
and determining first transformation information from the world coordinate system to the pixel coordinate system according to the perspective projection matrix and the observation matrix.
6. The method of claim 4, wherein the determining, according to the area image position, an area center image position corresponding to an area center of the frame selection area comprises:
and determining the central position of the minimum circumscribed rectangular frame as the central image position of the region corresponding to the region center of the frame selection region according to the minimum circumscribed rectangular frame of the region image position.
7. The method of claim 4, wherein the determining the rotation angle information of the carrying device according to the area center image position and the first transformation information comprises:
constructing first vector information of the opposite direction of the current image information according to the first transformation information;
determining second vector information of the space direction corresponding to the area center according to the first transformation information and the position of the image of the area center;
and determining rotation angle information of the bearing equipment according to the first vector information and the second vector information, wherein the rotation angle information is used for indicating angle information for adjusting the PTZ camera to a space position corresponding to the position just facing the area image from the current camera shooting attitude.
8. The method of claim 7, wherein the determining rotation angle information of the carrier device from the first vector information and the second vector information comprises:
converting the first vector information and the second vector information into a spherical polar coordinate system, and determining corresponding first spherical polar vector information and second spherical polar vector information;
and calculating a corresponding angle difference value according to the first spherical polar vector information and the second spherical polar vector information, so as to obtain rotation angle information of the bearing equipment according to the angle difference value, wherein the rotation angle information is used for indicating angle information for adjusting the PTZ camera from the current camera shooting posture to a space position corresponding to the area image position.
9. The method of claim 1, wherein the method further comprises:
and determining corresponding zooming proportion change information according to the frame selection area and the resolution information of the current image information, wherein the zooming proportion change information is used for focusing the acquisition area of the PTZ camera device to the frame selection area.
10. The method of claim 9, wherein the determining the corresponding zoom scale change information according to the frame area and the resolution information of the current image information comprises:
and determining corresponding zooming proportion change information according to the minimum circumscribed rectangle frame of the frame selection area and the resolution of the minimum circumscribed rectangle frame and the current image information, wherein the zooming proportion change information is used for focusing the acquisition area of the PTZ camera device to the frame selection area.
11. The method according to claim 9 or 10, wherein the method further comprises:
and sending the zoom ratio change information to the PTZ camera device so as to adjust an acquisition area of the PTZ camera device to focus the acquisition area to the frame selection area.
12. A computer apparatus for determining rotation angle information of a carrying apparatus for carrying a PTZ camera, the computer apparatus comprising:
the one-to-one module is used for acquiring the camera shooting parameter information of the PTZ camera shooting device;
the first module and the second module are used for acquiring current image information currently shot by the PTZ camera device and first angle information of the bearing equipment when the current image information is shot;
a third module, configured to obtain an area image position of a frame area in the current image information, where the frame area is used to indicate a direction to be adjusted of the bearing device;
and the four modules are used for determining rotation angle information of the bearing equipment according to the shooting parameter information, the first angle information and the area image position, wherein the rotation angle information is used for indicating angle information for adjusting the PTZ shooting device from the current shooting posture to a space position corresponding to the area image position.
13. A computer device, wherein the device comprises:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the steps of the method of any one of claims 1 to 12.
14. A computer-readable storage medium having stored thereon a computer program/instructions, characterized in that the computer program/instructions, when executed, cause a system to perform the steps of performing the method according to any one of claims 1 to 12.
15. A computer program product comprising computer program/instructions, characterized in that the computer program/instructions, when executed by a processor, implement the steps of the method of any of claims 1 to 12.
CN202210696871.6A 2022-06-20 2022-06-20 Method and device for determining rotation angle information of bearing device Active CN115190237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210696871.6A CN115190237B (en) 2022-06-20 2022-06-20 Method and device for determining rotation angle information of bearing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210696871.6A CN115190237B (en) 2022-06-20 2022-06-20 Method and device for determining rotation angle information of bearing device

Publications (2)

Publication Number Publication Date
CN115190237A true CN115190237A (en) 2022-10-14
CN115190237B CN115190237B (en) 2023-12-15

Family

ID=83513067

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210696871.6A Active CN115190237B (en) 2022-06-20 2022-06-20 Method and device for determining rotation angle information of bearing device

Country Status (1)

Country Link
CN (1) CN115190237B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115866254A (en) * 2022-11-24 2023-03-28 亮风台(上海)信息科技有限公司 Method and equipment for transmitting video frame and camera shooting parameter information
CN115922404A (en) * 2023-01-28 2023-04-07 中冶赛迪技术研究中心有限公司 Disassembling method, disassembling system, electronic equipment and storage medium

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110122250A1 (en) * 2009-11-23 2011-05-26 Hon Hai Precision Industry Co., Ltd. System and method for motion detection
KR20110094664A (en) * 2010-02-17 2011-08-24 (주)서광시스템 Apparatus for controlling pan/tilt/zoom camera in omnidirectional and method for the same
US20140078263A1 (en) * 2012-09-18 2014-03-20 Samsung Techwin Co., Ltd. Monitoring apparatus and system using 3d information of images and monitoring method using the same
CN103826103A (en) * 2014-02-27 2014-05-28 浙江宇视科技有限公司 Cruise control method for tripod head video camera
CN105676880A (en) * 2016-01-13 2016-06-15 零度智控(北京)智能科技有限公司 Control method and system of holder camera device
CN106971408A (en) * 2017-03-24 2017-07-21 大连理工大学 A kind of camera marking method based on space-time conversion thought
WO2017173734A1 (en) * 2016-04-06 2017-10-12 高鹏 Method and device for adjusting photographic angle and unmanned aerial vehicle
CN107257440A (en) * 2017-07-31 2017-10-17 深圳回收宝科技有限公司 It is a kind of to detect method, equipment and storage medium that video tracking is shot
CN107466385A (en) * 2016-08-03 2017-12-12 深圳市大疆灵眸科技有限公司 A kind of cloud platform control method and system
KR20190013104A (en) * 2017-07-31 2019-02-11 한화테크윈 주식회사 Surveillance system and operation method thereof
CN109635724A (en) * 2018-12-11 2019-04-16 东莞市强艺体育器材有限公司 A kind of intelligent comparison method of movement
CN110083180A (en) * 2019-05-22 2019-08-02 深圳市道通智能航空技术有限公司 Cloud platform control method, device, controlling terminal and aerocraft system
CN110225226A (en) * 2019-05-10 2019-09-10 华中科技大学 A kind of Visual Tracking System and method
WO2020019106A1 (en) * 2018-07-23 2020-01-30 深圳市大疆创新科技有限公司 Gimbal and unmanned aerial vehicle control method, gimbal, and unmanned aerial vehicle
CN111345029A (en) * 2019-05-30 2020-06-26 深圳市大疆创新科技有限公司 Target tracking method and device, movable platform and storage medium
CN111405193A (en) * 2020-04-30 2020-07-10 重庆紫光华山智安科技有限公司 Focusing method and device and camera equipment
CN111611989A (en) * 2020-05-22 2020-09-01 四川智动木牛智能科技有限公司 Multi-target accurate positioning identification method based on autonomous robot
EP3745718A1 (en) * 2019-05-31 2020-12-02 Idis Co., Ltd. Method of controlling pan-tilt-zoom camera by using fisheye camera and monitoring system
CN112040128A (en) * 2020-09-03 2020-12-04 浙江大华技术股份有限公司 Method and device for determining working parameters, storage medium and electronic device
CN112640422A (en) * 2020-04-24 2021-04-09 深圳市大疆创新科技有限公司 Photographing method, movable platform, control device, and storage medium
CN112714287A (en) * 2020-12-23 2021-04-27 广东科凯达智能机器人有限公司 Pan-tilt target conversion control method, device, equipment and storage medium
WO2021127888A1 (en) * 2019-12-23 2021-07-01 深圳市大疆创新科技有限公司 Control method, smart glasses, mobile platform, gimbal, control system, and computer-readable storage medium
CN113315914A (en) * 2021-05-25 2021-08-27 上海哔哩哔哩科技有限公司 Panoramic video data processing method and device
CN113345028A (en) * 2021-06-01 2021-09-03 亮风台(上海)信息科技有限公司 Method and equipment for determining target coordinate transformation information
CN113473024A (en) * 2020-08-21 2021-10-01 海信视像科技股份有限公司 Display device, holder camera and camera control method
US20210374994A1 (en) * 2020-05-26 2021-12-02 Beihang University Gaze point calculation method, apparatus and device
CN113747071A (en) * 2021-09-10 2021-12-03 深圳市道通智能航空技术股份有限公司 Unmanned aerial vehicle shooting method and device, unmanned aerial vehicle and storage medium
WO2022041013A1 (en) * 2020-08-26 2022-03-03 深圳市大疆创新科技有限公司 Control method, handheld gimbal, system, and computer readable storage medium

Patent Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110122250A1 (en) * 2009-11-23 2011-05-26 Hon Hai Precision Industry Co., Ltd. System and method for motion detection
KR20110094664A (en) * 2010-02-17 2011-08-24 (주)서광시스템 Apparatus for controlling pan/tilt/zoom camera in omnidirectional and method for the same
US20140078263A1 (en) * 2012-09-18 2014-03-20 Samsung Techwin Co., Ltd. Monitoring apparatus and system using 3d information of images and monitoring method using the same
CN103826103A (en) * 2014-02-27 2014-05-28 浙江宇视科技有限公司 Cruise control method for tripod head video camera
CN105676880A (en) * 2016-01-13 2016-06-15 零度智控(北京)智能科技有限公司 Control method and system of holder camera device
WO2017173734A1 (en) * 2016-04-06 2017-10-12 高鹏 Method and device for adjusting photographic angle and unmanned aerial vehicle
CN107466385A (en) * 2016-08-03 2017-12-12 深圳市大疆灵眸科技有限公司 A kind of cloud platform control method and system
CN106971408A (en) * 2017-03-24 2017-07-21 大连理工大学 A kind of camera marking method based on space-time conversion thought
CN107257440A (en) * 2017-07-31 2017-10-17 深圳回收宝科技有限公司 It is a kind of to detect method, equipment and storage medium that video tracking is shot
KR20190013104A (en) * 2017-07-31 2019-02-11 한화테크윈 주식회사 Surveillance system and operation method thereof
WO2020019106A1 (en) * 2018-07-23 2020-01-30 深圳市大疆创新科技有限公司 Gimbal and unmanned aerial vehicle control method, gimbal, and unmanned aerial vehicle
CN109635724A (en) * 2018-12-11 2019-04-16 东莞市强艺体育器材有限公司 A kind of intelligent comparison method of movement
CN110225226A (en) * 2019-05-10 2019-09-10 华中科技大学 A kind of Visual Tracking System and method
CN110083180A (en) * 2019-05-22 2019-08-02 深圳市道通智能航空技术有限公司 Cloud platform control method, device, controlling terminal and aerocraft system
CN111345029A (en) * 2019-05-30 2020-06-26 深圳市大疆创新科技有限公司 Target tracking method and device, movable platform and storage medium
EP3745718A1 (en) * 2019-05-31 2020-12-02 Idis Co., Ltd. Method of controlling pan-tilt-zoom camera by using fisheye camera and monitoring system
US20200382712A1 (en) * 2019-05-31 2020-12-03 Idis Co., Ltd. Method of controlling pan-tilt-zoom camera by using fisheye camera and monitoring system
WO2021127888A1 (en) * 2019-12-23 2021-07-01 深圳市大疆创新科技有限公司 Control method, smart glasses, mobile platform, gimbal, control system, and computer-readable storage medium
CN112640422A (en) * 2020-04-24 2021-04-09 深圳市大疆创新科技有限公司 Photographing method, movable platform, control device, and storage medium
CN111405193A (en) * 2020-04-30 2020-07-10 重庆紫光华山智安科技有限公司 Focusing method and device and camera equipment
CN111611989A (en) * 2020-05-22 2020-09-01 四川智动木牛智能科技有限公司 Multi-target accurate positioning identification method based on autonomous robot
US20210374994A1 (en) * 2020-05-26 2021-12-02 Beihang University Gaze point calculation method, apparatus and device
CN113473024A (en) * 2020-08-21 2021-10-01 海信视像科技股份有限公司 Display device, holder camera and camera control method
WO2022041013A1 (en) * 2020-08-26 2022-03-03 深圳市大疆创新科技有限公司 Control method, handheld gimbal, system, and computer readable storage medium
CN112040128A (en) * 2020-09-03 2020-12-04 浙江大华技术股份有限公司 Method and device for determining working parameters, storage medium and electronic device
CN112714287A (en) * 2020-12-23 2021-04-27 广东科凯达智能机器人有限公司 Pan-tilt target conversion control method, device, equipment and storage medium
CN113315914A (en) * 2021-05-25 2021-08-27 上海哔哩哔哩科技有限公司 Panoramic video data processing method and device
CN113345028A (en) * 2021-06-01 2021-09-03 亮风台(上海)信息科技有限公司 Method and equipment for determining target coordinate transformation information
CN113747071A (en) * 2021-09-10 2021-12-03 深圳市道通智能航空技术股份有限公司 Unmanned aerial vehicle shooting method and device, unmanned aerial vehicle and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
万定锐;周杰;: "双PTZ摄像机***的标定", 中国图象图形学报, no. 04 *
代亮,刘大成,王芹华,郑力: "远程多摄像机协同跟踪实现方法研究", 制造技术与机床, no. 09 *
张雪波;路晗;方勇纯;李宝全;: "室外环境下PTZ摄像机全自动标定技术及其应用", 机器人, no. 04 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115866254A (en) * 2022-11-24 2023-03-28 亮风台(上海)信息科技有限公司 Method and equipment for transmitting video frame and camera shooting parameter information
CN115922404A (en) * 2023-01-28 2023-04-07 中冶赛迪技术研究中心有限公司 Disassembling method, disassembling system, electronic equipment and storage medium
CN115922404B (en) * 2023-01-28 2024-04-12 中冶赛迪技术研究中心有限公司 Disassembling method, disassembling system, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN115190237B (en) 2023-12-15

Similar Documents

Publication Publication Date Title
WO2021036353A1 (en) Photographing-based 3d modeling system and method, and automatic 3d modeling apparatus and method
US10122997B1 (en) Automated matrix photo framing using range camera input
CN109887003B (en) Method and equipment for carrying out three-dimensional tracking initialization
US11748906B2 (en) Gaze point calculation method, apparatus and device
CN115190237B (en) Method and device for determining rotation angle information of bearing device
US9600859B2 (en) Image processing device, image processing method, and information processing device
EP3438919B1 (en) Image displaying method and head-mounted display apparatus
CN108304075B (en) Method and device for performing man-machine interaction on augmented reality device
US11557083B2 (en) Photography-based 3D modeling system and method, and automatic 3D modeling apparatus and method
CN109032348B (en) Intelligent manufacturing method and equipment based on augmented reality
US20150371385A1 (en) Method and system for calibrating surveillance cameras
CN113869231B (en) Method and equipment for acquiring real-time image information of target object
CN113345028B (en) Method and equipment for determining target coordinate transformation information
CN105989603A (en) Machine vision image sensor calibration
CN110572564B (en) Information processing apparatus, information processing method, and storage medium
CN115330966A (en) Method, system, device and storage medium for generating house type graph
CN111737518A (en) Image display method and device based on three-dimensional scene model and electronic equipment
US10565803B2 (en) Methods and apparatuses for determining positions of multi-directional image capture apparatuses
CN102096938A (en) Construction method capable of measuring panoramic picture
Jiang et al. An accurate and flexible technique for camera calibration
CN110392202A (en) Image processing apparatus, camera chain, image processing method
CN113763478B (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN116109684B (en) Online video monitoring two-dimensional and three-dimensional data mapping method and device for variable electric field station
CN110807413B (en) Target display method and related device
CN114640833A (en) Projection picture adjusting method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 201210 7th Floor, No. 1, Lane 5005, Shenjiang Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai

Applicant after: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.

Address before: Room 501 / 503-505, 570 shengxia Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai, 201203

Applicant before: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant