CN113869231B - Method and equipment for acquiring real-time image information of target object - Google Patents

Method and equipment for acquiring real-time image information of target object Download PDF

Info

Publication number
CN113869231B
CN113869231B CN202111154144.9A CN202111154144A CN113869231B CN 113869231 B CN113869231 B CN 113869231B CN 202111154144 A CN202111154144 A CN 202111154144A CN 113869231 B CN113869231 B CN 113869231B
Authority
CN
China
Prior art keywords
target
ptz
ptz camera
information
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111154144.9A
Other languages
Chinese (zh)
Other versions
CN113869231A (en
Inventor
廖春元
李文卿
陈芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hiscene Information Technology Co Ltd
Original Assignee
Hiscene Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hiscene Information Technology Co Ltd filed Critical Hiscene Information Technology Co Ltd
Priority to CN202111154144.9A priority Critical patent/CN113869231B/en
Publication of CN113869231A publication Critical patent/CN113869231A/en
Priority to PCT/CN2022/110488 priority patent/WO2023051027A1/en
Application granted granted Critical
Publication of CN113869231B publication Critical patent/CN113869231B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)

Abstract

The application aims to provide a method and equipment for acquiring real-time image information of a target object, and the method comprises the following steps: the method comprises the steps of determining a matched target PTZ camera device from a plurality of PTZ camera devices according to target position information by acquiring the target position information of a target object, wherein the acquisition area of the target PTZ camera device contains the target position information, sending an image acquisition instruction about the target object to the target PTZ camera device, and receiving real-time image information returned by the target PTZ camera device based on the image acquisition instruction. According to the method and the device, PTZ camera devices around the coordinate position of the specified target object can be intelligently recommended in massive PTZ camera devices, the best PTZ camera device or devices are selected, the PTZ camera devices focus on the target through the linkage control cloud deck, corresponding AR label prompts can be added into each real-time image information, and therefore more efficient and accurate intelligent office experience is achieved.

Description

Method and equipment for acquiring real-time image information of target object
Technical Field
The present application relates to the field of communications, and in particular, to a technique for acquiring real-time image information of a target object.
Background
The PTZ camera device is a monitoring camera with a PTZ control holder, PTZ is short for Pan/Tilt/Zoom, and the monitoring camera has three dimensions controlled by a holder: left-right rotation, up-down pitching and zooming. With the continuous development of cities, the types and the number of monitoring cameras are more and more, so that the control and the maintenance are more and more difficult. When an emergency happens, the corresponding camera is opened manually to obtain a scene picture at present, but the experience requirement on operators is high, the efficiency is low, and the defects are more prominent along with the increase of the number of the cameras and the monitoring range.
Disclosure of Invention
An object of the present application is to provide a method and apparatus for acquiring real-time image information of a target object.
According to an aspect of the present application, there is provided a method for acquiring real-time image information of a target object, the method comprising:
acquiring target position information of a target object;
determining a matched target PTZ camera device from a plurality of PTZ camera devices according to the target position information, wherein the acquisition area of the target PTZ camera device contains the target position information;
and sending an image acquisition instruction about a target object to the target PTZ camera device, and receiving real-time image information returned by the target PTZ camera device based on the image acquisition instruction.
According to another aspect of the present application, there is provided an apparatus for acquiring real-time image information of a target object, the apparatus comprising:
a one-to-one module for acquiring target position information of a target object;
the second module is used for determining a matched target PTZ camera device from a plurality of PTZ camera devices according to the target position information, wherein the acquisition area of the target PTZ camera device comprises the target position information;
and the three modules are used for sending an image acquisition instruction about a target object to the target PTZ camera device and receiving real-time image information returned by the target PTZ camera device based on the image acquisition instruction.
According to an aspect of the present application, there is provided a computer apparatus, wherein the apparatus comprises:
a processor; and
a memory arranged to store computer executable instructions which, when executed, cause the processor to perform the steps of the method as described in any one of the above.
According to an aspect of the application, there is provided a computer readable storage medium having stored thereon a computer program/instructions, characterized in that the computer program/instructions, when executed, cause a system to perform the steps of performing the method as described in any of the above.
According to an aspect of the application, there is provided a computer program product comprising computer programs/instructions, characterized in that the computer programs/instructions, when executed by a processor, implement the steps of the method as described in any of the above.
Compared with the prior art, the target position information of the target object is obtained, the matched target PTZ camera device is determined from the multiple PTZ camera devices according to the target position information, the acquisition area of the target PTZ camera device comprises the target position information, an image acquisition instruction about the target object is sent to the target PTZ camera device, and real-time image information returned by the target PTZ camera device based on the image acquisition instruction is received. According to the method and the device, PTZ camera devices around the position of the specified target object can be intelligently recommended in massive PTZ camera devices, the best PTZ camera device or devices can be selected, further, the PTZ camera device can be linked to control the holder to focus on the target, and further, corresponding AR label prompts can be added into each real-time image information, so that more efficient and accurate intelligent office experience is realized.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 illustrates a flow diagram of a method for obtaining real-time image information of a target object according to one embodiment of the present application;
FIG. 2 illustrates an exemplary diagram of determining target location information according to one embodiment of the present application;
FIG. 3 illustrates an example diagram of determining latitude and longitude of a target object according to one embodiment of the present application;
FIG. 4 illustrates an exemplary view of a PTZ camera in accordance with one embodiment of the present application;
FIG. 5 illustrates an exemplary diagram of determining an azimuth angle according to one embodiment of the present application;
FIG. 6 illustrates an exemplary diagram of a target-determining PTZ camera in accordance with one embodiment of the present application;
fig. 7 illustrates functional modules of a target location information acquiring apparatus according to an embodiment of the present application;
FIG. 8 illustrates an exemplary system that can be used to implement the various embodiments described in this application.
The same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
The present application is described in further detail below with reference to the attached drawing figures.
In a typical configuration of the present application, the terminal, the device serving the network, and the trusted party each include one or more processors (e.g., central Processing Units (CPUs)), input/output interfaces, network interfaces, and memory.
The Memory may include forms of volatile Memory, random Access Memory (RAM), and/or non-volatile Memory in a computer-readable medium, such as Read Only Memory (ROM) or Flash Memory. Memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase-Change Memory (PCM), programmable Random Access Memory (PRAM), static Random-Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash Memory or other Memory technology, compact Disc Read Only Memory (CD-ROM), digital Versatile Disc (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The device referred to in this application includes, but is not limited to, a user device, a network device, or a device formed by integrating a user device and a network device through a network. The user equipment includes, but is not limited to, any mobile electronic product, such as a smart phone, a tablet computer, etc., capable of performing human-computer interaction with a user (e.g., human-computer interaction through a touch panel), and the mobile electronic product may employ any operating system, such as an Android operating system, an iOS operating system, etc. The network Device includes an electronic Device capable of automatically performing numerical calculation and information processing according to a preset or stored instruction, and the hardware includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded Device, and the like. The network device includes but is not limited to a computer, a network host, a single network server, a plurality of network server sets or a cloud of a plurality of servers; here, the Cloud is composed of a large number of computers or web servers based on Cloud Computing (Cloud Computing), which is a kind of distributed Computing, one virtual supercomputer consisting of a collection of loosely coupled computers. Including, but not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, a wireless Ad Hoc network (Ad Hoc network), etc. Preferably, the device may also be a program running on the user device, the network device, or a device formed by integrating the user device and the network device, the touch terminal, or the network device and the touch terminal through a network.
Of course, those skilled in the art will appreciate that the foregoing is by way of example only, and that other existing or future devices, which may be suitable for use in the present application, are also encompassed within the scope of the present application and are hereby incorporated by reference.
In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
The application provides a method for acquiring real-time image information of a target object, which is mainly applied to computer equipment, wherein the computer equipment establishes communication connection with a corresponding PTZ camera device, and can receive image information, camera shooting posture information and the like shot by the PTZ camera device based on the communication connection. The PTZ camera device (such as a PTZ camera and the like) comprises a Zoom holder camera installed in a city, the PTZ (Pan/Tilt/Zoom) parameter of the camera is adjustable, the PTZ parameter can be adjusted based on a control instruction sent by computer equipment to which the scheme belongs, and can also be adjusted based on control instructions of other equipment (such as other control equipment or a server and the like).
Specifically, the calibration of the camera device comprises the steps of establishing a relation between the pixel position of a camera image and a target object, and solving parameters of the camera model according to a corresponding relation between coordinates of feature points in the image and world coordinates of the camera imaging model. The model parameters to be calibrated by the camera include internal parameters and external parameters. For the same camera, the internal reference matrix of the camera depends on the internal parameters of the camera, and the internal reference matrix of the camera is not changed no matter what the position relationship between the calibration board and the camera is. However, the external reference matrix reflects the position relationship between the calibration board and the camera, and the position relationship between the calibration board and the camera has changed for different pictures, and at this time, the external reference matrix corresponding to each picture is different. The internal and external parameters of the PTZ imaging device may be pre-calculated, and when the internal and external parameters are known, the position information of the target object in the world coordinate system may be calculated and determined according to the PTZ imaging device, the image position of the target object, and the like. The target object includes, but is not limited to, things corresponding to a point, a line, a plane, and a body that are expected to be acquired, or a carrier on which the point, the line, the plane, and the body are located, and the like, for example, the target object may be a person, a building, a car, and the like, and may also be a certain location, a certain position, even certain coordinate information, and the like.
In this embodiment, the computer device may be an independent server, or may be a server network or a server cluster composed of servers, for example, the computer device described in this embodiment includes, but is not limited to, a computer, a network host, a single network server, a plurality of network server sets, or a cloud server composed of a plurality of servers. Among them, the Cloud server is constituted by a large number of computers or web servers based on Cloud Computing (Cloud Computing).
Fig. 1 shows a method for acquiring real-time image information of a target object according to an aspect of the present application, wherein the method comprises step S101, step S102 and step S103. In step S101, target position information of a target object is acquired; in step S102, a matching target PTZ camera is determined from a plurality of PTZ cameras according to the target position information, wherein an acquisition area of the target PTZ camera includes the target position information; in step S103, an image capture instruction about a target object is sent to the target PTZ camera, and real-time image information returned by the target PTZ camera based on the image capture instruction is received.
Specifically, in step S101, target position information of the target object is acquired. For example, the target location information includes, but is not limited to, location information for identifying a spatial distribution of the target object, which may be coordinate location information in a certain coordinate system, or relative location information with respect to a certain object (known geographical coordinates, etc.), or the like. Specifically, the information includes latitude and longitude information of the target object, position information of the target object in an electronic map, a coordinate position of the target object in a world coordinate system corresponding to the PTZ imaging device, or a relative position of the target object with respect to a certain PTZ imaging device. The target position information may be directly uploaded by the user equipment or calculated according to other data, for example, target position information about the target object uploaded by mobile equipment (such as unmanned aerial vehicle equipment, mobile terminal, front line police officer, alarm personnel, etc.) and other server equipment (such as command equipment, etc.) in communication connection with the computer equipment; alternatively, the computer device acquires image information about the target object, and calculates target position information or the like that determines the target object based on the image information.
In some embodiments, in step S101, target location information about a target object uploaded by a user equipment is received. For example, a PTZ camera system (including a computer device) may access another device, establish a corresponding communication connection with the other device, such as a mobile terminal or a command device of a police officer, or a mobile terminal of an alarm officer, and through the communication connection, a user device may upload target location information about a target object to the PTZ camera system, where the target location information may be a location where the user device itself is located, which is acquired by the corresponding user device through a positioning device, or may calculate location information of the determined target object according to a current location of the user device, such as displaying a corresponding electronic map according to the current location where the user device is located, and collecting operations of a user on clicking, selecting, and the like of the target object in the electronic map to determine the target location information of the target object, or may be target location information of the target object directly input by the user device.
In some embodiments, in step S101, PTZ image information about the target object is acquired; determining image position information of the target object in the PTZ image information based on the PTZ image information; target position information of the target object is determined based on the image position information. For example, the PTZ camera system may further obtain target position information of the target object through PTZ image information acquired by the PTZ camera, such as the computer device may receive PTZ image information acquired by one or more PTZ cameras, identify and determine a PTZ camera image containing the target object according to template features of the target object, and determine image position information of the target object in the PTZ image information based on the identification result; or based on the operation of a user, marking certain PTZ image information, determining that the PTZ image information contains a target object, and identifying the image position information of the target object in the PTZ image information based on the template characteristic of the target object; or determining the image position information of the target object in the PTZ image information based on the mark of the user on a certain position in the PTZ image information. After the computer equipment determines the image position information of the target object in the PTZ image information, the relative position information of the target object relative to the initial PTZ camera can be determined based on the internal and external parameters of the initial PTZ camera corresponding to the PTZ image information; further, in combination with the imaging position information of the initial PTZ imaging device, such as longitude and latitude information or map position information, the target position information of the target object, such as longitude and latitude information or map position information, may be determined.
In some embodiments, the method further includes a step S104 (not shown), in which step S104, image capturing attitude information and image capturing position information of the PTZ image information corresponding to an initial PTZ image capturing device are acquired; wherein the determining target location information of the target object based on the image location information comprises: target position information of the target object is determined based on the imaging attitude information, the imaging position information, and the image position information. For example, the computer device side stores internal reference and imaging position information (such as coordinate position or geographic coordinate information of the imaging device in a corresponding world coordinate system) of each PTZ imaging device, and can calculate a coordinate conversion relationship (such as external reference) for converting an imaging coordinate system into the world coordinate system corresponding to an initial PTZ imaging device according to a rotation angle of a carrying device of the initial PTZ imaging device when the PTZ image information is captured, wherein the imaging attitude information includes the internal reference and the external reference of the initial PTZ imaging device. The computer device can inquire and determine internal reference, camera position information and the like of the initial PTZ camera through camera identification information (such as a serial number or a serial number) of the initial PTZ camera, calculate and determine target position information and the like of the target object according to external reference of the initial PTZ camera calculated when the PTZ image information is shot and then combine with the image position information of the target object.
In some embodiments, the camera location information includes latitude and longitude information of the initial PTZ camera; wherein the determining target position information of the target object based on the imaging attitude information, the imaging position information, and the image position information includes: determining relative position information of the target object relative to the initial PTZ camera based on the camera attitude information and the image position information; and determining the longitude and latitude information of the target object based on the relative position information and the longitude and latitude information of the initial PTZ camera device. For example, the imaging position information includes latitude and longitude information of the initial PTZ imaging device, and the computer device may determine the relative position information of the target object with respect to the initial PTZ imaging device based on the internal reference and the external reference of the initial PTZ imaging device and the image position information of the target object, such as the coordinate information of the target object in a coordinate system established with reference to the initial PTZ imaging device, where in some embodiments, the coordinate system is established with the ground as X, Y axis and the upward as an origin at an intersection point where the Z axis is perpendicular to the ground along the initial PTZ imaging device camera optical center point, where the coordinate system is not limited by way of example only, and other coordinate systems may represent the relative position information of the target object with respect to the initial PTZ imaging device. Here, the latitude and longitude information of the initial PTZ imaging device is known, and we calculate the latitude and longitude of the target object according to the coordinate information of the target object in the coordinate system and the latitude and longitude information of the PTZ imaging device.
In some embodiments, the imaging position information further comprises acquiring height information of the initial PTZ imaging device; wherein determining relative position information of the target object with respect to the initial PTZ imaging device based on the imaging pose information and the image position information comprises: determining relative position information of the target object with respect to the initial PTZ camera based on the camera attitude information, image position information, and height information of the initial PTZ camera. For example, as shown in fig. 2, a coordinate system is established with the ground where the target object is located as X, Y axis, the Z axis is upward, the intersection point perpendicular to the ground along the optical center point of the PTZ camera is the origin O, and the optical center coordinates c (Xc, yc, zc) of the PTZ camera are set as the origin O, where Xc and Yc are 0, and Zc is the height of the PTZ camera.
The ray L is a straight line which passes through an optical center c (Xc, yc, zc) of the PTZ camera and has a direction vector of Vw, wherein Vw is a ray in a world coordinate system determined according to the image position information of the target object, the internal reference and the external reference of the initial PTZ camera, and the parameter equation of the straight line L is as follows: x = Xc + Xw t; y = Yc + Yw t; z = Zc + Zw t. The plane P is a plane surrounded by X, Y coordinate axes, and a corresponding plane equation is z =0; from the two equations above, the parameter t can be solved, and the intersection point, i.e. the coordinate (Xm, ym, 0) of the target object m can be obtained. And determining the longitude and latitude information of the target object according to the relative position information and the longitude and latitude information of the initial PTZ camera device.
As shown in fig. 3, the coordinates of the intersection point m and the origin O are known, and since the latitude and longitude of the origin O (the latitude and longitude of the initial PTZ imaging device) are known, the latitude and longitude of the intersection point m can be obtained. In some embodiments, regarding the earth as a regular sphere, it is known that the longitude and latitude of the point O of the initial PTZ camera is (lon, lat), the circle of the longitude circle of the point of the initial PTZ camera is C1, the circle of the latitude circle of the point of the initial PTZ camera is C2, the radius R1 of C1 is the earth radius, and the radius R2 of C2 is R2. A plane coordinate system is established by taking the initial PTZ camera device as an origin, the coordinate of a point P on the ground is (x, y), and the corresponding arc lengths of the plane coordinates x and y mapped on the earth are approximate to l2 and l1. From point O to point P, the longitude and latitude angles are alpha and beta, and the following relations exist: r1= earth radius; r2= R1 · (lat); α =2 × pi (l 1/2 × pi R1) = l1/R1= y/R1; β =2 × pi (l 2/2 × pi R2) = l2/R2= x/R2; latitude and longitude of point P = (lon + α, lat + β). Therefore, the longitude and latitude deflection angles α and β can be obtained by knowing the plane coordinate relationship, and the longitude and latitude (lon + α, lat + β) of the target object m can be calculated. On the other hand, if the longitude and latitude of the target object m are known, the longitude and latitude deflection angles α and β can be obtained, and the planar coordinate relationship (Xm, ym) between the target object and the PTZ imaging device can be obtained.
In other cases, a known point is selected on the ground, a direction vector passing through the optical center of the camera can be obtained, the direction vector under the image coordinate system can be obtained after the pan-tilt is rotated and translated, and the coordinate point on the image corresponding to the point on the ground of the physical world can be obtained by solving the intersection point of the extension straight line and the image.
In step S102, a matching target PTZ camera is determined from a plurality of PTZ cameras according to the target position information, wherein an acquisition area of the target PTZ camera includes the target position information. For example, a plurality of PTZ cameras are included in the corresponding PTZ camera system, and a target PTZ camera capable of acquiring a target object can be determined in a matching manner according to the target position information. In some cases, according to the acquired images of a plurality of PTZ cameras, a target PTZ camera capable of acquiring a target object can be determined by means of user selection, target recognition and the like. In some cases, a corresponding distance threshold may be set in advance, and a PTZ camera whose distance difference is smaller than or equal to the distance threshold may be determined as a matching target PTZ camera, based on a comparison between the distance difference of each PTZ camera and the target position information and the distance threshold. In other cases, the computer device determines the acquisition range of each PTZ camera and determines the corresponding target PTZ camera based on the matching of the corresponding acquisition range with the target location information. In some embodiments, the collection range is used to indicate a collectable view angle and/or a collectable distance of each PTZ camera, the corresponding collectable view angle includes a horizontal rotation angle and/or a vertical rotation angle, etc., and the corresponding collectable distance may represent a farthest collection distance of the PTZ camera, and the farthest collection distance may be a fixed radius or a distance amount that varies with a change in the rotation angle, terrain, building height, etc. In some embodiments, when the target position information is within the acquisition range of a certain PTZ camera, the PTZ camera is determined to be a matching target PTZ camera. Further, because the PTZ imaging systems include a large number of PTZ imaging devices, the PTZ imaging devices generally need to be screened, so as to determine the PTZ imaging devices that meet the conditions, and then determine the target PTZ imaging device that matches the PTZ imaging devices, thereby reducing the amount of calculation to a certain extent. For example, the PTZ camera includes tag information for identifying a location of the PTZ camera or a type of the camera, such as an XX park, etc., and the PTZ camera corresponding to a specific tag is screened out by screening in the tag according to the target location information of the target object, or for example, a corresponding distance threshold may be preset, and the PTZ camera having a distance difference smaller than or equal to the distance threshold may be screened out by comparing the distance difference between each PTZ camera and the target location information with the distance threshold; for another example, when the acquisition range is calculated with a fixed radius (e.g., a distance threshold value, etc.), the PTZ cameras whose distance difference is smaller than or equal to the distance threshold value are screened out and then further matched to determine the PTZ camera as the target.
In some embodiments, the method further includes a step S105 (not shown) of acquiring an acquisition area of each of the plurality of PTZ cameras in the step S105; in step S102, if the target position information is in the acquisition area of one of the PTZ imaging devices, the one PTZ imaging device is determined as a matching target PTZ imaging device. For example, the acquisition area generally includes an acquirable view angle and/or an acquirable distance of the PTZ camera, and the corresponding acquirable view angle includes a rotation angle, such as a horizontal rotation angle and/or a vertical rotation angle, as shown in fig. 4, in some embodiments, the acquisition area of the PTZ camera may be determined by the rotation angle, or the corresponding acquisition area may be further determined based on the camera position information of the PTZ camera, in other embodiments, the acquisition area of the PTZ camera may be determined by the acquisition distance, or the corresponding acquisition area may be further determined based on the camera position information of the PTZ camera, and in other embodiments, the corresponding acquisition area may be determined based on the rotation angle and the acquisition distance, or the corresponding acquisition area may be further determined based on the camera position information of the PTZ camera, and the like. As in some embodiments, the method further includes a step S106 (not shown), in which a plurality of azimuth angles of the target object with respect to the plurality of PTZ cameras are determined according to the target position information and the camera position information of the plurality of PTZ cameras, wherein each azimuth angle corresponds to a PTZ camera; in step S102, if there is an azimuth angle in the acquisition area of the PTZ camera corresponding to the azimuth angle, the PTZ camera corresponding to the azimuth angle is determined as a matching target PTZ camera. For example, as shown in fig. 5, the azimuth angle is used for measuring the angle difference between objects on a plane, and is a horizontal included angle between a north-pointing direction line of a certain point and a target direction line along a clockwise direction, the point O of the PTZ camera is, the target position is P, the azimuth angle of the target position relative to the PTZ camera is θ, and the horizontal rotation angle of the pan head of the PTZ camera is pan, the following relations exist: θ = pan; θ = arctan (x/y) (first quadrant, x >0, y > 0); θ = arctan (-x/y) + π × 3/2 (second quadrant, x <0, y > 0); θ = arctan (-x/y) + π/2 (fourth quadrant, x >0, y-woven fabric 0); θ = arctan (x/y) + π (third quadrant, x <0, y woven-over 0).
The PTZ camera device comprises a PTZ camera device body, a set of horizontal rotating angles A and B, a set of horizontal rotating angles AB, a set of horizontal rotating angles B and a set of horizontal rotating angles B, wherein the horizontal rotating angles A and B of the holder on the left and right visible boundaries of each PTZ camera device body are obtained in advance, the view field of the PTZ camera device body is not blocked when the PTZ camera device body rotates between the two sets of horizontal rotating angles AB, and the set of horizontal rotating angles AB form an acquisition area (such as a visible angle) of the PTZ camera device body. Knowing the coordinate relationship (Xm, ym) of the PTZ camera and the target point m in the plane, the azimuth angle theta of the point m relative to the PTZ camera can be obtained, if theta is between B and A, the target m is in the acquisition range of the PTZ camera, otherwise, the target m is not in the acquisition range of the PTZ camera.
Here, the target PTZ camera is installed in a corresponding bearing device, the bearing device includes but is not limited to a pan-tilt of the PTZ camera, and the pan-tilt is a bearing device for installing and fixing a camera, and is divided into a fixed pan-tilt and a motorized pan-tilt. The fixed tripod head is suitable for the condition that the monitoring range is not large, the horizontal and pitching angles of the camera can be adjusted after the camera is installed on the fixed tripod head, and the adjusting mechanism can be locked after the best working posture is achieved. The electric pan-tilt is suitable for scanning and monitoring a large range, and can enlarge the monitoring range of the camera. The holder can change the bearing state due to the change of the rotation pose according to the rotation characteristics of the holder, and the bearing state information comprises horizontal rotation angle information corresponding to the horizontal rotation angle, vertical rotation angle information corresponding to the vertical rotation angle and the like. Coordinate conversion information (such as an external reference) of an imaging coordinate system of the PTZ imaging device with respect to a world coordinate system changes according to a change in the carrying state information of the pan/tilt head. In other words, a certain mapping relationship exists between the bearing state information of the pan/tilt and the coordinate transformation information of the camera coordinate system relative to the world coordinate system, and the mapping relationship can be solved through a plurality of coordinate transformation information corresponding to different camera pose information. For example, the coordinate change information includes rotation matrix information and translation matrix information, the translation matrix information of the imaging coordinate system is fixed relative to the world coordinate system, and there is a mapping relationship between the corresponding rotation matrix information and the bearing state information. Coordinate transformation information corresponding to at least two different camera pose information is taken, corresponding mapping parameter information is calculated and determined through rotation matrix information in the at least two pieces of coordinate transformation information and bearing state information corresponding to the at least two pieces of coordinate transformation information, and the mapping parameter information is divided into horizontal mapping parameters in the horizontal direction, vertical mapping parameters in the vertical direction and the like. When the mapping parameter information is known, coordinate transformation information of the PTZ camera device, such as coordinate transformation information of a camera coordinate system relative to a world coordinate system and rotation matrix information of the camera coordinate system relative to the world coordinate system, can be obtained only by acquiring real-time bearing state information, such as vertical rotation angle information and horizontal rotation angle information of a holder.
In some embodiments, in step S105, rotation angle information of each PTZ imaging device is acquired, wherein the rotation angle includes a horizontal rotation angle and/or a vertical rotation angle; determining an acquisition area of each PTZ camera based on the rotation angle information of each PTZ camera. Because the PTZ cameras are often installed on buildings, there may be blocking of obstacles, there may be a certain dead angle in video monitoring, and when the control console rotates to the target position, the image may be blocked, so it is necessary to determine the acquisition area of each PTZ camera first, for example, determine the acquisition area of each PTZ camera based on the rotation angle information of each PTZ camera. For example, the corresponding rotation angle information includes horizontal rotation angle information and/or vertical rotation angle information, such as obtaining horizontal rotation angles a and B of the pan heads of the left and right visible boundaries in the horizontal direction of each PTZ camera and/or obtaining vertical rotation angles C and D of the pan heads of the upper and lower visible boundaries in the vertical direction in advance, so as to ensure that the view field is not blocked when the PTZ camera rotates between the AB and/or the CD, and the horizontal rotation angles AB and/or the vertical rotation angles CD constitute the collection area (such as the visible angle) of the PTZ camera. The foregoing description mainly determines the acquisition area based on the horizontal rotation angle, and then matches the target PTZ camera according to the azimuth angle, and those skilled in the art should understand that the embodiments are also applicable to the vertical rotation angle, such as determining whether the target position information is in the acquisition area according to the included angle in the vertical direction and the vertical rotation angle. In other embodiments, a spatial coordinate set of points that can be scanned by the PTZ camera in the current space may be determined by combining the horizontal rotation angle and the vertical rotation angle, the spatial coordinate set is used as an acquisition area of the PTZ camera, and if the spatial coordinate corresponding to the target position information belongs to one of the coordinates of the spatial coordinate set, the PTZ camera corresponding to the spatial coordinate set is determined as the target PTZ camera.
In some embodiments, in step S102, at least one candidate PTZ camera is determined based on the target position information and the camera position information of each of the plurality of PTZ cameras; and if the target position information is in the acquisition area of one candidate PTZ camera in the candidate PTZ cameras, determining the one candidate PTZ camera as a matched target PTZ camera. For example, because the number of PTZ cameras included in the PTZ imaging system is large, there is a large amount of calculation to determine a target PTZ camera by matching the acquisition regions one to one, which wastes server resources. We can first determine at least one candidate PTZ camera from the plurality of PTZ cameras by the screening condition, and then determine the corresponding target PTZ camera from the candidate PTZ cameras. The screening conditions include, but are not limited to, modes such as user selection, distance matching, tag selection of the PTZ camera, and the like, for example, according to the target position information, the user manually selects the PTZ camera near the target position information as a candidate PTZ camera, and for example, according to the target position information, the PTZ camera corresponding to the corresponding tag is screened in the tag and determined as a candidate PTZ camera.
In some embodiments, the determining at least one candidate PTZ camera based on the target location information and camera location information for each of the plurality of PTZ cameras includes: determining image pickup distance information of each PTZ image pickup device according to the target position information and the image pickup position information of each PTZ image pickup device in the plurality of PTZ image pickup devices; and if the image pickup distance information of a certain PTZ image pickup device is smaller than or equal to the image pickup distance threshold value, determining the certain PTZ image pickup device as a corresponding candidate PTZ image pickup device. In some embodiments, a distance between the target and the PTZ imaging device is determined based on the target position information and the imaging position information of the PTZ imaging device, and when the distance is less than an imaging distance threshold, the PTZ imaging device is determined to be a candidate PTZ imaging device. For example, as shown in fig. 4, the image capturing devices are searched within a range with target position information as a center and an image capturing distance threshold as a radius (for example, the radius is R), the longitude and latitude of the target position information are known, for example, the longitude and latitude of all points in a circle within the radius R can be obtained, and then the PTZ image capturing devices within the position range of the circle are screened according to the corresponding longitude and latitude of the PTZ image capturing devices, and the PTZ image capturing devices are taken as candidate PTZ image capturing devices. Further, among the candidate PTZ imaging devices, and whether or not the target position information is within the acquisition area of the candidate PTZ imaging device is determined, and if so, the candidate PTZ imaging device is selected as the target PTZ imaging device, as shown in fig. 4, the PTZ imaging device a is not within the imaging distance threshold range, PTZ imaging devices B and C that meet the imaging distance threshold, and the target position information is not within the acquisition area (e.g., the angle of view) of the PTZ imaging device C, so only the PTZ imaging device B is the target PTZ imaging device.
In some embodiments, if the target position information is in an acquisition area of a candidate PTZ camera of the candidate PTZ cameras, determining the candidate PTZ camera as a matching target PTZ camera includes: if the target position information is in the acquisition area of one candidate PTZ camera device of the candidate PTZ camera devices, determining the candidate PTZ camera device as a matched initial target PTZ camera device, and determining a plurality of initial target PTZ camera devices; determining at least one target PTZ camera from the plurality of initial target PTZ cameras. For example, because PTZ cameras in a PTZ camera system are densely distributed, in some cases, the number of target PTZ cameras determined based on the acquisition area is also large, and in order to observe a target position object from multiple angles, the machine position is dispersed, and the viewing angle is expanded, here, we first determine a plurality of initial target PTZ cameras through the acquisition area, and then determine one or more final target PTZ cameras from the plurality of initial target PTZ cameras, for example, randomly select and determine from them, such as randomly determining a predetermined number of initial target PTZ cameras from the plurality of initial target PTZ cameras as the target PTZ cameras, or determining according to one or more of the camera angle, the camera position, and the camera parameters.
In some embodiments, said determining at least one target PTZ camera from said plurality of initial target PTZ cameras comprises: determining one target PTZ camera from the plurality of initial target PTZ cameras, wherein the camera shooting distance information of the one target PTZ camera is the smallest in the camera shooting distance information of the plurality of initial target PTZ cameras; determining other target PTZ camera shooting devices according to the camera shooting parameter information of the target PTZ camera shooting device and the camera shooting parameter information of other initial target PTZ camera shooting devices, wherein the other initial target PTZ camera shooting devices comprise other initial target PTZ camera shooting devices except the target camera shooting device in the plurality of initial target PTZ camera shooting devices. For example, as shown in fig. 6, when a plurality of initial target PTZ cameras satisfying the condition are searched near the target position information, it is necessary to allocate the stations to view the targets at as many angles as possible in order to disperse the stations and expand the view angle. For example, when an accident occurs at an intersection, besides observing the people and things, the traffic and people flow conditions of a plurality of intersections in the section where the accident occurs are also observed, and the field is fully known to be well commanded and scheduled. The imaging parameter information includes, but is not limited to, imaging position information, camera parameter information, imaging angle information relative to a target object, and the like of each initial target PTZ imaging device, and the initial target PTZ imaging devices can be screened according to requirements through one or a combination of the parameters, so that the target PTZ imaging devices are determined.
As in some embodiments, the imaging parameter information includes imaging position information; wherein the determining of the other target PTZ camera device according to the camera parameter information of the one target PTZ camera device and the camera parameter information of the other initial target PTZ camera device comprises: determining the sum of vector included angles corresponding to each other initial target PTZ camera device according to the camera position information of the current target PTZ camera device and the camera position information of other initial target PTZ camera devices, wherein the current target PTZ camera device comprises the one target PTZ camera device and the other determined target PTZ camera devices; f, determining other initial target PTZ camera devices as other target PTZ camera devices according to the vector included angle of each other initial target PTZ camera device in the other initial target PTZ camera devices and the other initial target PTZ camera device with the determined vector included angle and the maximum vector included angle; and e, repeatedly executing the steps e and f until the number of the PTZ cameras with the current targets meets a preset number threshold.
For example, in order to disperse the machine positions and expand the view angle, the vector angle sum of the initial target PTZ camera and the existing target PTZ camera is used as a judgment standard, and the initial target PTZ camera with the largest vector angle sum is selected, wherein the initial target PTZ camera is the machine position with the largest dispersion degree with the original target PTZ camera. The computer device sets a preset number threshold of target PTZ cameras according to the current scene (such as the specific position of a target object, the object attribute and the like). Specifically, an initial target PTZ imaging device is selected as a target PTZ imaging device from among initial target PTZ imaging devices, for example, an initial target PTZ imaging device closest to the target position information, then an angle between vectors is calculated based on a vector composed of the imaging position and the target position of each of the other initial target PTZ imaging devices and a vector composed of the imaging position and the target position of one target PTZ imaging device, the other initial target PTZ imaging devices having the largest vector angle are selected as the other target PTZ imaging devices, then, a vector composed of the current imaging position and the target position of each of the other initial target PTZ imaging devices and a plurality of vectors composed of the imaging position and the target position of the target PTZ imaging devices (one target PTZ imaging device and the other target PTZ imaging devices) are calculated, the vector angle and the other initial target PTZ imaging devices having the largest vector angle and the other initial target PTZ imaging devices are selected as the other target PTZ imaging devices, and so on the other target PTZ imaging devices are sequentially selected until a predetermined number of target PTZ imaging devices are selected. As shown in fig. 6, the initial target PTZ imaging device closest to the target position information is selected as one target PTZ imaging device, then two other target PTZ imaging devices are sequentially selected according to the vector angle and the maximum, three target PTZ imaging devices (P0, P1, P2) are selected in total, next, the fourth target PTZ imaging device is selected from the current other initial target PTZ imaging devices (1), (2), (3), the vector angle sum of (1), (2), (3) and the determined three target PTZ imaging devices (P0, P1, P2) is calculated, and the other initial target PTZ imaging device with the vector angle sum being the maximum is selected as the other target PTZ imaging devices each time, so that the target PTZ imaging device is formed. And circularly calculating in such a way, and selecting all target PTZ camera devices which meet the preset number of thresholds of the conditions.
In step S103, an image capture instruction about a target object is sent to the target PTZ camera, and real-time image information returned by the target PTZ camera based on the image capture instruction is received. For example, after the PTZ camera device has the capability of sensing the world, the target object can be accurately displayed in the corresponding image by controlling the corresponding bearing device to rotate, the computer device sends an image acquisition instruction about the target object to the corresponding target PTZ camera device, the image acquisition instruction comprises the steps of opening the camera and acquiring the current picture, and further the image acquisition instruction further comprises the camera attitude information of each target PTZ camera device and is used for instructing each target PTZ camera device to adjust the current attitude information to the camera attitude information, so that the image acquisition and the like of the target object are facilitated. And optionally, when a plurality of target PTZ camera devices are arranged, the real-time image information can be displayed on a current screen for convenient observation, so that the target object can be observed in multiple directions and at multiple angles. As particularly in some embodiments, the target PTZ camera includes a corresponding target-bearing device; wherein the method further comprises: determining target bearing state information of the target bearing equipment according to the target position information and the image pickup position information of the target PTZ image pickup device, wherein the target bearing state information comprises a corresponding bearing horizontal angle and a corresponding bearing vertical angle; in step S103, an image capture instruction about a target object is sent to the target PTZ camera, and real-time image information returned by the target PTZ camera based on the image capture instruction is received, where the image capture instruction includes the target carrying state information, and the target carrying state information is used to instruct the target PTZ camera to adjust the target carrying device to the target carrying state information. For example, based on the foregoing, we have determined target position information of a target object, where the position information may be coordinate position information in a coordinate system, or relative position information with respect to an object (with known geographic coordinates and the like), such as longitude and latitude, position information in an electronic map, relative position with respect to a PTZ camera, and the like, and if the target object is to be displayed in real-time image information, we calculate, according to the target position information and camera position information of the target PTZ camera, carrying state information of a corresponding pan-tilt when the target object is presented in the field of view, such as horizontal rotation angle and vertical rotation angle of the pan-tilt, and the like. Of course, in order to present a better visual effect, we may present a target object in the center of the image of the real-time image information, as in some embodiments, the target bearing state information is used to instruct the target PTZ camera to adjust the target bearing device to the target bearing state information, so that the target object is in the center of the image of the real-time image information. For example, given target position information of the target object, such as coordinates in a coordinate system established with reference to the target PTZ imaging device as target position information, and location information expressed in other forms such as latitude and longitude, it is possible to further obtain coordinates m (Xm, ym, 0) in the coordinate system established with reference to the target PTZ imaging device, such as coordinates established with X, Y axis on the ground where the target object is located, Z axis in the upward direction, an intersection point perpendicular to the ground along the optical center point of the target PTZ imaging device is an origin O, and the optical center of the target PTZ imaging device is c (Xc, yc, zc) where Xc, yc are 0 and Zc is the height H of the target PTZ imaging device. As shown in fig. 2, if a target object is displayed in real-time image information, the target PTZ camera pan head needs to be adjusted to make the central axis of a lens coincide with a straight line cm, the distance S from m to an origin O and the camera height H can be obtained according to the position information of m (Xm, ym, 0) and c (Xc, yc, zc), the vertical rotation angle tile of the target PTZ camera pan head can be calculated according to an inverse trigonometric function, the horizontal rotation angle pan of the target PTZ camera pan head can be calculated according to Xm, ym, and an inverse trigonometric function, and a target bearing state is determined, so that the pan head is controlled to rotate to a corresponding posture, the target point is displayed in the center of a picture, and the tracking and focusing functions of the target PTZ camera are realized.
In some embodiments, the method further comprises a step S107 (not shown), in which step S107 rendering information about the target object is rendered in the real-time image information based on real-time image position information of the target object in the real-time image information. For example, after the corresponding real-time image information is obtained, a user performs a labeling operation, such as frame selection, click, circle drawing, video adding, audio adding, model adding, text adding, and the like, on a target object in the real-time image information captured by the target PTZ imaging device, and the computer device may generate corresponding rendering information according to the labeling operation, superimpose the rendering information on the real-time image information, and present the rendering information. Alternatively, the computer device may superimpose and present the current real-time image information according to object related information of the target object, such as identification information, attribute information, status information, and other device-added annotations about the target object.
Embodiments of a method for obtaining real-time image information of a target object according to the present application are mainly described above, and further, specific apparatuses capable of implementing the embodiments are provided, which are described below with reference to fig. 7.
Fig. 7 shows an apparatus for acquiring real-time image information of a target object, also referred to as a real-time image information acquisition apparatus 100, according to an aspect of the present application, wherein the apparatus comprises a one-module 101, a two-module 102 and a three-module 103. A one-to-one module 101, configured to obtain target position information of a target object; a second module 102, configured to determine a matched target PTZ camera from a plurality of PTZ cameras according to the target position information, where an acquisition area of the target PTZ camera includes the target position information; and the third module 103 is used for sending an image acquisition instruction about a target object to the target PTZ camera and receiving real-time image information returned by the target PTZ camera based on the image acquisition instruction.
Here, the specific implementation corresponding to the one-to-one module 101, the two-to-two module 102, and the one-to-three module 103 shown in fig. 7 is the same as or similar to the aforementioned embodiment of step S101, step S102, and step S103 shown in fig. 1, and therefore is not repeated herein and is included herein by way of reference.
In some embodiments, a module 101 is configured to receive target location information about a target object uploaded by a user equipment. In some embodiments, in step S101, PTZ image information about the target object is acquired; determining image position information of the target object in the PTZ image information based on the PTZ image information; target position information of the target object is determined based on the image position information. In some embodiments, the apparatus further includes a fourth module (not shown) configured to obtain imaging attitude information and imaging position information of the PTZ image information corresponding to the initial PTZ imaging device; wherein the determining target position information of the target object based on the image position information comprises: determining target position information of the target object based on the imaging attitude information, the imaging position information, and the image position information. In some embodiments, the camera location information includes latitude and longitude information of the initial PTZ camera; wherein the determining target position information of the target object based on the imaging attitude information, the imaging position information, and the image position information includes: determining relative position information of the target object relative to the initial PTZ camera based on the camera attitude information and the image position information; and determining the longitude and latitude information of the target object based on the relative position information and the longitude and latitude information of the initial PTZ camera device. In some embodiments, the camera position information further includes height information of the initial PTZ camera; wherein determining relative position information of the target object with respect to the initial PTZ imaging device based on the imaging pose information and the image position information comprises: determining relative position information of the target object with respect to the initial PTZ camera based on the camera attitude information, image position information, and height information of the initial PTZ camera.
In some embodiments, the apparatus further comprises a fifth module (not shown) for acquiring an acquisition area of each of the plurality of PTZ cameras; the second module 102 is configured to determine, if the target position information is in an acquisition area of a certain PTZ camera of the multiple PTZ cameras, the certain PTZ camera as a matching target PTZ camera. In some embodiments, the apparatus further includes a sixth module (not shown) for determining a plurality of azimuth angles of the target object relative to the plurality of PTZ cameras according to the target position information and the camera position information of the plurality of PTZ cameras, wherein each azimuth angle corresponds to a PTZ camera; the second module 102 is configured to determine, if there is an azimuth angle in the acquisition area of the PTZ camera corresponding to the azimuth angle, the PTZ camera corresponding to the azimuth angle as a matched target PTZ camera. In some embodiments, the system comprises a fifth module, configured to acquire rotation angle information of each PTZ camera, where the rotation angle includes a horizontal rotation angle and/or a vertical rotation angle; determining an acquisition area of each PTZ camera based on the rotation angle information of each PTZ camera.
In some embodiments, a second module 102 is configured to determine at least one candidate PTZ camera according to the target position information and the camera position information of each PTZ camera of the plurality of PTZ cameras; and if the target position information is in the acquisition area of one candidate PTZ camera in the candidate PTZ cameras, determining the candidate PTZ camera as a matched target PTZ camera. In some embodiments, the determining at least one candidate PTZ camera based on the target position information and the camera position information of each of the plurality of PTZ cameras includes: determining image pickup distance information of each PTZ image pickup device according to the target position information and the image pickup position information of each PTZ image pickup device in the plurality of PTZ image pickup devices; and if the image pickup distance information of a certain PTZ image pickup device is smaller than or equal to the image pickup distance threshold value, determining the certain PTZ image pickup device as a corresponding candidate PTZ image pickup device. In some embodiments, if the target position information is in an acquisition area of a candidate PTZ camera of the candidate PTZ cameras, determining the candidate PTZ camera as a matching target PTZ camera includes: if the target position information is in the acquisition area of one candidate PTZ camera device of the candidate PTZ camera devices, determining the candidate PTZ camera device as a matched initial target PTZ camera device, and determining a plurality of initial target PTZ camera devices; determining at least one target PTZ camera from the plurality of initial target PTZ cameras. In some embodiments, said determining at least one target PTZ camera from said plurality of initial target PTZ cameras comprises: determining one target PTZ camera from the plurality of initial target PTZ cameras, wherein the shooting distance information of the one target PTZ camera is the smallest in the shooting distance information of the plurality of initial target PTZ cameras; determining other target PTZ camera shooting devices according to the camera shooting parameter information of the target PTZ camera shooting device and the camera shooting parameter information of other initial target PTZ camera shooting devices, wherein the other initial target PTZ camera shooting devices comprise other initial target PTZ camera shooting devices except the target camera shooting device in the plurality of initial target PTZ camera shooting devices. As in some embodiments, the imaging parameter information includes imaging position information; wherein the determining of the other target PTZ camera device according to the camera parameter information of the one target PTZ camera device and the camera parameter information of the other initial target PTZ camera device comprises: determining a vector included angle sum corresponding to each other initial target PTZ camera device according to the camera position information of the current target PTZ camera device and the camera position information of other initial target PTZ camera devices, wherein the current target PTZ camera device comprises the one target PTZ camera device and the other determined target PTZ camera devices; f, determining other initial target PTZ camera devices as other target PTZ camera devices according to the vector included angle of each other initial target PTZ camera device in the other initial target PTZ camera devices and the other initial target PTZ camera device with the determined vector included angle and the maximum vector included angle; and e, repeatedly executing the steps e and f until the number of the PTZ cameras with the current targets meets a preset number threshold.
In some embodiments, the target PTZ camera includes a corresponding target-bearing device; wherein the method further comprises: determining target bearing state information of the target bearing equipment according to the target position information and the image pickup position information of the target PTZ image pickup device, wherein the target bearing state information comprises a corresponding bearing horizontal angle and a corresponding bearing vertical angle; the three modules 103 are configured to send an image acquisition instruction about a target object to the target PTZ camera, and receive real-time image information returned by the target PTZ camera based on the image acquisition instruction, where the image acquisition instruction includes the target bearing state information, and the target bearing state information is used to instruct the target PTZ camera to adjust the target bearing device to the target bearing state information. In some embodiments, the apparatus further comprises a seventh module (not shown) for presenting rendering information about the target object in the real-time image information in an overlay manner based on real-time image position information of the target object in the real-time image information.
Here, the specific implementation corresponding to the four to seven modules is the same as or similar to the embodiment of the foregoing steps S104 to S107, and thus is not repeated herein and is included by way of reference.
In addition to the methods and apparatus described in the embodiments above, the present application also provides a computer readable storage medium storing computer code that, when executed, performs the method as described in any of the previous items.
The present application also provides a computer program product, which when executed by a computer device, performs the method of any of the preceding claims.
The present application further provides a computer device, comprising:
one or more processors;
a memory for storing one or more computer programs;
the one or more computer programs, when executed by the one or more processors, cause the one or more processors to implement the method of any preceding claim.
FIG. 8 illustrates an exemplary system that can be used to implement the various embodiments described herein;
in some embodiments, as shown in FIG. 8, the system 300 can be implemented as any of the above-described devices in the various embodiments. In some embodiments, system 300 may include one or more computer-readable media (e.g., system memory or NVM/storage 320) having instructions and one or more processors (e.g., processor(s) 305) coupled with the one or more computer-readable media and configured to execute the instructions to implement modules to perform the actions described herein.
For one embodiment, system control module 310 may include any suitable interface controllers to provide any suitable interface to at least one of processor(s) 305 and/or any suitable device or component in communication with system control module 310.
The system control module 310 may include a memory controller module 330 to provide an interface to the system memory 315. Memory controller module 330 may be a hardware module, a software module, and/or a firmware module.
System memory 315 may be used, for example, to load and store data and/or instructions for system 300. For one embodiment, system memory 315 may include any suitable volatile memory, such as suitable DRAM. In some embodiments, the system memory 315 may include a double data rate type four synchronous dynamic random access memory (DDR 4 SDRAM).
For one embodiment, system control module 310 may include one or more input/output (I/O) controllers to provide an interface to NVM/storage 320 and communication interface(s) 325.
For example, NVM/storage 320 may be used to store data and/or instructions. NVM/storage 320 may include any suitable non-volatile memory (e.g., flash memory) and/or may include any suitable non-volatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
NVM/storage 320 may include storage resources that are physically part of the device on which system 300 is installed or may be accessed by the device and not necessarily part of the device. For example, NVM/storage 320 may be accessible over a network via communication interface(s) 325.
Communication interface(s) 325 may provide an interface for system 300 to communicate over one or more networks and/or with any other suitable device. System 300 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols.
For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) of the system control module 310, such as memory controller module 330. For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) of the system control module 310 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310 to form a system on a chip (SoC).
In various embodiments, system 300 may be, but is not limited to being: a server, a workstation, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.). In various embodiments, system 300 may have more or fewer components and/or different architectures. For example, in some embodiments, system 300 includes one or more cameras, a keyboard, a Liquid Crystal Display (LCD) screen (including a touch screen display), a non-volatile memory port, multiple antennas, a graphics chip, an Application Specific Integrated Circuit (ASIC), and speakers.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, for example, implemented using Application Specific Integrated Circuits (ASICs), general purpose computers or any other similar hardware devices. In one embodiment, the software programs of the present application may be executed by a processor to implement the steps or functions described above. Likewise, the software programs (including associated data structures) of the present application may be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Additionally, some of the steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
In addition, some of the present application may be implemented as a computer program product, such as computer program instructions, which when executed by a computer, may invoke or provide methods and/or techniques in accordance with the present application through the operation of the computer. Those skilled in the art will appreciate that the form in which the computer program instructions reside on a computer-readable medium includes, but is not limited to, source files, executable files, installation package files, and the like, and that the manner in which the computer program instructions are executed by a computer includes, but is not limited to: the computer directly executes the instruction, or the computer compiles the instruction and then executes the corresponding compiled program, or the computer reads and executes the instruction, or the computer reads and installs the instruction and then executes the corresponding installed program. Computer-readable media herein can be any available computer-readable storage media or communication media that can be accessed by a computer.
Communication media includes media whereby communication signals, including, for example, computer readable instructions, data structures, program modules, or other data, are transmitted from one system to another. Communication media may include conductive transmission media such as cables and wires (e.g., fiber optics, coaxial, etc.) and wireless (non-conductive transmission) media capable of propagating energy waves, such as acoustic, electromagnetic, RF, microwave, and infrared. Computer readable instructions, data structures, program modules or other data may be embodied in a modulated data signal, such as a carrier wave or similar mechanism that is embodied in a wireless medium, such as part of spread-spectrum techniques, for example. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. The modulation may be analog, digital or hybrid modulation techniques.
By way of example, and not limitation, computer-readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media include, but are not limited to, volatile memory such as random access memory (RAM, DRAM, SRAM); and non-volatile memory such as flash memory, various read-only memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, feRAM); and magnetic and optical storage devices (hard disk, magnetic tape, CD, DVD); or other now known media or later developed that can store computer-readable information/data for use by a computer system.
An embodiment according to the present application comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the apparatus to perform a method and/or a solution according to the aforementioned embodiments of the present application.
It will be evident to those skilled in the art that the application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.

Claims (15)

1. A method for obtaining real-time image information of a target object, wherein the method comprises:
acquiring target position information of a target object;
acquiring an acquisition area of each PTZ camera in a plurality of PTZ cameras;
determining a matched target PTZ camera device from a plurality of PTZ camera devices according to the target position information, wherein the acquisition area of the target PTZ camera device comprises the target position information;
sending an image acquisition instruction about a target object to the target PTZ camera device, and receiving real-time image information returned by the target PTZ camera device based on the image acquisition instruction;
wherein the determining a matching target PTZ camera from a plurality of PTZ cameras according to the target position information comprises:
determining at least one candidate PTZ camera according to the target position information and the camera position information of each PTZ camera in the plurality of PTZ cameras;
if the target position information is in the acquisition area of one candidate PTZ camera device of the candidate PTZ camera devices, determining the candidate PTZ camera device as a matched initial target PTZ camera device, and determining a plurality of initial target PTZ camera devices;
determining one target PTZ camera from the plurality of initial target PTZ cameras, wherein the camera shooting distance information of the one target PTZ camera is the smallest in the camera shooting distance information of the plurality of initial target PTZ cameras;
determining other target PTZ camera devices according to the camera shooting parameter information of the target PTZ camera device and the camera shooting parameter information of other initial target PTZ camera devices, wherein the other initial target PTZ camera devices comprise other initial target PTZ camera devices except the target camera device in the plurality of initial target PTZ camera devices;
the camera shooting parameter information comprises camera shooting position information; the method for determining other target PTZ camera shooting devices according to the camera shooting parameter information of the target PTZ camera shooting device and the camera shooting parameter information of other initial target PTZ camera shooting devices comprises the following steps:
e, determining one or more vectors corresponding to a current target PTZ camera according to the camera position information of the current target PTZ camera and the target position information, determining a vector corresponding to each other initial target PTZ camera according to the camera position information of other initial target PTZ cameras and the target position information, and determining the vector included angle sum of each other initial target PTZ camera according to the one or more vectors corresponding to the current target PTZ camera and the vector corresponding to each other initial target PTZ camera, wherein the current target PTZ camera comprises the one target PTZ camera and the other determined target PTZ cameras;
f, determining other initial target PTZ camera devices as other target PTZ camera devices according to the vector included angle of each other initial target PTZ camera device in the other initial target PTZ camera devices and the other initial target PTZ camera device with the determined vector included angle and the maximum vector included angle;
and e, repeatedly executing the steps e and f until the number of the PTZ cameras with the current targets meets a preset number threshold.
2. The method of claim 1, wherein the obtaining target location information of a target object comprises:
target location information about a target object uploaded by a user equipment is received.
3. The method of claim 1, wherein the obtaining target location information of a target object comprises:
acquiring PTZ image information about the target object;
determining image position information of the target object in the PTZ image information based on the PTZ image information;
target position information of the target object is determined based on the image position information.
4. The method of claim 3, wherein the method further comprises:
acquiring image pickup attitude information and image pickup position information of an initial PTZ image pickup device corresponding to the PTZ image information;
wherein the determining target location information of the target object based on the image location information comprises:
determining target position information of the target object based on the imaging attitude information, the imaging position information, and the image position information.
5. The method of claim 4, wherein the camera location information includes latitude and longitude information of the initial PTZ camera;
wherein the determining target position information of the target object based on the imaging attitude information, the imaging position information, and the image position information includes:
determining relative position information of the target object relative to the initial PTZ camera based on the camera attitude information and the image position information;
and determining the longitude and latitude information of the target object based on the relative position information and the longitude and latitude information of the initial PTZ camera device.
6. The method of claim 5, wherein the camera position information further includes height information of the initial PTZ camera;
wherein determining relative position information of the target object with respect to the initial PTZ camera based on the camera pose information and the image position information comprises:
determining relative position information of the target object with respect to the initial PTZ camera based on the camera attitude information, image position information, and height information of the initial PTZ camera.
7. The method of claim 1, wherein the method further comprises:
determining a plurality of azimuth angles of the target object relative to the plurality of PTZ camera devices according to the target position information and the camera position information of the plurality of PTZ camera devices, wherein each azimuth angle corresponds to one PTZ camera device;
wherein the determining a matching target PTZ camera from a plurality of PTZ cameras according to the target position information comprises:
and if one azimuth angle in the plurality of azimuth angles is in the acquisition area of the PTZ camera device corresponding to the azimuth angle, determining the PTZ camera device corresponding to the azimuth angle as a matched target PTZ camera device.
8. The method of claim 7, wherein the acquiring an acquisition area for each of the plurality of PTZ cameras comprises:
obtaining rotation angle information of each PTZ camera device, wherein the rotation angle comprises a horizontal rotation angle and/or a vertical rotation angle;
determining an acquisition area of each PTZ camera based on the rotation angle information of each PTZ camera.
9. The method of claim 1, wherein the determining at least one candidate PTZ camera from the target location information and camera location information for each of the plurality of PTZ cameras comprises:
determining image pickup distance information of each PTZ image pickup device according to the target position information and the image pickup position information of each PTZ image pickup device in the plurality of PTZ image pickup devices;
and if the image pickup distance information of a certain PTZ image pickup device is smaller than or equal to the image pickup distance threshold value, determining the certain PTZ image pickup device as a corresponding candidate PTZ image pickup device.
10. The method of claim 1, wherein the target PTZ camera includes a corresponding target-bearing device; wherein the method further comprises:
determining target bearing state information of the target bearing equipment according to the target position information and the image pickup position information of the target PTZ image pickup device, wherein the target bearing state information comprises a corresponding bearing horizontal angle and a corresponding bearing vertical angle;
the method for sending an image acquisition instruction about a target object to the target PTZ camera and receiving real-time image information returned by the target PTZ camera based on the image acquisition instruction comprises the following steps:
sending an image acquisition instruction about a target object to the target PTZ camera device, and receiving real-time image information returned by the target PTZ camera device based on the image acquisition instruction, wherein the image acquisition instruction comprises target bearing state information, and the target bearing state information is used for indicating the target PTZ camera device to adjust the target bearing equipment to the target bearing state information.
11. The method of claim 10, wherein the target bearer status information is used to instruct the target PTZ camera to adjust the target bearer device to the target bearer status information such that the target object is in an image center of the real-time image information.
12. The method of claim 1, wherein the method further comprises:
and displaying the rendering information about the target object in the real-time image information in a superposition manner based on the real-time image position information of the target object in the real-time image information.
13. An apparatus for acquiring real-time image information of a target object, wherein the apparatus comprises:
a one-to-one module for acquiring target position information of a target object;
the device comprises a first module, a second module, a third module and a fourth module, wherein the first module is used for acquiring the acquisition area of each PTZ camera in a plurality of PTZ cameras;
the second module is used for determining a matched target PTZ camera device from a plurality of PTZ camera devices according to the target position information, wherein the acquisition area of the target PTZ camera device contains the target position information;
the three modules are used for sending an image acquisition instruction about a target object to the target PTZ camera device and receiving real-time image information returned by the target PTZ camera device based on the image acquisition instruction;
wherein the determining a matching target PTZ camera from a plurality of PTZ cameras according to the target position information comprises:
determining at least one candidate PTZ camera according to the target position information and the camera position information of each PTZ camera in the plurality of PTZ cameras;
if the target position information is in the acquisition area of one candidate PTZ camera device of the candidate PTZ camera devices, determining the candidate PTZ camera device as a matched initial target PTZ camera device, and determining a plurality of initial target PTZ camera devices;
determining one target PTZ camera from the plurality of initial target PTZ cameras, wherein the camera shooting distance information of the one target PTZ camera is the smallest in the camera shooting distance information of the plurality of initial target PTZ cameras;
determining other target PTZ camera devices according to the camera shooting parameter information of the target PTZ camera device and the camera shooting parameter information of other initial target PTZ camera devices, wherein the other initial target PTZ camera devices comprise other initial target PTZ camera devices except the target camera device in the plurality of initial target PTZ camera devices;
the camera shooting parameter information comprises camera shooting position information; the method for determining other target PTZ camera shooting devices according to the camera shooting parameter information of the target PTZ camera shooting device and the camera shooting parameter information of other initial target PTZ camera shooting devices comprises the following steps:
e, determining one or more vectors corresponding to a current target PTZ camera according to the camera position information of the current target PTZ camera and the target position information, determining a vector corresponding to each other initial target PTZ camera according to the camera position information of other initial target PTZ cameras and the target position information, and determining the vector included angle sum of each other initial target PTZ camera according to the one or more vectors corresponding to the current target PTZ camera and the vector corresponding to each other initial target PTZ camera, wherein the current target PTZ camera comprises the one target PTZ camera and the other determined target PTZ cameras;
f, determining other initial target PTZ camera devices as other target PTZ camera devices according to the vector included angle of each other initial target PTZ camera device in the other initial target PTZ camera devices and the other initial target PTZ camera device with the determined vector included angle and the maximum vector included angle;
and e, repeatedly executing the steps e and f until the number of the current target PTZ camera devices meets a preset number threshold.
14. A computer device, wherein the device comprises:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the steps of the method of any one of claims 1 to 12.
15. A computer-readable storage medium having stored thereon a computer program/instructions, characterized in that the computer program/instructions, when executed, cause a system to perform the steps of performing the method according to any one of claims 1 to 12.
CN202111154144.9A 2021-09-29 2021-09-29 Method and equipment for acquiring real-time image information of target object Active CN113869231B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111154144.9A CN113869231B (en) 2021-09-29 2021-09-29 Method and equipment for acquiring real-time image information of target object
PCT/CN2022/110488 WO2023051027A1 (en) 2021-09-29 2022-08-05 Method and device for acquiring real-time image information of target object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111154144.9A CN113869231B (en) 2021-09-29 2021-09-29 Method and equipment for acquiring real-time image information of target object

Publications (2)

Publication Number Publication Date
CN113869231A CN113869231A (en) 2021-12-31
CN113869231B true CN113869231B (en) 2023-01-31

Family

ID=79000696

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111154144.9A Active CN113869231B (en) 2021-09-29 2021-09-29 Method and equipment for acquiring real-time image information of target object

Country Status (2)

Country Link
CN (1) CN113869231B (en)
WO (1) WO2023051027A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113869231B (en) * 2021-09-29 2023-01-31 亮风台(上海)信息科技有限公司 Method and equipment for acquiring real-time image information of target object
CN115439528B (en) * 2022-04-26 2023-07-11 亮风台(上海)信息科技有限公司 Method and equipment for acquiring image position information of target object
CN115225931A (en) * 2022-07-29 2022-10-21 深圳盈天下视觉科技有限公司 Live broadcasting method, device and system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103905792A (en) * 2014-03-26 2014-07-02 武汉烽火众智数字技术有限责任公司 3D positioning method and device based on PTZ surveillance camera
CN105160663A (en) * 2015-08-24 2015-12-16 深圳奥比中光科技有限公司 Method and system for acquiring depth image
CN109596118A (en) * 2018-11-22 2019-04-09 亮风台(上海)信息科技有限公司 It is a kind of for obtaining the method and apparatus of the spatial positional information of target object
CN109995993A (en) * 2018-01-02 2019-07-09 广州亿航智能技术有限公司 Aircraft and its filming control method, device and terminal system
CN111091584A (en) * 2019-12-23 2020-05-01 浙江宇视科技有限公司 Target tracking method, device, equipment and storage medium
CN112213716A (en) * 2020-09-21 2021-01-12 浙江大华技术股份有限公司 Object positioning method and device and electronic equipment
CN113240754A (en) * 2021-06-01 2021-08-10 亮风台(上海)信息科技有限公司 Method, device, equipment and storage medium for determining internal reference of PTZ camera
CN113345028A (en) * 2021-06-01 2021-09-03 亮风台(上海)信息科技有限公司 Method and equipment for determining target coordinate transformation information

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108574822B (en) * 2017-03-08 2021-01-29 华为技术有限公司 Method for realizing target tracking, pan-tilt camera and monitoring platform
CN110139040B (en) * 2019-06-17 2020-11-13 广东安居宝数码科技股份有限公司 Pan-tilt camera positioning method and device, pan-tilt camera, equipment and medium
CN113869231B (en) * 2021-09-29 2023-01-31 亮风台(上海)信息科技有限公司 Method and equipment for acquiring real-time image information of target object

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103905792A (en) * 2014-03-26 2014-07-02 武汉烽火众智数字技术有限责任公司 3D positioning method and device based on PTZ surveillance camera
CN105160663A (en) * 2015-08-24 2015-12-16 深圳奥比中光科技有限公司 Method and system for acquiring depth image
CN109995993A (en) * 2018-01-02 2019-07-09 广州亿航智能技术有限公司 Aircraft and its filming control method, device and terminal system
CN109596118A (en) * 2018-11-22 2019-04-09 亮风台(上海)信息科技有限公司 It is a kind of for obtaining the method and apparatus of the spatial positional information of target object
CN111091584A (en) * 2019-12-23 2020-05-01 浙江宇视科技有限公司 Target tracking method, device, equipment and storage medium
CN112213716A (en) * 2020-09-21 2021-01-12 浙江大华技术股份有限公司 Object positioning method and device and electronic equipment
CN113240754A (en) * 2021-06-01 2021-08-10 亮风台(上海)信息科技有限公司 Method, device, equipment and storage medium for determining internal reference of PTZ camera
CN113345028A (en) * 2021-06-01 2021-09-03 亮风台(上海)信息科技有限公司 Method and equipment for determining target coordinate transformation information

Also Published As

Publication number Publication date
WO2023051027A1 (en) 2023-04-06
CN113869231A (en) 2021-12-31

Similar Documents

Publication Publication Date Title
CN113869231B (en) Method and equipment for acquiring real-time image information of target object
CN108304075B (en) Method and device for performing man-machine interaction on augmented reality device
CN113345028B (en) Method and equipment for determining target coordinate transformation information
US11842516B2 (en) Homography through satellite image matching
WO2014082407A1 (en) Method and system for displaying video monitoring image
KR20210104684A (en) Surveying and mapping systems, surveying and mapping methods, devices and instruments
JP2012084146A (en) User device and method providing augmented reality (ar)
CN115330966A (en) Method, system, device and storage medium for generating house type graph
CN111025283A (en) Method and device for linking radar and dome camera
CN112469967B (en) Mapping system, mapping method, mapping device, mapping apparatus, and recording medium
JP2021509946A (en) Surveying survey sample point planning method, surveying survey sample point planning device, control terminal and computer storage medium
CN109523471A (en) A kind of conversion method, system and the device of ground coordinate and wide angle cameras picture coordinate
CN115190237B (en) Method and device for determining rotation angle information of bearing device
CN115439528B (en) Method and equipment for acquiring image position information of target object
WO2020211593A1 (en) Digital reconstruction method, apparatus, and system for traffic road
CN110807413B (en) Target display method and related device
CN111527375A (en) Planning method and device for surveying and mapping sampling point, control terminal and storage medium
CN115565092A (en) Method and equipment for acquiring geographical position information of target object
AU2010364001A1 (en) System and method for camera control in a surveillance system
CN115460539B (en) Method, equipment, medium and program product for acquiring electronic fence
KR101246844B1 (en) System for 3D stereo control system and providing method thereof
CN115760964B (en) Method and equipment for acquiring screen position information of target object
CN113572946B (en) Image display method, device, system and storage medium
CN115439635B (en) Method and equipment for presenting marking information of target object
CN117095066B (en) Method and device for marking PTZ camera screen

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: 201210 7th Floor, No. 1, Lane 5005, Shenjiang Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.

Address before: Room 501 / 503-505, 570 shengxia Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai, 201203

Patentee before: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.