CN116828132A - Virtual photography control method and system - Google Patents

Virtual photography control method and system Download PDF

Info

Publication number
CN116828132A
CN116828132A CN202310816884.7A CN202310816884A CN116828132A CN 116828132 A CN116828132 A CN 116828132A CN 202310816884 A CN202310816884 A CN 202310816884A CN 116828132 A CN116828132 A CN 116828132A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
speed
motion
pid controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310816884.7A
Other languages
Chinese (zh)
Inventor
栗斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Pandieta Information Technology Co ltd
Original Assignee
Guangzhou Pandieta Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Pandieta Information Technology Co ltd filed Critical Guangzhou Pandieta Information Technology Co ltd
Priority to CN202310816884.7A priority Critical patent/CN116828132A/en
Publication of CN116828132A publication Critical patent/CN116828132A/en
Pending legal-status Critical Current

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application relates to a control method and a system for virtual photography, which belong to the technical field of unmanned aerial vehicles and comprise the following steps: controlling the motion precision and the safety limit of the unmanned photographic plane respectively, shooting or measuring actual scenes with different angles, and obtaining information and data of the actual scenes; processing information and data of the real scene by a computer, and restoring the real scene by using a computer image rendering technology and a three-dimensional modeling technology to generate a virtual scene and a model based on a three-dimensional image processing technology; and rendering and making the virtual scene and the model by using a virtual reality technology and a computer vision technology. The motion precision and the safety limit of the unmanned aerial vehicle are controlled, so that the motion track of the unmanned aerial vehicle is precisely controlled, and the shooting quality of the unmanned aerial vehicle is improved.

Description

Virtual photography control method and system
Technical Field
The application belongs to the technical field of unmanned aerial vehicles, and particularly relates to a control method and a system for virtual photography.
Background
With the rapid development of computer technology, the virtual studio technology has become a hot spot of the new technology of the present digital television studio, and virtual pictures generated by a computer in real time and time pictures shot by a camera can be combined and weighed in the studio through the assistance of the computer, so that the studio can be on the spot even if not in a shooting site, the interestingness and interactive feeling of the whole video are greatly increased, and eyeballs of more audiences are attracted, so that higher audience rating and audience retention rate are obtained.
In order to perfectly synthesize the virtual manufactured picture and the field picture in real time and ensure the quality and the sense of reality of the synthesized picture, extremely high requirements are placed on the motion track of the camera, and the unmanned aerial vehicle control technology in the prior art cannot achieve the effect of enabling the motion track of the unmanned aerial vehicle to be overlapped with the virtual generated picture track.
Disclosure of Invention
In order to solve the technical problems in the background art, the application provides a control method and a system for virtual photography.
The aim of the application can be achieved by the following technical scheme:
the virtual photography control method specifically comprises the following steps:
controlling the motion precision and the safety limit of the unmanned photographic plane respectively, shooting or measuring actual scenes with different angles, and obtaining information and data of the actual scenes;
processing information and data of the real scene by a computer, and restoring the real scene by using a computer image rendering technology and a three-dimensional modeling technology to generate a virtual scene and a model based on a three-dimensional image processing technology;
and rendering and making the virtual scene and the model by using a virtual reality technology and a computer vision technology.
Further, the controlling of the safety limit of the unmanned aerial vehicle specifically includes the following steps:
servo-enabling a motion axis to be controlled, and setting a motion mode as a speed mode;
reading the motion parameters of the speed mode, setting the motion parameters to be modified and reconfiguring the motion parameters of the motion axis;
the planned position and the encoder position are synchronized, if positive limit is exceeded, the speed direction is positive, and if negative limit is exceeded, the speed direction is negative.
Further, if the positive limit is not exceeded, whether the negative limit is exceeded is checked, and if the negative limit is not exceeded, the target speed of the moving shaft is set.
Further, if the speed direction crossing the positive limit is negative, the target speed of the moving axis is set.
Further, to controlling the motion precision and safety limit of the unmanned aerial vehicle, the servo motors in each track direction of the unmanned aerial vehicle need to be controlled respectively, and the method specifically comprises the following steps:
establishing an interface class Axis with the servo motor, wherein the interface class Axis comprises basic functions of configuring the servo motor, the encoder and the limiter in each track direction of the unmanned aerial vehicle, and acquiring current position information and configuration pulse quantity of the unmanned aerial vehicle;
according to the precision requirements of each axis of the unmanned aerial vehicle and the information fed back by the current unmanned aerial vehicle sensor, the function of the interface is realized through AxislmpI.
Further, the method for controlling the motion precision of the unmanned aerial vehicle specifically comprises the following steps:
inputting the expected position of the target photographing unmanned aerial vehicle at the current moment and the measured position of the target photographing unmanned aerial vehicle at the current moment into an outer ring PID controller, thereby obtaining the expected speed;
inputting the expected speed and the measured speed of the target unmanned aerial vehicle at the current moment into an inner ring PID controller, so as to obtain the expected gesture of the target unmanned aerial vehicle;
and controlling the movement precision of the target unmanned aerial vehicle according to the expected gesture.
Further, the expected position of the target photo unmanned aerial vehicle at the current moment and the measured position of the target photo unmanned aerial vehicle at the current moment are input to the outer ring PID controller, so that the expected speed is obtained, and the method specifically comprises the following steps:
inputting the expected position of the target unmanned aerial vehicle at the current moment and the measured position of the target unmanned aerial vehicle at the money-holding moment into an outer ring PID controller, wherein the outer ring PID controller comprises a first control condition and a second control condition,
the calculation formula of the first control condition is that
The calculation formula of the second control condition is that
Wherein: the v is d Representing the desired speed, p d Representing the desired position, p representing the measured position, e p Representing the difference between the desired position and the measured position,representation e p Is differentiated by K p 、K i 、K d ∈R 2×2 ,K p Proportional term coefficient, K, representing the outer loop PID controller i Integral term coefficient, K, representing the outer loop PID controller d Differential term coefficients representing the outer loop PID controller;
and acquiring the expected speed according to the first control condition and the second control condition.
Further, the expected speed and the measured speed of the target unmanned aerial vehicle at the current moment are input to an inner ring PID controller, so that the expected gesture of the target unmanned aerial vehicle is obtained, and the method specifically comprises the following steps:
inputting the expected speed and the measured speed of the target unmanned aerial vehicle at the current moment to an inner ring PID controller, wherein the inner ring PID controller comprises a third control condition and a fourth control condition,
the third control condition is calculated by
The fourth control condition is calculated by
Wherein θ d Representing the desired pose, e v The desired speedThe difference between the degree and the measurement speed,representation e v Is differentiated by K vp 、K vi 、K vd ∈R 2×2 ,K vp Proportional term coefficient, K, representing the inner loop PID controller vi Integral term coefficient, K, representing the inner loop PID controller vd Representing the derivative term coefficients of the inner loop PID controller.
On the other hand, the application discloses a control system for virtual photography, and the control method for virtual photography comprises the following steps: a user presentation layer, a business logic layer, and a device data layer, wherein:
the business logic layer comprises a motion pose module and a motion control module,
the motion pose module calculates the position and pose information of the unmanned aerial vehicle according to the information transmitted by the encoder, and transmits the position and pose information of the unmanned aerial vehicle to the service logic layer;
the motion control module performs limit control on each axial motion of the unmanned aerial vehicle;
the equipment data layer comprises a serial port communication module, an instruction of the service logic layer is transmitted to the servo driver, and data returned by the sensor is sent to the service logic layer.
Further, the user representation layer is a UI interface, provides buttons required for operation for a user and sets related parameters of the unmanned aerial vehicle, and can display the current state of each axis of the unmanned aerial vehicle, including displaying the pose of the current unmanned aerial vehicle and whether each axis is in a limiting state.
The application has the beneficial effects that:
1. according to the virtual photography control method disclosed by the application, the motion precision and the safety limit of the unmanned aerial vehicle are controlled to realize the precise control on the motion track of the unmanned aerial vehicle, so that the photography quality of the unmanned aerial vehicle is improved;
2. the application discloses a control method of virtual photography, which is characterized in that a speed mode is utilized to control a motion axis of an unmanned aerial vehicle, the motion parameters of the motion axis are modified and reconfigured, the limiting condition of the unmanned aerial vehicle is monitored, whether the safety limit can work normally or not is monitored, the method is a protective measure for ensuring that the unmanned aerial vehicle cannot move beyond the limit distance which can be reached by the unmanned aerial vehicle, when the unmanned aerial vehicle exceeds a set limiting device, a system can enter a limiting state and cannot move continuously in the limiting direction, and the unmanned aerial vehicle is prevented from turning out of a track or the internal structure of the unmanned aerial vehicle is prevented from being broken;
3. the application discloses a control method of virtual photography, which also discloses control of the motion precision of a photographic unmanned aerial vehicle.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart showing the overall steps of a control method for virtual photography according to the present application;
fig. 2 is a flowchart of specific steps for controlling the safety limit of the unmanned aerial vehicle in step S1 of the present application;
fig. 3 is a flowchart of a specific step of controlling the motion accuracy of the unmanned aerial vehicle in step S1 of the present application;
FIG. 4 is a block flow diagram of controlling the safety limit of the unmanned aerial vehicle in step S1 of the present application;
fig. 5 is a schematic block diagram of a control system for virtual photography according to the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In unmanned aerial vehicle photography, unmanned aerial vehicle control is the basic function of a motion control system,
as shown in fig. 1, an embodiment of the present application discloses a control method for virtual photography, which specifically includes the following steps:
step S1, respectively controlling the motion precision and the safety limit of the unmanned photographic plane, shooting or measuring actual scenes with different angles, and obtaining information and data of the actual scenes;
s2, processing information and data of the real scene through a computer, and restoring the real scene by utilizing a computer image rendering technology and a three-dimensional modeling technology to generate a virtual scene and a model based on a three-dimensional image processing technology;
and step S3, rendering and making the virtual scene and the model by utilizing a virtual reality technology and a computer vision technology.
In the virtual photography production process, the actual scene needs to be photographed or measured first to obtain the information and data of the actual scene. Then, the data are processed by a computer, and the real scene is restored by using a computer image rendering technology and a three-dimensional modeling technology, so that a virtual scene and a model based on the three-dimensional image processing technology are generated. The virtual scenes and models are then rendered, photographed, and created using virtual reality technology and computer vision technology, creating a virtual photographic visual work.
Further, in the embodiment of the present application, as shown in fig. 2 and fig. 4, the controlling the safety limit of the unmanned aerial vehicle in step S1 specifically includes the following steps:
step S101, servo enabling a motion axis to be controlled and setting a motion mode of a speed mode;
servo enabling refers to the process of switching the operating state of the servo motor controller from a stopped state to an operating state, so that the servo motor controller starts to rotate or regulate speed in response to a control signal from the controller. In a servo system, servo enabling is a very important step, it can ensure that a servo shaft operates according to a preset mode, and can perform error correction and system self-detection in advance to ensure the safety and precision of the shaft, before enabling servo enabling, parameter setting, encoder configuration and other works are usually required to be performed on a motor according to specific model and needs, once the setting is completed, the servo can be enabled, the motor is brought into an operating state, and control signals are started to realize corresponding control motions.
Step S102, reading the motion parameters of the speed mode, setting the motion parameters to be modified, and reconfiguring the motion parameters of the motion axis;
step S103, the planning position and the encoder position are synchronized, if the positive limit is exceeded, the speed direction is positive, and if the negative limit is exceeded, the speed direction is negative.
Further, in the embodiment of the present application, in step S103, if the positive limit is not exceeded, it is checked whether the negative limit is exceeded, and if the negative limit is not exceeded, the target speed of the moving axis is set.
Further, in the embodiment of the present application, in step S103, if the speed direction crossing the positive limit is negative, the target speed of the moving axis is set.
Further, in the embodiment of the present application, in step S1, motion accuracy and safety limit of the unmanned aerial vehicle are controlled, and servo motors in each track direction of the unmanned aerial vehicle are required to be controlled, which specifically includes the following steps:
step S1001, establishing an interface class Axis with a servo motor, wherein the interface class Axis comprises basic functions of configuring servo motors, encoders and limiters in each track direction of the unmanned photographic plane, and acquiring current position information and configuration pulse quantity of the unmanned photographic plane;
step S1002, according to the precision requirements of each axis of the unmanned aerial vehicle and the information fed back by the current unmanned aerial vehicle sensor, the function of the interface is realized through Axislmpl.
Further, in the embodiment of the present application, as shown in fig. 3, in step S1, the motion accuracy of the unmanned aerial vehicle is controlled, and specifically includes the following steps:
step S110, inputting the expected position of the target photographic unmanned aerial vehicle at the current moment and the measured position of the target photographic unmanned aerial vehicle at the current moment into an outer ring PID controller so as to acquire the expected speed;
step S120, inputting the expected speed and the measured speed of the target unmanned aerial vehicle at the current moment to an inner ring PID controller, so as to obtain the expected gesture of the target unmanned aerial vehicle;
and step S130, controlling the motion precision of the target unmanned aerial vehicle according to the expected gesture.
In the flight control system of the unmanned aerial vehicle, the PID controller mainly controls the attitude (including roll, pitch and yaw) and altitude aspects of the unmanned aerial vehicle.
Attitude control: attitude control is a fundamental task of unmanned aerial vehicle flight control, while PID controllers are the core of attitude control. The attitude information of the unmanned aerial vehicle is required to be obtained through sensors such as an accelerometer, a gyroscope and a magnetometer, and the PID controller is used for converting an attitude control angle into a control quantity and adjusting the power and the speed of a motor of the unmanned aerial vehicle to realize the attitude control of the unmanned aerial vehicle, so that the flight is stable and accurate.
Height control: the unmanned aerial vehicle height control is mainly realized by measuring air pressure change and an ultrasonic or laser module, and the PID controller can adjust the output signal of the flight control system in real time according to the deviation between the current height and the target height of the unmanned aerial vehicle, and change the control parameters such as the power and the speed of a motor, so that the stable flight height of the unmanned aerial vehicle is maintained.
The PID controller can rapidly and accurately respond to the gesture and the height change in the flight control system through continuous error feedback and parameter adjustment so as to achieve the purpose of ensuring the flight stability and safety. Through the PID controller, the unmanned aerial vehicle can realize high-precision control and stability in the flight process, and can also realize more precise flight tasks and operations.
Further, in the embodiment of the present application, in step S110, the expected position of the target unmanned aerial vehicle at the current time and the measured position of the target unmanned aerial vehicle at the current time are input to the outer ring PID controller, so as to obtain the expected speed, and specifically includes the following steps:
inputting the expected position of the target unmanned aerial vehicle at the current moment and the measured position of the target unmanned aerial vehicle at the money-holding moment into an outer ring PID controller, wherein the outer ring PID controller comprises a first control condition and a second control condition,
the calculation formula of the first control condition is that
The calculation formula of the second control condition is that
Wherein: the v is d Representing the desired speed, p d Representing the desired position, p representing the measured position, e p Representing the difference between the desired position and the measured position,representation e p Is differentiated by K p 、K i 、K d ∈R 2×2 ,K p Proportional term coefficient, K, representing the outer loop PID controller i Integral term coefficient, K, representing the outer loop PID controller d Differential term coefficients representing the outer loop PID controller;
and acquiring the expected speed according to the first control condition and the second control condition.
After the expected position of the unmanned aerial vehicle at the current moment is obtained, the expected position of the unmanned aerial vehicle at the current moment and the measured position of the unmanned aerial vehicle at the current moment are input into an outer ring PID controller (Proportion Integration Differentiation, a proportional-integral-derivative controller) for parameter adjustment. The outer ring PID controller can be a position PID, and has the advantages of small static error and small overflow influence. The measurement position may be positioning information obtained by receiving satellite signals such as GPS or beidou satellite signals.
Further, in the embodiment of the present application, in step S120, the desired speed and the measured speed of the target unmanned aerial vehicle at the current time are input to the inner ring PID controller, so as to obtain the desired gesture of the target unmanned aerial vehicle, which specifically includes the following steps:
inputting the expected speed and the measured speed of the target unmanned aerial vehicle at the current moment to an inner ring PID controller, wherein the inner ring PID controller comprises a third control condition and a fourth control condition,
the third control condition is calculated by
The fourth control condition is calculated by
Wherein θ d Representing the desired pose, e v The difference between the desired speed and the measured speed,representation e v Is differentiated by K vp 、K vi 、K vd ∈R 2×2 ,K vp Proportional term coefficient, K, representing the inner loop PID controller vi Integral term coefficient, K, representing the inner loop PID controller vd Representing the derivative term coefficients of the inner loop PID controller.
In a preferred embodiment, in the horizontal position control, the desired position is set to p d The measurement position is p, wherein:x d representing the desired abscissa, y, of the drone in horizontal position d Representing the desired ordinate of the drone in the horizontal position; x represents the abscissa of the current moment measured by the unmanned aerial vehicle in the horizontal position, and y represents the ordinate of the current moment measured by the unmanned aerial vehicle in the horizontal position. If the first control condition is to be satisfiedThe desired speed v output by the horizontal outer loop PID controller d The second control condition of (2) is->
The expected position and the measured position of the unmanned aerial vehicle at the current moment are input into the outer ring PID controller, so that the expected speed is obtained based on the first control condition and the second control condition of the outer ring PID controller, and the obtained expected speed is more accurate.
On the other hand, the present application discloses a control system for virtual photography, and the control method for virtual photography, as shown in fig. 5, includes: a user presentation layer, a business logic layer, and a device data layer, wherein:
the business logic layer comprises a motion pose module and a motion control module,
the business logic layer is mainly responsible for realizing the related functions of motion control, provides a corresponding interface for the user representation layer to use, and mainly comprises a motion pose module and a motion control module. The motion pose module is mainly responsible for calculating the position and pose information of the photographic robot according to the information transmitted by the encoder and transmitting related information to the service logic layer. The motion control module is responsible for motion control of each axis, and the motion pose module calculates the position and pose information of the unmanned aerial vehicle according to the information transmitted by the encoder and transmits the position and pose information of the unmanned aerial vehicle to the service logic layer;
the motion control module performs limit control on each axial motion of the unmanned aerial vehicle;
the equipment data layer comprises a serial port communication module, an instruction of the service logic layer is transmitted to the servo driver, and data returned by the sensor is sent to the service logic layer.
The device data layer is responsible for realizing the basic functions of the software, and comprises a serial communication module. The equipment data layer transmits the command of the service logic layer to the servo driver, and transmits the data returned by the sensor to the service logic layer to provide data for the motion pose module. The method needs to ensure reliable transmission of serial port communication and ensure the integrity of shared data.
Further, in the embodiment of the present application, the user presentation layer is a UI interface, provides buttons required for operation for a user, sets related parameters of the unmanned aerial vehicle, and may display a current state of each axis of the unmanned aerial vehicle, including displaying a pose of the current unmanned aerial vehicle and whether each axis is in a limiting state.
The user presentation layer is mainly responsible for interactions with the user. The UI interface can provide buttons required by operation and an interface for setting related parameters of the unmanned aerial vehicle for the user to operate the unmanned aerial vehicle; and the current state of each shaft of the photographic unmanned aerial vehicle can be displayed, including the pose of the current photographic unmanned aerial vehicle, whether each shaft is in a limiting state, whether the current photographic unmanned aerial vehicle is in open-closed loop feedback and other information. The user can conveniently operate the unmanned aerial vehicle in a direct mode and obtain the information that the user wants to know the unmanned aerial vehicle.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (RAM, random access memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the description of the present specification, the descriptions of the terms "one embodiment," "example," "specific example," and the like, mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The foregoing is merely illustrative of the structures of this application and various modifications, additions and substitutions for those skilled in the art can be made to the described embodiments without departing from the scope of the application or from the scope of the application as defined in the accompanying claims.

Claims (10)

1. The control method for virtual photography is characterized by comprising the following steps:
controlling the motion precision and the safety limit of the unmanned photographic plane respectively, shooting or measuring actual scenes with different angles, and obtaining information and data of the actual scenes;
processing information and data of the real scene by a computer, and restoring the real scene by using a computer image rendering technology and a three-dimensional modeling technology to generate a virtual scene and a model based on a three-dimensional image processing technology;
and rendering and making the virtual scene and the model by using a virtual reality technology and a computer vision technology.
2. The method for controlling virtual photography according to claim 1, wherein the controlling of the safety limit of the unmanned aerial vehicle comprises the steps of:
servo-enabling a motion axis to be controlled, and setting a motion mode as a speed mode;
reading the motion parameters of the speed mode, setting the motion parameters to be modified and reconfiguring the motion parameters of the motion axis;
the planned position and the encoder position are synchronized, if positive limit is exceeded, the speed direction is positive, and if negative limit is exceeded, the speed direction is negative.
3. The control method of virtual photography according to claim 2, wherein if the positive limit is not exceeded, it is checked whether the negative limit is exceeded, and if the negative limit is not exceeded, the target speed of the moving axis is set.
4. The method according to claim 2, wherein the target speed of the moving axis is set if the speed direction crossing the positive limit is negative.
5. The method for controlling virtual photography according to claim 1, wherein the motion precision and the safety limit of the unmanned aerial vehicle are controlled, and the servo motors in each track direction of the unmanned aerial vehicle are required to be controlled respectively, specifically comprising the following steps:
establishing an interface class Axis with the servo motor, wherein the interface class Axis comprises basic functions of configuring the servo motor, the encoder and the limiter in each track direction of the unmanned aerial vehicle, and acquiring current position information and configuration pulse quantity of the unmanned aerial vehicle;
according to the precision requirements of each axis of the unmanned aerial vehicle and the information fed back by the current unmanned aerial vehicle sensor, the function of the interface is realized through AxisImpl.
6. The method for controlling virtual photography according to claim 1, wherein the motion accuracy of the unmanned aerial vehicle for photography is controlled, comprising the steps of:
inputting the expected position of the target photographing unmanned aerial vehicle at the current moment and the measured position of the target photographing unmanned aerial vehicle at the current moment into an outer ring PID controller, thereby obtaining the expected speed;
inputting the expected speed and the measured speed of the target unmanned aerial vehicle at the current moment into an inner ring PID controller, so as to obtain the expected gesture of the target unmanned aerial vehicle;
and controlling the movement precision of the target unmanned aerial vehicle according to the expected gesture.
7. The method according to claim 6, wherein the desired position of the target photo unmanned aerial vehicle at the current time and the measured position of the target photo unmanned aerial vehicle at the current time are input to the outer ring PID controller, thereby obtaining the desired speed, comprising the steps of:
inputting the expected position of the target unmanned aerial vehicle at the current moment and the measured position of the target unmanned aerial vehicle at the money-holding moment into an outer ring PID controller, wherein the outer ring PID controller comprises a first control condition and a second control condition,
the calculation formula of the first control condition is that
The calculation formula of the second control condition is that
Wherein: the v is d Representing the desired speed, p d Representing the desired position, p representing the measured position, e p Representing the difference between the desired position and the measured position,representation e p Is differentiated by K p 、K i 、K d ∈R 2×2 ,K p Proportional term coefficient, K, representing the outer loop PID controller i Integral term coefficient, K, representing the outer loop PID controller d Differential term coefficients representing the outer loop PID controller;
and acquiring the expected speed according to the first control condition and the second control condition.
8. The method according to claim 6, wherein the desired speed and the measured speed of the target unmanned aerial vehicle at the current time are input to an inner loop PID controller, so as to obtain the desired attitude of the target unmanned aerial vehicle, and specifically comprising the steps of:
inputting the expected speed and the measured speed of the target unmanned aerial vehicle at the current moment to an inner ring PID controller, wherein the inner ring PID controller comprises a third control condition and a fourth control condition,
the third control condition is calculated by
The fourth control condition is calculated by
Wherein θ d Representing the desired pose, e v The difference between the desired speed and the measured speed,representation e v Is differentiated by K vp 、K vi 、K vd ∈R 2×2 ,K vp Proportional term coefficient, K, representing the inner loop PID controller vi Integral term coefficient, K, representing the inner loop PID controller vd Representing the derivative term coefficients of the inner loop PID controller.
9. A control system for virtual photography, characterized by applying a control method for virtual photography according to any one of claims 1 to 8, comprising: a user presentation layer, a business logic layer, and a device data layer, wherein:
the business logic layer comprises a motion pose module and a motion control module,
the motion pose module calculates the position and pose information of the unmanned aerial vehicle according to the information transmitted by the encoder, and transmits the position and pose information of the unmanned aerial vehicle to the service logic layer;
the motion control module performs limit control on each axial motion of the unmanned aerial vehicle;
the equipment data layer comprises a serial port communication module, an instruction of the service logic layer is transmitted to the servo driver, and data returned by the sensor is sent to the service logic layer.
10. The virtual photography control system of claim 9, wherein the user presentation layer is a UI interface, provides buttons required for the user to operate and sets parameters related to the unmanned aerial vehicle, and can display the current state of each axis of the unmanned aerial vehicle, including displaying the pose of the current unmanned aerial vehicle and whether each axis is in a limited state.
CN202310816884.7A 2023-07-05 2023-07-05 Virtual photography control method and system Pending CN116828132A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310816884.7A CN116828132A (en) 2023-07-05 2023-07-05 Virtual photography control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310816884.7A CN116828132A (en) 2023-07-05 2023-07-05 Virtual photography control method and system

Publications (1)

Publication Number Publication Date
CN116828132A true CN116828132A (en) 2023-09-29

Family

ID=88120027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310816884.7A Pending CN116828132A (en) 2023-07-05 2023-07-05 Virtual photography control method and system

Country Status (1)

Country Link
CN (1) CN116828132A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104503466A (en) * 2015-01-05 2015-04-08 北京健德乾坤导航***科技有限责任公司 Micro-miniature unmanned plane navigation unit
KR101601127B1 (en) * 2015-06-12 2016-03-08 엘아이지넥스원 주식회사 Apparatus for controlling pose of small-sized flying object
KR20170103451A (en) * 2016-03-04 2017-09-13 (주)랜텍커뮤니케이션즈 Method for providing fight route optimization
WO2018058320A1 (en) * 2016-09-27 2018-04-05 深圳市大疆创新科技有限公司 Method and apparatus for controlling unmanned aerial vehicle
CN109799842A (en) * 2019-01-28 2019-05-24 南京航空航天大学 A kind of multiple no-manned plane sequence flight control method
CN110673631A (en) * 2019-09-26 2020-01-10 深圳市道通智能航空技术有限公司 Unmanned aerial vehicle flight method and device and unmanned aerial vehicle
CN111367309A (en) * 2018-12-25 2020-07-03 杭州海康机器人技术有限公司 Unmanned aerial vehicle flight control method and device
CN112230633A (en) * 2020-10-29 2021-01-15 北京信息科技大学 Safety protection device for unmanned aerial vehicle control training
CN112652065A (en) * 2020-12-18 2021-04-13 湖南赛吉智慧城市建设管理有限公司 Three-dimensional community modeling method and device, computer equipment and storage medium
CN112669469A (en) * 2021-01-08 2021-04-16 国网山东省电力公司枣庄供电公司 Power plant virtual roaming system and method based on unmanned aerial vehicle and panoramic camera
CN113454559A (en) * 2020-09-28 2021-09-28 深圳市大疆创新科技有限公司 Flight control method and device, unmanned aerial vehicle and storage medium
CN114326766A (en) * 2021-12-03 2022-04-12 深圳先进技术研究院 Vehicle-mounted machine cooperative autonomous tracking and landing method
CN115128965A (en) * 2022-02-17 2022-09-30 浙江工业大学 Method for simulating real trajectory and recommending view-angle capturing content in digital twin scene
CN115729094A (en) * 2022-11-22 2023-03-03 令箭科技(广州)有限责任公司 Unmanned aerial vehicle flight control method, unmanned aerial vehicle and unmanned aerial vehicle formation

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104503466A (en) * 2015-01-05 2015-04-08 北京健德乾坤导航***科技有限责任公司 Micro-miniature unmanned plane navigation unit
KR101601127B1 (en) * 2015-06-12 2016-03-08 엘아이지넥스원 주식회사 Apparatus for controlling pose of small-sized flying object
KR20170103451A (en) * 2016-03-04 2017-09-13 (주)랜텍커뮤니케이션즈 Method for providing fight route optimization
WO2018058320A1 (en) * 2016-09-27 2018-04-05 深圳市大疆创新科技有限公司 Method and apparatus for controlling unmanned aerial vehicle
CN111367309A (en) * 2018-12-25 2020-07-03 杭州海康机器人技术有限公司 Unmanned aerial vehicle flight control method and device
CN109799842A (en) * 2019-01-28 2019-05-24 南京航空航天大学 A kind of multiple no-manned plane sequence flight control method
CN110673631A (en) * 2019-09-26 2020-01-10 深圳市道通智能航空技术有限公司 Unmanned aerial vehicle flight method and device and unmanned aerial vehicle
CN113454559A (en) * 2020-09-28 2021-09-28 深圳市大疆创新科技有限公司 Flight control method and device, unmanned aerial vehicle and storage medium
CN112230633A (en) * 2020-10-29 2021-01-15 北京信息科技大学 Safety protection device for unmanned aerial vehicle control training
CN112652065A (en) * 2020-12-18 2021-04-13 湖南赛吉智慧城市建设管理有限公司 Three-dimensional community modeling method and device, computer equipment and storage medium
CN112669469A (en) * 2021-01-08 2021-04-16 国网山东省电力公司枣庄供电公司 Power plant virtual roaming system and method based on unmanned aerial vehicle and panoramic camera
CN114326766A (en) * 2021-12-03 2022-04-12 深圳先进技术研究院 Vehicle-mounted machine cooperative autonomous tracking and landing method
CN115128965A (en) * 2022-02-17 2022-09-30 浙江工业大学 Method for simulating real trajectory and recommending view-angle capturing content in digital twin scene
CN115729094A (en) * 2022-11-22 2023-03-03 令箭科技(广州)有限责任公司 Unmanned aerial vehicle flight control method, unmanned aerial vehicle and unmanned aerial vehicle formation

Similar Documents

Publication Publication Date Title
Erat et al. Drone-augmented human vision: Exocentric control for drones exploring hidden areas
US11572196B2 (en) Methods and systems for movement control of flying devices
US10901435B2 (en) Heading generation method and system of unmanned aerial vehicle
US9928649B2 (en) Interface for planning flight path
EP2629267A2 (en) Real-time compositing of live recording-based and computer graphics-based media streams
CN109071034A (en) Switch method, controller and the image stability augmentation equipment of holder operating mode
US11776577B2 (en) Camera tracking system for live compositing
CN110651466A (en) Shooting control method and device for movable platform
JPH05509181A (en) Motion control system for movie shooting
WO2021259252A1 (en) Flight simulation method and apparatus, electronic device, and unmanned aerial vehicle
CN110544314B (en) Fusion method, system, medium and equipment of virtual reality and simulation model
WO2020172800A1 (en) Patrol control method for movable platform, and movable platform
US20210009270A1 (en) Methods and system for composing and capturing images
WO2020133410A1 (en) Image capture method and device
US20190104250A1 (en) Coordinated cinematic drone
KR20210015624A (en) System and method for managing and controlling disaster situation using drone
US20200217665A1 (en) Mobile platform, image capture path generation method, program, and recording medium
Pueyo et al. Cinemairsim: A camera-realistic robotics simulator for cinematographic purposes
WO2021251441A1 (en) Method, system, and program
Mademlis et al. Vision-based drone control for autonomous UAV cinematography
CN116828132A (en) Virtual photography control method and system
WO2022036500A1 (en) Flight assisting method for unmanned aerial vehicle, device, chip, system, and medium
CN116149193B (en) Anti-disturbance control method and system for rotor unmanned aerial vehicle based on vision
WO2018227345A1 (en) Control method and unmanned aerial vehicle
CN115357052B (en) Method and system for automatically exploring interest points in video picture by unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination