WO2020233682A1 - 一种自主环绕拍摄方法、装置以及无人机 - Google Patents

一种自主环绕拍摄方法、装置以及无人机 Download PDF

Info

Publication number
WO2020233682A1
WO2020233682A1 PCT/CN2020/091620 CN2020091620W WO2020233682A1 WO 2020233682 A1 WO2020233682 A1 WO 2020233682A1 CN 2020091620 W CN2020091620 W CN 2020091620W WO 2020233682 A1 WO2020233682 A1 WO 2020233682A1
Authority
WO
WIPO (PCT)
Prior art keywords
shooting
target
binocular camera
drone
flying height
Prior art date
Application number
PCT/CN2020/091620
Other languages
English (en)
French (fr)
Inventor
钟自鸣
Original Assignee
深圳市道通智能航空技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司 filed Critical 深圳市道通智能航空技术有限公司
Publication of WO2020233682A1 publication Critical patent/WO2020233682A1/zh
Priority to US17/455,744 priority Critical patent/US11755042B2/en
Priority to US18/360,365 priority patent/US20230384803A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]

Definitions

  • This application relates to the technical field of unmanned aerial vehicles, in particular to an autonomous surround shooting method, device and unmanned aerial vehicle.
  • UAV is an unmanned aircraft that uses radio remote control equipment and built-in programs to control its flight attitude. Due to its advantages of flexibility, rapid response, unmanned driving, and low operating requirements, it has been widely used in aerial photography, Plant protection, power inspection, disaster relief and many other fields.
  • aerial photography is currently the main purpose of consumer drones, and surround shooting is one of the most favorite lens languages of aerial photography enthusiasts.
  • surround shooting requires precise trajectory control of the drone.
  • it is also necessary to manipulate the gimbal to turn to ensure that the surrounding objects are within a suitable shooting range.
  • the final shooting effect and quality also depend entirely on the drone operator. Proficiency, which places extremely high requirements on operating skills for operators, which indirectly reduces user experience.
  • the embodiments of the present invention provide an autonomous surround shooting method, device, and unmanned aerial vehicle to get rid of the dependence of the existing autonomous surround shooting technology on GPS signals, so that it can be performed in areas with poor GPS signals such as indoors.
  • the purpose of autonomous surround shooting is to get rid of the dependence of the existing autonomous surround shooting technology on GPS signals, so that it can be performed in areas with poor GPS signals such as indoors.
  • an embodiment of the present invention provides an autonomous surround shooting method, which is applied to a drone, the drone includes a binocular camera component, and the method includes:
  • the autonomous surround shooting according to the flying height, the spatial distance, and the real-time detected optical axis direction of the binocular camera assembly includes:
  • the method further includes:
  • maintaining the target flying height and the target spatial distance unchanged, and adjusting the flying direction of the drone according to the optical axis direction of the binocular camera assembly detected in real time includes:
  • the method further includes:
  • the autonomous surround shooting based on the flying height, the spatial distance, and the real-time detected optical axis direction of the binocular camera assembly includes:
  • autonomous surround shooting is performed.
  • said combining the shooting mode, the flying height, the spatial distance, and the real-time detected optical axis direction of the binocular camera assembly to perform autonomous surround shooting includes:
  • the initial flying height and the initial space distance, the target flying height of the drone in each shooting time period and the target space between the drone and the surrounding object are determined distance;
  • the acquiring the target shooting frame and the surrounding object selected by the user from the target shooting frame through the binocular camera component includes:
  • the shooting picture collected by the binocular camera component at the target shooting position is used as the target shooting picture, and the surrounding object selected by the user from the target shooting picture is obtained.
  • control instruction includes a joystick instruction or an operation instruction for a captured image captured in real time.
  • an embodiment of the present invention provides an autonomous surround shooting device applied to an unmanned aerial vehicle.
  • the unmanned aerial vehicle includes a binocular camera assembly, and the device includes:
  • An acquiring unit for acquiring a target shooting picture and a surrounding object selected by the user from the target shooting picture through the binocular camera component;
  • a distance determining unit configured to determine the spatial distance between the binocular camera assembly and the surrounding object based on the target shooting image
  • An optical axis detection unit for detecting the direction of the optical axis of the binocular camera assembly in real time
  • the surround shooting unit is configured to perform autonomous surround shooting according to the flying height, the spatial distance, and the optical axis direction of the binocular camera assembly detected in real time.
  • the surround shooting unit is specifically configured to:
  • the device further includes:
  • the surround shooting unit maintains the target flying height and the target spatial distance unchanged, and adjusts the flying direction of the drone according to the optical axis direction of the binocular camera assembly detected in real time, which is specifically :
  • the device further includes:
  • a shooting mode determining unit configured to determine a shooting mode for the surrounding object
  • the surround shooting unit is used for:
  • autonomous surround shooting is performed.
  • the surround shooting unit is specifically configured to:
  • the initial flying height and the initial space distance, the target flying height of the drone in each shooting time period and the target space between the drone and the surrounding object are determined distance;
  • an embodiment of the present invention provides a drone, including:
  • An arm which is connected to the fuselage
  • Binocular camera assembly which is arranged on the body
  • a processor which is arranged in the body and is communicatively connected with the binocular camera assembly;
  • a memory which is in communication connection with the processor; wherein,
  • the memory stores instructions executable by the processor, and the instructions are executed by the processor so that the processor can execute the autonomous surround shooting method as described above.
  • the embodiments of the present invention also provide a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are used to make The camera executes the autonomous surround shooting method as described above.
  • the embodiments of the present invention also provide a computer program product, the computer program product includes a computer program stored on a non-transitory computer-readable storage medium, the computer program includes program instructions, when the program When the instruction is executed by the drone, the drone is caused to execute the autonomous surround shooting method described above.
  • the beneficial effects of the embodiments of the present invention are: different from the prior art, the autonomous surround shooting method, device and drone provided by the embodiments of the present invention realize autonomous surround shooting based on visual sensing.
  • the autonomous surround shooting method, device and drone realize autonomous surround shooting based on visual sensing.
  • Due to the consumption of aerial photography Class UAVs are usually equipped with binocular camera components, so that they can get rid of the serious dependence on GPS for the surrounding flight of the UAV without adding additional hardware costs, even if the GPS signal is not good indoors.
  • the area can also complete the autonomous surround shooting task well; on the other hand, it can always adjust the flight trajectory of the drone according to the shooting effect that the user expects to achieve the true sense of autonomous surround shooting of the drone for the shooting effect.
  • FIG. 1 is a schematic diagram of one application environment of an autonomous surround shooting method provided by an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of an autonomous surround shooting method provided by an embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of another autonomous surround shooting method according to an embodiment of the present invention.
  • Fig. 4 is a schematic structural composition diagram of an autonomous surround shooting device provided by an embodiment of the present invention.
  • the embodiments of the present invention provide a new type of autonomous surround shooting method, device and drone.
  • the autonomous surround shooting method and device can be applied to any type of unmanned aerial vehicle, such as a tilt-rotor unmanned aerial vehicle or a rotary-wing unmanned aerial vehicle.
  • the unmanned aerial vehicle is equipped with a binocular camera for collecting image data. Components.
  • the “surround shooting” means that the drone is flying around the projected point of the surrounding object on the plane where the drone is located, and is flying around the drone. While shooting the surrounding objects.
  • the autonomous surround shooting method provided by the embodiment of the present invention is a vision-based autonomous surround shooting method for unmanned aerial vehicles for shooting effects, specifically: obtaining target shooting images and users through binocular camera components equipped on the unmanned aerial vehicle The surrounding object selected from the target shooting picture, and when the target shooting picture is obtained, the flying height of the drone is obtained and the distance between the binocular camera assembly and the surrounding object is determined Spatial distance; real-time detection of the optical axis direction of the binocular camera assembly; and autonomous surround shooting according to the flying height, the spatial distance, and the real-time detected optical axis direction of the binocular camera assembly.
  • the "target shooting picture” refers to the shooting effect of the surrounding object in the picture to be shot (including the position and size of the surrounding object in the picture to be shot) that the user wants to obtain, thereby ,
  • the flying height of the drone and the spatial distance between the binocular camera component and the surrounding object when the target shooting picture is acquired, and the real-time detection of the binocular camera component The direction of the optical axis enables autonomous surround shooting.
  • it can realize the autonomous flying and shooting of the drone without relying on GPS signals.
  • it can always obtain the shooting corresponding to the "target shooting picture”. The effect guarantees the quality of autonomous drone shooting.
  • the autonomous surround shooting device provided by the embodiment of the present invention may be a virtual device composed of a software program that can implement the autonomous surround shooting method provided by the embodiment of the present invention.
  • the autonomous surround shooting device and the autonomous surround shooting method provided in the embodiments of the present invention are based on the same inventive concept, and have the same technical features and beneficial effects.
  • FIG. 1 is an application environment of the autonomous surround shooting method provided by an embodiment of the present invention.
  • the application environment may include: a surrounding object 10, a drone 20, and a remote control device 30, and the drone 20 and the remote control device 30 can be communicated and connected in any manner.
  • the autonomous surround method provided by the embodiment of the present invention may be executed by the drone 20.
  • the surrounding object 10 may be any object that the user wants to surround and photograph, and the surrounding object 10 may be a static object (that is, an object that is always in place and still), such as buildings, trees, Islands, etc., can also be dynamic objects (that is, objects in motion), such as people, animals, cars, boats, etc., which can be determined according to actual applications.
  • a static object that is, an object that is always in place and still
  • dynamic objects that is, objects in motion
  • the drone 20 can be any type of drone, and specifically can include a fuselage 21, an arm 22 connected to the fuselage 21, a power assembly 23 provided on the arm 22, and The binocular camera assembly 24 of the body 21.
  • the fuselage 21 is the main part of the UAV 20, and various functional circuit components of the UAV 20 can be arranged in it.
  • the fuselage 21 is equipped with The processor 211 and the memory 212, and the processor 211 and the memory 212 may be communicatively connected through a system bus or in other ways.
  • the processor 211 may specifically be a micro-programmed control unit (MCU), a digital signal processor (Digital Signal Processor, DSP), etc., for providing calculation and control capabilities to control the
  • the man-machine 20 is flying and performing related tasks, for example, controlling the drone 20 to perform any autonomous surround shooting method provided by the embodiments of the present invention.
  • the memory 212 may specifically be a non-transitory computer-readable storage medium, which may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as the program corresponding to the autonomous surround method in the embodiment of the present invention Instructions/modules.
  • the processor 211 can implement the autonomous surround shooting method in any of the following method embodiments by running non-transitory software programs, instructions, and modules stored in the memory 212.
  • the memory 212 may include a high-speed random access memory, and may also include a non-transitory memory, such as at least one magnetic disk storage device, a flash memory device, or other non-transitory solid-state storage devices.
  • the memory 212 may further include a memory remotely provided with respect to the processor 211, and these remote memories may be connected to the processor 211 through a network.
  • networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the number of the arms 22 is at least two, and the arms 22 can be fixedly connected, integrally formed or detachably connected with the body 21, which is not specifically limited in the embodiment of the present invention.
  • the power assembly 23 may generally include an electronic governor, a motor, and a propeller.
  • the motor is connected between the electronic governor and the propeller, and the motor and the propeller are arranged on the corresponding arm 22 .
  • the electronic speed governor may be communicatively connected with the processor 211, can receive the drive signal sent by the processor 211, and provide a drive current for the motor according to the drive signal to control the motor and The rotation speed of the propeller thus provides the UAV 20 with flying lift or flying power.
  • the binocular camera component 24 may be any photographing device capable of collecting left and right views. For example, it may specifically include a first image acquisition device and a second image acquisition device arranged at intervals, the first image acquisition device and the second image acquisition device.
  • the second image acquisition device can be installed under the body 21 directly or through a pan/tilt.
  • the first image acquisition device may be a main camera for aerial photography and provide users with shooting images
  • the second image device may be an auxiliary camera for use in combination with the main camera.
  • Binocular vision sensing The binocular camera component 24 is communicatively connected with the processor 211, can feed back the collected image information to the processor 211, and can shoot under the control of the processor 211.
  • the remote control device 30 is a ground terminal device corresponding to the drone 20, and is used to remotely control the drone 20.
  • the remote control device 30 may specifically be a remote control, a mobile terminal (for example, a smart phone, a tablet computer, a notebook computer, etc.), a wearable device or other devices.
  • the remote control device 30 can receive user input and send corresponding control instructions to the drone 20 according to the user input to control the drone 20 to perform corresponding tasks, such as adjusting the flight attitude and performing autonomous surround Shooting etc.
  • the remote control device 30 may also receive information or image data from the drone 20, and present the information or image data to the user through its display screen or other display device.
  • the remote control device 30 may first control the drone 20 to obtain a target shooting picture including the surrounding object 10, wherein the The target shooting image represents the desired shooting effect of the surrounding object 10 in the subsequent to-be-shot image.
  • the user determines that the target shooting image has been acquired, he can send an autonomous surround shooting instruction to the processor 211 of the drone 20 through the function control on the remote control device 30 to instruct the drone 20 to shoot the image based on the target. , Perform autonomous surround shooting for the surrounding object 10.
  • the processor 211 of the drone 20 may first obtain the target shooting picture and the surround selected by the user from the target shooting picture through the binocular camera component 24.
  • Object 10 ; at the same time, acquiring the flying height of the drone 20 when acquiring the target shooting picture, and, based on the target shooting picture, determine the distance between the binocular camera assembly 24 and the surrounding object 10 Spatial distance; and real-time detection of the optical axis direction of the binocular camera assembly 24; and then autonomously surround shooting according to the flying height, the spatial distance, and the optical axis direction of the binocular camera assembly detected in real time .
  • autonomous surround shooting can be realized based on visual sensing. On the one hand, it can get rid of the dependence on GPS signals, and can carry out autonomous surround flight and shooting in areas with weak GPS signals such as indoors. On the other hand, it can be guided by the shooting effect. Ensure the shooting quality of autonomous surround shooting.
  • the above application environment is only for illustrative purposes. In practical applications, the autonomous surround shooting method and related devices provided by the embodiments of the present invention can be further extended to other suitable application environments. It is not limited to the application environment shown in FIG. 1. For example, in some other embodiments, the number of the drone 20 and the remote control device 30 may also be more than one.
  • FIG. 2 is a schematic flowchart of an autonomous surround shooting method provided by an embodiment of the present invention. This method can be executed by any type of drone including binocular camera components, for example, executed by the drone 20 shown in FIG. 1.
  • the method may include but is not limited to the following steps:
  • Step 110 Obtain a target shooting picture and a surrounding object selected by the user from the target shooting picture through the binocular camera component.
  • the "target shooting picture” is the shooting picture that the user expects to obtain, and is used to determine the shooting effect of the surrounding object in the picture to be shot, for example, the position and size of the surrounding object in the picture to be shot.
  • the target shooting picture acquired by the binocular camera component usually includes a left camera view and a right camera view for the same shooting scene, and the drone can select one of the views (left camera view or right camera view). Send to the remote control device for users to view and confirm.
  • the specific implementation manner of acquiring a target shooting frame through a binocular camera component and selecting a surrounding object from the target shooting frame may specifically be: receiving a control instruction sent by a remote control device, wherein the The remote control device is in communication connection with the drone; flies to the target shooting position according to the control instruction; takes the shooting picture collected by the binocular camera component at the target shooting position as the target shooting picture and obtains the user from the The surrounding object selected in the target shooting screen.
  • control instruction refers to an instruction for manipulating the drone to fly, and it may specifically include a joystick instruction or an operation instruction for a captured picture captured in real time.
  • the user can refer to the real-time feedback of the binocular camera component to adjust the drone's spatial position in the airspace by controlling the joystick on the remote control device, so as to realize the position and size of the surrounding objects in the shooting screen. Adjust; or, the user can also perform operations such as panning, zooming in, and zooming out on the real-time captured shots on the remote control device.
  • the remote control device generates corresponding operating instructions based on these operations and sends them to the drone, and the drone receives After these operation instructions, corresponding space position adjustment instructions (for example, fly forward, fly backward, fly left, fly right, fly upward, fly downward, etc.) can be generated to adjust the space position.
  • corresponding space position adjustment instructions for example, fly forward, fly backward, fly left, fly right, fly upward, fly downward, etc.
  • the "target shooting position” refers to the position where the drone is located when the user obtains the desired shooting effect.
  • the user can select the surrounding object from the shooting image and send an autonomous surrounding shooting instruction containing the surrounding object information to the drone.
  • the drone can determine that it has flown to the target shooting position when it receives the autonomous surround shooting instruction containing the surrounding object information sent by the remote control device. At this time, it can directly obtain the shooting captured by the binocular camera component at the current moment. Screen and use the shooting screen as the target shooting screen. Then, according to the surrounding object information, the surrounding object selected by the user from the target shooting frame is obtained.
  • Step 120 Obtain the flying height of the drone when acquiring the target shooting picture.
  • the flying height refers to the height of the drone relative to the ground.
  • the drone When acquiring the target shooting picture, the drone can determine its flying height at the current moment through any type of height sensor.
  • Step 130 Determine the spatial distance between the binocular camera component and the surrounding object based on the target shooting image.
  • the spatial distance between the binocular camera component and the surrounding object is determined based on the target shooting image when the target shooting image is acquired.
  • the target shooting picture includes a left camera view and a right camera view for the same shooting scene, so that by stereo matching the left camera view and the right camera view, the view of the surrounding object can be obtained. Difference, and then obtain the spatial distance between the binocular camera assembly and the surrounding object.
  • Step 140 Detect the optical axis direction of the binocular camera assembly in real time.
  • the optical axis direction of the binocular camera assembly refers to the direction of the center line of the binocular camera assembly.
  • the current flight direction of the drone can be obtained in real time, and the relative angle between the optical axis of the binocular camera assembly and the drone can be measured by an angle sensor, and then combined with the drone
  • the current flight direction of the aircraft and the relative angle between the optical axis of the binocular camera assembly and the drone are used to determine the direction of the optical axis of the binocular camera assembly in real time.
  • Step 150 Perform autonomous surround shooting according to the flying height, the spatial distance, and the real-time detected optical axis direction of the binocular camera assembly.
  • the imaging principle of the camera by adjusting the relative spatial position between the binocular camera component of the drone and the surrounding object, shooting effects such as the position and size of the surrounding object in the shooting picture of the binocular camera component can be adjusted. Therefore, during the flight of the drone, as long as the flying height and the spatial distance are kept unchanged, the shooting effect consistent with the target shooting picture can be obtained.
  • the flying height and the spatial distance it is also possible to determine the surrounding radius of the drone when the drone is flying around the surrounding object (that is, the projection point of the surrounding object on the plane of the drone and the Connection between drones).
  • the direction tangent to the surrounding radius is the direction perpendicular to the optical axis space of the binocular camera assembly on the horizontal plane. Therefore, by detecting the direction of the optical axis of the binocular camera assembly in real time, the flight direction of the drone can be adjusted in real time to achieve circle flight and shooting.
  • the drone when performing autonomous surround shooting for static surrounding objects, can use the flying height as the target flying height of the drone; and the spatial distance as the binocular camera component and the camera The target space distance between the surrounding objects; during the flight, maintain the target flying height and the target space distance unchanged, and adjust the optical axis direction of the binocular camera assembly detected in real time
  • the flying direction of the drone is to realize autonomous surround shooting centered on the surrounding object.
  • the adjusting the flight direction of the drone according to the direction of the optical axis of the binocular camera assembly detected in real time may specifically be: adjusting the flight direction of the drone to a horizontal plane and the optical axis The direction perpendicular to the space.
  • the drone when the surrounding object is in motion, in order to be able to perform autonomous surround shooting of the surrounding object at all times, the drone needs to interact with the surrounding object in addition to flying around.
  • the objects move synchronously, so that the surrounding objects appear in a relatively static state in the picture to be shot.
  • the method may further include: acquiring the moving direction of the surrounding object in real time.
  • the maintaining the target flying height and the target spatial distance unchanged, and adjusting the flying direction of the drone according to the optical axis direction of the binocular camera assembly detected in real time is specifically: maintaining all The target flying height and the target spatial distance remain unchanged, and the flight of the drone is adjusted according to the real-time movement direction of the surrounding object and the real-time detection of the optical axis direction of the binocular camera assembly direction.
  • the drone may determine the direction of movement of the surrounding object by comparing the front and back two frames of shooting images obtained by the binocular camera component, or it may compare the images obtained by the binocular camera component at the current moment.
  • the shooting frame and the target shooting frame are used to determine the movement direction of the surrounding object.
  • the beneficial effects of the embodiments of the present invention are that the autonomous surround shooting method provided by the embodiments of the present invention obtains the target shooting picture through the binocular camera component and the user from the target when receiving the autonomous surround shooting instruction. Take the selected surrounding object in the shooting picture; at the same time, when the target shooting picture is obtained, obtain the flying height of the drone and determine the spatial distance between the binocular camera assembly and the surrounding object; and Real-time detection of the optical axis direction of the binocular camera assembly; and then according to the flying height, the spatial distance and the real-time detected optical axis direction of the binocular camera assembly, autonomous surround shooting; Under the condition of relying on GPS signals, the drone's autonomous surrounding flight and shooting can be realized. On the other hand, the shooting effect corresponding to the "target shooting picture" can be always obtained to ensure the quality of the drone's autonomous surrounding shooting.
  • FIG. 3 is a schematic flowchart of another autonomous surround shooting method provided by an embodiment of the present invention. This method is similar to the autonomous surround shooting method described in the second embodiment above, with the difference that: in this embodiment, the drone can also perform personalized autonomous surround shooting according to the shooting mode set by the user.
  • the method may include but is not limited to the following steps:
  • Step 210 Obtain a target shooting picture and a surrounding object selected by the user from the target shooting picture through the binocular camera component.
  • Step 220 Obtain the flying height of the drone when acquiring the target shooting image.
  • Step 230 Determine the spatial distance between the binocular camera component and the surrounding object based on the target shooting image.
  • Step 240 Detect the direction of the optical axis of the binocular camera assembly in real time.
  • Step 250 Determine a shooting mode for the surrounding object.
  • the shooting mode refers to the shooting method used when shooting the surrounding object.
  • the user can simultaneously limit the shooting mode when inputting an autonomous surrounding shooting instruction for the surrounding object.
  • the shooting mode may include, but is not limited to: a constant mode, a gradual mode, a custom mode, and the like.
  • the constant mode always performs autonomous surround shooting according to the shooting effect of the target shooting frame.
  • the gradual change mode is to gradually change the position and/or size of the surrounding object in the shooting frame during the surrounding shooting.
  • the custom mode is to adjust the shooting effect of the surrounding object in the shooting frame according to the user's custom setting during the surround shooting.
  • Step 260 Perform autonomous surround shooting in combination with the shooting mode, the flying height, the spatial distance, and the optical axis direction of the binocular camera assembly detected in real time.
  • autonomous surround shooting is performed in combination with the shooting mode.
  • the shooting mode is a constant shooting mode
  • the autonomous surround shooting method described in the second embodiment can be directly used.
  • the specific implementation for autonomous surround shooting may be: taking the flying height as the initial flying height; taking the spatial distance as the initial spatial distance between the drone and the surrounding object; according to the shooting mode , The initial flying height and the initial space distance, determining the target flying height of the drone in each shooting time period and the target space distance between the drone and the surrounding object; During the time period, fly according to the target flight height and the target space distance, and adjust the flight direction of the drone according to the optical axis direction of the binocular camera assembly detected in real time to achieve the circle Subject-centric autonomous surround shooting.
  • steps 210 to 240 have the same technical features as the steps 110 to 140 in the autonomous surround shooting method shown in FIG. 2 respectively. Therefore, the specific implementation can refer to step 110 of the above embodiment. The corresponding descriptions in to 140 will not be repeated in this embodiment.
  • the beneficial effects of the embodiments of the present invention are: the autonomous surround shooting method provided by this embodiment can realize personalized autonomy oriented to "shooting effects" by further combining with the shooting mode selected by the user for autonomous surround shooting Surround shooting provides more interesting experience for autonomous surrounding shooting.
  • the autonomous surround shooting device 400 can run in the processor 211 of the drone 20 shown in FIG. 1.
  • the device 400 includes: an acquisition unit 401, a distance determination unit 402, an optical axis detection unit 403, and a surround shooting unit 404.
  • the obtaining unit 401 is used to obtain the target shooting picture and the surrounding object selected by the user from the target shooting picture through the binocular camera component; and is also used to obtain when the drone obtains the target shooting picture Flying height
  • the distance determining unit 402 is configured to determine the spatial distance between the binocular camera assembly and the surrounding object based on the target shooting image
  • the optical axis detection unit 403 is used to detect the direction of the optical axis of the binocular camera assembly in real time;
  • the surround shooting unit 404 is configured to perform autonomous surround shooting according to the flying height, the spatial distance, and the optical axis direction of the binocular camera assembly detected in real time.
  • the acquisition unit 401 may be used to acquire the target shooting picture through the binocular camera component, and the surrounding object selected by the user from the target shooting picture And the flying height of the drone when acquiring the target shooting image, and at the same time, using the distance determining unit 402 to determine the space between the binocular camera assembly and the surrounding object based on the target shooting image Distance, and real-time detection of the optical axis direction of the binocular camera assembly by the optical axis detection unit 403; and then the surround shooting unit 404 according to the flying height, the spatial distance, and the real-time detection of the The direction of the optical axis of the binocular camera assembly allows autonomous surround shooting.
  • the acquiring unit 401 acquires the target shooting frame and the surrounding object selected by the user from the target shooting frame through the binocular camera component, specifically: receiving a control instruction sent by a remote control device, Wherein, the remote control device is communicatively connected with the drone; flying to the target shooting position according to the control instruction; taking the shooting picture collected by the binocular camera component at the target shooting position as the target shooting picture and acquiring The surrounding object selected by the user from the target shooting frame.
  • the control instruction includes a joystick instruction or an operation instruction for a captured image captured in real time.
  • the surround shooting unit 404 is specifically configured to: use the flying height as the target flying height of the drone; and use the spatial distance as the binocular camera assembly and the surround The target space distance between objects; maintain the target flying height and the target space distance unchanged, and adjust the flight direction of the drone according to the optical axis direction of the binocular camera assembly detected in real time to Realize autonomous surround shooting centered on the surrounding object.
  • the autonomous surround shooting device 400 in order to be able to perform autonomous surround shooting of surrounding objects in a moving state, the autonomous surround shooting device 400 further includes:
  • the monitoring unit 405 is configured to obtain the moving direction of the surrounding object in real time.
  • the surround shooting unit 404 maintains the target flying height and the target spatial distance unchanged, and adjusts the flying direction of the drone according to the optical axis direction of the binocular camera assembly detected in real time, Specifically: maintaining the target flying height and the target spatial distance unchanged, and adjusting the sensor according to the movement direction of the surrounding object obtained in real time and the optical axis direction of the binocular camera component detected in real time.
  • the flight direction of the man-machine is configured to obtain the moving direction of the surrounding object in real time.
  • the autonomous surround shooting device 400 further includes:
  • the shooting mode determining unit 406 is configured to determine a shooting mode for the surrounding object. Then, in this embodiment, the surround shooting unit 404 is used to combine the shooting mode, the flying height, the spatial distance, and the optical axis direction of the binocular camera component detected in real time to perform autonomous Surround shooting.
  • the surround shooting unit 404 is specifically configured to: use the flying height as the initial flying height; and use the spatial distance as the initial space between the drone and the surrounding object Distance; according to the shooting mode, the initial flying height and the initial space distance, determine the target flying height of the drone in each shooting time period and the distance between the drone and the surrounding object Target space distance; in each shooting time period, fly according to the target flying height and the target space distance, and adjust the flight direction of the drone according to the optical axis direction of the binocular camera assembly detected in real time , In order to achieve autonomous surround shooting centered on the surrounding object.
  • the device embodiments described above are merely illustrative, and the units described as separate components may or may not be physically separated, that is, they may be located in one place, or they may be distributed to multiple locations.
  • Network unit Some or all of the units/modules may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the autonomous surround shooting device is based on the same inventive concept as the autonomous surround shooting method described in the second or third method embodiment above, the corresponding content in the above method embodiment is also applicable to this device embodiment. No more details.
  • the autonomous surround shooting device provided by the embodiments of the present invention uses the acquisition unit 401 to acquire the target shooting image through the binocular camera component, and the user shoots from the target The surrounding object selected in the picture and the flying height of the drone when acquiring the target shooting picture.
  • the distance determination unit 402 is used to determine the binocular camera component and the target shooting picture based on the target shooting picture.
  • the spatial distance between the surrounding objects, and the optical axis detection unit 403 detects the direction of the optical axis of the binocular camera assembly in real time; and then the surrounding shooting unit 404 determines the flying height and the spatial distance And the real-time detection of the optical axis direction of the binocular camera assembly for autonomous surround shooting; on the one hand, it can realize autonomous surround flight and shooting of the drone without relying on GPS signals, and on the other hand, it can always obtain The shooting effect corresponding to the "target shooting picture" ensures the quality of autonomous surround shooting by the drone.
  • the embodiment of the present invention also provides a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are executed by one or more processors, for example , Executed by the processor 211 in FIG. 1, can make the above one or more processors execute the autonomous surround shooting method in any of the above method embodiments, for example, execute the method steps 110 to 150 in FIG. 2 described above, The method steps 210 to 260 in 3 implement the functions of the units 401-406 in FIG. 4.
  • each embodiment can be implemented by software plus a general hardware platform, and of course, it can also be implemented by hardware.
  • the function of the acquisition unit 401 is realized by binocular camera components, altitude sensors and other related sensors
  • the function of the distance determination unit 402 is realized by the image processor
  • the function of the optical axis detection unit 403 is realized by the orientation sensor such as gyroscope.
  • the interaction between the control module and the binocular camera assembly realizes the function of the surround camera unit 404 described above.
  • the computer program can be stored in a non- In a transient computer readable storage medium, the computer program includes program instructions, and when the program instructions are executed by the drone, the drone can execute the procedures of the foregoing method embodiments.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.
  • the above products can execute the autonomous surround shooting method provided by the embodiments of the present invention, and have corresponding functional modules and beneficial effects for executing the autonomous surround shooting method.
  • the autonomous surround shooting method provided in the embodiment of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Studio Devices (AREA)

Abstract

本发明实施例涉及无人机技术领域,具体公开了一种自主环绕拍摄方法、装置以及无人机,所述无人机包括双目摄像组件,所述方法包括:通过所述双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象;获取所述无人机在获取所述目标拍摄画面时的飞行高度;基于所述目标拍摄画面,确定所述双目摄像组件与所述环绕对象之间的空间距离;实时检测所述双目摄像组件的光轴方向;根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。通过上述技术方案,本发明实施例一方面能够在室内等GPS信号弱的区域进行自主环绕拍摄,另一方面能够保证无人机自主环绕拍摄的质量。

Description

一种自主环绕拍摄方法、装置以及无人机
本申请要求于2019年5月22日提交中国专利局、申请号为201910430492.0、申请名称为“一种自主环绕拍摄方法、装置以及无人机”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及无人机技术领域,尤其涉及一种自主环绕拍摄方法、装置以及无人机。
背景技术
无人机是一种通过无线电遥控设备和内置的程序来控制飞行姿态的不载人飞机,由于其具有机动灵活、反应快速、无人驾驶、操作要求低等优点,现已广泛应用于航拍、植保、电力巡检、救灾等众多领域。
其中,航拍是目前消费级无人机的主要用途,而环绕拍摄则是航拍爱好者最喜爱的镜头语言之一。然而,环绕拍摄需要对无人机进行精准轨迹控制,同时,还需要操纵云台同步转向以保证环绕对象在合适的拍摄范围内,最终的拍摄效果与质量也完全取决于无人机操作者的熟练度,这对操作者提出了极高的操作技能要求,间接降低了用户体验。
为此,市面上很多消费级无人机都提供了针对兴趣点的自主环绕拍摄功能。其中,目前主流的无人机自主环绕拍摄实现方式都是基于GPS实现的,无人机利用接收到的GPS坐标进行环绕轨迹规划及相应的轨迹定位、跟踪与控制。GPS信号的质量直接决定了自主环绕拍摄的准确度。对于GPS信号差的区域(比如,室内),无人机难以实现自主环绕拍摄。
因此,现有的自主环绕拍摄技术还有待改进。
发明内容
有鉴于此,本发明实施例提供了一种自主环绕拍摄方法、装置以及无人机,以摆脱现有自主环绕拍摄技术对GPS信号的依赖性,达到在室内等GPS信号差的区域也可以进行自主环绕拍摄的目的。
为解决上述技术问题,本发明实施例提供了如下技术方案:
第一方面,本发明实施例提供一种自主环绕拍摄方法,应用于无人机,所述无人机包括双目摄像组件,所述方法包括:
通过所述双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象;
获取所述无人机在获取所述目标拍摄画面时的飞行高度;
基于所述目标拍摄画面,确定所述双目摄像组件与所述环绕对象之间的空间距离;
实时检测所述双目摄像组件的光轴方向;
根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
可选地,所述根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄,包括:
将所述飞行高度作为所述无人机的目标飞行高度;
将所述空间距离作为所述双目摄像组件与所述环绕对象之间的目标空间距离;
维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
可选地,所述方法还包括:
实时获取所述环绕对象的运动方向;
则,所述维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,包括:
维持所述目标飞行高度和所述目标空间距离不变,并根据实时获取的所述环绕对象的运动方向和实时检测到的所述双目摄像组件的光轴方向,调整所述无人机的飞行方向。
可选地,所述方法还包括:
确定针对所述环绕对象的拍摄模式;
则,所述根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄,包括:
结合所述拍摄模式、所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
可选地,所述结合所述拍摄模式、所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄,包括:
将所述飞行高度作为初始飞行高度;
将所述空间距离作为所述无人机与所述环绕对象之间的初始空间距离;
根据所述拍摄模式、所述初始飞行高度以及所述初始空间距离,确定所述无人机在各个拍摄时间段内的目标飞行高度和所述无人机与所述环绕对象之间的目标空间距离;
在各个拍摄时间段内,按照所述目标飞行高度和所述目标空间距离飞行,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
可选地,所述通过所述双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象,包括:
接收遥控设备发送的控制指令,其中,所述遥控设备与所述无人机通信连接;
根据所述控制指令飞行至目标拍摄位置;
将所述双目摄像组件在所述目标拍摄位置采集到的拍摄画面作为目标拍摄画面并获取用户从所述目标拍摄画面中选定的环绕对象。
可选地,所述控制指令包括摇杆指令或针对实时采集到的拍摄画面的操作指令。
第二方面,本发明实施例提供一种自主环绕拍摄装置,应用于无人机,所述无人机包括双目摄像组件,所述装置包括:
获取单元,用于通过所述双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象;以及
用于获取所述无人机在获取所述目标拍摄画面时的飞行高度;
距离确定单元,用于基于所述目标拍摄画面,确定所述双目摄像组件与所述环绕对象之间的空间距离;
光轴检测单元,用于实时检测所述双目摄像组件的光轴方向;以及
环绕拍摄单元,用于根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
可选地,所述环绕拍摄单元具体用于:
将所述飞行高度作为所述无人机的目标飞行高度;
将所述空间距离作为所述双目摄像组件与所述环绕对象之间的目标空间距离;
维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
可选地,所述装置还包括:
监控单元,用于实时获取所述环绕对象的运动方向;
则,所述环绕拍摄单元维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,具体为:
维持所述目标飞行高度和所述目标空间距离不变,并根据实时获取的所述环绕对象的运动方向和实时检测到的所述双目摄像组件的光轴方向,调整所述无人机的飞行方向。
可选地,所述装置还包括:
拍摄模式确定单元,用于确定针对所述环绕对象的拍摄模式;
则,所述环绕拍摄单元用于:
结合所述拍摄模式、所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
可选地,所述环绕拍摄单元具体用于:
将所述飞行高度作为初始飞行高度;
将所述空间距离作为所述无人机与所述环绕对象之间的初始空间距离;
根据所述拍摄模式、所述初始飞行高度以及所述初始空间距离,确定所述无人机在各个拍摄时间段内的目标飞行高度和所述无人机与所述环绕对象之间的目标空间距离;
在各个拍摄时间段内,按照所述目标飞行高度和所述目标空间距离飞行,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
第三方面,本发明实施例提供一种无人机,包括:
机身;
机臂,其与所述机身连接;
动力组件,其设置于所述机臂;
双目摄像组件,其设置于所述机身;
处理器,其设置于所述机身内,并与所述双目摄像组件通信连接;以及,
存储器,其与所述处理器通信连接;其中,
所述存储器存储有可被所述处理器执行的指令,所述指令被所述处理器执行,以使所述处理器能够执行如上所述的自主环绕拍摄方法。
第四方面,本发明实施例还提供了一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使无人机执行如上所述的自主环绕拍摄方法。
第五方面,本发明实施例还提供了一种计算机程序产品,所述计算机程序产品包括存储在非暂态计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被无人机执行时,使所述无人机执行如上所述的自主环绕拍摄方法。
本发明实施例的有益效果是:区别于现有技术的情况,本发明实施例提供的自主环绕拍摄方法、装置以及无人机,基于视觉感测实现自主环绕拍摄,一方面由于主打航拍的消费级无人机通常配备有双目摄像组件,从而,能够在不增加额外的硬件成本的情况下,使无人机的环绕飞行摆脱对GPS的严重依赖性,即便在室内等GPS信号不佳的区域也能很好地完成自主环绕拍摄任务;另一方面能够始终根据用户期望得到的拍摄效果调整无人机的飞行轨迹,实现真正意义上的面向拍摄效果的无人机自主环绕拍摄。
附图说明
为了更清楚地说明本发明实施例的技术方案,下面将对本发明实施例中所需要使用的附图作简单地介绍。显而易见地,下面所描述的附图仅仅是本发明 的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本发明实施例提供的自主环绕拍摄方法的其中一种应用环境示意图;
图2是本发明实施例提供的一种自主环绕拍摄方法的流程示意图;
图3是本发明实施例提供的另一种自主环绕拍摄方法的流程示意图;
图4是本发明实施例提供的一种自主环绕拍摄装置的结构组成示意图。
具体实施方式
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本发明,并不用于限定本发明。
需要说明的是,如果不冲突,本发明实施例中的各个特征可以相互结合,均在本发明的保护范围之内。另外,虽然在装置示意图中进行了功能模块划分,在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于装置中的模块划分,或流程图中的顺序执行所示出或描述的步骤。
为了保证自主环绕拍摄的拍摄效果,同时,摆脱无人机自主环绕飞行时对GPS信号的依赖,本发明实施例提供了一种新型的自主环绕拍摄方法、装置以及无人机。其中,所述自主环绕拍摄方法和装置能够应用于任意类型的无人机,比如:倾转旋翼无人机或旋翼无人机等,所述无人机上搭载有用于采集图像数据的双目摄像组件。此外,在本发明实施例中,所述“环绕拍摄”是指无人机在环绕对象的上空,围绕所述环绕对象在无人机所在平面的投影点进行环绕飞行,并且,在进行环绕飞行的同时对所述环绕对象进行拍摄。
具体地,本发明实施例提供的自主环绕拍摄方法是一种基于视觉的面向拍摄效果的无人机自主环绕拍摄方法,具体为:通过无人机上配备的双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象,并且,在获取到所述目标拍摄画面时,获取所述无人机的飞行高度以及确定所述双目摄像组件与所述环绕对象之间的空间距离;实时检测所述双目摄像组件的光轴方向;根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。其中,在本发明实施例中,所述“目标拍摄画面”即用户想要获取到的环绕对象在待拍摄画面中的拍摄效果(包括环绕对象在待拍摄画面中的位置、尺寸等),从而,根据在获取到所述目标拍摄画面时所述无人机的飞行高度和所述双目摄像组件与所述环绕对象之间的空间距离,以及,实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄,一方面能够在无需依赖GPS信号的情况下实现无人机的自主环绕飞行和拍摄,另一方面能够始终获取到与所述“目标拍摄画面”相对应的拍摄效果,保障无人机自主环绕拍摄的质量。
本发明实施例提供的自主环绕拍摄装置可以是由软件程序构成的能够实 现本发明实施例提供的自主环绕拍摄方法的虚拟装置。所述自主环绕拍摄装置与本发明实施例提供的自主环绕拍摄方法基于相同的发明构思,具有相同的技术特征以及有益效果。
下面结合附图,对本发明实施例作进一步阐述。
实施例一
图1是本发明实施例提供的自主环绕拍摄方法的其中一种应用环境。具体地,请参阅图1,该应用环境中可以包括:环绕对象10、无人机20以及遥控设备30,所述无人机20与所述遥控设备30之间可通过任意方式通信连接。本发明实施例提供的自主环绕方法可以由所述无人机20执行。
其中,所述环绕对象10可以为任意用户想要对其进行环绕拍摄的对象,该环绕对象10可以为静态对象(即,始终位于原地,静止不动的对象),比如,建筑、树木、岛屿等等,也可以为动态对象(即,处于运动状态中的对象),比如,人、动物、车、船等,其可根据实际应用场合而定。
所述无人机20可以为任意类型的无人机,具体可以包括机身21、与所述机身21相连的机臂22、设于所述机臂22的动力组件23以及设于所述机身21的双目摄像组件24。
所述机身21即所述无人机20的主体部分,其内可以设置有所述无人机20的各类功能电路组件,比如,在本实施例中,所述机身21内设置有处理器211和存储器212,所述处理器211和所述存储器212可以通过***总线或者其他方式通信连接。
其中,所述处理器211具体可以为微程序控制器(Micro-programmed Control Unit,MCU)、数字信号处理器(Digital Signal Processor,DSP)等,用于提供计算和控制能力,以控制所述无人机20飞行以及执行相关任务,例如,控制所述无人机20执行本发明实施例提供的任意一种自主环绕拍摄方法。
所述存储器212具体可以为一种非暂态计算机可读存储介质,可用于存储非暂态软件程序、非暂态计算机可执行程序以及模块,如本发明实施例中的自主环绕方法对应的程序指令/模块。所述处理器211通过运行存储在存储器212中的非暂态软件程序、指令以及模块,可以实现下述任一方法实施例中的自主环绕拍摄方法。具体地,所述存储器212可以包括高速随机存取存储器,还可以包括非暂态存储器,例如至少一个磁盘存储器件、闪存器件、或其他非暂态固态存储器件。在一些实施例中,存储器212还可以包括相对于处理器211远程设置的存储器,这些远程存储器可以通过网络连接至处理器211。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
所述机臂22的数量为至少两个,所述机臂22可以与所述机身21固定连接、一体成型或可拆卸连接,本发明实施例对此不作具体限定。
所述动力组件23通常可以包括电子调速器、电机以及螺旋桨,所述电机 连接在所述电子调速器和所述螺旋桨之间,所述电机和所述螺旋桨设置于相应的机臂22上。其中,所述电子调速器可以与所述处理器211通信连接,能够接收所述处理器211发送的驱动信号,并根据所述驱动信号为所述电机提供驱动电流,以控制所述电机和所述螺旋桨的旋转速度,从而为所述无人机20提供飞行的升力或飞行的动力。
所述双目摄像组件24可以为任意能够采集左右视图的拍摄设备,比如,其具体可以包括间隔设置的第一图像采集装置和第二图像采集装置,所述第一图像采集装置和所述第二图像采集装置可以直接或者通过云台安装在所述机身21的下方。其中,在一些实施例中,所述第一图像采集装置可以为主摄像机,用于进行航拍并向用户提供拍摄画面,所述第二图像装置可以为辅助摄像机,用于结合所述主摄像机实现双目视觉感测。所述双目摄像组件24与所述处理器211通信连接,能够向所述处理器211反馈采集到的图像信息,以及,在所述处理器211的控制下进行拍摄。
所述遥控设备30即与所述无人机20对应的地面端设备,用于对所述无人机20进行远程操纵。所述遥控设备30具体可以为遥控器、移动终端(例如,智能手机、平板电脑、笔记本电脑等)、可穿戴设备或者其他设备。所述遥控设备30可以接收用户输入,并根据用户的输入向所述无人机20发送相应的控制指令,以控制所述无人机20执行相应的任务,比如,调整飞行姿态、进行自主环绕拍摄等。所述遥控设备30还可以接收来自所述无人机20的信息或图像数据,并通过其显示屏或者其他显示装置将所述信息或图像数据呈现给用户。
在实际应用中,当用户需要对某一环绕对象10进行环绕拍摄时,可以首先通过所述遥控设备30控制所述无人机20获取包括所述环绕对象10的目标拍摄画面,其中,所述目标拍摄画面代表了用户期望获取到的环绕对象10在后续的待拍摄画面中的拍摄效果。当用户确定获取到了目标拍摄画面时,可以通过遥控设备30上的功能控件向所述无人机20的处理器211发送自主环绕拍摄指令,以指示所述无人机20基于所述目标拍摄画面,针对所述环绕对象10进行自主环绕拍摄。
所述无人机20的处理器211在接收到所述自主环绕拍摄指令后,可以首先通过所述双目摄像组件24获取所述目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象10;同时,获取所述无人机20在获取所述目标拍摄画面时的飞行高度,以及,基于所述目标拍摄画面,确定所述双目摄像组件24与所述环绕对象10之间的空间距离;并且,实时检测所述双目摄像组件24的光轴方向;进而根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。通过上述方式,能够基于视觉感测实现自主环绕拍摄,一方面摆脱了对GPS信号的依赖,能够在室内等GPS信号弱的区域进行自主环绕飞行和拍摄,另一方面以拍摄效果为导向,能够保证自主环绕拍摄的拍摄质量。
其中,需要说明的是,上述应用环境仅是为了进行示例性说明,在实际应用中,本发明实施例提供的自主环绕拍摄方法和相关装置还可以进一步的拓展到其他合适的应用环境中,而不限于图1中所示的应用环境。比如,在其他的一些实施例中,所述无人机20和所述遥控设备30的数量也可以不止一个。
实施例二
图2是本发明实施例提供的一种自主环绕拍摄方法的流程示意图。该方法可以由任意类型的包括双目摄像组件的无人机执行,比如,由如图1所示的无人机20执行。
具体地,请参阅图2,所述方法可以包括但不限于如下步骤:
步骤110:通过双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象。
在本实施例中,所述“目标拍摄画面”即用户期望获取到的拍摄画面,用于确定环绕对象在待拍摄画面中的拍摄效果,比如,环绕对象在待拍摄画面中的位置、尺寸等。具体地,通过所述双目摄像组件获取到的所述目标拍摄画面通常包括针对同一拍摄场景的左相机视图和右相机视图,无人机可以将其中一个视图(左相机视图或者右相机视图)发送至遥控设备供用户查看和确认。
在本实施例中,所述通过双目摄像组件获取目标拍摄画面,并从所述目标拍摄画面中选定环绕对象的具体实施方式具体可以为:接收遥控设备发送的控制指令,其中,所述遥控设备与所述无人机通信连接;根据所述控制指令飞行至目标拍摄位置;将所述双目摄像组件在所述目标拍摄位置采集到的拍摄画面作为目标拍摄画面并获取用户从所述目标拍摄画面中选定的环绕对象。
其中,所述控制指令是指操纵无人机飞行的指令,其具体可以包括摇杆指令或针对实时采集到的拍摄画面的操作指令。比如,用户可以参考双目摄像组件实时反馈的拍摄画面,通过控制遥控设备上的摇杆来调整无人机在空域中的空间位置,从而实现对环绕对象在拍摄画面中的位置与大小尺寸的调整;或者,用户也可以在遥控设备上针对实时采集到的拍摄画面进行平移、放大、缩小等操作,遥控设备则基于这些操作生成对应的操作指令并发送至无人机,无人机接收到这些操作指令后可以生成相应的空间位置调整指令(比如,向前飞、向后飞、向左飞、向右飞、向上飞、向下飞等)以进行空间位置的调整。
其中,所述“目标拍摄位置”即用户获取到其所期望的拍摄效果时,无人机所在的位置。在实际应用中,当获取到预期中的拍摄画面时,用户可以从该拍摄画面中框选出环绕对象并向无人机发送包含环绕对象信息的自主环绕拍摄指令。而无人机则可以在接收到遥控设备发送的包含环绕对象信息的自主环绕拍摄指令时,确定其已经飞行至目标拍摄位置,此时,可以直接获取双目摄像组件在当前时刻采集到的拍摄画面并将该拍摄画面作为目标拍摄画面。然后根据该环绕对象信息,获取用户从所述目标拍摄画面中选定的环绕对象。
步骤120:获取所述无人机在获取所述目标拍摄画面时的飞行高度。
在本实施例中,所述飞行高度即无人机相对地面的高度。
在获取到所述目标拍摄画面时,所述无人机可以通过任意类型的高度传感器,确定其在当前时刻的飞行高度。
步骤130:基于所述目标拍摄画面,确定所述双目摄像组件与所述环绕对象之间的空间距离。
在本实施例中,基于所述目标拍摄画面来确定在获取到所述目标拍摄画面时,所述双目摄像组件与所述环绕对象之间的空间距离。
其中,所述目标拍摄画面包括针对同一拍摄场景的左相机视图和右相机视图,从而,通过对所述左相机视图和所述右相机视图进行立体匹配,即可以获取到所述环绕对象的视差值,进而得到所述双目摄像组件与所述环绕对象之间的空间距离。
步骤140:实时检测所述双目摄像组件的光轴方向。
在本实施例中,所述双目摄像组件的光轴方向是指所述双目摄像组件的中心线所在方向。
在实际应用中,可以实时获取所述无人机当前的飞行方向,并通过角度传感器测量所述双目摄像组件的光轴与所述无人机之间的相对角度,进而结合所述无人机当前的飞行方向和所述双目摄像组件的光轴与所述无人机之间的相对角度来实时确定所述双目摄像组件的光轴方向。
步骤150:根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
根据相机成像原理可知,通过调整无人机的双目摄像组件与环绕对象之间的相对空间位置,可以调整环绕对象在双目摄像组件的拍摄画面中的位置和尺寸大小等拍摄效果。从而,无人机在飞行的过程中,只要保持所述飞行高度和所述空间距离不变,即可获得与所述目标拍摄画面一致的拍摄效果。
此外,根据所述飞行高度和所述空间距离,还可以确定所述无人机针对所述环绕对象进行环绕飞行时的环绕半径(即,环绕对象在无人机所在平面的投影点与所述无人机之间的连线)。而根据圆周运动的原理可知,若无人机在水平面上始终沿着与环绕半径相切的方向运动,即可实现环绕飞行。其中,与环绕半径相切的方向即水平面上与所述双目摄像组件的光轴空间垂直的方向。从而,通过实时检测所述双目摄像组件的光轴方向,可以实时调整所述无人机的飞行方向,以实现环绕飞行和拍摄。
有鉴于此,在针对静态的环绕对象进行自主环绕拍摄时,无人机可以将所述飞行高度作为所述无人机的目标飞行高度;将所述空间距离作为所述双目摄像组件与所述环绕对象之间的目标空间距离;在飞行的过程中,维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。其中,所述根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向具体可以为:将所述无人机的飞行方向调整为水平面上与所 述光轴方向空间垂直的方向。
进一步地,在一些实施例中,当所述环绕对象处于运动状态时,为了能够始终针对所述环绕对象进行自主环绕拍摄,所述无人机除了进行环绕飞行之外,还需与所述环绕对象同步运动,以使所述环绕对象在待拍摄画面中呈现相对静止的状态。
从而,在该实施例中,在执行步骤150之前,所述方法还可以包括:实时获取所述环绕对象的运动方向。此时,所述维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向具体为:维持所述目标飞行高度和所述目标空间距离不变,并根据实时获取到的所述环绕对象的运动方向和实时检测到的所述双目摄像组件的光轴方向,调整所述无人机的飞行方向。
其中,所述无人机可以通过对比所述双目摄像组件获取到的前后两帧拍摄画面来确定所述环绕对象的运动方向,也可以通过对比所述双目摄像组件在当前时刻获取到的拍摄画面与所述目标拍摄画面来确定所述环绕对象的运动方向。
通过上述技术方案可知,本发明实施例的有益效果在于:本发明实施例提供的自主环绕拍摄方法通过在接收到自主环绕拍摄指令时,通过双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象;同时,在获取到所述目标拍摄画面时,获取所述无人机的飞行高度以及确定所述双目摄像组件与所述环绕对象之间的空间距离;并且实时检测所述双目摄像组件的光轴方向;进而根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄;一方面能够在无需依赖GPS信号的情况下实现无人机的自主环绕飞行和拍摄,另一方面能够始终获取到与所述“目标拍摄画面”相对应的拍摄效果,保证无人机自主环绕拍摄的质量。
实施例三
图3是本发明实施例提供的另一种自主环绕拍摄方法的流程示意图。该方法与上述实施例二所述的自主环绕拍摄方法相似,其区别在于:在本实施例中,无人机还可以根据用户设定的拍摄模式进行个性化的自主环绕拍摄。
具体地,请参阅图3,所述方法可以包括但不限于如下步骤:
步骤210:通过双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象。
步骤220:获取所述无人机在获取所述目标拍摄画面时的飞行高度。
步骤230:基于所述目标拍摄画面,确定所述双目摄像组件与所述环绕对象之间的空间距离。
步骤240:实时检测所述双目摄像组件的光轴方向。
步骤250:确定针对所述环绕对象的拍摄模式。
在本实施例中,所述拍摄模式是指针对所述环绕对象进行环绕拍摄时所采 用的拍摄手法,用户可以在输入针对所述环绕对象的自主环绕拍摄指令时,同时限定拍摄模式。
具体地,所述拍摄模式可以包括但不限于:恒定模式、渐变模式以及自定义模式等。所述恒定模式即始终按照所述目标拍摄画面的拍摄效果进行自主环绕拍摄。所述渐变模式即在进行环绕拍摄的过程中逐渐改变所述环绕对象在拍摄画面中的位置和/或大小。所述自定义模式即在进行环绕拍摄的过程中,按照用户的自定义设置调整所述环绕对象在拍摄画面中拍摄效果。
步骤260:结合所述拍摄模式、所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
在本实施例中,除了依据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向之外,还结合所述拍摄模式进行自主环绕拍摄。
具体地,若所述拍摄模式为恒定拍摄模式,则,可以直接采用上述实施例二所述的自主环绕拍摄方法。
若所述拍摄模式为渐变模式、自定义模式或者其他模式,则,所述结合所述拍摄模式、所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄的具体实施方式可以为:将所述飞行高度作为初始飞行高度;将所述空间距离作为所述无人机与所述环绕对象之间的初始空间距离;根据所述拍摄模式、所述初始飞行高度以及所述初始空间距离,确定所述无人机在各个拍摄时间段内的目标飞行高度和所述无人机与所述环绕对象之间的目标空间距离;在各个拍摄时间段内,按照所述目标飞行高度和所述目标空间距离飞行,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
其中,需说明的是,上述步骤210至240分别与如图2所示的自主环绕拍摄方法中的步骤110至140具有相同的技术特征,因此,其具体实施方式可以参考上述实施例的步骤110至140中相应的描述,在本实施例中便不再赘述。
通过上述技术方案可知,本发明实施例的有益效果在于:本实施例提供的自主环绕拍摄方法,通过进一步结合用户选定的拍摄模式进行自主环绕拍摄,能够实现面向“拍摄效果”的个性化自主环绕拍摄,为自主环绕拍摄提供更多趣味性体验。
实施例四
图4是本发明实施例提供的一种自主环绕拍摄装置的结构示意图,该自主环绕拍摄装置400可以运行于如图1所示的无人机20的处理器211中。
具体地,请参阅图4,该装置400包括:获取单元401、距离确定单元402、光轴检测单元403以及环绕拍摄单元404。
其中,获取单元401用于通过所述双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象;还用于获取所述无人机在获取所述目标拍摄画面时的飞行高度;
距离确定单元402用于基于所述目标拍摄画面,确定所述双目摄像组件与所述环绕对象之间的空间距离;
光轴检测单元403用于实时检测所述双目摄像组件的光轴方向;
环绕拍摄单元404用于根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
在本实施例中,当无人机接收到自主环绕拍摄指令时,可以利用所述获取单元401通过所述双目摄像组件获取目标拍摄画面、用户从所述目标拍摄画面中选定的环绕对象以及所述无人机在获取所述目标拍摄画面时的飞行高度,同时,利用所述距离确定单元402基于所述目标拍摄画面,确定所述双目摄像组件与所述环绕对象之间的空间距离,以及,通过所述光轴检测单元403实时检测所述双目摄像组件的光轴方向;进而由所述环绕拍摄单元404根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
其中,在一些实施例中,所述获取单元401通过所述双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象,具体为:接收遥控设备发送的控制指令,其中,所述遥控设备与所述无人机通信连接;根据所述控制指令飞行至目标拍摄位置;将所述双目摄像组件在所述目标拍摄位置采集到的拍摄画面作为目标拍摄画面并获取用户从所述目标拍摄画面中选定的环绕对象。进一步地,在一些实施例中,所述控制指令包括摇杆指令或针对实时采集到的拍摄画面的操作指令。
其中,在一些实施例中,所述环绕拍摄单元404具体用于:将所述飞行高度作为所述无人机的目标飞行高度;将所述空间距离作为所述双目摄像组件与所述环绕对象之间的目标空间距离;维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
进一步地,在一些实施例中,为了能够对处于运动状态的环绕对象进行自主环绕拍摄,所述自主环绕拍摄装置400还包括:
监控单元405,用于实时获取所述环绕对象的运动方向。此时,所述环绕拍摄单元404维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,具体为:维持所述目标飞行高度和所述目标空间距离不变,并根据实时获取的所述环绕对象的运动方向和实时检测到的所述双目摄像组件的光轴方向,调整所述无人机的飞行方向。
此外,在另一些实施例中,所述自主环绕拍摄装置400还包括:
拍摄模式确定单元406,用于确定针对所述环绕对象的拍摄模式。则,在该实施例中,所述环绕拍摄单元404用于:结合所述拍摄模式、所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
具体地,在一些实施例中,所述环绕拍摄单元404具体用于:将所述飞行高度作为初始飞行高度;将所述空间距离作为所述无人机与所述环绕对象之间的初始空间距离;根据所述拍摄模式、所述初始飞行高度以及所述初始空间距离,确定所述无人机在各个拍摄时间段内的目标飞行高度和所述无人机与所述环绕对象之间的目标空间距离;在各个拍摄时间段内,按照所述目标飞行高度和所述目标空间距离飞行,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
需要说明的是,以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元/模块来实现本实施例方案的目的。
并且,由于所述自主环绕拍摄装置与上述方法实施例二或三所描述的自主环绕拍摄方法基于相同的发明构思,因此,上述方法实施例中的相应内容同样适用于本装置实施例,此处不再详述。
通过上述技术方案可知,本发明实施例的有益效果在于:本发明实施例提供的自主环绕拍摄装置通过利用所述获取单元401通过所述双目摄像组件获取目标拍摄画面、用户从所述目标拍摄画面中选定的环绕对象以及所述无人机在获取所述目标拍摄画面时的飞行高度,同时,利用所述距离确定单元402基于所述目标拍摄画面,确定所述双目摄像组件与所述环绕对象之间的空间距离,以及,通过所述光轴检测单元403实时检测所述双目摄像组件的光轴方向;进而由所述环绕拍摄单元404根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄;一方面能够在无需依赖GPS信号的情况下实现无人机的自主环绕飞行和拍摄,另一方面能够始终获取到与所述“目标拍摄画面”相对应的拍摄效果,保证无人机自主环绕拍摄的质量。
本发明实施例还提供了一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储有计算机可执行指令,该计算机可执行指令被一个或多个处理器执行,例如,被图1中的处理器211执行,可使得上述一个或多个处理器执行上述任意方法实施例中的自主环绕拍摄方法,例如,执行以上描述的图2中的方法步骤110至150,图3中的方法步骤210至260,实现图4中的单元401-406的功能。
此外,应当理解的是,通过以上的实施方式的描述,本领域普通技术人员可以清楚地了解到各实施方式可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。比如,通过双目摄像组件、高度传感器等相关传感器实现上述获取单元401的功能,通过图像处理器实现距离确定单元402的功能,通过陀螺仪等方位传感器实现光轴检测单元403的功能,通过飞行控制模块与双目 摄像组件之间的相互配合实现上述环绕拍摄单元404的功能。
本领域普通技术人员还可以理解的是,实现上述实施例方法中的全部或部分流程也可以通过计算机程序产品中的计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一非暂态计算机可读取存储介质中,该计算机程序包括程序指令,当所述程序指令被无人机执行时,可使所述无人机执行上述各方法实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
上述产品(包括:无人机、非暂态计算机可读存储介质以及计算机程序产品)可执行本发明实施例所提供的自主环绕拍摄方法,具备执行自主环绕拍摄方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本发明实施例所提供的自主环绕拍摄方法。
最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其限制;在本发明的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本发明的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (14)

  1. 一种自主环绕拍摄方法,应用于无人机,所述无人机包括双目摄像组件,其特征在于,所述方法包括:
    通过所述双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象;
    获取所述无人机在获取所述目标拍摄画面时的飞行高度;
    基于所述目标拍摄画面,确定所述双目摄像组件与所述环绕对象之间的空间距离;
    实时检测所述双目摄像组件的光轴方向;
    根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄,包括:
    将所述飞行高度作为所述无人机的目标飞行高度;
    将所述空间距离作为所述双目摄像组件与所述环绕对象之间的目标空间距离;
    维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
  3. 根据权利要求2所述的方法,其特征在于,所述方法还包括:
    实时获取所述环绕对象的运动方向;
    则,所述维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,包括:
    维持所述目标飞行高度和所述目标空间距离不变,并根据实时获取的所述环绕对象的运动方向和实时检测到的所述双目摄像组件的光轴方向,调整所述无人机的飞行方向。
  4. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    确定针对所述环绕对象的拍摄模式;
    则,所述根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄,包括:
    结合所述拍摄模式、所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
  5. 根据权利要求4所述的方法,其特征在于,所述结合所述拍摄模式、所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄,包括:
    将所述飞行高度作为初始飞行高度;
    将所述空间距离作为所述无人机与所述环绕对象之间的初始空间距离;
    根据所述拍摄模式、所述初始飞行高度以及所述初始空间距离,确定所述无人机在各个拍摄时间段内的目标飞行高度和所述无人机与所述环绕对象之间的目标空间距离;
    在各个拍摄时间段内,按照所述目标飞行高度和所述目标空间距离飞行,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,所述通过所述双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象,包括:
    接收遥控设备发送的控制指令,其中,所述遥控设备与所述无人机通信连接;
    根据所述控制指令飞行至目标拍摄位置;
    将所述双目摄像组件在所述目标拍摄位置采集到的拍摄画面作为目标拍摄画面并获取用户从所述目标拍摄画面中选定的环绕对象。
  7. 根据权利要求6所述的方法,其特征在于,所述控制指令包括摇杆指令或针对实时采集到的拍摄画面的操作指令。
  8. 一种自主环绕拍摄装置,应用于无人机,所述无人机包括双目摄像组件,其特征在于,所述装置包括:
    获取单元,用于通过所述双目摄像组件获取目标拍摄画面和用户从所述目标拍摄画面中选定的环绕对象;以及
    用于获取所述无人机在获取所述目标拍摄画面时的飞行高度;
    距离确定单元,用于基于所述目标拍摄画面,确定所述双目摄像组件与所述环绕对象之间的空间距离;
    光轴检测单元,用于实时检测所述双目摄像组件的光轴方向;以及
    环绕拍摄单元,用于根据所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
  9. 根据权利要求8所述的装置,其特征在于,所述环绕拍摄单元具体用于:
    将所述飞行高度作为所述无人机的目标飞行高度;
    将所述空间距离作为所述双目摄像组件与所述环绕对象之间的目标空间距离;
    维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
  10. 根据权利要求9所述的装置,其特征在于,所述装置还包括:
    监控单元,用于实时获取所述环绕对象的运动方向;
    则,所述环绕拍摄单元维持所述目标飞行高度和所述目标空间距离不变,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,具体为:
    维持所述目标飞行高度和所述目标空间距离不变,并根据实时获取的所述环绕对象的运动方向和实时检测到的所述双目摄像组件的光轴方向,调整所述无人机的飞行方向。
  11. 根据权利要求8所述的装置,其特征在于,所述装置还包括:
    拍摄模式确定单元,用于确定针对所述环绕对象的拍摄模式;
    则,所述环绕拍摄单元用于:
    结合所述拍摄模式、所述飞行高度、所述空间距离以及实时检测到的所述双目摄像组件的光轴方向,进行自主环绕拍摄。
  12. 根据权利要求11所述的装置,其特征在于,所述环绕拍摄单元具体用于:
    将所述飞行高度作为初始飞行高度;
    将所述空间距离作为所述无人机与所述环绕对象之间的初始空间距离;
    根据所述拍摄模式、所述初始飞行高度以及所述初始空间距离,确定所述无人机在各个拍摄时间段内的目标飞行高度和所述无人机与所述环绕对象之间的目标空间距离;
    在各个拍摄时间段内,按照所述目标飞行高度和所述目标空间距离飞行,并根据实时检测到的所述双目摄像组件的光轴方向调整所述无人机的飞行方向,以实现以所述环绕对象为中心的自主环绕拍摄。
  13. 一种无人机,其特征在于,包括:
    机身;
    机臂,其与所述机身连接;
    动力组件,其设置于所述机臂;
    双目摄像组件,其设置于所述机身;
    处理器,其设置于所述机身内,并与所述双目摄像组件通信连接;以及,
    存储器,其与所述处理器通信连接;其中,
    所述存储器存储有可被所述处理器执行的指令,所述指令被所述处理器执行,以使所述处理器能够执行如权利要求1-7任一项所述的方法。
  14. 一种非暂态计算机可读存储介质,其特征在于,所述非暂态计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使无人机执行如权利要求1-7任一项所述的方法。
PCT/CN2020/091620 2019-05-22 2020-05-21 一种自主环绕拍摄方法、装置以及无人机 WO2020233682A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/455,744 US11755042B2 (en) 2019-05-22 2021-11-19 Autonomous orbiting method and device and UAV
US18/360,365 US20230384803A1 (en) 2019-05-22 2023-07-27 Autonomous orbiting method and device and uav

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910430492.0A CN110139038B (zh) 2019-05-22 2019-05-22 一种自主环绕拍摄方法、装置以及无人机
CN201910430492.0 2019-05-22

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/455,744 Continuation US11755042B2 (en) 2019-05-22 2021-11-19 Autonomous orbiting method and device and UAV

Publications (1)

Publication Number Publication Date
WO2020233682A1 true WO2020233682A1 (zh) 2020-11-26

Family

ID=67572540

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/091620 WO2020233682A1 (zh) 2019-05-22 2020-05-21 一种自主环绕拍摄方法、装置以及无人机

Country Status (3)

Country Link
US (2) US11755042B2 (zh)
CN (1) CN110139038B (zh)
WO (1) WO2020233682A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110139038B (zh) * 2019-05-22 2021-10-22 深圳市道通智能航空技术股份有限公司 一种自主环绕拍摄方法、装置以及无人机
CN111307198A (zh) * 2019-11-01 2020-06-19 宁波纳智微光电科技有限公司 一种动态测量***及其测量方法
CN112585939B (zh) * 2019-12-31 2023-11-17 深圳市大疆创新科技有限公司 一种图像处理方法、控制方法、设备及存储介质
WO2022126529A1 (zh) * 2020-12-17 2022-06-23 深圳市大疆创新科技有限公司 定位的方法、设备、无人机和存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103149939A (zh) * 2013-02-26 2013-06-12 北京航空航天大学 一种基于视觉的无人机动态目标跟踪与定位方法
US20160173859A1 (en) * 2013-12-27 2016-06-16 National Institute Of Meteorological Research Imaging system mounted in flight vehicle
CN105787447A (zh) * 2016-02-26 2016-07-20 深圳市道通智能航空技术有限公司 一种无人机基于双目视觉的全方位避障的方法及***
CN106909172A (zh) * 2017-03-06 2017-06-30 重庆零度智控智能科技有限公司 环绕跟踪方法、装置和无人机
US20170369167A1 (en) * 2016-06-24 2017-12-28 1St Rescue, Inc. Precise and rapid delivery of an emergency medical kit from an unmanned aerial vehicle
CN108475071A (zh) * 2017-06-29 2018-08-31 深圳市大疆创新科技有限公司 无人机及其控制方法、控制终端及其控制方法
CN108985193A (zh) * 2018-06-28 2018-12-11 电子科技大学 一种基于图像检测的无人机航拍人像对准方法
CN110139038A (zh) * 2019-05-22 2019-08-16 深圳市道通智能航空技术有限公司 一种自主环绕拍摄方法、装置以及无人机

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2468345B (en) * 2009-03-05 2014-01-15 Cranfield Aerospace Ltd Unmanned air vehicle (uav) control system and method
US9025256B2 (en) * 2011-03-10 2015-05-05 Raytheon Company Dual field of view refractive optical system for GEO synchronous earth orbit
RU2531433C1 (ru) * 2013-07-16 2014-10-20 Федеральное государственное унитарное предприятие "Центральный научно-исследовательский институт машиностроения" (ФГУП ЦНИИмаш) Способ определения параметров орбиты космического объекта
DE102013108711B4 (de) * 2013-08-12 2016-07-14 Jena-Optronik Gmbh Verfahren zum Betrieb eines Lage- und Orbit-Steuersystems und Lage- und Orbit-Steuersystem
CN204697171U (zh) * 2015-05-27 2015-10-07 杨珊珊 一种智能多模式飞行拍摄设备
WO2018032457A1 (en) * 2016-08-18 2018-02-22 SZ DJI Technology Co., Ltd. Systems and methods for augmented stereoscopic display
CN206265327U (zh) * 2016-11-18 2017-06-20 捷西迪(广州)光学科技有限公司 一种用于水下拍摄的无人飞机
CN106657779B (zh) * 2016-12-13 2022-01-04 北京远度互联科技有限公司 环绕拍摄方法、装置及无人机
CN207182100U (zh) * 2017-05-22 2018-04-03 埃洛克航空科技(北京)有限公司 一种用于固定翼无人机的双目视觉避障***
CN113163118A (zh) * 2017-05-24 2021-07-23 深圳市大疆创新科技有限公司 拍摄控制方法及装置
US10866597B1 (en) * 2018-05-07 2020-12-15 Securus Technologies, Llc Drone detection and interception
US11222229B1 (en) * 2018-05-31 2022-01-11 The Charles Stark Draper Laboratory, Inc. System and method for multidimensional gradient-based cross-spectral stereo matching

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103149939A (zh) * 2013-02-26 2013-06-12 北京航空航天大学 一种基于视觉的无人机动态目标跟踪与定位方法
US20160173859A1 (en) * 2013-12-27 2016-06-16 National Institute Of Meteorological Research Imaging system mounted in flight vehicle
CN105787447A (zh) * 2016-02-26 2016-07-20 深圳市道通智能航空技术有限公司 一种无人机基于双目视觉的全方位避障的方法及***
US20170369167A1 (en) * 2016-06-24 2017-12-28 1St Rescue, Inc. Precise and rapid delivery of an emergency medical kit from an unmanned aerial vehicle
CN106909172A (zh) * 2017-03-06 2017-06-30 重庆零度智控智能科技有限公司 环绕跟踪方法、装置和无人机
CN108475071A (zh) * 2017-06-29 2018-08-31 深圳市大疆创新科技有限公司 无人机及其控制方法、控制终端及其控制方法
CN108985193A (zh) * 2018-06-28 2018-12-11 电子科技大学 一种基于图像检测的无人机航拍人像对准方法
CN110139038A (zh) * 2019-05-22 2019-08-16 深圳市道通智能航空技术有限公司 一种自主环绕拍摄方法、装置以及无人机

Also Published As

Publication number Publication date
US20230384803A1 (en) 2023-11-30
US20220075394A1 (en) 2022-03-10
CN110139038B (zh) 2021-10-22
CN110139038A (zh) 2019-08-16
US11755042B2 (en) 2023-09-12

Similar Documents

Publication Publication Date Title
US11797009B2 (en) Unmanned aerial image capture platform
WO2020233682A1 (zh) 一种自主环绕拍摄方法、装置以及无人机
US11649052B2 (en) System and method for providing autonomous photography and videography
WO2018214078A1 (zh) 拍摄控制方法及装置
CN105242685B (zh) 一种伴飞无人机航拍***及方法
WO2019113966A1 (zh) 一种避障方法、装置和无人机
WO2018210078A1 (zh) 无人机的距离测量方法以及无人机
WO2018098704A1 (zh) 控制方法、设备、***、无人机和可移动平台
WO2021098453A1 (zh) 目标跟踪方法及无人飞行器
CN103149788A (zh) 空中360°全景照片拍摄装置及方法
WO2020014987A1 (zh) 移动机器人的控制方法、装置、设备及存储介质
WO2019128275A1 (zh) 一种拍摄控制方法、装置及飞行器
CN203204299U (zh) 空中360°全景照片拍摄装置
WO2021212445A1 (zh) 拍摄方法、可移动平台、控制设备和存储介质
CN112650267A (zh) 一种对飞行器的飞行控制方法、装置及飞行器
US20210120171A1 (en) Determination device, movable body, determination method, and program
WO2021031159A1 (zh) 比赛拍摄方法、电子设备、无人机与存储介质
WO2020048365A1 (zh) 飞行器的飞行控制方法、装置、终端设备及飞行控制***
WO2022193081A1 (zh) 无人机的控制方法、装置及无人机
WO2020168519A1 (zh) 拍摄参数的调整方法、拍摄设备以及可移动平台
WO2021093577A1 (zh) 高动态范围图像自动曝光方法及无人飞行器
WO2021135824A1 (zh) 图像曝光方法及装置、无人机
WO2022056683A1 (zh) 视场确定方法、视场确定装置、视场确定***和介质
WO2020150974A1 (zh) 拍摄控制方法、可移动平台与存储介质
WO2022205294A1 (zh) 无人机的控制方法、装置、无人机及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20810712

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20810712

Country of ref document: EP

Kind code of ref document: A1