WO2021115192A1 - Image processing device, image processing method, program and recording medium - Google Patents

Image processing device, image processing method, program and recording medium Download PDF

Info

Publication number
WO2021115192A1
WO2021115192A1 PCT/CN2020/133589 CN2020133589W WO2021115192A1 WO 2021115192 A1 WO2021115192 A1 WO 2021115192A1 CN 2020133589 W CN2020133589 W CN 2020133589W WO 2021115192 A1 WO2021115192 A1 WO 2021115192A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
dynamic
dynamic image
flying body
image processing
Prior art date
Application number
PCT/CN2020/133589
Other languages
French (fr)
Chinese (zh)
Inventor
周杰旻
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080074343.6A priority Critical patent/CN114586335A/en
Publication of WO2021115192A1 publication Critical patent/WO2021115192A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present disclosure relates to an image processing device, an image processing method, a program, and a recording medium.
  • Patent Document 1 discloses an image processing device that performs image synthesis.
  • the image processing device includes a synthesis unit that synthesizes a plurality of images taken at different points in time, and a motion correction unit that corrects for reducing the influence of motion on the image.
  • Patent Document 1 European Patent Application Publication No. 3450310 Specification
  • the image processing device in Patent Document 1 fixes the position of the imaging device to take a plurality of still images, and then combines the plurality of still images (Image Stacking). However, it does not take into account the situation where multiple moving images are combined (Video Stacking) while the shooting device is moving, as in the case of a flying body equipped with a shooting device. It is expected that by synthesizing a plurality of moving images taken by flying objects that can be photographed while flying, the image quality of moving images can be improved.
  • the dynamic image may have a plurality of image frames in time series order.
  • the processing unit may control the flying body so that each image frame of the same relative time in the plurality of dynamic images has the same shooting range.
  • the processing unit acquires the state of the flying object in synchronization with the vertical synchronization signal of the imaging unit during the flight of the first lap of the flight path; during the flight of the flight path after the second lap, it synchronizes the flight with the vertical synchronization signal of the imaging unit.
  • the flying and imaging unit of the body is controlled so that the shooting is performed in the same state as the state of the flying body in the first circle.
  • the status of the flying body may include at least one of the position of the flying body, the orientation of the flying body, and the angle of the gimbal supporting the imaging unit.
  • the processing unit may generate a composite dynamic image based on the first dynamic image obtained in the first circle and the second dynamic image obtained after the second circle.
  • the processing unit may compare the first dynamic image with the second dynamic image for each image frame of the same relative time; according to the comparison result, perform the motion compensation of the second dynamic image on the first dynamic image.
  • Motion compensation may include global motion compensation.
  • the processing unit may generate a composite moving image based on the statistical value of the same pixel of the image frame of the same relative time in the first moving image and the second moving image.
  • the processing unit compares the first dynamic image with the second dynamic image for each image frame of the same relative time; extracts the characteristic region from the second dynamic image; replaces the characteristic region in the second dynamic image with the one in the first dynamic image.
  • the area corresponding to the characteristic area is the area corresponding to the characteristic area.
  • the processing unit can obtain the number of turns of the flying object on the flight path; when the acquired number of turns is less than the threshold, output the dynamic image taken in the last round; when the acquired number of turns is greater than or equal to the threshold, output the composite dynamic image .
  • the processing unit can evaluate the output synthetic dynamic image; when the evaluation result of the synthetic dynamic image meets the preset criterion, the flying and shooting of the flying object is ended; when the evaluation result of the synthetic dynamic image does not meet the preset criterion, it moves along the downward direction. Fly and take pictures on a circling flight path.
  • the processing unit may acquire operation information indicating the evaluation result of the synthesized moving image.
  • the processing unit can perform image recognition for the synthetic dynamic image; evaluate the synthetic dynamic image according to the result of the image recognition.
  • the image processing device may be a flying object.
  • an image processing method that processes a dynamic image taken by a camera included in a flying body includes the following steps: designating a flight path of the flying body; making the flying body fly around along the flight path Multiple times; the imaging unit included in the flying body captures multiple dynamic images with the same shooting range through multiple circumnavigation flights; and synthesizes multiple dynamic images captured through multiple circumnavigation flights to generate a composite dynamic image.
  • the dynamic image may have a plurality of image frames in time series order.
  • the step of shooting a plurality of dynamic images may include the step of controlling the flying body so that each of the plurality of dynamic images has the same shooting range for image frames of the same relative time.
  • the step of capturing multiple dynamic images may include the following steps: during the flight of the first lap of the flight path, acquiring the state of the flying object in synchronization with the vertical synchronization signal of the camera unit; and during the flight of the flight path after the second lap, The flight of the flying object and the imaging unit are controlled in synchronization with the vertical synchronization signal of the imaging unit so that the shooting is performed in the same state as the state of the flying object in the first circle.
  • the status of the flying body may include at least one of the position of the flying body, the orientation of the flying body, and the angle of the universal joint supporting the imaging unit.
  • the step of generating a synthetic dynamic image may include the following steps: generating a synthetic dynamic image based on the first dynamic image obtained in the first circle and the second dynamic image obtained after the second circle.
  • the step of generating a synthetic dynamic image may include the following steps: for each image frame of the same relative time, comparing the first dynamic image with the second dynamic image; according to the comparison result, performing the movement of the second dynamic image on the first dynamic image make up.
  • Motion compensation may include global motion compensation.
  • the step of generating a composite moving image may include the following steps: generating a composite moving image based on the statistical values of the same pixels of the image frames of the same relative time in the first moving image and the second moving image.
  • the step of generating a synthetic dynamic image may include the following steps: comparing the first dynamic image with the second dynamic image for each image frame of the same relative time; extracting the characteristic region from the second dynamic image; and using the second dynamic image The feature area of replaces the area corresponding to the feature area in the first dynamic image.
  • the step of shooting multiple dynamic images may include the following steps: evaluating the output synthetic dynamic image; when the evaluation result of the synthetic dynamic image meets a preset criterion, ending the flight and shooting of the flying object; when the evaluation result of the synthetic dynamic image is not When the preset reference is met, the flight and shooting will be carried out along the flight path of the next circle.
  • the step of evaluating the synthetic dynamic image may include the following steps: obtaining operation information representing the evaluation result of the synthetic dynamic image.
  • the step of evaluating the synthetic dynamic image may include the following steps: performing image recognition for the synthetic dynamic image; and evaluating the synthetic dynamic image according to the result of the image recognition.
  • the image processing method can be executed by an image processing device.
  • the image processing device may be a flying object.
  • a recording medium which is a computer-readable recording medium on which a program is recorded for causing an image processing device that processes a moving image captured by an imaging unit included in an flying body to execute the following Steps: Specify the flight path of the flying body; make the flying body circle and fly along the flight path for multiple times; make the camera included in the flying body shoot multiple dynamic images with the same shooting range through multiple circles; Multiple dynamic images taken by flying around are synthesized to generate a composite dynamic image.
  • FIG. 1 is a schematic diagram showing an example of the configuration of the flying body system in the embodiment.
  • Fig. 2 is a diagram showing an example of a specific appearance of an unmanned aircraft.
  • Fig. 3 is a block diagram showing an example of the hardware configuration of the unmanned aircraft.
  • Fig. 4 is a block diagram showing an example of the hardware configuration of the terminal.
  • Fig. 5 is a diagram showing an example of the operation outline of the unmanned aircraft.
  • Fig. 6 is a flowchart showing an example of the operation of the unmanned aircraft.
  • Fig. 7 is a flowchart showing a first example of dynamic image synthesis.
  • Fig. 8 is a flowchart showing a second example of dynamic image synthesis.
  • Fig. 9 is a flowchart showing an output example of a moving image.
  • the flying object is an unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle) as an example.
  • UAV Unmanned Aerial Vehicle
  • the image processing device is, for example, an unmanned aircraft, but it may also be another device (for example, a terminal, a transmitter, a server, and other image processing devices).
  • the image processing method is used to specify the actions of the image processing device.
  • a program for example, a program that causes the image processing apparatus to execute various processes is recorded in the recording medium.
  • the “section” or “device” described in the following embodiments is not limited to a physical structure realized by hardware, but also includes a function that realizes the structure by software such as a program.
  • the function of one structure may be realized by two or more physical structures, or the function of two or more structures may also be realized by, for example, one physical structure.
  • the “acquisition” described in the embodiment is not limited to the action of directly acquiring information or signals, etc., but also includes, for example, any of the processing unit's acquisition or reception through the communication unit and the acquisition from the storage unit (such as a memory, etc.) By. The understanding and interpretation of these terms are also the same in the description of the claims.
  • FIG. 1 is a schematic diagram showing a configuration example of a flying body system 10 in the embodiment.
  • the flying body system 10 includes an unmanned aircraft 100 and a terminal 80.
  • the unmanned aircraft 100 and the terminal 80 may communicate with each other through wired communication or wireless communication (for example, a wireless LAN (Local Area Network)).
  • the terminal 80 exemplifies a portable terminal (such as a smart phone or a tablet terminal), but it may also be another terminal (such as a PC (Personal Computer, personal computer), which can be manipulated by a joystick for unmanned driving
  • the transmitter proportional controller
  • FIG. 2 is a diagram showing an example of a specific appearance of unmanned aircraft 100. As shown in FIG. FIG. 2 shows a perspective view when the unmanned aircraft 100 is flying in the moving direction STV0. Unmanned aircraft 100 is an example of a moving body.
  • the roll axis (refer to the x-axis) is set in a direction parallel to the ground and along the moving direction STV0.
  • set the pitch axis (refer to the y-axis) in a direction parallel to the ground and perpendicular to the roll axis, and then set the yaw axis in a direction perpendicular to the ground and perpendicular to the roll and pitch axes ( Refer to the z axis).
  • the unmanned aircraft 100 includes a UAV main body 102, a gimbal 200, an imaging unit 220, and a plurality of imaging units 230.
  • the UAV main body 102 includes a plurality of rotors (propellers).
  • the UAV main body 102 makes the unmanned aircraft 100 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 102 uses, for example, four rotors to fly the unmanned aircraft 100.
  • the number of rotors is not limited to four.
  • the unmanned aircraft 100 may be a fixed-wing aircraft without a rotor.
  • the imaging unit 220 is a photographing camera that photographs a subject included in a desired photographing range (for example, the sky above the subject, the scenery such as mountains and rivers, and the buildings on the ground).
  • a desired photographing range for example, the sky above the subject, the scenery such as mountains and rivers, and the buildings on the ground.
  • the plurality of imaging units 230 are sensor cameras that photograph the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100.
  • the two camera units 230 may be installed on the nose of the unmanned aircraft 100, that is, on the front side.
  • the other two camera units 230 may be provided on the bottom surface of the unmanned aircraft 100.
  • the two imaging units 230 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging parts 230 on the bottom side may also be paired to function as a stereo camera.
  • the three-dimensional space data around the unmanned aircraft 100 may be generated based on the images captured by the plurality of imaging units 230.
  • the number of imaging units 230 included in unmanned aircraft 100 is not limited to four.
  • the unmanned aircraft 100 only needs to include at least one camera 230.
  • the unmanned aircraft 100 may include at least one camera 230 on the nose, tail, sides, bottom surface, and top surface of the unmanned aircraft 100, respectively.
  • the angle of view that can be set in the imaging unit 230 may be larger than the angle of view that can be set in the imaging unit 220.
  • the imaging part 230 may have a single focus lens or a fisheye lens.
  • FIG. 3 is a block diagram showing an example of the hardware configuration of unmanned aircraft 100.
  • the unmanned aircraft 100 includes a UAV control unit 110, a communication unit 150, a storage unit 160, a universal joint 200, a rotor mechanism 210, a camera unit 220, a camera unit 230, a GPS receiver 240, and an inertial measurement unit (IMU: Inertial Measurement Unit). ) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, laser measuring device 290.
  • IMU Inertial Measurement Unit
  • the UAV control unit 110 is composed of, for example, a CPU (Central Processing Unit: Central Processing Unit), MPU (Micro Processing Unit: Microprocessor), or DSP (Digital Signal Processor: Digital Signal Processor).
  • the UAV control unit 110 performs signal processing for overall control of the operations of each part of the unmanned aircraft 100, data input and output processing with other parts, data arithmetic processing, and data storage processing.
  • the UAV control unit 110 can control the flight of the unmanned aircraft 100 according to a program stored in the storage unit 160.
  • the UAV control unit 110 can control the flight in accordance with the flight control instructions from the terminal 80 or the like.
  • the UAV control unit 110 can capture images (for example, moving images, still images) (for example, aerial photography).
  • the UAV control unit 110 acquires position information indicating the position of the unmanned aircraft 100.
  • the UAV control unit 110 can obtain position information indicating the latitude, longitude, and altitude where the unmanned aircraft 100 is located from the GPS receiver 240.
  • the UAV control unit 110 can obtain the latitude and longitude information indicating the latitude and longitude of the unmanned aircraft 100 from the GPS receiver 240, and obtain the altitude information indicating the altitude of the unmanned aircraft 100 from the barometric altimeter 270 as position information.
  • the UAV control unit 110 may obtain the distance between the ultrasonic radiation point and the ultrasonic reflection point generated by the ultrasonic sensor 280 as height information.
  • the UAV control unit 110 can acquire the orientation information indicating the orientation of the unmanned aircraft 100 from the magnetic compass 260.
  • the orientation information may be represented by, for example, an orientation corresponding to the orientation of the nose of the unmanned aircraft 100.
  • the UAV control unit 110 can acquire position information indicating the position where the unmanned aircraft 100 should exist when the imaging unit 220 captures the shooting range to be captured.
  • the UAV control unit 110 may obtain position information indicating the position where the unmanned aircraft 100 should exist from the storage unit 160.
  • the UAV control unit 110 can obtain the position information indicating the position where the unmanned aerial vehicle 100 should exist from other devices through the communication unit 150.
  • the UAV control unit 110 may refer to the three-dimensional map database to determine the possible location of the unmanned aircraft 100, and obtain the location as the location information indicating the location where the unmanned aircraft 100 should exist.
  • the UAV control unit 110 can acquire the respective imaging ranges of the imaging unit 220 and the imaging unit 230.
  • the UAV control unit 110 may acquire the angle of view information representing the angle of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230 as a parameter for determining the imaging range.
  • the UAV control unit 110 may acquire information indicating the shooting direction of the camera unit 220 and the camera unit 230 as a parameter for determining the shooting range.
  • the UAV control unit 110 may obtain posture information indicating the posture state of the imaging unit 220 from the gimbal 200 as information indicating the imaging direction of the imaging unit 220, for example.
  • the posture information of the imaging unit 220 may indicate the angle of rotation of the universal joint 200 from the pitch axis and the yaw axis reference rotation angle.
  • the UAV control unit 110 may obtain position information indicating the location of the unmanned aircraft 100 as a parameter for determining the shooting range.
  • the UAV control unit 110 may limit the imaging range representing the geographic range captured by the imaging unit 220 according to the angle of view and the imaging direction of the imaging unit 220 and the imaging unit 230, and the location of the unmanned aircraft 100.
  • the UAV control unit 110 may acquire the shooting range information from the storage unit 160.
  • the UAV control unit 110 may obtain the shooting range information through the communication unit 150.
  • the UAV control unit 110 controls the universal joint 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230.
  • the UAV control unit 110 can control the imaging range of the imaging unit 220 by changing the imaging direction or angle of view of the imaging unit 220.
  • the UAV control unit 110 can control the imaging range of the imaging unit 220 supported by the universal joint 200 by controlling the rotation mechanism of the universal joint 200.
  • the photographing range refers to the geographic range photographed by the photographing unit 220 or the photographing unit 230.
  • the shooting range is defined by latitude, longitude and altitude.
  • the shooting range may be a range of three-dimensional spatial data defined by latitude, longitude, and altitude.
  • the shooting range may be a range of two-dimensional spatial data defined by latitude and longitude.
  • the shooting range may be determined based on the angle of view and shooting direction of the camera 220 or 230 and the location where the unmanned aircraft 100 is located.
  • the shooting directions of the imaging unit 220 and the imaging unit 230 can be defined by the orientation and depression angle of the front facing of the imaging unit 220 and the imaging unit 230 on which the imaging lens is provided.
  • the imaging direction of the imaging unit 220 may be a direction determined by the orientation of the nose of the unmanned aircraft 100 and the posture state of the imaging unit 220 with respect to the gimbal 200.
  • the imaging direction of the imaging unit 230 may be a direction determined from the orientation of the nose of the unmanned aircraft 100 and the position where the imaging unit 230 is installed.
  • the UAV control unit 110 can determine the surrounding environment of the unmanned aircraft 100 by analyzing multiple images captured by the multiple camera units 230.
  • the UAV control unit 110 may control the flight based on the surrounding environment of the unmanned aircraft 100, such as avoiding obstacles.
  • the UAV control unit 110 can acquire three-dimensional information (three-dimensional information) indicating the three-dimensional shape (three-dimensional shape) of an object existing around the unmanned aircraft 100.
  • the object may be a part of a landscape such as buildings, roads, vehicles, trees, etc., for example.
  • the stereo information is, for example, three-dimensional spatial data.
  • the UAV control unit 110 may generate 3D information indicating the 3D shape of an object existing around the unmanned aircraft 100 based on each image acquired by the plurality of camera units 230, thereby acquiring the 3D information.
  • the UAV control unit 110 can obtain the three-dimensional information indicating the three-dimensional shape of objects existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the storage unit 160.
  • the UAV control unit 110 can acquire three-dimensional information related to the three-dimensional shape of objects existing around the unmanned aircraft 100 by referring to a three-dimensional map database managed by a server existing on the network.
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotor mechanism 210. That is, the UAV control unit 110 controls the position including the latitude, longitude, and altitude of the unmanned aircraft 100 by controlling the rotor mechanism 210.
  • the UAV control unit 110 can control the shooting range of the camera unit 220 by controlling the flight of the unmanned aircraft 100.
  • the UAV control unit 110 can control the angle of view of the imaging unit 220 by controlling the zoom lens included in the imaging unit 220.
  • the UAV control unit 110 can use the digital zoom function of the camera unit 220 to control the angle of view of the camera unit 220 through digital zoom.
  • the UAV control unit 110 can move the camera unit 220 to the desired position by moving the unmanned aircraft 100 to a specific position at a specific date and time.
  • the UAV control unit 110 can move the unmanned aerial vehicle 100 to a specific position on a specific date and time to make the imaging unit 220 work as desired.
  • the communication unit 150 communicates with the terminal 80.
  • the communication unit 150 can perform wireless communication by any wireless communication method.
  • the communication unit 150 can perform wired communication through any wired communication method.
  • the communication unit 150 may send the captured image or additional information (metadata) related to the captured image to the terminal 80.
  • the storage unit 160 can be various types of information, various types of data, various types of programs, and various types of images.
  • the various images may include a photographed image or an image based on the photographed image.
  • the program may include the UAV control unit 110 to control the universal joint 200, the rotor mechanism 210, the camera unit 220, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring device 290.
  • the storage 160 may be a computer-readable recording medium.
  • the storage unit 160 includes memory, and may include ROM (Read Only Memory), RAM (Random Access Memory), and the like.
  • the storage unit 160 may include at least one of HDD (Hard Disk Drive), SSD (Solid State Drive), SD card, USB (Universal Serial bus) memory, and other memories. At least a part of the storage unit 160 can be detached from the unmanned aircraft 100.
  • the universal joint 200 may rotatably support the imaging unit 220 around the yaw axis, the pitch axis, and the roll axis.
  • the gimbal 200 can change the imaging direction of the imaging unit 220 by rotating the imaging unit 220 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the rotor mechanism 210 includes a plurality of rotor wings and a plurality of drive motors that rotate the plurality of rotor wings.
  • the rotor mechanism 210 is controlled by the UAV control unit 110 to rotate, so that the unmanned aircraft 100 can fly.
  • the imaging unit 220 captures a subject in a desired imaging range and generates captured image data.
  • the data of the captured image captured by the imaging unit 220 may be stored in the memory included in the imaging unit 220 or the storage unit 160.
  • the imaging unit 230 captures the surroundings of the unmanned aircraft 100 and generates captured image data.
  • the image data of the imaging unit 230 may be stored in the storage unit 160.
  • the GPS receiver 240 receives a plurality of signals transmitted from a plurality of navigation satellites (ie, GPS satellites) that indicate time and the position (coordinate) of each GPS satellite.
  • the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned aircraft 100) based on the received multiple signals.
  • the GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control unit 110.
  • the UAV control unit 110 may replace the GPS receiver 240 to calculate the position information of the GPS receiver 240. In this case, the information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240 is input to the UAV control unit 110.
  • the inertial measurement device 250 detects the posture of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the inertial measurement device 250 can detect the acceleration in the front and rear, left and right, and up and down directions of the unmanned aircraft 100 and the angular velocities in the three axis directions of the pitch axis, the roll axis, and the yaw axis as the posture of the unmanned aircraft 100.
  • the magnetic compass 260 detects the orientation of the nose of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the barometric altimeter 270 detects the flying altitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected by the ground and objects, and outputs the detection result to the UAV control unit 110.
  • the detection result can show the distance from the unmanned aircraft 100 to the ground, that is, the height.
  • the detection result can show the distance from the unmanned aircraft 100 to the object (subject).
  • the laser measuring instrument 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between the unmanned aircraft 100 and the object (subject) through the reflected light.
  • a time-of-flight method may be used.
  • FIG. 4 is a block diagram showing an example of the hardware configuration of the terminal 80.
  • the terminal 80 includes a terminal control unit 81, an operation unit 83, a communication unit 85, a storage unit 87, and a display unit 88.
  • the terminal 80 may be held by a user who wishes to instruct the flight control of the unmanned aircraft 100.
  • the terminal 80 may instruct the flight control of the unmanned aircraft 100.
  • the terminal control unit 81 is configured using, for example, a CPU, MPU, or DSP.
  • the terminal control unit 81 performs signal processing for overall control of the operation of each part of the terminal 80, data input/output processing with other parts, data arithmetic processing, and data storage processing.
  • the terminal control unit 81 can acquire data and information from the unmanned aircraft 100 via the communication unit 85.
  • the terminal control unit 81 can also acquire data and information input via the operation unit 83.
  • the terminal control unit 81 may obtain data or information stored in the storage unit 87.
  • the terminal control unit 81 can transmit data and information to the unmanned aircraft 100 via the communication unit 85.
  • the terminal control unit 81 may send data and information to the display unit 88 and cause the display unit 88 to display display information based on the data and information.
  • the information displayed by the display unit 88 and the information sent to the unmanned aircraft 100 through the communication unit 85 may include the flight path of the unmanned aircraft 100, the shooting position, the captured image, and the information based on the image (for example, composite image) of the captured image. information.
  • the operation unit 83 receives and obtains data and information input by the user of the terminal 80.
  • the operation unit 83 may include input devices such as buttons, keys, a touch panel, and a microphone.
  • the touch panel may be composed of an operation part 83 and a display part 88. In this case, the operation section 83 can accept touch operations, click operations, drag operations, and the like.
  • the communication unit 85 performs wireless communication with the unmanned aircraft 100 through various wireless communication methods.
  • the wireless communication method of the wireless communication may include communication based on a wireless LAN or a public wireless network.
  • the communication unit 85 can perform wired communication by any wired communication method.
  • the storage unit 87 can store various information, various data, various programs, and various images.
  • the various programs may include application programs executed by the terminal 80.
  • the storage section 87 may be a computer-readable recording medium.
  • the storage section 87 may include ROM, RAM, and the like.
  • the storage unit 87 may include at least one of HDD, SSD, SD card, USB memory, and other memories. At least a part of the storage part 87 can be detached from the terminal 80.
  • the storage unit 87 may store a captured image acquired from the unmanned aircraft 100 or an image based on the captured image.
  • the storage unit 87 may store additional information of the captured image or the image based on the captured image.
  • the display unit 88 is configured with an LCD (Liquid Crystal Display), for example, and displays various information and data output from the terminal control unit 81.
  • the display section 88 may display a captured image or an image based on the captured image.
  • the display unit 88 may also display various data and information related to the execution of the application program.
  • FIG. 5 is a diagram showing an example of the outline of the operation of unmanned aircraft 100.
  • the UAV control unit 110 specifies the flight path RT.
  • the UAV control unit 110 acquires the shooting range of the shooting moving image during the flight along the flight path RT.
  • the shooting range is determined by the state of the unmanned aircraft 100.
  • the state of the unmanned aircraft 100 may include: the position of the unmanned aircraft 100 related to the shooting, the orientation of the unmanned aircraft 100 (for example, the direction of the nose), and the angle (rotation angle) of the universal joint 200 that supports the camera 220 ) And other information.
  • the status of the unmanned aircraft 100 may also include status information of other unmanned aircraft 100 (for example, flight information or shooting information).
  • the UAV control unit 110 can obtain the position of the camera unit 220 through GPS technology, or can obtain the position information of the unmanned aircraft 100 with high precision through RTK (Real Time Kinetic GPS) technology.
  • RTK Real Time Kinetic GPS
  • the shooting range may be generated by the UAV control unit 110 according to the positional relationship between the flying position along the flight path RT and the shooting object, that is, the subject.
  • the shooting range may be stored in the storage part 160 and obtained from the storage part 160.
  • the shooting range can be obtained from an external server through the communication unit 150.
  • the UAV control unit 110 causes the unmanned aircraft 100 to fly along the acquired flight path RT.
  • the imaging unit 220 captures the acquired imaging range to capture a moving image.
  • the unmanned aircraft 100 flies multiple times on the same flight path RT and shoots dynamic images (videos).
  • a dynamic image consists of an image sequence with multiple image frames.
  • the dynamic image may have, for example, 30 (equivalent to 30 fps) or 60 (equivalent to 60 fps) image frames per second.
  • the UAV control unit 110 causes the unmanned aircraft 100 to fly along the same flight path RT multiple times, and causes the imaging unit 220 to capture moving images of the same shooting range multiple times.
  • the UAV control unit 110 acquires the first image frame gf11, the second image frame gf12, the third image frame gf13, and the first image frame gf11, the second image frame gf12, the third image frame gf13, and the first image frame gf11, the second image frame gf12, and the Four image frames gf14,....
  • the UAV control unit 110 acquires the first image frame gf21, the second image frame gf22, the third image frame gf23, the fourth image frame gf24,... From the imaging unit 220 in the second circle of the flight path RT.
  • the UAV control unit 110 acquires the first image frame gf31, the second image frame gf32, the third image frame gf33, the fourth image frame gf34,... From the imaging unit 220 in the third circle of the flight path RT.
  • the X-th image frame is simply described as the X-th frame.
  • the same shooting range is photographed.
  • the image ranges corresponding to the image ranges of the first image frames gf11, gf21, and gf31 captured at the same relative time t1 are the same.
  • the image ranges corresponding to the image ranges of the second image frames gf12, gf22, and gf32 captured at the same relative time t2 are the same.
  • the image ranges corresponding to the image ranges of the third image frames gf13, gf23, and gf33 captured at the same relative time t3 are the same.
  • the image ranges corresponding to the image ranges of the fourth image frames gf14, gf24, and gf34 captured at the same relative time t4 are the same.
  • the state of the unmanned aircraft 100 is the same.
  • unmanned aircraft 100 can acquire multiple image frames taken at the same position.
  • the unmanned aircraft 100 can continuously shoot frame by frame by repeatedly flying and shooting on the flight path RT.
  • the UAV control unit 110 synthesizes a plurality of image frames of the same relative time in each circle, and each image frame of the same relative time obtains a composite image frame. For example, three first image frames gf11, gf21, and gf31 are synthesized to generate a first synthesized image frame. For image frames after the second image frame, the second composite image frame is generated in the same way,.... The UAV control unit 110 generates a composite moving image including each composite image frame sequentially in time series.
  • the UAV control unit 110 may store the status information of the unmanned aircraft 100 when capturing an image frame.
  • the time when the image frame is photographed that is, the time when the state information of the unmanned aircraft 100 is acquired, may be synchronized with the vertical synchronization signal (VSYNC signal) of the imaging unit 220.
  • VSYNC signal vertical synchronization signal
  • At least the state of the unmanned aircraft 100 can be saved during the first round of shooting.
  • the unmanned aircraft 100 can also follow the state of the unmanned aerial vehicle during the first flight during the second lap and later, and can also capture moving images of image frames with the same shooting range after the second lap.
  • FIG. 6 is a flowchart showing an example of the operation of unmanned aircraft 100.
  • the UAV control unit 110 specifies the flight path RT (S11).
  • the flight path RT may be specified by the user through the operation unit 83 of the terminal 80 in advance, or may be obtained by the communication unit 85 and the communication unit 150 and specified.
  • the flight path RT can be generated and specified by the UAV control unit 110, so that more than one desired subject can be photographed.
  • the flight path RT may be stored in the storage unit 160 in advance, and obtained from the storage unit 160 for designation.
  • the flight path RT can be specified by obtaining it from an external server through the communication unit 150.
  • the flight path RT is a flight path that can capture a desired subject.
  • the UAV control unit 110 can designate the flight path RT in accordance with manual operation (manipulation) by the operation unit 83 of the terminal 80 during the first round of flight.
  • the UAV control unit 110 causes the imaging unit 220 to start imaging along the flight path RT according to a predetermined shooting start trigger signal.
  • the shooting start trigger signal may include: receiving a shooting start instruction from the terminal 80 through the communication unit 150, or detecting that a predetermined time to start shooting has been reached.
  • the instruction to start shooting may include, for example, the video synthesis mode is selected as the shooting mode through the operating unit 83 of the terminal 80.
  • the UAV control unit 110 stores the state of the unmanned aircraft 100 when the moving image is started along the flight path RT in the storage unit 160 (S12).
  • the UAV control unit 110 may also acquire the state of the unmanned aircraft 100 instructed by the terminal 80 through the communication unit 150, that is, the state of the unmanned aircraft 100 at the start of imaging.
  • the UAV control unit 110 may determine the state of the unmanned aircraft 100 at the start of imaging according to a desired subject.
  • the imaging range captured by the imaging unit 220 is determined according to the state of the unmanned aircraft 100.
  • the UAV control unit 110 captures a moving image along the flight path RT (S13).
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 so that it flies along the flight path RT in each circle, and acquires each image frame of the dynamic image of each circle.
  • the UAV control unit 110 synthesizes the moving images captured in each circle, and generates a synthetic moving image (S14). The composition of dynamic images will be described in detail later.
  • the UAV control unit 110 outputs moving images such as composite moving images. The output of the moving image will be described in detail later (S15).
  • the UAV control unit 110 may store the information of the number of the circle in the storage unit 160 during the flight and shooting of each circle (for example, when the shooting of each circle starts).
  • the UAV control unit 110 may also save the state of the unmanned aircraft 100 at least when acquiring each image frame of the first circle. Therefore, after the second round of the flight path RT, the unmanned aircraft 100 can also perform flight and shooting in the state of the unmanned aircraft 100 that is the same as the state of the unmanned aircraft 100 in the first round.
  • the UAV control unit 110 evaluates the output moving image (output moving image) (S16).
  • the UAV control unit 110 may evaluate the output moving image when the shooting of the moving image in each circle ends. For example, the UAV control unit 110 may determine that the shooting of the moving image is completed when the flight and shooting of the predetermined flight path RT are completed. For example, the UAV control unit 110 can determine that the shooting of the moving image has ended in the following situations: when the unmanned aircraft 100 is manipulated by the terminal 80 in the first lap, and the terminal 80 controls the unmanned aircraft 100; When 83 has performed an operation instructing the end of the operation of the unmanned aircraft 100 and notified the unmanned aircraft 100 through the communication unit 85.
  • the UAV control unit 110 determines whether the evaluation result of the output moving image satisfies a preset criterion (S17).
  • the preset benchmark can be a user's subjective benchmark or an objective benchmark.
  • the UAV control unit 110 can send the output dynamic image to the terminal 80 through the communication unit 150, and the terminal control unit 81 of the terminal 80 receives the output dynamic image through the communication unit 85, and displays it through the display unit 88 Output dynamic images.
  • the user may also confirm the displayed output moving image, and the user may subjectively determine whether the output moving image satisfies a preset criterion.
  • the terminal control unit 81 may obtain the operation information indicating that the preset criterion is satisfied through the operation unit 83, and send it to the unmanned aircraft 100 through the communication unit 85.
  • the terminal control unit 81 may obtain the operation information that the preset criterion is not met through the operation unit 83, and send it to the unmanned aircraft 100 through the communication unit 85. That is, the user can manually input the evaluation result.
  • the UAV control unit 110 may perform image recognition (for example, pattern recognition) on the output moving image, and evaluate the output moving image according to the result of the image recognition.
  • image recognition for example, pattern recognition
  • the preset reference may be a reference based on the pixel value of each pixel of each image frame of the output dynamic image.
  • the UAV control unit 110 ends the process of FIG. 5, and ends the flight and shooting along the flight path RT.
  • the UAV control unit 110 enters the next round of flying and shooting (S18).
  • the UAV control unit 110 acquires the status information of the unmanned aircraft 100 at the start of shooting from the storage unit 160, and sets it as the status of the unmanned aircraft 100 at the starting point of the flight path RT (S18 ).
  • the UAV control unit 110 moves to the position where the imaging of the next-circle flight path RT starts, and the imaging unit 220 at the start of imaging is brought into a state capable of imaging the desired imaging range.
  • the moving image to be evaluated may be limited to the synthesized moving image among the output moving images. For example, even if the reference dynamic image of the first circle has not been evaluated, the quality of the synthesized dynamic image will not be affected, and the processing time of FIG. 6 can be shortened.
  • the unmanned aircraft 100 repeatedly performs flight and shooting along the flight path RT at least N times.
  • N is an arbitrary number greater than or equal to 2, and for example, it is assumed that the quality of the generated synthetic moving image is higher than the number of times of the predetermined quality.
  • the value of N may be designated by the user through the operation unit 83 of the terminal 80, for example, or may be appropriately determined as an arbitrary numerical value.
  • the UAV control unit 110 may also determine the value of N according to the shooting scene or the shooting range.
  • unmanned aircraft 100 processes the moving images captured by the imaging unit 220 included in the unmanned aircraft 100 (an example of a flying object).
  • the UAV control unit (an example of the processing unit) can specify the flight path RT on which the unmanned aircraft 100 is flying.
  • the UAV control unit 110 can make the unmanned aircraft 100 circulate along the flight path RT multiple times.
  • the UAV control unit 110 can cause the imaging unit 220 to capture a plurality of dynamic images having the same shooting range through multiple round flights.
  • the UAV control unit 110 may synthesize a plurality of moving images captured through multiple circling flights to generate a synthetic moving image.
  • the unmanned aircraft 100 It is difficult for the unmanned aircraft 100 to take dynamic images while staying in one place during flight. Therefore, when shooting moving images, it is difficult to continuously shoot in the same shooting range, and it is difficult to synthesize images in the same shooting range. In this regard, the unmanned aircraft 100 does not stay in one place when shooting dynamic images, but performs multiple rounds on the designated flight path RT, so that it can respond to the same one over time. Shooting within the shooting range. Therefore, the unmanned aircraft 100 is within the same shooting range, that is, a larger shooting range can be fixed, and a plurality of dynamic images having a plurality of image frames corresponding to each shooting range can be obtained.
  • the unmanned aircraft 100 synthesizes the plurality of dynamic images and generates a composite dynamic image, so as to obtain various beneficial shooting effects (for example, Temporal Denoise, HDR (High Dynamic Range, high dynamic range)). That is, the unmanned aircraft 100 can obtain a long-time exposure shooting effect, increase the SNR (Signal to Noise Ratio), reduce noise, and expand the dynamic range.
  • beneficial shooting effects for example, Temporal Denoise, HDR (High Dynamic Range, high dynamic range)
  • the dynamic image may have a plurality of image frames.
  • the UAV control unit 110 may control the unmanned aircraft 100 so that each of the plurality of dynamic images has the same image frame at the same relative time.
  • the unmanned aircraft 100 obtains images with the same shooting range from each of the image frames of the same relative time in each moving image, so that as the whole moving image, the same shooting range can be obtained over a larger range. Of multiple image frames.
  • the UAV control unit 110 may acquire the state of the unmanned aircraft 100 in synchronization with the vertical synchronization signal (VSYNC signal) of the imaging unit 220 during the flight of the first flight path RT.
  • the UAV control unit 110 can control the flight of the unmanned aircraft 100 and the camera unit 220 in synchronization with the vertical synchronization signal of the camera unit 220 during the flight of the flight path RT after the second lap, so that the flight and camera unit 220 of the unmanned aerial vehicle 100 can be in line with those in the first lap.
  • the shooting is performed in the same state of the unmanned aircraft 100.
  • the unmanned aircraft 100 is synchronized with the vertical synchronization signal of the camera unit 220, so that every time an image frame is acquired, the state of the unmanned aircraft 100 can be acquired.
  • the unmanned aircraft 100 stores the flight mode and shooting mode of the unmanned aircraft 100 in the first lap, and sets the flight mode and shooting mode in the subsequent laps to be the same as those in the first lap, so that it can be easily
  • the shooting range corresponding to the state of the unmanned aircraft 100 is fixed in a wide range, and multiple dynamic images are obtained.
  • the state of the unmanned aircraft 100 may include at least one of the position of the unmanned aircraft 100, the orientation of the unmanned aircraft 100, and the angle of the universal joint 200 supporting the camera 220.
  • the unmanned aircraft 100 can store the state of the unmanned aircraft 100 in the storage unit 160, and acquire and set the state of the unmanned aircraft 100 from the storage unit 160 at a later point in time, for example.
  • the image frame of the imaging range captured by the imaging unit 220 in the past is acquired.
  • the UAV control unit 110 may end the control of the flight and shooting of the unmanned aircraft 100.
  • the UAV control unit 110 may perform flight and shooting control along the next flight path RT to be circled.
  • the unmanned aircraft 100 can continuously shoot on the flight path RT until the evaluation of the synthetic dynamic image reaches the preset reference. Therefore, it is expected that the quality of the synthesized moving image of unmanned aircraft 100 will be improved.
  • the UAV control unit 110 may acquire operation information indicating the evaluation result of the synthesized moving image.
  • the operation information can be obtained from the terminal 80. In this way, the user can subjectively synthesize the dynamic image for evaluation, and can determine whether to acquire more images as the basis of the synthetic dynamic image.
  • the UAV control unit 110 may perform image recognition for the composite moving image.
  • the UAV control unit 110 may evaluate the synthesized dynamic image based on the result of the image recognition.
  • the unmanned aircraft 100 can objectively evaluate the synthetic dynamic image through image recognition, and can determine whether to fly again on the flight path RT and continue to acquire image frames that are the basis of the synthetic dynamic image.
  • the processing related to the above-mentioned flight control and shooting control, and synthesis of moving images may be mainly performed by the unmanned aircraft 100.
  • various controls and various processes can be performed by one device, which can implement efficient processing and shorten the processing time.
  • the processing related to the above-mentioned shooting control and synthesis of moving images may also be mainly performed by other devices (for example, the terminal 80 and the transmitter).
  • Fig. 7 is a flowchart showing a first example of composition of moving images.
  • the composite processing of the moving image corresponds to S14 in FIG. 6.
  • FIG. 7 it is assumed that a moving image that surrounds at any one time in S13 of FIG. 6 is acquired.
  • the UAV control unit 110 determines whether the obtained moving image is the moving image obtained in the first circle of the flight path RT (S21). For example, by referring to the storage unit 160, the UAV control unit 110 can discriminate which lap of the current flight path RT is. The UAV control unit 110 may obtain information indicating the number of laps of the current flight path RT from the storage unit 160.
  • the UAV control unit 110 stores each of the obtained moving image frames as each of the reference moving images in the storage unit 160 (S22).
  • the image frame is acquired, that is, the state information of the flying object is stored in the storage unit 160 in synchronization with the vertical synchronization signal of the imaging unit 220. Thereby, it is possible to grasp the state of unmanned aircraft 100 at the moment the image is taken.
  • the UAV control unit 110 also stores each image frame of the obtained moving image as each image frame of the calculation moving image (S23).
  • the UAV control unit 110 compares each image frame of the obtained moving image with each corresponding image frame of the reference moving image, and Calculate the global motion vector (S24).
  • Corresponding image frames refer to image frames at the same relative time.
  • the global motion refers to motion information representing changes in the state (posture) of the unmanned aircraft 100 and the flight movement of the unmanned aircraft 100 at multiple time points.
  • the global motion is represented by a motion vector (global motion vector).
  • the UAV control unit 110 corrects the global motion based on the calculated global motion vector, that is, performs global motion compensation (S25). For example, in global motion compensation, since the motion of the entire image frame can be expressed through affine transformation, and the motion compensation is performed in units of the image frame, the coding efficiency and the compensation efficiency are high. In addition, the UAV control unit 110 may also implement inter-frame prediction and motion compensation other than global motion compensation between image frames at the same relative time in each circle. In addition, the processing related to motion compensation in S24 and 25 may be omitted.
  • the UAV control unit 110 adds each image frame of the obtained moving image to each corresponding image frame of the calculation moving image (S26).
  • the value of each pixel of each frame of the moving image subjected to global motion compensation may be added to the value of each pixel of each corresponding image frame in the moving image for calculation.
  • the UAV control unit 110 compares the pixel value of each pixel of the first image frame gf11, which is the first circle moving image of the calculation moving image, to the second circle moving image
  • the pixel values of the pixels of the first image frame gf21 are added to calculate the first image frame in the new dynamic image for calculation.
  • the UAV control unit 110 adds the dynamic image of the first circle and the dynamic image of the second circle to each pixel of the first image frame of the dynamic image for calculation.
  • the pixel value of is added to the pixel value of each pixel of the first image frame gf31 of the third circle dynamic image to generate the first frame of the new dynamic image for calculation.
  • the same addition is performed for the moving images after the third circle.
  • the UAV control unit 110 calculates the average value of each image frame of the calculated moving image for calculation (S27). In this case, the UAV control unit 110 may calculate the average value of the pixel value of each pixel of each image frame of the moving image for calculation. The UAV control unit 110 generates a composite moving image having each image frame whose average value is calculated (S27). Thus, when the flight of the flight path RT is the flight after the second lap, the unmanned aircraft 100 can output (for example, transmit, display) the synthesized moving image while capturing the moving image.
  • the UAV control unit 110 can generate a composite moving image based on the first moving image (for example, the reference moving image) obtained in the first lap and the second moving image obtained after the second lap.
  • first moving image for example, the reference moving image
  • second moving image obtained after the second lap.
  • unmanned aircraft 100 can generate a composite moving image in which a plurality of surrounding moving images are synthesized using the first-circle moving image as a reference.
  • the UAV control unit 110 may compare the first moving image with the second moving image for each image frame of the same relative time, and perform the motion compensation of the second moving image on the first moving image based on the comparison result.
  • unmanned aircraft 100 can perform motion compensation in image frames of the same relative time after the first lap and the second lap. Therefore, it is possible to improve the uniformity of the image range of each image frame at the same relative time in a plurality of moving images.
  • the image range corresponds to the shooting range. Therefore, for example, even if the flying environment of the unmanned aircraft 100 is not good, it is possible to reduce the positional deviation between a plurality of image frames in each moving image, thereby improving the image quality of the composite moving image.
  • motion compensation may include global motion compensation.
  • unmanned aircraft 100 can improve the coding efficiency of compression coding of moving images and the efficiency of motion compensation.
  • the UAV control unit 110 may generate a composite moving image based on the statistical value of the same pixel of the image frame of the same relative time in the first moving image and the second moving image.
  • the unmanned aircraft 100 captures dynamic images while flying, it is difficult to obtain image frames with the same shooting range.
  • the unmanned aircraft 100 can circle on the same flight path RT and obtain multiple image frames at the same relative time.
  • the unmanned aircraft 100 obtains statistical values (for example, average values) of a plurality of image frames, so that even if some image frames with lower image quality are included, the image quality of the image frames can be improved and a dynamic image can be obtained.
  • Fig. 8 is a flowchart showing a second example of composition of moving images.
  • the same step numbers are assigned, and the description thereof is omitted or simplified.
  • unmanned aircraft 100 performs the same processing as S21, S22, S24, and S25 in FIG. 7.
  • the UAV control unit 110 extracts the characteristic region in the image frame of the obtained moving image (S26A).
  • the feature area is extracted based on objective or user subjectivity.
  • the characteristic area may be, for example, a characteristic area having value in the surround.
  • the UAV control unit 110 may extract the difference area between the obtained moving image and the image frame of the same relative time in the reference moving image as the characteristic area.
  • the UAV control unit 110 may extract an area where a predetermined subject exists in an image frame of the obtained moving image as a characteristic area.
  • the UAV control unit 110 may extract an area designated by the user as a characteristic area through the operation unit 83 of the terminal 80 for the obtained image frame of the moving image. The extraction of the characteristic region is implemented for each image frame in the obtained moving image.
  • the UAV control unit 110 replaces the area (feature corresponding area) of each image frame of the reference moving image corresponding to the characteristic area extracted in each image frame of the obtained moving image with the extracted characteristic area (S27A).
  • the UAV control unit 110 may replace the pixel value of each pixel in the feature corresponding area with the pixel value of each pixel in the extracted feature area.
  • the UAV control unit 110 generates a composite moving image having each image frame in which the characteristic corresponding area in the reference moving image is replaced with the characteristic area in the obtained moving image (S27A).
  • the UAV control unit 110 can compare the first moving image with the second moving image for each image frame of the same relative time, extract the characteristic area from the second moving image, and replace the first moving image with the characteristic area in the second moving image.
  • An area corresponding to the characteristic area in a dynamic image feature corresponding area.
  • the unmanned aircraft 100 replaces a part of the first moving image with a lower image quality or a part that is not in the state expected by the user with a part of the image frame of the same relative time in another moving image, thereby improving the first moving image.
  • the quality of the dynamic image and the synthesized dynamic image are obtained. For example, when photographing an arbitrary tower or building as a subject, there may be many tourists around the tower or building in the image frame of the first moving image. Even in this case, when there is no tourist in the image frame of the same relative time in the second dynamic image, the unmanned aircraft 100 extracts this part as a feature area to replace the feature corresponding to the image frame in the first dynamic image. area. As a result, unmanned aircraft 100 can obtain a composite moving image including towers or buildings excluding tourists.
  • Fig. 9 is a flowchart showing an output example of a moving image.
  • the output processing of the moving image corresponds to S15 in FIG. 6.
  • FIG. 9 it is assumed that a moving image that surrounds at any one time in S13 of FIG. 6 is acquired.
  • the UAV control unit 110 determines whether the obtained moving image is the moving image after the Nth circle (S31). When the obtained moving image is the surrounding moving image before the Nth circle, the UAV control unit 110 outputs the moving image of the last surrounding (S32). In this case, the UAV control unit 110 may output a moving image captured by the imaging unit 220 in real time, instead of synthesizing the moving image. When the obtained moving image is the moving image after the N-th circle, the UAV control unit 110 outputs the generated composite moving image (S33).
  • the UAV control unit 110 may transmit the moving image to another device (for example, the terminal 80) through the communication unit 150 as an output of the moving image.
  • the UAV control unit 110 may display a moving image on another device (for example, the terminal 80) as an output of the moving image.
  • the terminal control unit 81 of the terminal 80 can receive the moving image through the communication unit 85 and display the moving image through the display unit 88.
  • the UAV control unit 110 may store the moving image in the storage unit 160 or another recording medium (for example, an external recording medium) as the output of the moving image.
  • the UAV control unit 110 can obtain the number of turns of the flight path RT of the unmanned aircraft 100.
  • the UAV control section 110 may output the dynamic image captured in the last surround.
  • the UAV control section 110 may output the synthesized dynamic image.
  • unmanned aircraft 100 may have undesired artifacts appearing in the synthesized moving image when the image quality of the synthesized moving image is assumed to be insufficient. Therefore, in this case, the unmanned aircraft 100 can suppress the output of the synthesized dynamic image and provide the latest dynamic image by providing the unsynthesized dynamic image of the last circling.
  • unmanned aircraft 100 flies a threshold number or more and takes a moving image, it sometimes takes a long time. Even in this case, some moving images can be output, and the user can confirm.
  • the unmanned aircraft 100 assumes that the image quality of the composite moving image is sufficient for the number of rounds, and the image quality of the composite moving image is stable. In this case, it is expected that the unmanned aircraft 100 can provide a dynamic image with improved image quality compared to the dynamic image at each circle by providing the composite dynamic image.
  • the output example of the moving image shown in FIG. 9 is an example, and other output methods may be used.
  • the UAV control unit 110 may output the composite moving image independently of the number of laps, regardless of the number of laps of the obtained moving image.
  • the shooting and synthesis of a plurality of moving images during the flight of the flying body have been described, but it is not limited to the flying body, and the above-mentioned embodiment may be applied to other moving bodies (for example, vehicles and ships). In this case, for example, by replacing the expression of flying with movement, the above-described embodiment can also be applied to the shooting and synthesis of a plurality of moving images when the moving body is moving.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

It is expected that multiple dynamic images photographed during flight by an aircraft capable of image capture can be synthesized, thereby improving the image quality of the dynamic images. An image processing device processes dynamic images photographed by an image capture part comprised by an aircraft. A processing part of the image processing device designates the flight path flown by the aircraft such that the aircraft flies in a circle multiple times along the flight path; by flying in circles multiple times, the image capture part comprised by the aircraft photographs multiple dynamic images having the same photographing range; and the multiple dynamic images photographed by flying in circles multiple times are synthesized to generate a synthesized dynamic image.

Description

图像处理装置、图像处理方法、程序及记录介质Image processing device, image processing method, program and recording medium 技术领域Technical field
本公开涉及一种图像处理装置、图像处理方法、程序及记录介质。The present disclosure relates to an image processing device, an image processing method, a program, and a recording medium.
背景技术Background technique
以往,已知有一种对多个图像进行合成的图像合成技术(Image Stacking)。其通过对多个图像进行合成,从而能够改善图像的画质。专利文献1中公开有一种进行图像合成的图像处理装置。该图像处理装置包括:对在不同的时间点拍摄的多个图像进行合成的合成部以及用于减少运动对图像造成的影响而进行校正的运动校正部。In the past, there has been known an image stacking technology (Image Stacking) for synthesizing multiple images. It can improve the image quality by synthesizing multiple images. Patent Document 1 discloses an image processing device that performs image synthesis. The image processing device includes a synthesis unit that synthesizes a plurality of images taken at different points in time, and a motion correction unit that corrects for reducing the influence of motion on the image.
背景技术文献:Background technical literature:
[专利文献][Patent Literature]
[专利文献1]欧州专利申请公开第3450310号说明书[Patent Document 1] European Patent Application Publication No. 3450310 Specification
发明内容Summary of the invention
发明所要解决的技术问题:The technical problem to be solved by the invention:
专利文献1中的图像处理装置将拍摄装置的位置固定而拍摄多个静止图像,然后对多个静止图像进行合成(Image Stacking)。然而,其并未考虑到像飞行体搭载拍摄装置的情况那样,在拍摄装置移动的同时对多个动态图像进行合成(Video Stacking)的情况。期待通过对由可边飞行边拍摄的飞行体拍摄的多个动态图像进行合成,从而改善动态图像的画质。The image processing device in Patent Document 1 fixes the position of the imaging device to take a plurality of still images, and then combines the plurality of still images (Image Stacking). However, it does not take into account the situation where multiple moving images are combined (Video Stacking) while the shooting device is moving, as in the case of a flying body equipped with a shooting device. It is expected that by synthesizing a plurality of moving images taken by flying objects that can be photographed while flying, the image quality of moving images can be improved.
用于解决技术问题的手段:Means used to solve technical problems:
在一个方面中,一种图像处理装置,其对由飞行体所包括的摄像部拍摄的动态图像进行处理,其包括处理部,所述处理部指定飞行体飞行的飞行路径;使飞行体沿着飞行路径环绕飞行多次;通过多次环绕飞行使飞行体所包括的摄像部拍摄具有相同拍摄范围的多个动态图像;对通过多次环绕飞行所拍摄的多个动态图像进行合成,生成合成动态图像。In one aspect, an image processing device that processes a dynamic image captured by a camera included in a flying body includes a processing part that specifies a flight path for the flying body to fly; The flight path revolves multiple times; through multiple recirculation flights, the camera included in the flying body shoots multiple dynamic images with the same shooting range; multiple dynamic images taken through multiple recirculation flights are synthesized to generate a composite dynamic image.
动态图像可以具有按时间序列顺序的多个图像帧。处理部可以控制飞行体使得多个动态图像中的每个相同相对时间的图像帧具有相同的拍摄范围。The dynamic image may have a plurality of image frames in time series order. The processing unit may control the flying body so that each image frame of the same relative time in the plurality of dynamic images has the same shooting range.
处理部在第一圈飞行路径的飞行中,与摄像部的垂直同步信号同步地获取飞行体的状态;在第二圈以后的飞行路径的飞行中,与摄像部的垂直同步信号同步地对飞行体的飞行及摄像部进行控制,使得以与第一圈中的飞行体的状态相同的状态进行拍摄。The processing unit acquires the state of the flying object in synchronization with the vertical synchronization signal of the imaging unit during the flight of the first lap of the flight path; during the flight of the flight path after the second lap, it synchronizes the flight with the vertical synchronization signal of the imaging unit. The flying and imaging unit of the body is controlled so that the shooting is performed in the same state as the state of the flying body in the first circle.
飞行体的状态可以包括:飞行体的位置、飞行体的朝向、支撑摄像部的万向节的角度中的至少一个信息。The status of the flying body may include at least one of the position of the flying body, the orientation of the flying body, and the angle of the gimbal supporting the imaging unit.
处理部可以根据第一圈得到的第一动态图像与第二圈以后得到的第二动态图像生成合成动态图像。The processing unit may generate a composite dynamic image based on the first dynamic image obtained in the first circle and the second dynamic image obtained after the second circle.
处理部可以针对每个相同相对时间的图像帧,对第一动态图像与第二动态图像进行比较;根据比较结果,对第一动态图像进行第二动态图像的运动补偿。The processing unit may compare the first dynamic image with the second dynamic image for each image frame of the same relative time; according to the comparison result, perform the motion compensation of the second dynamic image on the first dynamic image.
运动补偿可以包括全局运动补偿。Motion compensation may include global motion compensation.
处理部可以根据第一动态图像及第二动态图像中的相同相对时间的图像帧的相同像素的统计值来生成合成动态图像。The processing unit may generate a composite moving image based on the statistical value of the same pixel of the image frame of the same relative time in the first moving image and the second moving image.
处理部针对每个相同相对时间的图像帧,对第一动态图像与第二动态图像进行比较;针对第二动态图像提取特征区域;用第二动态图像中的特征区域替换第一动态图像中与特征区域相对应的区域。The processing unit compares the first dynamic image with the second dynamic image for each image frame of the same relative time; extracts the characteristic region from the second dynamic image; replaces the characteristic region in the second dynamic image with the one in the first dynamic image. The area corresponding to the characteristic area.
处理部可以获取飞行体在飞行路径上的飞行的环绕次数;当获取的环绕次数小于阈值时,输出在最后一次环绕中拍摄的动态图像;当获取的环绕次数大于等于阈值时,输出合成动态图像。The processing unit can obtain the number of turns of the flying object on the flight path; when the acquired number of turns is less than the threshold, output the dynamic image taken in the last round; when the acquired number of turns is greater than or equal to the threshold, output the composite dynamic image .
处理部可以对输出的合成动态图像进行评估;当合成动态图像的评估结果满足预设基准时,结束飞行体的飞行及拍摄;当合成动态图像的评估结果不满足预设基准时,沿着下一次环绕的飞行路径进行飞行及拍摄。The processing unit can evaluate the output synthetic dynamic image; when the evaluation result of the synthetic dynamic image meets the preset criterion, the flying and shooting of the flying object is ended; when the evaluation result of the synthetic dynamic image does not meet the preset criterion, it moves along the downward direction. Fly and take pictures on a circling flight path.
处理部可以获取表示合成动态图像的评估结果的操作信息。The processing unit may acquire operation information indicating the evaluation result of the synthesized moving image.
处理部可以针对合成动态图像进行图像识别;根据图像识别的结果对合成动态图像进行评估。The processing unit can perform image recognition for the synthetic dynamic image; evaluate the synthetic dynamic image according to the result of the image recognition.
图像处理装置可以是飞行体。The image processing device may be a flying object.
在一个方面中,一种图像处理方法,其对由飞行体所包括的摄像部拍摄的动态图像进行处理,其包括以下步骤:指定飞行体飞行的飞行路径;使飞行体沿着飞行路径环绕飞行多次;通过多次环绕飞行使飞行体所包括的摄像部拍摄具有相同拍摄范围的多个动态图像;以及对通过多次环绕飞行所拍摄的多个动态图像进行合成,生成合成动态图像。In one aspect, an image processing method that processes a dynamic image taken by a camera included in a flying body includes the following steps: designating a flight path of the flying body; making the flying body fly around along the flight path Multiple times; the imaging unit included in the flying body captures multiple dynamic images with the same shooting range through multiple circumnavigation flights; and synthesizes multiple dynamic images captured through multiple circumnavigation flights to generate a composite dynamic image.
动态图像可以具有按时间序列顺序的多个图像帧。拍摄多个动态图像的步骤可以包括以下步骤:控制飞行体使得多个动态图像中的每个相同相对时间的图像帧具有相同的拍摄范围。The dynamic image may have a plurality of image frames in time series order. The step of shooting a plurality of dynamic images may include the step of controlling the flying body so that each of the plurality of dynamic images has the same shooting range for image frames of the same relative time.
拍摄多个动态图像的步骤可以包括以下步骤:在第一圈飞行路径的飞行中,与摄像部的垂直同步信号同步地获取飞行体的状态;以及在第二圈以后的飞行路径的飞行中,与摄像部的垂直同步信号同步地对飞行体的飞行及摄像部进行控制,使得以与第一圈中的飞行体的状态相同的状态进行拍摄。The step of capturing multiple dynamic images may include the following steps: during the flight of the first lap of the flight path, acquiring the state of the flying object in synchronization with the vertical synchronization signal of the camera unit; and during the flight of the flight path after the second lap, The flight of the flying object and the imaging unit are controlled in synchronization with the vertical synchronization signal of the imaging unit so that the shooting is performed in the same state as the state of the flying object in the first circle.
飞行体的状态可以包括:飞行体的位置、飞行体的朝向、支撑摄像部的万向节的角度等信息中的至少一个。The status of the flying body may include at least one of the position of the flying body, the orientation of the flying body, and the angle of the universal joint supporting the imaging unit.
生成合成动态图像的步骤可以包括以下步骤:根据第一圈得到的第一动态图像与第二圈以后得到的第二动态图像生成合成动态图像。The step of generating a synthetic dynamic image may include the following steps: generating a synthetic dynamic image based on the first dynamic image obtained in the first circle and the second dynamic image obtained after the second circle.
生成合成动态图像的步骤可以包括以下步骤:针对每个相同相对时间的图像帧,对第一动态图像与第二动态图像进行比较;根据比较结果,对第一动态图像进行第二动态图像的运动补偿。The step of generating a synthetic dynamic image may include the following steps: for each image frame of the same relative time, comparing the first dynamic image with the second dynamic image; according to the comparison result, performing the movement of the second dynamic image on the first dynamic image make up.
运动补偿可以包括全局运动补偿。Motion compensation may include global motion compensation.
生成合成动态图像的步骤可以包括以下步骤:根据第一动态图像及第二动态图像中的相同相对时间的图像帧的相同像素的统计值来生成合成动态图像。The step of generating a composite moving image may include the following steps: generating a composite moving image based on the statistical values of the same pixels of the image frames of the same relative time in the first moving image and the second moving image.
生成合成动态图像的步骤可以包括以下步骤:针对每个相同相对时间的图像帧,对第一动态图像与第二动态图像进行比较;针对第二动态图像提取特征区域;以及用第二动态图像中的特征区域替换第一动态图像中与特征区域相对应的区域。The step of generating a synthetic dynamic image may include the following steps: comparing the first dynamic image with the second dynamic image for each image frame of the same relative time; extracting the characteristic region from the second dynamic image; and using the second dynamic image The feature area of replaces the area corresponding to the feature area in the first dynamic image.
还可以包括以下步骤:获取飞行体在飞行路径上的飞行的环绕次数;当获取的环绕次数小于阈值时,输出在最后一次环绕中拍摄的动态图像;以及当获取的环绕次数大于等于阈值时,输出合成动态图像。It may also include the following steps: acquiring the number of turns of the flying object flying on the flight path; when the acquired number of turns is less than the threshold, outputting the dynamic image taken in the last turn; and when the acquired number of turns is greater than or equal to the threshold, Output composite dynamic image.
拍摄多个动态图像的步骤可以包括以下步骤:对输出的合成动态图像进行评估;当合成动态图像的评估结果满足预设基准时,结束飞行体的飞行及拍摄;当合成动态图像的评估结果不满足预设基准时,沿着下一次环绕的飞行路径进行飞行及拍摄。The step of shooting multiple dynamic images may include the following steps: evaluating the output synthetic dynamic image; when the evaluation result of the synthetic dynamic image meets a preset criterion, ending the flight and shooting of the flying object; when the evaluation result of the synthetic dynamic image is not When the preset reference is met, the flight and shooting will be carried out along the flight path of the next circle.
对合成动态图像进行评估的步骤可以包括以下步骤:获取表示合成动态图像的评估结果的操作信息。The step of evaluating the synthetic dynamic image may include the following steps: obtaining operation information representing the evaluation result of the synthetic dynamic image.
对合成动态图像进行评估的步骤可以包括以下步骤:针对合成动态图像进行图像识别;以及根据图像识别的结果对合成动态图像进行评估。The step of evaluating the synthetic dynamic image may include the following steps: performing image recognition for the synthetic dynamic image; and evaluating the synthetic dynamic image according to the result of the image recognition.
图像处理方法可以由图像处理装置执行。图像处理装置可以是飞行体。The image processing method can be executed by an image processing device. The image processing device may be a flying object.
在一个方面中,一种程序,其用于使对由飞行体所包括的摄像部拍摄的动态图像进行处理的图像处理装置执行以下步骤:指定飞行体飞行的飞行路径;使飞行体沿着飞行路径环绕飞行多次;通过多次环绕飞行使飞行体所包括的摄像部拍摄具有相同拍摄范围的多个动态图像;对通过多次环绕飞行所拍摄的多个动态图像进行合成,生成合成动态图像。In one aspect, a program for causing an image processing device that processes a moving image captured by a camera included in a flying body to perform the following steps: designating a flight path of the flying body; making the flying body fly along The path circumnavigates multiple times; the camera included in the flying body takes multiple dynamic images with the same shooting range through multiple circumnavigation flights; synthesizes multiple dynamic images taken through multiple circumnavigation flights to generate a composite dynamic image .
在一个方面中,提供一种记录介质,其是记录有程序的计算机可读记录介质,所述程序用于使对由飞行体所包括的摄像部拍摄的动态图像进行处理的图像处理装置执行以下步骤:指定飞行体飞行的飞行路径;使飞行体沿着飞行路径环绕飞行多次;通过多次环绕飞行使飞行体所包括的摄像部拍摄具有相同拍摄范围的多个动态图像;对通过多次环绕飞行所拍摄的多个动态图像进行合成,生成合成动态图像。In one aspect, there is provided a recording medium, which is a computer-readable recording medium on which a program is recorded for causing an image processing device that processes a moving image captured by an imaging unit included in an flying body to execute the following Steps: Specify the flight path of the flying body; make the flying body circle and fly along the flight path for multiple times; make the camera included in the flying body shoot multiple dynamic images with the same shooting range through multiple circles; Multiple dynamic images taken by flying around are synthesized to generate a composite dynamic image.
另外,上述发明的内容中并未穷举本公开的全部特征。此外,这些特征组的子组合也可以构成发明。In addition, the content of the above invention does not exhaust all the features of the present disclosure. In addition, sub-combinations of these feature groups can also constitute inventions.
附图说明Description of the drawings
图1是示出实施方式中的飞行体***的构成示例的示意图。FIG. 1 is a schematic diagram showing an example of the configuration of the flying body system in the embodiment.
图2是示出无人驾驶航空器的具体外观的一个示例的图。Fig. 2 is a diagram showing an example of a specific appearance of an unmanned aircraft.
图3是示出无人驾驶航空器的硬件构成的一个示例的框图。Fig. 3 is a block diagram showing an example of the hardware configuration of the unmanned aircraft.
图4是示出终端的硬件构成的一个示例的框图。Fig. 4 is a block diagram showing an example of the hardware configuration of the terminal.
图5是示出无人驾驶航空器的动作概要的一个示例的图。Fig. 5 is a diagram showing an example of the operation outline of the unmanned aircraft.
图6是示出无人驾驶航空器的动作例的流程图。Fig. 6 is a flowchart showing an example of the operation of the unmanned aircraft.
图7是示出动态图像合成的第一示例的流程图。Fig. 7 is a flowchart showing a first example of dynamic image synthesis.
图8是示出动态图像合成的第二示例的流程图。Fig. 8 is a flowchart showing a second example of dynamic image synthesis.
图9是示出动态图像的输出例的流程图。Fig. 9 is a flowchart showing an output example of a moving image.
符号说明:Symbol Description:
10 飞行体***10 Flight body system
80 终端80 terminal
81 终端控制部81 Terminal Control Department
83 操作部83 Operation Department
85 通信部85 Communications Department
87 存储部87 Storage Department
88 显示部88 Display
100 无人驾驶航空器100 Unmanned Aircraft
110 UAV控制部110 UAV Control Department
150 通信部150 Ministry of Communications
160 存储部160 Storage Department
200 万向节200 universal joint
210 旋翼机构210 Rotor Mechanism
220 摄像部220 Camera Department
240 GPS接收器240 GPS receiver
250 惯性测量装置250 inertial measurement device
260 磁罗盘260 Magnetic Compass
270 气压高度计270 Barometric Altimeter
280 超声波传感器280 Ultrasonic Sensor
290 激光测定器290 Laser Measuring Device
具体实施方式Detailed ways
以下,通过本发明的实施方式来对本公开进行说明,但是以下实施方式并非限制权利要求书所涉及的发明。实施方式中说明的特征的所有组合未必是发明的解决方案所必须的。Hereinafter, the present disclosure will be described through embodiments of the present invention, but the following embodiments do not limit the invention related to the claims. Not all the combinations of the features described in the embodiments are necessary for the solution of the invention.
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。The claims, the description, the drawings of the description, and the summary of the description include matters that are the subject of copyright protection. As long as anyone makes copies of these files as indicated in the patent office's documents or records, the copyright owner will not raise an objection. However, in other cases, all copyrights are reserved.
在以下实施方式中,飞行体以无人驾驶航空器(UAV:Unmanned Aerial Vehicle)为例。图像处理装置例如是无人驾驶航空器,但也可以是其他的装置(例如终端、发送器、服务器、其他的图像处理装置)。图像处理方法用于规定图像处理装置的动作。另外,记录介质中记录有程序(例如使图像处理装置执行各种处理的程序)。In the following embodiments, the flying object is an unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle) as an example. The image processing device is, for example, an unmanned aircraft, but it may also be another device (for example, a terminal, a transmitter, a server, and other image processing devices). The image processing method is used to specify the actions of the image processing device. In addition, a program (for example, a program that causes the image processing apparatus to execute various processes) is recorded in the recording medium.
以下的实施方式中所述的“部”或者“装置”并不仅限于通过硬件实现的物理结构,也包括通过程序等软件实现该结构所具有的功能者。另外,一个构成所具有的功能可以通过两个以上的物理结构实现,或者两个以上的结构的功能也可以通过例如一个物理结构实现。另外,实施方式中所述的“获取”并不仅限于表示直接获取信息或信号等的动作,也包括例如处理部通过通信部进行获取即接收以及从存储部(例如存储器等)获取中的任一者。对于这些术语的理解和解释在权利要求书的记载中也相同。The “section” or “device” described in the following embodiments is not limited to a physical structure realized by hardware, but also includes a function that realizes the structure by software such as a program. In addition, the function of one structure may be realized by two or more physical structures, or the function of two or more structures may also be realized by, for example, one physical structure. In addition, the “acquisition” described in the embodiment is not limited to the action of directly acquiring information or signals, etc., but also includes, for example, any of the processing unit's acquisition or reception through the communication unit and the acquisition from the storage unit (such as a memory, etc.) By. The understanding and interpretation of these terms are also the same in the description of the claims.
图1是示出实施方式中的飞行体***10的构成示例的示意图。飞行体***10包括无人驾驶航空器100及终端80。无人驾驶航空器100和终端80之间可以通过有线通信或无线通信(例如,无线LAN(Local Area Network:局域网))彼此通信。在图1中,终端80例示了一种便携终端(例如智能电话、平板电脑终端),但也可以是其他的终端(例如,PC(Personal Computer,个人计算机)、可通过控制杆操纵无人驾驶航空器100的发送器(比例控制器))。FIG. 1 is a schematic diagram showing a configuration example of a flying body system 10 in the embodiment. The flying body system 10 includes an unmanned aircraft 100 and a terminal 80. The unmanned aircraft 100 and the terminal 80 may communicate with each other through wired communication or wireless communication (for example, a wireless LAN (Local Area Network)). In FIG. 1, the terminal 80 exemplifies a portable terminal (such as a smart phone or a tablet terminal), but it may also be another terminal (such as a PC (Personal Computer, personal computer), which can be manipulated by a joystick for unmanned driving The transmitter (proportional controller) of the aircraft 100).
图2是示出无人驾驶航空器100的具体外观的一个示例的图。图2示出了当无人驾驶航空器100沿移动方向STV0飞行时的立体图。无人驾驶航空器100为移动体的一个示例。FIG. 2 is a diagram showing an example of a specific appearance of unmanned aircraft 100. As shown in FIG. FIG. 2 shows a perspective view when the unmanned aircraft 100 is flying in the moving direction STV0. Unmanned aircraft 100 is an example of a moving body.
如图2所示,在与地面平行且沿着移动方向STV0的方向上设定滚转轴(参照x轴)。在此情况下,在与地面平行且与滚转轴垂直的方向上设定俯仰轴(参照y轴),进而,在与地面垂直且与滚转轴及俯仰轴垂直的方向上设定偏航轴(参照z轴)。As shown in FIG. 2, the roll axis (refer to the x-axis) is set in a direction parallel to the ground and along the moving direction STV0. In this case, set the pitch axis (refer to the y-axis) in a direction parallel to the ground and perpendicular to the roll axis, and then set the yaw axis in a direction perpendicular to the ground and perpendicular to the roll and pitch axes ( Refer to the z axis).
无人驾驶航空器100为包括UAV主体102、万向节200、摄像部220以及多个摄像部230。The unmanned aircraft 100 includes a UAV main body 102, a gimbal 200, an imaging unit 220, and a plurality of imaging units 230.
UAV主体102包括多个旋翼(螺旋桨)。UAV主体102通过控制多个旋翼的旋转而使无人驾驶航空器100飞行。UAV主体102使用例如四个旋翼使无人驾驶航空器100飞行。旋翼的数量并不限于四个。此外,无人驾驶航空器100可以是没有旋翼的固定翼机。The UAV main body 102 includes a plurality of rotors (propellers). The UAV main body 102 makes the unmanned aircraft 100 fly by controlling the rotation of a plurality of rotors. The UAV main body 102 uses, for example, four rotors to fly the unmanned aircraft 100. The number of rotors is not limited to four. In addition, the unmanned aircraft 100 may be a fixed-wing aircraft without a rotor.
摄像部220是对包括在期望拍摄范围内的被摄体(例如,作为拍摄对象的上空的情况、山川河流等景色、地上的建筑物)进行拍摄的拍摄用相机。The imaging unit 220 is a photographing camera that photographs a subject included in a desired photographing range (for example, the sky above the subject, the scenery such as mountains and rivers, and the buildings on the ground).
多个摄像部230是为了控制无人驾驶航空器100的飞行而对无人驾驶航空器100的周围进行拍摄的传感用相机。两个摄像部230可以设置于无人驾驶航空器100的机头、即正面。并且,其他两个摄像部230可以设置于无人驾驶航空器100的底面。正面侧的两个摄像部230可以成对,起到所谓的立体相机的作用。底面侧的两个摄像部230也可以成对,起到立体相机的作用。可以基于由多个摄像部230拍摄到的图像来生成无人驾驶航空器100周围的三维空间数据。另外,无人驾驶航空器100所包括的摄像部230的数量不限于四个。无人驾驶航空器100只要包括至少一个摄像部230即可。无人驾驶航空器100可以在无人驾驶航空器100的机头、机尾、侧面、底面及顶面分别包括至少一个摄像部230。摄像部230中可设定的视角可大于摄像部220中可设定的视角。摄像部230可以具有单焦点镜头或鱼眼镜头。The plurality of imaging units 230 are sensor cameras that photograph the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100. The two camera units 230 may be installed on the nose of the unmanned aircraft 100, that is, on the front side. In addition, the other two camera units 230 may be provided on the bottom surface of the unmanned aircraft 100. The two imaging units 230 on the front side may be paired to function as a so-called stereo camera. The two imaging parts 230 on the bottom side may also be paired to function as a stereo camera. The three-dimensional space data around the unmanned aircraft 100 may be generated based on the images captured by the plurality of imaging units 230. In addition, the number of imaging units 230 included in unmanned aircraft 100 is not limited to four. The unmanned aircraft 100 only needs to include at least one camera 230. The unmanned aircraft 100 may include at least one camera 230 on the nose, tail, sides, bottom surface, and top surface of the unmanned aircraft 100, respectively. The angle of view that can be set in the imaging unit 230 may be larger than the angle of view that can be set in the imaging unit 220. The imaging part 230 may have a single focus lens or a fisheye lens.
图3是示出无人驾驶航空器100的硬件构成的一个示例的框图。无人驾驶航空器100包括UAV控制部110、通信部150、存储部160、万向节200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置(IMU:Inertial Measurement Unit)250、磁罗盘260、气压高度计270、超声波传感器280、激光测定器290。FIG. 3 is a block diagram showing an example of the hardware configuration of unmanned aircraft 100. The unmanned aircraft 100 includes a UAV control unit 110, a communication unit 150, a storage unit 160, a universal joint 200, a rotor mechanism 210, a camera unit 220, a camera unit 230, a GPS receiver 240, and an inertial measurement unit (IMU: Inertial Measurement Unit). ) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, laser measuring device 290.
UAV控制部110例如由CPU(Central Processing Unit:中央处理器)、MPU(Micro Processing Unit:微处理器)或DSP(Digital Signal Processor:数字信号处理器)构成。UAV控制部110执行用于总体控制无人驾驶航空器100的各部分的动作的信号处理、与其它各部分之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。The UAV control unit 110 is composed of, for example, a CPU (Central Processing Unit: Central Processing Unit), MPU (Micro Processing Unit: Microprocessor), or DSP (Digital Signal Processor: Digital Signal Processor). The UAV control unit 110 performs signal processing for overall control of the operations of each part of the unmanned aircraft 100, data input and output processing with other parts, data arithmetic processing, and data storage processing.
UAV控制部110可以根据存储在存储部160中的程序对无人驾驶航空器100的飞行进行控制。UAV控制部110可以按照来自终端80的操纵等对飞行控制的指示对飞行进行控制。UAV控制部110可以拍摄图像(例如动态图像、静止图像)(例如航拍)。The UAV control unit 110 can control the flight of the unmanned aircraft 100 according to a program stored in the storage unit 160. The UAV control unit 110 can control the flight in accordance with the flight control instructions from the terminal 80 or the like. The UAV control unit 110 can capture images (for example, moving images, still images) (for example, aerial photography).
UAV控制部110获取表示无人驾驶航空器100的位置的位置信息。UAV控制部110可以从GPS接收器240获取表示无人驾驶航空器100所在的纬度、经度以及高度的位置信息。UAV控制部110可以分别从GPS接收器240获取表示无人驾驶航空器100所在的纬度以及经度的纬度经度信息,并从气压高度计270获取表示无人驾驶航空器100所在的高度的高度信息,作为位置信息。UAV控制部110可以获取超声波传感器280产生的超声波放射点与超声波反射点之间的距离,作为高度信息。The UAV control unit 110 acquires position information indicating the position of the unmanned aircraft 100. The UAV control unit 110 can obtain position information indicating the latitude, longitude, and altitude where the unmanned aircraft 100 is located from the GPS receiver 240. The UAV control unit 110 can obtain the latitude and longitude information indicating the latitude and longitude of the unmanned aircraft 100 from the GPS receiver 240, and obtain the altitude information indicating the altitude of the unmanned aircraft 100 from the barometric altimeter 270 as position information. . The UAV control unit 110 may obtain the distance between the ultrasonic radiation point and the ultrasonic reflection point generated by the ultrasonic sensor 280 as height information.
UAV控制部110可以从磁罗盘260获取表示无人驾驶航空器100的朝向的朝向信息。朝向信息可以用例如与无人驾驶航空器100的机头的朝向相对应的方位来表示。The UAV control unit 110 can acquire the orientation information indicating the orientation of the unmanned aircraft 100 from the magnetic compass 260. The orientation information may be represented by, for example, an orientation corresponding to the orientation of the nose of the unmanned aircraft 100.
UAV控制部110可以获取表示在摄像部220对要拍摄的拍摄范围进行拍摄时无人驾驶航空器100应存在的位置的位置信息。UAV控制部110可以从存储部160获取表示无人驾驶航空器100应存在的位置的位置信息。UAV控制部110可以通过通信部150 从其他的装置获取表示无人驾驶航空器100应存在的位置的位置信息。UAV控制部110可以参照三维地图数据库来确定无人驾驶航空器100可能存在的位置,并获取该位置作为表示无人驾驶航空器100应存在的位置的位置信息。The UAV control unit 110 can acquire position information indicating the position where the unmanned aircraft 100 should exist when the imaging unit 220 captures the shooting range to be captured. The UAV control unit 110 may obtain position information indicating the position where the unmanned aircraft 100 should exist from the storage unit 160. The UAV control unit 110 can obtain the position information indicating the position where the unmanned aerial vehicle 100 should exist from other devices through the communication unit 150. The UAV control unit 110 may refer to the three-dimensional map database to determine the possible location of the unmanned aircraft 100, and obtain the location as the location information indicating the location where the unmanned aircraft 100 should exist.
UAV控制部110可以获取摄像部220及摄像部230的各自的拍摄范围。UAV控制部110可以从摄像部220和摄像部230获取表示摄像部220和摄像部230的视角的视角信息作为用于确定拍摄范围的参数。UAV控制部110可以获取表示摄像部220和摄像部230的拍摄方向的信息,作为用于确定拍摄范围的参数。UAV控制部110可以从万向节200获取表示摄像部220的姿势状态的姿势信息,作为例如表示摄像部220的拍摄方向的信息。摄像部220的姿势信息可以表示万向节200的从俯仰轴和偏航轴基准旋转角度旋转的角度。The UAV control unit 110 can acquire the respective imaging ranges of the imaging unit 220 and the imaging unit 230. The UAV control unit 110 may acquire the angle of view information representing the angle of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230 as a parameter for determining the imaging range. The UAV control unit 110 may acquire information indicating the shooting direction of the camera unit 220 and the camera unit 230 as a parameter for determining the shooting range. The UAV control unit 110 may obtain posture information indicating the posture state of the imaging unit 220 from the gimbal 200 as information indicating the imaging direction of the imaging unit 220, for example. The posture information of the imaging unit 220 may indicate the angle of rotation of the universal joint 200 from the pitch axis and the yaw axis reference rotation angle.
UAV控制部110可以获取表示无人驾驶航空器100所在位置的位置信息,作为用于确定拍摄范围的参数。UAV控制部110可以根据摄像部220及摄像部230的视角及拍摄方向以及无人驾驶航空器100所在位置,来限定表示摄像部220拍摄的地理范围的拍摄范围。The UAV control unit 110 may obtain position information indicating the location of the unmanned aircraft 100 as a parameter for determining the shooting range. The UAV control unit 110 may limit the imaging range representing the geographic range captured by the imaging unit 220 according to the angle of view and the imaging direction of the imaging unit 220 and the imaging unit 230, and the location of the unmanned aircraft 100.
UAV控制部110可以从存储部160获取拍摄范围信息。UAV控制部110可以通过通信部150获取拍摄范围信息。The UAV control unit 110 may acquire the shooting range information from the storage unit 160. The UAV control unit 110 may obtain the shooting range information through the communication unit 150.
UAV控制部110控制万向节200、旋翼机构210、摄像部220以及摄像部230。UAV控制部110可以通过变更摄像部220的拍摄方向或视角来控制摄像部220的拍摄范围。UAV控制部110可以通过控制万向节200的旋转机构来控制由万向节200所支撑的摄像部220的拍摄范围。The UAV control unit 110 controls the universal joint 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230. The UAV control unit 110 can control the imaging range of the imaging unit 220 by changing the imaging direction or angle of view of the imaging unit 220. The UAV control unit 110 can control the imaging range of the imaging unit 220 supported by the universal joint 200 by controlling the rotation mechanism of the universal joint 200.
拍摄范围是指由摄像部220或摄像部230拍摄的地理范围。拍摄范围由纬度、经度和高度定义。拍摄范围可以是由纬度、经度和高度定义的三维空间数据的范围。拍摄范围可以是由纬度和经度定义的二维空间数据的范围。拍摄范围可以基于摄像部220或摄像部230的视角和拍摄方向以及无人驾驶航空器100所在的位置而确定。摄像部220和摄像部230的拍摄方向可以由摄像部220和摄像部230的设置有拍摄镜头的正面朝向的方位和俯角来定义。摄像部220的拍摄方向可以是由无人驾驶航空器100的机头方位以及相对于万向节200的摄像部220的姿势状态而确定的方向。摄像部230的拍摄方向可以是从无人驾驶航空器100的机头方位和设置有摄像部230的位置而确定的方向。The photographing range refers to the geographic range photographed by the photographing unit 220 or the photographing unit 230. The shooting range is defined by latitude, longitude and altitude. The shooting range may be a range of three-dimensional spatial data defined by latitude, longitude, and altitude. The shooting range may be a range of two-dimensional spatial data defined by latitude and longitude. The shooting range may be determined based on the angle of view and shooting direction of the camera 220 or 230 and the location where the unmanned aircraft 100 is located. The shooting directions of the imaging unit 220 and the imaging unit 230 can be defined by the orientation and depression angle of the front facing of the imaging unit 220 and the imaging unit 230 on which the imaging lens is provided. The imaging direction of the imaging unit 220 may be a direction determined by the orientation of the nose of the unmanned aircraft 100 and the posture state of the imaging unit 220 with respect to the gimbal 200. The imaging direction of the imaging unit 230 may be a direction determined from the orientation of the nose of the unmanned aircraft 100 and the position where the imaging unit 230 is installed.
UAV控制部110可以通过分析由多个摄像部230拍摄的多个图像,来确定无人驾驶航空器100的周围环境。UAV控制部110可以基于无人驾驶航空器100的周围环境,例如避开障碍物来控制飞行。The UAV control unit 110 can determine the surrounding environment of the unmanned aircraft 100 by analyzing multiple images captured by the multiple camera units 230. The UAV control unit 110 may control the flight based on the surrounding environment of the unmanned aircraft 100, such as avoiding obstacles.
UAV控制部110可以获取表示存在于无人驾驶航空器100周围的物体的立体形状(三维形状)的立体信息(三维信息)。物体例如可以是建筑物、道路、车辆、树木等风景的一部分。立体信息例如是三维空间数据。UAV控制部110可以根据由多个摄像部230获取的各个图像,生成表示存在于无人驾驶航空器100周围的物体的立体形状的立体信息,从而获取立体信息。UAV控制部110可以通过参照存储在存储部160中的三维地图数据库,来获取表示无人驾驶航空器100周围存在的物体的立体形状的立体信息。UAV控制部110可以通过参照由网络上存在的服务器所管理的三维地图数据库,来获取与存在于无人驾驶航空器100的周围的物体的立体形状相关的立体信息。The UAV control unit 110 can acquire three-dimensional information (three-dimensional information) indicating the three-dimensional shape (three-dimensional shape) of an object existing around the unmanned aircraft 100. The object may be a part of a landscape such as buildings, roads, vehicles, trees, etc., for example. The stereo information is, for example, three-dimensional spatial data. The UAV control unit 110 may generate 3D information indicating the 3D shape of an object existing around the unmanned aircraft 100 based on each image acquired by the plurality of camera units 230, thereby acquiring the 3D information. The UAV control unit 110 can obtain the three-dimensional information indicating the three-dimensional shape of objects existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the storage unit 160. The UAV control unit 110 can acquire three-dimensional information related to the three-dimensional shape of objects existing around the unmanned aircraft 100 by referring to a three-dimensional map database managed by a server existing on the network.
UAV控制部110通过控制旋翼机构210来控制无人驾驶航空器100的飞行。即,UAV控制部110通过控制旋翼机构210来对包括无人驾驶航空器100的纬度、经度以及高度的位置进行控制。UAV控制部110可以通过控制无人驾驶航空器100的飞行来 控制摄像部220的拍摄范围。UAV控制部110可以通过控制摄像部220所包括的变焦镜头来控制摄像部220的视角。UAV控制部110可以利用摄像部220的数字变焦功能,通过数字变焦来控制摄像部220的视角。The UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotor mechanism 210. That is, the UAV control unit 110 controls the position including the latitude, longitude, and altitude of the unmanned aircraft 100 by controlling the rotor mechanism 210. The UAV control unit 110 can control the shooting range of the camera unit 220 by controlling the flight of the unmanned aircraft 100. The UAV control unit 110 can control the angle of view of the imaging unit 220 by controlling the zoom lens included in the imaging unit 220. The UAV control unit 110 can use the digital zoom function of the camera unit 220 to control the angle of view of the camera unit 220 through digital zoom.
当摄像部220固定在无人驾驶航空器100并且不能移动摄像部220时,UAV控制部110可以通过使无人驾驶航空器100在特定的日期时间移动到特定的位置,来使摄像部220在期望的环境下拍摄期望的拍摄范围。或者,即使摄像部220不具有变焦功能并且不能变更摄像部220的视角,UAV控制部110也可以通过使无人驾驶航空器100在特定的日期时间移动到特定的位置,来使摄像部220在期望的环境下拍摄期望的拍摄范围。When the camera unit 220 is fixed to the unmanned aircraft 100 and the camera unit 220 cannot be moved, the UAV control unit 110 can move the camera unit 220 to the desired position by moving the unmanned aircraft 100 to a specific position at a specific date and time. The desired shooting range for shooting under the environment. Or, even if the imaging unit 220 does not have a zoom function and cannot change the angle of view of the imaging unit 220, the UAV control unit 110 can move the unmanned aerial vehicle 100 to a specific position on a specific date and time to make the imaging unit 220 work as desired. Shoot the desired shooting range under the environment.
通信部150与终端80进行通信。通信部150可以通过任意的无线通信方式进行无线通信。通信部150可以通过任意的有线通信方式进行有线通信。通信部150可以将拍摄图像或拍摄图像的有关附加信息(元数据)发送给终端80。The communication unit 150 communicates with the terminal 80. The communication unit 150 can perform wireless communication by any wireless communication method. The communication unit 150 can perform wired communication through any wired communication method. The communication unit 150 may send the captured image or additional information (metadata) related to the captured image to the terminal 80.
存储部160可以各种信息、各种数据、各种程序、各种图像。各种图像可以包括拍摄图像或基于拍摄图像的图像。程序可以包括UAV控制部110对万向节200、旋翼机构210、摄像部220、GPS接收器240、惯性测量装置250、磁罗盘260、气压高度计270、超声波传感器280及激光测定器290进行控制所需的程序。存储部160可以是计算机可读记录介质。存储部160包括存储器,可以包括ROM(Read Only Memory)、RAM(Random Access Memory)等。存储部160可以包括HDD(Hard Disk Drive)、SSD(Solid State Drive)、SD卡、USB(Universal Serial bus)存储器、其他的存储器中的至少一个。存储部160的至少一部分可以从无人驾驶航空器100上拆卸下来。The storage unit 160 can be various types of information, various types of data, various types of programs, and various types of images. The various images may include a photographed image or an image based on the photographed image. The program may include the UAV control unit 110 to control the universal joint 200, the rotor mechanism 210, the camera unit 220, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring device 290. Required procedures. The storage 160 may be a computer-readable recording medium. The storage unit 160 includes memory, and may include ROM (Read Only Memory), RAM (Random Access Memory), and the like. The storage unit 160 may include at least one of HDD (Hard Disk Drive), SSD (Solid State Drive), SD card, USB (Universal Serial bus) memory, and other memories. At least a part of the storage unit 160 can be detached from the unmanned aircraft 100.
万向节200可以以偏航轴、俯仰轴以及滚转轴为中心可旋转地支撑摄像部220。万向节200可以通过使摄像部220以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来变更摄像部220的拍摄方向。The universal joint 200 may rotatably support the imaging unit 220 around the yaw axis, the pitch axis, and the roll axis. The gimbal 200 can change the imaging direction of the imaging unit 220 by rotating the imaging unit 220 around at least one of the yaw axis, the pitch axis, and the roll axis.
旋翼机构210包括多个旋翼和使多个旋翼旋转的多个驱动电机。旋翼机构210由UAV控制部110控制旋转,从而使无人驾驶航空器100飞行。The rotor mechanism 210 includes a plurality of rotor wings and a plurality of drive motors that rotate the plurality of rotor wings. The rotor mechanism 210 is controlled by the UAV control unit 110 to rotate, so that the unmanned aircraft 100 can fly.
摄像部220对期望的拍摄范围中的被摄体进行拍摄并生成拍摄图像的数据。由摄像部220拍摄得到的拍摄图像的数据可以存储在摄像部220所具有的存储器或者存储部160中。The imaging unit 220 captures a subject in a desired imaging range and generates captured image data. The data of the captured image captured by the imaging unit 220 may be stored in the memory included in the imaging unit 220 or the storage unit 160.
摄像部230对无人驾驶航空器100的周围进行拍摄并生成拍摄图像的数据。摄像部230的图像数据可以存储在存储部160中。The imaging unit 230 captures the surroundings of the unmanned aircraft 100 and generates captured image data. The image data of the imaging unit 230 may be stored in the storage unit 160.
GPS接收器240接收从多个导航卫星(即GPS卫星)发送的、表示时间以及各GPS卫星的位置(坐标)的多个信号。GPS接收器240根据接收到的多个信号,计算出GPS接收器240的位置(即无人驾驶航空器100的位置)。GPS接收器240将无人驾驶航空器100的位置信息输出到UAV控制部110。另外,可以由UAV控制部110代替GPS接收器240来进行GPS接收器240的位置信息的计算。在此情况下,GPS接收器240所接收到的多个信号中所包含的表示时间以及各GPS卫星的位置的信息被输入到UAV控制部110中。The GPS receiver 240 receives a plurality of signals transmitted from a plurality of navigation satellites (ie, GPS satellites) that indicate time and the position (coordinate) of each GPS satellite. The GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned aircraft 100) based on the received multiple signals. The GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control unit 110. In addition, the UAV control unit 110 may replace the GPS receiver 240 to calculate the position information of the GPS receiver 240. In this case, the information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240 is input to the UAV control unit 110.
惯性测量装置250检测无人驾驶航空器100的姿势,并将检测结果输出到UAV控制部110。惯性测量装置250可以检测无人驾驶航空器100的前后、左右、以及上下三轴方向的加速度以及俯仰轴、滚转轴和偏航轴三轴方向的角速度,作为无人驾驶航空器100的姿势。The inertial measurement device 250 detects the posture of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110. The inertial measurement device 250 can detect the acceleration in the front and rear, left and right, and up and down directions of the unmanned aircraft 100 and the angular velocities in the three axis directions of the pitch axis, the roll axis, and the yaw axis as the posture of the unmanned aircraft 100.
磁罗盘260检测无人驾驶航空器100的机头的方位,并将检测结果输出到UAV控制部110。The magnetic compass 260 detects the orientation of the nose of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
气压高度计270检测无人驾驶航空器100的飞行高度,并将检测结果输出到UAV控制部110。The barometric altimeter 270 detects the flying altitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
超声波传感器280发射超声波,检测地面、物体反射的超声波,并将检测结果输出到UAV控制部110。检测结果可以示出从无人驾驶航空器100到地面的距离,即高度。检测结果可以示出从无人驾驶航空器100到物体(被摄体)的距离。The ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected by the ground and objects, and outputs the detection result to the UAV control unit 110. The detection result can show the distance from the unmanned aircraft 100 to the ground, that is, the height. The detection result can show the distance from the unmanned aircraft 100 to the object (subject).
激光测量仪290对物体照射激光,接收物体反射的反射光,并通过反射光来测量无人驾驶航空器100与物体(被摄体)之间的距离。作为基于激光的距离测量方法的一个示例,可以为飞行时间法。The laser measuring instrument 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between the unmanned aircraft 100 and the object (subject) through the reflected light. As an example of a laser-based distance measurement method, a time-of-flight method may be used.
图4是示出终端80的硬件构成的一个示例的框图。终端80包括终端控制部81、操作部83、通信部85、存储部87及显示部88。终端80可以由希望指示无人驾驶航空器100的飞行控制的用户所持有。终端80可以指示无人驾驶航空器100的飞行控制。FIG. 4 is a block diagram showing an example of the hardware configuration of the terminal 80. The terminal 80 includes a terminal control unit 81, an operation unit 83, a communication unit 85, a storage unit 87, and a display unit 88. The terminal 80 may be held by a user who wishes to instruct the flight control of the unmanned aircraft 100. The terminal 80 may instruct the flight control of the unmanned aircraft 100.
终端控制部81例如采用CPU、MPU或DSP构成。终端控制部81进行用于整体控制终端80的各部分动作的信号处理、与其他各部分之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。The terminal control unit 81 is configured using, for example, a CPU, MPU, or DSP. The terminal control unit 81 performs signal processing for overall control of the operation of each part of the terminal 80, data input/output processing with other parts, data arithmetic processing, and data storage processing.
终端控制部81可以经由通信部85获取来自无人驾驶航空器100的数据、信息。终端控制部81也可以获取经由操作部83输入的数据、信息。终端控制部81也可以获取存储在存储部87的数据或信息。终端控制部81可以经由通信部85向无人驾驶航空器100发送数据、信息。终端控制部81也可以将数据、信息发送到显示部88,并使显示部88显示基于数据、信息的显示信息。显示部88所显示的信息以及通过通信部85向无人驾驶航空器100发送的信息可以包括无人驾驶航空器100飞行的飞行路径、拍摄位置、拍摄图像、基于拍摄图像的图像(例如合成图像)的信息。The terminal control unit 81 can acquire data and information from the unmanned aircraft 100 via the communication unit 85. The terminal control unit 81 can also acquire data and information input via the operation unit 83. The terminal control unit 81 may obtain data or information stored in the storage unit 87. The terminal control unit 81 can transmit data and information to the unmanned aircraft 100 via the communication unit 85. The terminal control unit 81 may send data and information to the display unit 88 and cause the display unit 88 to display display information based on the data and information. The information displayed by the display unit 88 and the information sent to the unmanned aircraft 100 through the communication unit 85 may include the flight path of the unmanned aircraft 100, the shooting position, the captured image, and the information based on the image (for example, composite image) of the captured image. information.
操作部83接收并获取由终端80的用户输入的数据、信息。操作部83可以包括按钮、按键、触摸面板、麦克风等输入装置。触摸面板可以由操作部83和显示部88构成。在这种情况下,操作部83可以接受触摸操作、点击操作、拖动操作等。The operation unit 83 receives and obtains data and information input by the user of the terminal 80. The operation unit 83 may include input devices such as buttons, keys, a touch panel, and a microphone. The touch panel may be composed of an operation part 83 and a display part 88. In this case, the operation section 83 can accept touch operations, click operations, drag operations, and the like.
通信部85通过各种无线通信方式与无人驾驶航空器100之间进行无线通信。例如,该无线通信的无线通信方式可以包括基于无线LAN或公共无线网络的通信。通信部85可以通过任意的有线通信方式进行有线通信。The communication unit 85 performs wireless communication with the unmanned aircraft 100 through various wireless communication methods. For example, the wireless communication method of the wireless communication may include communication based on a wireless LAN or a public wireless network. The communication unit 85 can perform wired communication by any wired communication method.
存储部87可以存储各种信息、各种数据、各种程序、各种图像。各种程序可以包括由终端80执行的应用程序。存储部87可以是计算机可读记录介质。存储部87可以包括ROM、RAM等。存储部87可以包括HDD、SSD、SD卡、USB存储器、其他的存储器中的至少一个。存储部87的至少一部分可以从终端80上拆卸下来。The storage unit 87 can store various information, various data, various programs, and various images. The various programs may include application programs executed by the terminal 80. The storage section 87 may be a computer-readable recording medium. The storage section 87 may include ROM, RAM, and the like. The storage unit 87 may include at least one of HDD, SSD, SD card, USB memory, and other memories. At least a part of the storage part 87 can be detached from the terminal 80.
存储部87可以对从无人驾驶航空器100获取的拍摄图像或基于拍摄图像的图像进行存储。存储部87可以对拍摄图像或基于拍摄图像的图像的附加信息进行存储。The storage unit 87 may store a captured image acquired from the unmanned aircraft 100 or an image based on the captured image. The storage unit 87 may store additional information of the captured image or the image based on the captured image.
显示部88例如采用LCD(Liquid Crystal Display,液晶显示器)构成,显示从终端控制部81输出的各种信息、数据。例如,显示部88可以显示拍摄图像或基于拍摄图像的图像。显示部88也可以显示涉及应用程序的执行的各种数据和信息。The display unit 88 is configured with an LCD (Liquid Crystal Display), for example, and displays various information and data output from the terminal control unit 81. For example, the display section 88 may display a captured image or an image based on the captured image. The display unit 88 may also display various data and information related to the execution of the application program.
以下,对无人驾驶航空器100的动作进行说明。Hereinafter, the operation of unmanned aircraft 100 will be described.
图5是示出无人驾驶航空器100的动作概要的一个示例的图。FIG. 5 is a diagram showing an example of the outline of the operation of unmanned aircraft 100.
UAV控制部110指定飞行路径RT。UAV控制部110在沿着飞行路径RT的飞行中获取拍摄动态图像的拍摄范围。The UAV control unit 110 specifies the flight path RT. The UAV control unit 110 acquires the shooting range of the shooting moving image during the flight along the flight path RT.
该拍摄范围是通过无人驾驶航空器100的状态来确定的。该无人驾驶航空器100的状态可以包括:拍摄相关的无人驾驶航空器100的位置、无人驾驶航空器100的朝向(例如机头方向),支撑摄像部220的万向节200的角度(旋转角度)等信息中的至 少一个。另外,该无人驾驶航空器100的状态也可以包括其他的无人驾驶航空器100的状态信息(例如飞行信息或拍摄信息)。例如,UAV控制部110可以通过GPS技术获取摄像部220的位置,也可以通过RTK(Real Time Kinematic GPS)技术,高精度地获取无人驾驶航空器100的位置信息。该拍摄范围可以根据沿着飞行路径RT的飞行位置与拍摄对象即被摄体之间的位置关系而由UAV控制部110生成获得。该拍摄范围可以存储在存储部160中,并从存储部160获得。该拍摄范围可以通过通信部150从外部服务器获得。The shooting range is determined by the state of the unmanned aircraft 100. The state of the unmanned aircraft 100 may include: the position of the unmanned aircraft 100 related to the shooting, the orientation of the unmanned aircraft 100 (for example, the direction of the nose), and the angle (rotation angle) of the universal joint 200 that supports the camera 220 ) And other information. In addition, the status of the unmanned aircraft 100 may also include status information of other unmanned aircraft 100 (for example, flight information or shooting information). For example, the UAV control unit 110 can obtain the position of the camera unit 220 through GPS technology, or can obtain the position information of the unmanned aircraft 100 with high precision through RTK (Real Time Kinetic GPS) technology. The shooting range may be generated by the UAV control unit 110 according to the positional relationship between the flying position along the flight path RT and the shooting object, that is, the subject. The shooting range may be stored in the storage part 160 and obtained from the storage part 160. The shooting range can be obtained from an external server through the communication unit 150.
UAV控制部110使无人驾驶航空器100沿着获取的飞行路径RT飞行。摄像部220在沿着飞行路径的无人驾驶航空器100的飞行中,通过对获取的拍摄范围进行拍摄,从而拍摄动态图像。The UAV control unit 110 causes the unmanned aircraft 100 to fly along the acquired flight path RT. During the flight of the unmanned aircraft 100 along the flight path, the imaging unit 220 captures the acquired imaging range to capture a moving image.
无人驾驶航空器100在相同飞行路径RT上多次飞行,并拍摄动态图像(视频)。动态图像由具有多个图像帧的图像序列组成。动态图像可以具有例如每秒30(相当于30fps)或者60(相当于60fps)图像帧。UAV控制部110使无人驾驶航空器100沿着相同飞行路径RT多次飞行,并使摄像部220多次拍摄相同拍摄范围的动态图像。The unmanned aircraft 100 flies multiple times on the same flight path RT and shoots dynamic images (videos). A dynamic image consists of an image sequence with multiple image frames. The dynamic image may have, for example, 30 (equivalent to 30 fps) or 60 (equivalent to 60 fps) image frames per second. The UAV control unit 110 causes the unmanned aircraft 100 to fly along the same flight path RT multiple times, and causes the imaging unit 220 to capture moving images of the same shooting range multiple times.
如图5所示,UAV控制部110在飞行路径RT的第一圈中,按时间序列的顺序,从摄像部220获取第一图像帧gf11、第二图像帧gf12、第三图像帧gf13、第四图像帧gf14、…。UAV控制部110在飞行路径RT的第二圈中,从摄像部220获取第一图像帧gf21、第二图像帧gf22、第三图像帧gf23、第四图像帧gf24、…。UAV控制部110在飞行路径RT的第三圈中,从摄像部220获取第一图像帧gf31、第二图像帧gf32、第三图像帧gf33、第四图像帧gf34、…。在图5中,将第X图像帧简单地记载为第X帧。As shown in FIG. 5, in the first circle of the flight path RT, the UAV control unit 110 acquires the first image frame gf11, the second image frame gf12, the third image frame gf13, and the first image frame gf11, the second image frame gf12, the third image frame gf13, and the first image frame gf11, the second image frame gf12, and the Four image frames gf14,.... The UAV control unit 110 acquires the first image frame gf21, the second image frame gf22, the third image frame gf23, the fourth image frame gf24,... From the imaging unit 220 in the second circle of the flight path RT. The UAV control unit 110 acquires the first image frame gf31, the second image frame gf32, the third image frame gf33, the fourth image frame gf34,... From the imaging unit 220 in the third circle of the flight path RT. In FIG. 5, the X-th image frame is simply described as the X-th frame.
在各圈中的相同相对时间(时间序列位置)的图像帧中,对相同的拍摄范围进行拍摄。例如,在各圈中,在相同相对时间t1拍摄的第一图像帧gf11、gf21、gf31的图像范围所对应的拍摄范围相同。在各圈中,在相同相对时间t2拍摄的第二图像帧gf12、gf22、gf32的图像范围所对应的拍摄范围相同。在各圈中,在相同相对时间t3拍摄的第三图像帧gf13、gf23、gf33的图像范围所对应的拍摄范围相同。在各圈中,在相同相对时间t4拍摄的第四图像帧gf14、gf24、gf34的图像范围所对应的拍摄范围相同。在拍摄范围相同的情况下,无人驾驶航空器100的状态相同。由此,无人驾驶航空器100可以获取多张对相同位置拍摄的图像帧。无人驾驶航空器100通过在飞行路径RT上反复飞行并进行拍摄,能够逐帧实现连续拍摄。In the image frames of the same relative time (time-series position) in each circle, the same shooting range is photographed. For example, in each circle, the image ranges corresponding to the image ranges of the first image frames gf11, gf21, and gf31 captured at the same relative time t1 are the same. In each circle, the image ranges corresponding to the image ranges of the second image frames gf12, gf22, and gf32 captured at the same relative time t2 are the same. In each circle, the image ranges corresponding to the image ranges of the third image frames gf13, gf23, and gf33 captured at the same relative time t3 are the same. In each circle, the image ranges corresponding to the image ranges of the fourth image frames gf14, gf24, and gf34 captured at the same relative time t4 are the same. When the shooting range is the same, the state of the unmanned aircraft 100 is the same. As a result, unmanned aircraft 100 can acquire multiple image frames taken at the same position. The unmanned aircraft 100 can continuously shoot frame by frame by repeatedly flying and shooting on the flight path RT.
UAV控制部110对各圈中的相同相对时间的多个图像帧进行合成,每个相同相对时间的图像帧得到合成图像帧。例如,对三个第一图像帧gf11、gf21、gf31进行合成,生成第一合成图像帧。针对第二图像帧以后的图像帧也同样,生成第二合成图像帧,…。UAV控制部110生成按时间序列依次包括各个合成图像帧的合成动态图像。The UAV control unit 110 synthesizes a plurality of image frames of the same relative time in each circle, and each image frame of the same relative time obtains a composite image frame. For example, three first image frames gf11, gf21, and gf31 are synthesized to generate a first synthesized image frame. For image frames after the second image frame, the second composite image frame is generated in the same way,.... The UAV control unit 110 generates a composite moving image including each composite image frame sequentially in time series.
另外,UAV控制部110对图像帧进行拍摄时,可以对无人驾驶航空器100的状态信息进行存储。对图像帧进行拍摄的时间,即获取无人驾驶航空器100的状态信息的时间可以与摄像部220的垂直同步信号(VSYNC信号)同步。至少可以在第一圈拍摄时对无人驾驶航空器100的状态进行保存。由此,无人驾驶航空器100在第二圈以后的飞行中,也能够追随第一圈飞行时的无人飞行器的状态,第二圈以后也能够拍摄具有相同拍摄范围的图像帧的动态图像。In addition, the UAV control unit 110 may store the status information of the unmanned aircraft 100 when capturing an image frame. The time when the image frame is photographed, that is, the time when the state information of the unmanned aircraft 100 is acquired, may be synchronized with the vertical synchronization signal (VSYNC signal) of the imaging unit 220. At least the state of the unmanned aircraft 100 can be saved during the first round of shooting. As a result, the unmanned aircraft 100 can also follow the state of the unmanned aerial vehicle during the first flight during the second lap and later, and can also capture moving images of image frames with the same shooting range after the second lap.
图6是示出无人驾驶航空器100的动作例的流程图。FIG. 6 is a flowchart showing an example of the operation of unmanned aircraft 100.
首先,UAV控制部110指定飞行路径RT(S11)。例如,飞行路径RT可以由用户通过终端80的操作部83预先进行指定,也可以通过通信部85及通信部150获得而进 行指定。飞行路径RT可以通过UAV控制部110生成并进行指定,使得能够对一个以上的所期望的被摄体进行拍摄。飞行路径RT可以预先存储在存储部160中,并从存储部160中获得而进行指定。飞行路径RT可以通过通信部150从外部服务器获得而进行指定。例如,飞行路径RT是能够对所期望的被摄体进行拍摄的飞行路径。另外,UAV控制部110在第一圈飞行中,可以按照基于终端80的操作部83的手动操作(操纵)来指定飞行路径RT。First, the UAV control unit 110 specifies the flight path RT (S11). For example, the flight path RT may be specified by the user through the operation unit 83 of the terminal 80 in advance, or may be obtained by the communication unit 85 and the communication unit 150 and specified. The flight path RT can be generated and specified by the UAV control unit 110, so that more than one desired subject can be photographed. The flight path RT may be stored in the storage unit 160 in advance, and obtained from the storage unit 160 for designation. The flight path RT can be specified by obtaining it from an external server through the communication unit 150. For example, the flight path RT is a flight path that can capture a desired subject. In addition, the UAV control unit 110 can designate the flight path RT in accordance with manual operation (manipulation) by the operation unit 83 of the terminal 80 during the first round of flight.
UAV控制部110根据预定拍摄开始触发信号,使摄像部220沿着飞行路径RT开始拍摄。拍摄开始触发信号可以包括:通过通信部150从终端80接收到拍摄开始指示,或者检测出达到开始拍摄的规定时刻等。拍摄开始的指示可以包括:例如通过终端80的操作部83选择了视频合成模式作为拍摄模式。The UAV control unit 110 causes the imaging unit 220 to start imaging along the flight path RT according to a predetermined shooting start trigger signal. The shooting start trigger signal may include: receiving a shooting start instruction from the terminal 80 through the communication unit 150, or detecting that a predetermined time to start shooting has been reached. The instruction to start shooting may include, for example, the video synthesis mode is selected as the shooting mode through the operating unit 83 of the terminal 80.
UAV控制部110将沿着飞行路径RT开始拍摄动态图像的时的无人驾驶航空器100的状态存储在存储部160中(S12)。UAV控制部110也可以获取由终端80通过通信部150指示的无人驾驶航空器100的状态即拍摄开始时的无人驾驶航空器100的状态。UAV控制部110也可以按照所期望的被摄体来决定拍摄开始时的无人驾驶航空器100的状态。按照无人驾驶航空器100的状态来确定由摄像部220拍摄的拍摄范围。The UAV control unit 110 stores the state of the unmanned aircraft 100 when the moving image is started along the flight path RT in the storage unit 160 (S12). The UAV control unit 110 may also acquire the state of the unmanned aircraft 100 instructed by the terminal 80 through the communication unit 150, that is, the state of the unmanned aircraft 100 at the start of imaging. The UAV control unit 110 may determine the state of the unmanned aircraft 100 at the start of imaging according to a desired subject. The imaging range captured by the imaging unit 220 is determined according to the state of the unmanned aircraft 100.
UAV控制部110沿着飞行路径RT拍摄动态图像(S13)。UAV控制部110控制无人驾驶航空器100的飞行,使其在各圈中沿着飞行路径RT飞行,并且获取各圈的动态图像的各个图像帧。UAV控制部110对在各圈中拍摄的动态图像进行合成,生成合成动态图像(S14)。动态图像的合成在后面会有详细说明。UAV控制部110输出合成动态图像等动态图像。后面会对动态图像的输出进行详细说明(S15)。此外,UAV控制部110可以在各圈的飞行及拍摄中(例如各圈的拍摄开始时),将该圈是第几圈这一信息存储在存储部160中。此外,与S12类似,在S13中,UAV控制部110也可以至少在获取第一圈的各个图像帧时,对无人驾驶航空器100的状态进行保存。由此,无人驾驶航空器100在飞行路径RT的第二圈以后,也可以在与第一圈无人驾驶航空器100的状态相同的无人驾驶航空器100的状态下,实施飞行及拍摄。The UAV control unit 110 captures a moving image along the flight path RT (S13). The UAV control unit 110 controls the flight of the unmanned aircraft 100 so that it flies along the flight path RT in each circle, and acquires each image frame of the dynamic image of each circle. The UAV control unit 110 synthesizes the moving images captured in each circle, and generates a synthetic moving image (S14). The composition of dynamic images will be described in detail later. The UAV control unit 110 outputs moving images such as composite moving images. The output of the moving image will be described in detail later (S15). In addition, the UAV control unit 110 may store the information of the number of the circle in the storage unit 160 during the flight and shooting of each circle (for example, when the shooting of each circle starts). In addition, similar to S12, in S13, the UAV control unit 110 may also save the state of the unmanned aircraft 100 at least when acquiring each image frame of the first circle. Therefore, after the second round of the flight path RT, the unmanned aircraft 100 can also perform flight and shooting in the state of the unmanned aircraft 100 that is the same as the state of the unmanned aircraft 100 in the first round.
UAV控制部110对输出的动态图像(输出动态图像)进行评估(S16)。UAV控制部110在各圈中的动态图像的拍摄结束时,可以对输出动态图像进行评估。例如,UAV控制部110可以在预定的飞行路径RT的飞行及拍摄结束时,判断动态图像的拍摄结束。例如,UAV控制部110可以在以下情况判断动态图像的拍摄已结束:当在第一圈中通过终端80操纵无人驾驶航空器100,终端80对无人驾驶航空器100的操纵结束时;在操作部83进行了指示无人驾驶航空器100的操纵结束的操作并且通过通信部85通知了无人驾驶航空器100时。The UAV control unit 110 evaluates the output moving image (output moving image) (S16). The UAV control unit 110 may evaluate the output moving image when the shooting of the moving image in each circle ends. For example, the UAV control unit 110 may determine that the shooting of the moving image is completed when the flight and shooting of the predetermined flight path RT are completed. For example, the UAV control unit 110 can determine that the shooting of the moving image has ended in the following situations: when the unmanned aircraft 100 is manipulated by the terminal 80 in the first lap, and the terminal 80 controls the unmanned aircraft 100; When 83 has performed an operation instructing the end of the operation of the unmanned aircraft 100 and notified the unmanned aircraft 100 through the communication unit 85.
UAV控制部110判定输出动态图像的评估结果是否满足预设基准(S17)。预设基准可以是用户的主观基准,也可以是客观基准。The UAV control unit 110 determines whether the evaluation result of the output moving image satisfies a preset criterion (S17). The preset benchmark can be a user's subjective benchmark or an objective benchmark.
当预设基准为用户主观基准时,UAV控制部110可以通过通信部150将输出动态图像发送给终端80,终端80的终端控制部81通过通信部85接收输出动态图像,并且通过显示部88显示输出动态图像。并且,用户也可以对显示的输出动态图像进行确认,并由用户主观判定输出动态图像是否满足预设基准。在此情况下,当满足预设基准时,终端控制部81可以通过操作部83获取表示满足预设基准的操作信息,并且通过通信部85发送给无人驾驶航空器100。另一方面,当不满足预设基准时,终端控制部81可以通过操作部83获取不满足预设基准这一操作信息,并通过通信部85发送给无人驾驶航空器100。即,用户可以手动输入评估结果。When the preset reference is the subjective reference of the user, the UAV control unit 110 can send the output dynamic image to the terminal 80 through the communication unit 150, and the terminal control unit 81 of the terminal 80 receives the output dynamic image through the communication unit 85, and displays it through the display unit 88 Output dynamic images. In addition, the user may also confirm the displayed output moving image, and the user may subjectively determine whether the output moving image satisfies a preset criterion. In this case, when the preset criterion is satisfied, the terminal control unit 81 may obtain the operation information indicating that the preset criterion is satisfied through the operation unit 83, and send it to the unmanned aircraft 100 through the communication unit 85. On the other hand, when the preset criterion is not met, the terminal control unit 81 may obtain the operation information that the preset criterion is not met through the operation unit 83, and send it to the unmanned aircraft 100 through the communication unit 85. That is, the user can manually input the evaluation result.
当预设基准为客观基准时,UAV控制部110可以针对输出动态图像进行图像识别(例如图案识别),并且根据图像识别的结果对输出动态图像进行评估。例如,在此情况下,预设基准可以是基于输出动态图像的各个图像帧的各个像素的像素值的基准。When the preset reference is an objective reference, the UAV control unit 110 may perform image recognition (for example, pattern recognition) on the output moving image, and evaluate the output moving image according to the result of the image recognition. For example, in this case, the preset reference may be a reference based on the pixel value of each pixel of each image frame of the output dynamic image.
当输出动态图像满足预设基准时(S17的“是”),UAV控制部110结束图5的处理,并且结束沿着飞行路径RT的飞行及拍摄。When the output dynamic image satisfies the preset reference (Yes in S17), the UAV control unit 110 ends the process of FIG. 5, and ends the flight and shooting along the flight path RT.
另一方面,当输出动态图像不满足预设基准时(S17的“否”),UAV控制部110进入下一环绕的飞行及拍摄(S18)。在此情况下,UAV控制部110从存储部160获取拍摄开始时的无人驾驶航空器100的状态信息,并且将其设定为飞行路径RT的开始地点处的无人驾驶航空器100的状态(S18)。由此,UAV控制部110移动到下一次环绕的飞行路径RT的拍摄开始的位置,并且将拍摄开始时的摄像部220设为能够对所期望的拍摄范围进行拍摄的状态。On the other hand, when the output dynamic image does not satisfy the preset reference (No in S17), the UAV control unit 110 enters the next round of flying and shooting (S18). In this case, the UAV control unit 110 acquires the status information of the unmanned aircraft 100 at the start of shooting from the storage unit 160, and sets it as the status of the unmanned aircraft 100 at the starting point of the flight path RT (S18 ). As a result, the UAV control unit 110 moves to the position where the imaging of the next-circle flight path RT starts, and the imaging unit 220 at the start of imaging is brought into a state capable of imaging the desired imaging range.
此外,作为评估对象的动态图像也可以仅限于输出动态图像中的合成动态图像。例如,即使第一圈的基准动态图像并未得到评估,也不会影响到合成动态图像的质量,可缩短图6的处理时间。In addition, the moving image to be evaluated may be limited to the synthesized moving image among the output moving images. For example, even if the reference dynamic image of the first circle has not been evaluated, the quality of the synthesized dynamic image will not be affected, and the processing time of FIG. 6 can be shortened.
无人驾驶航空器100至少反复进行N次沿着飞行路径RT的飞行及拍摄。“N”是大于等于2的任意数值,例如假设为生成的合成动态图像的质量高于规定的质量的环绕次数。当输出动态图像的评估结果不满足预设基准时,在N次以后,也能够继续沿着飞行路径RT进行飞行及拍摄。N的值例如可以通过终端80的操作部83由用户进行指定,也可以适当地确定为任意数值。另外,UAV控制部110也可以根据拍摄的场景或拍摄范围来确定N的值。The unmanned aircraft 100 repeatedly performs flight and shooting along the flight path RT at least N times. "N" is an arbitrary number greater than or equal to 2, and for example, it is assumed that the quality of the generated synthetic moving image is higher than the number of times of the predetermined quality. When the evaluation result of the output dynamic image does not meet the preset criterion, after N times, it is also possible to continue flying and shooting along the flight path RT. The value of N may be designated by the user through the operation unit 83 of the terminal 80, for example, or may be appropriately determined as an arbitrary numerical value. In addition, the UAV control unit 110 may also determine the value of N according to the shooting scene or the shooting range.
这样,无人驾驶航空器100(图像处理装置的一个示例)对由无人驾驶航空器100(飞行体的一个示例)所包括的摄像部220拍摄的动态图像进行处理。UAV控制部(处理部的一个示例)可以指定无人驾驶航空器100飞行的飞行路径RT。UAV控制部110可以使无人驾驶航空器100沿着飞行路径RT环绕飞行多次。UAV控制部110可以通过多次环绕飞行使摄像部220拍摄具有相同拍摄范围的多个动态图像。UAV控制部110可以对通过多次环绕飞行所拍摄的多个动态图像进行合成,生成合成动态图像。In this way, unmanned aircraft 100 (an example of an image processing device) processes the moving images captured by the imaging unit 220 included in the unmanned aircraft 100 (an example of a flying object). The UAV control unit (an example of the processing unit) can specify the flight path RT on which the unmanned aircraft 100 is flying. The UAV control unit 110 can make the unmanned aircraft 100 circulate along the flight path RT multiple times. The UAV control unit 110 can cause the imaging unit 220 to capture a plurality of dynamic images having the same shooting range through multiple round flights. The UAV control unit 110 may synthesize a plurality of moving images captured through multiple circling flights to generate a synthetic moving image.
无人驾驶航空器100很难在飞行中一边停留在一处一边拍摄动态图像。因此,在进行动态图像的拍摄时,很难在同一拍摄范围内进行连续拍摄,而且很难对相同的拍摄范围内的图像进行合成。对此,无人驾驶航空器100在进行动态图像的拍摄时,并不会停留在一处,而是通过在指定的飞行路径RT上进行多次环绕飞行,从而能够随着时间的变化对相同的拍摄范围进行拍摄。因此,无人驾驶航空器100在相同的拍摄范围内,即可以对较大范围的拍摄范围进行固定,得到具有与各个拍摄范围相对应的多个图像帧的多个动态图像。从而,无人驾驶航空器100通过对该多个动态图像进行合成并生成合成动态图像,从而能够得到各种有益的拍摄效果(例如Temporal Denoise、HDR(High Dynamic Range,高动态范围))。即,无人驾驶航空器100可以得到长时间曝光的拍摄效果,提高SNR(Signal to Noise Ratio),减少噪声,并扩大动态范围。It is difficult for the unmanned aircraft 100 to take dynamic images while staying in one place during flight. Therefore, when shooting moving images, it is difficult to continuously shoot in the same shooting range, and it is difficult to synthesize images in the same shooting range. In this regard, the unmanned aircraft 100 does not stay in one place when shooting dynamic images, but performs multiple rounds on the designated flight path RT, so that it can respond to the same one over time. Shooting within the shooting range. Therefore, the unmanned aircraft 100 is within the same shooting range, that is, a larger shooting range can be fixed, and a plurality of dynamic images having a plurality of image frames corresponding to each shooting range can be obtained. Therefore, the unmanned aircraft 100 synthesizes the plurality of dynamic images and generates a composite dynamic image, so as to obtain various beneficial shooting effects (for example, Temporal Denoise, HDR (High Dynamic Range, high dynamic range)). That is, the unmanned aircraft 100 can obtain a long-time exposure shooting effect, increase the SNR (Signal to Noise Ratio), reduce noise, and expand the dynamic range.
另外,动态图像可以具有多个图像帧。UAV控制部110可以控制无人驾驶航空器100,以使多个动态图像中的每个相同相对时间的图像帧具有相同的拍摄范围。In addition, the dynamic image may have a plurality of image frames. The UAV control unit 110 may control the unmanned aircraft 100 so that each of the plurality of dynamic images has the same image frame at the same relative time.
由此,无人驾驶航空器100通过各个动态图像中的每个相同相对时间的图像帧的各帧得到具有相同拍摄范围的图像,从而,作为动态图像整体,能够在较大范围内得到拍摄范围相同的多个图像帧。As a result, the unmanned aircraft 100 obtains images with the same shooting range from each of the image frames of the same relative time in each moving image, so that as the whole moving image, the same shooting range can be obtained over a larger range. Of multiple image frames.
另外,UAV控制部110可以在第一圈飞行路径RT的飞行中,与摄像部220的垂直同步信号(VSYNC信号)同步地获取无人驾驶航空器100的状态。UAV控制部110 可以在第二圈以后的飞行路径RT的飞行中,与摄像部220的垂直同步信号同步地控制无人驾驶航空器100的飞行及摄像部220,使其以与第一圈中的无人驾驶航空器100的状态相同的状态进行拍摄。In addition, the UAV control unit 110 may acquire the state of the unmanned aircraft 100 in synchronization with the vertical synchronization signal (VSYNC signal) of the imaging unit 220 during the flight of the first flight path RT. The UAV control unit 110 can control the flight of the unmanned aircraft 100 and the camera unit 220 in synchronization with the vertical synchronization signal of the camera unit 220 during the flight of the flight path RT after the second lap, so that the flight and camera unit 220 of the unmanned aerial vehicle 100 can be in line with those in the first lap. The shooting is performed in the same state of the unmanned aircraft 100.
由此,无人驾驶航空器100通过与摄像部220的垂直同步信号同步,从而每获取一个图像帧,即可获取无人驾驶航空器100的状态。无人驾驶航空器100通过对第一圈无人驾驶航空器100的飞行方式及拍摄方式进行存储,而将以后的环绕中的飞行方式及拍摄方式设为与第一圈相同,从而能够较容易地在大范围内固定无人驾驶航空器100的状态所对应的拍摄范围,得到多个动态图像。As a result, the unmanned aircraft 100 is synchronized with the vertical synchronization signal of the camera unit 220, so that every time an image frame is acquired, the state of the unmanned aircraft 100 can be acquired. The unmanned aircraft 100 stores the flight mode and shooting mode of the unmanned aircraft 100 in the first lap, and sets the flight mode and shooting mode in the subsequent laps to be the same as those in the first lap, so that it can be easily The shooting range corresponding to the state of the unmanned aircraft 100 is fixed in a wide range, and multiple dynamic images are obtained.
另外,无人驾驶航空器100的状态可以包括无人驾驶航空器100的位置、无人驾驶航空器100的朝向、支撑摄像部220的万向节200的角度等信息中的至少一个。In addition, the state of the unmanned aircraft 100 may include at least one of the position of the unmanned aircraft 100, the orientation of the unmanned aircraft 100, and the angle of the universal joint 200 supporting the camera 220.
由此,无人驾驶航空器100例如通过将无人驾驶航空器100的状态保存在存储部160中,并且在之后的时间点从存储部160获取无人驾驶航空器100的状态并进行设定,从而能够获取由过去的摄像部220拍摄的拍摄范围的图像帧。Thus, the unmanned aircraft 100 can store the state of the unmanned aircraft 100 in the storage unit 160, and acquire and set the state of the unmanned aircraft 100 from the storage unit 160 at a later point in time, for example. The image frame of the imaging range captured by the imaging unit 220 in the past is acquired.
另外,当合成动态图像的评估满足预设基准时,UAV控制部110可以结束无人驾驶航空器100的飞行及拍摄的控制。当合成动态图像的评估不满足预设基准时,UAV控制部110可以沿着下一次环绕的飞行路径RT进行飞行及拍摄的控制。In addition, when the evaluation of the synthetic dynamic image satisfies the preset criterion, the UAV control unit 110 may end the control of the flight and shooting of the unmanned aircraft 100. When the evaluation of the synthetic dynamic image does not meet the preset criterion, the UAV control unit 110 may perform flight and shooting control along the next flight path RT to be circled.
由此,在合成动态图像的评估达到预设基准之前,无人驾驶航空器100能够持续地在飞行路径RT上进行拍摄。因此,可以期待提高无人驾驶航空器100的合成动态图像的质量。As a result, the unmanned aircraft 100 can continuously shoot on the flight path RT until the evaluation of the synthetic dynamic image reaches the preset reference. Therefore, it is expected that the quality of the synthesized moving image of unmanned aircraft 100 will be improved.
另外,UAV控制部110可以获取表示合成动态图像的评估结果的操作信息。该操作信息可以从终端80获得。由此,可以用户主观合成动态图像进行评估,并且可以确定是否获取作为合成动态图像基础的更多图像。In addition, the UAV control unit 110 may acquire operation information indicating the evaluation result of the synthesized moving image. The operation information can be obtained from the terminal 80. In this way, the user can subjectively synthesize the dynamic image for evaluation, and can determine whether to acquire more images as the basis of the synthetic dynamic image.
另外,UAV控制部110可以针对合成动态图像进行图像识别。UAV控制部110可以根据图像识别的结果对合成动态图像进行评估。由此,无人驾驶航空器100可以通过图像识别客观地对合成动态图像进行评估,并且可以确定是否在飞行路径RT上再次飞行并继续获取作为合成动态图像基础的图像帧。In addition, the UAV control unit 110 may perform image recognition for the composite moving image. The UAV control unit 110 may evaluate the synthesized dynamic image based on the result of the image recognition. As a result, the unmanned aircraft 100 can objectively evaluate the synthetic dynamic image through image recognition, and can determine whether to fly again on the flight path RT and continue to acquire image frames that are the basis of the synthetic dynamic image.
另外,涉及上述飞行控制和拍摄控制以及合成动态图像的处理可以主要由无人驾驶航空器100进行。在此情况下,可以通过一个装置进行各项控制及各项处理,能够实施高效的处理,缩短处理时间。另外,无需与无人驾驶航空器100分开准备用于进行这些处理的装置。此外,涉及上述拍摄控制和合成动态图像的处理也可以主要由其他的装置(例如终端80、发送器)进行。In addition, the processing related to the above-mentioned flight control and shooting control, and synthesis of moving images may be mainly performed by the unmanned aircraft 100. In this case, various controls and various processes can be performed by one device, which can implement efficient processing and shorten the processing time. In addition, there is no need to prepare a device for performing these processes separately from the unmanned aircraft 100. In addition, the processing related to the above-mentioned shooting control and synthesis of moving images may also be mainly performed by other devices (for example, the terminal 80 and the transmitter).
图7是示出动态图像的合成的第一示例的流程图。动态图像的合成处理相当于图6的S14。在图7中,假设获取了在图6的S13中任意一次环绕的动态图像。Fig. 7 is a flowchart showing a first example of composition of moving images. The composite processing of the moving image corresponds to S14 in FIG. 6. In FIG. 7, it is assumed that a moving image that surrounds at any one time in S13 of FIG. 6 is acquired.
UAV控制部110对得到的动态图像是否为飞行路径RT的第一圈得到的动态图像进行判定(S21)。例如,UAV控制部110通过参照存储部160,从而能够判别出当前的飞行路径RT的飞行为第几圈。UAV控制部110可以从存储部160获取表示当前的飞行路径RT的飞行为第几圈的信息。The UAV control unit 110 determines whether the obtained moving image is the moving image obtained in the first circle of the flight path RT (S21). For example, by referring to the storage unit 160, the UAV control unit 110 can discriminate which lap of the current flight path RT is. The UAV control unit 110 may obtain information indicating the number of laps of the current flight path RT from the storage unit 160.
当得到的动态图像是第一圈飞行路径RT的动态图像时,UAV控制部110将得到的动态图像的各个图像帧作为基准动态图像的各个图像帧而存储在存储部160(S22)中。另外,在第一圈中,获取图像帧时,即与摄像部220的垂直同步信号同步地将飞行体的状态信息存储在存储部160中。由此,能够掌握拍摄图像瞬间的无人驾驶航空器100的状态。另外,UAV控制部110也将得到的动态图像的各个图像帧作为计算用动态图像的各个图像帧进行存储(S23)。When the obtained moving image is a moving image of the first flight path RT, the UAV control unit 110 stores each of the obtained moving image frames as each of the reference moving images in the storage unit 160 (S22). In addition, in the first circle, when the image frame is acquired, that is, the state information of the flying object is stored in the storage unit 160 in synchronization with the vertical synchronization signal of the imaging unit 220. Thereby, it is possible to grasp the state of unmanned aircraft 100 at the moment the image is taken. In addition, the UAV control unit 110 also stores each image frame of the obtained moving image as each image frame of the calculation moving image (S23).
另一方面,当得到的动态图像是第二圈以后的飞行路径RT的动态图像时,UAV控制部110将得到的动态图像的各个图像帧与基准动态图像的对应的各个图像帧进行比较,并且计算全局运动向量(S24)。对应的图像帧是指相同相对时间的图像帧。全局运动是指表示在多个时间点上且基于无人驾驶航空器100的飞行移动及无人驾驶航空器100的状态(姿势)变化的运动信息。全局运动由运动向量(全局运动向量)表示。On the other hand, when the obtained moving image is a moving image of the flight path RT after the second lap, the UAV control unit 110 compares each image frame of the obtained moving image with each corresponding image frame of the reference moving image, and Calculate the global motion vector (S24). Corresponding image frames refer to image frames at the same relative time. The global motion refers to motion information representing changes in the state (posture) of the unmanned aircraft 100 and the flight movement of the unmanned aircraft 100 at multiple time points. The global motion is represented by a motion vector (global motion vector).
UAV控制部110根据算出的全局运动向量对全局运动进行修正,即进行全局运动补偿(S25)。例如,在全局运动补偿中,由于可以通过仿射变换来表现整个图像帧的运动,并且以图像帧为单位进行运动补偿,因此编码效率和补偿效率较高。此外,UAV控制部110也可以在各圈时的相同相对时间的图像帧之间,实施全局运动补偿以外的帧间预测和运动补偿。此外,也可以省略S24、25的运动补偿的相关处理。The UAV control unit 110 corrects the global motion based on the calculated global motion vector, that is, performs global motion compensation (S25). For example, in global motion compensation, since the motion of the entire image frame can be expressed through affine transformation, and the motion compensation is performed in units of the image frame, the coding efficiency and the compensation efficiency are high. In addition, the UAV control unit 110 may also implement inter-frame prediction and motion compensation other than global motion compensation between image frames at the same relative time in each circle. In addition, the processing related to motion compensation in S24 and 25 may be omitted.
UAV控制部110将得到的动态图像的各个图像帧与计算用动态图像中对应的各个图像帧相加(S26)。在此情况下,可以将实施了全局运动补偿的动态图像的各个帧的各个像素的值与计算用动态图像中对应的各个图像帧的各个像素的值相加。The UAV control unit 110 adds each image frame of the obtained moving image to each corresponding image frame of the calculation moving image (S26). In this case, the value of each pixel of each frame of the moving image subjected to global motion compensation may be added to the value of each pixel of each corresponding image frame in the moving image for calculation.
例如,在S21中当得到第二圈动态图像时,UAV控制部110将作为计算用动态图像的第一圈的动态图像的第一图像帧gf11的各个像素的像素值与第二圈的动态图像的第一图像帧gf21的各个像素的像素值相加,计算出新的计算用动态图像中的第一图像帧。例如,在S21中当得到第三圈的动态图像时,UAV控制部110将第一圈的动态图像与第二圈的动态图像相加后所得的计算用动态图像的第一图像帧的各个像素的像素值与第三圈动态图像的第一图像帧gf31的各个像素的像素值相加,生成新的计算用动态图像中的第一帧。此外,针对第三圈以后的动态图像也进行同样的相加。此外,针对第二图像帧以后的图像帧也同样。For example, when the second circle moving image is obtained in S21, the UAV control unit 110 compares the pixel value of each pixel of the first image frame gf11, which is the first circle moving image of the calculation moving image, to the second circle moving image The pixel values of the pixels of the first image frame gf21 are added to calculate the first image frame in the new dynamic image for calculation. For example, when the dynamic image of the third circle is obtained in S21, the UAV control unit 110 adds the dynamic image of the first circle and the dynamic image of the second circle to each pixel of the first image frame of the dynamic image for calculation. The pixel value of is added to the pixel value of each pixel of the first image frame gf31 of the third circle dynamic image to generate the first frame of the new dynamic image for calculation. In addition, the same addition is performed for the moving images after the third circle. In addition, the same applies to image frames subsequent to the second image frame.
UAV控制部110对所计算出的计算用动态图像的各个图像帧的平均值进行计算(S27)。在此情况下,UAV控制部110可以计算计算用动态图像的各个图像帧的各个像素的像素值的平均值。UAV控制部110生成具有计算出了平均值的各个图像帧的合成动态图像(S27)。由此,当飞行路径RT的飞行为第二圈以后的飞行时,无人驾驶航空器100可以在拍摄动态图像的同时输出(例如发送、显示)合成动态图像。The UAV control unit 110 calculates the average value of each image frame of the calculated moving image for calculation (S27). In this case, the UAV control unit 110 may calculate the average value of the pixel value of each pixel of each image frame of the moving image for calculation. The UAV control unit 110 generates a composite moving image having each image frame whose average value is calculated (S27). Thus, when the flight of the flight path RT is the flight after the second lap, the unmanned aircraft 100 can output (for example, transmit, display) the synthesized moving image while capturing the moving image.
这样,UAV控制部110可以根据第一圈得到的第一动态图像(例如基准动态图像)与第二圈以后得到的第二动态图像生成合成动态图像。由此,无人驾驶航空器100能够以第一圈动态图像作为基准,生成对多个环绕的动态图像进行了合成的合成动态图像。In this way, the UAV control unit 110 can generate a composite moving image based on the first moving image (for example, the reference moving image) obtained in the first lap and the second moving image obtained after the second lap. In this way, unmanned aircraft 100 can generate a composite moving image in which a plurality of surrounding moving images are synthesized using the first-circle moving image as a reference.
另外,UAV控制部110可以针对每个相同相对时间的图像帧,将第一动态图像与第二动态图像进行比较,根据比较结果,对第一动态图像进行第二动态图像的运动补偿。In addition, the UAV control unit 110 may compare the first moving image with the second moving image for each image frame of the same relative time, and perform the motion compensation of the second moving image on the first moving image based on the comparison result.
由此,无人驾驶航空器100在第一圈与第二圈以后的相同相对时间的图像帧中,能够进行运动补偿。因此,能够提高在多个动态图像中每个相同相对时间的图像帧的图像范围的一致性。该图像范围与拍摄范围相对应。因此,例如即使无人驾驶航空器100的飞行环境并不良好,也能够减少在各个动态图像中的多个图像帧之间的位置偏移,从而提高合成动态图像的画质。As a result, unmanned aircraft 100 can perform motion compensation in image frames of the same relative time after the first lap and the second lap. Therefore, it is possible to improve the uniformity of the image range of each image frame at the same relative time in a plurality of moving images. The image range corresponds to the shooting range. Therefore, for example, even if the flying environment of the unmanned aircraft 100 is not good, it is possible to reduce the positional deviation between a plurality of image frames in each moving image, thereby improving the image quality of the composite moving image.
另外,运动补偿可以包括全局运动补偿。由此,无人驾驶航空器100能够提高动态图像的压缩编码的编码效率及运动补偿的效率。In addition, motion compensation may include global motion compensation. As a result, unmanned aircraft 100 can improve the coding efficiency of compression coding of moving images and the efficiency of motion compensation.
另外,UAV控制部110可以根据第一动态图像及第二动态图像中的相同相对时间的图像帧的相同像素的统计值来生成合成动态图像。当无人驾驶航空器100在飞行的 同时拍摄动态图像时,很难获取拍摄范围相同的图像帧。对此,无人驾驶航空器100能够在相同的飞行路径RT上环绕并得到相同相对时间的多个图像帧。另外,无人驾驶航空器100通过得到多个图像帧的统计值(例如平均值),使得即使多少包含一些画质较低的图像帧,也可以改善图像帧的画质并得到动态图像。In addition, the UAV control unit 110 may generate a composite moving image based on the statistical value of the same pixel of the image frame of the same relative time in the first moving image and the second moving image. When the unmanned aircraft 100 captures dynamic images while flying, it is difficult to obtain image frames with the same shooting range. In this regard, the unmanned aircraft 100 can circle on the same flight path RT and obtain multiple image frames at the same relative time. In addition, the unmanned aircraft 100 obtains statistical values (for example, average values) of a plurality of image frames, so that even if some image frames with lower image quality are included, the image quality of the image frames can be improved and a dynamic image can be obtained.
图8是示出动态图像的合成的第二示例的流程图。在图8中,关于与图7相同的处理,标注相同的步骤编号,并且省略或者简化其说明。Fig. 8 is a flowchart showing a second example of composition of moving images. In FIG. 8, with regard to the same processing as in FIG. 7, the same step numbers are assigned, and the description thereof is omitted or simplified.
首先,无人驾驶航空器100进行与图7的S21、S22、S24、S25相同的处理。First, unmanned aircraft 100 performs the same processing as S21, S22, S24, and S25 in FIG. 7.
然后,UAV控制部110提取所得到的动态图像的图像帧中的特征区域(S26A)。特征区域是基于客观或者用户主观而进行提取。特征区域例如可以是在该环绕中具有价值的特征的区域。例如,UAV控制部110可以提取所得到的动态图像与基准动态图像中的相同相对时间的图像帧之间的差异区域作为特征区域。例如,UAV控制部110可以提取所得到的动态图像的图像帧中的预定被摄体存在的区域作为特征区域。例如,UAV控制部110可以针对得到的动态图像的图像帧,通过终端80的操作部83提取用户指定的区域作为特征区域。特征区域的提取是针对得到的动态图像中的各个图像帧而实施的。Then, the UAV control unit 110 extracts the characteristic region in the image frame of the obtained moving image (S26A). The feature area is extracted based on objective or user subjectivity. The characteristic area may be, for example, a characteristic area having value in the surround. For example, the UAV control unit 110 may extract the difference area between the obtained moving image and the image frame of the same relative time in the reference moving image as the characteristic area. For example, the UAV control unit 110 may extract an area where a predetermined subject exists in an image frame of the obtained moving image as a characteristic area. For example, the UAV control unit 110 may extract an area designated by the user as a characteristic area through the operation unit 83 of the terminal 80 for the obtained image frame of the moving image. The extraction of the characteristic region is implemented for each image frame in the obtained moving image.
UAV控制部110用提取的特征区域替换在得到的动态图像的各个图像帧中提取的特征区域所对应的基准动态图像的各个图像帧的区域(特征对应区域)(S27A)。在此情况下,UAV控制部110可以将特征对应区域中的各个像素的像素值替换为提取的特征区域中的各个像素的像素值。UAV控制部110生成具有基准动态图像中的特征对应区域被所得到的动态图像中的特征区域替换了的各个图像帧的合成动态图像(S27A)。The UAV control unit 110 replaces the area (feature corresponding area) of each image frame of the reference moving image corresponding to the characteristic area extracted in each image frame of the obtained moving image with the extracted characteristic area (S27A). In this case, the UAV control unit 110 may replace the pixel value of each pixel in the feature corresponding area with the pixel value of each pixel in the extracted feature area. The UAV control unit 110 generates a composite moving image having each image frame in which the characteristic corresponding area in the reference moving image is replaced with the characteristic area in the obtained moving image (S27A).
这样,UAV控制部110可以针对每个相同相对时间的图像帧,对第一动态图像与第二动态图像进行比较,针对第二动态图像提取特征区域,用第二动态图像中的特征区域替换第一动态图像中的与特征区域相对应的区域(特征对应区域)。In this way, the UAV control unit 110 can compare the first moving image with the second moving image for each image frame of the same relative time, extract the characteristic area from the second moving image, and replace the first moving image with the characteristic area in the second moving image. An area corresponding to the characteristic area in a dynamic image (feature corresponding area).
由此,无人驾驶航空器100通过用其他的动态图像中的相同相对时间的图像帧的一部分替换第一动态图像中画质较低的部分或并非用户所期望的状态的部分,从而改善第一动态图像的画质并得到合成动态图像。例如,当对作为被摄体的任意的塔或建筑物进行拍摄时,有时在第一动态图像的图像帧中会在塔或建筑物的周围存在许多游客。即使在此情况下,在第二动态图像中的相同相对时间的图像帧中不存在游客时,无人驾驶航空器100提取该部分作为特征区域,来替换第一动态图像中的图像帧的特征对应区域。由此,无人驾驶航空器100能够得到包含排除了游客的塔或建筑物的合成动态图像。As a result, the unmanned aircraft 100 replaces a part of the first moving image with a lower image quality or a part that is not in the state expected by the user with a part of the image frame of the same relative time in another moving image, thereby improving the first moving image. The quality of the dynamic image and the synthesized dynamic image are obtained. For example, when photographing an arbitrary tower or building as a subject, there may be many tourists around the tower or building in the image frame of the first moving image. Even in this case, when there is no tourist in the image frame of the same relative time in the second dynamic image, the unmanned aircraft 100 extracts this part as a feature area to replace the feature corresponding to the image frame in the first dynamic image. area. As a result, unmanned aircraft 100 can obtain a composite moving image including towers or buildings excluding tourists.
图9是示出动态图像的输出例的流程图。动态图像的输出处理相当于图6的S15。在图9中,假设获取了在图6的S13中任意一次环绕的动态图像。Fig. 9 is a flowchart showing an output example of a moving image. The output processing of the moving image corresponds to S15 in FIG. 6. In FIG. 9, it is assumed that a moving image that surrounds at any one time in S13 of FIG. 6 is acquired.
UAV控制部110对得到的动态图像是否是第N圈以后的动态图像进行判定(S31)。当得到的动态图像为第N圈之前的环绕的动态图像时,UAV控制部110输出最后一次环绕的动态图像(S32)。在此情况下,UAV控制部110可以输出由摄像部220实时拍摄的动态图像,而不是合成动态图像。当得到的动态图像为第N圈环绕以后的的动态图像时,UAV控制部110输出生成的合成动态图像(S33)。The UAV control unit 110 determines whether the obtained moving image is the moving image after the Nth circle (S31). When the obtained moving image is the surrounding moving image before the Nth circle, the UAV control unit 110 outputs the moving image of the last surrounding (S32). In this case, the UAV control unit 110 may output a moving image captured by the imaging unit 220 in real time, instead of synthesizing the moving image. When the obtained moving image is the moving image after the N-th circle, the UAV control unit 110 outputs the generated composite moving image (S33).
UAV控制部110可以通过通信部150向其他的装置(例如终端80)发送动态图像来作为动态图像的输出。UAV控制部110也可以在其他的装置(例如终端80)上显示动态图像来作为动态图像的输出。在此情况下,终端80的终端控制部81可以通过通信部85接收动态图像,并且通过显示部88显示动态图像。另外,UAV控制部110可 以将动态图像存储在存储部160或其他的记录介质(例如外部记录介质)中来作为动态图像的输出。The UAV control unit 110 may transmit the moving image to another device (for example, the terminal 80) through the communication unit 150 as an output of the moving image. The UAV control unit 110 may display a moving image on another device (for example, the terminal 80) as an output of the moving image. In this case, the terminal control unit 81 of the terminal 80 can receive the moving image through the communication unit 85 and display the moving image through the display unit 88. In addition, the UAV control unit 110 may store the moving image in the storage unit 160 or another recording medium (for example, an external recording medium) as the output of the moving image.
这样,UAV控制部110可以获取无人驾驶航空器100的飞行路径RT的飞行的环绕次数。当获取的环绕次数小于阈值(例如N次)时,UAV控制部110可以输出在最后一次环绕中拍摄的动态图像。当获取的环绕次数大于等于阈值时,UAV控制部110可以输出合成动态图像。In this way, the UAV control unit 110 can obtain the number of turns of the flight path RT of the unmanned aircraft 100. When the acquired number of surrounds is less than a threshold value (for example, N times), the UAV control section 110 may output the dynamic image captured in the last surround. When the acquired number of surrounds is greater than or equal to the threshold, the UAV control section 110 may output the synthesized dynamic image.
由此,无人驾驶航空器100在假设合成动态图像的画质不充分的环绕次数中,合成动态图像中会出现不需要的伪像。因此,在此情况下,无人驾驶航空器100通过提供未合成的最后一次环绕的动态图像,可以抑制合成动态图像的输出,并且提供最新的动态图像。另外,无人驾驶航空器100在大于等于阈值的次数进行飞行并拍摄动态图像时,有时需要较长时间。即使在此情况下也能够输出一些动态图像,并且用户可以进行确认。另一方面,无人驾驶航空器100在假设合成动态图像的画质充分的环绕次数中,合成动态图像的画质稳定。在此情况下,期待无人驾驶航空器100通过提供合成动态图像,从而能够提供与各圈时的动态图像相比画质得到改进的动态图像。As a result, unmanned aircraft 100 may have undesired artifacts appearing in the synthesized moving image when the image quality of the synthesized moving image is assumed to be insufficient. Therefore, in this case, the unmanned aircraft 100 can suppress the output of the synthesized dynamic image and provide the latest dynamic image by providing the unsynthesized dynamic image of the last circling. In addition, when unmanned aircraft 100 flies a threshold number or more and takes a moving image, it sometimes takes a long time. Even in this case, some moving images can be output, and the user can confirm. On the other hand, the unmanned aircraft 100 assumes that the image quality of the composite moving image is sufficient for the number of rounds, and the image quality of the composite moving image is stable. In this case, it is expected that the unmanned aircraft 100 can provide a dynamic image with improved image quality compared to the dynamic image at each circle by providing the composite dynamic image.
此外,图9所示的动态图像的输出例是一个示例,也可以是其他的输出方法。例如,UAV控制部110也可以不依赖于环绕次数来输出合成动态图像,而与得到的动态图像是第几圈的动态图像无关。In addition, the output example of the moving image shown in FIG. 9 is an example, and other output methods may be used. For example, the UAV control unit 110 may output the composite moving image independently of the number of laps, regardless of the number of laps of the obtained moving image.
以上使用实施方式对本公开进行了说明,但是本公开的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的记载即可明白,加以了这样的变更或改良的方式都可包含在本公开的技术范围之内。The present disclosure has been described above using the embodiment, but the technical scope of the present disclosure is not limited to the scope described in the above embodiment. It is obvious to a person of ordinary skill in the art that various changes or improvements can be made to the above-mentioned embodiments. It can be understood from the description of the claims that all such changes or improvements can be included in the technical scope of the present disclosure.
权利要求书、说明书以及说明书附图中所示的装置、***、程序和方法中的动作、顺序、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,即可以以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。The execution order of the actions, sequences, steps, and stages in the devices, systems, programs, and methods shown in the claims, specifications, and drawings of the specification, as long as there is no special indication that "before" or "in advance" "Etc., and as long as the output of the previous processing is not used in the subsequent processing, it can be implemented in any order. Regarding the operation flow in the claims, the specification and the drawings, the description is made using "first", "next", etc. for convenience, but it does not mean that it must be implemented in this order.
在上述实施方式中,对飞行体飞行时的多个动态图像的拍摄及合成进行了说明,但并不仅限于飞行体,也可以在其他的移动体(例如车辆、船舶)中应用上述实施方式。在此情况下,例如,通过将飞行这一表述替换为移动,也可以将上述实施方式应用到移动体移动时的多个动态图像的拍摄及合成中。In the above-mentioned embodiment, the shooting and synthesis of a plurality of moving images during the flight of the flying body have been described, but it is not limited to the flying body, and the above-mentioned embodiment may be applied to other moving bodies (for example, vehicles and ships). In this case, for example, by replacing the expression of flying with movement, the above-described embodiment can also be applied to the shooting and synthesis of a plurality of moving images when the moving body is moving.

Claims (30)

  1. 一种对由飞行体所包括的摄像部拍摄的动态图像进行处理的图像处理装置,其特征在于,An image processing device for processing moving images captured by an imaging unit included in a flying body, characterized in that:
    包括处理部,Including the processing department,
    所述处理部指定所述飞行体飞行的飞行路径;The processing unit designates a flight path of the flying body;
    使所述飞行体沿着所述飞行路径环绕飞行多次;Making the flying body circulate along the flight path for multiple times;
    通过多次环绕飞行使所述飞行体所包括的摄像部拍摄具有相同拍摄范围的多个动态图像;Enabling the camera included in the flying body to take a plurality of dynamic images with the same shooting range through multiple circumnavigation flights;
    对通过多次环绕飞行所拍摄的多个动态图像进行合成,生成合成动态图像。Combine multiple dynamic images taken through multiple circumnavigation flights to generate a composite dynamic image.
  2. 根据权利要求1所述的图像处理装置,其特征在于,所述动态图像具有按时间序列顺序的多个图像帧,The image processing device according to claim 1, wherein the dynamic image has a plurality of image frames in a time series sequence,
    所述处理部控制所述飞行体,使得多个动态图像中的每个相同相对时间的图像帧具有相同的拍摄范围。The processing unit controls the flying body so that each image frame of the same relative time in the plurality of dynamic images has the same shooting range.
  3. 根据权利要求2所述的图像处理装置,其特征在于,The image processing device according to claim 2, wherein:
    所述处理部在第一圈飞行路径的飞行中,与所述摄像部的垂直同步信号同步地获取所述飞行体的状态;The processing unit acquires the state of the flying body in synchronization with the vertical synchronization signal of the imaging unit during the first flight of the flight path;
    在第二圈以后的飞行路径的飞行中,与所述摄像部的垂直同步信号同步地对所述飞行体的飞行及所述摄像部进行控制,使得以与所述第一圈中的所述飞行体的状态相同的状态进行拍摄。During the flight of the flight path after the second lap, the flight of the flying object and the imaging section are controlled in synchronization with the vertical synchronization signal of the imaging unit so as to be in line with the one in the first lap. Shoot in the same state of the flying object.
  4. 根据权利要求3所述的图像处理装置,其特征在于,所述飞行体的状态包括所述飞行体的位置、所述飞行体的朝向、支撑所述摄像部的万向节的角度等信息中的至少一个。The image processing device according to claim 3, wherein the state of the flying body includes information such as the position of the flying body, the orientation of the flying body, the angle of the universal joint supporting the imaging unit, and the like. At least one of them.
  5. 根据权利要求2至4中任一项所述的图像处理装置,其特征在于,所述处理部根据第一圈得到的第一动态图像与第二圈以后得到的第二动态图像生成所述合成动态图像。The image processing device according to any one of claims 2 to 4, wherein the processing unit generates the composite based on a first moving image obtained in the first circle and a second moving image obtained after the second circle. Dynamic image.
  6. 根据权利要求5所述的图像处理装置,其特征在于,The image processing device according to claim 5, wherein:
    所述处理部针对每个相同相对时间的所述图像帧,The processing unit for each of the image frames at the same relative time,
    对所述第一动态图像与所述第二动态图像进行比较;Comparing the first dynamic image with the second dynamic image;
    根据所述比较结果,对所述第一动态图像进行所述第二动态图像的运动补偿。According to the comparison result, the motion compensation of the second dynamic image is performed on the first dynamic image.
  7. 根据权利要求6所述的图像处理装置,其特征在于,所述运动补偿包括全局运动补偿。The image processing device according to claim 6, wherein the motion compensation includes global motion compensation.
  8. 根据权利要求5至7中任一项所述的图像处理装置,其特征在于,所述处理部根据所述第一动态图像及所述第二动态图像中的相同相对时间的图像帧的相同像素 的统计值来生成所述合成动态图像。The image processing device according to any one of claims 5 to 7, wherein the processing unit is based on the same pixels in the image frames of the same relative time in the first moving image and the second moving image. To generate the composite dynamic image.
  9. 根据权利要求5至7中任一项所述的图像处理装置,其特征在于,The image processing device according to any one of claims 5 to 7, wherein:
    所述处理部针对每个相同相对时间的所述图像帧,The processing unit for each of the image frames at the same relative time,
    对所述第一动态图像与所述第二动态图像进行比较,Comparing the first dynamic image with the second dynamic image,
    针对所述第二动态图像提取特征区域,Extracting characteristic regions for the second dynamic image,
    用所述第二动态图像中的所述特征区域替换所述第一动态图像中与所述特征区域相对应的区域。The area corresponding to the characteristic area in the first moving image is replaced with the characteristic area in the second moving image.
  10. 根据权利要求5至9中任一项所述的图像处理装置,其特征在于,The image processing device according to any one of claims 5 to 9, wherein:
    所述处理部获取所述飞行体在所述飞行路径上的飞行环绕次数;Acquiring, by the processing unit, the number of flight circles of the flying body on the flight path;
    当获取的所述环绕次数小于阈值时,输出在最后一次环绕中拍摄的动态图像;When the acquired number of surrounds is less than the threshold value, output the dynamic image taken in the last surround;
    当获取的所述环绕次数大于等于所述阈值时,输出所述合成动态图像。When the acquired number of surrounds is greater than or equal to the threshold, the composite dynamic image is output.
  11. 根据权利要求1至10中任一项所述的图像处理装置,其特征在于,The image processing device according to any one of claims 1 to 10, wherein:
    所述处理部对输出的所述合成动态图像进行评估;The processing unit evaluates the output synthetic dynamic image;
    当所述合成动态图像的评估结果满足预设基准时,结束所述飞行体的飞行及拍摄;When the evaluation result of the synthetic dynamic image meets a preset criterion, the flying and shooting of the flying body is ended;
    当所述合成动态图像的评估结果不满足所述预设基准时,沿着下一次环绕的所述飞行路径进行飞行及拍摄。When the evaluation result of the synthetic dynamic image does not meet the preset reference, the flight and shooting are performed along the flight path of the next circle.
  12. 根据权利要求11所述的图像处理装置,其特征在于,所述处理部获取表示所述合成动态图像的评估结果的操作信息。The image processing device according to claim 11, wherein the processing unit acquires operation information indicating an evaluation result of the composite moving image.
  13. 根据权利要求11所述的图像处理装置,其特征在于,The image processing device according to claim 11, wherein:
    所述处理部针对所述合成动态图像进行图像识别;The processing unit performs image recognition on the synthetic dynamic image;
    根据所述图像识别的结果对所述合成动态图像进行评估。The synthetic dynamic image is evaluated according to the result of the image recognition.
  14. 根据权利要求1至13中任一项所述的图像处理装置,其特征在于,所述图像处理装置是所述飞行体。The image processing device according to any one of claims 1 to 13, wherein the image processing device is the flying body.
  15. 一种图像处理方法,其对由飞行体所包括的摄像部拍摄的动态图像进行处理,其特征在于,包括以下步骤:An image processing method for processing dynamic images captured by a camera included in a flying body, characterized in that it includes the following steps:
    指定所述飞行体飞行的飞行路径;Specify the flight path of the flying body;
    使所述飞行体沿着所述飞行路径环绕飞行多次;Making the flying body circulate along the flight path for multiple times;
    通过多次环绕飞行使所述飞行体所包括的摄像部拍摄具有相同拍摄范围的多个动态图像;以及Make the camera included in the flying body take a plurality of dynamic images with the same shooting range by flying multiple times; and
    对通过多次环绕飞行所拍摄的多个动态图像进行合成,生成合成动态图像。Combine multiple dynamic images taken through multiple circumnavigation flights to generate a composite dynamic image.
  16. 根据权利要求15所述的图像处理方法,其特征在于,所述动态图像具有按时间序列顺序的多个图像帧,The image processing method according to claim 15, wherein the dynamic image has a plurality of image frames in a time series sequence,
    拍摄所述多个动态图像的步骤包括以下步骤:控制所述飞行体,使得多个动态图像中的每个相同相对时间的图像帧具有相同的拍摄范围。The step of capturing the plurality of dynamic images includes the following steps: controlling the flying body so that each image frame of the same relative time in the plurality of dynamic images has the same shooting range.
  17. 根据权利要求16所述的图像处理方法,其特征在于,拍摄所述多个动态图像的步骤包括以下步骤:The image processing method according to claim 16, wherein the step of shooting the plurality of dynamic images comprises the following steps:
    在第一圈飞行路径的飞行中,与所述摄像部的垂直同步信号同步地获取所述飞行体的状态;以及During the first flight of the flight path, acquiring the state of the flying body in synchronization with the vertical synchronization signal of the camera unit; and
    在第二圈以后的飞行路径的飞行中,与所述摄像部的垂直同步信号同步地对所述飞行体的飞行及所述摄像部进行控制,使得以与所述第一圈中的所述飞行体的状态相同的状态进行拍摄。During the flight of the flight path after the second lap, the flight of the flying body and the imaging section are controlled in synchronization with the vertical synchronization signal of the imaging section so as to be in line with the one in the first lap. Shoot in the same state of the flying object.
  18. 根据权利要求17所述的图像处理方法,其特征在于,所述飞行体的状态包括所述飞行体的位置、所述飞行体的朝向、支撑所述摄像部的万向节的角度等信息中的至少一个。The image processing method according to claim 17, wherein the state of the flying body includes information such as the position of the flying body, the orientation of the flying body, the angle of the universal joint supporting the camera unit, and the like. At least one of them.
  19. 根据权利要求16至18中任一项所述的图像处理方法,其特征在于,生成所述合成动态图像的步骤包括以下步骤:根据第一圈得到的第一动态图像与第二圈以后得到的第二动态图像生成所述合成动态图像。The image processing method according to any one of claims 16 to 18, wherein the step of generating the synthetic dynamic image comprises the following steps: a first dynamic image obtained according to the first circle and a second dynamic image obtained after the second circle The second dynamic image generates the composite dynamic image.
  20. 根据权利要求19所述的图像处理方法,其特征在于,生成所述合成动态图像的步骤包括以下步骤:The image processing method according to claim 19, wherein the step of generating the synthetic dynamic image comprises the following steps:
    针对每个相同相对时间的所述图像帧,For each of the image frames at the same relative time,
    对所述第一动态图像与所述第二动态图像进行比较;以及Comparing the first dynamic image with the second dynamic image; and
    根据所述比较结果,对第一动态图像进行所述第二动态图像的运动补偿。According to the comparison result, the motion compensation of the second dynamic image is performed on the first dynamic image.
  21. 根据权利要求20所述的图像处理方法,其特征在于,所述运动补偿包括全局运动补偿。The image processing method according to claim 20, wherein the motion compensation includes global motion compensation.
  22. 根据权利要求19至21中任一项所述的图像处理方法,其特征在于,生成所述合成动态图像的步骤包括以下步骤:根据所述第一动态图像及所述第二动态图像中的相同相对时间的图像帧的相同像素的统计值来生成所述合成动态图像。The image processing method according to any one of claims 19 to 21, wherein the step of generating the synthetic dynamic image comprises the following steps: according to the same in the first dynamic image and the second dynamic image The statistic value of the same pixel of the image frame in the relative time is used to generate the composite dynamic image.
  23. 根据权利要求19至21中任一项所述的图像处理方法,其特征在于,生成所述合成动态图像的步骤包括以下步骤:The image processing method according to any one of claims 19 to 21, wherein the step of generating the synthetic dynamic image comprises the following steps:
    针对每个相同相对时间的所述图像帧,For each of the image frames at the same relative time,
    对所述第一动态图像与所述第二动态图像进行比较;Comparing the first dynamic image with the second dynamic image;
    针对所述第二动态图像提取特征区域;以及Extracting a characteristic region for the second dynamic image; and
    用所述第二动态图像中的所述特征区域替换所述第一动态图像中与所述特征区域相对应的区域。The area corresponding to the characteristic area in the first moving image is replaced with the characteristic area in the second moving image.
  24. 根据权利要求19至23中任一项所述的图像处理方法,其特征在于,还包括以下步骤:获取所述飞行体在所述飞行路径上的飞行环绕次数;The image processing method according to any one of claims 19 to 23, further comprising the following step: obtaining the number of flight circles of the flying body on the flight path;
    当获取的所述环绕次数小于阈值时,输出在最后一次环绕中拍摄的动态图像;以及When the acquired number of surrounds is less than the threshold, output the dynamic image shot in the last surround; and
    当获取的所述环绕次数大于等于所述阈值时,输出所述合成动态图像。When the acquired number of surrounds is greater than or equal to the threshold, the composite dynamic image is output.
  25. 根据权利要求15至24中任一项所述的图像处理方法,其特征在于,拍摄所述多个动态图像的步骤包括以下步骤:The image processing method according to any one of claims 15 to 24, wherein the step of shooting the plurality of dynamic images comprises the following steps:
    对输出的所述合成动态图像进行评估;Evaluating the output synthetic dynamic image;
    当所述合成动态图像的评估结果满足预设基准时,结束所述飞行体的飞行及拍摄;以及When the evaluation result of the synthetic dynamic image satisfies a preset criterion, the flying and shooting of the flying body is ended; and
    当所述合成动态图像的评估结果不满足所述预设基准时,沿着下一次环绕的所述飞行路径进行飞行及拍摄。When the evaluation result of the synthetic dynamic image does not meet the preset reference, the flight and shooting are performed along the flight path of the next circle.
  26. 根据权利要求25所述的图像处理方法,其特征在于,对所述合成动态图像进行评估的步骤包括以下步骤:获取表示所述合成动态图像的评估结果的操作信息。The image processing method according to claim 25, wherein the step of evaluating the synthetic dynamic image includes the step of acquiring operation information representing the evaluation result of the synthetic dynamic image.
  27. 根据权利要求25所述的图像处理方法,其特征在于,对所述合成动态图像进行评估的步骤包括以下步骤:The image processing method according to claim 25, wherein the step of evaluating the synthetic dynamic image comprises the following steps:
    针对所述合成动态图像进行图像识别;以及Performing image recognition on the synthetic dynamic image; and
    根据所述图像识别的结果对所述合成动态图像进行评估。The synthetic dynamic image is evaluated according to the result of the image recognition.
  28. 根据权利要求15至27中任一项所述的图像处理方法,其特征在于,所述图像处理方法由图像处理装置执行,The image processing method according to any one of claims 15 to 27, wherein the image processing method is executed by an image processing device,
    所述图像处理装置是所述飞行体。The image processing device is the flying body.
  29. 一种程序,其特征在于,其用于使对由飞行体所包括的摄像部拍摄的动态图像进行处理的图像处理装置执行以下步骤:A program characterized in that it is used for causing an image processing device that processes a moving image captured by an imaging unit included in an flying body to execute the following steps:
    指定所述飞行体飞行的飞行路径;Specify the flight path of the flying body;
    使所述飞行体沿着所述飞行路径环绕飞行多次;Making the flying body circulate along the flight path for multiple times;
    通过多次环绕飞行,使所述飞行体所包括的摄像部拍摄具有相同拍摄范围的多个动态图像;Through multiple rounds of flight, the camera included in the flying body can take a plurality of dynamic images with the same shooting range;
    对通过多次环绕飞行所拍摄的多个动态图像进行合成,生成合成动态图像。Combine multiple dynamic images taken through multiple circumnavigation flights to generate a composite dynamic image.
  30. 一种记录介质,其特征在于,其是记录有程序的计算机可读记录介质,所述程序用于使对由飞行体所包括的摄像部拍摄的动态图像进行处理的图像处理装置执行以下步骤:A recording medium characterized in that it is a computer-readable recording medium on which a program is recorded for causing an image processing device that processes a moving image captured by an imaging unit included in an flying body to perform the following steps:
    指定所述飞行体飞行的飞行路径;Specify the flight path of the flying body;
    使所述飞行体沿着所述飞行路径环绕飞行多次;Making the flying body circulate along the flight path for multiple times;
    通过多次环绕飞行,使所述飞行体所包括的摄像部拍摄具有相同拍摄范围的多个动态图像;Through multiple rounds of flight, the camera included in the flying body can take a plurality of dynamic images with the same shooting range;
    对通过多次环绕飞行所拍摄的多个动态图像进行合成,生成合成动态图像。Combine multiple dynamic images taken through multiple circumnavigation flights to generate a composite dynamic image.
PCT/CN2020/133589 2019-12-09 2020-12-03 Image processing device, image processing method, program and recording medium WO2021115192A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080074343.6A CN114586335A (en) 2019-12-09 2020-12-03 Image processing apparatus, image processing method, program, and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019222092A JP6997164B2 (en) 2019-12-09 2019-12-09 Image processing equipment, image processing methods, programs, and recording media
JP2019-222092 2019-12-09

Publications (1)

Publication Number Publication Date
WO2021115192A1 true WO2021115192A1 (en) 2021-06-17

Family

ID=76311106

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/133589 WO2021115192A1 (en) 2019-12-09 2020-12-03 Image processing device, image processing method, program and recording medium

Country Status (3)

Country Link
JP (1) JP6997164B2 (en)
CN (1) CN114586335A (en)
WO (1) WO2021115192A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024006072A (en) * 2022-06-30 2024-01-17 本田技研工業株式会社 Image processing device, image processing method, image processing system, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070122139A1 (en) * 2005-11-29 2007-05-31 Seiko Epson Corporation Controller, photographing equipment, control method of photographing equipment, and control program
CN102210136A (en) * 2009-09-16 2011-10-05 索尼公司 Device, method, and program for processing image
CN109246355A (en) * 2018-09-19 2019-01-18 北京云迹科技有限公司 The method, apparatus and robot of panoramic picture are generated using robot
CN109952755A (en) * 2016-10-17 2019-06-28 深圳市大疆创新科技有限公司 Flight path generation method, flight path generate system, flying body, program and recording medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008186145A (en) 2007-01-29 2008-08-14 Mitsubishi Electric Corp Aerial image processing apparatus and aerial image processing method
JP2011087183A (en) * 2009-10-16 2011-04-28 Olympus Imaging Corp Imaging apparatus, image processing apparatus, and program
JP2014185947A (en) 2013-03-25 2014-10-02 Geo Technical Laboratory Co Ltd Image photographing method for three-dimensional restoration
JP7021900B2 (en) 2017-10-24 2022-02-17 M-Solutions株式会社 Image provision method
CN108419023B (en) * 2018-03-26 2020-09-08 华为技术有限公司 Method for generating high dynamic range image and related equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070122139A1 (en) * 2005-11-29 2007-05-31 Seiko Epson Corporation Controller, photographing equipment, control method of photographing equipment, and control program
CN102210136A (en) * 2009-09-16 2011-10-05 索尼公司 Device, method, and program for processing image
CN109952755A (en) * 2016-10-17 2019-06-28 深圳市大疆创新科技有限公司 Flight path generation method, flight path generate system, flying body, program and recording medium
CN109246355A (en) * 2018-09-19 2019-01-18 北京云迹科技有限公司 The method, apparatus and robot of panoramic picture are generated using robot

Also Published As

Publication number Publication date
CN114586335A (en) 2022-06-03
JP6997164B2 (en) 2022-01-17
JP2021093592A (en) 2021-06-17

Similar Documents

Publication Publication Date Title
US11722647B2 (en) Unmanned aerial vehicle imaging control method, unmanned aerial vehicle imaging method, control terminal, unmanned aerial vehicle control device, and unmanned aerial vehicle
JP6765512B2 (en) Flight path generation method, information processing device, flight path generation system, program and recording medium
WO2018073879A1 (en) Flight route generation method, flight route generation system, flight vehicle, program, and recording medium
KR20170136750A (en) Electronic apparatus and operating method thereof
CN112154649A (en) Aerial survey method, shooting control method, aircraft, terminal, system and storage medium
WO2019080768A1 (en) Information processing apparatus, aerial photography path generation method, program and recording medium
WO2019230604A1 (en) Inspection system
CN110291777B (en) Image acquisition method, device and machine-readable storage medium
JP2017201261A (en) Shape information generating system
WO2018073878A1 (en) Three-dimensional-shape estimation method, three-dimensional-shape estimation system, flying body, program, and recording medium
CN111344650B (en) Information processing device, flight path generation method, program, and recording medium
WO2018214401A1 (en) Mobile platform, flying object, support apparatus, portable terminal, method for assisting in photography, program and recording medium
JP2019028560A (en) Mobile platform, image composition method, program and recording medium
WO2021115192A1 (en) Image processing device, image processing method, program and recording medium
JP2021096865A (en) Information processing device, flight control instruction method, program, and recording medium
CN111213107B (en) Information processing device, imaging control method, program, and recording medium
US20210092306A1 (en) Movable body, image generation method, program, and recording medium
WO2020119572A1 (en) Shape inferring device, shape inferring method, program, and recording medium
KR101552407B1 (en) System and Method for Taking a Picture with Partner Have a Same Place but Different Time
JP7081198B2 (en) Shooting system and shooting control device
WO2020001629A1 (en) Information processing device, flight path generating method, program, and recording medium
JP6803960B1 (en) Image processing equipment, image processing methods, programs, and recording media
JP2019212961A (en) Mobile unit, light amount adjustment method, program, and recording medium
WO2023047799A1 (en) Image processing device, image processing method, and program
WO2020088397A1 (en) Position estimation apparatus, position estimation method, program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20897999

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20897999

Country of ref document: EP

Kind code of ref document: A1