WO2020108290A1 - Image generating device, image generating method, program and recording medium - Google Patents

Image generating device, image generating method, program and recording medium Download PDF

Info

Publication number
WO2020108290A1
WO2020108290A1 PCT/CN2019/117466 CN2019117466W WO2020108290A1 WO 2020108290 A1 WO2020108290 A1 WO 2020108290A1 CN 2019117466 W CN2019117466 W CN 2019117466W WO 2020108290 A1 WO2020108290 A1 WO 2020108290A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional model
distance
captured
camera
dimensional
Prior art date
Application number
PCT/CN2019/117466
Other languages
French (fr)
Chinese (zh)
Inventor
陈斌
沈思杰
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980009014.0A priority Critical patent/CN111656760A/en
Publication of WO2020108290A1 publication Critical patent/WO2020108290A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment

Definitions

  • the present disclosure relates to an image generation device, an image generation method, a program, and a recording medium that generate a composite image based on a plurality of captured images captured by a flying object.
  • a platform unmanned aerial vehicle for shooting while passing through a predetermined fixed path.
  • This platform receives the camera instruction from the ground base and shoots the camera object.
  • This platform shoots a subject, while flying along a fixed path, and tilting the platform's shooting device to shoot based on the positional relationship between the platform and the subject (refer to Patent Document 1).
  • Patent Document 1 Japanese Patent Application Publication No. 2010-61216
  • an image generation device is an image generation device that generates a composite image based on a plurality of captured images captured by a flying body, and includes a processing unit that performs processing related to the generation of a composite image, and the processing unit acquires Multiple camera images captured by the camera included in the body; generate a three-dimensional model based on the multiple camera images to obtain various postures of the camera device when taking multiple camera images; calculate multiple images based on each camera device posture and three-dimensional model The distance between each position of the camera device and the three-dimensional model at the time of capturing an image; the size of the plurality of camera images is adjusted based on the distance between each position of the camera device and the three-dimensional model; the plurality of camera images after resizing are combined Instead, a composite image is generated.
  • the processing unit may acquire each position and each posture of the camera device when capturing the plurality of captured images, and calculate based on the each position and each posture of the camera device and the three-dimensional model The distance between each position of the camera device and the three-dimensional model.
  • the processing unit may calculate the distance between the imaging device and the first part of the three-dimensional model corresponding to the imaging range for each imaging range captured by the imaging device at each position.
  • the processing unit may divide the imaging range shot at each position of the imaging device to generate a division area of the imaging range, calculate the second part of the three-dimensional model corresponding to the division area, and calculate the imaging device and the The distance between the second part of the 3D model corresponding to the segmented area.
  • the distance may be the distance in the vertical direction between each position of the camera and the three-dimensional model.
  • the distance may be a distance in the imaging direction of the imaging device between each position of the imaging device and the three-dimensional model.
  • the processing unit may generate sparse point group data based on the plurality of captured images, and generate a three-dimensional model based on the sparse point group data.
  • the processing unit can project multiple three-dimensional points contained in the sparse point group data onto the two-dimensional plane; specify the projected multiple two-dimensional points adjacent to the two-dimensional plane as a group, and specify multiple groups ; Connect multiple 3D points contained in the sparse point group data corresponding to the specified adjacent multiple 2D points in groups to generate multiple face data; generate a three-dimensional model based on multiple face data.
  • an image generation method is an image generation method that generates a composite image based on a plurality of captured images captured by a flying body, and includes the following steps: acquiring multiple captured images captured by a camera device included in the flying body ; Generate a three-dimensional model based on multiple camera images; obtain the various poses of the camera device when shooting multiple camera images; calculate the various positions and three-dimensional models of the camera device when shooting multiple camera images based on the camera device's various poses and three-dimensional models The distance between them; adjust the size of multiple captured images based on the distance between each position of the camera device and the three-dimensional model; synthesize multiple captured images after size adjustment to generate a composite image.
  • the step of acquiring a posture may include the following steps: various positions of the camera device and the various postures when the plurality of captured images are captured.
  • the step of calculating the distance may include the step of calculating the distance between each position of the camera device and the three-dimensional model based on the respective positions and postures of the camera device and the three-dimensional model.
  • the step of calculating the distance may include the step of calculating the distance between the camera device and the first part of the three-dimensional model corresponding to the camera range for each camera range captured by the camera device at various positions.
  • the step of calculating the distance may include the following steps: dividing the imaging ranges taken at various positions of the camera device to generate a segmentation area of the imaging range; calculating the second part of the three-dimensional model corresponding to the segmentation area; To calculate the distance between the camera and the second part of the three-dimensional model corresponding to the segmented area.
  • the distance may be the distance in the vertical direction between each position of the camera and the three-dimensional model.
  • the distance may be a distance in the imaging direction of the imaging device between each position of the imaging device and the three-dimensional model.
  • the step of generating a three-dimensional model may include the steps of: generating sparse point group data based on the plurality of camera images; and generating a three-dimensional model based on the sparse point group data.
  • the step of generating a 3D model may include the steps of: projecting a plurality of 3D points contained in the sparse point group data onto a 2D plane; and designating a plurality of projected 2D points adjacent to the 2D plane as a Group, and specify multiple groups; connect multiple 3D points contained in the sparse point group data corresponding to the specified adjacent multiple 2D points by groups to generate multiple face data; and based on multiple faces
  • the data generates a three-dimensional model.
  • a program for causing an image generation device that generates a composite image based on a plurality of captured images captured by a flying body to perform the following steps: acquire multiple captured images captured by a camera included in the flying body; Generate a three-dimensional model based on multiple camera images; acquire various poses of the camera device when capturing multiple camera images; calculate each position and three-dimensional model of the camera device when capturing multiple camera images based on each camera device's posture and three-dimensional model The distance between each position of the camera device and the three-dimensional model to adjust the size of the plurality of captured images; and the resized multiple captured images are combined to generate a composite image.
  • a recording medium is a computer-readable recording medium recorded with a program that causes an image generation device that generates a composite image based on a plurality of camera images taken by a flying body to perform the following steps: acquiring Multiple camera images captured by the camera device of the world; generating a three-dimensional model based on the multiple camera images; acquiring each posture of the camera device when taking multiple camera images; calculating each camera image based on each posture and three-dimensional model of the camera device The distance between each position of the camera device and the three-dimensional model; adjust the size of the plurality of camera images based on the distance between each position of the camera device and the three-dimensional model; and synthesize the plurality of camera images after the size adjustment to generate Composite image.
  • FIG. 1 is a schematic diagram showing a first configuration example of the flying system in Embodiment 1.
  • FIG. 1 is a schematic diagram showing a first configuration example of the flying system in Embodiment 1.
  • FIG. 2 is a schematic diagram showing a second configuration example of the flying system in Embodiment 1.
  • FIG. 2 is a schematic diagram showing a second configuration example of the flying system in Embodiment 1.
  • FIG. 3 is a diagram showing an example of a specific appearance of an unmanned aircraft.
  • FIG. 4 is a block diagram showing an example of the hardware configuration of an unmanned aircraft.
  • FIG. 5 is a block diagram showing an example of the hardware configuration of the terminal.
  • FIG. 6 is a flowchart showing an example of an image synthesis processing procedure.
  • FIG. 7 is a diagram showing an example of the distance between each position of the imaging unit and the corresponding part of the three-dimensional model.
  • FIG. 8 is a diagram showing an example of deriving the distance between the position of the imaging unit and the three-dimensional model.
  • FIG. 9 is a diagram showing an example of deriving the distance of each imaging range.
  • FIG. 10 is a diagram showing an example of deriving the distance of each divided region after the imaging range is divided.
  • FIG. 11 is a diagram showing a correction example of the size adjustment amount.
  • FIG. 12 is a flowchart showing an example of generation processing of a three-dimensional model.
  • FIG. 13 is a diagram showing an example of a three-dimensional point group and a two-dimensional point group.
  • FIG. 14 is a diagram showing a three-dimensional model generated in a comparative example.
  • FIG. 15 is a diagram showing an example of a three-dimensional model generated in Embodiment 1.
  • FIG. 15 is a diagram showing an example of a three-dimensional model generated in Embodiment 1.
  • an unmanned aircraft (UAV: Unmanned Aerial) is used as an example of a flying body.
  • unmanned aerial vehicles are also expressed as "UAV".
  • the image generation system is exemplified by a flying system with an unmanned aircraft and a terminal.
  • the image generation device mainly takes an unmanned aircraft as an example, it may be a terminal.
  • the terminal may include a smart phone, a tablet terminal, a PC (Personal Computer) or other devices.
  • the image generation method is defined by the operation in the image generation device.
  • a program for example, a program that causes the image generating device to execute various processes is recorded in the recording medium.
  • FIG. 1 is a schematic diagram showing a first configuration example of the flying system 10 in Embodiment 1.
  • the flight system 10 includes an unmanned aircraft 100 and a terminal 80.
  • the unmanned aircraft 100 and the terminal 80 may communicate with each other through wired communication or wireless communication (for example, wireless LAN (Local Area Network)).
  • wireless LAN Local Area Network
  • the terminal 80 is a portable terminal (for example, a smartphone or a tablet terminal).
  • the configuration of the flight system may include an unmanned aircraft, a transmitter (proportional controller), and a portable terminal.
  • the transmitter When the transmitter is included, the user can use the left and right joysticks arranged in front of the transmitter to control the flight of the unmanned aircraft.
  • the unmanned aircraft, the transmitter, and the portable terminal can communicate with each other through wired communication or wireless communication.
  • FIG. 2 is a schematic diagram showing a second configuration example of the flying system 10 in Embodiment 1.
  • FIG. 2 it is illustrated that the terminal 80 is a PC.
  • the function of the terminal 80 may be the same.
  • FIG. 3 is a diagram showing an example of a specific appearance of the unmanned aerial vehicle 100.
  • a perspective view of the unmanned aircraft 100 when flying in the moving direction STV0 is shown.
  • the unmanned aerial vehicle 100 is an example of a mobile body.
  • the roll axis is set in a direction parallel to the ground and along the moving direction STV0 (refer to the x axis).
  • set the pitch axis in the direction parallel to the ground and perpendicular to the roll axis (refer to the y-axis)
  • set the yaw axis in the direction perpendicular to the ground and perpendicular to the roll axis and pitch axis (Refer to the z axis).
  • the configuration of the unmanned aerial vehicle 100 includes a UAV main body 102, a universal joint 200, an imaging unit 220, and a plurality of imaging units 230.
  • the UAV body 102 includes a plurality of rotors (propellers).
  • the UAV main body 102 makes the unmanned aircraft 100 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 102 uses, for example, four rotors to fly the unmanned aerial vehicle 100.
  • the number of rotors is not limited to four.
  • the unmanned aerial vehicle 100 may be a fixed-wing aircraft without a rotor.
  • the imaging unit 220 is an imaging camera that captures an object included in a desired imaging range (for example, the sky above an aerial photography target, a landscape such as mountains, rivers, and buildings on the ground).
  • a desired imaging range for example, the sky above an aerial photography target, a landscape such as mountains, rivers, and buildings on the ground.
  • the plurality of imaging units 230 are sensing cameras that capture the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100.
  • the two camera units 230 may be installed on the front of the nose of the unmanned aircraft 100.
  • the other two imaging units 230 may be installed on the bottom surface of the unmanned aerial vehicle 100.
  • the two imaging units 230 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging units 230 on the bottom surface side may also be paired to function as a stereo camera.
  • the three-dimensional space data around the unmanned aircraft 100 may be generated based on the images captured by the plurality of imaging units 230.
  • the number of imaging units 230 included in the unmanned aerial vehicle 100 is not limited to four.
  • the unmanned aerial vehicle 100 only needs to include at least one camera 230.
  • the unmanned aircraft 100 may include at least one camera 230 on the nose, tail, side, bottom, and top of the unmanned aircraft 100, respectively.
  • the angle of view that can be set in the imaging unit 230 can be larger than the angle of view that can be set in the imaging unit 220.
  • the imaging unit 230 may have a single focus lens or a fisheye lens.
  • FIG. 4 is a block diagram showing one example of the hardware configuration of the unmanned aerial vehicle 100.
  • the unmanned aerial vehicle 100 includes a UAV control unit 110, a communication interface 150, a memory 160, a memory 170, a universal joint 200, a rotor mechanism 210, an imaging unit 220, an imaging unit 230, a GPS receiver 240, and an inertial measurement device ( IMU: Inertial Measurement Unit) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, laser measuring instrument 290.
  • IMU Inertial Measurement Unit
  • the UAV control unit 110 is composed of, for example, a CPU (Central Processing Unit), MPU (Micro Processing Unit), or DSP (Digital Signal Processor).
  • the UAV control unit 110 performs signal processing for overall control of the operation of each part of the unmanned aircraft 100, data input/output processing with other parts, data calculation processing, and data storage processing.
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 according to a program stored in the memory 160.
  • the UAV control unit 110 can control the flight.
  • the UAV control unit 110 can take aerial images.
  • the UAV control unit 110 acquires position information indicating the position of the unmanned aircraft 100.
  • the UAV control unit 110 may obtain position information indicating the latitude, longitude, and altitude where the unmanned aircraft 100 is located from the GPS receiver 240.
  • the UAV control unit 110 may obtain latitude and longitude information indicating the latitude and longitude of the unmanned aircraft 100 from the GPS receiver 240 respectively, and obtain altitude information indicating the altitude of the unmanned aircraft 100 from the barometric altimeter 270 as position information .
  • the UAV control unit 110 may acquire the distance between the radiation point of the ultrasonic wave generated by the ultrasonic sensor 280 and the reflection point of the ultrasonic wave as height information.
  • the UAV control unit 110 may obtain orientation information indicating the orientation of the unmanned aircraft 100 from the magnetic compass 260.
  • the orientation information can be expressed by, for example, an orientation corresponding to the orientation of the nose of the unmanned aircraft 100.
  • the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist when the imaging unit 220 shoots the imaging range to be photographed.
  • the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist from the memory 160.
  • the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist from other devices via the communication interface 150.
  • the UAV control unit 110 may refer to the three-dimensional map database to specify the position where the unmanned aircraft 100 can exist, and acquire the position as position information indicating the position where the unmanned aircraft 100 should exist.
  • the UAV control unit 110 can acquire imaging range information indicating the respective imaging ranges of the imaging unit 220 and the imaging unit 230.
  • the UAV control unit 110 may acquire the angle of view information indicating the angles of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230 as a parameter for specifying the imaging range.
  • the UAV control unit 110 may acquire information indicating the imaging directions of the imaging unit 220 and the imaging unit 230 as parameters for specifying the imaging range.
  • the UAV control unit 110 may acquire posture information indicating the posture state of the imaging unit 220 from the universal joint 200 as information indicating the imaging direction of the imaging unit 220, for example.
  • the posture information of the imaging unit 220 may indicate the angle at which the gimbal 200 rotates from the reference rotation angle of the pitch axis and the yaw axis.
  • the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 is located as a parameter for determining the imaging range.
  • the UAV control unit 110 may delineate the imaging range representing the geographical range captured by the imaging unit 220 based on the angle of view and imaging direction of the imaging unit 220 and the imaging unit 230 and the location of the unmanned aircraft 100 and generate imaging range information to obtain Camera range information.
  • the UAV control unit 110 can obtain the imaging range information from the memory 160.
  • the UAV control unit 110 can acquire the imaging range information via the communication interface 150.
  • the UAV control unit 110 controls the universal joint 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230.
  • the UAV control unit 110 can control the imaging range of the imaging unit 220 by changing the imaging direction or angle of view of the imaging unit 220.
  • the UAV control unit 110 can control the imaging range of the imaging unit 220 supported by the universal joint 200 by controlling the rotation mechanism of the universal joint 200.
  • the imaging range refers to the geographic range captured by the imaging unit 220 or the imaging unit 230.
  • the camera range is defined by latitude, longitude and altitude.
  • the imaging range may be the range of three-dimensional spatial data defined by latitude, longitude, and altitude.
  • the imaging range may be the range of two-dimensional spatial data defined by latitude and longitude.
  • the imaging range may be specified according to the angle of view and imaging direction of the imaging unit 220 or the imaging unit 230 and the location where the unmanned aircraft 100 is located.
  • the imaging direction of the imaging unit 220 and the imaging unit 230 can be defined by the azimuth and depression angle of the front of the imaging lens provided with the imaging unit 220 and the imaging unit 230.
  • the imaging direction of the imaging unit 220 may be a direction specified by the orientation of the nose of the unmanned aircraft 100 and the posture state of the imaging unit 220 of the universal joint 200.
  • the imaging direction of the imaging unit 230 may be a direction specified by the orientation of the nose of the unmanned aircraft 100 and the position where the imaging unit 230 is provided.
  • the UAV control unit 110 may determine the surrounding environment of the unmanned aircraft 100 by analyzing multiple images captured by the multiple imaging units 230.
  • the UAV control unit 110 may control the flight according to the surrounding environment of the unmanned aircraft 100, for example, avoiding obstacles.
  • the UAV control unit 110 may acquire three-dimensional information (three-dimensional information) indicating the three-dimensional shape (three-dimensional shape) of an object existing around the unmanned aircraft 100.
  • the object may be part of a landscape such as buildings, roads, vehicles, trees, etc., for example.
  • the three-dimensional information is, for example, three-dimensional spatial data.
  • the UAV control unit 110 may generate three-dimensional information representing the three-dimensional shape of an object existing around the unmanned aircraft 100 from each image obtained from the plurality of imaging units 230, thereby acquiring three-dimensional information.
  • the UAV control unit 110 may acquire the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the memory 160 or the memory 170.
  • the UAV control unit 110 may acquire three-dimensional information related to the three-dimensional shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database managed by a server existing on the network.
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotor mechanism 210. That is, the UAV control unit 110 controls the rotor mechanism 210 to control the position including the latitude, longitude, and altitude of the unmanned aircraft 100.
  • the UAV control unit 110 may control the imaging range of the imaging unit 220 by controlling the flight of the unmanned aircraft 100.
  • the UAV control section 110 may control the angle of view of the imaging section 220 by controlling the zoom lens included in the imaging section 220.
  • the UAV control unit 110 may use the digital zoom function of the imaging unit 220 to control the angle of view of the imaging unit 220 through digital zoom.
  • the UAV control unit 110 can move the unmanned aircraft 100 to a specific location on a specially designated date and time to keep the camera unit 220 in Shoot the desired shooting range under the desired environment.
  • the UAV control unit 110 may move the unmanned aircraft 100 to a specific position on a specific date and time to place the imaging unit 220 in Shoot the desired shooting range under the desired environment.
  • the communication interface 150 communicates with the terminal 80.
  • the communication interface 150 can perform wireless communication by any wireless communication method.
  • the communication interface 150 can perform wired communication by any wired communication method.
  • the communication interface 150 may send a captured image (for example, an aerial image) and additional information (metadata) related to the captured image to the terminal 80.
  • the memory 160 stores the UAV control unit 110 to the universal joint 200, the rotor mechanism 210, the imaging unit 220, the imaging unit 230, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring instrument 290 Programs required for control, etc.
  • the memory 160 may be a computer-readable recording medium, and may include SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory: erasable At least one of flash memory such as programmable read-only memory), EEPROM (Electrically Erasable Programmable Read-Only Memory) and USB (Universal Serial Bus) memory.
  • the memory 160 can be detached from the unmanned aerial vehicle 100.
  • the memory 160 can be used as a working memory.
  • the storage 170 may include at least one of an HDD (Hard Disk: Drive), SSD (Solid State Drive), SD card, USB memory, and other storage.
  • the memory 170 can store various information and various data.
  • the memory 170 can be detached from the unmanned aerial vehicle 100.
  • the memory 170 can record a captured image.
  • the memory 160 or the memory 170 may store the information of the imaging position and imaging path generated by the terminal 80 or the unmanned aerial vehicle 100.
  • the UAV control unit 110 may set the information of the imaging position and the imaging path as one of imaging parameters related to the shooting planned by the unmanned aircraft 100 or flight parameters related to the flight scheduled by the unmanned aircraft 100.
  • the setting information may be stored in the memory 160 or the memory 170.
  • the gimbal 200 can rotatably support the imaging unit 220 about the yaw axis, the pitch axis, and the roll axis.
  • the gimbal 200 can rotate the imaging unit 220 around at least one of the yaw axis, the pitch axis, and the roll axis, thereby changing the imaging direction of the imaging unit 220.
  • the rotor mechanism 210 has a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the rotor mechanism 210 controls the rotation by the UAV control unit 110, thereby causing the unmanned aircraft 100 to fly.
  • the number of rotors 211 may be four, for example, or other numbers.
  • the unmanned aerial vehicle 100 may be a fixed-wing aircraft without a rotor.
  • the imaging unit 220 captures an object within a desired imaging range and generates data of a captured image.
  • the image data (for example, a captured image) obtained by the imaging unit 220 may be stored in the memory or the memory 170 included in the imaging unit 220.
  • the imaging unit 230 captures the surroundings of the drone 100 and generates data of captured images.
  • the image data of the imaging unit 230 can be stored in the memory 170.
  • the GPS receiver 240 receives a plurality of signals indicating the time transmitted from a plurality of navigation satellites (that is, GPS satellites) and the position (coordinates) of each GPS satellite.
  • the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned aircraft 100) based on the received multiple signals.
  • the GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control unit 110.
  • the UAV control unit 110 may calculate the position information of the GPS receiver 240 instead of the GPS receiver 240. In this case, information indicating the time and the position of each GPS satellite included in the multiple signals received by the GPS receiver 240 is input to the UAV control unit 110.
  • the inertial measurement device 250 detects the posture of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the inertial measurement device 250 can detect the acceleration and triaxial angular velocity of the unmanned aircraft 100 in the three axis directions of front, back, left, right, and up and down, as the attitude of the unmanned aircraft 100.
  • the magnetic compass 260 detects the orientation of the nose of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the barometric altimeter 270 detects the flying altitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected on the ground and objects, and outputs the detection results to the UAV control unit 110.
  • the detection result may show the distance from the unmanned aircraft 100 to the ground, that is, the height.
  • the detection result may show the distance from the unmanned aircraft 100 to the object (subject).
  • the laser measuring instrument 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between the unmanned aircraft 100 and the object (subject) by the reflected light.
  • a time-of-flight method may be used.
  • FIG. 5 is a block diagram showing an example of the hardware configuration of the terminal 80.
  • the terminal 80 includes a terminal control unit 81, an operation unit 83, a communication unit 85, a memory 87, a display unit 88, and a memory 89.
  • the terminal 80 may be held by a user who wishes to control the flight of the unmanned aircraft 100.
  • the terminal control unit 81 is composed of, for example, a CPU, MPU, or DSP.
  • the terminal control unit 81 performs signal processing for overall control of the operation of each unit of the terminal 80, data input/output processing with other units, data calculation processing, and data storage processing.
  • the terminal control unit 81 can acquire data and information from the unmanned aircraft 100 via the communication unit 85.
  • the terminal control section 81 can acquire data and information (for example, various parameters) input via the operation section 83.
  • the terminal control unit 81 can acquire data and information stored in the memory 87.
  • the terminal control unit 81 may transmit data and information (eg, position, speed, and flight path information) to the unmanned aircraft 100 via the communication unit 85.
  • the terminal control unit 81 may send data and information to the display unit 88 and cause the display unit 88 to display display information based on the data and information.
  • the operation unit 83 receives and acquires data and information input by the user of the terminal 80.
  • the operation unit 83 may include input devices such as buttons, keys, touch screens, and microphones.
  • the operation section 83 and the display section 88 are composed of a touch panel.
  • the operation unit 83 can perform touch operation, click operation, drag operation, and the like.
  • the operation unit 83 can receive information on various parameters.
  • the information input by the operation unit 83 may be transmitted to the unmanned aerial vehicle 100.
  • the communication unit 85 performs wireless communication with the unmanned aircraft 100 through various wireless communication methods.
  • the wireless communication method of this wireless communication may include, for example, wireless LAN, Bluetooth (registered trademark), or communication via a public wireless network.
  • the communication unit 85 can perform wired communication by any wired communication method.
  • the memory 87 may include, for example, a program that defines the operation of the terminal 80, a ROM that stores data of set values, and a RAM that temporarily stores various information and data used when the terminal control unit 81 performs processing.
  • the memory 87 may include memory other than ROM and RAM.
  • the memory 87 may be provided inside the terminal 80.
  • the memory 87 may be configured to be detachable from the terminal 80.
  • the program may include an application program.
  • the display unit 88 is configured by, for example, an LCD (Liquid Crystal Display), and displays various information and data output from the terminal control unit 81.
  • the display unit 88 can display various data and information related to the execution of the application program.
  • the memory 89 stores and stores various data and information.
  • the memory 89 may be an HDD, SSD, SD card, USB memory, or the like.
  • the memory 89 may be provided inside the terminal 80.
  • the memory 89 may be detachably provided on the terminal 80.
  • the memory 89 can store the captured image acquired from the unmanned aircraft 100 and its additional information.
  • the additional information can be stored in the memory 87.
  • the unmanned aircraft 100 or the terminal 80 of the flying system 10 executes processing related to the generation of a composite image based on a plurality of captured images captured by the unmanned aircraft 100.
  • the UAV control unit 110 of the unmanned aircraft 100 or the terminal control unit 81 of the terminal 80 is an example of a processing unit that executes processing related to the generation of a composite image. Among them, it is shown that the unmanned aircraft 100 dominates to perform processing related to the synthesized image.
  • the generation of the synthesized image is performed by a processor with insufficient computing power.
  • the composite image can be used as a map image or an orthophoto.
  • the processor with insufficient computing power may include, for example, a processor that is difficult to implement synthetic image generation including dense point group generation in real time.
  • FIG. 6 is a flowchart showing an example of an image synthesis processing procedure. As an example, this process may be performed by the terminal control section 81 of the terminal 80 executing a program stored in the memory 87.
  • the unmanned aerial vehicle 100 may perform an action that assists image synthesis processing.
  • the unmanned aircraft 100 may provide the terminal 80 with a camera image captured by the camera unit 220 and its additional information, or may provide various parameters (eg, flight parameters related to the flight of the unmanned aircraft 100, and the camera unit 220 shooting parameters related to shooting).
  • the terminal control section 81 acquires the flight range and various parameters (S1). In this case, the user can input the flight range and parameters to the terminal 80.
  • the terminal control section 81 may receive user input via the operation section 83 and acquire the input flight range and parameters.
  • the terminal control section 81 can acquire map information from an external server via the communication section 85. For example, when the flight range is set to a rectangular range, the user can obtain the information of the flight range by inputting the positions (latitude, longitude) of the four corners of the rectangle in the map information. In addition, when the flight range is set to a circular range, the user can obtain the information of the flight range by inputting the radius of the circle centered on the flight position. In addition, the user can obtain the information of the flight range by inputting information such as an area, a specific place name (for example, Tokyo), and based on the map information. In addition, the terminal control unit 81 may acquire the flight range stored in the memory 87 and the memory 89 from the memory 87 and the memory 89. The flight range may be a predetermined range in which the unmanned aircraft 100 flies.
  • the parameters may be imaging parameters related to the shooting of the imaging unit 220 and flight parameters related to the flight of the unmanned aircraft 100.
  • the camera parameters may include camera position, camera date and time, distance from the subject, camera angle of view, unmanned aerial vehicle 100 posture, camera direction, camera conditions, camera parameters (shutter speed, exposure value, camera Mode, etc.).
  • Flight parameters may include flight position (three-dimensional position or two-dimensional position), flight altitude, flight speed, flight acceleration, flight path, flight date and time, and so on.
  • the terminal control unit 81 can acquire various parameters stored in the memory 87 and the memory 89 from the memory 87 and the memory 89.
  • the terminal control unit 81 can acquire the flight range and various parameters from the external server and the unmanned aircraft 100 via the communication unit 85.
  • the flight range and various parameters can be obtained from the memory 160, or obtained from various sensors (for example, GPS receiver 240, inertial measurement device 250) in the unmanned aerial vehicle 100 (for example, calculation ) Flight range and various parameters.
  • the terminal control section 81 may determine the flight path and imaging position of the unmanned aircraft 100 based on the acquired flight range and various parameters.
  • the terminal control unit 81 may notify the unmanned aircraft 100 via the communication unit 85 of the determined flight path and imaging position of the unmanned aircraft 100.
  • the UAV control unit 110 controls the flight according to the determined flight path, and causes the imaging unit 220 to capture (eg, aerial photography) images.
  • the UAV control section 110 may acquire multiple captured images in different positions and postures during flight.
  • the UAV control unit 110 may send the captured image to the terminal 80 via the communication interface 150.
  • the terminal control unit 81 acquires a captured image via the communication unit 85 and stores it in the memory 89 (S2).
  • the terminal control unit 81 may acquire additional information related to the captured image via the communication unit 85 and store it in the memory 89.
  • the additional information may include information similar to the various parameters described above (eg, flight parameters, camera parameters). Therefore, the terminal control unit 81 can acquire information such as the imaging position of the imaging unit 220 (that is, the unmanned aerial vehicle 100) at the time of capturing each captured image, the posture at the time of capturing, and the imaging direction.
  • a plurality of captured images may be captured by the same imaging unit 220 or may be captured by different imaging units 220. That is, multiple images can be captured by multiple different unmanned aircraft 100.
  • the terminal control unit 81 can acquire a plurality of captured images from the communication unit 85 or the memory 89.
  • the terminal control unit 81 may extract a plurality of feature points included in each captured image.
  • the feature point may be a point anywhere on the camera image.
  • the terminal control unit 81 may perform a matching process corresponding to the same feature point on a plurality of feature points included in each captured image, and then generate a corresponding point as the corresponding feature point.
  • the terminal control unit 81 may consider the actual observation position at which each feature point is projected onto the captured image and the reproduction position at which each feature point is reproduced on the captured image based on parameters such as the position and posture of the imaging unit 220 Difference (reprojection error).
  • the terminal control unit 81 can perform bundle adjustment (BA: Bundle Adjustment) that minimizes the reprojection error.
  • BA Bundle Adjustment
  • the terminal control unit 81 may derive the result of the bundle adjustment and the correspondence between the feature points in each captured image and derive the corresponding point.
  • the terminal control unit 81 may generate a sparse point group including a plurality of corresponding points.
  • the terminal control unit 81 may generate a sparse point group based on, for example, sfm.
  • the number of sparse point groups may be, for example, several hundred points per image.
  • the data of the points contained in the sparse point group may include data representing the three-dimensional position. In other words, the sparse point group here is a three-dimensional point group. In this way, the terminal control unit 81 generates a sparse point group based on the acquired multiple captured images (S3).
  • the terminal control unit 81 generates a three-dimensional model M based on the sparse point group (S4).
  • the terminal control unit 81 may generate a plurality of sf (planes) having a plurality of adjacent points included in the sparse point group as vertices, and generate a three-dimensional model M represented by the plurality of planes sf.
  • the three-dimensional model M may be, for example, a terrain model representing the shape of the ground. Since the three-dimensional model M is formed based on the sparse point group, it becomes a sparse three-dimensional model (rough three-dimensional model).
  • the generated three-dimensional model M may be stored in the memory 89.
  • the terminal control unit 81 derives (for example, calculates) between each position of the imaging unit 220 and the three-dimensional model M based on at least one of each position (three-dimensional position) and each posture of the imaging unit 220 and the three-dimensional model M at the time of shooting.
  • Distance D S5
  • the unmanned aerial vehicle 100 moves during flight, a plurality of distances D between the position of the imaging unit 220 and the three-dimensional model M are derived. Since the three-dimensional model M is generated based on the sparse point group having three-dimensional position information, the position and shape of the three-dimensional model M in the three-dimensional space can be determined.
  • the coordinate space where the three-dimensional model M exists and the coordinate space where the imaging unit 220 of the unmanned aerial vehicle 100 exists are the same coordinate space. Therefore, the distance D between each position of the imaging unit 220 and the predetermined position in the three-dimensional model M can be derived.
  • the terminal control unit 81 can not only calculate the distance D based on the position information of the imaging unit 220 but also use the time information to calculate the distance D based on the posture information of the imaging unit 220 at each time (the posture of the imaging unit 220 at each time). . In this case, the terminal control section 81 may acquire only the information about the posture and not the information about the position of the imaging section 220.
  • the terminal control section 81 adjusts the size (scale) of each captured image captured by the imaging section 220 at each position based on the acquired distance D (S6). In this case, the terminal control section 81 may calculate the size of each captured image captured by the imaging section 220 at each position based on the acquired distance D. The terminal control section 81 may enlarge or reduce each captured image so that it has a calculated size.
  • the size of each captured image is an index indicating the magnification rate or reduction rate of each captured image used to generate a composite image. In each of the captured images (acquired multiple captured images), there may be a captured image as a reference (a captured image that is not enlarged or reduced).
  • the terminal The control section 81 can increase the magnification of the captured image. Since the shorter the distance D between the photographing unit 220 and the three-dimensional model M, the larger the photographic object contained in the photographed image captured by the photographing unit 220, therefore, the terminal control unit 81 can reduce the magnification of the photographed image .
  • the terminal control unit 81 can reduce the Reduction rate. Since the shorter the distance D between the photographing unit 220 and the three-dimensional model M, the larger the photographic object included in the photographed image captured by the photographing unit 220, therefore, the terminal control unit 81 can increase the reduction ratio of the photographed image .
  • the terminal control unit 81 synthesizes the captured images after resizing to generate a synthesized image (S7).
  • the terminal control section 81 may, among the plurality of adjacent captured images in the composite image, for the repeated subject portion depicted in the captured image, except for one of the adjacent captured images Also, delete duplicates. That is, with regard to the repeated part, the subject can be drawn with any one of the captured images as a representative.
  • the terminal control unit 81 may generate a synthesized image according to a well-known synthesis method based on the plurality of captured images after resizing.
  • the terminal 80 can make the positional relationship between the feature points and corresponding points included in each captured image substantially equal. Therefore, the terminal 80 can control the occurrence of a situation in which the size of the subject (object) included in the composite image differs according to the original captured image, and the size of the subject varies for each captured image included in the composite image .
  • the terminal 80 can make the scale of the subject (such as terrain, buildings, etc.) reflected in the composite image in multiple captured images, reduce the size deviation of the same or corresponding objects in multiple captured images To generate a composite image.
  • the terminal 80 generates a sparse point group, generates a sparse three-dimensional model, and adjusts the size of each captured image. Therefore, it is not necessary to generate a dense point group. Therefore, the terminal 80 can reduce the processing load and processing time for generating a composite image without having to provide a processor with high processing capacity.
  • the UAV control unit 110 may acquire the flight range or various parameters from the terminal 80 in S1, acquire the captured image (S2), generate a sparse point group (S3), generate a three-dimensional model (S4), and derive the distance D (S5) , Adjust the size (S6), generate a composite image (S7), etc.
  • the unmanned aerial vehicle 100 and the terminal 80 share the image synthesis processing.
  • FIG. 7 is a diagram showing an example of the distance between each position of the imaging unit 220 (that is, each position of the unmanned aerial vehicle 100) and the corresponding portion of the three-dimensional model M (for example, the portion included in the imaging range CR).
  • the reference numerals of the imaging unit 220 at each position are referred to as 220a, 220b, 220c, ....
  • the distance D from the imaging unit 220a is the distance ha.
  • the distance D between the imaging unit 220c and the three-dimensional model M is the distance hc. That is, in FIG. 7, the distance D is different at each position of the imaging unit 220. Therefore, when the captured images are synthesized without size adjustment, the size of the subject reflected in each captured image in the created synthesized image is not uniform.
  • the terminal control unit 81 adjusts the size of each captured image by a size adjustment amount (for example, magnification ratio or reduction ratio) corresponding to the distance D, and synthesizes each captured image after the size adjustment. . Therefore, the terminal 80 can generate a composite image with a uniform size of the subject reflected in each captured image.
  • FIG. 8 is a diagram showing an example of deriving the distance D.
  • the distance D between the position of the imaging unit 220 and the three-dimensional model M may be the distance h1 along the vertical direction (direction perpendicular to the horizontal direction). That is, the terminal control unit 81 may use the distance h1 connecting the imaging unit 220 and the intersection point C1 as the distance D, where the intersection point C1 is the intersection point of the straight line L1 passing through the imaging unit 220 and parallel to the vertical direction and the three-dimensional model M.
  • the position of the imaging unit 220 may be the imaging surface of the imaging unit 220 or may be the image center of the captured image captured by the imaging unit 220.
  • the terminal 80 can adjust the size of the captured image in consideration of the flying height of the drone when the captured image is captured.
  • the distance D between the position of the imaging unit 220 and the three-dimensional model M may be the distance h2 along the imaging direction of the imaging unit 220. That is, the terminal control unit 81 may use the distance h2 between the imaging unit 220 and the intersection point C2 as the distance D, where the intersection point C2 is a straight line L2 passing through the imaging unit 220 and parallel to the imaging direction specified by the posture of the imaging unit 220 and the three-dimensional model The intersection of M.
  • the inclination of the shooting direction relative to the vertical direction is also taken into consideration.
  • the terminal 80 can adjust the size of the captured image in consideration of the distance between the subject reflected in the captured image when the captured image is captured and the drone 100.
  • At least a part of the three-dimensional model M is included in the imaging range CR of the image captured by the imaging unit 220. Therefore, at least a part of the three-dimensional model M is included in the image range of the captured image captured by the imaging unit 220.
  • One shooting range CR or one image range corresponds to one shooting image.
  • the distance D between the position of the imaging unit 220 and the three-dimensional model M may be derived for each captured image. That is, the terminal control unit 81 may derive the distance D of the portion of the three-dimensional model M included in the imaging range CR and the image range for each imaging range CR or each image range.
  • the terminal control unit 81 may calculate the position MA of the three-dimensional model M included in the shooting range CRA of the shooting unit 220 at the position PA and the shooting range CRA of the shooting unit 220 at the position PA (the first part The first example) is the distance DA.
  • the terminal control section 81 can calculate the position MB of the shooting section 220 where the shot image GMB is shot and the part MB of the three-dimensional model M included in the shooting range CRB of the shooting section 220 at the position PB (second example of the first section of the three-dimensional model) The distance between DB.
  • the terminal control unit 81 may enlarge the captured image GM2 in the case of the photographing range CR2 by 2 times, and combine the captured image GM1 with the captured image GM2 enlarged by 2 times.
  • the enlargement to 2 times in the case of the distance "2" is only an example, and it is sufficient to determine the size adjustment amount such as the enlargement ratio or the reduction ratio according to the distance.
  • the terminal 80 can roughly derive the distance D between the position of the photographing unit 220 (camera position) and the photographed object (three-dimensional model M) according to the distance corresponding to each captured image. Therefore, the processing load for the terminal 80 to derive the distance D is relatively small, and the processing time is relatively short.
  • the distance D between the position of the imaging unit 220 and the three-dimensional model M may be derived for each divided region DR of the captured image. That is, the distance D can be derived for each divided region DR of the shooting range CR and the image range GR corresponding to the captured image.
  • the terminal control unit 81 may divide the imaging range CR or the image range GR to generate a plurality of divided regions DR.
  • the terminal control unit 81 may calculate each distance between each divided region DR and the portion of the three-dimensional model M corresponding to each divided region DR as the distance D. Therefore, since the distance D is derived for each divided region DR, the number of different distances D corresponding to the number of divisions can be derived for one captured image.
  • the terminal control unit 81 can calculate the portion MC1 (three-dimensional) of the three-dimensional model M included in the divided region DRC1 of the imaging range CRC of the imaging unit 220 of the imaging unit 220 where the captured image GMC is captured and the imaging range 220 of the imaging unit 220 at the position PC An example of the second part of the model) the distance DC1.
  • the terminal control unit 81 can calculate the distance DC2 between the position PC and the part MC2 of the three-dimensional model M (an example of the second part of the three-dimensional model) included in the divided region DRC2 of the shooting range CRC. That is, for each divided region DR of the shooting range CRC, the distances DC1, DC2, ... can be calculated.
  • the distance D of each divided region DR can be derived for a plurality of shooting ranges CR corresponding to a plurality of shot images.
  • FIG. 10 is a diagram showing an example of deriving the distance D for each divided region DR after the imaging range CR is divided.
  • the shooting range CR11 where the shooting image GM11 is shot by the shooting unit 220 a includes the divided regions DR11 to 19.
  • the distance D11 between the position of the imaging unit 220 where the captured image GM11 is captured and the part M11 corresponding to the three-dimensional model M (the part M11 of the three-dimensional model M) is the distance "1".
  • the distance D12 between the position of the imaging unit 220 where the captured image GM11 is captured and the portion M12 corresponding to the three-dimensional model M is the distance "0.5".
  • the distance D12 between the position of the imaging unit 220 where the captured image GM11 was captured and the portion M13 corresponding to the three-dimensional model M is the distance "1".
  • the distance D14 between the position of the imaging unit 220 where the captured image GM11 is captured and the portion M14 corresponding to the three-dimensional model M is the distance "2".
  • the distance D15 between the position of the imaging unit 220 where the captured image GM11 was captured and the portion M15 corresponding to the three-dimensional model M is the distance "2".
  • the distance D16 between the position of the imaging unit 220 where the captured image GM11 is captured and the portion M16 corresponding to the three-dimensional model M is the distance "1.5".
  • the distance D17 between the position of the imaging unit 220 where the captured image GM11 is captured and the portion M17 corresponding to the three-dimensional model M is the distance "1".
  • the distance D18 between the position of the imaging unit 220 where the captured image GM18 is captured and the portion M18 corresponding to the three-dimensional model M is the distance "2.5".
  • the distance D19 between the position of the imaging unit 220 where the captured image GM19 was captured and the portion M19 corresponding to the three-dimensional model M is the distance "1".
  • the imaging range CR21 in which the captured image GM21 is captured by the imaging unit 220 b includes the divided regions DR21 to 29.
  • the distance D21 between the position of the imaging unit 220 where the captured image GM21 is captured and the corresponding part M21 of the three-dimensional model M is the distance "2".
  • the distance D22 corresponding to the divided region DR22 is the distance "1.5”.
  • the distance D23 corresponding to the divided region DR23 is the distance "2".
  • the distance D24 corresponding to the divided area DR24 is the distance "3".
  • the distance D25 corresponding to the divided area DR25 is the distance "3".
  • the distance D26 corresponding to the divided region DR26 is the distance "2.5".
  • the distance D27 corresponding to the divided area DR27 is the distance "2".
  • the distance D28 corresponding to the divided area DR28 is the distance "3.5".
  • the distance D29 corresponding to the divided area DR29 is the distance "2".
  • the terminal control unit 81 keeps the portion of the captured image GM11 corresponding to the distance “1” (the portion corresponding to the divided region DR11) as it is (1 ⁇ ) (without zooming in and out), and sets the distance “0.5”
  • the portion of the corresponding captured image GM12 (the portion corresponding to the divided area DR12) is enlarged to 0.5 times, and the portion of the captured image GM13 (the portion corresponding to the divided area DR13) corresponding to the distance "1" is left as it is (1 time).
  • the terminal control unit 81 doubles the portion of the captured image GM14 (the portion corresponding to the divided area DR14) corresponding to the distance "2", and enlarges the portion of the captured image GM15 (corresponding to the divided area DR15) corresponding to the distance "2" Of the captured image GM16 corresponding to the distance "1.5” (the portion corresponding to the divided area DR16) is enlarged by 1.5 times.
  • the terminal control unit 81 keeps the portion of the captured image GM17 (the portion corresponding to the divided area DR17) corresponding to the distance "1", and the portion of the captured image GM18 (the portion corresponding to the divided area DR18) corresponding to the distance "2.5” ) Is enlarged to 2.5 times, and the portion of the captured image GM19 (the portion corresponding to the divided region DR19) corresponding to the distance "1" is left as it is. In this way, the terminal control unit 81 adjusts the size of the regions corresponding to the divided regions DR11 to DR19 in the captured image GM11.
  • the terminal control unit 81 enlarges the portion of the captured image GM21 corresponding to the distance "2" by 2 times, enlarges the portion of the captured image GM22 corresponding to the distance "1.5” by 1.5 times, and corresponds to the distance "2"
  • the part of the captured image GM23 is enlarged twice.
  • the terminal control unit 81 enlarges the portion of the captured image GM24 corresponding to the distance "3" to 3 times, enlarges the portion of the captured image GM25 corresponding to the distance "3" to 3 times, and enlarges the captured image corresponding to the distance "2.5"
  • the GM26 part is enlarged 2.5 times.
  • the terminal control unit 81 enlarges the portion of the captured image GM27 corresponding to the distance "2" to 2 times, enlarges the portion of the captured image GM28 corresponding to the distance "3.5” to 3.5 times, and enlarges the captured image corresponding to the distance "2"
  • the GM29 part is doubled. In this way, the terminal control unit 81 adjusts the size of the region corresponding to each of the divided regions DR21 to DR29 in the captured image GM21.
  • the terminal 80 can derive the distance between the position of the photographing unit 220 (camera position) and the photographic object (three-dimensional model M) within a fine range corresponding to each distance of the divided region DR divided by the photographing range CR D. Therefore, the terminal 80 can improve the accuracy of the distance D between the camera position and the three-dimensional model M. Therefore, the terminal 80 can perform the size adjustment based on the distance with high accuracy, and can improve the reproducibility of the subject included in the composite image.
  • the terminal 80 can also derive between the three-dimensional model M and the three-dimensional model M for each divided region FR after the shooting range CR is divided The distance D. Therefore, the terminal 80 can adjust the size of the captured image according to the distance D for each divided region DR, synthesize a plurality of captured images after the size adjustment, and generate a synthesized image. In addition, when the distance D is derived for each divided area, the terminal 80 can perform finer scale adjustment than when the distance D is derived for each imaging range.
  • the relationship between the distance and the size adjustment amount is an example, and the size adjustment amount may be determined according to the distance.
  • the terminal 80 can finely adjust the scale compared to the case where the scale is adjusted for each shooting range CR by adjusting the scale for each divided region DR.
  • the terminal 80 can realize the scale adjustment corresponding to the frequently changing height difference, which can reduce the original in the composite image. The deviation of the scale of the captured image. Therefore, the terminal 80 can improve the generation accuracy of the synthesized image.
  • the terminal 80 may be considered to derive the distance between the imaging unit 220 and the three-dimensional model M for each point included in the sparse point group, but in this case, a plurality of pieces of information at different distances appear, making it difficult to know the accurate scale.
  • the terminal 80 derives the distance D according to the range of each captured image, that is, the shooting range CR, or derives the distance D for each divided region DR after the shooting range CR is divided into several, so that it is possible to suppress the distance based on the uniform size regardless of the distance D Multiple captured images generate a composite image.
  • the terminal 80 does not derive the distance D for each point of the sparse point group, but derives the distance slightly, thereby reducing the amount of calculation and shortening the calculation time.
  • the terminal control unit 81 generates a composite image SG based on the plurality of captured images GM after resizing.
  • the size adjustment amount SA determined according to the distance D is different in the vicinity of the boundary of the shooting range CR of the captured image GM that is the basis of the combined image SG or the distance D corresponding to the divided region DR, which is different, which makes it possible Generate discontinuous areas.
  • the terminal control unit 81 may correct the size adjustment amount SA near the boundary so that the size adjustment amount SA near the boundary changes smoothly in the adjacent area.
  • the vicinity of the boundary may be, for example, a portion of each shooting range CR within a specific range from the boundary of the adjacent shooting range CR.
  • the specific range may be at least a part of the overlapping area where the adjacent shooting ranges CR overlap.
  • the captured images GM corresponding to the adjacent shooting ranges CR are adjacent to each other in the composite image SG.
  • the vicinity of the boundary may be, for example, a portion of each divided region DR within a specific range from the boundary of the adjacent divided region DR.
  • the parts of the captured image GM corresponding to the adjacent divided regions DR are adjacent to each other in the composite image SG.
  • the terminal control unit 81 may smoothly change the size adjustment amount SA near the boundary of the adjacent shooting range CR. For example, when the distance D is 1 in the shooting range CR31 and the distance D is 2 in the shooting range CR32 adjacent to the shooting range CR31, the terminal control section 81 may correct the size adjustment amount SA so as to be near the boundary
  • the size adjustment amount (for example, magnification) changes smoothly from 1 to 2.
  • FIG. 11 is a diagram showing a correction example of the size adjustment amount.
  • the distance D is 1 in the shooting range CR31, and the distance D is 2 in the shooting range CR32. Therefore, without correcting the size adjustment amount SA, in the captured image GM31 corresponding to the shooting range CR31, the size adjustment amount SA (for example, magnification) is 1, and in the captured image GM32 corresponding to the shooting range CR32, the size The adjustment amount SA may be 2.
  • the size adjustment amount SA outside the vicinity of the boundary is 1 in the captured image GM31, and the size adjustment amount SA outside the vicinity of the boundary may be 2 in the captured image GM31.
  • the terminal control unit 81 may perform correction so that the size adjustment amount SA near the boundary changes smoothly from 1 to 2 in the composite image SG from the adjacent captured image GM31 toward the captured image GM32.
  • the terminal control unit 81 may linearly change the size adjustment amount SA near the boundary (refer to the curve g1), or may change it non-linearly (refer to the graph g2).
  • the terminal 80 can suppress the problem that the image quality of the composite image SG is deteriorated due to the discontinuous areas caused by the different resizing amounts SA near the ends (near the boundary) of the plurality of captured images GM having different resizing amounts SA. This also applies to the case where the distance D is derived for each divided region DR and the size is adjusted.
  • the terminal control unit 81 may generate a three-dimensional model using the grid gd corresponding to the shooting range CR.
  • the terrain here may include, for example, the shape of the photographed object (eg, ground, building, object) photographed by the imaging unit 220 included in the flying drone 100.
  • the grid gd will be described.
  • the grid gd may be formed of a grid pattern.
  • the terrain of the shooting range of the three-dimensional model M is virtually represented in the grid gd.
  • the grid gd can be set within the same range as the shooting range CR or within the range included in the shooting range CR.
  • the grid gd may be lattice-shaped, triangular, other polygonal, or other shapes, including grid points gp that are vertices of the grid gd.
  • the interval (grid interval) of each grid point gp may be a predetermined value, or may be arbitrarily set by the terminal control unit 81.
  • the grid spacing can be 1m, 2m, etc.
  • the terminal control section 81 may specify the grid interval via the operation section 83.
  • the position in the two-dimensional plane of the sparse point group (position not considering height in three-dimensional space) may not coincide with the position in the two-dimensional plane of grid points gp (position not considering the height of the grid).
  • FIG. 12 is a flowchart showing an example of the generation process of the three-dimensional model M.
  • FIG. 12 the generation of the three-dimensional model M shown in FIG. 12 is an example, and the three-dimensional model may be generated according to other methods.
  • the terminal control unit 81 projects the sparse point group (three-dimensional point group PG3) in the three-dimensional space (XYZ coordinate system) onto a two-dimensional plane (XY plane), and generates a sparse point group (two-dimensional point group PG2) projected on the two-dimensional plane (S11).
  • the terminal control unit 81 may generate the two-dimensional point group PG2 by setting the height (Z coordinate) of the three-dimensional point group PG3 to the value 0.
  • the sparse point group here may be the sparse point group generated in S3 of FIG. 6.
  • the terminal control unit 81 specifies a plurality of adjacent points included in the two-dimensional point group PG2 (S12). Then, the terminal control unit 81 considers the height of the three-dimensional point group PG3 before being projected on the two-dimensional plane for the specified points, and connects the points on the three-dimensional space corresponding to the extracted points on the two-dimensional plane, Generate surface sf (S13).
  • a plurality of points can be specified in the whole or part of the two-dimensional point group PG2.
  • a plurality of groups having a plurality of points for generating the surface sf may be formed, and the surface sf may be generated in the plurality of groups.
  • the terminal control unit 81 may triangulate the three adjacent points included in the two-dimensional point group PG2 using the three points included in the corresponding three-dimensional point group PG3 to generate a triangular surface sf, that is, a triangle Subdivision (Delaunay triangulation: Delaunay triangulation).
  • the terminal control unit 81 may generate the surface sf according to a method other than Delaunay triangulation.
  • the terminal control unit 81 sets the height (grid height) of each grid point gp as the position of the surface sf in the grid point. That is, the terminal control unit 81 may set the height of the intersection between the straight line passing through the grid point gp and the surface sf along the vertical direction as the grid height. In this way, the terminal control unit 81 calculates the three-dimensional position of each grid point gp (mesh point) (S14).
  • the terminal control unit 81 generates a three-dimensional model M based on the three-dimensional position of each grid point gp (S15). 3
  • the shape of the three-dimensional model M can be defined by the three-dimensional position of each grid point gp.
  • the shape of the three-dimensional model M may be a shape combining the surfaces sf. In this way, the terminal 80 can generate a three-dimensional model M based on the three-dimensional position of each grid point gp in the grid gd, and determine the three-dimensional model M.
  • the terminal control unit 81 can project a plurality of three-dimensional points included in the sparse point group (an example of sparse point group data) onto a two-dimensional plane.
  • the terminal control unit 81 may designate a plurality of two-dimensional points projected to be adjacent in a two-dimensional plane as a group.
  • the terminal control unit 81 may specify a plurality of such groups.
  • the terminal control unit 81 may connect the plurality of three-dimensional points included in the sparse point group corresponding to the specified adjacent two-dimensional points in groups to generate a plurality of planes sf (an example of plane data).
  • the terminal control unit 81 may generate the three-dimensional model M based on the plurality of planes sf.
  • the terminal 80 temporarily projects the three-dimensional point group PG3 on the two-dimensional plane to generate the two-dimensional point group PG2, and generates a surface based on the adjacency relationship of the two-dimensional point group PG2. Therefore, the terminal 80 is easier to obtain a smooth shape than the adjacent surface generation surface sf based on the three-dimensional point group PG3. Therefore, the terminal 80 derives the shape of the three-dimensional model M based on the adjacency relationship of the two-dimensional point group PG2, whereby the reproduction accuracy of the actual terrain can be improved.
  • FIG. 13 is a diagram showing an example of a three-dimensional point group PG3 and a two-dimensional point group PG2.
  • the terminal control unit 81 generates a two-dimensional point group PG2 based on the three-dimensional point group PG3.
  • the terminal control unit 81 projects each point included in the three-dimensional point group PG3 in the three-dimensional space (XYZ space) onto a two-dimensional plane (XY plane), and generates each point included in the two-dimensional point group PG2.
  • the terminal control unit 81 specifies, for example, adjacent points P21 to P23 included in the two-dimensional point group G2 (an example of two-dimensional points).
  • the points P31 to P33 (an example of a three-dimensional point) in the three-dimensional point group PG3 corresponding to P21 to P23 are the points before the points P21 to P23 are projected on the two-dimensional plane, and are used to generate one surface Points.
  • the terminal 80 specifies a plurality of points that are adjacent in the two-dimensional plane, and connects a plurality of points in the three-dimensional point group PG3 corresponding to the specified plurality of points to generate one surface sf, so it is possible to suppress the surface in the three-dimensional space sf intersects or is not continuous. Therefore, the terminal 80 can generate a three-dimensional model M having a shape more conforming to the shape of the terrain.
  • Non-Patent Document 1 Michael Kazhdan, Hugues Hoppe, "Screened Poisson Surface Reconstruction.” ACM Translons On Graphics (ToG), Volume 32, Issue 3, June 2013, Article No. 29
  • FIG. 14 is a diagram showing a sparse three-dimensional model generated according to the Screened Poisson surface generation algorithm as a comparative example.
  • the terrains G1 and G2 have shapes that protrude in the horizontal direction, and are different from the actual terrain. This is considered to be because in the comparative example, there are multiple intersection points between the vertical line in the vertical direction of the three-dimensional space and the surface derived from the sparse point group (three-dimensional point group). In addition, because the sparse point group is used in the comparative example, the reproducibility of the three-dimensional model is low, and the accuracy of the connection relationship of the points included in the three-dimensional point group used to generate the surface is reduced.
  • FIG. 15 is a diagram showing an example of the three-dimensional model M generated in the generation process of the three-dimensional model M of the present embodiment.
  • the terminal control unit 81 projects a sparse point group (three-dimensional point group PG3) on a two-dimensional plane, for example, performs triangulation. That is, a continuous triangle can be generated in a two-dimensional plane. Therefore, even in the three-dimensional space, the terminal 80 can generate a continuous triangle in the direction along the two-dimensional plane, and can suppress the vertical line in the three-dimensional space along the vertical direction and the surface sf derived from the sparse point group There are multiple intersections. For example, in FIG. 14, there are terrains G1 and G2 that protrude in the horizontal direction, but in FIG. 15, portions corresponding to terrains G1 and G2 become smooth terrain G3.
  • the continuity of the surface sf on the three-dimensional space formed by connecting the points is ensured.
  • the height of the surface sf in the three-dimensional space may be based on the height before the projection of the three-dimensional point group onto the two-dimensional plane.
  • the unmanned aerial vehicle 100 (an example of an image generating device) can generate a composite image SG based on a plurality of captured images GM photographed by the unmanned aerial vehicle 100 (flying body).
  • the terminal 80 may include a terminal control unit 81 (an example of a processing unit) that performs processing related to the generation of the synthesized image SG.
  • the terminal control unit 81 can acquire a plurality of captured images GM captured by the imaging unit 220 (an example of an imaging device) included in the UAV 100.
  • the terminal control section 81 may generate the three-dimensional model M based on the plurality of captured images GM. At least one of each position and each posture of the photographing unit 220 when photographing a plurality of photographed images GM can be acquired.
  • the terminal control unit 81 may derive the distance D between each position of the imaging unit 220 and the three-dimensional model M based on at least one of each position and each posture of the imaging unit 220 and the three-dimensional model M.
  • the terminal control section 81 may adjust the size of the plurality of captured images GM based on the distance D between each position of the shooting section 220 and the three-dimensional model M.
  • the terminal control unit 81 may synthesize the plurality of captured images GM after the size adjustment to generate a synthesized image SG.
  • the terminal control section 81 may generate sparse point group data based on the plurality of captured images GM, and may generate a three-dimensional model M based on the sparse point group data.
  • the terminal 80 adjusts the size of each captured image GM and synthesizes it so that the size of the subject reflected in each captured image GM matches, so that the reproducibility of the subject represented by the synthesized image can be improved.
  • the terminal 80 does not need to sequentially perform all processes such as prime point generation, dense point generation, mesh generation, and texture generation in order to generate the synthesized image. Therefore, the terminal 80 can reduce the processing load for generating the synthesized image, and can shorten the processing time. Therefore, the terminal 80 can easily generate a composite image with a tablet terminal or the like whose calculation processing capability is not so high.
  • the terminal 80 can generate an ortho image and the like in addition to the synthesized image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Provided is an image generating device, which can easily generate a synthesized image and the like by using a device with less computational processing capability, and can suppress the reduction of reproducibility of a subject represented by the synthesized image. The image generating device generates the synthesized image on the basis of a plurality of captured images captured by a flying body, and comprises a processing unit for performing the processing related to the generation of the synthesized image. The processing unit acquires the plurality of captured images captured by a photographing device of the flying body, generates a three-dimensional model on the basis of the plurality of captured images, acquires postures of the photographing device when capturing the plurality of captured images, on the basis of the postures of the photographing device and the three-dimensional model, calculating the distances between positions of the photographing device when capturing the plurality of captured images and the three-dimensional model, adjusting sizes of the plurality of captured images according to the distances between positions of the photographing device and the three-dimensional model, and synthesizing the resized plurality of captured images to generate the synthesized image.

Description

图像生成装置、图像生成方法、程序以及记录介质Image generating device, image generating method, program and recording medium 【技术领域】【Technical Field】
本公开涉及一种基于由飞行体拍摄的多个摄像图像来生成合成图像的图像生成装置、图像生成方法、程序以及记录介质。The present disclosure relates to an image generation device, an image generation method, a program, and a recording medium that generate a composite image based on a plurality of captured images captured by a flying object.
【背景技术】【Background technique】
以往,已知一种一边通过预设的固定路径一边进行拍摄的平台(无人机)。此平台从地面基地接收摄像指示,对摄像对象进行拍摄。此平台在对拍摄对象进行拍摄时,一边沿固定路径飞行,一边依据平台与拍摄对象的位置关系,倾斜平台的拍摄机器来进行拍摄(参照专利文献1)。Conventionally, a platform (unmanned aerial vehicle) for shooting while passing through a predetermined fixed path is known. This platform receives the camera instruction from the ground base and shoots the camera object. This platform shoots a subject, while flying along a fixed path, and tilting the platform's shooting device to shoot based on the positional relationship between the platform and the subject (refer to Patent Document 1).
【现有技术文献】【Prior Art Literature】
【专利文献】【Patent Literature】
【专利文献1】日本特开2010-61216号公报[Patent Document 1] Japanese Patent Application Publication No. 2010-61216
【发明内容】[Invention content]
【发明要解决的问题】[Problems to be solved by the invention]
可以合成由专利文献1的无人机拍摄的多个图像并生成合成图像。此外,可以基于由专利文献1的无人机拍摄的多个图像,并且采用三维重建技术在二维平面上投影密集的三维点群,从而生成正射图像。在前者的生成合成图像时,需要预先识别参考图像的尺度(尺寸)以便将所有图像拟合到参考图像中。如果在不考虑各图像的尺寸的情况下生成合成图像,则由合成图像体现的被摄体的再现性降低。在生成后者的正射图像时,尽管生成了密集的三维点群,但是用于生成密集的三维点群的计算很复杂且计算成本很高。例如,在执行三维重建从而生成正射图像的处理过程中需要依次执行素点生成(例如,sfm:Structure from Motion)、密点生成(例如,mvs:Multi-View Stereo)、网格生成、纹理生成等,造成生成正射图像的处理负担较重。因此,难以在计算处理能力不是很高的平板电脑终端等上轻松地生成合成图像或正射图像。It is possible to synthesize multiple images captured by the drone of Patent Document 1 and generate a synthesized image. In addition, it is possible to generate an orthophoto based on a plurality of images taken by the drone of Patent Document 1 and using a three-dimensional reconstruction technique to project a dense three-dimensional point group on a two-dimensional plane. When generating the synthetic image of the former, the scale (size) of the reference image needs to be identified in advance in order to fit all the images into the reference image. If the composite image is generated without considering the size of each image, the reproducibility of the subject reflected by the composite image is reduced. When the latter orthophoto is generated, although dense three-dimensional point groups are generated, the calculation for generating dense three-dimensional point groups is complicated and the calculation cost is high. For example, in the process of performing three-dimensional reconstruction to generate an orthophoto, it is necessary to sequentially perform prime point generation (for example, sfm: Structure from Motion), dense point generation (for example, mvs: Multi-View Stereo), mesh generation, and texture Generation, etc., cause a heavy processing burden to generate orthophotos. Therefore, it is difficult to easily generate a composite image or an orthophoto on a tablet computer terminal or the like that does not have a high calculation processing capability.
【用于解决问题的技术手段】[Technical means for solving problems]
一方面中,一种图像生成装置,其是基于由飞行体拍摄的多个摄像图像来生成合成图像的图像生成装置,包括执行与合成图像的生成相关的处理的处理部,处理部获取由飞行体所包括的摄像装置拍摄的多个摄像图像;基于多个摄像图像生成三维模型,获取拍摄多个摄像图像时的摄像装置的各个姿势;基于摄像装置的各个姿势和三维模型,计算拍摄多个摄像图像时的摄像装置的各个位置与三维模型之间的距离;基于摄像装置的各个位置与三维模型之间的距离来调整多个摄像图像的尺寸;对尺寸调整后的多个摄像图像进行合成而生成合成图像。In one aspect, an image generation device is an image generation device that generates a composite image based on a plurality of captured images captured by a flying body, and includes a processing unit that performs processing related to the generation of a composite image, and the processing unit acquires Multiple camera images captured by the camera included in the body; generate a three-dimensional model based on the multiple camera images to obtain various postures of the camera device when taking multiple camera images; calculate multiple images based on each camera device posture and three-dimensional model The distance between each position of the camera device and the three-dimensional model at the time of capturing an image; the size of the plurality of camera images is adjusted based on the distance between each position of the camera device and the three-dimensional model; the plurality of camera images after resizing are combined Instead, a composite image is generated.
所述处理部可以获取拍摄所述多个摄像图像时的所述摄像装置的各个位置和所述各个姿势,基于所述摄像装置的所述各个位置和所述各个姿势以及所述三维模型来计算所述摄像装置的各个位置与所述三维模型之间的距离。The processing unit may acquire each position and each posture of the camera device when capturing the plurality of captured images, and calculate based on the each position and each posture of the camera device and the three-dimensional model The distance between each position of the camera device and the three-dimensional model.
处理部可以按摄像装置在各个位置处拍摄的各个摄像范围,来计算摄像装置和与摄像范围相对应的三维模型的第一部分的距离。The processing unit may calculate the distance between the imaging device and the first part of the three-dimensional model corresponding to the imaging range for each imaging range captured by the imaging device at each position.
处理部可以对摄像装置的各个位置处拍摄的摄像范围进行分割,以产生摄像范围的分割区域,计算与分割区域相对应的三维模型的第二部分,并按照各个分割区域,计算摄像装置和与分割区域相对应的三维模型的第二部分之间的距离。The processing unit may divide the imaging range shot at each position of the imaging device to generate a division area of the imaging range, calculate the second part of the three-dimensional model corresponding to the division area, and calculate the imaging device and the The distance between the second part of the 3D model corresponding to the segmented area.
距离可以是摄像装置的各个位置与三维模型之间的沿铅直方向的距离。The distance may be the distance in the vertical direction between each position of the camera and the three-dimensional model.
距离可以是摄像装置的各个位置与三维模型之间的沿摄像装置的摄像方向的距离。The distance may be a distance in the imaging direction of the imaging device between each position of the imaging device and the three-dimensional model.
所述处理部可以基于所述多个摄像图像生成稀疏点群数据,基于稀疏点群数据生成三维模型。The processing unit may generate sparse point group data based on the plurality of captured images, and generate a three-dimensional model based on the sparse point group data.
处理部可以将包含在稀疏点群数据中的多个三维点投影到二维平面上;将所投影的在二维平面中相邻的多个二维点指定为一组,并指定多个组;按组连接包含在与指定的相邻的多个二维点相对应的稀疏点群数据中的多个三维点,以产生多个面数据;基于多个面数据生成三维模型。The processing unit can project multiple three-dimensional points contained in the sparse point group data onto the two-dimensional plane; specify the projected multiple two-dimensional points adjacent to the two-dimensional plane as a group, and specify multiple groups ; Connect multiple 3D points contained in the sparse point group data corresponding to the specified adjacent multiple 2D points in groups to generate multiple face data; generate a three-dimensional model based on multiple face data.
一方面中,一种图像生成方法,其是基于由飞行体拍摄的多个摄像图像来生成合成图像的图像生成方法,包括以下步骤:获取由飞行体所包括的摄像装置拍摄的多个摄像图像;基于多个摄像图像生成三维模型;获取拍摄多个摄像图像时的摄像装置的各个姿势;基于摄像装置的各个姿势和三维模型,计算拍摄多个摄像图像时的摄像装置的各个位置与三维模型之间的距离;基于摄像装置的各个位置与三维模型之间的距离来调整多个摄像图像的尺寸;对尺寸调整后的多个摄像图像进行合成而生成合成图像。In one aspect, an image generation method is an image generation method that generates a composite image based on a plurality of captured images captured by a flying body, and includes the following steps: acquiring multiple captured images captured by a camera device included in the flying body ; Generate a three-dimensional model based on multiple camera images; obtain the various poses of the camera device when shooting multiple camera images; calculate the various positions and three-dimensional models of the camera device when shooting multiple camera images based on the camera device's various poses and three-dimensional models The distance between them; adjust the size of multiple captured images based on the distance between each position of the camera device and the three-dimensional model; synthesize multiple captured images after size adjustment to generate a composite image.
所述获取姿势的步骤可以包括以下步骤:拍摄所述多个摄像图像时的所述摄像装置的各个位置以及所述各个姿势。所述计算距离的步骤可以包括以下步骤:基于所述摄像装置的所述各个位置和所述各个姿势以及所述三维模型来计算所述摄像装置的各个位置与所述三维模型之间的距离。The step of acquiring a posture may include the following steps: various positions of the camera device and the various postures when the plurality of captured images are captured. The step of calculating the distance may include the step of calculating the distance between each position of the camera device and the three-dimensional model based on the respective positions and postures of the camera device and the three-dimensional model.
计算距离的步骤可以包括以下步骤:按摄像装置在各个位置处拍摄的各个摄像范围,来计算摄像装置和与摄像范围相对应的三维模型的第一部分的距离。The step of calculating the distance may include the step of calculating the distance between the camera device and the first part of the three-dimensional model corresponding to the camera range for each camera range captured by the camera device at various positions.
计算距离的步骤可以包括以下步骤:对摄像装置的各个位置处拍摄的摄像范围进行分割,以产生摄像范围的分割区域;计算与分割区域相对应的三维模型的第二部分;并按照各个分割区域,计算摄像装置和与分割区域相对应的三维模型的第二部分之间 的距离。The step of calculating the distance may include the following steps: dividing the imaging ranges taken at various positions of the camera device to generate a segmentation area of the imaging range; calculating the second part of the three-dimensional model corresponding to the segmentation area; To calculate the distance between the camera and the second part of the three-dimensional model corresponding to the segmented area.
距离可以是摄像装置的各个位置与三维模型之间的沿铅直方向的距离。The distance may be the distance in the vertical direction between each position of the camera and the three-dimensional model.
距离可以是摄像装置的各个位置与三维模型之间的沿摄像装置的摄像方向的距离。The distance may be a distance in the imaging direction of the imaging device between each position of the imaging device and the three-dimensional model.
所述生成三维模型的步骤可以包括以下步骤:基于所述多个摄像图像生成稀疏点群数据;基于所述稀疏点群数据生成三维模型。The step of generating a three-dimensional model may include the steps of: generating sparse point group data based on the plurality of camera images; and generating a three-dimensional model based on the sparse point group data.
生成三维模型的步骤可以包括以下步骤:将包含在稀疏点群数据中的多个三维点投影到二维平面上;将所投影的在二维平面中相邻的多个二维点指定为一组,并指定多个组;按组连接包含在与指定的相邻的多个二维点相对应的稀疏点群数据中的多个三维点,以产生多个面数据;以及基于多个面数据生成三维模型。The step of generating a 3D model may include the steps of: projecting a plurality of 3D points contained in the sparse point group data onto a 2D plane; and designating a plurality of projected 2D points adjacent to the 2D plane as a Group, and specify multiple groups; connect multiple 3D points contained in the sparse point group data corresponding to the specified adjacent multiple 2D points by groups to generate multiple face data; and based on multiple faces The data generates a three-dimensional model.
一方面中,一种程序,其用于使基于由飞行体拍摄的多个摄像图像来生成合成图像的图像生成装置执行以下步骤:获取由飞行体所包括的摄像装置拍摄的多个摄像图像;基于多个摄像图像生成三维模型;获取拍摄多个摄像图像时的摄像装置的各个姿势;基于摄像装置的各个姿势和三维模型,计算拍摄多个摄像图像时的摄像装置的各个位置与三维模型之间的距离;基于摄像装置的各个位置与三维模型之间的距离来调整多个摄像图像的尺寸;以及对尺寸调整后的多个摄像图像进行合成而生成合成图像。In one aspect, a program for causing an image generation device that generates a composite image based on a plurality of captured images captured by a flying body to perform the following steps: acquire multiple captured images captured by a camera included in the flying body; Generate a three-dimensional model based on multiple camera images; acquire various poses of the camera device when capturing multiple camera images; calculate each position and three-dimensional model of the camera device when capturing multiple camera images based on each camera device's posture and three-dimensional model The distance between each position of the camera device and the three-dimensional model to adjust the size of the plurality of captured images; and the resized multiple captured images are combined to generate a composite image.
一方面中,一种记录介质,其是记录有使基于由飞行体拍摄的多个摄像图像来生成合成图像的图像生成装置执行以下步骤的程序的计算机可读记录介质:获取由飞行体所包括的摄像装置拍摄的多个摄像图像;基于多个摄像图像生成三维模型;获取拍摄多个摄像图像时的摄像装置的各个姿势;基于摄像装置的各个姿势和三维模型,计算拍摄多个摄像图像时的摄像装置的各个位置与三维模型之间的距离;基于摄像装置的各个位置与三维模型之间的距离来调整多个摄像图像的尺寸;以及对尺寸调整后的多个摄像图像进行合成而生成合成图像。In one aspect, a recording medium is a computer-readable recording medium recorded with a program that causes an image generation device that generates a composite image based on a plurality of camera images taken by a flying body to perform the following steps: acquiring Multiple camera images captured by the camera device of the world; generating a three-dimensional model based on the multiple camera images; acquiring each posture of the camera device when taking multiple camera images; calculating each camera image based on each posture and three-dimensional model of the camera device The distance between each position of the camera device and the three-dimensional model; adjust the size of the plurality of camera images based on the distance between each position of the camera device and the three-dimensional model; and synthesize the plurality of camera images after the size adjustment to generate Composite image.
此外,上述发明的概要并未列举出本公开所有的特征。此外,这些特征组的子组合也可以构成发明。In addition, the summary of the above invention does not list all the features of the present disclosure. In addition, sub-combinations of these feature sets may also constitute inventions.
【附图说明】【Explanation】
图1是示出实施方式1中的飞行***的第一构成示例的示意图。FIG. 1 is a schematic diagram showing a first configuration example of the flying system in Embodiment 1. FIG.
图2是示出实施方式1中的飞行***的第二构成示例的示意图。2 is a schematic diagram showing a second configuration example of the flying system in Embodiment 1. FIG.
图3是示出无人驾驶航空器的具体的外观的一个示例的图。3 is a diagram showing an example of a specific appearance of an unmanned aircraft.
图4是示出无人驾驶航空器的硬件配置的一个示例的框图。4 is a block diagram showing an example of the hardware configuration of an unmanned aircraft.
图5是示出终端的硬件配置的一个示例的框图。FIG. 5 is a block diagram showing an example of the hardware configuration of the terminal.
图6是示出图像合成处理过程的一个示例的流程图。6 is a flowchart showing an example of an image synthesis processing procedure.
图7是示出摄像部的各个位置与三维模型的对应部分之间的距离的一个示例的图。7 is a diagram showing an example of the distance between each position of the imaging unit and the corresponding part of the three-dimensional model.
图8是示出摄像部的位置与三维模型之间的距离的导出示例的图。8 is a diagram showing an example of deriving the distance between the position of the imaging unit and the three-dimensional model.
图9是示出各个摄像范围的距离的导出示例的图。FIG. 9 is a diagram showing an example of deriving the distance of each imaging range.
图10是示出摄像范围被分割后的各个分割区域的距离的导出示例的图。FIG. 10 is a diagram showing an example of deriving the distance of each divided region after the imaging range is divided.
图11是示出尺寸调整量的校正示例的图。FIG. 11 is a diagram showing a correction example of the size adjustment amount.
图12是示出三维模型的生成处理的一个示例的流程图。FIG. 12 is a flowchart showing an example of generation processing of a three-dimensional model.
图13是示出三维点群和二维点群的一个示例的图。13 is a diagram showing an example of a three-dimensional point group and a two-dimensional point group.
图14是示出在比较例中生成的三维模型的图。14 is a diagram showing a three-dimensional model generated in a comparative example.
图15是示出在实施方式1中生成的三维模型的一个示例的图。15 is a diagram showing an example of a three-dimensional model generated in Embodiment 1. FIG.
【具体实施方式】【detailed description】
以下,通过发明的实施方式来说明本公开,但是以下的实施方式并不限定权利要求书所涉及的发明。实施方式中说明的特征的所有组合未必是发明的解决方案所必须的。Hereinafter, the present disclosure will be described by embodiments of the invention, but the following embodiments do not limit the inventions related to the claims. All combinations of the features described in the embodiments are not necessarily required for the invented solution.
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会异议。但是,在除此以外的情况下,保留一切的著作权。The claims, the description, the drawings of the description, and the abstract of the description contain matters that are protected by the copyright. Anyone who reproduces these files as indicated in the documents or records of the Patent Office will not object to the copyright owner. However, in other cases, all copyrights are reserved.
在以下实施方式中,飞行体以无人驾驶航空器(UAV:Unmanned Aerial Vehicle)为例。在本说明书的附图中,无人驾驶航空器也表述为“UAV”。图像生成***以具有无人驾驶航空器以及终端的飞行***为例。虽然图像生成装置主要以无人驾驶航空器为例,但是也可以是终端。终端可以包括智能手机、平板电脑终端、PC(Personal Computer)或者其他装置。图像生成方法被图像生成装置中的操作所定义。此外,记录介质中记录有程序(例如使图像生成装置执行各种处理的程序)。In the following embodiments, an unmanned aircraft (UAV: Unmanned Aerial) is used as an example of a flying body. In the drawings of this specification, unmanned aerial vehicles are also expressed as "UAV". The image generation system is exemplified by a flying system with an unmanned aircraft and a terminal. Although the image generation device mainly takes an unmanned aircraft as an example, it may be a terminal. The terminal may include a smart phone, a tablet terminal, a PC (Personal Computer) or other devices. The image generation method is defined by the operation in the image generation device. In addition, a program (for example, a program that causes the image generating device to execute various processes) is recorded in the recording medium.
(实施方式1)(Embodiment 1)
图1是示出实施方式1中的飞行***10的第一构成示例的示意图。飞行***10包括无人驾驶航空器100以及终端80。无人驾驶航空器100和终端80之间可以通过有线通信或无线通信(例如,无线LAN(Local Area Network))互相通信。在图1中,例示了终端80是便携式终端(例如智能手机、平板电脑终端)。FIG. 1 is a schematic diagram showing a first configuration example of the flying system 10 in Embodiment 1. FIG. The flight system 10 includes an unmanned aircraft 100 and a terminal 80. The unmanned aircraft 100 and the terminal 80 may communicate with each other through wired communication or wireless communication (for example, wireless LAN (Local Area Network)). In FIG. 1, it is illustrated that the terminal 80 is a portable terminal (for example, a smartphone or a tablet terminal).
另外,飞行***的构成可以为包括无人驾驶航空器、发送器(比例控制器)以及便携式终端。当包括发送器时,用户能够使用配置在发送器的前面的左右的控制杆来控制无人驾驶航空器的飞行。另外,在此情况下,无人驾驶航空器、发送器以及便携式终端之间能够通过有线通信或者无线通信相互通信。In addition, the configuration of the flight system may include an unmanned aircraft, a transmitter (proportional controller), and a portable terminal. When the transmitter is included, the user can use the left and right joysticks arranged in front of the transmitter to control the flight of the unmanned aircraft. In addition, in this case, the unmanned aircraft, the transmitter, and the portable terminal can communicate with each other through wired communication or wireless communication.
图2是示出实施方式1中的飞行***10的第二构成示例的示意图。在图2中,例示了终端80是PC。在图1和图2的任意一个中,终端80具有的功能可以相同。2 is a schematic diagram showing a second configuration example of the flying system 10 in Embodiment 1. FIG. In FIG. 2, it is illustrated that the terminal 80 is a PC. In any one of FIGS. 1 and 2, the function of the terminal 80 may be the same.
图3是示出无人驾驶航空器100的具体的外观的一个示例的图。在图3中,示出了无人驾驶航空器100在移动方向STV0飞行时的立体图。无人驾驶航空器100为移动体的一个示例。FIG. 3 is a diagram showing an example of a specific appearance of the unmanned aerial vehicle 100. In FIG. 3, a perspective view of the unmanned aircraft 100 when flying in the moving direction STV0 is shown. The unmanned aerial vehicle 100 is an example of a mobile body.
如图3所示,在与地面平行且沿着移动方向STV0的方向上设定滚转轴(参照x 轴)。在此情况下,在与地面平行且与滚转轴垂直的方向上设定俯仰轴(参照y轴),进而,在与地面垂直且与滚转轴及俯仰轴垂直的方向上设定偏航轴(参照z轴)。As shown in FIG. 3, the roll axis is set in a direction parallel to the ground and along the moving direction STV0 (refer to the x axis). In this case, set the pitch axis in the direction parallel to the ground and perpendicular to the roll axis (refer to the y-axis), and further, set the yaw axis in the direction perpendicular to the ground and perpendicular to the roll axis and pitch axis ( (Refer to the z axis).
无人驾驶航空器100的构成为包括UAV主体102、万向节200、摄像部220、以及多个摄像部230。The configuration of the unmanned aerial vehicle 100 includes a UAV main body 102, a universal joint 200, an imaging unit 220, and a plurality of imaging units 230.
UAV主体102包括多个旋翼(螺旋浆)。UAV主体102通过控制多个旋翼的旋转而使无人驾驶航空器100飞行。UAV主体102使用例如四个旋翼使无人驾驶航空器100飞行。旋翼的数量并不限于四个。此外,无人驾驶航空器100可以是没有旋翼的固定翼飞机。The UAV body 102 includes a plurality of rotors (propellers). The UAV main body 102 makes the unmanned aircraft 100 fly by controlling the rotation of a plurality of rotors. The UAV main body 102 uses, for example, four rotors to fly the unmanned aerial vehicle 100. The number of rotors is not limited to four. In addition, the unmanned aerial vehicle 100 may be a fixed-wing aircraft without a rotor.
摄像部220是对包含在期望的摄像范围内的被摄体(例如,作为航拍对象的上空的情况、山川河流等的景色、地上的建筑物)进行拍摄的摄像用相机。The imaging unit 220 is an imaging camera that captures an object included in a desired imaging range (for example, the sky above an aerial photography target, a landscape such as mountains, rivers, and buildings on the ground).
多个摄像部230是为了控制无人驾驶航空器100的飞行而对无人驾驶航空器100的周围进行拍摄的传感用相机。两个摄像部230可以设置于无人驾驶航空器100的机头即正面。并且,其他两个摄像部230可以设置于无人驾驶航空器100的底面。正侧的两个摄像部230可以成对,起到所谓立体相机的作用。底面侧的两个摄像部230也可以成对,起到立体相机的作用。可以基于由多个摄像部230拍摄到的图像来生成无人驾驶航空器100周围的三维空间数据。另外,无人驾驶航空器100所包括的摄像部230的数量不限于四个。无人驾驶航空器100只要包括至少一个摄像部230即可。无人驾驶航空器100可以在无人驾驶航空器100的机头、机尾、侧面、底面及顶面分别包括至少一个摄像部230。摄像部230中可设定的视角可大于摄像部220中可设定的视角。摄像部230可以具有单焦点镜头或鱼眼镜头。The plurality of imaging units 230 are sensing cameras that capture the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100. The two camera units 230 may be installed on the front of the nose of the unmanned aircraft 100. In addition, the other two imaging units 230 may be installed on the bottom surface of the unmanned aerial vehicle 100. The two imaging units 230 on the front side may be paired and function as a so-called stereo camera. The two imaging units 230 on the bottom surface side may also be paired to function as a stereo camera. The three-dimensional space data around the unmanned aircraft 100 may be generated based on the images captured by the plurality of imaging units 230. In addition, the number of imaging units 230 included in the unmanned aerial vehicle 100 is not limited to four. The unmanned aerial vehicle 100 only needs to include at least one camera 230. The unmanned aircraft 100 may include at least one camera 230 on the nose, tail, side, bottom, and top of the unmanned aircraft 100, respectively. The angle of view that can be set in the imaging unit 230 can be larger than the angle of view that can be set in the imaging unit 220. The imaging unit 230 may have a single focus lens or a fisheye lens.
图4是示出无人驾驶航空器100的硬件配置的一个示例的框图。无人驾驶航空器100的构成为包括UAV控制部110、通信接口150、内存160、存储器170、万向节200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置(IMU:Inertial Measurement Unit)250、磁罗盘260、气压高度计270、超声波传感器280、激光测量仪290。FIG. 4 is a block diagram showing one example of the hardware configuration of the unmanned aerial vehicle 100. The unmanned aerial vehicle 100 includes a UAV control unit 110, a communication interface 150, a memory 160, a memory 170, a universal joint 200, a rotor mechanism 210, an imaging unit 220, an imaging unit 230, a GPS receiver 240, and an inertial measurement device ( IMU: Inertial Measurement Unit) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, laser measuring instrument 290.
UAV控制部110例如由CPU(Central Processing Unit:中央处理器)、MPU(Micro Processing Unit:微处理器)或DSP(Digital Signal Processor:数字信号处理器)构成。UAV控制部110执行用于总体控制无人驾驶航空器100的各部分的动作的信号处理、与其它各部分之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。The UAV control unit 110 is composed of, for example, a CPU (Central Processing Unit), MPU (Micro Processing Unit), or DSP (Digital Signal Processor). The UAV control unit 110 performs signal processing for overall control of the operation of each part of the unmanned aircraft 100, data input/output processing with other parts, data calculation processing, and data storage processing.
UAV控制部110按照存储于内存160中的程序来控制无人驾驶航空器100的飞行。UAV控制部110可以控制飞行。UAV控制部110可以航拍图像。The UAV control unit 110 controls the flight of the unmanned aircraft 100 according to a program stored in the memory 160. The UAV control unit 110 can control the flight. The UAV control unit 110 can take aerial images.
UAV控制部110获取表示无人驾驶航空器100的位置的位置信息。UAV控制部110可以从GPS接收器240获取表示无人驾驶航空器100所在的纬度、经度以及高度的位置信息。UAV控制部110可以分别从GPS接收器240获取表示无人驾驶航空器100所在的纬度以及经度的纬度经度信息,并从气压高度计270获取表示无人驾驶航 空器100所在的高度的高度信息,作为位置信息。UAV控制部110可以获取超声波传感器280产生的超声波的放射点与超声波的反射点之间的距离,作为高度信息。The UAV control unit 110 acquires position information indicating the position of the unmanned aircraft 100. The UAV control unit 110 may obtain position information indicating the latitude, longitude, and altitude where the unmanned aircraft 100 is located from the GPS receiver 240. The UAV control unit 110 may obtain latitude and longitude information indicating the latitude and longitude of the unmanned aircraft 100 from the GPS receiver 240 respectively, and obtain altitude information indicating the altitude of the unmanned aircraft 100 from the barometric altimeter 270 as position information . The UAV control unit 110 may acquire the distance between the radiation point of the ultrasonic wave generated by the ultrasonic sensor 280 and the reflection point of the ultrasonic wave as height information.
UAV控制部110可以从磁罗盘260获取表示无人驾驶航空器100的朝向的朝向信息。朝向信息可以用例如与无人驾驶航空器100的机头的朝向相对应的方位来表示。The UAV control unit 110 may obtain orientation information indicating the orientation of the unmanned aircraft 100 from the magnetic compass 260. The orientation information can be expressed by, for example, an orientation corresponding to the orientation of the nose of the unmanned aircraft 100.
UAV控制部110可以在摄像部220对应该拍摄的摄像范围进行拍摄时,获取表示无人驾驶航空器100所应该存在的位置的位置信息。UAV控制部110可以从内存160获取表示无人驾驶航空器100所应该存在的位置的位置信息。UAV控制部110可以经由通信接口150从其他装置获取表示无人驾驶航空器100所应该存在的位置的位置信息。UAV控制部110可以参照三维地图数据库,来指定无人驾驶航空器100所能够存在的位置,并获取该位置作为表示无人驾驶航空器100所应该存在的位置的位置信息。The UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist when the imaging unit 220 shoots the imaging range to be photographed. The UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist from the memory 160. The UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist from other devices via the communication interface 150. The UAV control unit 110 may refer to the three-dimensional map database to specify the position where the unmanned aircraft 100 can exist, and acquire the position as position information indicating the position where the unmanned aircraft 100 should exist.
UAV控制部110可以获取表示摄像部220以及摄像部230的各自的摄像范围的摄像范围信息。UAV控制部110可以从摄像部220以及摄像部230获取表示摄像部220以及摄像部230的视角的视角信息,作为用于确定摄像范围的参数。UAV控制部110可以获取表示摄像部220以及摄像部230的摄像方向的信息,作为用于确定摄像范围的参数。UAV控制部110例如可以根据万向节200获取表示摄像部220的姿势状态的姿势信息,作为表示摄像部220的摄像方向的信息。摄像部220的姿势信息可以表示万向节200从俯仰轴和偏航轴的基准旋转角度旋转的角度。The UAV control unit 110 can acquire imaging range information indicating the respective imaging ranges of the imaging unit 220 and the imaging unit 230. The UAV control unit 110 may acquire the angle of view information indicating the angles of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230 as a parameter for specifying the imaging range. The UAV control unit 110 may acquire information indicating the imaging directions of the imaging unit 220 and the imaging unit 230 as parameters for specifying the imaging range. The UAV control unit 110 may acquire posture information indicating the posture state of the imaging unit 220 from the universal joint 200 as information indicating the imaging direction of the imaging unit 220, for example. The posture information of the imaging unit 220 may indicate the angle at which the gimbal 200 rotates from the reference rotation angle of the pitch axis and the yaw axis.
UAV控制部110可以获取表示无人驾驶航空器100所在的位置的位置信息,作为用于确定摄像范围的参数。UAV控制部110可以基于摄像部220和摄像部230的视角和摄像方向以及无人驾驶航空器100所在的位置,来划定表示摄像部220拍摄的地理范围的摄像范围并生成摄像范围信息,从而获取摄像范围信息。The UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 is located as a parameter for determining the imaging range. The UAV control unit 110 may delineate the imaging range representing the geographical range captured by the imaging unit 220 based on the angle of view and imaging direction of the imaging unit 220 and the imaging unit 230 and the location of the unmanned aircraft 100 and generate imaging range information to obtain Camera range information.
UAV控制部110可以从内存160获取摄像范围信息。UAV控制部110可以经由通信接口150获取摄像范围信息。The UAV control unit 110 can obtain the imaging range information from the memory 160. The UAV control unit 110 can acquire the imaging range information via the communication interface 150.
UAV控制部110控制万向节200、旋翼机构210、摄像部220以及摄像部230。UAV控制部110可以通过变更摄像部220的摄像方向或视角来控制摄像部220的摄像范围。UAV控制部110可以通过控制万向节200的旋转机构来控制万向节200所支持的摄像部220的摄像范围。The UAV control unit 110 controls the universal joint 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230. The UAV control unit 110 can control the imaging range of the imaging unit 220 by changing the imaging direction or angle of view of the imaging unit 220. The UAV control unit 110 can control the imaging range of the imaging unit 220 supported by the universal joint 200 by controlling the rotation mechanism of the universal joint 200.
摄像范围是指由摄像部220或摄像部230拍摄的地理范围。摄像范围由纬度、经度和高度定义。摄像范围可以是由纬度、经度和高度定义的三维空间数据的范围。摄像范围可以是由纬度和经度定义的二维空间数据的范围。摄像范围可以根据摄像部220或摄像部230的视角和摄像方向以及无人驾驶航空器100所在的位置而特别指定。摄像部220和摄像部230的摄像方向可以由设置有摄像部220和摄像部230的摄像镜头的正面所朝的方位和俯角来定义。摄像部220的摄像方向可以是由无人驾驶航空器100的机头的方位和相对于万向节200的摄像部220的姿势状态而特别指定的方向。摄像部230的摄像方向可以是由无人驾驶航空器100的机头的方位和设置有摄像部230的位置而特别指定的方向。The imaging range refers to the geographic range captured by the imaging unit 220 or the imaging unit 230. The camera range is defined by latitude, longitude and altitude. The imaging range may be the range of three-dimensional spatial data defined by latitude, longitude, and altitude. The imaging range may be the range of two-dimensional spatial data defined by latitude and longitude. The imaging range may be specified according to the angle of view and imaging direction of the imaging unit 220 or the imaging unit 230 and the location where the unmanned aircraft 100 is located. The imaging direction of the imaging unit 220 and the imaging unit 230 can be defined by the azimuth and depression angle of the front of the imaging lens provided with the imaging unit 220 and the imaging unit 230. The imaging direction of the imaging unit 220 may be a direction specified by the orientation of the nose of the unmanned aircraft 100 and the posture state of the imaging unit 220 of the universal joint 200. The imaging direction of the imaging unit 230 may be a direction specified by the orientation of the nose of the unmanned aircraft 100 and the position where the imaging unit 230 is provided.
UAV控制部110可以通过分析由多个摄像部230拍摄到的多个图像,来确定无人驾驶航空器100的周围的环境。UAV控制部110可以根据无人驾驶航空器100的周围的环境,例如避开障碍物来控制飞行。The UAV control unit 110 may determine the surrounding environment of the unmanned aircraft 100 by analyzing multiple images captured by the multiple imaging units 230. The UAV control unit 110 may control the flight according to the surrounding environment of the unmanned aircraft 100, for example, avoiding obstacles.
UAV控制部110可以获取表示存在于无人驾驶航空器100周围的对象的立体形状(三维形状)的立体信息(三维信息)。对象例如可以是建筑物、道路、车辆、树木等风景的一部分。立体信息例如是三维空间数据。UAV控制部110可以根据从多个摄像部230得到的各个图像,生成表示存在于无人驾驶航空器100的周围的对象的立体形状的立体信息,由此获取立体信息。UAV控制部110可以通过参照存储在内存160或存储器170中的三维地图数据库,来获取表示存在于无人驾驶航空器100的周围的对象的立体形状的立体信息。UAV控制部110可以通过参照由网络上存在的服务器管理的三维地图数据库,来获取与存在于无人驾驶航空器100的周围的对象的立体形状相关的立体信息。The UAV control unit 110 may acquire three-dimensional information (three-dimensional information) indicating the three-dimensional shape (three-dimensional shape) of an object existing around the unmanned aircraft 100. The object may be part of a landscape such as buildings, roads, vehicles, trees, etc., for example. The three-dimensional information is, for example, three-dimensional spatial data. The UAV control unit 110 may generate three-dimensional information representing the three-dimensional shape of an object existing around the unmanned aircraft 100 from each image obtained from the plurality of imaging units 230, thereby acquiring three-dimensional information. The UAV control unit 110 may acquire the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the memory 160 or the memory 170. The UAV control unit 110 may acquire three-dimensional information related to the three-dimensional shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database managed by a server existing on the network.
UAV控制部110通过控制旋翼机构210来控制无人驾驶航空器100的飞行。即,UAV控制部110通过控制旋翼机构210来对包括无人驾驶航空器100的纬度、经度以及高度的位置进行控制。UAV控制部110可以通过控制无人驾驶航空器100的飞行来控制摄像部220的摄像范围。UAV控制部110可以通过控制摄像部220所包括的变焦镜头来控制摄像部220的视角。UAV控制部110可以利用摄像部220的数字变焦功能,通过数字变焦来控制摄像部220的视角。The UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotor mechanism 210. That is, the UAV control unit 110 controls the rotor mechanism 210 to control the position including the latitude, longitude, and altitude of the unmanned aircraft 100. The UAV control unit 110 may control the imaging range of the imaging unit 220 by controlling the flight of the unmanned aircraft 100. The UAV control section 110 may control the angle of view of the imaging section 220 by controlling the zoom lens included in the imaging section 220. The UAV control unit 110 may use the digital zoom function of the imaging unit 220 to control the angle of view of the imaging unit 220 through digital zoom.
当摄像部220固定于无人驾驶航空器100,不能移动摄像部220时,UAV控制部110可以通过使无人驾驶航空器100在特别指定的日期和时间向特定的位置移动,使摄像部220在所希望的环境下对所希望的摄像范围进行拍摄。或者,即使当摄像部220没有变焦功能,无法变更摄像部220视角时,UAV控制部110也可以通过使无人驾驶航空器100在特定的日期和时间向特定的位置移动,使摄像部220在所希望的环境下对所希望的摄像范围进行拍摄。When the camera unit 220 is fixed to the unmanned aerial vehicle 100 and the camera unit 220 cannot be moved, the UAV control unit 110 can move the unmanned aircraft 100 to a specific location on a specially designated date and time to keep the camera unit 220 in Shoot the desired shooting range under the desired environment. Alternatively, even when the imaging unit 220 does not have a zoom function and the viewing angle of the imaging unit 220 cannot be changed, the UAV control unit 110 may move the unmanned aircraft 100 to a specific position on a specific date and time to place the imaging unit 220 in Shoot the desired shooting range under the desired environment.
通信接口150与终端80进行通信。通信接口150可以通过任意的无线通信方式进行无线通信。通信接口150可以通过任意的有线通信方式进行有线通信。通信接口150可以将摄像图像(例如航拍图像)、与摄像图像相关的附加信息(元数据)发送到终端80。The communication interface 150 communicates with the terminal 80. The communication interface 150 can perform wireless communication by any wireless communication method. The communication interface 150 can perform wired communication by any wired communication method. The communication interface 150 may send a captured image (for example, an aerial image) and additional information (metadata) related to the captured image to the terminal 80.
内存160存储UAV控制部110对万向节200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置250、磁罗盘260、气压高度计270、超声波传感器280以及激光测量仪290进行控制所需的程序等。内存160可以是计算机可读记录介质,可以包括SRAM(Static Random Access Memory:静态随机存取存储器)、DRAM(Dynamic Random Access Memory:动态随机存取存储器)、EPROM(Erasable Programmable Read Only Memory:可擦除可编程只读存储器)、EEPROM(Electrically Erasable Programmable Read-Only Memory:电可擦除可编程只读存储器)以及USB(Universal Serial Bus:通用串行总线)内存等闪存中的至少一个。内存160可以从无人驾驶航空器100上拆卸下来。内存160可以作为作业用内存进行工作。The memory 160 stores the UAV control unit 110 to the universal joint 200, the rotor mechanism 210, the imaging unit 220, the imaging unit 230, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring instrument 290 Programs required for control, etc. The memory 160 may be a computer-readable recording medium, and may include SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory: erasable At least one of flash memory such as programmable read-only memory), EEPROM (Electrically Erasable Programmable Read-Only Memory) and USB (Universal Serial Bus) memory. The memory 160 can be detached from the unmanned aerial vehicle 100. The memory 160 can be used as a working memory.
存储器170可以包括HDD(Hard Disk Drive:硬盘驱动器)、SSD(Solid State Drive:固态硬盘)、SD卡、USB内存、其他的存储器中的至少一个。存储器170可以保存各种信息、各种数据。存储器170可以从无人驾驶航空器100上拆卸下来。存储器170可以记录摄像图像。The storage 170 may include at least one of an HDD (Hard Disk: Drive), SSD (Solid State Drive), SD card, USB memory, and other storage. The memory 170 can store various information and various data. The memory 170 can be detached from the unmanned aerial vehicle 100. The memory 170 can record a captured image.
内存160或存储器170可以保存由终端80或无人驾驶航空器100生成的摄像位置、摄像路径的信息。可以通过UAV控制部110设置摄像位置、摄像路径的信息,作为由无人驾驶航空器100计划的拍摄所涉及的摄像参数或由无人驾驶航空器100预定的飞行所涉及的飞行参数中的一个。该设定信息可以保存在内存160或存储器170中。The memory 160 or the memory 170 may store the information of the imaging position and imaging path generated by the terminal 80 or the unmanned aerial vehicle 100. The UAV control unit 110 may set the information of the imaging position and the imaging path as one of imaging parameters related to the shooting planned by the unmanned aircraft 100 or flight parameters related to the flight scheduled by the unmanned aircraft 100. The setting information may be stored in the memory 160 or the memory 170.
万向节200可以以偏航轴、俯仰轴以及滚转轴为中心可旋转地支持摄像部220。万向节200可以使摄像部220以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,从而改变摄像部220的摄像方向。The gimbal 200 can rotatably support the imaging unit 220 about the yaw axis, the pitch axis, and the roll axis. The gimbal 200 can rotate the imaging unit 220 around at least one of the yaw axis, the pitch axis, and the roll axis, thereby changing the imaging direction of the imaging unit 220.
旋翼机构210具有多个旋翼和使多个旋翼旋转的多个驱动电机。旋翼机构210通过UAV控制部110控制旋转,从而使无人驾驶航空器100飞行。旋翼211的数量例如可以是四个,也可以是其他数量。此外,无人驾驶航空器100可以是没有旋翼的固定翼飞机。The rotor mechanism 210 has a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors. The rotor mechanism 210 controls the rotation by the UAV control unit 110, thereby causing the unmanned aircraft 100 to fly. The number of rotors 211 may be four, for example, or other numbers. In addition, the unmanned aerial vehicle 100 may be a fixed-wing aircraft without a rotor.
摄像部220对所希望的摄像范围内的被摄体进行拍摄并生成摄像图像的数据。通过摄像部220的拍摄而得到的图像数据(例如摄像图像)可以存储于摄像部220具有的内存或存储器170中。The imaging unit 220 captures an object within a desired imaging range and generates data of a captured image. The image data (for example, a captured image) obtained by the imaging unit 220 may be stored in the memory or the memory 170 included in the imaging unit 220.
摄像部230对无人驾驶航空器100的周围进行拍摄并生成摄像图像的数据。摄像部230的图像数据可以存储于存储器170中。The imaging unit 230 captures the surroundings of the drone 100 and generates data of captured images. The image data of the imaging unit 230 can be stored in the memory 170.
GPS接收器240接收表示从多个导航卫星(即GPS卫星)发送的时间以及各GPS卫星的位置(坐标)的多个信号。GPS接收器240根据接收到的多个信号,计算GPS接收器240的位置(即无人驾驶航空器100的位置)。GPS接收器240将无人驾驶航空器100的位置信息输出到UAV控制部110。另外,可以由UAV控制部110代替GPS接收器240来进行GPS接收器240的位置信息的计算。在此情况下,在UAV控制部110中输入有GPS接收器240所接收到的多个信号中包含的表示时间以及各GPS卫星的位置的信息。The GPS receiver 240 receives a plurality of signals indicating the time transmitted from a plurality of navigation satellites (that is, GPS satellites) and the position (coordinates) of each GPS satellite. The GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned aircraft 100) based on the received multiple signals. The GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control unit 110. In addition, the UAV control unit 110 may calculate the position information of the GPS receiver 240 instead of the GPS receiver 240. In this case, information indicating the time and the position of each GPS satellite included in the multiple signals received by the GPS receiver 240 is input to the UAV control unit 110.
惯性测量装置250检测无人驾驶航空器100的姿势,并将检测结果输出到UAV控制部110。惯性测量装置250可以检测无人驾驶航空器100的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度,作为无人驾驶航空器100的姿势。The inertial measurement device 250 detects the posture of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110. The inertial measurement device 250 can detect the acceleration and triaxial angular velocity of the unmanned aircraft 100 in the three axis directions of front, back, left, right, and up and down, as the attitude of the unmanned aircraft 100.
磁罗盘260检测无人驾驶航空器100的机头的方位,并将检测结果输出到UAV控制部110。The magnetic compass 260 detects the orientation of the nose of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
气压高度计270检测无人驾驶航空器100的飞行高度,并将检测结果输出到UAV控制部110。The barometric altimeter 270 detects the flying altitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
超声波传感器280放射超声波,检测地面、物体反射的超声波,并将检测结果输出到UAV控制部110。检测结果可以示出从无人驾驶航空器100到地面的距离,即高度。检测结果可以示出从无人驾驶航空器100到物体(被摄体)的距离。The ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected on the ground and objects, and outputs the detection results to the UAV control unit 110. The detection result may show the distance from the unmanned aircraft 100 to the ground, that is, the height. The detection result may show the distance from the unmanned aircraft 100 to the object (subject).
激光测量仪290对物体照射激光,接收物体反射的反射光,并通过反射光来测量无人驾驶航空器100与物体(被摄体)之间的距离。作为基于激光的距离测量方法的一个示例,可以为飞行时间法。The laser measuring instrument 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between the unmanned aircraft 100 and the object (subject) by the reflected light. As an example of the laser-based distance measurement method, a time-of-flight method may be used.
图5是示出终端80的硬件配置的一个示例的框图。终端80包括终端控制部81、操作部83、通信部85、内存87、显示部88以及存储器89。终端80可以由希望控制无人驾驶航空器100的飞行的用户所持有。FIG. 5 is a block diagram showing an example of the hardware configuration of the terminal 80. The terminal 80 includes a terminal control unit 81, an operation unit 83, a communication unit 85, a memory 87, a display unit 88, and a memory 89. The terminal 80 may be held by a user who wishes to control the flight of the unmanned aircraft 100.
终端控制部81例如采用CPU、MPU或DSP构成。终端控制部81进行用于整体控制终端80的各部的动作的信号处理、与其它各部之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。The terminal control unit 81 is composed of, for example, a CPU, MPU, or DSP. The terminal control unit 81 performs signal processing for overall control of the operation of each unit of the terminal 80, data input/output processing with other units, data calculation processing, and data storage processing.
终端控制部81可以经由通信部85获取来自无人驾驶航空器100的数据、信息。终端控制部81可以获取经由操作部83输入的数据、信息(例如,各种参数)。终端控制部81可以获取保存在内存87中的数据、信息。终端控制部81可以经由通信部85向无人驾驶航空器100发送数据、信息(例如,位置、速度、飞行路径的信息)。终端控制部81可以将数据、信息发送至显示部88,并使显示部88显示基于此数据、信息的显示信息。The terminal control unit 81 can acquire data and information from the unmanned aircraft 100 via the communication unit 85. The terminal control section 81 can acquire data and information (for example, various parameters) input via the operation section 83. The terminal control unit 81 can acquire data and information stored in the memory 87. The terminal control unit 81 may transmit data and information (eg, position, speed, and flight path information) to the unmanned aircraft 100 via the communication unit 85. The terminal control unit 81 may send data and information to the display unit 88 and cause the display unit 88 to display display information based on the data and information.
操作部83接受并获取由终端80的用户输入的数据、信息。操作部83也可以包括按钮、按键、触控显示屏、话筒等输入装置。这里主要示出了操作部83和显示部88由触控面板构成。在此情况下,操作部83可以进行触控操作、点击操作、拖动操作等。操作部83可以接受各种参数的信息。操作部83输入的信息可以发送到无人驾驶航空器100。The operation unit 83 receives and acquires data and information input by the user of the terminal 80. The operation unit 83 may include input devices such as buttons, keys, touch screens, and microphones. Here, it is mainly shown that the operation section 83 and the display section 88 are composed of a touch panel. In this case, the operation unit 83 can perform touch operation, click operation, drag operation, and the like. The operation unit 83 can receive information on various parameters. The information input by the operation unit 83 may be transmitted to the unmanned aerial vehicle 100.
通信部85通过各种无线通信方式与无人驾驶航空器100之间进行无线通信。此无线通信的无线通信方式可以包括例如无线LAN、Bluetooth(注册商标),或经由公共无线网络进行的通信。通信部85可以通过任意的有线通信方式进行有线通信。The communication unit 85 performs wireless communication with the unmanned aircraft 100 through various wireless communication methods. The wireless communication method of this wireless communication may include, for example, wireless LAN, Bluetooth (registered trademark), or communication via a public wireless network. The communication unit 85 can perform wired communication by any wired communication method.
内存87例如可以具有规定终端80的动作的程序、存储设定值的数据的ROM、暂时保存终端控制部81进行处理时所使用的各种信息、数据的RAM。内存87可以包括ROM和RAM以外的内存。内存87可以设置在终端80的内部。内存87可以设置成可从终端80上拆卸下来。程序可以包括应用程序。The memory 87 may include, for example, a program that defines the operation of the terminal 80, a ROM that stores data of set values, and a RAM that temporarily stores various information and data used when the terminal control unit 81 performs processing. The memory 87 may include memory other than ROM and RAM. The memory 87 may be provided inside the terminal 80. The memory 87 may be configured to be detachable from the terminal 80. The program may include an application program.
显示部88例如采用LCD(Liquid Crystal Display,液晶显示器)构成,显示从终端控制部81输出的各种信息、数据。显示部88可以显示应用程序的执行涉及的各种 数据、信息。The display unit 88 is configured by, for example, an LCD (Liquid Crystal Display), and displays various information and data output from the terminal control unit 81. The display unit 88 can display various data and information related to the execution of the application program.
存储器89存储并保存各种数据、信息。存储器89可以是HDD、SSD、SD卡、USB存储器等。存储器89可以设置在终端80的内部。存储器89可以可拆卸地设置在终端80上。存储器89可以保存从无人驾驶航空器100获取的摄像图像、其附加信息。附加信息可以保存在内存87中。The memory 89 stores and stores various data and information. The memory 89 may be an HDD, SSD, SD card, USB memory, or the like. The memory 89 may be provided inside the terminal 80. The memory 89 may be detachably provided on the terminal 80. The memory 89 can store the captured image acquired from the unmanned aircraft 100 and its additional information. The additional information can be stored in the memory 87.
以下,对飞行***10的动作进行说明。飞行***10的无人驾驶航空器100或者终端80基于由无人驾驶航空器100拍摄的多个摄像图像执行与合成图像的生成相关的处理。无人驾驶航空器100的UAV控制部110或者终端80的终端控制部81是执行与合成图像的生成相关的处理的处理部的一个示例。其中,示出了由无人驾驶航空器100主导来执行与合成图像相关的处理。Hereinafter, the operation of the flying system 10 will be described. The unmanned aircraft 100 or the terminal 80 of the flying system 10 executes processing related to the generation of a composite image based on a plurality of captured images captured by the unmanned aircraft 100. The UAV control unit 110 of the unmanned aircraft 100 or the terminal control unit 81 of the terminal 80 is an example of a processing unit that executes processing related to the generation of a composite image. Among them, it is shown that the unmanned aircraft 100 dominates to perform processing related to the synthesized image.
在本实施方式中,可以假设合成图像的生成由计算能力不足的处理器执行。合成图像可以用作地图图像、正射图像。计算能力不足的处理器可以包括例如难以实时实施包括密集点群生成的合成图像生成的处理器。In this embodiment, it can be assumed that the generation of the synthesized image is performed by a processor with insufficient computing power. The composite image can be used as a map image or an orthophoto. The processor with insufficient computing power may include, for example, a processor that is difficult to implement synthetic image generation including dense point group generation in real time.
图6是示出图像合成处理过程的一个示例的流程图。作为一个示例,可以通过终端80的终端控制部81执行存储在内存87中的程序来执行该处理。此外,无人驾驶航空器100可以执行辅助图像合成处理的动作。例如,无人驾驶航空器100可以向终端80提供由摄像部220拍摄的摄像图像及其附加信息,或者可以提供各种参数(例如,与无人驾驶航空器100的飞行相关的飞行参数、与摄像部220的拍摄相关的摄像参数)。6 is a flowchart showing an example of an image synthesis processing procedure. As an example, this process may be performed by the terminal control section 81 of the terminal 80 executing a program stored in the memory 87. In addition, the unmanned aerial vehicle 100 may perform an action that assists image synthesis processing. For example, the unmanned aircraft 100 may provide the terminal 80 with a camera image captured by the camera unit 220 and its additional information, or may provide various parameters (eg, flight parameters related to the flight of the unmanned aircraft 100, and the camera unit 220 shooting parameters related to shooting).
终端控制部81获取飞行范围和各种参数(S1)。在此情况下,用户可以将飞行范围和参数输入到终端80。终端控制部81可以经由操作部83接收用户输入,并获取输入的飞行范围和参数。The terminal control section 81 acquires the flight range and various parameters (S1). In this case, the user can input the flight range and parameters to the terminal 80. The terminal control section 81 may receive user input via the operation section 83 and acquire the input flight range and parameters.
终端控制部81可以经由通信部85从外部服务器获取地图信息。例如,当飞行范围被设置为矩形范围时,用户可以通过在地图信息中输入矩形的四个角的位置(纬度、经度)来获得飞行范围的信息。此外,当飞行范围被设置为圆形范围时,用户可以通过输入以飞行位置为中心的圆的半径来获得飞行范围的信息。此外,用户可以通过输入区域、特定地名(例如,东京)等信息,并且基于地图信息来获得飞行范围的信息。此外,终端控制部81可以从内存87、存储器89获取保存在内存87、存储器89中的飞行范围。飞行范围可以是无人驾驶航空器100飞行的预定范围。The terminal control section 81 can acquire map information from an external server via the communication section 85. For example, when the flight range is set to a rectangular range, the user can obtain the information of the flight range by inputting the positions (latitude, longitude) of the four corners of the rectangle in the map information. In addition, when the flight range is set to a circular range, the user can obtain the information of the flight range by inputting the radius of the circle centered on the flight position. In addition, the user can obtain the information of the flight range by inputting information such as an area, a specific place name (for example, Tokyo), and based on the map information. In addition, the terminal control unit 81 may acquire the flight range stored in the memory 87 and the memory 89 from the memory 87 and the memory 89. The flight range may be a predetermined range in which the unmanned aircraft 100 flies.
参数可以是与摄像部220的拍摄相关的摄像参数、与无人驾驶航空器100的飞行相关的飞行参数。该摄像参数可以包括摄像位置、摄像日期和时间、距被摄体的距离、摄像视角、拍摄时的无人驾驶航空器100的姿势、摄像方向、摄像条件、相机参数(快门速度、曝光值、摄像模式等)。飞行参数可以包括飞行位置(三维位置或二维位置)、飞行高度、飞行速度、飞行加速度、飞行路径、飞行日期和时间等。终端控制部81可以从内存87、存储器89获取保存在内存87、存储器89中的各种参数。The parameters may be imaging parameters related to the shooting of the imaging unit 220 and flight parameters related to the flight of the unmanned aircraft 100. The camera parameters may include camera position, camera date and time, distance from the subject, camera angle of view, unmanned aerial vehicle 100 posture, camera direction, camera conditions, camera parameters (shutter speed, exposure value, camera Mode, etc.). Flight parameters may include flight position (three-dimensional position or two-dimensional position), flight altitude, flight speed, flight acceleration, flight path, flight date and time, and so on. The terminal control unit 81 can acquire various parameters stored in the memory 87 and the memory 89 from the memory 87 and the memory 89.
终端控制部81可以经由通信部85从外部服务器、无人驾驶航空器100获取飞行 范围和各种参数。在无人驾驶航空器100中,可以从内存160获取飞行范围和各种参数,从无人驾驶航空器100中的各传感器(例如,GPS接收器240、惯性测量装置250)获取或者导出(例如,计算)飞行范围和各种参数。终端控制部81可以基于所获取的飞行范围和各种参数来确定无人驾驶航空器100的飞行路径、摄像位置。终端控制部81可以经由通信部85向无人驾驶航空器100通知所确定的无人驾驶航空器100的飞行路径、摄像位置。The terminal control unit 81 can acquire the flight range and various parameters from the external server and the unmanned aircraft 100 via the communication unit 85. In the unmanned aerial vehicle 100, the flight range and various parameters can be obtained from the memory 160, or obtained from various sensors (for example, GPS receiver 240, inertial measurement device 250) in the unmanned aerial vehicle 100 (for example, calculation ) Flight range and various parameters. The terminal control section 81 may determine the flight path and imaging position of the unmanned aircraft 100 based on the acquired flight range and various parameters. The terminal control unit 81 may notify the unmanned aircraft 100 via the communication unit 85 of the determined flight path and imaging position of the unmanned aircraft 100.
在无人驾驶航空器100中,UAV控制部110根据所确定的飞行路径控制飞行,并使摄像部220拍摄(例如,航拍)图像。在此情况下,UAV控制部110可以在飞行期间获取处于不同位置和姿势的多个摄像图像。UAV控制部110可以经由通信接口150将摄像图像发送给终端80。在终端80中,终端控制部81经由通信部85获取摄像图像,并保存在存储器89中(S2)。In the unmanned aerial vehicle 100, the UAV control unit 110 controls the flight according to the determined flight path, and causes the imaging unit 220 to capture (eg, aerial photography) images. In this case, the UAV control section 110 may acquire multiple captured images in different positions and postures during flight. The UAV control unit 110 may send the captured image to the terminal 80 via the communication interface 150. In the terminal 80, the terminal control unit 81 acquires a captured image via the communication unit 85 and stores it in the memory 89 (S2).
可以在沿着飞行路径的各摄像位置处进行拍摄,并且将多个摄像图像保存在存储器89中。此外,终端控制部81可以经由通信部85获取与摄像图像相关的附加信息,并保存在存储器89中。该附加信息可以包括与上述各种参数(例如,飞行参数、摄像参数)类似的信息。因此,终端控制部81可以获取在拍摄各摄像图像时的摄像部220(即,无人驾驶航空器100)的摄像位置、拍摄时的姿势、摄像方向等的信息。It is possible to shoot at each imaging position along the flight path, and save a plurality of imaging images in the memory 89. In addition, the terminal control unit 81 may acquire additional information related to the captured image via the communication unit 85 and store it in the memory 89. The additional information may include information similar to the various parameters described above (eg, flight parameters, camera parameters). Therefore, the terminal control unit 81 can acquire information such as the imaging position of the imaging unit 220 (that is, the unmanned aerial vehicle 100) at the time of capturing each captured image, the posture at the time of capturing, and the imaging direction.
另外,多个摄像图像可以由相同的摄像部220拍摄,也可以由不同的摄像部220拍摄。也就是说,多个图像可以由多个不同的无人驾驶航空器100拍摄。In addition, a plurality of captured images may be captured by the same imaging unit 220 or may be captured by different imaging units 220. That is, multiple images can be captured by multiple different unmanned aircraft 100.
终端控制部81可以从通信部85或者存储器89获取多个摄像图像。终端控制部81可以提取包含在各摄像图像中的多个特征点。特征点可以是摄像图像上任何位置的点。终端控制部81可以对包含在各摄像图像中的多个特征点执行使相同特征点对应的匹配处理,进而生成对应点作为对应的特征点。The terminal control unit 81 can acquire a plurality of captured images from the communication unit 85 or the memory 89. The terminal control unit 81 may extract a plurality of feature points included in each captured image. The feature point may be a point anywhere on the camera image. The terminal control unit 81 may perform a matching process corresponding to the same feature point on a plurality of feature points included in each captured image, and then generate a corresponding point as the corresponding feature point.
在匹配处理中,终端控制部81可以考虑将各特征点投影到摄像图像上的实际观察位置与基于摄像部220的位置以及姿势等的参数将各特征点再现于摄像图像上的再现位置之间的差(再投影误差)。终端控制部81可以执行使再投影误差最小化的集束调整(BA:Bundle Adjustment)。终端控制部81可以导出集束调整的结果、各摄像图像中的特征点的对应关系并导出对应点。In the matching process, the terminal control unit 81 may consider the actual observation position at which each feature point is projected onto the captured image and the reproduction position at which each feature point is reproduced on the captured image based on parameters such as the position and posture of the imaging unit 220 Difference (reprojection error). The terminal control unit 81 can perform bundle adjustment (BA: Bundle Adjustment) that minimizes the reprojection error. The terminal control unit 81 may derive the result of the bundle adjustment and the correspondence between the feature points in each captured image and derive the corresponding point.
终端控制部81可以生成包括多个对应点的稀疏点群。终端控制部81可以根据例如sfm生成稀疏点群。稀疏点群的数量可以是例如每个图像几百个点。包含在稀疏点群中的点的数据可以包括表示三维位置的数据。也就是说,这里的稀疏点群是三维点群。如此,终端控制部81基于获取的多个摄像图像生成稀疏点群(S3)。The terminal control unit 81 may generate a sparse point group including a plurality of corresponding points. The terminal control unit 81 may generate a sparse point group based on, for example, sfm. The number of sparse point groups may be, for example, several hundred points per image. The data of the points contained in the sparse point group may include data representing the three-dimensional position. In other words, the sparse point group here is a three-dimensional point group. In this way, the terminal control unit 81 generates a sparse point group based on the acquired multiple captured images (S3).
终端控制部81基于稀疏点群生成三维模型M(S4)。在此情况下,终端控制部81可以生成以包含在稀疏的点群中的相邻的多个点作为顶点的多个sf(面),并且生成由多个面sf表示的三维模型M。三维模型M可以是例如表示地面形状的地形模型。由于该三维模型M是基于稀疏点群而形成的,所以其成为稀疏三维模型(粗略的三维模型)。生成的三维模型M可以保存在存储器89中。The terminal control unit 81 generates a three-dimensional model M based on the sparse point group (S4). In this case, the terminal control unit 81 may generate a plurality of sf (planes) having a plurality of adjacent points included in the sparse point group as vertices, and generate a three-dimensional model M represented by the plurality of planes sf. The three-dimensional model M may be, for example, a terrain model representing the shape of the ground. Since the three-dimensional model M is formed based on the sparse point group, it becomes a sparse three-dimensional model (rough three-dimensional model). The generated three-dimensional model M may be stored in the memory 89.
终端控制部81基于拍摄时的摄像部220的各个位置(三维位置)和各个姿势中的至少其一以及三维模型M来导出(例如,计算)摄像部220的各个位置与三维模型M之间的距离D(S5)。也就是说,由于无人驾驶航空器100在飞行期间移动,所以导出摄像部220的位置与三维模型M之间的多个距离D。由于三维模型M是基于具有三维位置信息的稀疏点群而生成的,所以可以确定三维空间中的三维模型M的位置、形状。此外,存在三维模型M的坐标空间和存在无人驾驶航空器100的摄像部220的坐标空间是相同的坐标空间。因此,可以导出摄像部220的各个位置与三维模型M中的预定位置之间的距离D。另外,终端控制部81不仅可以基于摄像部220的位置信息计算距离D,还可以利用时间信息,基于各时间的摄像部220的姿势信息(各间时间的摄像部220的姿势)来计算距离D。在此情况下,终端控制部81可以仅获取关于姿势的信息而不获取关于摄像部220的位置的信息。The terminal control unit 81 derives (for example, calculates) between each position of the imaging unit 220 and the three-dimensional model M based on at least one of each position (three-dimensional position) and each posture of the imaging unit 220 and the three-dimensional model M at the time of shooting. Distance D (S5). That is, since the unmanned aerial vehicle 100 moves during flight, a plurality of distances D between the position of the imaging unit 220 and the three-dimensional model M are derived. Since the three-dimensional model M is generated based on the sparse point group having three-dimensional position information, the position and shape of the three-dimensional model M in the three-dimensional space can be determined. In addition, the coordinate space where the three-dimensional model M exists and the coordinate space where the imaging unit 220 of the unmanned aerial vehicle 100 exists are the same coordinate space. Therefore, the distance D between each position of the imaging unit 220 and the predetermined position in the three-dimensional model M can be derived. In addition, the terminal control unit 81 can not only calculate the distance D based on the position information of the imaging unit 220 but also use the time information to calculate the distance D based on the posture information of the imaging unit 220 at each time (the posture of the imaging unit 220 at each time). . In this case, the terminal control section 81 may acquire only the information about the posture and not the information about the position of the imaging section 220.
终端控制部81基于获取的距离D来调整由摄像部220在各个位置处拍摄的各摄像图像的尺寸(尺度)(S6)。在此情况下,终端控制部81可以基于获取的距离D来计算由摄像部220在各个位置处拍摄的各摄像图像的尺寸。终端控制部81可以放大或缩小各摄像图像,以便其具有计算出的尺寸。各摄像图像的尺寸是表示用于生成合成图像的各摄像图像的放大率或缩小率的指标。在各摄像图像(获取的多个摄像图像)中可以存在作为基准的摄像图像(没有进行放大或缩小的摄像图像)。The terminal control section 81 adjusts the size (scale) of each captured image captured by the imaging section 220 at each position based on the acquired distance D (S6). In this case, the terminal control section 81 may calculate the size of each captured image captured by the imaging section 220 at each position based on the acquired distance D. The terminal control section 81 may enlarge or reduce each captured image so that it has a calculated size. The size of each captured image is an index indicating the magnification rate or reduction rate of each captured image used to generate a composite image. In each of the captured images (acquired multiple captured images), there may be a captured image as a reference (a captured image that is not enlarged or reduced).
例如,由于摄像部220与三维模型M之间的距离D越大,则由该摄像部220拍摄到的摄像图像中包含的摄像对象物(例如地面、建筑物、对象)越小,因此,终端控制部81可以增大摄像图像的放大率。由于拍摄部220与三维模型M之间的距离D越短,则由该拍摄部220拍摄到的拍摄图像中包含的拍摄对象物越大,因此,终端控制部81可以减小拍摄图像的放大率。For example, the larger the distance D between the imaging unit 220 and the three-dimensional model M, the smaller the imaging target objects (eg, ground, buildings, objects) included in the captured image captured by the imaging unit 220, and therefore, the terminal The control section 81 can increase the magnification of the captured image. Since the shorter the distance D between the photographing unit 220 and the three-dimensional model M, the larger the photographic object contained in the photographed image captured by the photographing unit 220, therefore, the terminal control unit 81 can reduce the magnification of the photographed image .
例如,由于拍摄部220与三维模型M之间的距离D越长,则由该拍摄部220拍摄到的拍摄图像中包含的拍摄对象物越小,因此,终端控制部81可以减小拍摄图像的缩小率。由于拍摄部220与三维模型M之间的距离D越短,则由该拍摄部220拍摄到的拍摄图像中包含的拍摄对象物越大,因此,终端控制部81可以增大拍摄图像的缩小率。For example, as the distance D between the photographing unit 220 and the three-dimensional model M is longer, the smaller the photographic object included in the photographed image captured by the photographing unit 220, the terminal control unit 81 can reduce the Reduction rate. Since the shorter the distance D between the photographing unit 220 and the three-dimensional model M, the larger the photographic object included in the photographed image captured by the photographing unit 220, therefore, the terminal control unit 81 can increase the reduction ratio of the photographed image .
终端控制部81对尺寸调整后的各拍摄图像进行合成,生成合成图像(S7)。在此情况下,终端控制部81可以在合成图像中相邻的多个拍摄图像中,针对拍摄图像中描绘的重复的被摄体部分,除了相邻的多个拍摄图像中的一个拍摄图像之外,删除重复部分。即,关于重复部分,可以以任意一个拍摄图像为代表来描绘被摄体。另外,终端控制部81可以基于尺寸调整后的多个拍摄图像,按照众所周知的合成方法来生成合成图像。The terminal control unit 81 synthesizes the captured images after resizing to generate a synthesized image (S7). In this case, the terminal control section 81 may, among the plurality of adjacent captured images in the composite image, for the repeated subject portion depicted in the captured image, except for one of the adjacent captured images Also, delete duplicates. That is, with regard to the repeated part, the subject can be drawn with any one of the captured images as a representative. In addition, the terminal control unit 81 may generate a synthesized image according to a well-known synthesis method based on the plurality of captured images after resizing.
这样,终端80通过调整各拍摄图像的尺寸,能够使各拍摄图像所包含的特征点、对应点的位置关系大致相等。因此,终端80能够控制以下情况的发生:合成图像中包含的被摄体(物体)的尺寸根据原始的拍摄图像而不同、针对合成图像所包含的每个拍摄图像,被摄体的大小产生偏差。例如,终端80能够使映入合成图像的被摄体(例 如地形、建筑物等物体)的尺度在多个拍摄图像中一致,能够降低相同或对应的物体在多个拍摄图像中的尺寸的偏差,来生成合成图像。In this way, by adjusting the size of each captured image, the terminal 80 can make the positional relationship between the feature points and corresponding points included in each captured image substantially equal. Therefore, the terminal 80 can control the occurrence of a situation in which the size of the subject (object) included in the composite image differs according to the original captured image, and the size of the subject varies for each captured image included in the composite image . For example, the terminal 80 can make the scale of the subject (such as terrain, buildings, etc.) reflected in the composite image in multiple captured images, reduce the size deviation of the same or corresponding objects in multiple captured images To generate a composite image.
另外,终端80生成稀疏点群,生成稀疏的三维模型,进行各拍摄图像的尺寸调整,因此可以无需生成密集点群。因此,终端80无须配备处理能力高的处理器,就能够减少用于生成合成图像的处理负荷、处理时间。In addition, the terminal 80 generates a sparse point group, generates a sparse three-dimensional model, and adjusts the size of each captured image. Therefore, it is not necessary to generate a dense point group. Therefore, the terminal 80 can reduce the processing load and processing time for generating a composite image without having to provide a processor with high processing capacity.
另外,即使不基于由拍摄部220拍摄到的拍摄图像,而是基于由拍摄部230拍摄到的拍摄图像,也可以进行距离导出、尺寸调整、合成图像生成等。In addition, even if it is not based on the captured image captured by the imaging unit 220, but based on the captured image captured by the imaging unit 230, distance derivation, size adjustment, composite image generation, and the like may be performed.
此外,图6的图像生成处理的至少一部分处理也可以由无人飞行器100执行。例如,UAV控制部110可以在S1中从终端80取得飞行范围或各种参数,并取得拍摄图像(S2),生成稀疏点群(S3),生成三维模型(S4),导出距离D(S5),调整尺寸(S6),生成合成图像(S7)等。另外,也用无人飞行器100和终端80来分担实施图像合成处理。In addition, at least a part of the image generation processing of FIG. 6 may be executed by the unmanned aerial vehicle 100. For example, the UAV control unit 110 may acquire the flight range or various parameters from the terminal 80 in S1, acquire the captured image (S2), generate a sparse point group (S3), generate a three-dimensional model (S4), and derive the distance D (S5) , Adjust the size (S6), generate a composite image (S7), etc. In addition, the unmanned aerial vehicle 100 and the terminal 80 share the image synthesis processing.
接下来,对距离D的导出例进行说明。Next, an example of deriving the distance D will be described.
图7是表示拍摄部220的各个位置(即无人飞行器100的各个位置)与三维模型M的对应部分(例如拍摄范围CR所包含的部分)的距离的一个示例的图。在图7中,将各个位置处的拍摄部220的标号设为220a、220b、220c、...。7 is a diagram showing an example of the distance between each position of the imaging unit 220 (that is, each position of the unmanned aerial vehicle 100) and the corresponding portion of the three-dimensional model M (for example, the portion included in the imaging range CR). In FIG. 7, the reference numerals of the imaging unit 220 at each position are referred to as 220a, 220b, 220c, ....
与拍摄部220a的距离D为距离ha。成像单元220c和三维模型M之间的距离D是距离hc。即,在图7中,在拍摄部220的各个位置处距离D是不同。因此,当不进行尺寸调整就合成各拍摄图像时,制作出的合成图像中,映入到各拍摄图像中的被摄体的尺寸不统一。与此相对,在本实施方式中,终端控制部81以与距离D对应的尺寸调整量(例如放大率或缩小率)来对各拍摄图像进行尺寸调整,对尺寸调整后的各拍摄图像进行合成。因此,终端80能够生成在各拍摄图像中映入的被摄体的尺寸统一的合成图像。The distance D from the imaging unit 220a is the distance ha. The distance D between the imaging unit 220c and the three-dimensional model M is the distance hc. That is, in FIG. 7, the distance D is different at each position of the imaging unit 220. Therefore, when the captured images are synthesized without size adjustment, the size of the subject reflected in each captured image in the created synthesized image is not uniform. In contrast, in the present embodiment, the terminal control unit 81 adjusts the size of each captured image by a size adjustment amount (for example, magnification ratio or reduction ratio) corresponding to the distance D, and synthesizes each captured image after the size adjustment. . Therefore, the terminal 80 can generate a composite image with a uniform size of the subject reflected in each captured image.
图8是表示距离D的导出例的图。FIG. 8 is a diagram showing an example of deriving the distance D. FIG.
如图8所示,拍摄部220的位置与三维模型M的距离D可以是沿着铅直方向(与水平方向垂直的方向)的距离h1。即,终端控制部81可以将连结拍摄部220和交点C1的距离h1作为距离D,其中交点C1为通过拍摄部220且平行于铅直方向的直线L1与三维模型M的交点。此外,拍摄部220的位置可以是拍摄部220的拍摄面、可以是由拍摄部220拍摄的拍摄图像的图像中心。As shown in FIG. 8, the distance D between the position of the imaging unit 220 and the three-dimensional model M may be the distance h1 along the vertical direction (direction perpendicular to the horizontal direction). That is, the terminal control unit 81 may use the distance h1 connecting the imaging unit 220 and the intersection point C1 as the distance D, where the intersection point C1 is the intersection point of the straight line L1 passing through the imaging unit 220 and parallel to the vertical direction and the three-dimensional model M. In addition, the position of the imaging unit 220 may be the imaging surface of the imaging unit 220 or may be the image center of the captured image captured by the imaging unit 220.
由此,终端80能够考虑拍摄到拍摄图像时的无人驾驶飞机的飞行高度来实施拍摄图像的尺寸调整。Thus, the terminal 80 can adjust the size of the captured image in consideration of the flying height of the drone when the captured image is captured.
如图8所示,拍摄部220的位置与三维模型M的距离D可以是沿着拍摄部220的拍摄方向的距离h2。即,终端控制部81可以将拍摄部220和交点C2之间的距离 h2作为距离D,其中交点C2为通过拍摄部220且平行于由拍摄部220的姿势规定的拍摄方向的直线L2与三维模型M的交点。由此,拍摄方向相对于铅直方向的倾斜也被考虑进去。As shown in FIG. 8, the distance D between the position of the imaging unit 220 and the three-dimensional model M may be the distance h2 along the imaging direction of the imaging unit 220. That is, the terminal control unit 81 may use the distance h2 between the imaging unit 220 and the intersection point C2 as the distance D, where the intersection point C2 is a straight line L2 passing through the imaging unit 220 and parallel to the imaging direction specified by the posture of the imaging unit 220 and the three-dimensional model The intersection of M. Thus, the inclination of the shooting direction relative to the vertical direction is also taken into consideration.
由此,终端80能够考虑拍摄到拍摄图像时的拍摄图像中映入的被摄体与无人驾驶飞机100之间的距离来实施拍摄图像的尺寸调整。Accordingly, the terminal 80 can adjust the size of the captured image in consideration of the distance between the subject reflected in the captured image when the captured image is captured and the drone 100.
另外,在由拍摄部220拍摄图像的拍摄范围CR包含三维模型M的至少一部分。因此,在由拍摄部220拍摄到的拍摄图像的图像范围中包含三维模型M的至少一部分。一个拍摄范围CR或一个图像范围对应于一个拍摄图像。In addition, at least a part of the three-dimensional model M is included in the imaging range CR of the image captured by the imaging unit 220. Therefore, at least a part of the three-dimensional model M is included in the image range of the captured image captured by the imaging unit 220. One shooting range CR or one image range corresponds to one shooting image.
可以按每个拍摄图像导出拍摄部220的位置与三维模型M的距离D。即,终端控制部81可以按每个拍摄范围CR或每一个图像范围,导出针对拍摄范围CR、图像范围所包含的三维模型M的部分的距离D。The distance D between the position of the imaging unit 220 and the three-dimensional model M may be derived for each captured image. That is, the terminal control unit 81 may derive the distance D of the portion of the three-dimensional model M included in the imaging range CR and the image range for each imaging range CR or each image range.
具体而言,终端控制部81可以计算拍摄了拍摄图像GMA的拍摄部220的位置PA与位置PA处的拍摄部220的拍摄范围CRA所包含的三维模型M的部分MA(三维模型的第一部分的第一例)之间的距离DA。终端控制部81可以计算拍摄了拍摄图像GMB的拍摄部220的位置PB和位置PB处的拍摄部220的拍摄范围CRB所包含的三维模型M的部分MB(三维模型的第一部分的第二例)之间的距离DB。Specifically, the terminal control unit 81 may calculate the position MA of the three-dimensional model M included in the shooting range CRA of the shooting unit 220 at the position PA and the shooting range CRA of the shooting unit 220 at the position PA (the first part The first example) is the distance DA. The terminal control section 81 can calculate the position MB of the shooting section 220 where the shot image GMB is shot and the part MB of the three-dimensional model M included in the shooting range CRB of the shooting section 220 at the position PB (second example of the first section of the three-dimensional model) The distance between DB.
图9是表示每个拍摄范围CR的距离D的导出例的图。在图9中,在拍摄拍摄图像GM1的拍摄范围CR1中,拍摄部220与三维模型M的部分MA之间的距离D为距离“1”。在拍摄拍摄图像GM2的拍摄范围CR2中,拍摄部220与三维模型M的部分MB之间的距离D为距离“2”。在此情况下,终端控制部81例如可以将拍摄范围CR2的情况下的拍摄图像GM2放大为2倍,将拍摄图像GM1和放大为2倍的拍摄图像GM2合成。此外,在距离“2”的情况下放大为2倍只是一个示例,只要根据距离来决定放大率或缩小率等尺寸调整量即可。9 is a diagram showing an example of deriving the distance D for each shooting range CR. In FIG. 9, in the imaging range CR1 where the captured image GM1 is captured, the distance D between the imaging unit 220 and the part MA of the three-dimensional model M is the distance “1”. In the imaging range CR2 where the captured image GM2 is captured, the distance D between the imaging unit 220 and the part MB of the three-dimensional model M is the distance “2”. In this case, for example, the terminal control unit 81 may enlarge the captured image GM2 in the case of the photographing range CR2 by 2 times, and combine the captured image GM1 with the captured image GM2 enlarged by 2 times. In addition, the enlargement to 2 times in the case of the distance "2" is only an example, and it is sufficient to determine the size adjustment amount such as the enlargement ratio or the reduction ratio according to the distance.
这样,终端80能够按照对应于每个拍摄图像的距离,粗略导出拍摄部220的位置(相机位置)与拍摄对象(三维模型M)之间的距离D。因此,终端80用于导出距离D的处理负荷比较小,处理时间比较短。In this way, the terminal 80 can roughly derive the distance D between the position of the photographing unit 220 (camera position) and the photographed object (three-dimensional model M) according to the distance corresponding to each captured image. Therefore, the processing load for the terminal 80 to derive the distance D is relatively small, and the processing time is relatively short.
另外,可以按拍摄图像的每个分割区域DR来导出拍摄部220的位置与三维模型M之间的距离D。即,可以按与拍摄图像对应的拍摄范围CR、图像范围GR的每个分割区域DR来导出距离D。在此情况下,终端控制部81可以对拍摄范围CR或者图像范围GR进行分割,生成多个分割区域DR。终端控制部81可以计算出各分割区域DR与对应于各分割区域DR的三维模型M的部分之间的各距离作为距离D。因此,由于按每个分割区域DR来导出距离D,所以能够针对一个拍摄图像导出与分割数对应的数量的不同的距离D。In addition, the distance D between the position of the imaging unit 220 and the three-dimensional model M may be derived for each divided region DR of the captured image. That is, the distance D can be derived for each divided region DR of the shooting range CR and the image range GR corresponding to the captured image. In this case, the terminal control unit 81 may divide the imaging range CR or the image range GR to generate a plurality of divided regions DR. The terminal control unit 81 may calculate each distance between each divided region DR and the portion of the three-dimensional model M corresponding to each divided region DR as the distance D. Therefore, since the distance D is derived for each divided region DR, the number of different distances D corresponding to the number of divisions can be derived for one captured image.
具体而言,终端控制部81可以计算出拍摄了拍摄图像GMC的拍摄部220的位置PC与位置PC处的拍摄部220的拍摄范围CRC的分割区域DRC1所包含的三维模型 M的部分MC1(三维模型的第二部分的一个示例)之间的距离DC1。终端控制部81可以计算出位置PC与拍摄范围CRC的分割区域DRC2所包含的三维模型M的部分MC2(三维模型的第二部分的一个示例)之间的距离DC2。即,对于拍摄范围CRC的各分割区域DR,可以计算出距离DC1、DC2、...。另外,每个分割区域DR的距离D的导出可以针对与多个拍摄图像对应的多个拍摄范围CR来进行。Specifically, the terminal control unit 81 can calculate the portion MC1 (three-dimensional) of the three-dimensional model M included in the divided region DRC1 of the imaging range CRC of the imaging unit 220 of the imaging unit 220 where the captured image GMC is captured and the imaging range 220 of the imaging unit 220 at the position PC An example of the second part of the model) the distance DC1. The terminal control unit 81 can calculate the distance DC2 between the position PC and the part MC2 of the three-dimensional model M (an example of the second part of the three-dimensional model) included in the divided region DRC2 of the shooting range CRC. That is, for each divided region DR of the shooting range CRC, the distances DC1, DC2, ... can be calculated. In addition, the distance D of each divided region DR can be derived for a plurality of shooting ranges CR corresponding to a plurality of shot images.
图10是表示拍摄范围CR被分割后的每个分割区域DR的距离D的导出例的图。FIG. 10 is a diagram showing an example of deriving the distance D for each divided region DR after the imaging range CR is divided.
在图10中,在由拍摄部220a拍摄到拍摄图像GM11的拍摄范围CR11中包含分割区域DR11~19。在分割区域DR11中,拍摄了拍摄图像GM11的拍摄部220的位置与三维模型M所对应的部分M11(三维模型M的部分M11)之间的距离D11是距离“1”。在分割区域DR12中,拍摄了拍摄图像GM11的拍摄部220的位置与三维模型M所对应的部分M12之间的距离D12是距离“0.5”。在分割区域DR13中,拍摄了拍摄图像GM11的拍摄部220的位置与三维模型M所对应的部分M13之间的距离D12是距离“1”。在分割区域DR14中,拍摄了拍摄图像GM11的拍摄部220的位置与三维模型M所对应的部分M14之间的距离D14为距离“2”。在分割区域DR15中,拍摄了拍摄图像GM11的拍摄部220的位置与三维模型M所对应的部分M15之间的距离D15是距离“2”。在分割区域DR16中,拍摄了拍摄图像GM11的拍摄部220的位置与三维模型M所对应的部分M16之间的距离D16为距离“1.5”。在分割区域DR17中,拍摄了拍摄图像GM11的拍摄部220的位置与三维模型M所对应的部分M17之间的距离D17为距离“1”。在分割区域DR18中,拍摄了拍摄图像GM18的拍摄部220的位置与三维模型M所对应的部分M18之间的距离D18为距离“2.5”。在分割区域DR19中,拍摄了拍摄图像GM19的拍摄部220的位置与三维模型M所对应的部分M19之间的距离D19为距离“1”。In FIG. 10, the shooting range CR11 where the shooting image GM11 is shot by the shooting unit 220 a includes the divided regions DR11 to 19. In the divided region DR11, the distance D11 between the position of the imaging unit 220 where the captured image GM11 is captured and the part M11 corresponding to the three-dimensional model M (the part M11 of the three-dimensional model M) is the distance "1". In the divided region DR12, the distance D12 between the position of the imaging unit 220 where the captured image GM11 is captured and the portion M12 corresponding to the three-dimensional model M is the distance "0.5". In the divided region DR13, the distance D12 between the position of the imaging unit 220 where the captured image GM11 was captured and the portion M13 corresponding to the three-dimensional model M is the distance "1". In the divided region DR14, the distance D14 between the position of the imaging unit 220 where the captured image GM11 is captured and the portion M14 corresponding to the three-dimensional model M is the distance "2". In the divided region DR15, the distance D15 between the position of the imaging unit 220 where the captured image GM11 was captured and the portion M15 corresponding to the three-dimensional model M is the distance "2". In the divided region DR16, the distance D16 between the position of the imaging unit 220 where the captured image GM11 is captured and the portion M16 corresponding to the three-dimensional model M is the distance "1.5". In the divided region DR17, the distance D17 between the position of the imaging unit 220 where the captured image GM11 is captured and the portion M17 corresponding to the three-dimensional model M is the distance "1". In the divided region DR18, the distance D18 between the position of the imaging unit 220 where the captured image GM18 is captured and the portion M18 corresponding to the three-dimensional model M is the distance "2.5". In the divided region DR19, the distance D19 between the position of the imaging unit 220 where the captured image GM19 was captured and the portion M19 corresponding to the three-dimensional model M is the distance "1".
在图10中,在由拍摄部220b拍摄拍摄图像GM21的拍摄范围CR21中包含分割区域DR21~29。在分割区域DR21中,拍摄了拍摄图像GM21的拍摄部220的位置与三维模型M的对应的部分M21之间的距离D21为距离“2”。以下同样地,与分割区域DR22对应的距离D22是距离“1.5”。与分割区域DR23对应的距离D23为距离“2”。与分割区域DR24对应的距离D24是距离“3”。与分割区域DR25对应的距离D25是距离“3”。与分割区域DR26对应的距离D26是距离“2.5”。与分割区域DR27对应的距离D27是距离“2”。与分割区域DR28对应的距离D28是距离“3.5”。与分割区域DR29对应的距离D29为距离“2”。In FIG. 10, the imaging range CR21 in which the captured image GM21 is captured by the imaging unit 220 b includes the divided regions DR21 to 29. In the divided region DR21, the distance D21 between the position of the imaging unit 220 where the captured image GM21 is captured and the corresponding part M21 of the three-dimensional model M is the distance "2". In the same manner below, the distance D22 corresponding to the divided region DR22 is the distance "1.5". The distance D23 corresponding to the divided region DR23 is the distance "2". The distance D24 corresponding to the divided area DR24 is the distance "3". The distance D25 corresponding to the divided area DR25 is the distance "3". The distance D26 corresponding to the divided region DR26 is the distance "2.5". The distance D27 corresponding to the divided area DR27 is the distance "2". The distance D28 corresponding to the divided area DR28 is the distance "3.5". The distance D29 corresponding to the divided area DR29 is the distance "2".
在此情况下,终端控制部81例如将与距离“1”对应的拍摄图像GM11的部分(与分割区域DR11对应的部分)保持原样(1倍)(不放大缩小),将与距离“0.5”对应的拍摄图像GM12的部分(与分割区域DR12对应的部分)放大为0.5倍,将与距离“1”对应的拍摄图像GM13的部分(与分割区域DR13对应的部分)保持原样(1倍)。终端控制部81将与距离“2”对应的拍摄图像GM14的部分(与分割区域DR14对应的部分)放大为2倍,将与距离“2”对应的拍摄图像GM15的部分(与分割区域DR15对应的部分)放大为2倍,将与距离“1.5”对应的拍摄图像GM16的部分(与分割区域DR16对应的部分)放大1.5倍。终端控制部81将与距离“1”对应的拍摄图像GM17的部分(与分割区域DR17对应的部分)保持原样,将与距离“2.5”对应的拍摄图像 GM18的部分(与分割区域DR18对应的部分)放大为2.5倍,将与距离“1”对应的拍摄图像GM19的部分(与分割区域DR19对应的部分)保持原样。这样,终端控制部81调整与拍摄图像GM11中的各分割区域DR11~DR19对应的区域的尺寸。In this case, for example, the terminal control unit 81 keeps the portion of the captured image GM11 corresponding to the distance “1” (the portion corresponding to the divided region DR11) as it is (1×) (without zooming in and out), and sets the distance “0.5” The portion of the corresponding captured image GM12 (the portion corresponding to the divided area DR12) is enlarged to 0.5 times, and the portion of the captured image GM13 (the portion corresponding to the divided area DR13) corresponding to the distance "1" is left as it is (1 time). The terminal control unit 81 doubles the portion of the captured image GM14 (the portion corresponding to the divided area DR14) corresponding to the distance "2", and enlarges the portion of the captured image GM15 (corresponding to the divided area DR15) corresponding to the distance "2" Of the captured image GM16 corresponding to the distance "1.5" (the portion corresponding to the divided area DR16) is enlarged by 1.5 times. The terminal control unit 81 keeps the portion of the captured image GM17 (the portion corresponding to the divided area DR17) corresponding to the distance "1", and the portion of the captured image GM18 (the portion corresponding to the divided area DR18) corresponding to the distance "2.5" ) Is enlarged to 2.5 times, and the portion of the captured image GM19 (the portion corresponding to the divided region DR19) corresponding to the distance "1" is left as it is. In this way, the terminal control unit 81 adjusts the size of the regions corresponding to the divided regions DR11 to DR19 in the captured image GM11.
同样地,终端控制部81将与距离“2”对应的拍摄图像GM21的部分放大为2倍,将与距离“1.5”对应的拍摄图像GM22的部分放大1.5倍,将与距离“2”对应的拍摄图像GM23的部分放大为2倍。终端控制部81将与距离“3”对应的拍摄图像GM24的部分放大为3倍,将与距离“3”对应的拍摄图像GM25的部分放大为3倍,将与距离“2.5”对应的拍摄图像GM26的部分放大2.5倍。终端控制部81将与距离“2”对应的拍摄图像GM27的部分放大为2倍,将与距离“3.5”对应的拍摄图像GM28的部分放大为3.5倍,将与距离“2”对应的拍摄图像GM29的部分放大为2倍。这样,终端控制部81调整与拍摄图像GM21中的各分割区域DR21~DR29对应的区域的尺寸。Similarly, the terminal control unit 81 enlarges the portion of the captured image GM21 corresponding to the distance "2" by 2 times, enlarges the portion of the captured image GM22 corresponding to the distance "1.5" by 1.5 times, and corresponds to the distance "2" The part of the captured image GM23 is enlarged twice. The terminal control unit 81 enlarges the portion of the captured image GM24 corresponding to the distance "3" to 3 times, enlarges the portion of the captured image GM25 corresponding to the distance "3" to 3 times, and enlarges the captured image corresponding to the distance "2.5" The GM26 part is enlarged 2.5 times. The terminal control unit 81 enlarges the portion of the captured image GM27 corresponding to the distance "2" to 2 times, enlarges the portion of the captured image GM28 corresponding to the distance "3.5" to 3.5 times, and enlarges the captured image corresponding to the distance "2" The GM29 part is doubled. In this way, the terminal control unit 81 adjusts the size of the region corresponding to each of the divided regions DR21 to DR29 in the captured image GM21.
这样,终端80能够按拍摄范围CR被分割后的每个分割区域DR有一个距离对应地在精细的范围内导出拍摄部220的位置(照相机位置)与拍摄对象(三维模型M)之间的距离D。因此,终端80能够提高拍摄机位置与三维模型M之间的距离D的精度。因此,终端80能够高精度地实施基于距离的尺寸调整,能够提高合成图像所包含的被摄体的再现性。In this way, the terminal 80 can derive the distance between the position of the photographing unit 220 (camera position) and the photographic object (three-dimensional model M) within a fine range corresponding to each distance of the divided region DR divided by the photographing range CR D. Therefore, the terminal 80 can improve the accuracy of the distance D between the camera position and the three-dimensional model M. Therefore, the terminal 80 can perform the size adjustment based on the distance with high accuracy, and can improve the reproducibility of the subject included in the composite image.
另外,例如在由三维模型M近似的地形的形状复杂,高低差较甚的情况下,终端80也能够按拍摄范围CR被分割后的每个分割区域FR导出三维模型M与三维模型M之间的距离D。因此,终端80能够按照每个分割区域DR,根据距离D对拍摄图像进行尺寸调整,对尺寸调整后的多个拍摄图像进行合成,生成合成图像。另外,终端80在按每个分割区域导出距离D的情况下,与按每个拍摄范围导出距离D的情况相比,能够更精细地实施尺度调整。In addition, for example, when the shape of the terrain approximated by the three-dimensional model M is complex and the height difference is relatively large, the terminal 80 can also derive between the three-dimensional model M and the three-dimensional model M for each divided region FR after the shooting range CR is divided The distance D. Therefore, the terminal 80 can adjust the size of the captured image according to the distance D for each divided region DR, synthesize a plurality of captured images after the size adjustment, and generate a synthesized image. In addition, when the distance D is derived for each divided area, the terminal 80 can perform finer scale adjustment than when the distance D is derived for each imaging range.
此外,距离与尺寸调整量(在此为放大率)的关系是一个示例,只要根据距离来决定尺寸调整量即可。In addition, the relationship between the distance and the size adjustment amount (magnification ratio here) is an example, and the size adjustment amount may be determined according to the distance.
如图9所示,通过对每个拍摄范围CR进行尺度调整,与对拍摄范围CR中的每个分割区域DR进行尺度调整的情况相比,较少的计算量即可。另一方面,如图10所示,终端80通过对每个分割区域DR进行尺度调整,与按每个拍摄范围CR进行尺度调整的情况相比,能够细致地实施尺度调整。在此情况下,即使例如在一个拍摄范围CR中地形的高低差很大,难以用一个距离表示,终端80也能够实现与频繁变化的高低差对应的尺度调整,能够降低合成图像中的原来的拍摄图像的尺度的偏差。因此,终端80能够提高合成图像的生成精度。As shown in FIG. 9, by performing scale adjustment for each shooting range CR, it is only necessary to reduce the amount of calculation as compared to the case of performing scale adjustment for each divided region DR in the shooting range CR. On the other hand, as shown in FIG. 10, the terminal 80 can finely adjust the scale compared to the case where the scale is adjusted for each shooting range CR by adjusting the scale for each divided region DR. In this case, even if, for example, the height difference of the terrain in one shooting range CR is large and it is difficult to express it by one distance, the terminal 80 can realize the scale adjustment corresponding to the frequently changing height difference, which can reduce the original in the composite image. The deviation of the scale of the captured image. Therefore, the terminal 80 can improve the generation accuracy of the synthesized image.
另外,也可以考虑按包含于稀疏点群的每个点导出拍摄部220与三维模型M之间的距离,但在此情况下,出现多个不同距离的信息,难以知道准确的尺度。终端80按每个拍摄图像的范围即拍摄范围CR导出距离D,或者按拍摄范围CR被分割为几个后的每个分割区域DR导出距离D,从而能够抑制基于不考虑距离D的统一尺寸的多个拍摄图像生成合成图像。另外,终端80通过不按稀疏点群的每个点导出距离D, 而是稍微大致地导出距离,能够减少运算量,能够缩短运算时间。In addition, it may be considered to derive the distance between the imaging unit 220 and the three-dimensional model M for each point included in the sparse point group, but in this case, a plurality of pieces of information at different distances appear, making it difficult to know the accurate scale. The terminal 80 derives the distance D according to the range of each captured image, that is, the shooting range CR, or derives the distance D for each divided region DR after the shooting range CR is divided into several, so that it is possible to suppress the distance based on the uniform size regardless of the distance D Multiple captured images generate a composite image. In addition, the terminal 80 does not derive the distance D for each point of the sparse point group, but derives the distance slightly, thereby reducing the amount of calculation and shortening the calculation time.
接着,对距离D不同的边界附近的尺寸调整量SA的校正进行说明。Next, the correction of the size adjustment amount SA near the boundary where the distance D is different will be described.
终端控制部81基于尺寸调整后的多个拍摄图像GM生成合成图像SG。在合成图像SG中,在作为合成图像SG的基础的拍摄图像GM的拍摄范围CR或与分割区域DR对应的距离D不同的边界附近,根据距离D而决定的尺寸调整量SA不同,由此可能产生不连续区域。在该情况下,终端控制部81也可以对边界附近的尺寸调整量SA进行校正,以使在边界附近尺寸调整量SA在相邻的区域平滑地变化。The terminal control unit 81 generates a composite image SG based on the plurality of captured images GM after resizing. In the composite image SG, the size adjustment amount SA determined according to the distance D is different in the vicinity of the boundary of the shooting range CR of the captured image GM that is the basis of the combined image SG or the distance D corresponding to the divided region DR, which is different, which makes it possible Generate discontinuous areas. In this case, the terminal control unit 81 may correct the size adjustment amount SA near the boundary so that the size adjustment amount SA near the boundary changes smoothly in the adjacent area.
边界附近例如也可以是距离相邻的拍摄范围CR的边界特定范围内的各个拍摄范围CR的部分。相邻的拍摄范围CR重复的情况下,该特定范围在也可以是相邻的拍摄范围CR重复的重复区域的至少一部分。与相邻的拍摄范围CR对应的拍摄图像GM彼此在合成图像SG中相邻。同样地,边界附近例如可以是距离相邻的分割区域DR的边界特定范围内的各个分割区域DR的部分。与相邻的分割区域DR对应的拍摄图像GM的部分彼此在合成图像SG中相邻。The vicinity of the boundary may be, for example, a portion of each shooting range CR within a specific range from the boundary of the adjacent shooting range CR. When the adjacent shooting ranges CR overlap, the specific range may be at least a part of the overlapping area where the adjacent shooting ranges CR overlap. The captured images GM corresponding to the adjacent shooting ranges CR are adjacent to each other in the composite image SG. Similarly, the vicinity of the boundary may be, for example, a portion of each divided region DR within a specific range from the boundary of the adjacent divided region DR. The parts of the captured image GM corresponding to the adjacent divided regions DR are adjacent to each other in the composite image SG.
例如,在按照每个拍摄范围CR导出距离D而进行尺寸调整的情况下,终端控制部81可以在相邻的拍摄范围CR的边界附近使尺寸调整量SA平滑地变化。例如,在拍摄范围CR31中距离D为1,在与拍摄范围CR31相邻的拍摄范围CR32中距离D为2的情况下,终端控制部81可以对尺寸调整量SA进行校正,以使在边界附近尺寸调整量(例如放大率)从1平顺地变化为2。For example, when deriving the distance D for each shooting range CR and adjusting the size, the terminal control unit 81 may smoothly change the size adjustment amount SA near the boundary of the adjacent shooting range CR. For example, when the distance D is 1 in the shooting range CR31 and the distance D is 2 in the shooting range CR32 adjacent to the shooting range CR31, the terminal control section 81 may correct the size adjustment amount SA so as to be near the boundary The size adjustment amount (for example, magnification) changes smoothly from 1 to 2.
图11是示出尺寸调整量的校正示例的图。在图11中,在拍摄范围CR31中距离D为1,在拍摄范围CR32中距离D为2。因此,在不校正尺寸调整量SA的情况下,在与拍摄范围CR31对应的拍摄图像GM31中,尺寸调整量SA(例如放大率)为1,在与拍摄范围CR32对应的拍摄图像GM32中,尺寸调整量SA为2即可。另外,在校正尺寸调整量SA的情况下,在拍摄图像GM31中,在边界附近以外尺寸调整量SA为1,在拍摄图像GM31中,在边界附近以外尺寸调整量SA为2即可。然后,终端控制部81可以进行校正,以使在合成图像SG中从相邻的拍摄图像GM31朝向拍摄图像GM32,在边界附近尺寸调整量SA平滑地从1变为2。在此情况下,终端控制部81可以在边界附近使尺寸调整量SA线性变化(参照曲线g1),也可以非线性地变化(参照图表g2)。FIG. 11 is a diagram showing a correction example of the size adjustment amount. In FIG. 11, the distance D is 1 in the shooting range CR31, and the distance D is 2 in the shooting range CR32. Therefore, without correcting the size adjustment amount SA, in the captured image GM31 corresponding to the shooting range CR31, the size adjustment amount SA (for example, magnification) is 1, and in the captured image GM32 corresponding to the shooting range CR32, the size The adjustment amount SA may be 2. In the case of correcting the size adjustment amount SA, the size adjustment amount SA outside the vicinity of the boundary is 1 in the captured image GM31, and the size adjustment amount SA outside the vicinity of the boundary may be 2 in the captured image GM31. Then, the terminal control unit 81 may perform correction so that the size adjustment amount SA near the boundary changes smoothly from 1 to 2 in the composite image SG from the adjacent captured image GM31 toward the captured image GM32. In this case, the terminal control unit 81 may linearly change the size adjustment amount SA near the boundary (refer to the curve g1), or may change it non-linearly (refer to the graph g2).
由此,终端80能够抑制在尺寸调整量SA不同的多个拍摄图像GM的端部附近(边界附近)因尺寸调整量SA不同而产生不连续区域造成合成图像SG的画质较差的问题。这在按每个分割区域DR导出距离D而进行尺寸调整的情况下也同样适用。As a result, the terminal 80 can suppress the problem that the image quality of the composite image SG is deteriorated due to the discontinuous areas caused by the different resizing amounts SA near the ends (near the boundary) of the plurality of captured images GM having different resizing amounts SA. This also applies to the case where the distance D is derived for each divided region DR and the size is adjusted.
接着,对三维模型M的生成例进行说明。在此,以生成地形模型作为三维模型M为例。终端控制部81可以使用与拍摄范围CR对应的栅格gd来生成三维模型。在此的地形中,例如可以广泛地包含由飞行中的无人驾驶飞机100所具备的拍摄部220拍摄的拍摄对象(例如地面、建筑物、对象)的形状。Next, an example of generating the three-dimensional model M will be described. Here, take the generation of a terrain model as a three-dimensional model M as an example. The terminal control unit 81 may generate a three-dimensional model using the grid gd corresponding to the shooting range CR. The terrain here may include, for example, the shape of the photographed object (eg, ground, building, object) photographed by the imaging unit 220 included in the flying drone 100.
对栅格gd进行说明。栅格gd可以由网格图案形成。在栅格gd中虚拟地表现作为三维模型M的拍摄范围的地形。栅格gd可以设定在与拍摄范围CR相同的范围、或者拍摄范围CR所包含的范围内。栅格gd可以是格子状、三角形、其他多边形、其他形状,包括作为栅格gd的顶点的栅格点gp。The grid gd will be described. The grid gd may be formed of a grid pattern. The terrain of the shooting range of the three-dimensional model M is virtually represented in the grid gd. The grid gd can be set within the same range as the shooting range CR or within the range included in the shooting range CR. The grid gd may be lattice-shaped, triangular, other polygonal, or other shapes, including grid points gp that are vertices of the grid gd.
各栅格点gp的间隔(栅格间隔)可以是规定的值,也可以由终端控制部81任意地设定。栅格间隔可以是1m、2m等。例如,终端控制部81可以经由操作部83指定栅格间隔。另外,稀疏点群的二维平面中的位置(不考虑三维空间中的高度的位置)可以不与栅格点gp的二维平面中的位置(不考虑栅格高度的位置)一致。The interval (grid interval) of each grid point gp may be a predetermined value, or may be arbitrarily set by the terminal control unit 81. The grid spacing can be 1m, 2m, etc. For example, the terminal control section 81 may specify the grid interval via the operation section 83. In addition, the position in the two-dimensional plane of the sparse point group (position not considering height in three-dimensional space) may not coincide with the position in the two-dimensional plane of grid points gp (position not considering the height of the grid).
图12是表示三维模型M的生成处理的一个示例的流程图。此外,图12所示的三维模型M的生成是一个例子,也可以按照其他方法生成三维模型。FIG. 12 is a flowchart showing an example of the generation process of the three-dimensional model M. FIG. In addition, the generation of the three-dimensional model M shown in FIG. 12 is an example, and the three-dimensional model may be generated according to other methods.
终端控制部81将三维空间(XYZ坐标系)中的稀疏点群(三维点群PG3)投影到二维平面(XY平面),生成投影于二维平面的稀疏点群(二维点群PG2)(S11)。终端控制部81可以将三维点群PG3的高度(Z坐标)设为值0,并生成二维点群PG2。另外,这里的稀疏点群可以是在图6的S3中生成的稀疏点群。The terminal control unit 81 projects the sparse point group (three-dimensional point group PG3) in the three-dimensional space (XYZ coordinate system) onto a two-dimensional plane (XY plane), and generates a sparse point group (two-dimensional point group PG2) projected on the two-dimensional plane (S11). The terminal control unit 81 may generate the two-dimensional point group PG2 by setting the height (Z coordinate) of the three-dimensional point group PG3 to the value 0. In addition, the sparse point group here may be the sparse point group generated in S3 of FIG. 6.
终端控制部81指定二维点群PG2所包含的相邻的多个点(S12)。然后,终端控制部81针对所指定的多个点,考虑投影到二维平面之前的三维点群PG3的高度,连接与提取出的二维平面上的点对应的多个三维空间上的点,生成surface(面)sf(S13)。可以在二维点群PG2的整体或者一部分来指定多个点。可以形成具有用于生成面sf的多个点的多个组,在多个组中分别生成面sf。在此情况下,终端控制部81可以对于二维点群PG2所包含的相邻的3点,使用对应的三维点群PG3中包含的3点进行三角化,生成三角形的面sf,即可以三角剖分(Delaunay三角剖分:Delaunay triangulation)。此外,终端控制部81也可以按照Delaunay三角剖分以外的方法,生成面sf。The terminal control unit 81 specifies a plurality of adjacent points included in the two-dimensional point group PG2 (S12). Then, the terminal control unit 81 considers the height of the three-dimensional point group PG3 before being projected on the two-dimensional plane for the specified points, and connects the points on the three-dimensional space corresponding to the extracted points on the two-dimensional plane, Generate surface sf (S13). A plurality of points can be specified in the whole or part of the two-dimensional point group PG2. A plurality of groups having a plurality of points for generating the surface sf may be formed, and the surface sf may be generated in the plurality of groups. In this case, the terminal control unit 81 may triangulate the three adjacent points included in the two-dimensional point group PG2 using the three points included in the corresponding three-dimensional point group PG3 to generate a triangular surface sf, that is, a triangle Subdivision (Delaunay triangulation: Delaunay triangulation). In addition, the terminal control unit 81 may generate the surface sf according to a method other than Delaunay triangulation.
在所生成的各面sf被投影到二维平面的范围内,即在由二维点群PG2指定的多个点包围的范围内,可能存在一个以上的栅格点gp。终端控制部81将各栅格点gp的高度(栅格高度)设定为栅格点中的面sf的位置。即,终端控制部81可以将穿过栅格点gp并沿着铅垂方向的直线与面sf之间的交点的高度设为栅格高度。这样,终端控制部81计算各栅格点gp(网状点)的三维位置(S14)。There may be more than one grid point gp within the range in which the generated planes sf are projected onto the two-dimensional plane, that is, the range surrounded by the multiple points specified by the two-dimensional point group PG2. The terminal control unit 81 sets the height (grid height) of each grid point gp as the position of the surface sf in the grid point. That is, the terminal control unit 81 may set the height of the intersection between the straight line passing through the grid point gp and the surface sf along the vertical direction as the grid height. In this way, the terminal control unit 81 calculates the three-dimensional position of each grid point gp (mesh point) (S14).
终端控制部81基于各栅格点gp的三维位置,生成三维模型M(S15)。3三维模型M的形状可以由各栅格点gp的三维位置规定。另外,各栅格点gp的三维位置位于各面sf的三维位置,因此三维模型M的形状可以是组合各面sf的形状。这样,终端80能够根据栅格gd中的各栅格点gp的三维位置,生成三维模型M,决定三维模型M。The terminal control unit 81 generates a three-dimensional model M based on the three-dimensional position of each grid point gp (S15). 3 The shape of the three-dimensional model M can be defined by the three-dimensional position of each grid point gp. In addition, since the three-dimensional position of each grid point gp is located at the three-dimensional position of each surface sf, the shape of the three-dimensional model M may be a shape combining the surfaces sf. In this way, the terminal 80 can generate a three-dimensional model M based on the three-dimensional position of each grid point gp in the grid gd, and determine the three-dimensional model M.
这样,终端控制部81可以将稀疏点群(稀疏的点群数据的一个示例)所包含的多个三维点投影到二维平面上。终端控制部81可以将被投影而在二维平面中相邻的多个二维点指定为一个组。终端控制部81可以指定多个这样的组。终端控制部81可以按组连接与所指定的相邻的多个二维点对应的稀疏点群所包含的多个三维点,生成多个面sf(面数据的一个示例)。终端控制部81可以基于多个面sf生成三维模型M。In this way, the terminal control unit 81 can project a plurality of three-dimensional points included in the sparse point group (an example of sparse point group data) onto a two-dimensional plane. The terminal control unit 81 may designate a plurality of two-dimensional points projected to be adjacent in a two-dimensional plane as a group. The terminal control unit 81 may specify a plurality of such groups. The terminal control unit 81 may connect the plurality of three-dimensional points included in the sparse point group corresponding to the specified adjacent two-dimensional points in groups to generate a plurality of planes sf (an example of plane data). The terminal control unit 81 may generate the three-dimensional model M based on the plurality of planes sf.
由此,终端80暂时将三维点群PG3投影于二维平面而生成二维点群PG2,并基于二维点群PG2的邻接关系而生成面。因此,与基于三维点群PG3的邻接关系生成面sf相比,终端80容易得到平滑的形状。因此,终端80基于二维点群PG2的邻接关系导出三维模型M的形状,由此能够提高实际地形的再现精度。Thereby, the terminal 80 temporarily projects the three-dimensional point group PG3 on the two-dimensional plane to generate the two-dimensional point group PG2, and generates a surface based on the adjacency relationship of the two-dimensional point group PG2. Therefore, the terminal 80 is easier to obtain a smooth shape than the adjacent surface generation surface sf based on the three-dimensional point group PG3. Therefore, the terminal 80 derives the shape of the three-dimensional model M based on the adjacency relationship of the two-dimensional point group PG2, whereby the reproduction accuracy of the actual terrain can be improved.
图13是表示三维点群PG3与二维点群PG2的一个示例的图。13 is a diagram showing an example of a three-dimensional point group PG3 and a two-dimensional point group PG2.
终端控制部81基于三维点群PG3而生成二维点群PG2。终端控制部81将三维空间(XYZ空间)的三维点群PG3中包含的各点投影到二维平面(XY平面),生成二维点群PG2所包含的各点。终端控制部81例如指定二维点群G2所包含的相邻的点P21~P23(二维点的一个示例)。与P21~P23对应的三维点群PG3中的点P31~P33(三维点的一个示例)是点P21~P23被投影到二维平面之前的点,是用于生成一个面sf1的相邻的多个点。The terminal control unit 81 generates a two-dimensional point group PG2 based on the three-dimensional point group PG3. The terminal control unit 81 projects each point included in the three-dimensional point group PG3 in the three-dimensional space (XYZ space) onto a two-dimensional plane (XY plane), and generates each point included in the two-dimensional point group PG2. The terminal control unit 81 specifies, for example, adjacent points P21 to P23 included in the two-dimensional point group G2 (an example of two-dimensional points). The points P31 to P33 (an example of a three-dimensional point) in the three-dimensional point group PG3 corresponding to P21 to P23 are the points before the points P21 to P23 are projected on the two-dimensional plane, and are used to generate one surface Points.
因此,终端80指定在二维平面中相邻的多个点,并连结与指定的多个点对应的三维点群PG3内的多个点而生成一个面sf,因此能够抑制在三维空间中面sf彼此相交或不连续。因此,终端80能够生成具有更符合地形形状的形状的三维模型M。Therefore, the terminal 80 specifies a plurality of points that are adjacent in the two-dimensional plane, and connects a plurality of points in the three-dimensional point group PG3 corresponding to the specified plurality of points to generate one surface sf, so it is possible to suppress the surface in the three-dimensional space sf intersects or is not continuous. Therefore, the terminal 80 can generate a three-dimensional model M having a shape more conforming to the shape of the terrain.
接下来,对比较例与本实施方式的三维模型M的生成结果进行说明。Next, the comparative example and the results of generating the three-dimensional model M of the present embodiment will be described.
在比较例中,假定按照Screened Poisson表面生成算法,基于稀疏点群数据生成稀疏的三维模型。Screeened Poisson表面生成算法记载于以下的参考非专利文献1中。(参考非专利文献1:Michael Kazhdan,Hugues Hoppe,“Screened poisson surface reconstruction.”ACM Translons on Graphics(ToG),Volume32,Issue3,June2013,Article No.29)In the comparative example, it is assumed that a sparse three-dimensional model is generated based on the sparse point group data according to the Screened Poisson surface generation algorithm. The Screeened Poisson surface generation algorithm is described in the following Reference Non-Patent Document 1. (Refer to Non-Patent Document 1: Michael Kazhdan, Hugues Hoppe, "Screened Poisson Surface Reconstruction." ACM Translons On Graphics (ToG), Volume 32, Issue 3, June 2013, Article No. 29)
图14是表示按照作为比较例的Screened Poisson表面生成算法生成的稀疏的三维模型的图。14 is a diagram showing a sparse three-dimensional model generated according to the Screened Poisson surface generation algorithm as a comparative example.
在图14中,地形G1、G2具有在水平方向上突出的形状,为与现实的地形不同的形状。这被认为是由于在比较例中,在三维空间的沿着铅直方向的垂直线与从稀疏点群(三维点群)导出的面之间存在多个相交点。另外还因为,在比较例中由于使用稀疏点群,三维模型的再现性低,用于生成面的三维点群所包含的点的连接关系的准确性降低。In FIG. 14, the terrains G1 and G2 have shapes that protrude in the horizontal direction, and are different from the actual terrain. This is considered to be because in the comparative example, there are multiple intersection points between the vertical line in the vertical direction of the three-dimensional space and the surface derived from the sparse point group (three-dimensional point group). In addition, because the sparse point group is used in the comparative example, the reproducibility of the three-dimensional model is low, and the accuracy of the connection relationship of the points included in the three-dimensional point group used to generate the surface is reduced.
图15是表示在本实施方式的三维模型M的生成处理中生成的三维模型M的一个示例的图。FIG. 15 is a diagram showing an example of the three-dimensional model M generated in the generation process of the three-dimensional model M of the present embodiment.
在本实施方式中,终端控制部81将稀疏点群(三维点群PG3)投影于二维平面,例如进行三角剖分。即,能够在二维平面中生成连续的三角形。因此,即使在三维空间中,终端80也能够在沿着二维平面的方向上生成连续的三角形,能够抑制在三维空间的沿着铅垂方向的垂直线与从稀疏点群导出的面sf之间出现多个相交点。例如,在 图14中,存在向水平方向突出的地形G1、G2,但在图15中,与地形G1、G2对应的部分成为平滑的地形G3。这样,通过将三维点群PG3暂且投影到二维平面,来保证连接各点而形成的三维空间上的面sf的连续性。面sf在三维空间中的高度可以基于三维点群向二维平面投影之前的高度。In the present embodiment, the terminal control unit 81 projects a sparse point group (three-dimensional point group PG3) on a two-dimensional plane, for example, performs triangulation. That is, a continuous triangle can be generated in a two-dimensional plane. Therefore, even in the three-dimensional space, the terminal 80 can generate a continuous triangle in the direction along the two-dimensional plane, and can suppress the vertical line in the three-dimensional space along the vertical direction and the surface sf derived from the sparse point group There are multiple intersections. For example, in FIG. 14, there are terrains G1 and G2 that protrude in the horizontal direction, but in FIG. 15, portions corresponding to terrains G1 and G2 become smooth terrain G3. In this way, by temporarily projecting the three-dimensional point group PG3 onto the two-dimensional plane, the continuity of the surface sf on the three-dimensional space formed by connecting the points is ensured. The height of the surface sf in the three-dimensional space may be based on the height before the projection of the three-dimensional point group onto the two-dimensional plane.
这样,无人飞行器100(图像生成装置的一个示例)可以基于由无人飞行器100(飞行体)拍摄到的多个拍摄图像GM生成合成图像SG。终端80可以具备进行与合成图像SG的生成相关的处理的终端控制部81(处理部的一个示例)。终端控制部81可以获取由无人飞行器100所具备的拍摄部220(拍摄装置的一个示例)拍摄到的多个拍摄图像GM。终端控制部81可以基于多个拍摄图像GM生成三维模型M。可以获取拍摄多个拍摄图像GM时的拍摄部220的各个位置以及各个姿势中的至少其一。终端控制部81可以基于拍摄部220的各个位置以及各个姿势中的至少其一和三维模型M,导出拍摄部220的各个位置与三维模型M之间的距离D。终端控制部81可以基于拍摄部220的各个位置与三维模型M之间的距离D来调整多个拍摄图像GM的尺寸。终端控制部81可以对尺寸调整后的多个拍摄图像GM进行合成,生成合成图像SG。此外,终端控制部81可以基于多个拍摄图像GM生成稀疏点群数据,并基于所述稀疏的点群数据来生成三维模型M。In this way, the unmanned aerial vehicle 100 (an example of an image generating device) can generate a composite image SG based on a plurality of captured images GM photographed by the unmanned aerial vehicle 100 (flying body). The terminal 80 may include a terminal control unit 81 (an example of a processing unit) that performs processing related to the generation of the synthesized image SG. The terminal control unit 81 can acquire a plurality of captured images GM captured by the imaging unit 220 (an example of an imaging device) included in the UAV 100. The terminal control section 81 may generate the three-dimensional model M based on the plurality of captured images GM. At least one of each position and each posture of the photographing unit 220 when photographing a plurality of photographed images GM can be acquired. The terminal control unit 81 may derive the distance D between each position of the imaging unit 220 and the three-dimensional model M based on at least one of each position and each posture of the imaging unit 220 and the three-dimensional model M. The terminal control section 81 may adjust the size of the plurality of captured images GM based on the distance D between each position of the shooting section 220 and the three-dimensional model M. The terminal control unit 81 may synthesize the plurality of captured images GM after the size adjustment to generate a synthesized image SG. In addition, the terminal control section 81 may generate sparse point group data based on the plurality of captured images GM, and may generate a three-dimensional model M based on the sparse point group data.
由此,终端80调整各拍摄图像GM的尺寸并进行合成,以使映入各拍摄图像GM的被摄体的尺寸一致,因此能够提高合成图像所表现的被摄体的再现性。另外,终端80不需要为了生成合成图像而依次进行素点生成、密点生成、网格生成、纹理生成等处理的全部。因此,终端80能够降低用于进行合成图像的生成的处理负担,能够缩短处理时间。因此,终端80能够用计算处理能力不太高的平板终端等轻松地生成合成图像。另外,终端80除了合成图像以外,还能够生成正射图像等。As a result, the terminal 80 adjusts the size of each captured image GM and synthesizes it so that the size of the subject reflected in each captured image GM matches, so that the reproducibility of the subject represented by the synthesized image can be improved. In addition, the terminal 80 does not need to sequentially perform all processes such as prime point generation, dense point generation, mesh generation, and texture generation in order to generate the synthesized image. Therefore, the terminal 80 can reduce the processing load for generating the synthesized image, and can shorten the processing time. Therefore, the terminal 80 can easily generate a composite image with a tablet terminal or the like whose calculation processing capability is not so high. In addition, the terminal 80 can generate an ortho image and the like in addition to the synthesized image.
以上,使用实施方式对本公开进行了说明,但本公开的技术范围并不限定于上述的实施方式所记载的范围。对本领域技术人员来说对上述的实施方式施加各种变更或改良是不言自明的。根据权利要求书的记载也应当明白,进行了这样的变更或改良的方式也包含在本公开的技术范围内。The present disclosure has been described above using embodiments, but the technical scope of the present disclosure is not limited to the scope described in the above-described embodiments. It is self-evident for those skilled in the art to apply various changes or improvements to the above-described embodiment. It should also be understood from the description in the claims that such a change or improvement is also included in the technical scope of the present disclosure.
权利要求书、说明书以及说明书附图中所示的装置、***、程序和方法中的动作、过程、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,即可以以任意顺序实现。关于权利要求书、说明书以及附图中的动作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。The order of execution of actions, procedures, steps, and stages in the devices, systems, programs, and methods shown in the claims, the description, and the drawings of the description, as long as "before" is not specifically stated, "Prior" and so on, and as long as the output of the previous processing is not used in the subsequent processing, it can be implemented in any order. The operation flow in the claims, the description, and the drawings has been described using "first", "next", etc. for the sake of convenience, but this does not mean that they must be implemented in this order.
【符号的说明】[Description of symbols]
10 飞行***10 Flight system
80 终端80 terminal
81 终端控制部81 Terminal Control Department
83 操作部83 Operations Department
85 通信部85 Communication Department
87 内存87 memory
88 显示部88 Display Department
89 存储器89 memory
100 无人驾驶飞机100 drones
110 UAV控制部110 UAV Control Department
150 通信接口150 communication interface
160 内存160 memory
170 存储器170 memory
200 万向节200 gimbal
210 旋转叶片机构210 Rotating blade mechanism
220,230 拍摄部220, 230 shooting department
240 GPS接收机240 GPS receiver
250 惯性测量装置250 inertial measurement device
260 磁传感器260 Magnetic sensor
270 气压计270 barometer
280 超声波传感器280 ultrasonic sensor
290 激光测定器290 Laser measuring instrument
CR 拍摄范围CR shooting range
DR 分割区域DR divided area
GM 拍摄图像GM images
SG 合成图像SG composite image

Claims (18)

  1. 一种基于由飞行体拍摄的多个摄像图像来生成合成图像的图像生成装置,其特征在于,其包括执行与所述合成图像的生成相关处理的处理部,An image generation device that generates a composite image based on a plurality of captured images captured by a flying body is characterized in that it includes a processing section that performs processing related to the generation of the composite image,
    所述处理部获取由所述飞行体所具备的摄像装置拍摄的多个摄像图像;基于所述多个摄像图像生成三维模型;获取拍摄所述多个摄像图像时的所述摄像装置的各个姿势,基于所述摄像装置的所述各个姿势和所述三维模型,计算拍摄所述多个摄像图像时的所述摄像装置的各个位置与所述三维模型之间的距离;基于所述摄像装置的各个位置与所述三维模型之间的距离来调整所述多个摄像图像的尺寸;对尺寸调整后的所述多个摄像图像进行合成而生成合成图像。The processing unit acquires a plurality of captured images captured by a camera device included in the flying object; generates a three-dimensional model based on the plurality of captured images; and acquires each posture of the camera device when capturing the multiple captured images , Based on the respective postures of the camera device and the three-dimensional model, calculate the distance between each position of the camera device and the three-dimensional model when the plurality of captured images are taken; The distance between each position and the three-dimensional model adjusts the size of the plurality of captured images; the size-adjusted plurality of captured images are synthesized to generate a synthesized image.
  2. 如权利要求1所述的图像生成装置,其特征在于,所述处理部获取拍摄所述多个摄像图像时的所述摄像装置的各个位置及所述各个姿势;基于所述摄像装置的所述各个位置和所述各个姿势以及所述三维模型来计算所述摄像装置的各个位置与所述三维模型之间的距离。The image generation device according to claim 1, wherein the processing unit acquires each position and each posture of the imaging device when the plurality of captured images are captured; Each position, each posture, and the three-dimensional model are used to calculate the distance between each position of the camera device and the three-dimensional model.
  3. 如权利要求1或2所述的图像生成装置,其特征在于,处理部按所述摄像装置在各个位置处拍摄的各个摄像范围,计算所述摄像装置和与所述摄像范围相对应的所述三维模型的第一部分的距离。The image generating device according to claim 1 or 2, wherein the processing unit calculates the imaging device and the image corresponding to the imaging range for each imaging range captured by the imaging device at each position The distance of the first part of the 3D model.
  4. 如权利要求1或2所述的图像生成装置,其特征在于,所述处理部对所述摄像装置的各个位置处拍摄的摄像范围进行分割,以产生所述摄像范围的分割区域;计算与所述分割区域相对应的所述三维模型的第二部分;按各个所述分割区域,计算所述摄像装置和与所述分割区域相对应的所述三维模型的第二部分之间的距离。The image generating device according to claim 1 or 2, wherein the processing section divides the imaging range captured at each position of the imaging device to generate a divided area of the imaging range; A second part of the three-dimensional model corresponding to the divided region; for each divided region, a distance between the camera and the second part of the three-dimensional model corresponding to the divided region is calculated.
  5. 如权利要求1至4中任一项所述的图像生成装置,其特征在于,所述距离是所述摄像装置的各个位置与所述三维模型之间的沿铅直方向的距离。The image generating device according to any one of claims 1 to 4, wherein the distance is a distance in a vertical direction between each position of the imaging device and the three-dimensional model.
  6. 如权利要求1至4中任一项所述的图像生成装置,其特征在于,所述距离是所述摄像装置的各个位置与所述三维模型之间的沿所述摄像装置的摄像方向的距离。The image generating device according to any one of claims 1 to 4, wherein the distance is a distance in the imaging direction of the imaging device between each position of the imaging device and the three-dimensional model .
  7. 如权利要求1至6中任一项所述的图像生成装置,其特征在于,所述处理部基于所述多个摄像图像生成稀疏点群数据,基于所述稀疏点群数据生成三维模型。The image generating apparatus according to any one of claims 1 to 6, wherein the processing unit generates sparse point group data based on the plurality of captured images, and generates a three-dimensional model based on the sparse point group data.
  8. 如权利要求7所述的图像生成装置,其特征在于,所述处理部将包含在所述稀疏点群数据中的多个三维点投影到二维平面上;将所投影的在二维平面中相邻的多个二维点指定为一组,并指定多个所述组;按所述组连接包含在与指定的相邻的所述多个二维点相对应的所述稀疏点群数据中的多个三维点,以产生多个面数据;基于所述多个面数据生成所述三维模型。The image generating device according to claim 7, wherein the processing unit projects a plurality of three-dimensional points included in the sparse point group data onto a two-dimensional plane; and projects the projected onto the two-dimensional plane A plurality of adjacent two-dimensional points are designated as a group, and a plurality of the groups are designated; the data of the sparse point group corresponding to the specified adjacent two-dimensional points are connected according to the group Multiple three-dimensional points in to generate multiple face data; generate the three-dimensional model based on the multiple face data.
  9. 一种基于由飞行体拍摄的多个摄像图像来生成合成图像的图像生成方法,其特征在于,具有以下步骤:An image generation method for generating a composite image based on a plurality of camera images taken by a flying body, characterized by having the following steps:
    获取由所述飞行体所具备的摄像装置拍摄的多个摄像图像;Acquiring a plurality of camera images captured by a camera device provided in the flying object;
    基于所述多个摄像图像生成三维模型;Generate a three-dimensional model based on the multiple camera images;
    获取拍摄所述多个摄像图像时的所述摄像装置的各个姿势;Acquiring each posture of the camera device when shooting the plurality of camera images;
    基于所述摄像装置的所述各个姿势和所述三维模型来计算所述摄像装置的各个位置与所述三维模型之间的距离;Calculating the distance between each position of the camera device and the three-dimensional model based on the various poses of the camera device and the three-dimensional model;
    基于所述摄像装置的各个位置与所述三维模型之间的距离来调整所述多个摄像图像的尺寸;以及Adjusting the size of the plurality of camera images based on the distance between each position of the camera device and the three-dimensional model; and
    对尺寸调整后的所述多个摄像图像进行合成而生成合成图像。The plurality of captured images after size adjustment are combined to generate a combined image.
  10. 如权利要求9所述的图像生成方法,其特征在于,所述获取姿势的步骤包括:获取拍摄所述多个摄像图像时的所述摄像装置的各位置以及所述各姿势的步骤;The image generating method according to claim 9, wherein the step of acquiring a posture includes the steps of acquiring each position of the camera device and each posture when the plurality of captured images are captured;
    所述计算距离的步骤包括:基于所述摄像装置的所述各位置和所述各姿势以及所述三维模型来计算所述摄像装置的各位置与所述三维模型之间的距离的步骤。The step of calculating the distance includes the step of calculating the distance between each position of the camera device and the three-dimensional model based on the positions and postures of the camera device and the three-dimensional model.
  11. 如权利要求9或10所述的图像生成方法,其特征在于,所述计算距离的步骤包括以下步骤:按所述摄像装置在各个位置处拍摄的各个摄像范围,来计算所述摄像装置和与所述摄像范围相对应的所述三维模型的第一部分的距离。The image generating method according to claim 9 or 10, wherein the step of calculating the distance includes the step of calculating the imaging device and the The distance of the first part of the three-dimensional model corresponding to the imaging range.
  12. 如权利要求9或10所述的图像生成方法,其特征在于,所述计算距离的步骤包括以下步骤:The image generating method according to claim 9 or 10, wherein the step of calculating the distance includes the following steps:
    对所述摄像装置的各个位置处拍摄的摄像范围进行分割,以产生所述摄像范围的分割区域;Dividing the imaging range shot at each position of the imaging device to generate a divided area of the imaging range;
    计算与所述分割区域相对应的所述三维模型的第二部分;以及Calculating the second part of the three-dimensional model corresponding to the segmented area; and
    按各个所述分割区域,计算所述摄像装置和与所述分割区域相对应的所述三维模型的第二部分之间的距离。For each of the divided regions, the distance between the imaging device and the second part of the three-dimensional model corresponding to the divided regions is calculated.
  13. 如权利要求9至12中任一项所述的图像生成方法,其特征在于,所述距离是所述摄像装置的各个位置与所述三维模型之间的沿铅直方向的距离。The image generating method according to any one of claims 9 to 12, wherein the distance is a distance in the vertical direction between each position of the camera device and the three-dimensional model.
  14. 如权利要求9至12中任一项所述的图像生成方法,其特征在于,所述距离是所述摄像装置的各个位置与所述三维模型之间沿所述摄像装置的摄像方向的距离。The image generating method according to any one of claims 9 to 12, wherein the distance is a distance between each position of the imaging device and the three-dimensional model in the imaging direction of the imaging device.
  15. 如权利要求9至14中任一项所述的图像生成方法,其特征在于,所述生成所述三维模型的步骤包括以下步骤:The image generation method according to any one of claims 9 to 14, wherein the step of generating the three-dimensional model includes the following steps:
    基于所述多个摄像图像生成稀疏点群数据;以及Generating sparse point group data based on the plurality of camera images; and
    基于所述稀疏点群数据生成三维模型。A three-dimensional model is generated based on the sparse point group data.
  16. 如权利要求15所述的图像生成方法,其特征在于,所述生成所述三维模型的步骤包括以下步骤:The image generating method according to claim 15, wherein the step of generating the three-dimensional model includes the following steps:
    将包含在所述稀疏点群数据中的多个三维点投影到二维平面上;Projecting multiple three-dimensional points contained in the sparse point group data onto a two-dimensional plane;
    将所投影的在二维平面中相邻的多个二维点指定为一组,并指定多个所述组;Designate a plurality of projected two-dimensional points adjacent in a two-dimensional plane as a group, and specify a plurality of the groups;
    按所述组连接包含在与指定的相邻的所述多个二维点相对应的所述稀疏点群数据中的多个三维点,以产生多个面数据;以及Connecting a plurality of three-dimensional points included in the sparse point group data corresponding to the specified adjacent two-dimensional points by the group to generate a plurality of surface data; and
    基于所述多个面数据生成所述三维模型。The three-dimensional model is generated based on the plurality of surface data.
  17. 一种程序,其特征在于,其用于使基于由飞行体拍摄的多个摄像图像来生成合成图像的图像生成装置执行以下步骤:A program characterized in that it is used to cause an image generation device that generates a composite image based on a plurality of captured images captured by a flying body to perform the following steps:
    获取由所述飞行体所包括的摄像装置拍摄的多个摄像图像;Acquiring multiple camera images captured by the camera device included in the flying object;
    基于所述多个摄像图像生成三维模型;Generate a three-dimensional model based on the multiple camera images;
    获取拍摄所述多个摄像图像时的所述摄像装置的各个姿势;Acquiring each posture of the camera device when shooting the plurality of camera images;
    基于所述摄像装置的所述各个姿势和所述三维模型,计算拍摄所述多个摄像图像时的所述摄像装置的各个位置与所述三维模型之间的距离;Based on the respective postures of the camera device and the three-dimensional model, calculating the distance between each position of the camera device and the three-dimensional model when the plurality of camera images are taken;
    基于所述摄像装置的各个位置与所述三维模型之间的距离来调整所述多个摄像图像的尺寸;以及Adjusting the size of the plurality of camera images based on the distance between each position of the camera device and the three-dimensional model; and
    对尺寸调整后的所述多个摄像图像进行合成而生成合成图像。The plurality of captured images after size adjustment are combined to generate a combined image.
  18. 一种计算机可读取记录介质,其特征在于,其记录有用于使基于由飞行体拍摄的多个摄像图像来生成合成图像的图像生成装置执行以下步骤的程序:A computer-readable recording medium, characterized in that it records a program for causing an image generation device that generates a composite image based on a plurality of captured images captured by a flying body to perform the following steps:
    获取由所述飞行体所包括的摄像装置拍摄的多个摄像图像;Acquiring multiple camera images captured by the camera device included in the flying object;
    基于所述多个摄像图像生成三维模型;Generate a three-dimensional model based on the multiple camera images;
    获取拍摄所述多个摄像图像时的所述摄像装置的各个姿势;Acquiring each posture of the camera device when shooting the plurality of camera images;
    基于所述摄像装置的所述各个姿势和所述三维模型,计算拍摄所述多个摄像图像时的所述摄像装置的各个位置与所述三维模型之间的距离;Based on the respective postures of the camera device and the three-dimensional model, calculating the distance between each position of the camera device and the three-dimensional model when the plurality of camera images are taken;
    基于所述摄像装置的各个位置与所述三维模型之间的距离来调整所述多个摄像图像的尺寸;以及Adjusting the size of the plurality of camera images based on the distance between each position of the camera device and the three-dimensional model; and
    对尺寸调整后的所述多个摄像图像进行合成而生成合成图像。The plurality of captured images after size adjustment are combined to generate a combined image.
PCT/CN2019/117466 2018-11-30 2019-11-12 Image generating device, image generating method, program and recording medium WO2020108290A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980009014.0A CN111656760A (en) 2018-11-30 2019-11-12 Image generation device, image generation method, program, and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-225741 2018-11-30
JP2018225741A JP2020088821A (en) 2018-11-30 2018-11-30 Image generation device, image generation method, program, and recording medium

Publications (1)

Publication Number Publication Date
WO2020108290A1 true WO2020108290A1 (en) 2020-06-04

Family

ID=70852385

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/117466 WO2020108290A1 (en) 2018-11-30 2019-11-12 Image generating device, image generating method, program and recording medium

Country Status (3)

Country Link
JP (1) JP2020088821A (en)
CN (1) CN111656760A (en)
WO (1) WO2020108290A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1788188A (en) * 2003-06-20 2006-06-14 三菱电机株式会社 Picked-up image display method
CN101919235A (en) * 2008-01-21 2010-12-15 株式会社博思科 Orthophotographic image creating method and imaging device
CN103329518A (en) * 2011-01-11 2013-09-25 松下电器产业株式会社 Image capturing system, camera control device for use therein, image capturing method, camera control method, and computer program
CN104980651A (en) * 2014-04-04 2015-10-14 佳能株式会社 Image processing apparatus and control method
CN106464843A (en) * 2014-09-05 2017-02-22 堺显示器制品株式会社 Image generation apparatus, image generation method, and computer program
WO2018168405A1 (en) * 2017-03-16 2018-09-20 富士フイルム株式会社 Image compositing device, image compositing method, and program
WO2018209898A1 (en) * 2017-05-19 2018-11-22 深圳市大疆创新科技有限公司 Information processing device, aerial photographing path generation method, aerial photographing path generation system, program and recording medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001014492A (en) * 1999-06-29 2001-01-19 Sony Corp Method and device for generating triangle meshes
US9185289B2 (en) * 2013-06-10 2015-11-10 International Business Machines Corporation Generating a composite field of view using a plurality of oblique panoramic images of a geographic area
CN108628337A (en) * 2017-03-21 2018-10-09 株式会社东芝 Coordinates measurement device, contouring system and path generating method
WO2018198634A1 (en) * 2017-04-28 2018-11-01 ソニー株式会社 Information processing device, information processing method, information processing program, image processing device, and image processing system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1788188A (en) * 2003-06-20 2006-06-14 三菱电机株式会社 Picked-up image display method
CN101919235A (en) * 2008-01-21 2010-12-15 株式会社博思科 Orthophotographic image creating method and imaging device
CN103329518A (en) * 2011-01-11 2013-09-25 松下电器产业株式会社 Image capturing system, camera control device for use therein, image capturing method, camera control method, and computer program
CN104980651A (en) * 2014-04-04 2015-10-14 佳能株式会社 Image processing apparatus and control method
CN106464843A (en) * 2014-09-05 2017-02-22 堺显示器制品株式会社 Image generation apparatus, image generation method, and computer program
WO2018168405A1 (en) * 2017-03-16 2018-09-20 富士フイルム株式会社 Image compositing device, image compositing method, and program
WO2018209898A1 (en) * 2017-05-19 2018-11-22 深圳市大疆创新科技有限公司 Information processing device, aerial photographing path generation method, aerial photographing path generation system, program and recording medium

Also Published As

Publication number Publication date
CN111656760A (en) 2020-09-11
JP2020088821A (en) 2020-06-04

Similar Documents

Publication Publication Date Title
JP6962775B2 (en) Information processing equipment, aerial photography route generation method, program, and recording medium
JP6765512B2 (en) Flight path generation method, information processing device, flight path generation system, program and recording medium
JP2016180761A (en) System and method of capturing large area image in detail including cascaded camera and/or calibration feature
JP6675537B1 (en) Flight path generation device, flight path generation method and program, and structure inspection method
JP6878194B2 (en) Mobile platforms, information output methods, programs, and recording media
JP6962812B2 (en) Information processing equipment, flight control instruction method, program, and recording medium
CN112146629A (en) Multi-angle close-up photography track and attitude planning method
CN111344650B (en) Information processing device, flight path generation method, program, and recording medium
WO2021203940A1 (en) Display control method, display control apparatus, program, and recording medium
CN110703805A (en) Method, device and equipment for planning three-dimensional object surveying and mapping route, unmanned aerial vehicle and medium
US20200217665A1 (en) Mobile platform, image capture path generation method, program, and recording medium
JP2019028560A (en) Mobile platform, image composition method, program and recording medium
US20210185235A1 (en) Information processing device, imaging control method, program and recording medium
WO2020198963A1 (en) Data processing method and apparatus related to photographing device, and image processing device
CN111699454A (en) Flight planning method and related equipment
WO2020051208A1 (en) Method for obtaining photogrammetric data using a layered approach
US20210229810A1 (en) Information processing device, flight control method, and flight control system
WO2020108290A1 (en) Image generating device, image generating method, program and recording medium
JP7067897B2 (en) Information processing equipment, flight control instruction method, program, and recording medium
WO2020119572A1 (en) Shape inferring device, shape inferring method, program, and recording medium
CN115357052A (en) Method and system for automatically exploring interest points in video picture by unmanned aerial vehicle
WO2020001629A1 (en) Information processing device, flight path generating method, program, and recording medium
KR102520189B1 (en) Method and system for generating high-definition map based on aerial images captured from unmanned air vehicle or aircraft
JP6974290B2 (en) Position estimation device, position estimation method, program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19888873

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19888873

Country of ref document: EP

Kind code of ref document: A1