WO2020052549A1 - 信息处理装置、飞行路径生成方法、程序以及记录介质 - Google Patents

信息处理装置、飞行路径生成方法、程序以及记录介质 Download PDF

Info

Publication number
WO2020052549A1
WO2020052549A1 PCT/CN2019/105125 CN2019105125W WO2020052549A1 WO 2020052549 A1 WO2020052549 A1 WO 2020052549A1 CN 2019105125 W CN2019105125 W CN 2019105125W WO 2020052549 A1 WO2020052549 A1 WO 2020052549A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
angle
flight path
flight
cost
Prior art date
Application number
PCT/CN2019/105125
Other languages
English (en)
French (fr)
Inventor
顾磊
沈思杰
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980005546.7A priority Critical patent/CN111344650B/zh
Publication of WO2020052549A1 publication Critical patent/WO2020052549A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/32UAVs specially adapted for particular uses or applications for imaging, photography or videography for cartography or topography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/19Propulsion using electrically powered motors

Definitions

  • the present disclosure relates to an information processing device, a flight path generation method, a program, and a recording medium that generate a flight path for a flying body.
  • a platform (unmanned aerial vehicle) that performs shooting while passing through a preset fixed path is known (see Patent Document 1).
  • This platform receives camera instructions from the ground base and shoots the camera objects.
  • this platform while flying along a fixed path, tilts the camera device of the platform to shoot according to the position relationship between the platform and the camera object.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2010-61216
  • an imaging angle for capturing an imaging target from a platform is determined so that the imaging target enters an imaging range.
  • the camera angle is not determined in order to photograph the front of the terrain. Because there may be some difficulties in shooting the front of the terrain, the amount of information on each point in the terrain may be reduced.
  • the terrain is photographed at various imaging angles in order to make the amount of information of each point in the terrain sufficient, it may include image capturing at unnecessary imaging angles, and the imaging efficiency may decrease. Therefore, it is desirable to be able to obtain a large amount of terrain frontal information from the flying body while suppressing a decrease in the imaging efficiency of the flying body.
  • the processing unit may obtain a candidate angle, which is a candidate for an imaging angle for photographing the terrain of the flight range, calculate a first imaging cost, which is the first imaging cost, for each candidate angle when the imaging is performed at the imaging position at the candidate angle, and The candidate angle at which the first imaging cost at the imaging position is greater than or equal to the first threshold is determined as the imaging angle at the imaging position.
  • the processing unit may sample the terrain of the flight range to obtain a plurality of sampling positions for shooting by the flying body, and for each sampling position, calculate the imaging cost when shooting the sampling position at the camera position at a candidate angle, that is, the second camera Cost, and the first imaging cost is calculated by adding the second imaging cost at each sampling position.
  • the processing unit may exclude the second imaging cost having a negative internal product value from the calculation object of the first imaging cost.
  • the processing unit may acquire a region of interest that is included in the flight range and includes the position of the camera object, and derives a camera angle for each camera position in the flight path based on the terrain information and the flight path of the region of interest.
  • the processing unit may obtain a candidate angle, which is a candidate for an imaging angle for photographing the terrain of the flight range, and calculate a third imaging cost, which is an imaging cost when photographing the region of interest at the imaging position with the candidate angle, for each candidate angle. And determine the candidate angle at which the third imaging cost at the imaging position is greater than or equal to the second threshold as the imaging angle at the imaging position.
  • the processing unit may exclude the first imaging position from the plurality of imaging positions to generate a flight path.
  • the processing unit may classify a plurality of imaging positions for each imaging object photographed from the imaging position to generate a plurality of imaging position groups, and connect the plurality of imaging position groups to generate a flight path.
  • the information processing device is a terminal including a communication unit, and the processing unit may transmit information of the imaging position, the flight path, and the imaging angle to the flying body via the communication unit.
  • the information processing device is a flying body including an imaging section, and the processing section may control the flight according to the flight path, and capture an image at an imaging angle at the imaging position of the flight path via the imaging section.
  • a method for generating a flight path of an information processing device for generating a flight path for flying a flying object includes the following steps: acquiring terrain information of a flying range of the flying object; A flight path including a camera position in a three-dimensional space for photographing the terrain of the flight range; based on the terrain information and the flight path of the flight range, for each camera position in the flight path, a flight path for shooting the terrain of the flight range is derived Camera angle.
  • the step of deriving the camera angle may include the following steps: obtaining candidate camera angles that are camera angles for shooting the terrain of the flight range; candidate angles; and for each candidate angle, calculating the camera cost when shooting at the camera position with the candidate angle, that is, The first imaging cost; a candidate angle at which the first imaging cost at the imaging position is greater than or equal to the first threshold is determined as the imaging angle at the imaging position.
  • the step of calculating the first imaging cost may include the following steps: sampling the terrain of the flight range to obtain a plurality of sampling positions photographed by the flying body; calculating, for each sampling position, sampling at a camera position at a candidate angle The imaging cost when shooting at a location is the second imaging cost; the first imaging cost is calculated by adding the second imaging costs at each sampling position.
  • the step of calculating the first imaging cost may include the following steps: excluding the second imaging cost with a negative internal product value from the calculation object of the first imaging cost.
  • the step of deriving the camera angle may include the following steps: obtaining a region of interest included in the flight range and including the position of the camera object; and deriving the camera angle for each camera position in the flight path based on the terrain information of the region of interest and the flight path.
  • the step of deriving the camera angle may include the following steps: obtaining a candidate camera angle candidate, which is a camera angle for shooting the terrain of the flight range; calculating for each candidate angle when shooting the region of interest with the candidate angle at the camera position
  • the imaging cost of is the third imaging cost; the candidate angle at which the third imaging cost at the imaging position is greater than or equal to the second threshold is determined as the imaging angle at the imaging position.
  • the step of generating the flight path may include the step of: when the third imaging cost when shooting at the imaging angle at the first imaging position among the plurality of imaging positions is equal to or less than the third threshold, the first imaging position is removed from the plurality of imaging positions Exclude to generate flight paths.
  • the step of generating a flight path may include the steps of classifying a plurality of camera positions to generate a plurality of camera position groups for each camera object captured from the camera position, and connecting the plurality of camera position groups to generate a flight path.
  • the information processing device is a terminal, and may further include the following steps: sending information of the imaging position, the flight path, and the imaging angle to the flying object.
  • the information processing device is a flying object, and may further include the following steps: controlling the flight according to the flight path; and capturing an image at a camera angle at a camera position of the flight path.
  • a program that causes an information processing device that generates a flight path for a flying body to perform the following steps: obtaining terrain information of a flying range of the flying body; based on the flying range of terrain information, generating a method including: The flight path of the camera position in the three-dimensional space where the terrain of the flight range is captured; based on the terrain information and the flight path of the flight range, a camera angle for capturing the terrain of the flight range is derived for each camera position in the flight path.
  • a recording medium is a computer-readable medium and records a program that causes an information processing device that generates a flight path for a flying body to perform the following steps: acquiring terrain information of a flying range of the flying body; Based on the terrain information of the flight range, a flight path including a camera position in a three-dimensional space for shooting the terrain of the flight range is generated; based on the terrain information and the flight path of the flight range, for each camera position in the flight path is derived for The camera angle at which the terrain in the flight range is captured.
  • FIG. 1 is a schematic diagram showing a first configuration example of a flying body system in a first embodiment.
  • FIG. 2 is a schematic diagram showing a second configuration example of the flying body system in the first embodiment.
  • FIG. 3 is a diagram showing an example of a specific appearance of an unmanned aircraft.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of an unmanned aircraft.
  • FIG. 5 is a block diagram showing an example of a hardware configuration of a terminal.
  • FIG. 6 is a diagram illustrating an example of a flight range, a flight path, and an imaging position.
  • FIG. 7 is a diagram showing an example of a table representing candidates for imaging angles.
  • FIG. 8 is a diagram showing an example of a sampling position provided on an uneven ground.
  • FIG. 9 is a diagram explaining an example of calculating an imaging cost corresponding to a candidate angle.
  • FIG. 10 is a sequence diagram showing an example of a flight path generation process in the flying body system in the first embodiment.
  • FIG. 11 is a sequence diagram showing an example of a flight path generation process in a flying body system in the second embodiment.
  • FIG. 12 is a diagram showing an example of specifying a region of interest.
  • FIG. 13 is a diagram illustrating a process of regenerating a flight path for capturing a region of interest.
  • FIG. 14 is a sequence diagram showing an example of a flight path generation process in a flying body system in the third embodiment.
  • FIG. 15 is a sequence diagram illustrating an example of a flight path generation process in a flying body system in the fourth embodiment.
  • FIG. 16 is a diagram illustrating a situation in which the unmanned aerial vehicle photographs the terrain immediately below while flying along the terrain.
  • FIG. 17 is a diagram showing a situation where the unmanned aerial vehicle photographs the terrain at a certain angle while flying along the terrain.
  • FIG. 18 is a diagram showing a situation in which the unmanned aerial vehicle photographs the terrain at various angles while flying along the terrain.
  • FIG. 16 is a diagram showing a situation in which the unmanned aircraft 100R as a flying body photographs the terrain ms immediately below while flying along the terrain ms.
  • the terrain ms immediately below is the hillside ms1
  • the frontal information of the terrain cannot be captured sufficiently.
  • 17 is a diagram illustrating a situation where the unmanned aerial vehicle 100R photographs the terrain ms at a certain angle while flying along the terrain ms.
  • the unmanned aerial vehicle 100R shoots at a certain angle, it may be difficult to shoot the back ms2 of the mountain. That is, it is difficult to obtain frontal terrain information.
  • FIG. 18 is a diagram illustrating a situation where the unmanned aerial vehicle 100R photographs the terrain ms at various angles while flying along the terrain ms.
  • unnecessary images may be captured. That is, the imaging efficiency when the terrain shape is photographed is reduced.
  • an imaging angle for capturing an imaging target from a platform is determined so that the imaging target enters an imaging range. That is, the imaging angle is not determined in order to photograph the front of the terrain. There may be places where the front of the terrain is difficult to shoot, so the amount of information on each point in the terrain may be reduced.
  • the flying object is an unmanned aircraft (UAV: Unmanned Aerial Vehicle) as an example.
  • Unmanned aircraft includes aircraft moving in the air.
  • an unmanned aircraft is also expressed as "UAV".
  • the information processing device for example, a terminal may be used as an example, but other devices (such as a transmitter, a PC (Personal Computer), an unmanned aerial vehicle, and other information processing devices) may also be used.
  • the flight path generation method specifies operations in the information processing apparatus.
  • a program for example, a program which causes an information processing apparatus to execute various processes is recorded in a recording medium.
  • FIG. 1 is a diagram showing a first configuration example of a flying body system 10 in the first embodiment.
  • the flying body system 10 includes an unmanned aircraft 100 and a terminal 80.
  • the unmanned aircraft 100 and the terminal 80 may communicate with each other through wired communication or wireless communication (for example, wireless LAN (Local Area Network)).
  • the terminal 80 is exemplified as a portable terminal (for example, a smart phone or a tablet terminal).
  • the terminal 80 is an example of an information processing apparatus.
  • the configuration of the flying body system 10 may include an unmanned aircraft 100, a transmitter (proportional controller), and a terminal 80.
  • the transmitter When the transmitter is included, the user can use the left and right joysticks located in front of the transmitter to indicate the control of the drone's flight.
  • the unmanned aircraft 100, the transmitter, and the terminal 80 can communicate with each other through wired communication or wireless communication.
  • FIG. 2 is a schematic diagram showing a second configuration example of the flying body system 10 in the first embodiment.
  • the terminal 80 is a PC.
  • the terminal 80 may have the same function.
  • FIG. 3 is a diagram showing an example of a specific appearance of the unmanned aircraft 100.
  • a perspective view of the unmanned aircraft 100 when flying in the moving direction STV0 is shown.
  • the unmanned aircraft 100 is an example of a moving body.
  • a roll axis (refer to the x-axis) may be provided in a direction parallel to the ground and along the moving direction STV0.
  • a pitch axis (refer to the y axis) may be provided in a direction parallel to the ground and perpendicular to the roll axis
  • a yaw may be provided in a direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis.
  • Axis (see z-axis).
  • the unmanned aerial vehicle 100 is configured to include a UAV body 102, a gimbal 200, an imaging unit 220, and a plurality of imaging units 230.
  • the UAV body 102 includes a plurality of rotors (propellers).
  • the UAV body 102 controls the rotation of a plurality of rotors to fly the unmanned aircraft 100.
  • the UAV body 102 uses, for example, four rotors to fly the unmanned aircraft 100.
  • the number of rotors is not limited to four.
  • the unmanned aircraft 100 may be a fixed-wing aircraft without a rotor.
  • the imaging unit 220 is a photographing camera that photographs a subject included in an intended imaging range (for example, the situation above the imaging target, scenery such as mountains and rivers, buildings on the ground).
  • the plurality of imaging units 230 are sensing cameras that capture the surroundings of the drone 100 in order to control the flight of the drone 100.
  • the two camera units 230 may be provided on the nose of the unmanned aircraft 100, that is, on the front side.
  • the other two imaging units 230 may be provided on the bottom surface of the drone 100.
  • the two image pickup units 230 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging units 230 on the bottom surface side may be paired to function as a stereo camera.
  • the three-dimensional space data around the drone aircraft 100 may be generated based on the images captured by the plurality of imaging sections 230.
  • the number of imaging units 230 included in the unmanned aerial vehicle 100 is not limited to four.
  • the unmanned aerial vehicle 100 only needs to include at least one imaging unit 230.
  • the unmanned aircraft 100 may include at least one camera 230 on the nose, tail, side, bottom, and top surfaces of the unmanned aircraft 100, respectively.
  • the angle of view settable in the imaging section 230 may be larger than the angle of view settable in the imaging section 220.
  • the imaging unit 230 may include a single focus lens or a fisheye lens.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of the unmanned aircraft 100.
  • the unmanned aerial vehicle 100 is composed of a UAV control unit 110, a communication interface 150, a memory 160, a memory 170, a gimbal 200, a rotor mechanism 210, a camera unit 220, a camera unit 230, a GPS receiver 240, and an inertial measurement device ( IMU: Inertial Measurement Unit) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, and laser measuring instrument 290.
  • IMU Inertial Measurement Unit
  • the UAV control unit 110 is composed of, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
  • the UAV control section 110 performs signal processing for overall control of the operations of each section of the unmanned aerial vehicle 100, data input / output processing with other sections, data operation processing, and data storage processing.
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 according to a program stored in the memory 160.
  • the UAV control unit 110 may control flight.
  • the UAV control unit 110 can capture an image (for example, aerial photography).
  • the UAV control unit 110 acquires position information indicating the position of the UAV 100.
  • the UAV control unit 110 may obtain position information indicating the latitude, longitude, and altitude of the unmanned aircraft 100 from the GPS receiver 240.
  • the UAV control unit 110 may obtain latitude and longitude information indicating the latitude and longitude of the unmanned aircraft 100 from the GPS receiver 240, and obtain altitude information indicating the altitude of the unmanned aircraft 100 from the barometric altimeter 270 as position information.
  • the UAV control unit 110 may obtain the distance between the radiation point of the ultrasonic wave generated by the ultrasonic sensor 280 and the reflection point of the ultrasonic wave as height information.
  • the UAV control unit 110 may obtain orientation information indicating the orientation of the unmanned aircraft 100 from the magnetic compass 260.
  • the orientation information may be expressed in an orientation corresponding to the orientation of the nose of the unmanned aircraft 100, for example.
  • the UAV control unit 110 may acquire position information indicating a position where the unmanned aerial vehicle 100 should exist when the imaging unit 220 photographs an imaging range to be photographed.
  • the UAV control unit 110 may acquire position information indicating a position where the unmanned aircraft 100 should exist from the memory 160.
  • the UAV control unit 110 may obtain position information indicating a position where the unmanned aircraft 100 should exist from another device via the communication interface 150.
  • the UAV control unit 110 may refer to the three-dimensional map database to specify a position where the unmanned aircraft 100 can exist, and obtain the position as position information indicating the position where the unmanned aircraft 100 should exist.
  • the UAV control unit 110 can acquire imaging range information indicating the respective imaging ranges of the imaging unit 220 and the imaging unit 230.
  • the UAV control unit 110 may acquire angle information indicating the angles of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230 as parameters for specifying the imaging range.
  • the UAV control unit 110 may acquire information indicating the imaging directions of the imaging unit 220 and the imaging unit 230 as parameters for specifying the imaging range.
  • the UAV control unit 110 may acquire, for example, posture information indicating the posture state of the imaging unit 220 from the gimbal 200 as the information indicating the imaging direction of the imaging unit 220.
  • the posture information of the imaging unit 220 may indicate an angle at which the pitch axis and the yaw axis of the gimbal 200 are rotated from the reference rotation angle.
  • the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 is located as a parameter for specifying the imaging range in particular.
  • the UAV control unit 110 may define an imaging range representing the geographic range captured by the imaging unit 220 and generate imaging range information based on the angle of view and the imaging direction of the imaging unit 220 and the imaging unit 230 and the location of the unmanned aerial vehicle 100, thereby Obtain camera range information.
  • the UAV control unit 110 may acquire imaging range information from the memory 160.
  • the UAV control unit 110 can acquire imaging range information via the communication interface 150.
  • the UAV control unit 110 controls the gimbal 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230.
  • the UAV control unit 110 may control the imaging range of the imaging unit 220 by changing the imaging direction or viewing angle of the imaging unit 220.
  • the UAV control unit 110 may control the imaging range of the imaging unit 220 supported by the universal joint 200 by controlling the rotation mechanism of the universal joint 200.
  • the imaging range refers to a geographic range captured by the imaging section 220 or the imaging section 230.
  • the imaging range is defined by latitude, longitude, and altitude.
  • the imaging range may be a range of three-dimensional spatial data defined by latitude, longitude, and height.
  • the imaging range may be a range of two-dimensional spatial data defined by latitude and longitude.
  • the imaging range can be specifically specified according to the viewing angle and imaging direction of the imaging unit 220 or the imaging unit 230 and the position where the unmanned aircraft 100 is located.
  • the imaging directions of the imaging section 220 and the imaging section 230 can be defined by the azimuth and depression angle of the front side of the imaging lens provided with the imaging section 220 and the imaging section 230.
  • the imaging direction of the imaging unit 220 may be a direction specified by the azimuth of the nose of the unmanned aircraft 100 and the posture state of the imaging unit 220 of the gimbal 200.
  • the imaging direction of the imaging unit 230 may be a direction specified by the orientation of the nose of the unmanned aircraft 100 and the position where the imaging unit 230 is provided.
  • the UAV control unit 110 may specify the environment around the unmanned aerial vehicle 100 by analyzing a plurality of images captured by the plurality of imaging units 230.
  • the UAV control unit 110 may control the flight according to the surrounding environment of the unmanned aircraft 100, for example, avoiding obstacles.
  • the UAV control unit 110 can acquire stereo information (three-dimensional information) indicating a stereo shape (three-dimensional shape) of an object existing around the unmanned aircraft 100.
  • the object may be, for example, a part of a landscape such as a building, a road, a vehicle, or a tree.
  • the stereo information is, for example, three-dimensional spatial data.
  • the UAV control unit 110 may generate stereoscopic information representing a stereoscopic shape of an object existing around the unmanned aerial vehicle 100 based on each image obtained from the plurality of imaging units 230, thereby acquiring stereoscopic information.
  • the UAV control unit 110 may acquire stereoscopic information indicating a stereoscopic shape of an object existing around the unmanned aircraft 100 by referring to a three-dimensional map database stored in the memory 160 or the memory 170.
  • the UAV control unit 110 may acquire stereoscopic information related to the stereoscopic shape of an object existing around the drone 100 by referring to a three-dimensional map database managed by a server existing on the network.
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotor mechanism 210. That is, the UAV control unit 110 controls the rotor mechanism 210 to control the position including the latitude, longitude, and altitude of the unmanned aircraft 100.
  • the UAV control unit 110 may control the imaging range of the imaging unit 220 by controlling the flight of the unmanned aircraft 100.
  • the UAV control unit 110 may control a viewing angle of the imaging unit 220 by controlling a zoom lens included in the imaging unit 220.
  • the UAV control unit 110 may use the digital zoom function of the imaging unit 220 to control the angle of view of the imaging unit 220 through digital zoom.
  • the UAV control unit 110 can move the unmanned aerial vehicle 100 to a specified location on a specified date, so that the camera portion 220 is at a desired position. Shooting in the desired imaging range under the surrounding environment.
  • the UAV control unit 110 can move the drone aircraft 100 to a specified location on a specified date to make the imaging unit 220 at Capture the desired imaging range in the desired environment.
  • the communication interface 150 communicates with the terminal 80.
  • the communication interface 150 can perform wireless communication through any wireless communication method.
  • the communication interface 150 can perform wired communication by using any wired communication method.
  • the communication interface 150 may transmit the captured image and additional information (metadata) related to the captured image to the terminal 80.
  • the memory 160 stores the UAV control unit 110 to the gimbal 200, the rotor mechanism 210, the camera unit 220, the camera unit 230, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring instrument 290 Programs and the like required for control.
  • the memory 160 may be a computer-readable recording medium, and may include SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable ReadOnly Memory).
  • the memory 160 can be removed from the unmanned aircraft 100.
  • the memory 160 can work as a job memory.
  • the memory 170 may include at least one of an HDD (Hard Disk Drive), an SSD (Solid State Drive), an SD card, a USB memory, and other memories.
  • the memory 170 can store various information and various data.
  • the memory 170 can be detached from the unmanned aircraft 100.
  • the memory 170 can record a captured image.
  • the memory 160 or the memory 170 may store information of an imaging position and an imaging path generated by the terminal 80 or the unmanned aerial vehicle 100.
  • the UAV control unit 110 may set the information of the imaging position and the imaging path as one of the imaging parameters related to the imaging scheduled by the unmanned aircraft 100 or the flight parameters related to the flight scheduled by the unmanned aircraft 100.
  • the setting information may be stored in the memory 160 or the memory 170.
  • the imaging parameters may include information on the imaging angle of the imaging section 220.
  • the gimbal 200 may rotatably support the imaging unit 220 around a yaw axis, a pitch axis, and a roll axis.
  • the gimbal 200 can rotate the imaging unit 220 around at least one of a yaw axis, a pitch axis, and a roll axis, thereby changing the imaging direction of the imaging unit 220.
  • the rotor mechanism 210 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the rotor mechanism 210 is controlled to rotate by the UAV control unit 110 to fly the unmanned aircraft 100.
  • the number of rotors 211 may be, for example, four, or may be another number.
  • the unmanned aircraft 100 may be a fixed-wing aircraft without a rotor.
  • the imaging unit 220 captures a subject in a desired imaging range and generates data of a captured image.
  • the image data (for example, aerial image) obtained by the imaging of the imaging unit 220 may be stored in a memory or the memory 170 of the imaging unit 220.
  • the imaging unit 230 captures the surroundings of the drone 100 and generates data of a captured image.
  • the image data of the imaging unit 230 may be stored in the memory 170.
  • the GPS receiver 240 receives a plurality of signals indicating the time transmitted from a plurality of navigation satellites (ie, GPS satellites) and the position (coordinates) of each GPS satellite.
  • the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned aircraft 100) based on the received multiple signals.
  • the GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control section 110.
  • the UAV control unit 110 may calculate the position information of the GPS receiver 240 instead of the GPS receiver 240. In this case, the UAV control unit 110 is inputted with information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240.
  • the inertial measurement device 250 detects the attitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the inertial measurement device 250 can detect the accelerations of the three-axis directions of the front-rear, left-right, and up-down directions of the unmanned aircraft 100 and the angular velocities of the three-axis directions of the pitch axis, the roll axis, and the yaw axis as the attitude of the unmanned aircraft 100.
  • the magnetic compass 260 detects the heading of the drone 100 and outputs the detection result to the UAV control unit 110.
  • the barometric altimeter 270 detects the flying height of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the ultrasonic sensor 280 transmits ultrasonic waves, detects ultrasonic waves reflected from the ground and objects, and outputs the detection results to the UAV control unit 110.
  • the detection result may show the distance from the unmanned aircraft 100 to the ground, that is, the altitude.
  • the detection result can show the distance from the drone 100 to the object (subject).
  • the laser measuring instrument 290 irradiates an object with laser light, receives reflected light reflected from the object, and measures the distance between the unmanned aerial vehicle 100 and the object (subject) by the reflected light.
  • a time-of-flight method may be used.
  • FIG. 5 is a block diagram showing an example of a hardware configuration of the terminal 80.
  • the terminal 80 includes a terminal control section 81, an operation section 83, a communication section 85, a memory 87, a display section 88, and a memory 89.
  • the terminal 80 may be held by a user who wishes to control the flight of the unmanned aircraft 100.
  • the terminal control unit 81 is configured using, for example, a CPU, an MPU, or a DSP.
  • the terminal control unit 81 performs signal processing for overall control of operations of each unit of the terminal 80, input / output processing of data with other units, data calculation processing, and data storage processing.
  • the terminal control unit 81 can acquire data and information (various measurement data, captured images, additional information, etc.) from the unmanned aircraft 100 via the communication unit 85.
  • the terminal control section 81 can acquire data and information (for example, various parameters) input via the operation section 83.
  • the terminal control unit 81 can acquire data and information stored in the memory 87.
  • the terminal control unit 81 may transmit data and information (for example, information on an imaging position, an imaging angle, and a flight path) to the unmanned aircraft 100 via the communication unit 85.
  • the terminal control unit 81 may send data and information to the display unit 88 and display the display information based on the data and information on the display unit 88.
  • the terminal control unit 81 may execute an application program for flight control for the unmanned aircraft 100.
  • the terminal control unit 81 can generate various data used in the application.
  • the operation unit 83 receives and acquires data and information input by a user of the terminal 80.
  • the operation unit 83 may include input devices such as buttons, keys, a touch display screen, and a microphone.
  • the operation section 83 and the display section 88 are constituted by a touch panel.
  • the operation section 83 may perform a touch operation, a click operation, a drag operation, and the like.
  • the operation unit 83 can receive information of various parameters.
  • the information input by the operation unit 83 may be transmitted to the unmanned aircraft 100.
  • Various parameters may include parameters related to flight control.
  • the communication unit 85 performs wireless communication with the unmanned aircraft 100 through various wireless communication methods.
  • the wireless communication method of this wireless communication may include, for example, wireless LAN, Bluetooth (registered trademark), or communication via a public wireless network.
  • the communication unit 85 can perform wired communication using any wired communication method.
  • the memory 87 may include, for example, a program that defines the operation of the terminal 80, a ROM that stores data of set values, and a RAM that temporarily stores various information and data used by the terminal control unit 81 for processing.
  • the memory 87 may include a memory other than a ROM and a RAM.
  • the memory 87 may be provided inside the terminal 80.
  • the memory 87 may be configured to be detachable from the terminal 80.
  • the program may include an application program.
  • the display section 88 is configured by, for example, an LCD (Liquid Crystal Display), and displays various information and data output from the terminal control section 81.
  • the display unit 88 can display various data and information related to the execution of the application.
  • the memory 89 stores and stores various data and information.
  • the memory 89 may be an HDD, an SSD, an SD card, a USB memory, or the like.
  • the memory 89 may be provided inside the terminal 80.
  • the memory 89 may be detachably provided on the terminal 80.
  • the memory 89 may store a captured image and additional information acquired from the unmanned aircraft 100. Additional information may be stored in the memory 87.
  • the processing performed by the terminal 80 may be performed by the transmitter. Since the transmitter has the same constituent parts as the terminal 80, it will not be described in detail.
  • the transmitter includes a control section, an operation section, a communication section, a display section, a memory, and the like. When the flying body system 10 has a transmitter, the terminal 80 may not be provided.
  • FIG. 6 is a diagram showing a flight range AR, a flight path rt, and an imaging position wp.
  • the flight range AR indicates the range in which the unmanned aircraft 100 flies.
  • the flight range AR may be consistent with the imaging range captured by the imaging section 220 of the unmanned aircraft 100.
  • the flight path rt indicates a path when the unmanned aircraft 100 flies.
  • the imaging position wp is a position when an image is captured by the imaging unit 220 of the drone 100.
  • the flight path rt is generated through the imaging position wp.
  • the terminal control unit 81 acquires the flight range AR, generates a flight path rt, and determines the imaging position wp.
  • the unmanned aerial vehicle 100 performs aerial photography at the camera position wp while flying along the flight path rt within the flight range AR.
  • the flight path rt is a route set to enter from the lower left corner, move in a square wave shape, and exit from the upper right corner.
  • the flight path rt in this case is a flight path according to a scanning method for uniformly capturing the flight range AR.
  • the flight path rt may be a route set in a zigzag or spiral shape, or a flight path of other shapes.
  • FIG. 7 is a diagram showing a table representing candidates (also referred to as candidate angles) for imaging angles.
  • the imaging angle is an imaging angle used by the imaging unit 220 of the unmanned aerial vehicle 100 to photograph the terrain of the flight range AR.
  • Candidates for imaging angles are candidates for imaging angles used in actual shooting from among various imaging angles.
  • a table representing candidate angles may be registered in the memory 87 or the memory 89 of the terminal 80.
  • a table representing candidate angles can also be saved on an external server.
  • the imaging angle can be specified by the pitch angle and yaw angle of the gimbal 200 supporting the imaging section 220. Therefore, the candidate angle may be specified by the pitch angle and the yaw angle of the gimbal 200 supporting the imaging unit 220.
  • nine combinations (points) of the pitch angle and the yaw angle are shown as candidate angles.
  • the nine points include a point with a pitch angle of 0 °, a yaw angle of 0 °, a point with a pitch angle of 0 °, a point with a yaw angle of 270 °, and a point with a pitch angle of -45 °, and 270 ° point.
  • FIG. 7 is only an example, and a combination of a pitch angle and a yaw angle may be defined in more detail.
  • the pitch and yaw angles of the candidate angles in FIG. 7 are defined at uniform intervals, and may also be defined at non-uniform intervals. For example, in an angle range that is easily selected (easy to assume) as the imaging angle of the imaging unit 220, many candidate angles can be defined, and an angle that is difficult to be selected (hard to assume) as the imaging angle of the imaging unit 220 Within the range, fewer candidate angles can be defined.
  • the pitch angle is set to a negative angle, that is, the imaging direction is directed downward from the horizontal plane with respect to the horizontal direction.
  • the elevation angle may be set to a positive angle. This makes it possible to perform shooting suitable for the situation of the subject.
  • the terminal control unit 81 may acquire a candidate angle for photographing the terrain of the flight range AR.
  • the terminal control section 81 may acquire the candidate angle from the memory 87 or the memory 89.
  • the terminal control section 81 can acquire a candidate angle from an external server via the communication section 85.
  • FIG. 8 is a diagram showing an example of the sampling position k provided on the uneven ground hm.
  • the sampling position k may be a sampling position taken by the unmanned aerial vehicle 100 and extracting the terrain of the flight range AR.
  • the sampling position can be specified by a three-dimensional position (latitude, longitude, altitude). Terrain information can be defined based on multiple sampling locations.
  • sampling position k is set in one direction in the flight range AR in FIG. 8, the sampling position k may be set in a two-dimensional direction in the flight range AR. That is, the sampling positions k can be set in a grid (grid) within the flight range AR, that is, at a predetermined interval. In addition, the sampling positions k may be arranged at unequal intervals instead of equal intervals.
  • the arrow pointing from the sampling position k on the ground hm to the imaging unit 220 of the unmanned aerial vehicle 100 indicates a normal vector (normal direction) to the ground hm.
  • the imaging section 220 exists on the normal vector at the sampling position k
  • the imaging range of the imaging unit 220 may include not only the sampling position k existing on the front side but also other sampling positions k not on the front side. Therefore, it is desirable to photograph each sampling position k from a direction as close as possible to the front of the entire sampling position k included in the imaging range. It is possible to digitize the appropriate degree of imaging of each sampling position k by the imaging unit 220 as the imaging cost.
  • the vertical axis of the graph represents the height in the three-dimensional space (for example, the height of the imaging unit 220 of the unmanned aerial vehicle 100 and the ground hm), and the horizontal axis represents the position (latitude, longitude) in the three-dimensional space.
  • the sampling position k the position of the drone 100 including the imaging unit 220.
  • the information of the sampling position can be stored in the memory 87 or the memory 89 or an external server.
  • the terminal control section 81 can acquire the information of the sampling position k from the memory 87 or the memory 89.
  • the terminal control section 81 can acquire the information of the sampling position k via the communication section 85.
  • the terminal control section 81 itself can sample the terrain information and determine the sampling position k.
  • FIG. 9 is a diagram explaining an example of calculating an imaging cost corresponding to a candidate angle.
  • the imaging cost is numerically determined as to whether the imaging unit 220 of the drone 100 is suitable for imaging. For example, the imaging cost is calculated for each candidate angle.
  • the terminal control unit 81 performs the following specific examples of processing (for example, calculation by an expression), and calculates the imaging cost corresponding to the candidate angle.
  • the terminal control unit 81 determines an imaging angle ⁇ based on the imaging cost. Since the imaging angle ⁇ is calculated for each imaging position i, it is also referred to as the imaging angle ⁇ i.
  • the terminal control unit 81 may calculate the imaging cost Cij of the candidate angle j at the imaging position i according to the formula (1).
  • the distance from the imaging unit 220 to the ground hm that is, the distance between the imaging position i and the sampling position k is represented by a distance d.
  • the imaging direction corresponding to the candidate angle j of the imaging unit 220 is represented by an imaging vector n.
  • the normal direction of the ground hm at the sampling position k is represented by vector 1.
  • the terminal control unit 81 can calculate the imaging cost Cijk for the sampling position k of the imaging target in accordance with Equation (2).
  • n * (-1) represents an inner product (inner product value) of the imaging vector n and the normal vector 1.
  • max (n * (-1), 0) represents the larger of the inner product n * (-1) and the value 0. This means that the terminal control unit 81 excludes the imaging cost Cijk at the sampling position where the inner product n * (-1) is negative from the calculation target of the imaging cost Cij at the imaging position i.
  • the terminal control section 81 obtains the imaging cost Cij by adding the imaging costs Cijk at the respective sampling positions k.
  • the candidate angle j when the imaging cost Cij at the imaging position i is the maximum is the optimal imaging angle ⁇ im.
  • the terminal control unit 81 can calculate the optimal imaging angle ⁇ im according to the formula (3).
  • argmax (Cij) is a candidate angle j when the imaging cost Cij is the maximum (max), and this angle is the optimal imaging angle ⁇ im.
  • the optimal imaging angle ⁇ im is an example of the imaging angle ⁇ i. That is, the imaging angle ⁇ i is not limited to the angle when the imaging cost Cij is the maximum, and it may be an angle that satisfies a predetermined criterion.
  • the imaging angle ⁇ i determined (selected) from the candidate angles may be an angle when the imaging cost Cij is the second largest and the third largest among the angles greater than or equal to the threshold th1.
  • the imaging angle ⁇ i may be an imaging angle corresponding to an imaging cost Cij that is greater than or equal to an average value obtained by averaging the imaging costs Cij.
  • the imaging angle ⁇ i may not be determined from the candidate angle j, and the imaging at the imaging position may be omitted.
  • the imaging position i for which the image quality of the captured image is less than or equal to a predetermined reference regardless of the imaging angle ⁇ i is not used as the imaging position, so that the terminal 80 can omit unnecessary photography and improve imaging efficiency.
  • the terminal 80 can also cause the unmanned aircraft 100 to fly along a flight path rt that does not pass through the imaging position i.
  • the terminal control unit 81 can sample the terrain of the flight range AR and acquire a plurality of sampling positions k captured by the unmanned aircraft 100. For each sampling position k, the terminal control unit 81 may calculate an imaging cost Cijk (an example of a second imaging cost) when the sampling position k is captured at the imaging position i at the candidate angle j. The terminal control unit 81 may add the imaging cost Cijk at each sampling position k to calculate the imaging cost Cij at the imaging position i (an example of the first imaging cost).
  • Cijk an example of a second imaging cost
  • the terminal 80 can determine the imaging angle ⁇ i at each imaging position i by considering the imaging cost Cijk at each sampling position k. For example, even if the imaging cost Cijk at one sampling position k is small, when the imaging cost Cijk at another sampling position k is large, the overall imaging cost Cij at multiple sampling positions k becomes large, and the terminal 80 can change The candidate angle j in this case is used as the imaging angle ⁇ i. Therefore, the terminal 80 can determine the imaging angle ⁇ i in consideration of the goodness of shooting at a plurality of sampling positions k.
  • Equation (2) the shorter the distance d, the larger the value of the imaging cost Cijk at the sampling position can be.
  • the terminal 80 can increase the degree of influence of the imaging cost Cijk at the sampling position k near the unmanned aircraft 100.
  • the imaging range the range included in the captured image
  • the terminal 80 can improve the accuracy of orthophoto generation and the three-dimensional restoration.
  • Equation (2) the larger the inner product n * (-1), the larger the value of the imaging cost Cijk at the sampling position k can be.
  • the terminal 80 can increase the degree of influence of the imaging vector n having a small angle with the normal vector 1.
  • the terminal 80 can improve the accuracy of orthophoto generation and the three-dimensional restoration.
  • the terminal control unit 81 may exclude the imaging cost Cijk at the sampling position k where the value of the inner product n * (-1) (inner product value) is a negative value from the calculation target of the imaging cost Cij at the imaging position i.
  • the terminal 80 can suppress an extreme value of the inner product value to the imaging position by setting the imaging cost Cijk at the sampling position where the value of the inner product n * (-1) is negative.
  • FIG. 10 is a sequence diagram illustrating an example of a flight path generation process in the flying body system 10. In FIG. 10, it is illustrated that the process of generating a flight path is mainly performed by the terminal 80.
  • the terminal control unit 81 acquires information of the flight range AR (T1).
  • the terminal control section 81 may receive a user input via the operation section 83 and acquire the flight range AR.
  • the terminal control section 81 can acquire map information from an external server via the communication section 85.
  • the flight range AR is set to a rectangular range
  • the user can obtain the information of the flight range AR by inputting the positions (latitude, longitude) of the four corners of the rectangle in the map information.
  • the flying range AR is set to a circular range
  • the user can obtain the information of the flying range AR by inputting the radius of a circle centered on the flying position.
  • the user can obtain information of the flight range AR based on the map information by inputting information such as a region, a specific place name (for example, Tokyo), and the like.
  • the terminal control unit 81 can acquire the flight range AR stored in the memory 87 and the memory 89 from the memory 87 and the memory 89.
  • the terminal control section 81 can acquire the flight range AR from an external server via the communication section 85.
  • the terminal control section 81 acquires various parameters (T2).
  • the parameter may be a parameter related to the flight of the unmanned aircraft 100 captured by the camera 220.
  • the parameters may include, for example, the imaging position, the imaging date and time, the distance to the subject, the imaging angle, the imaging conditions, the camera parameters (shutter speed, exposure value, imaging mode, etc.).
  • the terminal control section 81 can acquire parameters input by the user via the operation section 83.
  • the terminal control unit 81 can acquire various parameters stored in the memory 87 and the memory 89 from the memory 87 and the memory 89.
  • the terminal control unit 81 can acquire various parameters from the unmanned aircraft 100 and an external server via the communication unit 85.
  • the terminal control unit 81 acquires terrain information based on the information of the flight range AR (T3).
  • the terminal control unit 81 may acquire the terrain information of the flight range AR in cooperation with a map server on a network connected via the communication unit 85.
  • the terrain information may include position information (latitude, longitude, altitude) at each position of the flight range AR. By aggregating the position information at each position, the three-dimensional shape of the flight range AR can be represented.
  • the terrain information may include information on ground shapes such as buildings, mountains, forests, iron towers, and information on objects.
  • the terminal control unit 81 calculates the flying height based on the terrain information of the flight range AR and information such as the distance to the subject included in the acquired parameters (T4). For example, the terminal control unit 81 may calculate the flying height of the unmanned aerial vehicle 100 in conjunction with the fluctuation of the ground hm indicated by the terrain information to ensure the distance to the subject.
  • the terminal control unit 81 generates a flight path rt (T5).
  • the terminal control section 81 may generate the flight path rt based on the flight range AR, the terrain information, and the flight altitude.
  • the generated flight path rt maintains derived flight heights at various positions within the flight range AR, and passes through an imaging position wp in a three-dimensional space for capturing terrain within the flight range AR.
  • the terminal control unit 81 may determine which position on the two-dimensional plane (latitude, longitude) within the flight range AR and the sampling position k (imaging position wp) that the flight path passes through according to a known method.
  • the terminal control unit 81 derives (for example, calculates) an imaging angle ⁇ i (T6) for each imaging position i along the flight path rt based on the terrain information and the flying height.
  • the terminal control unit 81 calculates the imaging cost Cij at the imaging position i for each candidate angle j.
  • the terminal control section 81 determines a candidate angle (for example, the optimal imaging angle ⁇ im) at which the imaging cost Cij at the imaging position i is greater than or equal to the threshold th1 (for example, the largest) as the imaging angle ⁇ i at the imaging position i.
  • the optimal imaging angle ⁇ im can be calculated based on the terrain information and the information of the flight path rt.
  • the terminal control unit 81 transmits a notification parameter including the imaging position wp, the flight path rt, and the imaging angle ⁇ i to the unmanned aircraft 100 via the communication unit 85 (T7).
  • the notification parameters may include imaging parameters related to the camera (imaging section 220) at the time of shooting, and flight parameters related to the flight at the time of shooting.
  • the UAV control unit 110 receives a notification parameter from the terminal 80 via the communication interface 150 (T8).
  • the UAV control unit 110 sets various parameters used by the unmanned aircraft 100 by storing the received notification parameters in the memory 160 (T9).
  • the UAV control unit 110 drives the imaging unit 220 while flying along the flight path rt based on the set parameters, and performs aerial photography at the imaging angle ⁇ i (T10).
  • the terminal 80 can generate a flight path rt for the unmanned aircraft 100 to fly.
  • the terminal control unit 81 can acquire the terrain information of the flight range AR in which the unmanned aircraft 100 flies.
  • the terminal control unit 81 may generate a flight path rt including an imaging position wp in a three-dimensional space for capturing the terrain of the flight range AR based on the terrain information of the flight range AR.
  • the terminal control unit 81 may derive (for example, calculate) an imaging angle ⁇ i for each imaging position of the flight path rt based on the terrain information of the flight range AR and the flight path rt.
  • the terminal 80 determines the imaging angle ⁇ i in consideration of the fluctuation of the terrain, the number of places on the ground hm that are difficult to capture due to the fluctuation of the terrain is reduced. Therefore, at each imaging position i, the unmanned aerial vehicle 100 can shoot as much as possible from the front when shooting various points on the ground hm. Therefore, the terminal 80 can improve the generation accuracy of the orthophoto image and the three-dimensional restoration accuracy (three-dimensional shape estimation accuracy) by using the captured image captured with the determined imaging angle ⁇ i. In addition, in order to improve the accuracy of orthographic image generation and three-dimensional restoration accuracy, the terminal 80 can not capture images at various angles at each imaging position i, and improve the imaging efficiency of the unmanned aircraft 100.
  • the terminal 80 can suppress the decrease in the imaging efficiency of the unmanned aerial vehicle 100 and enable the unmanned aerial vehicle 100 to obtain as much information as possible at each point of the undulating terrain.
  • the process of generating the flight path performed mainly by the terminal 80 may be performed during the flight of the unmanned aircraft 100 or before the start of the flight.
  • the terminal control unit 81 may acquire a candidate angle j, which is a candidate for the imaging angle ⁇ i for capturing the terrain of the flight range AR.
  • the terminal control unit 81 may calculate the imaging cost Cij at the imaging position i for each candidate angle j (the imaging cost when the imaging is performed at the imaging position at the candidate angle j is an example of the first imaging cost).
  • the terminal control unit 81 may determine the imaging angle ⁇ i at the imaging position based on the imaging cost Cij at the imaging position i. In this case, the terminal control section 81 may determine the candidate angle j at which the imaging cost Cij at the imaging position i is greater than or equal to the threshold th1 as the imaging angle ⁇ i at the imaging position.
  • the terminal 80 can digitize whether it is suitable for shooting to the imaging cost, and thus can easily determine how appropriate the shooting is at the candidate angle j at the imaging position i.
  • the candidate angle j of the n-th largest (for example, the largest) imaging cost Cij in the imaging cost Cij at the imaging position i greater than or equal to the threshold th1 may be determined as the imaging angle ⁇ i.
  • the terminal control section 81 may transmit the information of the imaging position i, the flight path rt, and the imaging angle ⁇ i to the unmanned aircraft 100 via the communication section 85.
  • the flying body system in the second embodiment has substantially the same structure as that in the first embodiment.
  • the same components as those in the first embodiment are denoted by the same reference numerals, and descriptions thereof will be omitted or simplified.
  • FIG. 11 is a sequence diagram showing a flight path generation process in the flying body system 10 in the second embodiment.
  • FIG. 11 illustrates a case where the process of generating the flight path is performed mainly by the unmanned aircraft 100.
  • the processes of the processes T21 to T23 are the same as the processes T1 to T3 of the first embodiment.
  • the terminal control section 81 transmits the flight range AR, parameters, and terrain information acquired in the processes T21 to T23 to the unmanned aircraft 100 via the communication section 85 (T24).
  • the UAV control unit 110 receives the flight range AR, parameters, and terrain information via the communication interface 150 (T25).
  • the UAV control unit 110 stores the received flight range AR, parameters, and terrain information in the memory 160.
  • the UAV control unit 110 calculates the flying height (T26).
  • the calculation method of flying height can be the same as T4.
  • the UAV control unit 110 generates a flight path rt (T27).
  • the method of generating the flight path rt may be the same as that of T5.
  • the UAV control unit 110 derives an imaging angle ⁇ i for each imaging position along the flight path rt (T28).
  • the method of deriving the imaging angle ⁇ i may be the same as that of T6.
  • the UAV control unit 110 holds and sets parameters including the imaging position, the flight path rt, and the imaging angle ⁇ i in the memory 160 (T29).
  • the UAV control unit 110 drives the imaging unit 220 at the imaging position while flying along the flight path rt based on the set parameters, and performs aerial photography at the imaging angle ⁇ i (T30).
  • the processing of T23 may be performed in the unmanned aircraft 100.
  • the terminal control section 81 may send the flight range AR and parameters acquired in the processes T21 and T22 to the unmanned aircraft 100 via the communication section 85.
  • the UAV control unit 110 may receive the flight range AR and parameters, and calculate terrain information and flight altitude.
  • the unmanned aircraft 100 may generate a flight path rt for the unmanned aircraft 100 to fly.
  • the UAV control unit 110 may acquire the terrain information of the flight range AR in which the unmanned aircraft 100 flies.
  • the UAV control unit 110 may generate a flight path rt including a camera position wp in a three-dimensional space for capturing the terrain of the flight range AR based on the terrain information of the flight range AR.
  • the UAV control unit 110 may derive (for example, calculate) an imaging angle ⁇ i for each imaging position of the flight path rt based on the terrain information of the flight range AR and the flight path rt.
  • the drone aircraft 100 determines the imaging angle ⁇ in consideration of the fluctuation of the terrain, it is possible to reduce the ground hm where it is difficult to perform photography due to the fluctuation of the terrain. Therefore, the unmanned aerial vehicle 100 can shoot from the front as much as possible when shooting each point on the ground hm at each imaging position. Therefore, the unmanned aerial vehicle 100 can improve the generation accuracy of the orthophoto image and the three-dimensional restoration accuracy (three-dimensional shape estimation accuracy) by using a captured image captured at a predetermined imaging angle ⁇ i. In addition, in order to improve the accuracy of orthographic image generation and three-dimensional restoration accuracy, the unmanned aerial vehicle 100 does not need to capture images at various imaging positions at various angles, which can improve imaging efficiency.
  • the unmanned aerial vehicle 100 can suppress a decrease in imaging efficiency and obtain as much information as possible at each point of the undulating terrain.
  • the process of generating a flight path performed mainly by the unmanned aircraft 100 may be performed during or before the flight of the unmanned aircraft 100.
  • the UAV control unit 110 may control the flight in accordance with the flight path rt, and perform aerial photography of the terrain surface at the imaging position ⁇ i at the imaging position i of the flight path rt via the imaging unit 220 (an example of a captured image).
  • the unmanned aircraft 100 can reduce the processing load of the terminal 80 while suppressing the reduction in the imaging efficiency of the imaging unit 220, and Get more terrain frontal information.
  • the unmanned aircraft 100 can be implemented collectively.
  • the area of interest RI may be an area in which the user is interested or an area including a position where an object in which the user is interested exists.
  • the region of interest RI may be set by performing an input operation on an area or an object of interest of the user via the operation section 83.
  • the flying body system 10 in the third embodiment has substantially the same configuration as the first and second embodiments.
  • the same reference numerals are used for the same constituent elements as those of the first and second embodiments, and descriptions thereof are omitted or simplified.
  • FIG. 12 is a diagram showing an example of specifying a region of interest RI. Here, a case where the region of interest RI is an area including a building is shown.
  • the terminal control unit 81 displays the acquired flight range AR on the display unit 88.
  • the display section 88 and the operation section 83 may be configured by a control panel.
  • the terminal control section 81 receives the positions of the buildings 501 and 502 via the operation section 83.
  • the terminal control section 81 sets an area including the positions of the two buildings 501 and 502 as a region of interest RI.
  • the terminal control section 81 sets a plurality of initial imaging points gp in the region of interest RI.
  • the initial imaging point gp refers to an imaging position wp set as an initial setting.
  • the terminal control section 81 may obtain information of the initial imaging point gp from the memory 87 or the memory 89, or may obtain the information through a user operation via the operation section 83, or may obtain the information from an external processor via the communication section 85, for example.
  • the initial imaging points gp are arranged in a lattice shape in a two-dimensional plane.
  • adjacent initial imaging points gp are arranged at equal intervals.
  • the initial imaging points gp may be arranged in a shape other than a grid, or adjacent initial imaging points gp may not be arranged at equal intervals.
  • FIG. 13 is a diagram illustrating a process of regenerating a flight path rt for the unmanned aircraft 100 to photograph the region of interest RI.
  • the terminal control unit 81 derives the imaging cost Cij at the candidate angle j at the plurality of initial imaging points gp.
  • the terminal control unit 81 deletes the imaging point gpl with the lower imaging cost Cij from among the initial imaging points gp from the imaging position wp, where the imaging cost Cij of the imaging point gpl at the candidate angle j is less than or equal to the threshold th3. That is, at the imaging position (imaging point) where the imaging cost Cij is less than or equal to the threshold th3, the terminal control section 81 does not capture an image.
  • the terminal control section 81 reserves the imaging point gph with the higher imaging cost Cij among the plurality of initial imaging points gp as the imaging position wp, where the imaging cost Cij of the imaging point gpl at the candidate angle j exceeds the threshold th3.
  • the terminal control unit 81 may perform clustering (classification) on each of the plurality of imaging points gph having a high imaging cost Cij at the imaging angle ⁇ i for each region of interest RI (here, each building as an object of interest).
  • clustering known methods such as K-means (k-means) and DBSCAN (Density-based spatial clustering of applications with noise) can be used.
  • the terminal control unit 81 calculates the imaging position groups sg1 and sg2 including the four imaging points gph whose imaging cost Cij is greater than or equal to the threshold th3 by clustering.
  • the terminal control unit 81 links each imaging point gph included in the imaging position groups sg1 and sg2 as the imaging position wp, and generates a flight path rtn.
  • the terminal control unit 81 classifies a plurality of imaging points gph (imaging positions) having a high imaging cost Cij at the imaging angle ⁇ i for each region of interest RI that is captured from the imaging point gph, and generates a plurality of imaging Position group sg1, sg2.
  • the terminal control unit 81 may connect (connect) a plurality of imaging position groups sg1 and sg2 to regenerate a flight path rtn (an example of generating a flight path).
  • the unmanned aerial vehicle 100 can collect and photograph each of the plurality of imaging points gph (imaging positions) included in the imaging position groups sg1 and sg2. Therefore, since the flying distance of the unmanned aircraft 100 for capturing each region of interest RI is shortened, the unmanned aircraft 100 can improve imaging efficiency.
  • FIG. 14 is a sequence diagram showing an example of a flight path generation process in the flying body system 10 according to the third embodiment.
  • FIG. 14 illustrates a case where the terminal 80 mainly performs the process of generating a flight path.
  • the processes T41 to T43 are the same as the processes T1 to T3 of the first embodiment.
  • the terminal control section 81 acquires information of the region of interest RI.
  • the terminal control section 81 may receive a user input via the operation section 83 and acquire the region of interest RI (T44).
  • the user may directly specify the region of interest RI by inputting a place name via the operation section 83, or specify a region of interest RI by surrounding a part of the map information.
  • the terminal control section 81 can acquire the map information via the communication section 85.
  • the terminal control section 81 may receive a user input via the operation section 83 and acquire type information of the imaging target.
  • the terminal control unit 81 may also use an image processing technology to detect a region of interest RI (for example, an area including a building) corresponding to the type of the imaging object included in the flight range AR based on the type of the imaging object.
  • the processes T45 and T46 are the same as the processes T4 and T5 in the first embodiment.
  • the terminal control unit 81 derives (for example, calculates) an imaging angle ⁇ i for photographing the region of interest RI along the flight path rt based on the terrain information and flight altitude of the region of interest RI and the flight path rt. (T47).
  • the terminal control unit 81 may calculate the imaging cost Cij at the candidate angle j according to equations (4), (5), and (6).
  • the imaging cost Cij at the candidate angle j is calculated for each candidate angle j.
  • ⁇ im argmax (cij) ... (6)
  • (Pk InROI) in Expression (5) means that the sampling position k, which is the calculation target of the imaging cost Cijk, is limited to the position included in the region of interest RI. That is, the terminal control unit 81 can calculate the imaging cost Cijk when the sampling position k in the region of interest RI is captured from the initial imaging point gp (imaging position) at the candidate angle j according to the formula (5).
  • the formulas (4), (5), and (6) may be the same as the formulas (1), (2), and (3).
  • the terminal control unit 81 calculates an imaging angle ⁇ i for imaging a region of interest based on the imaging cost Cij calculated according to the formulae (4), (5), and (6).
  • the imaging angle ⁇ i may be an imaging angle ⁇ im for capturing an area of interest.
  • the terminal control section 81 deletes the corresponding (having low imaging cost Cij) imaging point gpl (T48).
  • the terminal control unit 81 does not delete the corresponding imaging point gph (having a high imaging cost Cij), but retains the imaging position wp.
  • the terminal control unit 81 performs clustering (grouping of imaging points) on a plurality of imaging points gph having a high imaging cost Cij, and calculates imaging position groups sg1 and sg2 (T49).
  • the terminal control unit 81 excludes the imaging position wp at which the imaging cost Cij of the imaging angle is less than or equal to the threshold th3, and connects each imaging point gph included in the imaging position group sg1, sg2 as the imaging position to generate the flight path rtn T50).
  • the generation of the imaging position group is not essential, and after passing through the entire imaging position wp included in the imaging position group sg1, it is not necessary to pass through the entire imaging position wp included in the imaging position group sg2. It is also possible to fly in the order of the imaging positions included in the imaging position group sg1, the imaging positions included in the imaging position group sg2, and the imaging positions included in the imaging position group sg1, for example.
  • the imaging position wp where the shooting is performed may be generated in a disordered manner to regenerate the respective flight paths rtn.
  • the terminal control unit 81 sends a notification parameter including the imaging position wp (equivalent to a plurality of imaging points gph with high imaging cost Cij), the regenerated flight path rtn, and the imaging angle ⁇ i to the unmanned aircraft 100 via the communication unit 85 ( T51).
  • the UAV control unit 110 receives a notification parameter via the communication interface 50 (T52).
  • the UAV control unit 110 sets various parameters used by the unmanned aircraft 100 by storing the received notification parameters in the memory 160 (T53).
  • the UAV control unit 110 drives the imaging unit 220 while flying along the flight path rtn based on the set parameters, and performs aerial photography of the region of interest RI at the imaging angle ⁇ i (T54).
  • the terminal control unit 81 can acquire the region of interest RI included in the flight range AR, for example, including the positions of the two buildings 501 and 502.
  • the terminal control unit 81 may derive an imaging angle ⁇ i for each imaging position based on the terrain information and the flight path rt of the region of interest RI.
  • the terminal 80 can suppress a decrease in imaging efficiency and obtain as much information as possible about each point of the undulating terrain.
  • the terminal 80 can derive an imaging angle ⁇ i, which can improve the generation accuracy and three-dimensional restoration accuracy of the orthoimage of the region of interest RI.
  • the terminal control unit 81 may acquire a candidate angle j, which is a candidate for the imaging angle ⁇ i for capturing the terrain of the flight range AR.
  • the terminal control unit 81 may calculate, for each candidate angle j, an imaging cost Cij (an example of a third imaging cost) when imaging the region of interest RI at the imaging position i with the candidate angle j.
  • the terminal control unit 81 may determine the imaging angle ⁇ i at the imaging position i based on the imaging cost Cij at the imaging position i.
  • the terminal control section 81 may determine the candidate angle j at which the imaging cost Cij at the imaging position i is greater than or equal to a threshold value th2 (an example of a second threshold value) as the imaging angle ⁇ i at the imaging position i.
  • a threshold value th2 an example of a second threshold value
  • the terminal 80 can digitize whether or not it is suitable for shooting the region of interest RI into an imaging cost, and can easily determine to what extent it is suitable for shooting.
  • the candidate angle of the n-th largest (eg, maximum) imaging cost Cij in the imaging cost Cij at the imaging position i greater than or equal to the threshold th2 may be determined as the optimal imaging angle ⁇ im.
  • the terminal control unit 81 may exclude the imaging point gpl (the first imaging point) of the imaging position when the imaging cost Cij when the imaging is performed at the imaging position ⁇ i at the imaging position is smaller than or equal to the threshold th3 from the imaging positions wp An example of position) to regenerate the flight path rtn. That is, the terminal control unit 81 may exclude the imaging point gpl from the generated flight path rtn to generate the flight path rtn again.
  • the terminal 80 can suppress the generation accuracy of the orthoimage of the region of interest RI , The reduction of the three-dimensional restoration accuracy, and the imaging efficiency when the unmanned aircraft 100 shoots the region of interest RI.
  • the flying body system of the fourth embodiment has substantially the same structure as that of the first to third embodiments.
  • the same reference numerals are used for the same constituent elements as in the first to third embodiments, and descriptions thereof are omitted or simplified.
  • FIG. 15 is a sequence diagram showing an example of a flight path generation process of the flying body system 10 in the fourth embodiment.
  • FIG. 15 illustrates a case where the process of generating the flight path is performed mainly by the unmanned aircraft 100.
  • the processes of the processes T61 to T64 are the same as the processes T41 to T44 of the third embodiment.
  • the terminal control section 81 sends the notification parameters including the flight range AR, the parameters, the terrain information, and the region of interest RI acquired in the processes T61 to T64 to the unmanned aircraft 100 via the communication section 85 (T65).
  • the UAV control unit 110 receives a notification parameter via the communication interface 150 (T66).
  • the UAV control unit 110 calculates the flying height (T67).
  • the calculation method of the flying height can be the same as the procedures T4 and T45.
  • the UAV control unit 110 generates a flight path (T68).
  • the method of generating the flight path rt may be the same as the processes T5 and T46.
  • the UAV control unit 110 derives an imaging angle ⁇ i for imaging the region of interest RI (T69).
  • the method of deriving the imaging angle ⁇ i may be the same as that of the process T47.
  • the UAV control unit 110 deletes unnecessary imaging points gpl (T70).
  • the method of deleting the unnecessary camera point gpl may be the same as the process T48.
  • the UAV control unit 110 performs clustering and calculates the imaging position groups sg1 and sg2 (T71).
  • the clustering method and the calculation method of the imaging position groups sg1 and sg2 may be the same as the process T49.
  • the UAV control unit 110 regenerates the flight path rtn (T72).
  • the method of regenerating the flight path rtn may be the same as the process T50.
  • the UAV control unit 110 stores the parameters including the imaging position wp (equivalent to the imaging point gph with a high imaging cost Cij), the regenerated flight path rtn, and the imaging angle ⁇ i in the memory 160 to set the drone 100 to use Parameters (T73).
  • the UAV control unit 110 drives the imaging unit 220 while flying along the flight path rtn based on the set parameters, and performs aerial photography of the region of interest RI at the imaging angle ⁇ i (T74).
  • the unmanned aircraft 100 can perform many processes on the unmanned aircraft 100 side for generating the flight path rtn in consideration of the region of interest RI, and reduce the processing load on the terminal 80.
  • UAV unmanned aerial vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Studio Devices (AREA)

Abstract

期望能够在抑制飞行体的摄像效率降低的同时使飞行体获取更多的地形正面信息。一种生成用于飞行体飞行的飞行路径的信息处理装置,其包含处理部。处理部获取飞行体飞行的飞行范围的地形信息,基于飞行范围的地形信息,生成包括用于对飞行范围的地形进行拍摄的三维空间中的摄像位置的飞行路径,并基于飞行范围的地形信息与飞行路径,针对飞行路径中每个摄像位置导出摄像角度。

Description

信息处理装置、飞行路径生成方法、程序以及记录介质 【技术领域】
本公开涉及一种生成用于飞行体飞行的飞行路径的信息处理装置、飞行路径生成方法、程序以及记录介质。
【背景技术】
已知一种一边通过预设的固定路径一边进行拍摄的平台(无人机)(参见专利文献1)。此平台从地面基地接收摄像指示,对摄像对象进行拍摄。此平台在对摄像对象进行拍摄时,一边沿固定路径飞行,一边根据平台与摄像对象的位置关系,倾斜平台的摄像装置来进行拍摄。
【现有技术文献】
专利文献
专利文献1:日本特开2010-61216号公报
【发明内容】
【发明所要解决的技术问题】
在专利文献1中,确定用于从平台对摄像对象进行拍摄的摄像角度,以使摄像对象进入摄像范围。然而,并不是为了拍摄地形正面确定摄像角度,因为可能存在地形正面拍摄困难的地方,所以可能造成地形中各点信息量的减少。另一方面,如果为了使地形中的各个点的信息量足够而以各种摄像角度对地形进行拍摄,则可能包括不必要的摄像角度下的图像拍摄,摄像效率可能降低。因此,期望能够在抑制飞行体的摄像效率降低的同时由飞行体获取较多的地形正面信息。
【用于解决问题的技术手段】
在一个方面中,一种生成用于飞行体飞行的飞行路径的信息处理装置,包含处理部,处理部获取飞行体飞行的飞行范围的地形信息,基于飞行范围的地形信息,生成包括用于对飞行范围的地形进行拍摄的三维空间中的摄像位置的飞行路径,并基于飞行范围的地形信息与飞行路径,针对飞行路径中每个摄像位置导出摄像角度。
处理部可以获取用于对飞行范围的地形进行拍摄的摄像角度的候选即候选角度,针对每个候选角度来计算出在摄像位置处以候选角度进行拍摄时的摄像成本即第一摄像成本,并将摄像位置处的第一摄像成本大于或等于第一阈值的候选角度确定为摄像位置处的摄像角度。
处理部可以对飞行范围的地形进行采样,来获取由飞行体进行拍摄的多个采样位置,针对每个采样位置计算出在摄像位置处以候选角度对采样位置进行拍摄时的摄像成本即第二摄像成本,并通过将各个采样位置处的第二摄像成本相加来计算第一摄像成本。
可以是摄像位置和采样位置之间的距离越短,第二摄像成本越大。
可以是采样位置处相对于地面的法向矢量与沿候选角度所示的摄像方向的矢量即摄像矢量间的内积值越大,第二摄像成本越大。
处理部可以从第一摄像成本的计算对象中排除内积值为负值的第二摄像成本。
处理部可以获取包括在飞行范围中且包括摄像对象位置的感兴趣区域,并基于感兴趣区域的地形信息和飞行路径,针对飞行路径中每个摄像位置导出摄像角度。
处理部可以获取用于对飞行范围的地形进行拍摄的摄像角度的候选即候选角度,针对每个候选角度计算出在摄像位置处以候选角度对感兴趣区域进行拍摄时的摄像成本即第三摄像成本,并将摄像位置处的第三摄像成本大于或等于第二阈值的候选角度确定为摄像位置处的摄像角度。
当在多个摄像位置中的第一摄像位置处以摄像角度拍摄时的第三摄像成本等于或小于第三阈值时,处理部可以将第一摄像位置从多个摄像位置中排除来生成飞行路径。
处理部可以按从摄像位置拍摄的每个摄像对象来将多个摄像位置分类以生成多个摄像位置组,并连接多个摄像位置组以生成飞行路径。
信息处理装置是包含通信部的终端,处理部可以经由通信部将摄像位置、飞行路径和摄像角度的信息发送到飞行体。
信息处理装置是包含摄像部的飞行体,处理部可以按照飞行路径控制飞行,并经由摄像部在飞行路径的摄像位置处以摄像角度拍摄图像。
在一个方面中,一种生成用于飞行体飞行的飞行路径的信息处理装置的飞行路径生成方法,其包括以下步骤:获取飞行体飞行的飞行范围的地形信息;基于飞行范围的地形信息,生成包括用于对飞行范围的地形进行拍摄的三维空间中的摄像位置的飞行路径;基于飞行范围的地形信息与飞行路径,针对飞行路径中每个摄像位置导出用于对飞行范围的地形进行拍摄的摄像角度。
导出摄像角度的步骤可以包括以下步骤:获取用于对飞行范围的地形进行拍摄的摄像角度的候选即候选角度;针对每个候选角度来计算出在摄像位置处以候选角度进行拍摄时的摄像成本即第一摄像成本;将摄像位置处的第一摄像成本大于或等于第一阈值的候选角度确定为摄像位置处的摄像角度。
计算出第一摄像成本的步骤可以包括以下步骤:对飞行范围的地形进行采样,来获取由飞行体进行拍摄的多个采样位置;针对每个采样位置来计算出在摄像位置处以候选角度对采样位置进行拍摄时的摄像成本即第二摄像成本;通过将各个采样位置处的第二摄像成本相加来计算出第一摄像成本。
可以是摄像位置和采样位置之间的距离越短,第二摄像成本越大。
可以是采样位置处相对于地面的法向矢量与沿候选角度所示的摄像方向的矢量即摄像矢量间的内积值越大,第二摄像成本越大。
计算出第一摄像成本的步骤可以包括以下步骤:从第一摄像成本的计算对象中排除内积值为负值的第二摄像成本。
导出摄像角度的步骤可以包括以下步骤:获取包括在飞行范围中且包括摄像对象位置的感兴趣区域;基于感兴趣区域的地形信息和飞行路径,针对飞行路径中每个摄像位置导出摄像角度。
导出摄像角度的步骤可以包括以下步骤:获取用于对飞行范围的地形进行拍摄的摄像角度的候选即候选角度;针对每个候选角度来计算出在摄像位置处以候选角度对感兴趣区域进行拍摄时的摄像成本即第三拍摄成本;将摄像位置处的第三摄像成本大于或等于第二阈值的候选角度确定为摄像位置处的摄像角度。
生成飞行路径的步骤可以包括以下步骤:当在多个摄像位置中的第一摄像位置处以摄像角度拍摄时的第三摄像成本等于或小于第三阈值时,将第一摄像位置从多个摄像位置中排除来生成飞行路径。
生成飞行路径的步骤可以包括以下步骤:按从摄像位置拍摄的每个摄像对象来将多个摄像位置分类以生成多个摄像位置组,并连接多个摄像位置组以生成飞行路径。
信息处理装置是终端,还可以包括以下步骤:将摄像位置、飞行路径和摄像角度的信息发送到飞行体。
信息处理装置是飞行体,还可以包括以下步骤:按照飞行路径控制飞行;在飞行路径的摄像位置处以摄像角度拍摄图像。
在一个方面中,一种程序,其使生成用于飞行体飞行的飞行路径的信息处理装置执行以下步骤:获取飞行体飞行的飞行范围的地形信息;基于飞行范围的地形信息,生成包括用于对飞行范围的地形进行拍摄的三维空间中的摄像位置的飞行路径;基于飞行范围的地形信息与飞行路径,针对飞行路径中每个摄像位置导出用于对飞行范围的地形进行拍摄的摄像角度。
在一个方面中,一种记录介质,其为计算机可读介质并记录有使生成用于飞行体飞行的飞行路径的信息处理装置执行以下步骤的程序:获取飞行体飞行的飞行范围的地形信息;基于飞行范围的地形信息,生成包括用于对飞行范围的地形进行拍摄的三维空间中的摄像位置的飞行路径;基于飞行范围的地形信息与飞行路径,针对飞行路径中每个摄像位置导出用于对飞行范围的地形进行拍摄的摄像角度。
此外,上述的发明内容中并未穷举本公开的所有特征。此外,这些特征组的子组合也可以构成发明。
【附图说明】
图1是示出第一实施方式中的飞行体***的第一构成示例的示意图。
图2是示出第一实施方式中的飞行体***的第二构成示例的示意图。
图3是示出无人驾驶航空器的具体的外观的一个示例的图。
图4是示出无人驾驶航空器的硬件构成的一个示例的框图。
图5是示出终端的硬件构成的一个示例的框图。
图6是示出飞行范围、飞行路径和摄像位置的一个示例的图。
图7是示出表示摄像角度的候选的表格的一个示例的图。
图8是示出设于起伏不平的地面上的采样位置的一个示例的图。
图9是对计算出候选角度对应的摄像成本的示例进行说明的图。
图10是示出第一实施方式中的飞行体***中的飞行路径生成过程的一个示例的序列图。
图11是示出第二实施方式中的飞行体***中的飞行路径生成过程的一个示例的序列图。
图12是示出指定感兴趣区域的一个示例的图。
图13是说明再生成用于对感兴趣区域进行拍摄的飞行路径的过程的图。
图14是示出第三实施方式中的飞行体***中的飞行路径生成过程的一个示例的序列图。
图15是示出第四实施方式中的飞行体***中的飞行路径生成过程的一个示例的序列图。
图16是示出无人驾驶航空器一边沿地形飞行一边对正下方的地形进行拍摄的情况的图。
图17是示出无人驾驶航空器一边沿地形飞行一边以一定的角度对地形进行拍摄的情况的图。
图18是示出无人驾驶航空器一边沿地形飞行一边以各种角度对地形进行拍摄的情况的图。
【具体实施方式】
以下,通过发明的实施方式来说明本公开,但是以下的实施方式并不限定权利要求书所涉及的发明。实施方式中说明的特征的所有组合未必是发明的解决方案所必须的。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
(实现本发明的一个方式的背景)拍摄地形(地面的情况)时,可以想到以无人驾驶航空器的正下方方向、相对于正下方的一定的角度来拍摄图像。在此情况下,起伏不平的地形中可能存在难以对地形正面进行拍摄的地方,因此地形中的各个点处的信息量可能减少(参见图16,图17)。图16是示出作为飞行体的无人驾驶航空器100R一边沿地形ms飞行一边对正下方的地形ms进行拍摄的情况的图。当正下方的地形ms为山坡ms1时,存在不能充分地对地形正面信息进行拍摄的情况。图17是示出无人驾驶航空器100R一边沿地形ms飞行一边以一定的角度对地形ms进行拍摄的情况的图。当无人驾驶航空器100R以一定的角度拍摄时,存在难以对山的背面ms2进行拍摄的情况。即难以获取地形正面信息。
此外,可以考虑在各个摄像位置(Waypoint)处以多个角度对地形进行拍摄(参见图18)。图18是示出无人驾驶航空器100R一边沿地形ms飞行一边以各种角度对地形ms进行拍摄的情况的图。在此情况下,除了复原地形形状(三维复原)和正射图像生成所需的图像之外,还可能拍摄到不必要的图像。即对地形形状进行拍摄时的摄像效率降低。
此外,在专利文献1中,确定用于从平台对摄像对象进行拍摄的摄像角度,以使摄像对象进入摄像范围。即,并不是为了拍摄地形正面而确定摄像角度,可能存在地形正面拍摄困难的地方,因此可能造成地形中各点信息量的减少。
因此,期望能够在抑制无人驾驶航空器的摄像效率降低的同时由无人驾驶航空器获取更多的地形正面信息。
在以下实施方式中,飞行体以无人驾驶航空器(UAV:Unmanned Aerial Vehicle)为例。无人驾驶航空器包括在空中移动的航空器。在本说明书的附图中,无人驾驶航空器也表述为“UAV”。作为信息处理装置,例如可以以终端为例,但也可以是其他装置(例如发送器、PC(Personal Computer)、无人驾驶航空器、以及其他信息处理装置)。飞行路径生成方法规定了信息处理装置中的操作。此外,记录介质中记录有程序(例如使信息处理装置执行各种处理的程序)。
(第1实施方式)
图1是示出第一实施方式中的飞行体***10的第一构成示例的图。飞行体***10包含无人驾驶航空器100以及终端80。无人驾驶航空器100和终端80之间可以通过有线通信或无线通信(例如,无线LAN(Local Area Network))互相通信。在图1中,例示了终端80是便携式终端(例如智能手机、平板电脑终端)。终端80是信息处理装置的一个示例。
另外,飞行体***10的构成可以为包含无人驾驶航空器100、发送器(比例控制器)以及终端80。当包含发送器时,用户能够使用配置在发送器前面的左、右控制杆来指示无人驾驶航空器的飞行的控制。另外,在此情况下,无人驾驶航空器100、发送器以及终端80之间能够通过有线通信或无线通信相互通信。
图2是示出第一实施方式中的飞行体***10的第二构成示例的示意图。在图2中,例示了终端80是PC。在图1和图2的任意一个中,终端80具有的功能可以相同。
图3是示出无人驾驶航空器100的具体的外观的一个示例的图。在图3中,示出了无人驾驶航空器100在移动方向STV0飞行时的立体图。无人驾驶航空器100为移动体的一个示例。
如图3所示,可以在与地面平行且沿着移动方向STV0的方向上设有滚转轴(参照x轴)。在此情况下,可以在与地面平行且与滚转轴垂直的方向上设有俯仰轴(参照y轴),另外,在与地面垂直且与滚转轴以及俯仰轴垂直的方向上可以设有偏航轴(参照z轴)。
无人驾驶航空器100的构成为包括UAV主体102、万向节200、摄像部220、以及多个摄像部230。
UAV主体102包含多个旋翼(螺旋浆)。UAV主体102通过控制多个旋翼的旋转而使无人驾驶航空器100飞行。UAV主体102使用例如四个旋翼使无人驾驶航空器100飞行。旋翼的数量并不限于四个。此外,无人驾驶航空器100可以是没有旋翼的固定翼飞机。
摄像部220是对包括在预期摄像范围内的被摄体(例如,作为摄像对象的上空的情况、山川河流等景色、地上的建筑物)进行拍摄的拍摄用相机。
多个摄像部230是为了控制无人驾驶航空器100的飞行而对无人驾驶航空器100的周围进行拍摄的传感用相机。两个摄像部230可以设置于无人驾驶航空器100的机头、即正面。并且,其他两个摄像部230可以设置于无人驾驶航空器100的底面。正面侧的两个摄像部230可以成对,起到所谓立体相机的作用。底面侧的两个摄像部230也可以成对,起到立体相机的作用。可以基于由多个摄像部230拍摄到的图像来生成无人驾驶航空器100周围的三维空间数据。另外,无人驾驶航空器100所包含的摄像部230的数量不限于四个。无人驾驶航空器100只要包含至少一个摄像部230即可。无人驾驶航空器100可以在无人驾驶航空器100的机头、机尾、侧面、底面及顶面分别包含至少一个摄像部230。摄像部230中可设定的视角可大于摄像部220中可设定的视角。摄像部230可以具有单焦点镜头或鱼眼镜头。
图4是示出无人驾驶航空器100的硬件构成的一个示例的框图。无人驾驶航空器100的构成为包括UAV控制部110、通信接口150、内存160、存储器170、万向节200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置(IMU:Inertial Measurement Unit)250、磁罗盘260、气压高度计270、超声波传感器280、激光测量仪290。
UAV控制部110例如由CPU(Central Processing Unit:中央处理器)、MPU(Micro Processing Unit:微处理器)或DSP(Digital Signal Processor:数字信号处理器)构成。UAV控制部110执行用于总体控制无人飞行器100的各部的操作的信号处理,与其他各部之间的数据的输入/输出处理,数据运算处理和数据存储处理。
UAV控制部110按照存储于内存160中的程序来控制无人驾驶航空器100的飞行。UAV控制部110可以控制飞行。UAV控制部110可以拍摄图像(例如航拍)。
UAV控制部110获取表示无人飞行器100的位置的位置信息。UAV控制部110可以从GPS接收器240获取表示无人驾驶航空器100所在的纬度、经度以及高度的位置信息。UAV控制部110可以分别从GPS接收器240获取表示无人驾驶航空器100所在的纬度以及经度的纬度经度信息,并从气压高度计270获取表示无人驾驶航空器100所在的高度的高度信息,作为位置信息。UAV控制部110可以获取超声波传感器280产生的超声波的放射点与超声波的反射点之间的距离,作为高度信息。
UAV控制部110可以从磁罗盘260获取表示无人驾驶航空器100的朝向的朝向信息。朝向信息可以用例如与无人驾驶航空器100的机头的朝向相对应的方位来表示。
UAV控制部110可以获取表示在摄像部220对应该拍摄的摄像范围进行拍摄时无人驾驶航空器100所应该存在的位置的位置信息。UAV控制部110可以从内存160获取表示无人驾驶航空器100所应该存在的位置的位置信息。UAV控制部110可以经由通信接口150从其他装置获取表示无人驾驶航空器100所应该存在的位置的位置信息。UAV控制部110可以参照三维地图数据库,来特别指定无人驾驶航空器100所能够存在的位置,并获取该位置作为表示无人驾驶航空器100所应该存在的位置的位置信息。
UAV控制部110可以获取表示摄像部220以及摄像部230的各自的摄像范围的摄像范围信息。UAV控制部110可以从摄像部220以及摄像部230获取表示摄像部220以及摄像部230的视角的视角信息,作为用于特别指定摄像范围的参数。UAV控制部110可以获取表示摄像部220以及摄像部230的摄像方向的信息,作为用于特别指定摄像范围的参数。UAV控制部110例如可以从万向节200获取表示摄像部220的姿势状态的姿势信息,作为表示摄像部220的摄像方向的信息。摄像部220的姿势信息可以表示万向节200的俯仰轴和偏航轴从基准旋转角度旋转的角度。
UAV控制部110可以获取表示无人驾驶航空器100所在的位置的位置信息,作为用于特别指定摄像范围的参数。UAV控制部110可以基于摄像部220和摄像部230的视角和摄像方向、以及无人驾驶航空器100所在的位置,来划定表示摄像部220拍摄的地理范围的摄像范围并生成摄像范围信息,从而获取摄像范围信息。
UAV控制部110可以从内存160获取摄像范围信息。UAV控制部110可以经由通信接口150获取摄像范围信息。
UAV控制部110控制万向节200、旋翼机构210、摄像部220以及摄像部230。UAV控制部110可以通过变更摄像部220的摄像方向或视角来控制摄像部220的摄像范围。UAV控制部110可以通过控制万向节200的旋转机构来控制万向节200所支持的摄像部220的摄像范围。
摄像范围是指由摄像部220或摄像部230拍摄的地理范围。摄像范围由纬度、经度和高度定义。摄像范围可以是由纬度、经度和高度定义的三维空间数据的范围。摄像范围可以是由纬度和经度定义的二维空间数据的范围。摄像范围可以根据摄像部220或摄像部230的视角和摄像方向、以及无人驾驶航空器100所在的位置而特别指定。摄像部220和摄像部230的摄像方向可以由设置有摄像部220和摄像部230的摄像镜头的正面所朝的方位和俯角来定义。摄像部220的摄像方向可以是由无人驾驶航空器100的机头的方位和相对于万向节200的摄像部220姿势状态而特别指定的方向。摄像部230的摄像方向可以是由无人驾驶航空器100的机头的方位和设置有摄像部230的位置而特别指定的方向。
UAV控制部110可以通过对分析由多个摄像部230拍摄到的多个图像,来特别指定无人驾驶航空器100的周围的环境。UAV控制部110可以根据无人驾驶航空器100的周围的环境,例如避开障碍物来控制飞行。
UAV控制部110可以获取表示存在于无人驾驶航空器100周围的对象的立体形状(三维形状)的立体信息(三维信息)。对象例如可以是建筑物、道路、车辆、树木等风景的一部分。立体信息例如是三维空间数据。UAV控制部110可以根据从多个摄像部230得到的各个图像,生成表示存在于无人驾驶航空器100的周围的对象的立体形状的立体信息,由此获取立体信息。UAV控制部110可以通过参照存储在内存160或存储器170中的三维地图数据库,来获取表示存在于无人驾驶航空器100的周围的对象的立体形状的立体信息。UAV控制部110可以通过参照由网络上存在的服务器所管理的三维地图数据库,来获取与存在于无人驾驶航空器100的周围的对象的立体形状相关的立体信息。
UAV控制部110通过控制旋翼机构210来控制无人驾驶航空器100的飞行。即,UAV控制部110通过控制旋翼机构210来对包括无人驾驶航空器100的纬度、经度以及高度的位置进行控制。UAV控制部110可以通过控制无人驾驶航空器100的飞行来控制摄像部220的摄像范围。UAV控制部110可以通过控制摄像部220所包含的变焦镜头来控制摄像部220的视角。UAV控制部110可以利用摄像部220的数字变焦功能,通过数字变焦来控制摄像部220的视角。
当摄像部220固定于无人驾驶航空器100,不能移动摄像部220时,UAV控制部110可以通过使无人驾驶航空器100在特别指定的日期向特别指定的位置移动,使摄像部220在所希望的环境下对所希望的摄像范围进行拍摄。或者,即使当摄像部220没有变焦功能,无法变更摄像部220视角时,UAV控制部110也可以通过使无人驾驶航空器100在特别指定的日期向特别指定的位置移动,使摄像部220在所希望的环境下对所希望的摄像范围进行拍摄。
通信接口150与终端80进行通信。通信接口150可以通过任意的无线通信方式进行无线通信。通信接口150可以通过任意的有线通信方式进行有线通信。通信接口150可以将拍摄图像、与拍摄图像相关的附加信息(元数据)发送到终端80。
内存160存储UAV控制部110对万向节200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置250、磁罗盘260、气压高度计270、超声波传感器280以及激光测量仪290进行控制所需的程序等。内存160可以是计算机可读记录介质,可以包括SRAM(Static Random Access Memory:静态随机存取存储器)、DRAM(Dynamic Random Access Memory:动态随机存取存储器)、EPROM(Erasable Programmable Read Only Memory:可擦除可编程只读存储器)、EEPROM(Electrically  Erasable Programmable Read-Only Memory:电可擦除可编程只读存储器)、以及USB(Universal Serial Bus:通用串行总线)存储器等闪存中的至少一个。内存160可以从无人驾驶航空器100上拆卸下来。内存160可以作为作业用内存进行工作。
存储器170可以包括HDD(Hard Disk Drive:硬盘驱动器)、SSD(Solid State Drive:固态硬盘)、SD卡、USB存储器、其他的存储器中的至少一个。存储器170可以保存各种信息、各种数据。存储器170可以从无人驾驶航空器100上拆卸下来。存储器170可以记录摄像图像。
内存160或存储器170可以保存由终端80或无人驾驶航空器100生成的摄像位置、摄像路径的信息。可以通过UAV控制部110设置摄像位置、摄像路径的信息,作为由无人驾驶航空器100预定的摄像所涉及的摄像参数或由无人驾驶航空器100预定的飞行所涉及的飞行参数中的一个。该设定信息可以保存在内存160或存储器170中。此外,摄像参数可以包括摄像部220的摄像角度的信息。
万向节200可以以偏航轴、俯仰轴以及滚转轴为中心可旋转地支持摄像部220。万向节200可以使摄像部220以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,从而改变摄像部220的摄像方向。
旋翼机构210具有多个旋翼和使多个旋翼旋转的多个驱动电机。旋翼机构210通过UAV控制部110控制旋转,从而使无人驾驶航空器100飞行。旋翼211的数量例如可以是4个,也可以是其他数量。此外,无人驾驶航空器100可以是没有旋翼的固定翼飞机。
摄像部220对所希望的摄像范围内的被摄体进行拍摄并生成摄像图像的数据。通过摄像部220的摄像而得到的图像数据(例如航拍图像)可以存储于摄像部220具有的内存、或存储器170中。
摄像部230对无人驾驶航空器100的周围进行拍摄并生成摄像图像的数据。摄像部230的图像数据可以存储于存储器170中。
GPS接收器240接收表示从多个导航卫星(即GPS卫星)发送的时间以及各GPS卫星的位置(坐标)的多个信号。GPS接收器240根据接收到的多个信号,计算出GPS接收器240的位置(即无人驾驶航空器100的位置)。GPS接收器240将无人驾驶航空器100的位置信息输出到UAV控制部110。另外,可以由UAV控制部110代替GPS接收器240来进行GPS接收器240的位置信息的计算出。在此情况下,在UAV控制部110中输入有GPS接收器240所接收到的多个信号中包含的表示时间以及各GPS卫星的位置的信息。
惯性测量装置250检测无人驾驶航空器100的姿势,并将检测结果输出到UAV控制部110。惯性测量装置250可以检测无人驾驶航空器100的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度,作为无人驾驶航空器100的姿势。
磁罗盘260检测无人驾驶航空器100的机头的方位,并将检测结果输出到UAV控制部110。
气压高度计270检测无人驾驶航空器100的飞行高度,并将检测结果输出到UAV控制部110。
超声波传感器280发射超声波,检测地面、物体反射的超声波,并将检测结果输出到UAV控制部110。检测结果可以示出从无人驾驶航空器100到地面的距离,即高度。检测结果可以示出从无人驾驶航空器100到物体(被摄体)的距离。
激光测量仪290对物体照射激光,接收物体反射的反射光,并通过反射光来测量无人驾驶航空器100与物体(被摄体)之间的距离。作为基于激光的距离测量方法的一个示例,可以为飞行时间法。
图5是示出终端80的硬件构成的一个示例的框图。终端80包含终端控制部81、操作部83、通信部85、内存87、显示部88以及存储器89。终端80可以由希望控制无人驾驶航空器100的飞行的用户所持有。
终端控制部81例如采用CPU、MPU或DSP构成。终端控制部81进行用于整体控制终端80的各部的动作的信号处理、与其它各部之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。
终端控制部81可以经由通信部85获取来自无人驾驶航空器100的数据、信息(各种测定数据、摄像图像、其附加信息等)。终端控制部81可以获取经由操作部83输入的数据、信息(例如,各种参数)。终端控制部81可以获取保存在内存87中的数据、信息。终端控制部81可以经由通信部85向无人驾驶航空器100发送数据、信息(例如,摄像位置、摄像角度、飞行路径的信息)。终端控制部81可以将数据、信息发送至显示部88,将基于此数据、信息的显示信息显示在显示部88上。
终端控制部81可以执行针对无人驾驶航空器100的、进行飞行控制的应用程序。终端控制部81可以生成应用程序中使用的各种数据。
操作部83接受并获取由终端80的用户输入的数据、信息。操作部83也可以包括按钮、按键、触控显示屏、话筒等输入装置。这里主要示出了操作部83和显示部 88由触控面板构成。在此情况下,操作部83可以担任触控操作、点击操作、拖动操作等。操作部83可以接受各种参数的信息。操作部83输入的信息可以发送到无人驾驶航空器100。各种参数可以包括与飞行控制有关的参数。
通信部85通过各种无线通信方式与无人驾驶航空器100之间进行无线通信。此无线通信的无线通信方式可以包括,例如,无线LAN、Bluetooth(注册商标),或经由公共无线网络进行的通信。通信部85可以通过任意的有线通信方式进行有线通信。
内存87例如可以具有规定终端80的动作的程序、存储设定值的数据的ROM、暂时保存终端控制部81进行处理时所使用的各种信息、数据的RAM。内存87可以包括ROM和RAM以外的内存。内存87可以设置在终端80的内部。内存87可以设置成可从终端80上拆卸下来。程序可以包括应用程序。
显示部88例如采用LCD(Liquid Crystal Display,液晶显示器)构成,显示从终端控制部81输出的各种信息、数据。显示部88可以显示应用程序的执行涉及的各种数据、信息。
存储器89存储并保存各种数据、信息。存储器89可以是HDD、SSD、SD卡、USB存储器等。存储器89可以设置在终端80的内部。存储器89可以可拆卸地设置在终端80上。存储器89可以保存从无人驾驶航空器100获取的摄像图像、附加信息。附加信息可以保存在内存87中。
另外,当飞行体***10包含发送器(比例控制器)时,终端80执行的处理也可以由发送器执行。由于发送器具有与终端80相同的构成部,故不再详细说明。发送器具有控制部、操作部、通信部、显示部、内存等。当飞行体***10具有发送器时,也可以不设置终端80。
图6是示出飞行范围AR、飞行路径rt和摄像位置wp的图。飞行范围AR表示无人驾驶航空器100飞行的范围。飞行范围AR可以与由无人驾驶航空器100的摄像部220拍摄的摄像范围一致。飞行路径rt表示无人驾驶航空器100飞行时的路径。摄像位置wp是无人驾驶航空器100的摄像部220拍摄图像时的位置。飞行路径rt是经过摄像位置wp而生成的。终端控制部81获取飞行范围AR,生成飞行路径rt,并确定摄像位置wp。
无人驾驶航空器100一边沿飞行范围AR内的飞行路径rt飞行一边在摄像位置wp处进行航拍。在图6中,飞行路径rt是被设置为从左下角进入,以方波形状移动并从右上角出去的路线。此情况下的飞行路径rt是按照用于均匀地拍摄飞行范围AR的扫描方式的飞行路径。除方波形状的路线之外,飞行路径rt可以是被设置为Z字形或螺旋形的路线,也可以是其他形状的飞行路径。
图7是示出表示摄像角度的候选(也称为候选角度)的表格的图。摄像角度是用于无人驾驶航空器100的摄像部220对飞行范围AR的地形进行拍摄的摄像角度。摄像角度的候选是各种摄像角度中的在实际拍摄时所采用的摄像角度的候选。表示候选角度的表格可以登记在终端80的内存87或存储器89中。表示候选角度的表格也可以保存在外部服务器中。
摄像角度可以由支撑摄像部220的万向节200的俯仰角和偏航角规定。因此,候选角度也可以由支撑摄像部220的万向节200的俯仰角和偏航角规定。在图7所示的表格中,示出了九个俯仰角和偏航角的组合(点),作为候选角度。例如,在九个点中包括有俯仰角为0°,偏航角为0°的点、俯仰角为0°,偏航角为270°的点以及俯仰角为-45°,偏航角为270°的点。
另外,图7仅是一个示例,也可以更详细地定义俯仰角和偏航角的组合。此外,在图7中候选角度的俯仰角和偏航角是以均匀的间隔定义的,也可以以不均匀的间隔进行定义。例如,在易被选作(容易假设)为摄像部220的摄像角度的角度范围内,可以定义较多的候选角度,而在难以被选作(难以假设)为摄像部220的摄像角度的角度范围内,可以定义较少的候选角度。
在图7中,假设无人驾驶航空器100从上空进行航拍,俯仰角设为负角,即摄像方向相对于水平方向由水平面朝下。朝向沿着水平方向时,为俯仰角0°,朝向正下方时,为俯仰角90°。另外,在无人驾驶航空器100的高度较低且摄像部220从下朝上仰着进行航拍的情况下,也可以将俯仰角设为正角。由此,可以进行适合于被摄体情况的拍摄。
终端控制部81可以获取用于对飞行范围AR的地形进行拍摄的候选角度。终端控制部81可以从内存87或存储器89获取候选角度。终端控制部81可以经由通信部85从外部服务器获取候选角度。
图8是示出设于起伏不平的地面hm上的采样位置k的一个示例的图。采样位置k可以是由无人驾驶航空器100所拍摄的、对飞行范围AR的地形进行抽取而来的采样位置。采样位置可以由三维位置(纬度、经度、高度)来规定。可以基于多个采样位置来定义地形信息。
在图8中在飞行范围AR内的一个方向上设置有采样位置k,但也可以在飞行范围AR内的二维方向上设置采样位置k。即可以在飞行范围AR内的格子(网格)中即以预定间隔设置采样位置k。此外,也可以以不等间隔而不是等间隔来布置采样位置k。
此外,从地面hm上的采样位置k指向无人驾驶航空器100的摄像部220的箭头表示对于地面hm的法向矢量(法线方向)。在采样位置k处的法向矢量上存在摄像部220的情况下,当摄像部220对采样位置进行拍摄时,可以从采样位置k的正面、通过摄像图像获得采样位置k周围的大量信息。另一方面,摄像部220的摄像范围不仅可以包括存在于正面的采样位置k,还可以包括不在正面的其他采样位置k。因此,期望从尽可能接近包括在摄像范围中的整个采样位置k的正面的方向来对各个采样位置k进行拍摄。可以将由摄像部220对各个采样位置k进行拍摄的适宜程度数值化,作为摄像成本。
另外,在图8中,曲线图的纵轴表示三维空间中的高度(例如,无人驾驶航空器100的摄像部220、地面hm的高度),横轴表示三维空间中的位置(纬度,经度)(例如,采样位置k、包含摄像部220的无人驾驶航空器100的位置)。
采样位置的信息可以保存在内存87或存储器89、外部服务器中。终端控制部81可以从内存87或存储器89获取采样位置k的信息。终端控制部81可以经由通信部85获取采样位置k的信息。终端控制部81自身可以对地形信息进行采样,并确定采样位置k。
图9是对计算出候选角度对应的摄像成本的示例的进行说明的图。摄像成本是对无人驾驶航空器100的摄像部220是否适于拍摄进行数值化而来的。例如,针对每个候选角度计算出摄像成本。
终端控制部81进行以下具体示例的处理(例如,数式的计算出),并计算出候选角度对应的摄像成本。终端控制部81基于摄像成本确定摄像角度θ。由于摄像角度θ是针对每个摄像位置i计算出的,因此也称为摄像角度θi。
例如,将作为摄像对象的地面hm的位置用采样位置k(k=1,2,...,K)表示。将无人驾驶航空器100的摄像部220所拍摄的摄像位置wp用摄像位置i(i=1,2,...,I)表示。将摄像部220的候选角度用候选角度j(j=1,2,...,J)表示。
在此情况下,终端控制部81可以按照式(1)计算出摄像位置i处的候选角度j的摄像成本Cij。
【数学式1】
Figure PCTCN2019105125-appb-000001
此外,例如,将从摄像部220到地面hm的距离,即摄像位置i和采样位置k之间的距离用距离d表示。将摄像部220的候选角度j对应的摄像方向用摄像矢量n表示。将采样位置k处的地面hm的法线方向用法向矢量1表示。终端控制部81可以按照式(2)计算出对于摄像对象的采样位置k的摄像成本Cijk。
【数学式2】
Cijk=d -0.5max(n*(-1),0)……(2)
根据式(2),距离d越短,采样位置处的摄像成本Cijk的值越大。此外,n*(-1)表示摄像矢量n与法向矢量1的内积(内积值)。根据式(2),内积n*(-1)越大,采样位置的摄像成本Cijk的值越大。
此外,max(n*(-1),0)表示内积n*(-1)和值0中的较大者。这意味着,终端控制部81从摄像位置i处的摄像成本Cij的计算对象中,排除内积n*(-1)为负的采样位置处的摄像成本Cijk。
如式(1)和(2)所示,终端控制部81通过将各个采样位置k处的摄像成本Cijk相加来获得摄像成本Cij。摄像位置i处的摄像成本Cij为最大时的候选角度j是最佳摄像角度θim。终端控制部81可以按照式(3)计算出最佳摄像角度θim。
【数学式3】
θim=argmax(cij)……(3)
另外,argmax(Cij)是摄像成本Cij为最大(max)时的候选角度j,并且该角度是最佳摄像角度θim。
另外,最佳摄像角度θim是摄像角度θi的一个示例。即,摄像角度θi不限于摄像成本Cij为最大时的角度,是满足预定标准的角度即可。例如,从候选角度中确定(选择)的摄像角度θi可以是大于或等于阈值th1的角度中的、摄像成本Cij为第二大、第三大时的角度。此外,摄像角度θi也可以是与大于或等于对摄像成本Cij求平均而得的平均值的摄像成本Cij相对应的摄像角度。
此外,当摄像位置i处的摄像成本Cij均小于阈值th1时,也可以不从候选角度j确定摄像角度θi,并省略在该摄像位置处的拍摄。在此情况下,例如,对于无论摄像角度θi如何摄像图像的图像质量都小于或等于预定基准的摄像位置i不作为摄像位置,以此,终端80能够省略不必要的拍摄并提高摄像效率。此外,终端80还能够使无人驾驶航空器100按照不通过摄像位置i的飞行路径rt飞行。
这样,终端控制部81可以对飞行范围AR的地形进行采样,并获取由无人驾驶航空器100所拍摄的多个采样位置k。终端控制部81可以针对每个采样位置k,计算出在摄像位置i处以候选角度j对采样位置k进行拍摄时的摄像成本Cijk(第二摄像成本的一个示例)。终端控制部81可以将各个采样位置k处的摄像成本Cijk相加,来计算出摄像位置i处的摄像成本Cij(第一摄像成本的一个示例)。
由此,终端80能够通过考虑各个采样位置k处的摄像成本Cijk来确定各个摄像位置i处的摄像角度θi。例如,即使在一个采样位置k处的摄像成本Cijk较小,当另一个采样位置k处的摄像成本Cijk较大时,多个采样位置k处的整体的摄像成本Cij变大,终端80能够将此情况下的候选角度j用作摄像角度θi。因此,终端80能够综合考虑多个采样位置k处的拍摄的良好程度来确定摄像角度θi。
此外,如式(2)所示,距离d越短,采样位置处的摄像成本Cijk的值可以越大。
由此,由于摄像位置i与采样位置k之间的距离d越短采样位置k处的摄像成本Cijk就越大,采样位置k越靠近无人驾驶航空器100,采样位置k处的摄像成本Cijk越大,摄像位置i处的摄像成本Cij容易变大。因此,当确定摄像角度θi时,终端80能够将靠近无人驾驶航空器100的采样位置k处的摄像成本Cijk的影响程度增大。此外,采样位置k靠近无人驾驶航空器100时,无人驾驶航空器100所拍摄的摄像范围(包括于摄像图像中的范围)变窄,摄像范围中的采样位置k处的图像信息相对增大。因此,由于用于正射图像、三维复原的地面信息增加,终端80能够提高正射图像的生成精度、三维复原精度。
此外,如式(2)所示,内积n*(-1)越大,采样位置k处的摄像成本Cijk的值可以越大。
由此,采样位置k处对于地面hm的法向矢量1与沿候选角度所示的摄像方向的矢量即摄像矢量n之间的内积值越大,采样位置处的摄像成本Cijk越大。因此,法向矢量1与摄像矢量n所成的角度越小,采样位置k处的摄像成本Cijk越大,摄像位置i处的摄像成本Cij容易变大。因此,当确定摄像角度θi时,终端80能够将与法向矢量1所成的角度较小的摄像矢量n的影响程度增大。此外,当法向矢量1与摄像矢量n所成的角度较小时,能够在从正面靠近采样位置k的位置处进行拍摄,增加采样位置k处的图像信息。因此,由于用于正射图像、三维复原的地面信息增加,终端80能够提高正射图像的生成精度、三维复原精度。
此外,终端控制部81可以从摄像位置i处的摄像成本Cij的计算对象中,排除内积n*(-1)的值(内积值)为负值的采样位置k处的摄像成本Cijk。
由此,例如,终端80能够通过将内积n*(-1)的值为负值的采样位置处的摄像成本Cijk设置为0,从而抑制内积值为负值的一个极值对摄像位置i处的摄像成本Cij造成的巨大影响。
以下,对飞行体***10的操作示例进行说明。
图10是示出飞行体***10中的飞行路径生成过程的一个示例的序列图。在图10中,例示了生成飞行路径的处理主要由终端80进行。
终端控制部81获取飞行范围AR的信息(T1)。终端控制部81可以经由操作部83接收用户输入,并获取飞行范围AR。在此情况下,终端控制部81可以经由通信部85从外部服务器获取地图信息。例如当飞行范围AR被设置为矩形范围时,用户可以通过输入矩形的四个角在地图信息中的位置(纬度,经度)来获得飞行范围AR的信息。此外,当飞行范围AR被设置为圆形范围时,用户可以通过输入以飞行位置为中心的圆的半径来获得飞行范围AR的信息。此外,用户可以通过输入区域、特定地名(例如东京)等信息,基于地图信息获得飞行范围AR的信息。此外,终端控制部81可以从内存87、存储器89获取保存在内存87、存储器89中的飞行范围AR。终端控制部81可以经由通信部85从外部服务器获取飞行范围AR。
终端控制部81获取各种参数(T2)。参数可以是与摄像部220所拍摄无人驾驶航空器100的飞行相关的参数。该参数可以包括例如摄像位置、摄像日期和时间、到被摄体的距离、摄像视角、摄像条件、相机参数(快门速度、曝光值、摄像模式等)。终端控制部81可以经由操作部83获取用户输入的参数。终端控制部81可以从内存87、存储器89获取保存在内存87、存储器89中的各种参数。终端控制部81可以经由通信部85从无人驾驶航空器100、外部服务器获取各种参数。
终端控制部81基于飞行范围AR的信息获取地形信息(T3)。例如,终端控制部81可以与经由通信部85连接的网络上的地图服务器联动,获取飞行范围AR的地形信息。地形信息可以包括飞行范围AR的各个位置处的位置信息(纬度、经度、高度)。通过汇集各个位置处的位置信息,可以表示飞行范围AR的三维形状。此外,地形信息可以包括建筑物、山、森林、铁塔等地面形状的信息、物体的信息。
终端控制部81基于飞行范围AR的地形信息、和包括在已获取的参数中的到被摄体的距离等信息来计算出飞行高度(T4)。例如,终端控制部81可以结合地形信息表示的地面hm的起伏来计算出无人驾驶航空器100的飞行高度,以确保到被摄体的距离。
终端控制部81生成飞行路径rt(T5)。在此情况下,终端控制部81可以基于飞行范围AR、地形信息和飞行高度生成飞行路径rt。生成的飞行路径rt维持在飞行范 围AR内的各个位置处的导出的飞行高度,并且经过用于对飞行范围AR内的地形进行拍摄的三维空间中的摄像位置wp。另外,终端控制部81可以按照已知方法来确定使飞行路径经过飞行范围AR内的二维平面(纬度、经度)上的哪个位置,以及使飞行路径经过哪个采样位置k(摄像位置wp)。
终端控制部81基于地形信息和飞行高度,沿飞行路径rt针对每个摄像位置i导出(例如,计算出)摄像角度θi(T6)。导出该摄像角度θi时,终端控制部81针对每个候选角度j计算出摄像位置i处的摄像成本Cij。终端控制部81将摄像位置i处的摄像成本Cij大于或等于阈值th1(例如最大)的候选角度(例如最佳摄像角度θim)确定为摄像位置i处的摄像角度θi。导出该摄像角度θi时,可以基于地形信息和飞行路径rt的信息计算出最佳摄像角度θim。
终端控制部81经由通信部85将包括摄像位置wp、飞行路径rt和摄像角度θi的通知参数发送到无人驾驶航空器100(T7)。通知参数可以包括与拍摄时的相机(摄像部220)相关的摄像参数、与拍摄时的飞行相关的飞行参数。
在无人驾驶航空器100中,UAV控制部110经由通信接口150接收来自终端80的通知参数(T8)。UAV控制部110通过将接收到的通知参数保存在内存160中来设置无人驾驶航空器100使用的各个参数(T9)。UAV控制部110基于设置的参数,一边沿飞行路径rt飞行一边驱动摄像部220,并以摄像角度θi进行航拍(T10)。
这样,终端80可以生成用于无人驾驶航空器100飞行的飞行路径rt。终端控制部81可以获取无人驾驶航空器100飞行的飞行范围AR的地形信息。终端控制部81可以基于飞行范围AR的地形信息,生成包括用于对飞行范围AR的地形进行拍摄的三维空间中的摄像位置wp的飞行路径rt。终端控制部81可以基于飞行范围AR的地形信息和飞行路径rt,针对飞行路径rt的每个摄像位置导出(例如计算出)摄像角度θi。
由此,由于终端80考虑了地形的起伏来确定摄像角度θi,因此减少了由于地形的起伏而难以进行拍摄的地面hm的地方。因此,在每个摄像位置i,无人驾驶航空器100能够在对地面hm的各个点进行拍摄时尽可能地从正面进行拍摄。因此,终端80能够通过利用以确定的摄像角度θi拍摄的摄像图像,提高正射图像的生成精度、三维复原精度(三维形状推测精度)。此外,为了提高正射图像的生成精度、三维复原精度,终端80能够不在各个摄像位置i处以各种角度拍摄图像,并提高无人驾驶航空器100的摄像效率。因此,终端80能够抑制无人驾驶航空器100的摄像效率的降低,使无人驾驶航空器100在起伏不平的地形的各个点上获取尽可能多的信息。另外,主要由终端80所执行的生成飞行路径的处理,可以在无人驾驶航空器100的飞行中或飞行开始前进行。
此外,终端控制部81可以获取用于对飞行范围AR的地形进行拍摄的摄像角度θi的候选即候选角度j。终端控制部81可以针对每个候选角度j计算出摄像位置i处的摄像成本Cij(在摄像位置处以候选角度j进行拍摄时的摄像成本即第一摄像成本的一个示例)。终端控制部81可以基于摄像位置i处的摄像成本Cij来确定摄像位置处的摄像角度θi。在此情况下,终端控制部81可以将摄像位置i处的摄像成本Cij大于或等于阈值th1的候选角度j确定为摄像位置处的摄像角度θi。
由此,终端80能够将是否适合于拍摄数值化为摄像成本,从而容易地判断出摄像位置i处的候选角度j下的拍摄的适宜程度如何。这里,可以将大于或等于阈值th1的摄像位置i处的摄像成本Cij中的第n大(例如最大)的摄像成本Cij的候选角度j确定为摄像角度θi。
此外,终端控制部81可以经由通信部85将摄像位置i、飞行路径rt和摄像角度θi的信息发送到无人驾驶航空器100。
由此,能够在终端80侧进行用于生成飞行路径rt的诸多处理,从而终端80能够在减轻无人驾驶航空器100的处理负荷的同时,抑制无人驾驶航空器100的摄像效率的降低,并由无人驾驶航空器100获取更多的地形正面的信息。
(第二实施方式)
在第二实施方式中,例示了主要由无人驾驶航空器100进行生成飞行路径的处理的情形。第二实施方式中的飞行体***具有与第一实施方式大致相同的结构。对于与第一实施方式相同的构成要素,通过使用相同的符号,省略或简化其说明。
图11是示出第二实施方式中的飞行体***10中的飞行路径生成过程的序列图。在图11中,例示了主要由无人驾驶航空器100进行该生成飞行路径的处理的情形。
过程T21至T23的处理与第一实施方式的过程T1至T3相同。终端控制部81经由通信部85将在过程T21至T23中获取的飞行范围AR、参数和地形信息发送到无人驾驶航空器100(T24)。
在无人驾驶航空器100中,UAV控制部110经由通信接口150接收飞行范围AR、参数和地形信息(T25)。UAV控制部110将接收的飞行范围AR、参数和地形信息存储在内存160中。
UAV控制部110计算出飞行高度(T26)。飞行高度的计算方法可以与T4相同。UAV控制部110生成飞行路径rt(T27)。飞行路径rt的生成方法可以与T5相同。UAV控制部110,沿飞行路径rt,针对每个摄像位置导出摄像角度θi(T28)。摄像角度θi的导出方法可以与T6相同。UAV控制部110将包括摄像位置、飞行路径rt和摄像角 度θi的参数保持并设置在内存160中(T29)。UAV控制部110基于所设置的参数,一边沿飞行路径rt飞行一边在摄像位置处驱动摄像部220,以摄像角度θi进行航拍(T30)。
另外,T23的处理可以在无人驾驶航空器100进行。在此情况下,终端控制部81可以经由通信部85将在过程T21、T22中获取的飞行范围AR和参数发送到无人驾驶航空器100。UAV控制部110可以接收飞行范围AR和参数,并计算出地形信息和飞行高度。
这样,无人驾驶航空器100可以生成用于无人驾驶航空器100飞行的飞行路径rt。UAV控制部110可以获取无人驾驶航空器100飞行的飞行范围AR的地形信息。UAV控制部110可以基于飞行范围AR的地形信息,生成包括用于对飞行范围AR的地形进行拍摄的三维空间中的摄像位置wp的飞行路径rt。UAV控制部110可以基于飞行范围AR的地形信息和飞行路径rt,针对飞行路径rt的每个摄像位置导出(例如,计算出)摄像角度θi。
由此,由于无人驾驶航空器100考虑了地形的起伏来确定摄像角度θ,因此能够减少由于地形的起伏而难以进行拍摄的地面hm的地方。因此,无人驾驶航空器100能够在每个摄像位置,在对地面hm的各个点进行拍摄时尽可能地从正面进行拍摄。因此,无人驾驶航空器100能够通过采用以确定的摄像角度θi拍摄到的摄像图像,来提高正射图像的生成精度、三维复原精度(三维形状推测精度)。此外,为了提高正射图像的生成精度、三维复原精度,无人驾驶航空器100无需在各个摄像位置处以各种角度拍摄图像,能够提高摄像效率。因此,无人驾驶航空器100能够抑制摄像效率的降低,在起伏不平的地形的各个点上获取尽可能多的信息。另外,主要由以无人驾驶航空器100执行的生成飞行路径的处理,可以在无人驾驶航空器100的飞行中或飞行开始前进行。
此外,UAV控制部110可以按照飞行路径rt控制飞行,并经由摄像部220,在飞行路径rt的摄像位置i处以摄像角度θi对地形表面进行航拍(拍摄图像的一个示例)。由此,能够在无人驾驶航空器100侧进行用于生成飞行路径rt的诸多处理,从而无人驾驶航空器100能够在减轻终端80的处理负荷的同时,抑制摄像部220的摄像效率的降低,并获取更多的地形正面的信息。此外,从生成通过抑制摄像部220的摄像效率的降低来更多地获取地形正面的信息的飞行路径rt,到沿生成的飞行路径rt进行拍摄,无人驾驶航空器100能够集中地进行实施。
(第三实施方式)
在第一和第二实施方式中,示出了对整个飞行范围AR进行航拍的情况。在第三实施方式中,示出了在飞行范围AR中,主要对感兴趣区域RI进行航拍的情况。感 兴趣区域RI可以是用户感兴趣的区域或包括用户感兴趣的对象存在的位置的区域。例如,可以经由操作部83,通过对用户感兴趣的区域、对象进行输入操作来设置感兴趣区域RI。
第三实施方式中的飞行体***10具有与第一、第二实施方式大致相同的结构。对于与第一、第二实施方式相同的构成要素,通过使用相同的符号,省略或简化其说明。
图12是示出指定感兴趣区域RI的一个示例的图。在此,示出了感兴趣区域RI是包括建筑物的区域的情况。
例如,终端控制部81在显示部88上显示获取的飞行范围AR。例如,显示部88和操作部83可以由控制面板构成。当用户用手指触摸显示在显示部88上的飞行范围AR内的建筑物501、502时,终端控制部81经由操作部83接收建筑物501、502的位置。终端控制部81将包括两个建筑物501、502的位置的区域设置为感兴趣区域RI。
此外,终端控制部81在感兴趣区域RI中设置多个初始摄像点gp。初始摄像点gp是指作为初始设置而设置的摄像位置wp。终端控制部81可以例如从内存87或存储器89获取初始摄像点gp的信息,或可以经由操作部83通过用户操作获取该信息,或经由通信部85从外部处理器获取该信息。在图12中,初始摄像点gp以格子状布置在二维平面中。此外,相邻的初始摄像点gp以等间隔布置。另外,初始摄像点gp也可以以格子状以外的形状布置,或者相邻的初始摄像点gp之间也可以不以等间隔布置。
图13是说明再生成用于无人驾驶航空器100对感兴趣区域RI进行拍摄的飞行路径rt的过程的图。在此情况下,与第一实施方式相同,终端控制部81导出多个初始摄像点gp处的候选角度j下的摄像成本Cij。终端控制部81从摄像位置wp删除多个初始摄像点gp中的具有较低摄像成本Cij的摄像点gpl,其中该摄像点gpl在候选角度j下的摄像成本Cij小于或等于阈值th3。即,在摄像成本Cij小于或等于阈值th3的摄像位置(摄像点)处,终端控制部81不拍摄图像。而且,终端控制部81将多个初始摄像点gp中的具有较高摄像成本Cij的摄像点gph留作摄像位置wp,其中该摄像点gpl在候选角度j下的摄像成本Cij超过阈值th3。
终端控制部81可以对具有摄像角度θi下的高摄像成本Cij的多个摄像点gph,针对每个感兴趣区域RI(这里指每个作为感兴趣对象的建筑物)进行聚类(分类)。对于聚类,可以使用K-means(k-平均法)、DBSCAN(Density-based spatial clustering of applications with noise,具有噪声应用的基于密度的空间聚类)等已知的方法。
在图13中,终端控制部81通过聚类来计算出包括摄像成本Cij大于或等于阈值th3的四个摄像点gph的摄像位置组sg1、sg2。终端控制部81将包括在摄像位置组sg1、sg2中的各个摄像点gph作为摄像位置wp连结,再生成飞行路径rtn。
这样,终端控制部81按从摄像点gph进行拍摄的每个感兴趣区域RI,来将摄像角度θi下的具有高摄像成本Cij的多个摄像点gph(摄像位置)分类,并生成多个摄像位置组sg1、sg2。终端控制部81可以连结(连接)多个摄像位置组sg1、sg2,以再生成飞行路径rtn(生成飞行路径的一个示例)。
由此,无人驾驶航空器100能够将包括在摄像位置组sg1、sg2中的多个摄像点gph(摄像位置)的每一个进行汇总并拍摄。因此,由于用于对各个感兴趣区域RI进行拍摄的无人驾驶航空器100的飞行距离缩短,因此无人驾驶航空器100能够提高摄像效率。
以下,对飞行体***10的操作示例进行说明。
图14是示出第三实施方式的飞行体***10中的飞行路径生成过程的一个示例的序列图。在图14中,例示了主要由终端80进行该生成飞行路径的处理的情形。
过程T41至T43与第一实施方式的过程T1至T3相同。
终端控制部81获取感兴趣区域RI的信息。终端控制部81可以经由操作部83接收用户输入,并获取感兴趣区域RI(T44)。例如,用户可以经由操作部83,通过输入地名直接指定感兴趣区域RI,或者通过圈住地图信息中的一部分区域来指定感兴趣区域RI。在此情况下,终端控制部81可以经由通信部85获取地图信息。此外,终端控制部81可以经由操作部83接收用户输入,并获取摄像对象的类型信息。终端控制部81也可以基于摄像对象的类型,使用图像处理技术来检测与包括在飞行范围AR内的摄像对象类型相对应的感兴趣区域RI(例如,包括建筑物的区域)。
过程T45、T46与所述第一实施方式中的过程T4、T5相同。终端控制部81基于感兴趣区域RI的地形信息和飞行高度,沿飞行路径rt,按初始摄像点gp(摄像位置)导出(例如,计算出)用于对感兴趣区域RI进行拍摄的摄像角度θi(T47)。
在导出用于对感兴趣区域RI进行拍摄的摄像角度θi时,终端控制部81可以按照式(4)、(5)、(6)来计算出候选角度j下的摄像成本Cij。针对每个候选角度j计算出候选角度j下的摄像成本Cij。
【数学式4】
Figure PCTCN2019105125-appb-000002
【数学式5】
Cijk=(Pk in ROI)d -0.5max(n*(-1),0)……(5)
【数学式6】
θim=argmax(cij)……(6)
式(5)中的(Pk In ROI)的意思是将作为摄像成本Cijk的计算对象的采样位置k限制在包括于感兴趣区域RI中的位置上。即,终端控制部81可以按照式(5),来计算出从初始摄像点gp(摄像位置)以候选角度j对感兴趣区域RI内的采样位置k进行拍摄时的摄像成本Cijk。对于其他方面,式(4)、(5)、(6)可以与式(1)、(2)、(3)相同。
例如,终端控制部81基于按照式(4)、(5)、(6)计算出的摄像成本Cij,来计算出用于对感兴趣区域进行拍摄的摄像角度θi。该摄像角度θi可以是用于对感兴趣区域进行拍摄的摄像角度θim。
当多个初始摄像点gp处的摄像角度θi下的摄像成本Cij小于或等于阈值th3时,终端控制部81删除相应(具有低摄像成本Cij)的摄像点gpl(T48)。另一方面,当多个初始摄像点gp处的摄像角度θi下的摄像成本Cij超过阈值th3时,终端控制部81不删除对应(具有高摄像成本Cij)的摄像点gph,而保留为摄像位置wp。
终端控制部81针对具有高摄像成本Cij的多个摄像点gph进行聚类(摄像点的分组),计算出摄像位置组sg1、sg2(T49)。终端控制部81排除摄像角度的摄像成本Cij小于或等于阈值th3的摄像位置wp,以将包括在摄像位置组sg1、sg2中的各个摄像点gph作为摄像位置而连结,以再生成飞行路径rtn(T50)。
另外,摄像位置组的生成不是必不可少的,并且在经过包括在摄像位置组sg1中的整个摄像位置wp之后,经过包括在摄像位置组sg2中的整个摄像位置wp也不是必须的。也可以是例如按照包括在摄像位置组sg1中的摄像位置、包括在摄像位置组sg2中的摄像位置、包括在摄像位置组sg1中的摄像位置的顺序飞行这样,用于对不 同感兴趣区域RI进行拍摄的摄像位置wp可以是无序的方式来再生成各个飞行路径rtn。
终端控制部81将包括摄像位置wp(相当于具有高摄像成本Cij的多个摄像点gph)、再生成的飞行路径rtn和摄像角度θi的通知参数经由通信部85发送到无人驾驶航空器100(T51)。
在无人驾驶航空器100中,UAV控制部110经由通信接口50接收通知参数(T52)。UAV控制部110通过将接收到的通知参数保存在内存160中来设置无人驾驶航空器100使用的各个参数(T53)。UAV控制部110基于设置的参数,一边沿飞行路径rtn飞行一边驱动摄像部220,并以摄像角度θi对感兴趣区域RI进行航拍(T54)。
这样,终端控制部81可以获取包括在飞行范围AR中的、例如包括两个建筑物501、502的位置的感兴趣区域RI。终端控制部81可以基于感兴趣区域RI的地形信息和飞行路径rt,针对每个摄像位置导出摄像角度θi。
由此,当无人驾驶航空器100对感兴趣区域RI进行航拍时,终端80能够抑制摄像效率的降低并使其对起伏不平的地形的各个点获取尽可能多的信息。此外,终端80能够导出摄像角度θi,该摄像角度θi能够提高感兴趣区域RI的正射图像的生成精度和三维复原精度。
此外,终端控制部81可以获取用于对飞行范围AR的地形进行拍摄的摄像角度θi的候选即候选角度j。终端控制部81可以针对每个候选角度j计算出在摄像位置i处以候选角度j对感兴趣区域RI进行拍摄时的摄像成本Cij(第三摄像成本的一个示例)。终端控制部81可以基于摄像位置i处的摄像成本Cij来确定摄像位置i处的摄像角度θi。在此情况下,终端控制部81可以将摄像位置i处的摄像成本Cij大于或等于阈值th2(第二阈值的一个示例)的候选角度j确定为摄像位置i处的摄像角度θi。
由此,终端80能够将是否适合于对感兴趣区域RI的拍摄数值化为摄像成本,并能够容易地判断在多大程度上适合于拍摄。这里,可以将大于或等于阈值th2的摄像位置i处的摄像成本Cij中的第n大(例如,最大)摄像成本Cij的候选角度确定为最佳摄像角度θim。
此外,终端控制部81可以从多个摄像位置wp中排除多个摄像位置i中的、在摄像位置处以摄像角度θi进行拍摄时的摄像成本Cij小于或等于阈值th3的摄像点gpl(第一摄像位置的一个示例),以再生成飞行路径rtn。即,终端控制部81可以从已生成的飞行路径rtn中排除摄像点gpl,来再生成飞行路径rtn。
由此,由于对摄像角度θi下的摄像成本Cij的影响较小的摄像位置不包括在飞行路径rtn的生成(再生成)中,因此终端80能够抑制感兴趣区域RI的正射图像的生成精度、三维复原精度的降低,并提高无人驾驶航空器100对感兴趣区域RI进行拍摄时的摄像效率。
(第四实施方式)
在第四实施方式中,例示了主要由无人驾驶航空器100来进行考虑了感兴趣区域的生成飞行路径的处理的情形。第四实施方式的飞行体***具有与第一至第三实施方式大致相同的结构。对于与第一至第三实施方式相同的构成要素使用相同的符号,省略或简化其说明。
图15是示出第四实施方式中的飞行体***10的飞行路径生成过程的一个示例的序列图。在图15中,例示了主要由无人驾驶航空器100来进行该生成飞行路径的处理的情形。
过程T61至T64的处理与第三实施方式的过程T41至T44相同。终端控制部81经由通信部85将包括在过程T61至T64中获取的飞行范围AR、参数、地形信息和感兴趣区域RI的通知参数发送到无人驾驶航空器100(T65)。
在无人驾驶航空器100中,UAV控制部110经由通信接口150接收通知参数(T66)。UAV控制部110计算出飞行高度(T67)。飞行高度的计算方法可以与过程T4、T45相同。UAV控制部110生成飞行路径(T68)。飞行路径rt的生成方法可以与过程T5、T46相同。
UAV控制部110导出用于对感兴趣区域RI进行拍摄的摄像角度θi(T69)。摄像角度θi的导出方法可以与过程T47相同。UAV控制部110删除不必要的摄像点gpl(T70)。不必要的摄像点gpl的删除方法可以与过程T48相同。UAV控制部110进行聚类,并计算出摄像位置组sg1、sg2(T71)。聚类方法和摄像位置组sg1、sg2的计算方法可以与过程T49相同。UAV控制部110再生成飞行路径rtn(T72)。飞行路径rtn的再生成方法可以与过程T50相同。
UAV控制部110通过将包括摄像位置wp(相当于具有高摄像成本Cij的摄像点gph)、再生成的飞行路径rtn和摄像角度θi的参数保存在内存160中,来设置无人驾驶航空器100使用的各个参数(T73)。UAV控制部110基于设置的参数,一边沿飞行路径rtn飞行一边驱动摄像部220,并以摄像角度θi对感兴趣区域RI进行航拍(T74)。
由此,无人驾驶航空器100能够在无人驾驶航空器100侧实施用于生成考虑了感兴趣区域RI的飞行路径rtn的诸多处理,并减轻终端80的处理负荷。
以上使用实施方式对本公开进行了说明,但是本公开的技术范围并不限于上述实施方式所记载的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的记载即可明白,加以了这样的变更或改良的方式都可包含在本公开的技术范围之内。
权利要求书、说明书以及说明书附图中所示的装置、***、程序和方法中的动作、顺序、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,即可以以任意顺序实现。关于权利要求书、说明书以及说明书附图中的操作流程,为了方便起见而使用“首先”、“接着”等,但并不意味着必须按照这样的顺序实施。
【符号说明】
10 飞行体***
80 终端
81 终端控制部
83 操作部
85 通信部
87 内存
88 显示部
89 存储器
100、100R 无人驾驶航空器(UAV)
110 UAV控制部
150 通信接口
160 内存
170 存储器
200 万向节
210 旋翼机构
220,230 摄像部
240 GPS接收器
250 惯性测量装置
260 磁罗盘
270 气压高度计
280 超声波传感器
290 激光测量仪
501、502 建筑物
AR 飞行范围
gp 初始摄像点
gph、gpl 摄像点
hm 地面
k 采样位置
ms 地形
ms1 山坡
ms2 背面
rt 飞行路径
sg1、sg2 摄像位置组
wp 摄像位置

Claims (26)

  1. 一种生成用于飞行体飞行的飞行路径的信息处理装置,其特征在于,
    其包含处理部;
    所述处理部获取所述飞行体飞行的飞行范围的地形信息;
    基于所述飞行范围的地形信息,生成包括用于对所述飞行范围的地形进行拍摄的三维空间中的摄像位置的飞行路径;
    并基于飞行范围的地形信息和所述飞行路径,针对所述飞行路径的每个摄像位置导出摄像角度。
  2. 根据权利要求1所述的信息处理装置,其特征在于,
    所述处理部获取用于对所述飞行范围的地形进行拍摄的所述摄像角度的候选即候选角度,
    针对每个所述候选角度来计算出在所述摄像位置处以所述候选角度进行拍摄时的摄像成本即第一摄像成本,
    并将所述摄像位置处的所述第一摄像成本大于或等于第一阈值的候选角度确定为所述摄像位置处的所述摄像角度。
  3. 根据权利要求2所述的信息处理装置,其特征在于,
    所述处理部对所述飞行范围的地形进行采样,来获取由所述飞行体进行拍摄的多个采样位置,
    针对每个所述采样位置来计算出在所述摄像位置处以所述候选角度对所述采样位置进行拍摄时的摄像成本即第二摄像成本,
    并通过将各个采样位置处的所述第二摄像成本相加来计算出所述第一摄像成本。
  4. 根据权利要求3所述的信息处理装置,其特征在于,所述摄像位置和所述采样位置之间的距离越短,所述第二摄像成本越大。
  5. 根据权利要求3所述的信息处理装置,其特征在于,所述采样位置处相对于地面的法向矢量与沿所述候选角度所示的摄像方向的矢量即摄像矢量间的内积值越大,所述第二摄像成本越大。
  6. 根据权利要求5所述的信息处理装置,其特征在于,所述处理部从所述第一摄像成本的计算对象中排除所述内积值为负值的所述第二摄像成本。
  7. 根据权利要求1至6中任一项所述的信息处理装置,其特征在于,
    所述处理部获取包括在所述飞行范围中且包括摄像对象位置的感兴趣区域,
    并基于所述感兴趣区域的地形信息和所述飞行路径,针对所述飞行路径中每个所述摄像位置导出摄像角度。
  8. 根据权利要求7所述的信息处理装置,其特征在于,
    所述处理部获取用于对所述飞行范围的地形进行拍摄的所述摄像角度的候选即候选角度,
    针对每个所述候选角度来计算出在所述摄像位置处以所述候选角度对所述感兴趣区域进行拍摄时的摄像成本即第三摄像成本,
    并将所述摄像位置处的所述第三摄像成本大于或等于第二阈值的候选角度确定为所述摄像位置处的所述摄像角度。
  9. 根据权利要求8所述的信息处理装置,其特征在于,当在多个所述摄像位置中的第一摄像位置处以所述摄像角度拍摄时的所述第三摄像成本小于或等于第三阈值时,所述处理部将所述第一摄像位置从多个摄像位置中排除来生成所述飞行路径。
  10. 根据权利要求7至9的任一项所述的信息处理装置,其特征在于,
    所述处理部按从摄像位置拍摄的每个摄像对象将多个所述摄像位置分类以生成多个摄像位置组,
    并连接多个所述摄像位置组以生成所述飞行路径。
  11. 根据权利要求1至10的任一项所述的信息处理装置,其特征在于,所述信息处理装置是包含通信部的终端,
    所述处理部经由所述通信部将所述摄像位置、所述飞行路径和所述摄像角度的信息发送到所述飞行体。
  12. 根据权利要求1至10的任一项所述的信息处理装置,其特征在于,所述信息处理装置是包含摄像部的飞行体,
    所述处理部按照所述飞行路径控制飞行,
    并经由所述摄像部在所述飞行路径的所述摄像位置处以所述摄像角度拍摄图像。
  13. 一种生成用于飞行体飞行的飞行路径的信息处理装置中的飞行路径生成方法,其特征在于,包括以下步骤:
    获取所述飞行体飞行的飞行范围的地形信息;
    基于所述飞行范围的地形信息,生成包括用于对所述飞行范围的地形进行拍摄的三维空间中的摄像位置的飞行路径;
    基于所述飞行范围的地形信息与所述飞行路径,针对所述飞行路径中的每个所述摄像位置导出用于对所述飞行范围的地形进行拍摄的摄像角度。
  14. 根据权利要求13所述的飞行路径生成方法,其特征在于,所述导出所述摄像角度的步骤包括以下步骤:
    获取用于对所述飞行范围的地形进行拍摄的所述摄像角度的候选即候选角度;
    针对每个所述候选角度来计算出在所述摄像位置处以所述候选角度进行拍摄时的摄像成本即第一摄像成本;
    将所述摄像位置处的所述第一摄像成本大于或等于第一阈值的候选角度确定为所述摄像位置处的所述摄像角度。
  15. 根据权利要求14所述的飞行路径生成方法,其特征在于,所述计算出所述第一摄像成本的步骤包括以下步骤:
    对所述飞行范围的地形进行采样,来获取由所述飞行体进行拍摄的多个采样位置;
    针对每个所述采样位置来计算出在所述摄像位置处以所述候选角度对所述采样位置进行拍摄时的摄像成本即第二摄像成本;
    通过将各个采样位置处的所述第二摄像成本相加来计算出所述第一摄像成本。
  16. 根据权利要求15所述的飞行路径生成方法,其特征在于,所述摄像位置和所述采样位置之间的距离越短,所述第二摄像成本越大。
  17. 根据权利要求15所述的飞行路径生成方法,其特征在于,所述采样位置处相对于地面的法向矢量与沿所述候选角度所示的摄像方向的矢量即摄像矢量间的内积值越大,所述第二摄像成本越大。
  18. 根据权利要求17所述的飞行路径生成方法,其特征在于,计算出所述第一摄像成本的步骤包括以下步骤:从所述第一摄像成本的计算对象中排除所述内积值为负值的所述第二摄像成本。
  19. 根据权利要求13至18的任一项所述的飞行路径生成方法,其特征在于,导出所述摄像角度的步骤包括以下步骤:
    获取包括在所述飞行范围中且包括摄像对象位置的感兴趣区域;
    基于所述感兴趣区域的地形信息和所述飞行路径,针对所述飞行路径中每个所述摄像位置导出所述摄像角度。
  20. 根据权利要求19所述的飞行路径生成方法,其特征在于,所述导出所述摄像角度的步骤包括以下步骤:
    获取用于对所述飞行范围的地形进行拍摄的所述摄像角度的候选即候选角度;
    针对每个所述候选角度来计算出在所述摄像位置处以所述候选角度对所述感兴趣区域进行拍摄时的摄像成本即第三摄像成本;
    将所述摄像位置处的所述第三摄像成本大于或等于第二阈值的候选角度确定为所述摄像位置处的所述摄像角度。
  21. 根据权利要求20所述的飞行路径生成方法,其特征在于,所述生成所述飞行路径的步骤包括以下步骤:当在多个所述摄像位置中的第一摄像位置处以所述摄像角度拍摄时的所述第三摄像成本小于或等于第三阈值时,
    将所述第一摄像位置从多个摄像位置中排除来生成所述飞行路径。
  22. 根据权利要求19至21的任一项所述的飞行路径生成方法,其特征在于,所述生成所述飞行路径的步骤包括:
    按从所述摄像位置拍摄的每个摄像对象将多个所述摄像位置分类以生成多个摄像位置组;
    连接多个所述摄像位置组以生成所述飞行路径。
  23. 根据权利要求13至22的任一项所述的飞行路径生成方法,其特征在于,所述信息处理装置是终端,还包括以下步骤:
    将所述摄像位置、所述飞行路径和所述摄像角度的信息发送到所述飞行体。
  24. 根据权利要求13至22的任一项所述的飞行路径生成方法,其特征在于,所述信息处理装置是飞行体,还包括以下步骤:
    按照所述飞行路径控制飞行;
    在所述飞行路径的所述摄像位置处以所述摄像角度拍摄图像。
  25. 一种程序,其特征在于,其使生成用于飞行体飞行的飞行路径的信息处理装置执行以下步骤:
    获取所述飞行体飞行的飞行范围的地形信息;
    基于所述飞行范围的地形信息,生成包括用于对所述飞行范围的地形进行拍摄的三维空间中的摄像位置的飞行路径;
    基于所述飞行范围的地形信息与所述飞行路径,针对所述飞行路径中的每个所述摄像位置来导出用于对所述飞行范围的地形进行拍摄的摄像角度。
  26. 一种记录介质,其特征在于,其为计算机可读记录介质,并记录有使生成用于飞行体飞行的飞行路径的信息处理装置执行以下步骤的程序:
    获取所述飞行体飞行的飞行范围的地形信息;
    基于所述飞行范围的地形信息,生成包括用于对所述飞行范围的地形进行拍摄的三维空间中的摄像位置的飞行路径;
    基于所述飞行范围的地形信息与所述飞行路径,针对所述飞行路径中的每个所述摄像位置导出用于对所述飞行范围的地形进行拍摄的摄像角度。
PCT/CN2019/105125 2018-09-13 2019-09-10 信息处理装置、飞行路径生成方法、程序以及记录介质 WO2020052549A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980005546.7A CN111344650B (zh) 2018-09-13 2019-09-10 信息处理装置、飞行路径生成方法、程序以及记录介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-171672 2018-09-13
JP2018171672A JP7017998B2 (ja) 2018-09-13 2018-09-13 情報処理装置、飛行経路生成方法、プログラム、及び記録媒体

Publications (1)

Publication Number Publication Date
WO2020052549A1 true WO2020052549A1 (zh) 2020-03-19

Family

ID=69777310

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/105125 WO2020052549A1 (zh) 2018-09-13 2019-09-10 信息处理装置、飞行路径生成方法、程序以及记录介质

Country Status (3)

Country Link
JP (1) JP7017998B2 (zh)
CN (1) CN111344650B (zh)
WO (1) WO2020052549A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114185360A (zh) * 2021-11-24 2022-03-15 北京思湃德信息技术有限公司 基于无人机的房屋普查方法及装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6902763B1 (ja) * 2020-06-16 2021-07-14 九州電力株式会社 ドローン飛行計画作成システム及びプログラム
WO2023182089A1 (ja) * 2022-03-24 2023-09-28 ソニーセミコンダクタソリューションズ株式会社 制御装置、制御方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102591353A (zh) * 2011-01-04 2012-07-18 株式会社拓普康 飞行体的飞行控制***
CN106200693A (zh) * 2016-08-12 2016-12-07 东南大学 土地调查小型无人机的云台实时控制***及控制方法
CN106547276A (zh) * 2016-10-19 2017-03-29 上海圣尧智能科技有限公司 自动喷洒回字形路径规划方法及植保机喷洒作业方法
CN107450573A (zh) * 2016-11-17 2017-12-08 广州亿航智能技术有限公司 飞行拍摄控制***和方法、智能移动通信终端、飞行器
CN107504957A (zh) * 2017-07-12 2017-12-22 天津大学 利用无人机多视角摄像快速进行三维地形模型构建的方法
CN107796384A (zh) * 2016-08-30 2018-03-13 波音公司 使用地理弧的2d交通工具定位
EP3336644A1 (en) * 2016-11-09 2018-06-20 Samsung Electronics Co., Ltd. Unmanned aerial vehicle and method for photographing operator using same
CN109871027A (zh) * 2017-12-05 2019-06-11 深圳市九天创新科技有限责任公司 一种倾斜摄影方法及***

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110056089A (ko) * 2009-11-20 2011-05-26 삼성전자주식회사 디지털 촬영 장치에서의 위치 정보 수신 방법
CN111399488B (zh) * 2014-04-25 2023-08-01 索尼公司 信息处理装置、信息处理方法、程序和成像***
US10156855B2 (en) 2014-05-30 2018-12-18 SZ DJI Technology Co., Ltd. Heading generation method and system of unmanned aerial vehicle
JP6621063B2 (ja) 2015-04-29 2019-12-18 パナソニックIpマネジメント株式会社 カメラ選択方法及び映像配信システム
WO2018073878A1 (ja) * 2016-10-17 2018-04-26 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド 3次元形状推定方法、3次元形状推定システム、飛行体、プログラム、及び記録媒体
WO2018073879A1 (ja) * 2016-10-17 2018-04-26 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体
CN110366670B (zh) 2017-03-02 2021-10-26 深圳市大疆创新科技有限公司 三维形状推断方法、飞行体、移动平台、程序及记录介质
CN108417041A (zh) * 2018-05-15 2018-08-17 江苏大学 一种基于四旋翼和云服务器的乡村道路监控***及方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102591353A (zh) * 2011-01-04 2012-07-18 株式会社拓普康 飞行体的飞行控制***
CN106200693A (zh) * 2016-08-12 2016-12-07 东南大学 土地调查小型无人机的云台实时控制***及控制方法
CN107796384A (zh) * 2016-08-30 2018-03-13 波音公司 使用地理弧的2d交通工具定位
CN106547276A (zh) * 2016-10-19 2017-03-29 上海圣尧智能科技有限公司 自动喷洒回字形路径规划方法及植保机喷洒作业方法
EP3336644A1 (en) * 2016-11-09 2018-06-20 Samsung Electronics Co., Ltd. Unmanned aerial vehicle and method for photographing operator using same
CN107450573A (zh) * 2016-11-17 2017-12-08 广州亿航智能技术有限公司 飞行拍摄控制***和方法、智能移动通信终端、飞行器
CN107504957A (zh) * 2017-07-12 2017-12-22 天津大学 利用无人机多视角摄像快速进行三维地形模型构建的方法
CN109871027A (zh) * 2017-12-05 2019-06-11 深圳市九天创新科技有限责任公司 一种倾斜摄影方法及***

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114185360A (zh) * 2021-11-24 2022-03-15 北京思湃德信息技术有限公司 基于无人机的房屋普查方法及装置
CN114185360B (zh) * 2021-11-24 2024-04-26 北京思湃德信息技术有限公司 基于无人机的房屋普查方法及装置

Also Published As

Publication number Publication date
CN111344650A (zh) 2020-06-26
JP7017998B2 (ja) 2022-02-09
JP2020043543A (ja) 2020-03-19
CN111344650B (zh) 2024-04-16

Similar Documents

Publication Publication Date Title
JP6803800B2 (ja) 情報処理装置、空撮経路生成方法、空撮経路生成システム、プログラム、及び記録媒体
JP6962775B2 (ja) 情報処理装置、空撮経路生成方法、プログラム、及び記録媒体
JP6962812B2 (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
WO2020024185A1 (en) Techniques for motion-based automatic image capture
WO2020052549A1 (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
JP6675537B1 (ja) 飛行経路生成装置、飛行経路生成方法とそのプログラム、構造物点検方法
US20200064133A1 (en) Information processing device, aerial photography route generation method, aerial photography route generation system, program, and storage medium
US20210185235A1 (en) Information processing device, imaging control method, program and recording medium
JP2019028560A (ja) モバイルプラットフォーム、画像合成方法、プログラム、及び記録媒体
CN109891188B (zh) 移动平台、摄像路径生成方法、程序、以及记录介质
WO2019105231A1 (zh) 信息处理装置、飞行控制指示方法及记录介质
JP6875269B2 (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
JP7067897B2 (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
WO2020119572A1 (zh) 形状推断装置、形状推断方法、程序以及记录介质
WO2021052217A1 (zh) 一种进行图像处理和框架体控制的控制装置
WO2020001629A1 (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
WO2020108290A1 (zh) 图像生成装置、图像生成方法、程序以及记录介质
WO2020088397A1 (zh) 位置推定装置、位置推定方法、程序以及记录介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19860852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19860852

Country of ref document: EP

Kind code of ref document: A1