WO2018146803A1 - 位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体 - Google Patents
位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体 Download PDFInfo
- Publication number
- WO2018146803A1 WO2018146803A1 PCT/JP2017/005016 JP2017005016W WO2018146803A1 WO 2018146803 A1 WO2018146803 A1 WO 2018146803A1 JP 2017005016 W JP2017005016 W JP 2017005016W WO 2018146803 A1 WO2018146803 A1 WO 2018146803A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- flight
- position information
- relative position
- information
- flying
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims description 131
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 title claims description 100
- 238000000034 method Methods 0.000 title claims description 50
- 238000003672 processing method Methods 0.000 title claims description 26
- 238000003384 imaging method Methods 0.000 claims description 313
- 230000008569 process Effects 0.000 claims description 14
- 238000004364 calculation method Methods 0.000 claims description 12
- 230000005484 gravity Effects 0.000 claims description 9
- 238000010586 diagram Methods 0.000 description 52
- 230000015654 memory Effects 0.000 description 51
- 238000004891 communication Methods 0.000 description 41
- 230000010365 information processing Effects 0.000 description 14
- 230000006870 function Effects 0.000 description 11
- 230000008859 change Effects 0.000 description 10
- 238000005259 measurement Methods 0.000 description 10
- 230000007246 mechanism Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 6
- 238000013500 data storage Methods 0.000 description 3
- 230000001174 ascending effect Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000011664 signaling Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000002366 time-of-flight method Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0027—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0016—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0038—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0044—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
- B64U2201/102—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] adapted for flying in formations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/28—Mobile studios
Definitions
- the present disclosure relates to a position processing device, a position processing system, a position processing method, a program, and a recording medium that process position information of a plurality of flying objects.
- the present disclosure relates to an aircraft that flies based on processed position information, a flight system, a flight control method, a program, and a recording medium.
- a plurality of unmanned aircrafts fly together in one area.
- a plurality of unmanned aircraft can fly in a coordinated manner by executing a preset flight program (see Patent Document 1).
- a plurality of flying bodies as a plurality of unmanned aircraft stop moving to a designated position in the air according to a command from a ground station and emit light.
- the observer can observe the constellation and the like in a pseudo manner.
- the flying object described in Patent Document 1 can fly according to a flight route and flight position set in advance, but it is difficult to fly in consideration of a flight route and flight position that are not set in advance. . Therefore, the system described in Patent Document 1 cannot specify a flight route or the like in real time, and has a low degree of freedom when the unmanned aircraft is flying.
- the flight route and flight position can be instructed to the unmanned aircraft reflecting the operator's will in real time.
- an operating device propo
- a plurality of operation devices are required, and it is difficult to operate a plurality of unmanned aerial vehicles in cooperation.
- the position processing device is a position processing device that processes position information of a plurality of flying objects, and selects a plurality of flying objects and forms a flight group to which the selected plurality of flying objects belong. And a determining unit that determines first relative position information that is relative position information of a plurality of flying objects belonging to the flight group during operation by the operating device that instructs control of the flying object.
- the determination unit may determine relative position information of each of the plurality of flying objects with respect to a reference position of the plurality of flying objects belonging to the flight group as the first relative position information.
- the determining unit may determine the identification information of each of the plurality of flying objects in association with the relative position information of each of the flying objects identified by the identification information.
- the first relative position information may include relative position information of a plurality of flying objects in a three-dimensional space.
- the first relative position information may include distance information in the horizontal direction of a plurality of flying objects.
- the first relative position information may include gravity direction distance information of a plurality of flying objects.
- the position processing device may further include a display unit that displays a plurality of flying object images indicating a plurality of flying objects, and an operation unit that receives an input.
- the determination unit may change the first relative position information by changing the positions of the plurality of flying object images displayed on the display unit by input to the operation unit.
- the operation unit may accept an input by a drag operation.
- the display unit may display distance information between the plurality of flying objects based on the positions of the plurality of flying object images changed by the drag operation.
- the position processing device may further include an operation unit that receives an input.
- the determination unit may determine the first relative position information based on distance information between the plurality of flying objects input to the operation unit.
- the position processing device may further include an acquisition unit that acquires position information of each of the plurality of flying objects.
- the determination unit may determine the first relative position information based on second relative position information that is relative position information based on a difference between the plurality of acquired position information.
- the position processing device may further include an output unit that outputs the first relative position information.
- the flying object is a flying object that forms a flight group together with another flying object, and an instruction for instructing control of the flight from an operating device that instructs control of a plurality of flying objects belonging to the flight group
- a first acquisition unit that acquires a signal and first relative position information that is relative position information of the aircraft with respect to a reference position of a plurality of aircraft belonging to the flight group; an instruction signal;
- a control unit that controls the flight of the flying object by fixing the relative positional relationship between the reference position and the flying object based on the relative position information of the flying object.
- the instruction signal may include first turning instruction information for instructing turning of a plurality of flying objects.
- the control unit fixes the distance between the flying object and the reference positions of a plurality of flying objects belonging to the flight group based on the first turning instruction information, and the flying object turns around the reference position. May be controlled.
- the instruction signal may include second turning instruction information for instructing turning of a plurality of flying objects.
- the control unit may control the flight of the flying object such that the position of the flying object is fixed and the flying object turns around the position of the flying object based on the second turning instruction information.
- the flying object may further include a first imaging unit.
- the control unit controls the angle of view of the first imaging unit based on the number of flying objects that fly in cooperation, and determines the imaging direction of the first imaging unit based on the first relative position information. You may control.
- the flying object indicates the flying position of the flying object based on the second acquisition unit that acquires the first flying position information indicating the flying position of the flying object, and the reference position and the first relative position information. And a calculation unit for calculating second flight position information.
- the control unit may control the flight of the flying body so that the first flight position information and the second flight position information match.
- the first imaging unit may acquire information on the first field angle indicating the field angle of the first imaging unit.
- the 1st acquisition part may acquire the information on the 2nd field angle which shows the field angle of the 2nd image pick-up part with which other flying bodies are provided.
- the control unit may control the flight of the flying body so that the difference between the first angle of view and the second angle of view is substantially constant.
- the first acquisition unit may acquire a second captured image captured by a second imaging unit included in another flying object. Based on the first captured image and the second captured image captured by the first imaging unit, the calculation unit calculates a second relative position that is relative position information of the flying object with respect to another flying object. Information may be calculated.
- the first relative position information may include third relative position information that is relative position information of the flying object with respect to the other flying objects.
- the first control unit may control the flight of the flying body so that the second relative position information matches the third relative position information.
- the flying object may further include a distance measuring sensor that measures a distance between the flying object and another flying object to obtain first distance information.
- the first relative position information may include second distance information indicating a distance between the flying object and another flying object.
- the control unit may control the flight of the flying body so that the first distance information matches the second distance information.
- the position processing system is a position processing system that processes position information of a plurality of flying objects, and selects a plurality of flying objects and forms a flight group to which the selected plurality of flying objects belong.
- a determination unit that determines first relative position information that is relative position information of a plurality of flying objects belonging to a flight group during operation by an operating device that instructs control of the flying object;
- a setting unit that sets relative position information for a plurality of flying objects.
- the flight system is a flight system including a plurality of aircraft that fly in a flight group, and an operation device that directs control of the plurality of aircraft.
- An instruction signal instructing the control of the flight of the body is transmitted, and each of the plurality of aircraft receives the instruction signal, and displays the relative position information of the plurality of aircraft belonging to the flight group during the operation by the operation device.
- the relative positional relationship of the plurality of flying objects is fixed, and the flight of each flying object is controlled.
- the flight system may further include an image processing device.
- Each of the plurality of flying bodies may capture a captured image by capturing a different imaging direction, and transmit the captured image to the image processing apparatus.
- the image processing apparatus may receive a plurality of captured images from each of the plurality of flying objects and generate at least one of a panoramic image and a stereo image based on the plurality of captured images.
- the position processing method is a position processing method in a position processing apparatus that processes position information of a flying object, and selects a plurality of flying objects and forms a flight group to which the selected plurality of flying objects belong. And determining first relative position information, which is relative position information of a plurality of flying objects belonging to the flight group, during operation by the operating device that instructs control of the flying object.
- the step of determining the position information may include determining relative position information of each of the plurality of aircrafts with respect to a reference position of the plurality of aircraft belonging to the flight group as the first relative position information. .
- the step of determining the position information may include determining relative position information of each of the plurality of aircrafts with respect to a reference position of the plurality of aircraft belonging to the flight group as the first relative position information. .
- the step of determining the position information may include a step of determining the identification information of each of the plurality of flying objects in association with the relative position information of each of the flying objects identified by the identification information.
- the first relative position information may include relative position information of a plurality of flying objects in a three-dimensional space.
- the first relative position information may include distance information in the horizontal direction of a plurality of flying objects.
- the first relative position information may include gravity direction distance information of a plurality of flying objects.
- the position processing method may further include a step of displaying a plurality of flying object images indicating a plurality of flying objects and a step of receiving an input to the operation unit.
- the step of determining the position information may include a step of changing the first relative position information by changing the positions of the displayed plurality of flying object images by input.
- the step of receiving an input may include a step of receiving an input by a drag operation.
- the step of displaying the flying object image may include a step of displaying distance information between the plurality of flying objects based on the positions of the plurality of flying object images changed by the drag operation.
- the position processing method may further include a step of receiving an input to the operation unit.
- the step of determining the position information may include a step of determining the first relative position information based on distance information between the plurality of flying objects input to the operation unit.
- the position processing method may further include a step of acquiring position information of each of the plurality of flying objects.
- the step of determining the position information is a step of determining the first relative position information based on the second relative position information that is relative position information based on the difference between the plurality of acquired position information. May be included.
- the position processing method may further include a step of outputting the first relative position information.
- a flight control method for a flying object that forms a flight group together with another flying object, and an instruction for instructing flight control from an operating device that instructs control of a plurality of flying objects belonging to the flight group A step of acquiring a signal, a step of acquiring first relative position information that is relative position information of the aircraft with respect to a reference position of a plurality of aircraft belonging to the flight group, an instruction signal and a first relative And controlling the flight of the air vehicle by fixing the relative positional relationship between the reference position and the air vehicle based on the general position information.
- the instruction signal may include first turning instruction information for instructing turning of a plurality of flying objects.
- first turning instruction information for instructing turning of a plurality of flying objects.
- the distance between the flying object and the reference positions of a plurality of flying objects belonging to the flight group is fixed based on the first turning instruction information, and the flying object turns around the reference position.
- a step of controlling the air vehicle may be included.
- the instruction signal may include second turning instruction information for instructing turning of a plurality of flying objects.
- the step of controlling the flight of the flying body controls the flight of the flying body so that the position of the flying body is fixed based on the second turning instruction information, and the flying body turns around the position of the flying body. Steps may be included.
- the flight control method includes a step of controlling the angle of view of the first imaging unit included in the flying body based on the number of flying bodies belonging to the flying group, and a first relative position information based on the first relative position information. Controlling the imaging direction of the imaging unit.
- the flight control method includes a step of acquiring first flight position information indicating a flight position of the aircraft, and a second position indicating the flight position of the aircraft based on the reference position and the first relative position information. Calculating flight position information. Controlling the flight of the air vehicle may include controlling the flight of the air vehicle so that the first flight position information and the second flight position information match.
- the flight control method includes a step of acquiring information on a first angle of view indicating the angle of view of the first imaging unit, and a second angle of view indicating the angle of view of the second imaging unit included in another flying object. Obtaining information.
- the step of controlling the flight of the flying object may include the step of controlling the flight of the flying object such that a difference between the first angle of view and the second angle of view is substantially constant.
- the flight control method captures a first captured image by capturing with a first image capturing unit, obtains a second captured image captured by a second image capturing unit included in another flying object, and Calculating second relative position information that is relative position information of the flying object with respect to the other flying objects based on the first captured image and the second captured image may be further included.
- the first relative position information may include third relative position information that is relative position information of the flying object with respect to the other flying objects.
- Controlling the flight of the air vehicle may include controlling the flight of the air vehicle so that the second relative position information and the third relative position information match.
- the flight control method may further include a step of measuring a distance between the flying object and another flying object to obtain first distance information.
- the first relative position information may include second distance information indicating a distance between the flying object and another flying object.
- Controlling the flight of the flying object may include controlling the flight of the flying object such that the first distance information and the second distance information match.
- the position processing method is a position processing method in a position processing system for processing position information of a flying object, wherein a plurality of flying objects are selected to form a flight group to which the selected plurality of flying objects belong. Determining first relative position information, which is relative position information of a plurality of flying objects belonging to a flight group, during operation by an operating device that instructs control of the flying object; Setting relative position information for a plurality of aircraft.
- a flight control method is a flight control method in a flight system, comprising: a plurality of aircraft that fly in a flight group; and an operating device that instructs control of the plurality of aircraft.
- controlling the flight of each of the flying objects by fixing the relative positional relationship between the plurality of flying objects based on the position information.
- At least one of a panoramic image and a stereo image based on a step of capturing a different imaging direction for each of a plurality of flying objects, a step of acquiring a plurality of captured images, and a plurality of captured images Generating the step of:
- the program selects a plurality of flying objects and forms a flight group to which the selected plurality of flying objects belong to a computer that is a position processing device that processes position information of the plurality of flying objects; Determining a first relative position information, which is a relative position information of a plurality of flying objects belonging to a flight group, during an operation by an operating device for instructing a control of the flying object. It is.
- the program obtains an instruction signal instructing control of flight from an operating device that instructs control of a plurality of aircraft belonging to the flight group to an aircraft that forms a flight group with other aircraft.
- a step of acquiring relative position information of the flying object with respect to a reference position of a plurality of flying objects belonging to the flying group during operation by the operating device that instructs control of the flying object; and relative to the instruction signal And controlling the flight of the flying object by fixing the relative positional relationship between the reference position and the flying object based on the position information.
- the recording medium selects a plurality of flying objects and forms a flight group to which the selected plurality of flying objects belong to a computer that is a position processing device that processes position information of the plurality of flying objects; Determining first relative position information that is relative position information of a plurality of flying objects belonging to a flight group during operation by an operating device that instructs control of the flying object.
- the recording medium instructs the control of the flight from the operating device for instructing the control of the plurality of flying objects belonging to the flying group to the computer that forms the flying group together with the other flying objects.
- a step of acquiring an instruction signal, a step of acquiring relative position information of the flying object with respect to a reference position of a plurality of flying objects belonging to the flight group during an operation by an operating device for instructing control of the flying object, and an instruction signal And controlling the flight of the flying object by fixing the relative positional relationship between the reference position and the flying object on the basis of the relative position information and the computer that records the program for executing It is a readable recording medium.
- a figure showing an example of the appearance of an unmanned aerial vehicle The figure which shows an example of the concrete appearance of an unmanned aerial vehicle
- the block diagram which shows an example of the hardware constitutions of the unmanned aircraft in 1st Embodiment The perspective view which shows an example of the external appearance of a transmitter and a portable terminal (smartphone)
- Block diagram showing an example of the hardware configuration of the transmitter Block diagram showing an example of the hardware configuration of a portable terminal
- the figure which shows the example of designation of the unmanned aerial vehicle belonging to the same flight group in the position relation processing screen The figure which shows the example of adjustment of the position of the horizontal direction of an unmanned aerial vehicle in a position relation processing screen
- the block diagram which shows an example of the hardware constitutions of the unmanned aircraft in 2nd Embodiment Block diagram showing an example of the functional configuration of an unmanned aerial vehicle Block diagram showing an example of the functional configuration of a mobile terminal
- the figure for demonstrating the rotation method of a rotor blade according to the kind of instruction signal from a transmitter Schematic diagram showing an example of a plurality of unmanned aircraft forming a flight group and a virtual machine located at a reference position
- Schematic showing an example of turning in the first turning mode of an unmanned aerial vehicle
- Schematic showing an example of turning in the second turning mode of an unmanned aerial vehicle
- Schematic diagram showing a first arrangement example during flight of three unmanned aerial vehicles forming a flight group
- Schematic diagram showing an example of the array of five unmanned aerial vehicles that form a flight group during flight Schematic diagram of a second arrangement example in the horizontal direction during flight of three unmanned aerial vehicles forming a flight group
- Schematic diagram of second arrangement example in height direction during flight of three unmanned aerial vehicles forming a flight group Flow chart showing an example of operation of an unmanned aerial vehicle
- an unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle) is exemplified as a flying object.
- Unmanned aerial vehicles include aircraft that travel in the air.
- the unmanned aerial vehicle is represented as “UAV”.
- the flight control method defines the operations in the flying object and the flight system.
- the recording medium is a recording medium of a program (for example, a program that causes an unmanned aircraft to execute various processes).
- a flight system is exemplified as the position processing system.
- a portable terminal is illustrated as a position processing apparatus.
- the mobile terminal may include a smartphone or a tablet terminal.
- the position processing method defines operations in the position processing apparatus and the position processing system.
- the recording medium is a recording medium of a program (for example, a program that causes a mobile terminal to execute various processes).
- FIG. 1 is a schematic diagram illustrating a configuration example of a flight system 10 according to the first embodiment.
- the flight system 10 includes an unmanned aircraft 100, a transmitter 50, and a portable terminal 80.
- the unmanned aircraft 100, the transmitter 50, and the portable terminal 80 can communicate with each other by wired communication or wireless communication (for example, a wireless local area network (LAN)).
- LAN wireless local area network
- FIG. 2 is a diagram illustrating an example of the appearance of the unmanned aerial vehicle 100.
- FIG. 3 is a diagram illustrating an example of a specific appearance of the unmanned aerial vehicle 100. A side view when the unmanned aircraft 100 flies in the moving direction STV0 is shown in FIG. 2, and a perspective view when the unmanned aircraft 100 flies in the moving direction STV0 is shown in FIG.
- a roll axis (see x-axis) is defined in a direction parallel to the ground and along the moving direction STV0.
- a pitch axis (see y-axis) is defined in a direction parallel to the ground and perpendicular to the roll axis, and further, a yaw axis (z-axis) in a direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis. See).
- the unmanned aerial vehicle 100 includes a UAV main body 102, a gimbal 200, an imaging device 220, and a plurality of imaging devices 230.
- the imaging devices 220 and 230 are an example of an imaging unit.
- the UAV main body 102 includes a plurality of rotor blades (propellers).
- the UAV main body 102 causes the unmanned aircraft 100 to fly by controlling the rotation of a plurality of rotor blades.
- the UAV main body 102 causes the unmanned aircraft 100 to fly using, for example, four rotary wings.
- the number of rotor blades is not limited to four.
- Unmanned aerial vehicle 100 may also be a fixed wing aircraft that does not have rotating wings.
- the imaging device 220 is an imaging camera that captures a subject included in a desired imaging range (for example, an aerial subject, a landscape such as a mountain or a river, a building on the ground).
- a desired imaging range for example, an aerial subject, a landscape such as a mountain or a river, a building on the ground.
- the plurality of imaging devices 230 are sensing cameras that image the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100.
- the two imaging devices 230 may be provided on the front surface that is the nose of the unmanned aircraft 100.
- the other two imaging devices 230 may be provided on the bottom surface of the unmanned aircraft 100.
- the two imaging devices 230 on the front side may be paired and function as a so-called stereo camera.
- the two imaging devices 230 on the bottom side may also be paired and function as a stereo camera.
- Three-dimensional spatial data around the unmanned aerial vehicle 100 may be generated based on images captured by the plurality of imaging devices 230. Note that the number of imaging devices 230 included in the unmanned aerial vehicle 100 is not limited to four.
- the unmanned aircraft 100 only needs to include at least one imaging device 230.
- the unmanned aerial vehicle 100 may include at least one imaging device 230 on each of the nose, tail, side, bottom, and ceiling of the unmanned aircraft 100.
- the angle of view that can be set by the imaging device 230 may be wider than the angle of view that can be set by the imaging device 220.
- the imaging device 230 may have a single focus lens or a fisheye lens.
- FIG. 4 is a block diagram showing an example of the hardware configuration of the unmanned aerial vehicle 100.
- the unmanned aircraft 100 includes a UAV control unit 110, a communication interface 150, a memory 160, a gimbal 200, a rotary wing mechanism 210, an imaging device 220, an imaging device 230, a GPS receiver 240, an inertial measurement device (
- the configuration includes an IMU (Inertial Measurement Unit) 250, a magnetic compass 260, a barometric altimeter 270, an ultrasonic sensor 280, and a laser measuring instrument 290.
- the communication interface 150 is an example of a communication unit.
- the ultrasonic sensor 280 and the laser measuring device 290 are examples of distance measuring sensors.
- the UAV control unit 110 is configured using, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
- the UAV control unit 110 performs signal processing for overall control of operations of each unit of the unmanned aircraft 100, data input / output processing with respect to other units, data calculation processing, and data storage processing.
- the UAV control unit 110 controls the flight of the unmanned aircraft 100 according to a program stored in the memory 160.
- UAV control unit 110 controls the flight of unmanned aerial vehicle 100 in accordance with instructions received from remote transmitter 50 via communication interface 150.
- Memory 160 may be removable from unmanned aerial vehicle 100.
- the UAV control unit 110 may specify the environment around the unmanned aircraft 100 by analyzing a plurality of images captured by the plurality of imaging devices 230.
- the UAV control unit 110 controls the flight based on the environment around the unmanned aircraft 100 while avoiding obstacles, for example.
- the UAV control unit 110 acquires date / time information indicating the current date / time.
- the UAV control unit 110 may acquire date / time information indicating the current date / time from the GPS receiver 240.
- the UAV control unit 110 may acquire date / time information indicating the current date / time from a timer (not shown) mounted on the unmanned aircraft 100.
- the UAV control unit 110 acquires position information indicating the position of the unmanned aircraft 100.
- the UAV control unit 110 may acquire position information indicating the latitude, longitude, and altitude at which the unmanned aircraft 100 exists from the GPS receiver 240.
- the UAV control unit 110 acquires, from the GPS receiver 240, latitude / longitude information indicating the latitude and longitude where the unmanned aircraft 100 exists, and altitude information indicating the altitude where the unmanned aircraft 100 exists from the barometric altimeter 270, as position information.
- the UAV control unit 110 may acquire the distance between the ultrasonic emission point and the ultrasonic reflection point by the ultrasonic sensor 280 as altitude information.
- the UAV control unit 110 acquires orientation information indicating the orientation of the unmanned aircraft 100 from the magnetic compass 260.
- direction information for example, a direction corresponding to the nose direction of the unmanned aircraft 100 is indicated.
- the UAV control unit 110 may acquire position information indicating a position where the unmanned aircraft 100 should be present when the imaging device 220 captures an imaging range to be imaged.
- the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should be present from the memory 160.
- the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist from another device such as the transmitter 50 via the communication interface 150.
- the UAV control unit 110 refers to the 3D map database, specifies a position where the unmanned aircraft 100 can exist in order to capture an imaging range to be imaged, and sets the position where the unmanned aircraft 100 should exist. May be acquired as position information indicating.
- the UAV control unit 110 acquires imaging information indicating the imaging ranges of the imaging device 220 and the imaging device 230.
- the UAV control unit 110 acquires angle-of-view information indicating the angle of view of the imaging device 220 and the imaging device 230 from the imaging device 220 and the imaging device 230 as parameters for specifying the imaging range.
- the UAV control unit 110 acquires information indicating the imaging direction of the imaging device 220 and the imaging device 230 as a parameter for specifying the imaging range.
- the UAV control unit 110 acquires posture information indicating the posture state of the imaging device 220 from the gimbal 200 as information indicating the imaging direction of the imaging device 220, for example.
- the UAV control unit 110 acquires information indicating the direction of the unmanned aircraft 100.
- Information indicating the posture state of the imaging device 220 indicates a rotation angle from the reference rotation angle of the pitch axis and yaw axis of the gimbal 200.
- the UAV control unit 110 acquires position information indicating a position where the unmanned aircraft 100 exists as a parameter for specifying the imaging range.
- the UAV control unit 110 defines an imaging range indicating a geographical range captured by the imaging device 220 based on the angle of view and the imaging direction of the imaging device 220 and the imaging device 230, and the position where the unmanned aircraft 100 exists.
- the imaging information may be acquired by generating imaging information indicating the imaging range.
- the UAV control unit 110 may acquire imaging information indicating an imaging range to be imaged by the imaging device 220.
- the UAV control unit 110 may acquire imaging information to be imaged by the imaging device 220 from the memory 160.
- the UAV control unit 110 may acquire imaging information to be imaged by the imaging device 220 from another device such as the transmitter 50 via the communication interface 150.
- the UAV control unit 110 may acquire three-dimensional information (three-dimensional information) indicating the three-dimensional shape (three-dimensional shape) of an object existing around the unmanned aircraft 100.
- the object is a part of a landscape such as a building, a road, a car, and a tree.
- the three-dimensional information is, for example, three-dimensional space data.
- the UAV control unit 110 may acquire the three-dimensional information by generating the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned aircraft 100 from each image obtained from the plurality of imaging devices 230.
- the UAV control unit 110 may acquire the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the memory 160.
- the UAV control unit 110 may acquire three-dimensional information related to the three-dimensional shape of an object existing around the unmanned aircraft 100 by referring to a three-dimensional map database managed by a server existing on the network.
- the UAV control unit 110 acquires image data captured by the imaging device 220 and the imaging device 230.
- the UAV control unit 110 controls the gimbal 200, the rotary blade mechanism 210, the imaging device 220, and the imaging device 230.
- the UAV control unit 110 controls the imaging range of the imaging device 220 by changing the imaging direction or angle of view of the imaging device 220.
- the UAV control unit 110 controls the imaging range of the imaging device 220 supported by the gimbal 200 by controlling the rotation mechanism of the gimbal 200.
- the imaging range refers to a geographical range captured by the imaging device 220 or the imaging device 230.
- the imaging range is defined by latitude, longitude, and altitude.
- the imaging range may be a range in three-dimensional spatial data defined by latitude, longitude, and altitude.
- the imaging range is specified based on the angle of view and imaging direction of the imaging device 220 or the imaging device 230, and the position where the unmanned aircraft 100 is present.
- the imaging directions of the imaging device 220 and the imaging device 230 are defined from the azimuth and the depression angle in which the front surface where the imaging lenses of the imaging device 220 and the imaging device 230 are provided is directed.
- the imaging direction of the imaging device 220 is a direction specified from the heading direction of the unmanned aerial vehicle 100 and the posture state of the imaging device 220 with respect to the gimbal 200.
- the imaging direction of the imaging device 230 is a direction specified from the heading of the unmanned aerial vehicle 100 and the position where the imaging device 230 is provided.
- the UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotary wing mechanism 210. That is, the UAV control unit 110 controls the position including the latitude, longitude, and altitude of the unmanned aircraft 100 by controlling the rotary wing mechanism 210.
- the UAV control unit 110 may control the imaging ranges of the imaging device 220 and the imaging device 230 by controlling the flight of the unmanned aircraft 100.
- the UAV control unit 110 may control the angle of view of the imaging device 220 by controlling a zoom lens included in the imaging device 220.
- the UAV control unit 110 may control the angle of view of the imaging device 220 by digital zoom using the digital zoom function of the imaging device 220.
- the UAV control unit 110 moves the unmanned aircraft 100 to a specific position at a specific date and time to perform desired imaging under a desired environment.
- the range can be imaged by the imaging device 220.
- the UAV control unit 110 moves the unmanned aircraft 100 to a specific position at the specified date and time to In this environment, the imaging device 220 can capture a desired imaging range.
- the UAV control unit 110 may acquire the relative position information of the plurality of unmanned aircraft 100 belonging to the flight group flying in cooperation with each other via the communication interface 150, for example.
- the UAV control unit 110 may set the relative position information by causing the memory 160 to store the relative position information. Therefore, the UAV control unit 110 is an example of a setting unit. By setting the relative position information, flight control can be performed by taking the relative position information into account (for example, maintaining the relative positional relationship).
- the communication interface 150 communicates with the transmitter 50.
- the communication interface 150 receives various commands and information for the UAV control unit 110 from the remote transmitter 50.
- the memory 160 includes a gimbal 200, a rotating blade mechanism 210, an imaging device 220, an imaging device 230, a GPS receiver 240, an inertial measurement device 250, a magnetic compass 260, a barometric altimeter 270, an ultrasonic sensor 280, and a laser. A program and the like necessary for controlling the measuring device 290 are stored.
- the memory 160 may be a computer-readable recording medium, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), and It may include at least one flash memory such as a USB memory.
- the memory 160 may be provided inside the UAV main body 102. It may be provided so as to be removable from the UAV main body 102.
- the gimbal 200 supports the imaging device 220 to be rotatable about at least one axis.
- the gimbal 200 may support the imaging device 220 rotatably about the yaw axis, pitch axis, and roll axis.
- the gimbal 200 may change the imaging direction of the imaging device 220 by rotating the imaging device 220 about at least one of the yaw axis, the pitch axis, and the roll axis.
- the rotary blade mechanism 210 includes a plurality of rotary blades 211, a plurality of drive motors 212 that rotate the plurality of rotary blades 211, and a current sensor that measures a current value (actual value) of a drive current for driving the drive motor 212. 213.
- the drive current is supplied to the drive motor 212.
- the imaging device 220 captures a subject within a desired imaging range and generates captured image data.
- Image data obtained by imaging by the imaging device 220 is stored in a memory included in the imaging device 220 or the memory 160.
- the imaging device 230 captures the surroundings of the unmanned aircraft 100 and generates captured image data. Image data of the imaging device 230 is stored in the memory 160.
- the GPS receiver 240 receives a plurality of signals indicating times and positions (coordinates) of each GPS satellite transmitted from a plurality of navigation satellites (that is, GPS satellites).
- the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned aircraft 100) based on the plurality of received signals.
- the GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control unit 110.
- the calculation of the position information of the GPS receiver 240 may be performed by the UAV control unit 110 instead of the GPS receiver 240. In this case, the UAV control unit 110 receives information indicating the time and the position of each GPS satellite included in a plurality of signals received by the GPS receiver 240.
- the inertial measurement device 250 detects the attitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
- the inertial measurement device IMU 250 detects the acceleration of the unmanned aircraft 100 in the three axial directions of the front, rear, left and right, and the angular velocity in the three axial directions of the pitch axis, the roll axis, and the yaw axis. .
- the magnetic compass 260 detects the heading of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
- the barometric altimeter 270 detects the altitude at which the unmanned aircraft 100 flies and outputs the detection result to the UAV control unit 110.
- Ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected by the ground and objects, and outputs detection results to UAV control unit 110.
- the detection result may indicate a distance from the unmanned aircraft 100 to the ground, that is, an altitude.
- the detection result may indicate the distance from the unmanned aerial vehicle 100 to the object.
- Laser measuring device 290 irradiates an object with laser light, receives reflected light reflected by the object, and measures the distance between unmanned aircraft 100 and the object using the reflected light.
- the distance measurement method using laser light may be a time-of-flight method.
- FIG. 5 is a perspective view showing an example of the appearance of the portable terminal 80 to which the transmitter 50 is attached.
- a smartphone 80 ⁇ / b> S is shown as an example of the mobile terminal 80.
- the up / down / front / rear / left / right directions with respect to the transmitter 50 are assumed to follow the directions of arrows shown in FIG.
- the transmitter 50 is used in a state of being held by both hands of a person using the transmitter 50 (hereinafter referred to as “operator”), for example.
- the transmitter 50 includes, for example, a resin casing 50B having a substantially rectangular parallelepiped shape (in other words, a substantially box shape) having a substantially square bottom surface and a height shorter than one side of the bottom surface.
- a left control rod 53L and a right control rod 53R are provided in a projecting manner at approximately the center of the housing surface of the transmitter 50.
- the left control rod 53L and the right control rod 53R are used in operations for remotely controlling the movement of the unmanned aircraft 100 by the operator (for example, moving the unmanned aircraft 100 back and forth, moving left and right, moving up and down, and changing the direction).
- the In FIG. 5, the left control rod 53L and the right control rod 53R show positions in an initial state where no external force is applied from both hands of the operator.
- the left control rod 53L and the right control rod 53R automatically return to predetermined positions (for example, the initial position shown in FIG. 5) after the external force applied by the operator is released.
- the power button B1 of the transmitter 50 is disposed on the front side (in other words, the operator side) of the left control rod 53L.
- the power button B1 is pressed once by the operator, for example, the remaining capacity of the battery (not shown) built in the transmitter 50 is displayed in the remaining battery capacity display portion L2.
- the power button B1 is pressed again by the operator, for example, the power of the transmitter 50 is turned on, and power is supplied to each part (see FIG. 7) of the transmitter 50 so that it can be used.
- RTH (Return To Home) button B2 is arranged on the front side (in other words, the operator side) of the right control rod 53R.
- the transmitter 50 transmits a signal for automatically returning the unmanned aircraft 100 to a predetermined position.
- the transmitter 50 can automatically return the unmanned aircraft 100 to a predetermined position (for example, a take-off position stored in the unmanned aircraft 100).
- the RTH button B2 is used when, for example, the operator loses sight of the fuselage of the unmanned aircraft 100 during aerial shooting with the unmanned aircraft 100 outdoors, or when it becomes impossible to operate due to radio interference or unexpected troubles. Is available.
- the remote status display part L1 and the remaining battery capacity display part L2 are arranged on the front side (in other words, the operator side) of the power button B1 and the RTH button B2.
- the remote status display unit L1 is configured using, for example, an LED (Light Emission Diode), and displays a wireless connection state between the transmitter 50 and the unmanned aircraft 100.
- the battery remaining amount display unit L2 is configured using, for example, an LED, and displays the remaining amount of the capacity of a battery (not shown) built in the transmitter 50.
- Two antennas AN1 and AN2 project from the rear side of the housing 50B of the transmitter 50 and rearward from the left control rod 53L and the right control rod 53R.
- the antennas AN1 and AN2 are unmanned signals generated by the transmitter control unit 61 (that is, signals for controlling the movement of the unmanned aircraft 100) based on the operations of the left control rod 53L and the right control rod 53R by the operator. Transmit to aircraft 100. This signal is one of the operation input signals input by the transmitter 50.
- the antennas AN1 and AN2 can cover a transmission / reception range of 2 km, for example.
- the antennas AN ⁇ b> 1 and AN ⁇ b> 2 are used when images taken by the imaging devices 220 and 230 included in the unmanned aircraft 100 wirelessly connected to the transmitter 50 or various data acquired by the unmanned aircraft 100 are transmitted from the unmanned aircraft 100. In addition, these images or various data can be received.
- the transmitter 50 does not include a display unit, but may include a display unit.
- the portable terminal 80 may be mounted on the holder HLD.
- the holder HLD may be bonded and attached to the transmitter 50. Thereby, the portable terminal 80 is attached to the transmitter 50 via the holder HLD.
- the portable terminal 80 and the transmitter 50 may be connected via a wired cable (for example, a USB cable).
- the portable terminal 80 may not be attached to the transmitter 50, and the portable terminal 80 and the transmitter 50 may be provided independently.
- FIG. 6 is a perspective view showing an example of the external appearance of the transmitter 50 and the portable terminal 80.
- a tablet 80 ⁇ / b> T is shown as an example of the mobile terminal 80.
- FIG. 7 is a block diagram illustrating an example of a hardware configuration of the transmitter 50.
- the transmitter 50 includes a left control rod 53L, a right control rod 53R, a transmitter control unit 61, a wireless communication unit 63, an interface unit 65, a power button B1, an RTH button B2, and an operation unit set OPS.
- the remote status display unit L1 the remaining battery level display unit L2, and the display unit DP.
- the transmitter 50 is an example of an operating device that instructs control of the unmanned aircraft 100.
- the left control rod 53L is used for an operation for remotely controlling the movement of the unmanned aircraft 100 by, for example, the left hand of the operator.
- the right control rod 53R is used for an operation for remotely controlling the movement of the unmanned aircraft 100 by, for example, the operator's right hand.
- the unmanned aircraft 100 may move forward, move backward, move left, move right, move up, move down, rotate the unmanned aircraft 100 left. Or a combination thereof, and so on.
- the transmitter control unit 61 displays the remaining capacity of the battery (not shown) built in the transmitter 50 on the remaining battery amount display unit L2. Thus, the operator can easily check the remaining capacity of the battery capacity built in the transmitter 50.
- the power button B1 is pressed twice, a signal indicating that the power button B1 has been pressed twice is passed to the transmitter control unit 61.
- the transmitter control unit 61 instructs a battery (not shown) built in the transmitter 50 to supply power to each unit in the transmitter 50. As a result, the operator turns on the power of the transmitter 50 and can easily start using the transmitter 50.
- a signal indicating that the RTH button B2 has been pressed is input to the transmitter control unit 61.
- the transmitter control unit 61 generates a signal for automatically returning the unmanned aircraft 100 to a predetermined position (for example, the takeoff position of the unmanned aircraft 100), via the wireless communication unit 63 and the antennas AN1 and AN2. Transmit to unmanned aerial vehicle 100.
- the operator can automatically return (return) the unmanned aircraft 100 to a predetermined position by a simple operation on the transmitter 50.
- the operation unit set OPS is configured using a plurality of operation units OP (for example, operation units OP1,..., Operation unit OPn) (n: an integer of 2 or more).
- the operation unit set OPS supports other operation units (for example, the remote control of the unmanned aircraft 100 by the transmitter 50) except for the left control rod 53L, the right control rod 53R, the power button B1, and the RTH button B2 shown in FIG. Various operation units).
- the various operation units referred to here are, for example, a button for instructing imaging of a still image using the imaging device 220 of the unmanned aerial vehicle 100, and an instruction for starting and ending video recording using the imaging device 220 of the unmanned aircraft 100.
- the remote status display unit L1 and the remaining battery level display unit L2 have been described with reference to FIG.
- the transmitter controller 61 is configured using a processor (for example, CPU, MPU or DSP).
- the transmitter control unit 61 performs signal processing for overall control of operations of the respective units of the transmitter 50, data input / output processing with other units, data calculation processing, and data storage processing.
- the transmitter control unit 61 acquires data of a captured image captured by the imaging device 220 of the unmanned aircraft 100 via the wireless communication unit 63 and stores it in a memory (not shown). 80 may be output. In other words, the transmitter control unit 61 may cause the portable terminal 80 to display data of an aerial image captured by the imaging device 220 of the unmanned aircraft 100. Thereby, an aerial image captured by the imaging device 220 of the unmanned aerial vehicle 100 can be displayed on the portable terminal 80.
- the transmitter control unit 61 may generate an instruction signal for controlling the flight of the unmanned aerial vehicle 100 specified by the operation of the left control rod 53L and the right control rod 53R of the operator.
- the transmitter control unit 61 may remotely control the unmanned aircraft 100 by transmitting this instruction signal to the unmanned aircraft 100 via the wireless communication unit 63 and the antennas AN1 and AN2. Thereby, the transmitter 50 can control the movement of the unmanned aircraft 100 remotely.
- the transmitter control unit 61 generates an operation input signal based on an operation on an arbitrary button or an arbitrary operation unit included in the transmitter 50, and transmits the operation input signal to the unmanned aircraft 100 via the wireless communication unit 63. It's okay. In this case, the unmanned aircraft 100 can recognize that it is under the control of the operator of the transmitter 50 by receiving the operation input signal from the transmitter 50.
- the wireless communication unit 63 is connected to two antennas AN1 and AN2.
- the wireless communication unit 63 transmits / receives information and data to / from the unmanned aircraft 100 via the two antennas AN1 and AN2 using a predetermined wireless communication method (for example, WiFi (registered trademark)).
- a predetermined wireless communication method for example, WiFi (registered trademark)
- the interface unit 65 inputs and outputs information and data between the transmitter 50 and the portable terminal 80.
- the interface unit 65 may be a USB port (not shown) provided in the transmitter 50, for example.
- the interface unit 65 may be an interface other than the USB port.
- FIG. 8 is a block diagram illustrating an example of a hardware configuration of the mobile terminal 80.
- the portable terminal 80 may include a processor 81, an interface unit 82, an operation unit 83, a wireless communication unit 85, a memory 87, and a display 88.
- the portable terminal 80 is an example of a position processing device.
- the processor 81 is configured using, for example, a CPU, MPU, or DSP.
- the processor 81 performs signal processing for overall control of operations of each unit of the mobile terminal 80, data input / output processing with other units, data calculation processing, and data storage processing.
- the processor 81 may acquire data and information from the unmanned aerial vehicle 100 via the wireless communication unit 85.
- the processor 81 may acquire data and information from the transmitter 50 via the interface unit 82.
- the processor 81 may acquire data and information input via the operation unit 83.
- the processor 81 may acquire data and information held in the memory 87.
- the processor 81 may send data and information to the display 88 and cause the display 88 to display display information based on the data and information.
- the processor 81 executes an application for instructing control of the unmanned aircraft 100.
- the application may include a relative position processing application for processing relative position information for flying a plurality of unmanned aircraft 100 in a coordinated manner.
- the processor 81 may generate various data used in the application.
- the interface unit 82 inputs and outputs information and data between the transmitter 50 and the portable terminal 80.
- the interface unit 82 may be a USB connector (not shown) provided in the mobile terminal 80, for example.
- the interface unit 65 may be an interface other than the USB connector.
- the operation unit 83 receives data and information input by the operator of the mobile terminal 80.
- the operation unit 83 may include buttons, keys, a touch panel, a microphone, and the like.
- the operation unit 83 and the display 88 are mainly configured by a touch panel.
- the operation unit 83 can accept a touch operation, a tap operation, a drag operation, and the like.
- the wireless communication unit 85 communicates with the unmanned aircraft 100B by various wireless communication methods.
- the wireless communication unit 85 is via a wireless LAN, Bluetooth (registered trademark), short-range wireless communication, or a public wireless line. Communication may be included.
- the wireless communication unit 85 is an example of an output unit.
- the memory 87 includes, for example, a ROM that stores a program that defines the operation of the mobile terminal 80 and data of setting values, and a RAM that temporarily stores various information and data used during processing by the processor 81. Good.
- the memory 87 may include memories other than ROM and RAM.
- the memory 87 may be provided inside the mobile terminal 80.
- the memory 87 may be provided so as to be removable from the portable terminal 80.
- the program may include an application program.
- the display 88 is configured using, for example, an LCD (Liquid Crystal Display), and displays various information and data output from the processor 81.
- the display 88 may display aerial image data captured by the imaging device 220 of the unmanned aerial vehicle 100.
- the display 88 may display a relative position processing screen used in the relative position processing application.
- FIG. 9 is a block diagram illustrating an example of a functional configuration of the mobile terminal 80.
- the processor 81 has functions of a UAV designation unit 811, a position information acquisition unit 812, a relative position processing unit 813, and an imaging information processing unit 814 by executing a program held in the memory 87.
- the UAV designation unit 811 is an example of a selection unit.
- the position information acquisition unit 812 is an example of an acquisition unit.
- the relative position processing unit 813 is an example of a determination unit.
- the UAV designation unit 811 designates (selects) a plurality (for example, two) of unmanned aircraft 100 forming one flight group from a plurality (for example, three) of unmanned aircraft 100. That is, the UAV designation unit 811 designates a plurality of unmanned aircraft 100 and forms one or more flight groups.
- the UAV designation unit 811 may designate the unmanned aircraft 100 based on the designation information input to the operation unit 83.
- the designation information input to the operation unit 83 may be touch information on the touch panel, or input of identification information for identifying the unmanned aircraft 100 (for example, key input, button input, voice input).
- the UAV designation unit 811 may designate the unmanned aircraft 100 forming the flight group through the operation unit 83 among the plurality of unmanned aircraft 100 displayed on various processing screens (for example, relative position processing screen).
- the positions of the plurality of unmanned aircraft 100 displayed on the processing screen may be determined based on the position information of each unmanned aircraft 100 acquired by the position information acquisition unit 812.
- the position information acquisition unit 812 acquires position information (for example, information on the current position) of the unmanned aircraft 100.
- the position information acquisition unit 812 may acquire the position information of the unmanned aircraft 100 via the wireless communication unit 85, for example.
- the position information acquisition unit 812 may acquire position information of the unmanned aircraft 100 via the transmitter 50 and the interface unit 82, for example.
- the position information of the unmanned aircraft 100 may be information on the absolute position of the unmanned aircraft 100.
- the position information of the unmanned aerial vehicle 100 may include the position information received by the GPS receiver 240 of the unmanned aircraft 100.
- the position information of the unmanned aircraft 100 may include position information obtained by referring to the three-dimensional map database.
- the position information of the unmanned aircraft 100 may include altitude information obtained by the barometric altimeter 270, the ultrasonic sensor 280, or the laser measuring device 290.
- the absolute position information is position information (for example, latitude, longitude, altitude information) defined by the position of an object such as one unmanned aircraft 100.
- the relative position information may be position information defined by the positional relationship between objects such as a plurality of unmanned aircraft (for example, information on distance and direction with respect to some reference position).
- the relative position processing unit 813 determines relative position information of a plurality of unmanned aircraft 100 included in the same flight group. It can be said that the relative position information of the plurality of unmanned aircraft 100 is information on the relative positional relationship of each of the plurality of unmanned aircraft 100.
- the relative position processing unit 813 may determine a relative positional relationship during the flight of the plurality of unmanned aircraft 100 and the flight operation by the transmitter 50.
- the relative position processing unit 813 uses, as a reference, one specific unmanned aircraft 100 among the plurality of unmanned aircraft 100 included in the same flight group, information on the positions of the other unmanned aircraft 100 with respect to the unmanned aircraft 100, It may be determined as relative position information.
- the relative position processing unit 813 may determine a reference position (reference position) in the flight group based on the positions (absolute positions) of a plurality of unmanned aircraft 100 included in the same flight group.
- the relative position processing unit 813 may determine, as relative position information, information on the position of each of the plurality of unmanned aircraft 100 with respect to the reference position with reference to the reference position.
- the relative position processing unit 813 drags the position of the unmanned aircraft 100 for which the relative position is to be set via the operation unit 83 among the plurality of unmanned aircraft 100 displayed on various processing screens (for example, the relative position processing screen).
- the relative position information may be changed by changing the operation. That is, the relative position processing unit 813 may adjust the relative position information by a drag operation.
- the relative position processing unit 813 may acquire the value of the distance between the plurality of unmanned aircraft 100 via the operation unit 83 and may determine relative position information based on this distance.
- the plurality of unmanned aircraft 100 included in the same flight group may fly in cooperation with each other.
- the flight group may be formed by the mobile terminal 80.
- the relative position information may be determined by the portable terminal 80.
- FIG. 10 is a diagram showing an example of designation of unmanned aerial vehicles 100 belonging to the same flight group on the positional relationship processing screen SG.
- the positional relationship processing screen SG may be displayed on at least a part of the display 88. The same applies to the subsequent positional relationship processing screen SG.
- the UAV images G11 to G13 are displayed on the positional relationship processing screen SG of FIG.
- UAV images G11, G12, and G13 are displayed corresponding to the absolute position of each unmanned aircraft 100, and indicate the positions of three unmanned aircraft 100G11, 100G12, and 100G13 (all not shown).
- the area indicated by the positional relationship processing screen SG corresponds to the area where the unmanned aircraft 100 is placed in the real space, and may be shown at a predetermined scale with respect to the real space area.
- the display position of the UAV images G11 to G13 on the positional relationship processing screen SG may be a position corresponding to the absolute position acquired by the position information acquisition unit 812.
- a UAV image is an example of a flying object image.
- the UAV image G11 may be displayed at the center of the positional relationship processing screen SG so that the UAV image G11 is easily visible. Note that another UAV image G11 may be displayed so as to be positioned at the center.
- the operation unit 83 receives a touch operation on the UAV images G11 and G12.
- the relative position processing unit 813 acquires information of this touch operation on the operation unit 83, and designates the unmanned aircraft 100G11 and 100G12 corresponding to the UAV images G11 and G12 as the unmanned aircraft 100 for forming a flight group.
- the relative position processing unit 813 does not acquire information on the touch operation on the UAV image G13 via the operation unit 83, the unmanned aircraft 100G for forming the flight group is used as the unmanned aircraft 100G13 corresponding to the UAV image G13.
- the unmanned aircraft 100G for forming the flight group is used as the unmanned aircraft 100G13 corresponding to the UAV image G13.
- FIG. 11 is a diagram illustrating an example of adjusting the horizontal position of the unmanned aircraft 100 on the positional relationship processing screen SG. In FIG. 11, you may show the positional relationship which looked at several selected unmanned aircraft 100 from upper direction.
- the UAV images G11 and G12 selected on the positional relationship processing screen SG of FIG. 10 are displayed, and the UAV images G13 not selected are not displayed.
- the operation unit 83 may accept a drag operation on the UAV image G12.
- the relative position processing unit 813 acquires information of the drag operation on the operation unit 83, and changes the display position of the UAV image G12 according to the drag operation.
- the distance of the unmanned aircraft 100G12 with respect to the unmanned aircraft 100G11 is a distance L1 (for example, 50 cm) along the x direction, and is a distance 0 along the y direction.
- the relative position processing unit 813 adjusts the position of the UAV image G12 on the positional relationship processing screen SG via the operation unit 83.
- a drag operation is performed from the UAV image G12a to the position of the UAV image G12b.
- the display 88 may display relative position information (for example, distance information) in response to the drag operation. Thereby, the operator of the portable terminal 80 can specifically understand the distance changed by the drag operation.
- an arrow ar drawn in each UAV image G12 indicates the direction of the imaging device 220 or 230, that is, the imaging direction. This is the same in the following.
- the imaging device 220 may function as a main camera.
- the imaging device 230 may function as a sub camera.
- FIG. 12 is a diagram illustrating an example of adjusting the position in the height direction (gravity direction) of the unmanned aircraft 100 on the positional relationship processing screen SG. In FIG. 12, you may show the positional relationship which looked at the several selected unmanned aircraft 100 from the horizontal direction.
- the UAV images G11 and G12 selected on the positional relationship processing screen SG1 of FIG. 10 are displayed, and the UAV images G13 not selected are not displayed.
- the operation unit 83 may accept a drag operation on the UAV image G12.
- the relative position processing unit 813 acquires information of the drag operation on the operation unit 83, and changes the display position of the UAV image G12 according to the drag operation.
- the distance of the unmanned aircraft 100G12 with respect to the unmanned aircraft 100G11 is the distance L1 along the x direction and the distance L2 (for example, 80 cm) along the z direction.
- the relative position processing unit 813 adjusts the position of the UAV image G12 on the positional relationship processing screen SG3 via the operation unit 83.
- a drag operation may be performed from the UAVG 12a to the position of the UAV image G12b.
- the positional relationship processing screens SG1, SG2, and SG3 are displayed in this order, and the relative position information may be determined and set in the order of the horizontal direction and the height direction.
- the positional relationship processing screens SG1, SG3, and SG2 are displayed in this order, and relative position information may be determined and set in the order of the height direction and the horizontal direction.
- the operator of the portable terminal 80 can check the display positions of the UAV images G11 and G12 corresponding to the actual unmanned aircraft 100G11 and 100G12 on the display 88 while The positions of the images G11 and G12 can be easily adjusted.
- the portable terminal 80 can determine the relative positional relationship between the UAV image G11 and the UAV image G12 by this simple operation. Further, the operator of the mobile terminal 80 can recognize the position adjustment in the three-dimensional space (for example, the horizontal direction and the height direction) and adjust the relative positional relationship. Further, the operator of the portable terminal 80 can determine the distance between the unmanned aircraft 100G11 and 100G12 by an intuitive operation (for example, a drag operation) on the displayed screen.
- an intuitive operation for example, a drag operation
- the operation unit 83 may input a specific distance value between the unmanned aircraft 100G11 and 100G12.
- the relative position processing unit 813 may determine this distance information (for example, 50 cm in the horizontal direction and 80 cm in the height direction) as relative position information. Thereby, the operator of the portable terminal 80 can determine relative position information without using the display 88.
- the portable terminal 80 determines relative position information in the three-dimensional space including the horizontal direction and the height direction, thereby allowing the plurality of unmanned aircraft 100 to fly in the three-dimensional space that is a flight range.
- a relative positional relationship can be defined. Therefore, the portable terminal 80 can determine relative position information in accordance with the actual flight of the unmanned aircraft 100.
- the relative position information in the three-dimensional space is not limited to the determination, and the relative position information in the two-dimensional space may be determined.
- the mobile terminal 80 determines at which position on the plane each unmanned aircraft 100 is arranged. Good.
- FIG. 13 is a schematic diagram illustrating an example of a horizontal reference position of a plurality of unmanned aircraft 100 forming a flight group.
- the flight group includes two unmanned aerial vehicles 100G11 and 100G12 corresponding to the UAV images G11 and G12.
- a reference position for determining a relative positional relationship in the flight group is indicated by a reference position RP.
- the horizontal reference position RP may be a horizontal intermediate position, center position, center of gravity position, or other reference position of the plurality of unmanned aircraft 100G11, 100G12 included in the same flight group.
- FIG. 13 shows the horizontal center positions of the unmanned aerial vehicles 100G11 and 100G12 as an example of the reference position RP.
- information on the horizontal positions of the unmanned aircraft 100G11, 100G12 with respect to the reference position RP may be included.
- the relative position information of the unmanned aircraft 100G11 with respect to the reference position RP may include information that the distance is (1/2) ⁇ L1 in the ⁇ x direction.
- the relative position information of the unmanned aircraft 100G12 with respect to the reference position RP may include information that the distance is (1/2) ⁇ L1 in the + x direction.
- FIG. 14 is a schematic diagram illustrating an example of a reference position in the height direction of a plurality of unmanned aircraft 100 forming a flight group.
- the flight group includes two unmanned aerial vehicles 100G11 and 100G12 corresponding to the UAV images G11 and G12.
- a reference position for determining a relative positional relationship in the flight group is indicated by a reference position RP.
- the reference position RP in the height direction may be an intermediate position in the height direction, a center position, a gravity center position, and other reference positions of the plurality of unmanned aircraft 100G11 and 100G12 included in the same flight group.
- FIG. 14 shows the center positions in the height direction of the unmanned aerial vehicles 100G11 and 100G12 as an example of the reference position RP.
- information on the position in the height direction of each of the unmanned aircraft 100G11, 100G12 with respect to the reference position RP may be included.
- the relative position information of the unmanned aircraft 100G11 with respect to the reference position RP may include information that the distance is (1/2) ⁇ L2 in the ⁇ z direction.
- the relative position information of the unmanned aircraft 100G12 with respect to the reference position RP may include information that the distance is (1/2) ⁇ L2 in the + z direction.
- the relative position processing unit 813 may determine the relative position information of each unmanned aircraft 100 with respect to the reference position RP.
- the portable terminal 80 can easily generate relative position information based on the difference between each unmanned aircraft 100 and the reference position RP.
- the mobile terminal 80 can have a flight form based on the reference position of the flight group. Therefore, the portable terminal 80 can provide a flight method in which the flight mode of the single unmanned aircraft 100 is simply expanded to a plurality of unmanned aircraft 100, and the operator of the transmitter 50 can Operation can be facilitated.
- the relative position processing unit 813 may include identification information of the unmanned aircraft 100G11 and 100G12 in the relative position information of the plurality of unmanned aircraft 100G11 and 100G12 together with information on at least one distance in the horizontal direction and the height direction.
- the relative position information may include identification information of unmanned aircraft 100G11 and 100G12 and distance information in association with each other.
- the identification information of the unmanned aircraft 100 may be, for example, an individual identification number given at the time of manufacture, a user identification number set for the operator, or other identification information.
- the relative position information may include the identification information of the unmanned aircraft 100G11 and 100G12 and the relative position information of the unmanned aircraft 100G11 and 100G12 in association with each other.
- the flight system 10 can prescribe which position the unmanned aircraft 100 should fly with respect to the reference position RP or the like during the flight of the unmanned aircraft 100G11 or the flight operation by the transmitter 50.
- FIG. 15 is a schematic diagram showing a first arrangement determination example of a plurality of unmanned aircraft 100 forming a flight group.
- UAV images G11, G12, G13, and G14 that is, unmanned aircraft 100G11, 100G12, 100G13, and 100G14 are arranged symmetrically around the reference position RP with the reference position RP as the center position.
- each unmanned aerial vehicle 100G11, 100G12, 100G13, 100G14 is arranged at a square apex having a side length of L3.
- the UAV image G11 corresponding to the unmanned aircraft 100G11 is the center position on the xy coordinates.
- Unmanned aerial vehicle 100G11 may be unmanned aerial vehicle 100 in which flight control is instructed by transmitter 50 to which portable terminal 80 that processes relative position information is attached.
- the + y direction may be the traveling direction when the flight group moves forward.
- FIG. 16 is a schematic diagram showing a second arrangement determination example of a plurality of unmanned aircraft 100 forming a flight group.
- the second sequence determination example is substantially the same as the first sequence determination example, but in FIG. 16, the reference position RP is the center position on the xy coordinates.
- information indicating that the position of the unmanned aircraft 100G11 with respect to the reference position RP is a distance of (1 / ⁇ 2) ⁇ L3 in a direction inclined by 45 ° from the + y direction to the ⁇ x direction is relative position information. May be included.
- the respective positions of the unmanned aircraft 100G12, 100G13, 100G14 with respect to the reference position RP may also be included in the relative position information.
- the + y direction may be the traveling direction when the flight group moves forward.
- FIG. 17 is a schematic diagram illustrating a third arrangement determination example of a plurality of unmanned aircraft 100 forming a flight group.
- the orientation of each imaging device 220 or 230 included in each unmanned aircraft 100 is different, and the imaging direction captured by each imaging device 220 or 230 is different.
- the relative position processing unit 813 determines relative position information of the plurality of unmanned aircraft 100
- the imaging information processing unit 814 determines information on the imaging direction of each of the plurality of unmanned aircraft 100.
- the imaging information processing unit 814 may determine information on the imaging direction of each unmanned aircraft 100 based on the number of unmanned aircraft 100 forming the flight group.
- the imaging information processing unit 814 may store and acquire the number of unmanned aircraft 100 designated as the same flight group in the memory 87.
- the imaging information processing unit 814 calculates information of imaging directions different by 90 degrees by equally dividing 360 degrees that is one revolution into four. You may decide. In addition, the imaging information processing unit 814 may determine the imaging direction information in which the imaging direction of each of the unmanned aircraft 100 is the direction in which each of the plurality of unmanned aircraft 100 is viewed from the reference position RP.
- the unmanned aircraft 100G11 has an imaging direction as an upward direction (for example, a traveling direction when the flight group moves forward), the unmanned aircraft 100G12 has an imaging direction as a right direction, and the unmanned aircraft 100G13 has an imaging direction as a downward direction.
- the information of the imaging direction in which 100G14 has the left direction as the imaging direction may be determined.
- the imaging information processing unit 814 may determine information on the angle of view of each imaging device 220 or 230 included in each unmanned aircraft 100.
- the imaging range is determined according to the angle of view.
- the imaging information processing unit 814 may determine information on the angle of view of each imaging device 220 or 230 included in each unmanned aircraft 100 based on the number of unmanned aircraft 100 forming the flight group. When the number of unmanned aircraft 100 forming the flight group is four, the imaging information processing unit 814 calculates and determines information on the angle of view as 90 degrees or more obtained by equally dividing 360 degrees that is one lap into four. It's okay.
- the image processing apparatus acquires these captured images and performs predetermined image processing, whereby a panoramic image or a stereo image with the periphery of the flight group as a subject can be obtained.
- FIG. 18 is a schematic diagram illustrating a fourth arrangement determination example of a plurality of unmanned aircraft 100 forming a flight group.
- the fourth arrangement determination example is substantially the same as the third arrangement determination example, but assumes that the number of unmanned aircraft 100 forming the flight group is three.
- unmanned aerial vehicles 100G11, 100G12, and 100G13 corresponding to the UAV images G11, G12, and G13 are arranged at the positions of vertices of equilateral triangles, respectively. Therefore, the imaging information processing unit 814 may calculate and determine information about different imaging directions by 120 degrees obtained by equally dividing 360 degrees that is one revolution into three.
- the unmanned aircraft 100G11 has the imaging direction as the imaging direction, and the unmanned aircraft 100G12 rotates the imaging direction of the unmanned aircraft 100G11 clockwise by 120 degrees in the imaging direction.
- the unmanned aircraft 100G13 may determine information on an imaging direction in which the imaging direction is a direction obtained by rotating the imaging direction of the unmanned aircraft 100G12 clockwise by 120 degrees.
- the imaging information processing unit 814 may calculate and determine the information on the angle of view as 120 degrees or more obtained by equally dividing 360 degrees that is one round into three. Thereby, when each unmanned aerial vehicle 100 captures an image, a captured image of 360 degrees around the flight group is obtained. Therefore, the image processing apparatus (for example, the portable terminal 80) acquires these captured images and performs predetermined image processing, whereby a panoramic image or a stereo image with the periphery of the flight group as a subject can be obtained.
- FIG. 19 is a schematic diagram illustrating a fifth arrangement determination example of a plurality of unmanned aircraft 100 forming a flight group.
- a plurality of unmanned aircraft 100G11 to 100G14 may be arranged asymmetrically with respect to the reference position RP, and relative position information may be determined. Further, the information on the imaging direction may be determined so that the imaging directions of the unmanned aerial vehicles 100G11 to 100G14 are uneven and irregular.
- the imaging information processing unit 814 may acquire imaging parameters including the imaging direction and angle of view of each unmanned aircraft 100 as input information via the operation unit 83 instead of determining the imaging parameters based on the number of unmanned aircraft 100. . Thereby, even when it is difficult to uniformly determine the imaging direction and the angle of view by calculation or the like, such as when there is no symmetry, the mobile terminal 80 individually sets the imaging parameters for the plurality of unmanned aircraft 100. Can be determined.
- parameters such as relative position information and imaging direction having symmetry, uniformity, or regularity are determined and set, but also have asymmetry, nonuniformity, or irregularity.
- Parameters such as relative position information and imaging direction may be determined and set.
- Parameters such as relative position information and imaging direction may be set and held in each of the unmanned aircraft 100G11 to 100G14 before flight. Thereby, each of the unmanned aircraft 100G11 to 100G14 forming the same flight group can fly in a coordinated manner while maintaining the imaging direction and angle of view indicated by the set relative position information and imaging parameters.
- FIG. 20 is a flowchart illustrating an operation example of the flight system 10.
- the position information acquisition unit 812 designates a plurality of unmanned aircraft 100 that forms the same flight group from among the plurality of unmanned aircraft 100 (S11).
- the relative position processing unit 813 determines relative position information of the plurality of unmanned aircraft 100 belonging to the same flight group (S12).
- the imaging information processing unit 814 may determine imaging parameters (for example, imaging direction and angle of view information) by each of the plurality of unmanned aircraft 100 (S13).
- the wireless communication unit 85 or the interface unit 82 may transmit determination information (for example, relative position information and imaging parameters) to each of the plurality of unmanned aircraft 100 (S14).
- the communication interface 150 receives the determination information from the portable terminal 80 (S15).
- the UAV control unit 110 sets the decision information by holding the received decision information in the memory 160 (S16). Therefore, relative position information and imaging parameters are set.
- the relative position information may be set in each unmanned aerial vehicle 100 (stored in the memory 160) before the plurality of unmanned aircraft 100 fly in cooperation.
- the decision information is transmitted by the portable terminal 80, but the decision information may be output by other methods.
- the processor 81 may record the determination information on an arbitrary recording medium. In this case, even when communication between the unmanned aircraft 100 and the portable terminal 80 is impossible, determination information can be set for each unmanned aircraft 100 via the recording medium.
- the portable terminal 80 when the control of the flight is instructed by the transmitter 50 during the flight of the unmanned aircraft 100, the relative position information of the plurality of unmanned aircraft 100 belonging to the flight group can be determined. This determined information may be set in the unmanned aircraft 100. Therefore, in the portable terminal 80, the unmanned aircraft 100 is operated by the transmitter 50 so that a plurality of unmanned aircraft 100 can fly in cooperation even if the flight route or the flight position is not set in advance. Can be made possible. Therefore, even when the unmanned aerial vehicle 100 flies in cooperation, the portable terminal 80 can specify the flight route and the like in real time by the transmitter 50, and can improve the degree of freedom of the unmanned aircraft during the cooperative flight. Moreover, the portable terminal 80 can fly a plurality of unmanned aircrafts in cooperation with one transmitter 50 by providing relative position information to each unmanned aircraft 100 of the flight group.
- the flight system 10 it is possible to determine the relative position information of the plurality of unmanned aircraft 100 belonging to the flight group when the transmitter 50 instructs flight control during the flight of the unmanned aircraft 100. This determined information may be set in the unmanned aircraft 100. Therefore, in the flight system 10, the unmanned aircraft 100 is operated by the transmitter 50 so that a plurality of unmanned aircraft 100 fly in cooperation even if the flight route or flight position is not set in advance. Can be made possible. Therefore, the flight system 10 can specify a flight route or the like in real time by the transmitter 50 even when the unmanned aircraft 100 flies in cooperation, and can improve the degree of freedom of the unmanned aircraft during the coordinated flight. it can. In addition, the flight system 10 can provide the relative position information to each unmanned aircraft 100 of the flight group, thereby allowing a plurality of unmanned aircraft to fly in cooperation with one transmitter 50.
- the portable terminal 80 detects position information of a plurality of unmanned aircraft 100 placed in a predetermined area by GPS or the like, displays the position information on the display 88, and relatively displays the UAV images G11 to G13 by touch operation or drag operation. The adjustment of the positional relationship was explained. Instead, the relative position processing unit 813 calculates the difference between the plurality of position information from the plurality of position information (absolute position information) detected by the GPS or the like in each of the plurality of unmanned aircraft 100. The difference may be determined as relative position information of the plurality of unmanned aircraft 100.
- the mobile terminal 80 may determine the relative position information of the plurality of unmanned aircraft 100 without using the operation unit 83 based on the acquired position information such as GPS. This eliminates the need for a special operation for determining relative position information, and improves user convenience.
- the transmitter 50 may have a function that the portable terminal 80 has. In this case, the transmitter 50 may be omitted. Further, the portable terminal 80 may have a function that the transmitter 50 has. In this case, the transmitter 50 may be omitted.
- the relative position information may be the relative position information described in the first embodiment.
- FIG. 21 is a schematic diagram illustrating a configuration example of a flight system 10A according to the second embodiment.
- the flight system 10A includes an unmanned aerial vehicle 100A, a transmitter 50A, and a portable terminal 80A.
- Unmanned aerial vehicle 100A, transmitter 50A, and portable terminal 80A can communicate with each other by wired communication or wireless communication (for example, wireless LAN (Local Area Network)).
- wireless LAN Local Area Network
- FIG. 22 is a block diagram illustrating an example of a hardware configuration of the unmanned aerial vehicle 100A.
- the unmanned aerial vehicle 100A includes a memory 160A instead of the memory 160, as compared with the unmanned aerial vehicle 100 in the first embodiment.
- the same reference numerals are given to the same configurations as those of the unmanned aircraft 100 of FIG. 4, and the description thereof is omitted or simplified.
- the memory 160 ⁇ / b> A has the function of the memory 160 and holds cooperative control information CC.
- the cooperative control information CC includes control information for a plurality of unmanned aircraft 100A belonging to the same flight group to fly in cooperation.
- the cooperative control information CC includes relative position information of a plurality of unmanned aircraft 100A belonging to the same flight group. This relative position information may include distance information indicating the distance between the reference position RP and each unmanned aircraft 100A. This relative position information may include direction information indicating the direction in which each unmanned aerial vehicle 100A is viewed from the reference position RP.
- the cooperative control information CC may include imaging parameters (for example, imaging direction information and angle of view information).
- the cooperative control information CC is held in the memory 160A before the plurality of unmanned aircraft 100A performs cooperative flight by the flight operation of the transmitter 50A.
- the memory 160A may hold a plurality of different cooperative control information CC for the same flight group. That is, the memory 160A may hold a plurality of different relative position information for the same flight group.
- FIG. 23 is a block diagram illustrating an example of a functional configuration of the UAV control unit 110A.
- the UAV control unit 110A includes a signal acquisition unit 111, a first relative position acquisition unit 112, a first absolute position acquisition unit 113, a second absolute position acquisition unit 114, a captured image acquisition unit 115, a second relative position acquisition unit 116, an image
- An angle information acquisition unit 117, an operation mode setting unit 118, a flight control unit 119, and an imaging control unit 120 are provided.
- the signal acquisition unit 111 is an example of a first acquisition unit.
- the first relative position acquisition unit 112 is an example of a first acquisition unit.
- the first absolute position acquisition unit 113 is an example of a calculation unit.
- the second absolute position acquisition unit 114 is an example of a second acquisition unit.
- the second relative position acquisition unit 116 is an example of a calculation unit.
- the captured image acquisition unit 115 is an example of a first acquisition unit.
- the angle-of-view information acquisition unit 117 is an example of a first acquisition unit.
- the flight control unit 119 is an example of a control unit.
- the imaging control unit 120 is an example of a control unit.
- the signal acquisition unit 111 acquires various signals.
- the signal acquisition unit 111 may acquire an instruction signal from the transmitter 50A via the communication interface 150.
- the instruction signal may be a signal that instructs flight control of the unmanned aircraft 100A.
- the instruction signal may include throttle instruction information for raising or lowering the unmanned aircraft 100A.
- the instruction signal may include pitch instruction information for moving the unmanned aircraft 100A forward or backward.
- the instruction signal may include roll instruction information for causing the unmanned aircraft 100A to travel rightward (also referred to as “rightward”) or leftward (also referred to as “leftward”).
- the instruction signal may include ladder instruction information (an example of turning instruction information) for making the unmanned aircraft 100A turn right or left.
- the first relative position acquisition unit 112 acquires relative position information of a plurality of unmanned aircraft 100A belonging to the same flight group.
- the first relative position acquisition unit 112 may acquire relative position information from the memory 160.
- the first relative position acquisition unit 112 may acquire relative position information from an external device (for example, the transmitter 50A) via the communication interface 150.
- the relative position information may include relative position information of the unmanned aircraft 100A (own aircraft) with respect to the reference position RP with reference to the reference position RP in the flight group.
- the relative position information may include relative position information of the other unmanned aircraft 100A (other aircraft) with respect to the reference position RP.
- the reference position RP may be an intermediate position, a center position, a center of gravity position, or other reference position of a plurality of unmanned aircraft 100A included in the same flight group.
- the relative position information may include relative position information of the unmanned aircraft 100A (own aircraft) with respect to the arbitrary unmanned aircraft 100A with respect to the arbitrary unmanned aircraft 100A in the flight group. That is, the position where any unmanned aircraft 100A exists may be set as the reference position RP.
- the relative position information may include relative position information of another unmanned aircraft 100A (another aircraft) with respect to the arbitrary unmanned aircraft 100A with respect to the arbitrary unmanned aircraft 100A in the flight group.
- the first relative position acquisition unit 112 When the first relative position acquisition unit 112 includes the relative position information of the own aircraft and the relative position information of the other aircraft, the first relative position acquisition unit 112 refers to the identification information of the unmanned aircraft 100A associated with the relative position information, You may identify and acquire the relative position information of the own machine.
- the first absolute position acquisition unit 113 based on the position information of the reference position RP and the relative position information of the unmanned aircraft 100A with respect to the reference position, position information (absolute position information) of the unmanned aircraft 100A (second flight) An example of position information) may be generated (for example, calculated).
- position information of the reference position may be included in the instruction information from the transmitter 50A, or may be held in the memory 160 as a past calculation result.
- the second absolute position acquisition unit 114 may acquire the position information (an example of first flight position information) of the unmanned aircraft 100A acquired by the GPS receiver 240.
- the second absolute position acquisition unit 114 may acquire position information of the unmanned aerial vehicle 100A obtained by a device other than the GPS receiver 240.
- the captured image acquisition unit 115 may acquire a captured image captured by the imaging device 220 or 230 of the unmanned aircraft 100A.
- a captured image captured by the imaging device 220 or 230 of another unmanned aircraft 100A may be acquired via the communication interface 150.
- the captured image acquisition unit 115 may acquire a captured image held in the memory 160.
- the captured image held in the memory 160 may be a captured image captured by the unmanned aircraft 100A or a captured image captured by another unmanned aircraft 100A.
- the second relative position acquisition unit 116 acquires relative position information with respect to an arbitrary object (for example, another unmanned aircraft 100A).
- the second relative position acquisition unit 116 may acquire distance information indicating a distance to an arbitrary object.
- the second relative position acquisition unit 116 may acquire distance information obtained by the ultrasonic sensor 280.
- the second relative position acquisition unit 116 may acquire distance information obtained by the laser measuring device 290.
- the second relative position acquisition unit 116 may acquire a captured image from the captured image acquisition unit 115.
- the second relative position acquisition unit 116 may calculate and acquire relative position information of the unmanned aircraft 100A with respect to a specific object (for example, another unmanned aircraft 100A) included in the captured image based on the captured image.
- the second relative position acquisition unit 116 may calculate the distance to the specific object by extracting the size of the specific object with respect to the captured image, and acquire the distance information. If information on the actual size of a specific object is stored in the memory 160 or the like in advance, distance information can be acquired.
- the second relative position acquisition unit 116 may calculate and acquire the direction in which the specific object is present with respect to the unmanned aircraft 100A according to the position of the region where the specific object is reflected in the captured image.
- the second relative position acquisition unit 116 may calculate distance information to a specific object reflected in the plurality of captured images using the plurality of acquired captured images as a stereo image. In this case, distance information can be acquired even if the actual size of a specific object is unknown. For example, it is assumed that, among three unmanned aircraft 100A belonging to the same flight group, one unmanned aircraft in the front and two unmanned aircraft in the rear fly. In this case, the second relative position acquisition unit 116 allows the two unmanned aircrafts 100A at the rear to image the one unmanned aircraft 100A at the front, for example, in accordance with the trigonometry, to the front with respect to the two unmanned aircrafts 100A at the rear.
- the relative position information (for example, information on distance and direction) of the single unmanned aircraft 100A may be acquired.
- the angle-of-view information acquisition unit 117 may acquire the angle-of-view information of the imaging device 220 or 230 from the imaging device 220 or 230 included in the unmanned aircraft 100A.
- the angle-of-view information acquisition unit 117 may acquire angle-of-view information of the imaging device 220 or 230 included in the other unmanned aircraft 100A from the other unmanned aircraft 100A via the communication interface 150.
- Operation mode setting unit 118 sets an operation mode during flight of unmanned aircraft 100A.
- the operation mode may include a single operation mode for the unmanned aircraft 100A to fly alone. In the single operation mode, one unmanned aerial vehicle 100A flies individually based on the instruction signal acquired from the transmitter 50A.
- the operation mode may include a cooperative operation mode for a plurality of unmanned aircraft 100A included in the same flight group to fly in cooperation. In the cooperative operation mode, the plurality of unmanned aircraft 100A in the same flight group fly in cooperation based on the instruction signal acquired from one transmitter 50A.
- unmanned aerial vehicle 100A can determine whether or not to fly in cooperation when a plurality of unmanned aircraft 100A fly based on whether the operation mode is set to the cooperative operation mode or the single operation mode.
- the operation mode may be set, for example, via an operation unit (not shown) of the unmanned aircraft 100, or may be set based on instruction information from the transmitter 50A.
- the cooperative operation mode may include a turning mode for turning (rotating) a plurality of unmanned aircraft 100A belonging to the same flight group.
- the turning mode may include a plurality of turning modes and indicate a turning mode.
- the first turning mode may be a turning mode in which each unmanned aircraft 100A turns around the reference position RP with the distance between each unmanned aircraft 100A and the reference position RP fixed. That is, in the first turning mode, the absolute position of each unmanned aircraft 100A may change and further turning may be performed.
- the second turning mode may be a turning mode in which the position of each unmanned aircraft 100A is fixed and each unmanned aircraft 100A turns around each unmanned aircraft 100A. That is, in the second turning mode, the absolute position of each unmanned aircraft 100A does not change, and turning may be performed.
- the flight control unit 119 controls the flight of the unmanned aircraft 100A (own aircraft) by fixing the relative positional relationship between the plurality of unmanned aircraft 100A belonging to the same flight group.
- the flight control unit 119 may control the flight of the unmanned aircraft 100A (own aircraft) by fixing the relative positional relationship of the unmanned aircraft 100A with respect to the reference position RP.
- the relative positional relationship of each unmanned aircraft 100A belonging to the flight group with respect to the reference position RP is fixed, the relative positional relationship as a whole of the plurality of unmanned aircraft 100A belonging to the flight group is also fixed.
- the fixing of the relative positional relationship may include maintaining the distance of the unmanned aircraft 100A with respect to the reference position RP without changing it.
- Fixing the relative positional relationship may include maintaining the imaging direction of the imaging device 220 or 230 with respect to a reference direction (for example, the traveling direction when the flight group moves forward) without being changed.
- the flight control unit 119 controls the flight of the unmanned aircraft 100A while maintaining the relative positional relationship of the unmanned aircraft 100A based on the instruction signal from the transmitter 50A. Therefore, the flight control unit 119 may perform flight control of ascent and descent while maintaining the relative positional relationship of each unmanned aircraft 100A based on the throttle instruction information from the transmitter 50A. The flight control unit 119 may perform forward and backward flight control while maintaining the relative positional relationship of each unmanned aircraft 100A based on the pitch instruction information from the transmitter 50A. The flight control unit 119 may perform rightward or leftward flight control while maintaining the relative positional relationship of each unmanned aircraft 100A based on the roll instruction information from the transmitter 50A. The flight control unit 119 may perform the right-handed or left-turned flight control while maintaining the relative positional relationship of each unmanned aircraft 100A based on the ladder instruction information from the transmitter 50A.
- the control amount of the flight of the unmanned aircraft 100A based on the instruction signal from the transmitter 50A may be the same control amount for each of the plurality of unmanned aircraft 100A.
- the turning amount and turning angle that each unmanned aircraft 100A turns may be the same.
- the flight control unit 119 controls the flight of the unmanned aircraft 100A by fixing the relative positional relationship between the plurality of unmanned aircraft 100A belonging to the same flight group. Good.
- the flight control unit 119 may control the flight of the unmanned aircraft 100A without fixing the relative positional relationship when the operation mode is not set to the cooperative operation mode.
- the flight control unit 119 changes the turning method of each unmanned aircraft 100A belonging to the same flight group based on which turning mode the turning mode is set to. It's okay.
- Flight control unit 119 may control the unmanned aircraft 100A to turn around the reference position RP by fixing the distance between each unmanned aircraft 100A and the reference position RP in the first turning mode. That is, the flight control unit 119 may change the position of each unmanned aircraft 100A and further turn each unmanned aircraft 100A in the first turning mode.
- the flight control unit 119 may fix the position of each unmanned aircraft 100A and control each unmanned aircraft 100A to turn around each unmanned aircraft 100A. That is, the flight control unit 119 may turn each unmanned aircraft 100A without changing the absolute position of each unmanned aircraft 100A in the second turning mode.
- the flight control unit 119 may perform flight control on a plurality of unmanned aircraft 100A as different relative positional relationships based on a plurality of different relative position information on the same flight group. Therefore, the unmanned aircraft 100A may change the relative positional relationship between the plurality of unmanned aircraft 100A by changing the relative position information to be used.
- the imaging control unit 120 may control the angle of view of the imaging device 220 or 230 included in the unmanned aircraft 100A based on the number of unmanned aircraft 100A belonging to the same flight group. . Information on the number of unmanned aircraft 100A belonging to the same flight group may be stored in the memory 160. The imaging control unit 120 may acquire information on the number of unmanned aircraft 100A from the memory 160.
- the imaging control unit 120 may acquire the angle of view information included in the imaging parameters held in the memory 160, and may control the angle of view of the imaging device 220 or 230 based on the angle of view information.
- the imaging control unit 120 may calculate and obtain the angle of view of the imaging device 220 or 230. For example, when the number of unmanned aircraft 100A that forms the same flight group is four, the imaging control unit 120 divides 360 degrees, which is one lap, into four equal parts by dividing 90 degrees or more into four unmanned aircraft 100A. You may calculate as each angle of view. In this case, the imaging control unit 120 may control the angle of view of each of the four unmanned aircraft 100A to be 90 degrees or more. For example, when the number of unmanned aircraft 100A forming the same flight group is three, the imaging control unit 120 can divide 360 degrees, which is one lap, into three equal parts by dividing 120 degrees or more into three unmanned aircraft 100A. You may calculate as each angle of view. In this case, the imaging control unit 120 may control the angle of view of each of the three unmanned aircraft 100A to be 120 degrees or more.
- the imaging control unit 120 may control the imaging direction of the imaging device 220 or 230 included in the unmanned aircraft 100A based on the number of unmanned aircraft 100A belonging to the same flight group. .
- the imaging control unit 120 may acquire information on the imaging direction included in the imaging parameters held in the memory 160 and may control the imaging direction of the imaging device 220 or 230 based on the information on the imaging direction.
- the imaging control unit 120 may calculate and acquire the imaging direction of the imaging device 220 or 230. For example, when the number of unmanned aircraft 100A forming the same flight group is four, the imaging control unit 120 calculates and acquires different imaging directions by 90 degrees obtained by equally dividing 360 degrees of one revolution into four. It's okay. When the number of unmanned aircraft 100A forming the same flight group is three, the imaging control unit 120 may calculate and acquire different imaging directions by 120 degrees obtained by equally dividing 360 degrees, which is one lap, into three. . The imaging control unit 120 may control the imaging direction of the imaging device 220 or 230 so that the calculated imaging direction is obtained.
- the imaging control unit 120 may control the imaging direction so that the direction in which each position of the plurality of unmanned aircraft 100A is viewed from the reference position RP of the flight group is the imaging direction of each unmanned aircraft 100A.
- FIG. 24 is a block diagram illustrating an example of a functional configuration of the mobile terminal 80A.
- the processor 81 ⁇ / b> A has functions of a captured image acquisition unit 816 and an image processing unit 817 by executing a program stored in the memory 87.
- the captured image acquisition unit 816 acquires each captured image captured by each of the plurality of unmanned aircraft 100A forming the same flight group.
- the captured image acquisition unit 816 may acquire each captured image via the interface unit 82 or the wireless communication unit 85.
- the image processing unit 817 performs arbitrary image processing on the plurality of captured images acquired by the captured image acquisition unit 816.
- the image processing unit 817 may generate a panoramic image or a stereo image based on a plurality of captured images.
- the image processing unit 817 may generate a panoramic image by combining a plurality of captured images having different imaging directions. When 360 degree omnidirectional is covered by a plurality of images, the image processing unit 817 can generate an omnidirectional panoramic image.
- the image processing unit 817 may generate a stereo image when the imaging ranges of two captured images adjacent to each other in the imaging direction included in the plurality of captured images partially overlap. When the imaging ranges of two adjacent captured images partially overlap and the 360 ° omnidirectional is covered by the plurality of images, the image processing unit 817 can generate an omnidirectional stereo image.
- image processing such as generation of a panoramic image or a stereo image based on the captured image may be performed by a device other than the mobile terminal 80A.
- This image processing may be performed by the transmitter 50A, any one or more unmanned aircraft 100A, and a PC (Personal Computer) (not shown).
- a captured image captured by each unmanned aircraft 100A may be stored in an SD card as the memory 160 provided in each unmanned aircraft 100A.
- a plurality of captured images stored in an SD card may be taken into a PC or the like and processed.
- the flight control unit 119 may control the flight of the unmanned aircraft 100A based on the instruction signal from the transmitter 50A. In this case, the flight control unit 119 may control the number of rotations (the number of rotations per unit time) of each rotor blade 211.
- FIG. 25 is a diagram for explaining a method of rotating the rotor blade 211 in accordance with the type of instruction signal from the transmitter 50A.
- the direction opposite to the direction in which the battery 103 is attached to the UAV main body 102 (the direction of the arrow ⁇ ) is the forward direction of the unmanned aircraft 100A.
- the direction of travel This is the traveling direction when the unmanned aerial vehicle 100A moves forward.
- the four rotary blades 211 may include rotary blades 211a, 211b, 211c, and 211d.
- the rotor blades 211a and 211b may rotate counterclockwise, and the rotor blades 211b and 211b may rotate counterclockwise.
- the instruction signal from the transmitter 50A may include at least one of throttle instruction information, pitch instruction information, roll instruction information, and ladder instruction information.
- the flight control unit 119 controls the rotational speeds of the four rotor blades 211a, 211b, 211c, and 211d.
- the throttle instruction information may include ascending instruction information and descending instruction information.
- the flight control unit 119 increases the rotational speeds of the four rotor blades 211a to 211d.
- the flight control unit 119 receives the lowering instruction information, the flight control unit 119 decreases the rotational speeds of the four rotor blades 211a to 211d.
- the unmanned aircraft 100A descends.
- the flight control unit 119 may control the rotational speeds of the rotor blades 211a and 211b positioned behind the unmanned aircraft 100A or the rotor blades 211c and 211d positioned in front of the unmanned aircraft 100A.
- the pitch instruction information may include forward instruction information and backward instruction information.
- the flight control unit 119 increases the rotational speeds of the two rotor blades 211a and 211b.
- the rotational speeds of the two rotor blades 211a and 211b increase, the unmanned aircraft 100A moves forward in the direction of the arrow ⁇ .
- the flight control unit 119 When the flight control unit 119 receives the backward instruction information, the flight control unit 119 increases the rotational speeds of the two rotor blades 211c and 211d. When the rotational speed of the two rotor blades 211c and 211d increases, the unmanned aircraft 100A moves backward.
- the flight control unit 119 may control the rotational speeds of the rotor blades 211a and 211c located on the left side of the unmanned aircraft 100A or the rotor blades 211b and 211d located on the right side of the unmanned aircraft 100A.
- the roll instruction information may include rightward instruction information and leftward instruction information.
- the flight control unit 119 increases the rotational speeds of the two rotor blades 211a and 211c.
- the rotational speeds of the two rotor blades 211a and 211c increase, the unmanned aircraft 100A moves to the right.
- the flight control unit 119 When the flight control unit 119 receives the leftward instruction information, the flight control unit 119 increases the rotational speeds of the two rotor blades 211b and 211d. When the rotational speed of the two rotor blades 211b and 211d increases, the unmanned aircraft 100A moves left.
- the flight control unit 119 may control the rotational speeds of the rotary blades 211a and 211d or the rotary blades 211b and 211c located on the diagonal line of the unmanned aircraft 100A.
- the ladder instruction information may include right turn instruction information and left turn instruction information.
- the flight control unit 119 increases the rotation speeds of the two rotor blades 211b and 211c.
- the unmanned aircraft 100A turns right.
- the flight control unit 119 When the flight control unit 119 receives the left turn instruction information, the flight control unit 119 increases the rotational speeds of the two rotary blades 211a and 211d. When the rotational speeds of the two rotor blades 211a and 211d increase, the unmanned aircraft 100A makes a left turn.
- the flight control unit 119 matches the relative position information included in the cooperative control information CC held (set) in the memory 160 with the relative position information acquired by the second relative position acquisition unit 116. As such, the flight of the unmanned aircraft 100A may be controlled.
- the relative position information included in the cooperative control information CC is information obtained before the flight, and can be said to be a predicted value of the relative position information when the cooperative flight is performed.
- the relative position information acquired by the second relative position acquisition unit 116 is information obtained based on some information during the flight, and can be said to be an actual measurement value of the relative position information when actually performing the cooperative flight.
- the relative position information acquired by the second relative position acquisition unit 116 may be information acquired based on a captured image obtained by capturing the direction of the other unmanned aircraft 100A from the unmanned aircraft 100A.
- the flight control unit 119 may perform the flight control by feedback so that the distance information and the direction information between the unmanned aircraft 100A and the other unmanned aircraft 100A obtained from the captured image are constant. Thereby, distance information and direction information are maintained constant, and the relative positional relationship between the unmanned aircraft 100A and the other unmanned aircraft 100A can be maintained.
- the captured image it is only necessary to provide the imaging device 220 or 230, and special sensors (for example, the GPS receiver 240, the ultrasonic sensor 280, and the laser measuring device 290) are used to maintain the relative positional relationship. It is not necessary to provide it.
- the relative position information included in the cooperative control information CC may be distance information between the unmanned aircraft 100A and another unmanned aircraft 100A.
- the relative position information acquired by the second relative position acquisition unit 116 may be distance information between the unmanned aircraft 100A and another unmanned aircraft 100A. That is, the flight control unit 119 may control the flight of the unmanned aircraft 100A so that the distance information included in the cooperative control information CC matches the distance information acquired by the second relative position acquisition unit 116. .
- the distance information included in the cooperative control information CC is information obtained before the flight, and can be said to be a predicted value of the distance information when the cooperative flight is performed.
- the distance information acquired by the second relative position acquisition unit 116 is information obtained based on some information during the flight, and can be said to be an actual measurement value of the distance information when actually performing the coordinated flight.
- the flight control unit 119 may perform flight control by feedback so that distance information between the unmanned aircraft 100A and the other unmanned aircraft 100A obtained by the ultrasonic sensor 280 or the laser measuring device 290 is constant. Thereby, the distance information is maintained constant, and the relative positional relationship between the unmanned aircraft 100A and the other unmanned aircraft 100A can be maintained. Further, the unmanned aircraft 100A can acquire more accurate distance information by using the ultrasonic sensor 280 and the laser measuring device 290.
- the flight control unit 119 matches the position information of the unmanned aircraft 100A acquired by the first absolute position acquisition unit 113 with the position information of the unmanned aircraft 100A acquired by the second absolute position acquisition unit 114.
- the flight of the unmanned aircraft 100A may be controlled.
- the position information of the unmanned aerial vehicle 100A acquired by the first absolute position acquisition unit 113 is information obtained before the flight, and can be said to be a predicted value of the information on the absolute position of the unmanned aircraft 100A in the case of coordinated flight.
- the position information of the unmanned aerial vehicle 100A acquired by the second relative position acquisition unit 116 is information obtained based on some information during the flight, and actual measurement of information on the absolute position of the unmanned aircraft 100A at the time of actual cooperative flight. It can also be said to be a value.
- the flight control unit 119 performs feedback control and flight control so that the position information of the unmanned aircraft 100A obtained by the GPS receiver 240 and the like matches the position information of the unmanned aircraft 100A based on the relative position information before the flight. It's okay. Thereby, the relative positional relationship between unmanned aircraft 100A and other unmanned aircraft 100A can be maintained. Further, since the GPS receiver 240 and the like are relatively easy to mount, the relative positional relationship can be easily maintained.
- the flight control unit 119 maintains the difference between the angle of view information of the unmanned aircraft 100A (own aircraft) acquired by the angle of view information acquisition unit 117 and the angle of view information of the other unmanned aircraft 100A (other aircraft).
- the flight of the unmanned aircraft 100A may be controlled.
- the relative positional relationship changes, for example, the angle of view when the same subject is imaged by each imaging device 220 or 230 provided in each of the plurality of unmanned aircraft 100A changes. Therefore, the difference in the angle of view of each imaging device 220 or 230 changes.
- the flight control unit 119 can maintain the relative positional relationship of the plurality of unmanned aircraft 100A by performing feedback control in the flight of the unmanned aircraft 100A so that the difference in the angle of view becomes constant. Further, since the information on the angle of view of the imaging device 220 or 230 is used, a special sensor (for example, the GPS receiver 240, the ultrasonic sensor 280, or the laser measuring device 290) is provided to maintain the relative positional relationship. Is no longer necessary.
- FIG. 26 is a schematic diagram showing an example of a plurality of unmanned aircraft 100A forming a flight group and a virtual machine 100v positioned at a reference position RP.
- two unmanned aircraft 100r1 and 100r2 are shown as the plurality of unmanned aircraft 100A.
- the virtual machine 100v represents a virtual unmanned aerial vehicle located at the reference position RP.
- the virtual straight lines VL1 and VL2 are straight lines that virtually connect the reference position RP, that is, the virtual machine 100v and each of the unmanned aircraft 100r1 and 100r2.
- two unmanned aircraft 100r1 and 100r2 are arranged symmetrically with respect to the reference position RP. Therefore, a virtual straight line VL1 connecting the reference position RP and the unmanned aircraft 100r1 and a virtual straight line VL2 connecting the reference position RP and the unmanned aircraft 100r2 are in a straight line.
- the plurality of virtual straight lines VL1 and VL2 do not have to be a straight line.
- each unmanned aircraft 100r1, 100r2 in the flight group is the upward direction indicated by the arrow ⁇ 1.
- the traveling direction of the flight group can be changed according to an instruction signal from the transmitter 50A.
- an arrow ar drawn in each unmanned aircraft 100A indicates the direction of the imaging device 220 or 230, that is, the imaging direction. This is the same in the following.
- Each unmanned aircraft 100r1, 100r2 is instructed to perform flight control from the transmitter 50A.
- the transmitter 50A gives an instruction signal to each unmanned aircraft 100r1, 100r2 based on the absolute position of the virtual machine 100v.
- Each unmanned aircraft 100A may control the flight according to the instruction signal from the transmitter 50A while fixing the relative positional relationship with respect to the reference position RP, that is, the virtual machine 100v. Since the relative positional relationship is fixed, the length of the virtual straight line VL is not changed, the positional relationship of each unmanned aircraft 100r1, 100r2 with respect to the reference position RP is not changed, and each unmanned aircraft 100r1, 100r2 flies. Control.
- each unmanned aerial vehicle 100r1, 100r2 the flight control unit 119 is configured even when the instruction signal from the transmitter 50A includes instruction information regarding any flight (for example, throttle instruction information, pitch instruction information, roll instruction information, ladder instruction information).
- instruction information regarding any flight for example, throttle instruction information, pitch instruction information, roll instruction information, ladder instruction information.
- the relative positional relationship between the plurality of unmanned aircrafts 100r1 and 100r2 flying in cooperation is maintained and fixed, and the flight is controlled in accordance with the instruction information from the transmitter 50A.
- each unmanned aircraft 100r1 and 100r2 can fly in a coordinated manner based on an instruction signal from one transmitter 50A.
- the transmitter 50A can easily instruct flight control of a plurality of unmanned aircraft 100r1 and 100r2 as if it were operating one virtual machine 100v.
- FIG. 27 is a schematic diagram showing a turning example of each unmanned aircraft 100A when the operation mode is set to the cooperative mode and the first turning mode.
- the distance between each unmanned aircraft 100r1, r2 and the reference position RP is fixed, and each unmanned aircraft 100r1, 100r2 turns around the reference position RP. That is, it can be said that the virtual straight line VL turns around the reference position RP, and accordingly, the unmanned aircraft 100r1 and 100r2 located at the ends ep1 and ep2 of the virtual straight lines VL1 and VL2 also turn.
- the transmitter 50A transmits the ladder instruction information to the virtual machine 100v, so that each unmanned aircraft 100r1 and r2 receives the ladder instruction information, and based on the ladder instruction information, Turn with a fixed relative position.
- the traveling directions of the unmanned aircrafts 100r1 and 100r2 in the flight group are all in the upper left direction indicated by the arrow ⁇ 2. That is, the traveling direction of each unmanned aircraft 100r1, 100r2 in the flight group may be changed from the arrow ⁇ 1 to the arrow ⁇ 2 according to the ladder instruction information from the transmitter 50A.
- the plurality of unmanned aircraft 100A that perform cooperative flight can turn so that the virtual straight line VL rotates.
- the angle formed by the traveling direction ⁇ 2 and the virtual straight lines VL1 and VL2 does not change. Therefore, even if each unmanned aerial vehicle 100r1, 100r2 turns, the positional relationship between the imaging ranges captured by each unmanned aircraft 100r1, 100r2 does not change. Therefore, when a panoramic image is generated from a plurality of captured images captured by the unmanned aircraft 100r1 and 100r2, the positional relationship between the captured images does not change by the unmanned aircraft 100r1 and 100r2 that contribute to the panoramic image. Therefore, the operator of the transmitter 50A can operate the plurality of unmanned aircraft 100A at the same interval as the virtual machine 100v including one imaging device having a wide imaging range turns.
- FIG. 28 is a schematic diagram showing a turning example of each unmanned aircraft 100A when the operation mode is set to the cooperative mode and the second turning mode.
- the positions of the unmanned aircraft 100r1 and 100r2 are fixed, and the unmanned aircraft 100r1 and 100r2 turn around the unmanned aircraft 100A. That is, it can be said that the virtual straight line VL does not turn, and the unmanned aircrafts 100r1 and 100r2 located at the ends ep1 and ep2 of the virtual straight line VL turn at the ends ep1 and ep2. Since the virtual straight line VL does not turn, the absolute positions of the unmanned aircraft 100r1 and 100r2 do not change only with the ladder instruction information.
- the transmitter 50A When viewed from the transmitter 50A side, the transmitter 50A transmits the ladder instruction information to the virtual machine 100v, so that each unmanned aircraft 100r1 and r2 receives the ladder instruction information, and based on the ladder instruction information, Turn with a fixed relative position.
- the traveling directions of the unmanned aerial vehicles 100r1 and 100r2 in the flight group are both the upper left direction indicated by the arrow ⁇ 3. That is, the traveling direction of each unmanned aircraft 100r1, 100r2 in the flight group may be changed from the arrow ⁇ 1 to the arrow ⁇ 3 according to the ladder instruction information from the transmitter 50A.
- the flight system 10A can turn the plurality of unmanned aircraft 100A performing the cooperative flight so that the virtual straight line VL does not rotate. Therefore, in the second turning mode, the flight range in which the plurality of unmanned aircraft 100A flies based on the ladder instruction information is narrower than in the first turning mode.
- the virtual straight line VL rotates around the reference position RP, so that the flight trajectory of each unmanned aircraft 100A in the flight group becomes, for example, a circle, and the inside of the circle is the flight range required for turning It becomes.
- the linear range indicated by the virtual straight line VL is a flight range necessary for turning. Therefore, for example, even when a plurality of unmanned aircraft 100A travels in a relatively narrow space, each unmanned aircraft 100A can fly according to the ladder instruction information.
- FIG. 29 is a schematic diagram showing a first arrangement example during the flight of three unmanned aerial vehicles 100r1, 100r2, and 100r3 forming a flight group.
- the three unmanned aircrafts 100r1 to 100r3 are arranged at equal positions from the reference position RP at positions corresponding to the vertices of the equilateral triangle. Respective virtual straight lines connecting the reference position RP and each of the three unmanned aircraft 100r1 to 100r3 are shown as virtual straight lines VL1, VL2, and VL3.
- the three unmanned aerial vehicles 100r1 to 100r3 are located at the ends ep1, ep2, and ep3 of the virtual straight lines VL1 to VL3.
- the traveling directions of the unmanned aerial vehicles 100r1 to 100r3 in the flight group are all in the upper left direction indicated by the arrow ⁇ 4.
- each imaging direction of the unmanned aircraft 100r1 to 100r3 coincide with the directions in which the virtual straight lines VL1 to VL3 extend from the reference position RP. Accordingly, each imaging direction is equally different by 120 degrees.
- the imaging direction of each unmanned aircraft 100r1 to 100r3 may be set by the imaging control unit 120 of each unmanned aircraft 100r1 to 100r3.
- the imaging device 220 or 230 included in each unmanned aircraft 100r1 to 100r3 may have an angle of view set to 120 degrees or more.
- Each of the unmanned aircraft 100r1 to 100r3 may transmit the captured image captured by the imaging device 220 or 230 of each of the unmanned aircraft 100r1 to 100r3 to the mobile terminal 80A.
- the portable terminal 80A may receive captured images from each of the unmanned aircraft 100r1 to 100r3.
- the mobile terminal 80A may acquire a plurality of captured images captured at an angle of view of 120 degrees or more in different imaging directions by 120 degrees.
- a panoramic image may be generated based on captured images captured by at least two of the unmanned aircraft 100r1 to 100r3.
- the mobile terminal 80A may generate an omnidirectional panoramic image based on captured images captured by the unmanned aircraft 100r1 to 100r3.
- the arrangement of a plurality of unmanned aircraft 100A belonging to this flight group may be determined based on the number of unmanned aircraft 100A belonging to the same flight group so that a panoramic image can be generated. That is, the first relative position acquisition unit 112 may automatically arrange each unmanned aircraft 100A based on the number of unmanned aircraft 100A. In this case, the first relative position acquisition unit 112 of each unmanned aircraft 100A may determine the arrangement position of each unmanned aircraft 100 with respect to the reference position RP. For example, the first relative position acquisition unit 112 of each unmanned aircraft 100A may arrange each unmanned aircraft 100 at the same distance and the same angle with respect to the reference position RP in the order of the identification number of the unmanned aircraft 100A. In this case, each unmanned aerial vehicle 100r1 to 100r3 may be arranged at the position of the apex of an equilateral triangle with the reference position RP as the center of gravity.
- FIG. 30 is a schematic diagram showing an example of turning in the first turning mode by the three unmanned aerial vehicles 100r1, 100r2, and 100r3 shown in FIG.
- each unmanned aircraft 100r1, 100r2, 100r3 and the reference position RP is fixed, and each unmanned aircraft 100r1 to 100r3 turns around the reference position RP. That is, it can be said that the virtual straight lines VL1, VL2, and VL3 turn around the reference position RP, and accordingly, the unmanned aircraft 100r1 to 100r3 positioned at the ends ep1, ep2, and ep3 of the virtual straight lines VL1 to VL3 also turn. . That is, even when three or more unmanned aircraft 100A form a flight group, each unmanned aircraft 100A can turn in accordance with the first turning mode. In FIG. 30, the traveling direction of each unmanned aircraft 100r1 to 100r3 in the flight group due to the turn is the upper left direction indicated by the arrow ⁇ 5.
- each unmanned aircraft 100A follows the second turning mode even when three or more unmanned aircraft 100A form a flight group, as in the first turning mode. Turn is possible.
- FIG. 31 is a schematic diagram showing an arrangement example of five unmanned aircrafts 100r1, 100r2, 100r3, 100r4, and 100r5 forming a flight group at the time of flight.
- each of the unmanned aircraft 100r1 to 100r5 and the reference position RP are connected by virtual straight lines VL1, VL2, VL3, VL4, and VL5.
- the imaging direction of the two unmanned aerial vehicles 100r1 and 100r2 positioned in front of the same flight group is the forward direction (upward).
- the three unmanned aerial vehicles 100r3, 100r4, and 100r5 located at positions other than the front in the same flight group have different imaging directions by 90 degrees.
- the imaging direction of the unmanned aircraft 100r3 is the right direction
- the imaging direction of the unmanned aircraft 100r4 is the backward direction (downward)
- the imaging direction of the unmanned aircraft 100r5 is the left direction.
- each unmanned aircraft 100r1 to 100r5 may set the angle of view of the imaging device 220 or 230 included in each aircraft to 90 degrees or more.
- Each imaging direction and each angle of view may be set by the imaging control unit 120 included in the unmanned aircraft 100-1 to 100r5.
- Each of the unmanned aircraft 100r1 to 100r5 may transmit the captured image captured by the imaging device 220 or 230 of each of the unmanned aircraft 100r1 to 100r5 to the mobile terminal 80A.
- the portable terminal 80A may receive captured images from each of the unmanned aircraft 100r1 to 100r5.
- the imaging range by the imaging device 220 or 230 of the unmanned aerial vehicle 100r1, 100r2 may partially overlap.
- the mobile terminal 80A may generate a stereo image based on captured images captured by each of the unmanned aircraft 100r1 and 100r2.
- the mobile terminal 80A may acquire a captured image captured at an angle of view of 90 degrees or more in an imaging direction that is different by 90 degrees.
- a panoramic image (for example, an omnidirectional panoramic image) may be generated based on a captured image captured by at least one of the unmanned aircraft 100r1 and 101r2 and a captured image captured by the unmanned aircraft 100r1 to 100r3.
- the unmanned aircraft 100A has different angles of view (for example, 90 degrees or more) and imaging directions (for example, 90 degrees different directions) based on the number of unmanned aircraft 100A (for example, four) forming the flight group. ), It is possible to acquire a plurality of captured images suitable for generating a panoramic image and a stereo image without requiring a precise flight operation on the transmitter 50A. In particular, if one of the captured images captured by the plurality of unmanned aircraft 100A deteriorates in image quality or the position of the subject shifts relative to the captured image, the image quality of the panoramic image or the stereo image is affected.
- the mobile terminal 80A can acquire a plurality of captured images in which image quality deterioration and positional deviation of the subject with respect to the captured image are suppressed from a plurality of unmanned aircraft 100A belonging to one flight group. Therefore, the mobile terminal 80A can acquire a desired panoramic image or stereo image.
- each of the unmanned aircraft 100r1 to 100r5 obtains the ladder instruction information from the transmitter 50A, and the relative position is determined by the turning method in the first turning mode or the second turning mode. You may turn with the relationship fixed.
- FIG. 32A is a schematic diagram of a second arrangement example in the horizontal direction during flight of three unmanned aerial vehicles 100r1, 100r2, and 100r3 forming a flight group.
- FIG. 32B is a schematic diagram of a second arrangement example in the height direction during flight of three unmanned aerial vehicles 100r1, 100r2, and 100r3 forming a flight group.
- two unmanned aircraft 100r1 and 100r2 fly in front of the flight group (traveling direction ⁇ 7 when moving forward).
- One unmanned aerial vehicle 100r3 is arranged rearward in the flight group. Further, one unmanned aircraft 100r3 flies at a higher altitude than the two unmanned aircraft 100r1 and 100r2.
- the imaging directions of the unmanned aerial vehicles 100r1 to 100r3 are all traveling directions during forward movement.
- the unmanned aerial vehicle 100r3 makes it easier to manage the flight of the unmanned aircraft 100r1 and 100r2 by flying the unmanned aircraft 100r1 and 100r2 at a high altitude.
- Each of the unmanned aircraft 100r1 to 100r3 may transmit the captured image captured by the imaging device 220 or 230 of each of the unmanned aircraft 100r1 to 100r3 to the mobile terminal 80A.
- the portable terminal 80A may receive captured images from each of the unmanned aircraft 100r1 to 100r3.
- the imaging range of the imaging device 220 or 230 of the unmanned aerial vehicle 100r3 may include the unmanned aircraft 100r1 and 100r2 flying in front.
- the unmanned aircraft 100r1 and 100r2 are reflected in the captured image captured by the imaging device 220 or 230 of the unmanned aircraft 100r3.
- the operator of the transmitter 50A instructs the coordinated flight control of the plurality of unmanned aircraft 100r1 to 100r3 while confirming the captured image (operation image) from the unmanned aircraft 100r3 displayed on the portable terminal 80A. it can.
- the imaging ranges of the unmanned aircraft 100r1 and 100r2 may partially overlap.
- the mobile terminal 80A may generate a stereo image based on the captured image captured by each of the unmanned aircraft 100r1 and 100r2.
- one imaging device 230 of the two unmanned aircraft 100r1 and 100r2 flying in the other side may image the other of the two unmanned aircraft 100r1 and 100r2.
- the imaging device 230 may be fixedly disposed on the UAV main body 102 or the like with the horizontal direction as the imaging direction so as to be capable of imaging each other.
- the unmanned aircraft 100A can acquire relative position information (for example, distance information) of the other unmanned aircraft 100A flying in cooperation in the left-right direction instead of the front-back direction in the horizontal direction.
- the flight system 10A can be operated at least one in a flight group that flies in cooperation with the transmitter 50A so that the operator of the transmitter 50A can easily operate.
- An image in which a part of the unmanned aircraft 100A is reflected can be provided. Therefore, the transmitter 50A operates the plurality of unmanned aircraft 100A with a simple operation while visually recognizing at least a part of the unmanned aircraft 100A in the flight group, and moves the plurality of unmanned aircraft 100A to an area where stereo image acquisition is desired. Can be advanced.
- FIG. 33 is a flowchart showing an operation example of the unmanned aerial vehicle 100A.
- a plurality of unmanned aircraft 100A belonging to the same flight group operates in the same manner.
- the flight control unit 119 acquires the cooperative control information CC including the relative position information of the plurality of unmanned aircraft 100A belonging to the flight group held in the memory 160 (S21).
- the signal acquisition unit 111 receives the instruction signal from the transmitter 50A (S22).
- the flight control unit 119 controls the flight of the unmanned aircraft 100A (own aircraft) by fixing the relative positional relationship of the unmanned aircraft 100A belonging to the same flight group based on the instruction signal from the transmitter 50A ( S23).
- the unmanned aerial vehicle 100A it is possible to acquire relative position information of a plurality of unmanned aircraft 100A belonging to the flight group when receiving a flight control instruction from the transmitter 50A during the flight.
- the unmanned aerial vehicle 100A is operated by the single transmitter 50A so that the relative position relationship with the other unmanned aircraft 100A can be maintained even when the flight route or flight position is not set in advance. It is possible to fly in cooperation with another unmanned aircraft 100A. Therefore, even when the unmanned aircraft 100A performs coordinated flight with another unmanned aircraft 100A, the flight route and the like can be designated in real time by one transmitter 50A. In other words, the unmanned aircraft 100A can improve the degree of freedom of flight during coordinated flight.
- each unmanned aircraft 100A is operated by one transmitter 50A, so that the relative positional relationship between the plurality of unmanned aircraft 100A even in a flight route or flight position that is not set in advance. And a plurality of unmanned aircraft 100A can fly in a coordinated manner. Therefore, even when a plurality of unmanned aircraft 100A in the flight system 10A performs coordinated flight, the flight route and the like can be designated in real time by one transmitter 50A. That is, the flight system 10A can improve the degree of freedom of flight during the coordinated flight of the plurality of unmanned aircraft 100A.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Traffic Control Systems (AREA)
- Toys (AREA)
Abstract
Description
図1は、第1の実施形態における飛行システム10の構成例を示す模式図である。飛行システム10は、無人航空機100、送信機50、及び携帯端末80を備える。無人航空機100、送信機50、及び携帯端末80は、相互に有線通信又は無線通信(例えば無線LAN(Local Area Network))により通信可能である。
図20は、飛行システム10の動作例を示すフローチャートである。
第2の実施形態では、飛行グループを形成する複数の無人航空機が、設定された相対的な位置情報を考慮して、送信機による操作信号に従って飛行することを想定する。相対的な位置情報は、第1の実施形態で説明した相対的な位置情報でよい。
図33は、無人航空機100Aの動作例を示すフローチャートである。なお、同じ飛行グループに属する複数の無人航空機100Aは、各々、同様に動作する。
50,50A 送信機
50B 筐体
53L 左制御棒
53R 右制御棒
61 送信機制御部
63 無線通信部
65 インタフェース部
80,80A 携帯端末
81,81A プロセッサ
82 インタフェース部
83 操作部
85 無線通信部
87 メモリ
88 ディスプレイ
100,100A,100r1,100r2,100r3、100r4,100r5 無人航空機
102 UAV本体
103 バッテリ
110,110A UAV制御部
111 信号取得部
112 第1相対位置取得部
113 第1絶対位置取得部
114 第2絶対位置取得部
115 撮像画像取得部
116 第2相対位置取得部
117 画角情報取得部
118 動作モード設定部
119 飛行制御部
120 撮像制御部
150 通信インタフェース
160 メモリ
200 ジンバル
210 回転翼機構
211,211a,211b,211c,211d 回転翼
212 駆動モータ
213 電流センサ
220,230 撮像装置
230 撮像装置
240 GPS受信機
250 慣性計測装置
260 磁気コンパス
270 気圧高度計
280 超音波センサ
290 レーザー測定器
811 UAV指定部
812 位置情報取得部
813 相対位置処理部
814 撮像情報処理部
816 撮像画像取得部
817 画像処理部
AN1,AN2 アンテナ
B1 電源ボタン
B2 RTHボタン
L1 リモートステータス表示部
L2 バッテリ残量表示部
OPS 操作部セット
G11,G12,G13,G14 UAV画像
Claims (50)
- 複数の飛行体の位置情報を処理する位置処理装置であって、
前記複数の飛行体を選択し、選択された前記複数の飛行体が飛行グループを形成する選択部と、
前記飛行体の制御を指示する操作装置による操作中における、前記飛行グループに属する前記複数の飛行体の相対的な位置情報である第1の相対的な位置情報を決定する決定部と、
を備える位置処理装置。 - 前記決定部は、前記第1の相対的な位置情報として、前記飛行グループに属する複数の飛行体の基準位置に対する前記複数の飛行体の各々の相対的な位置情報を決定する、
請求項1に記載の位置処理装置。 - 前記決定部は、前記複数の飛行体の各々の識別情報と、前記識別情報により識別される飛行体の各々の相対的な位置情報と、を関連付けて決定する、
請求項2に記載の位置処理装置。 - 前記第1の相対的な位置情報は、前記複数の飛行体の3次元空間における相対的な位置情報を含む、
請求項1~3のいずれか1項に記載の位置処理装置。 - 前記第1の相対的な位置情報は、前記複数の飛行体の水平方向の距離情報を含む、
請求項4に記載の位置処理装置。 - 前記第1の相対的な位置情報は、前記複数の飛行体の重力方向の距離情報を含む、
請求項4に記載の位置処理装置。 - 前記複数の飛行体を示す複数の飛行体画像を表示する表示部と、
入力を受け付ける操作部と、を更に備え、
前記決定部は、前記表示部に表示された前記複数の飛行体画像の位置を、前記操作部への入力により変更することで、前記第1の相対的な位置情報を変更する、
請求項1~6のいずれか1項に記載の位置処理装置。 - 前記操作部は、ドラッグ操作による入力を受け付ける、
請求項7に記載の位置処理装置。 - 前記表示部は、前記ドラッグ操作により変更された複数の前記飛行体画像の位置に基づいて、前記複数の飛行体の間の距離情報を表示する、
請求項8に記載の位置処理装置。 - 入力を受け付ける操作部、を更に備え、
前記決定部は、前記操作部へ入力された前記複数の飛行体の間の距離情報に基づいて、前記第1の相対的な位置情報を決定する、
請求項1~6のいずれか1項に記載の位置処理装置。 - 前記複数の飛行体の各々の位置情報を取得する取得部、を更に備え、
前記決定部は、取得された複数の前記位置情報の差分に基づく相対的な位置情報である第2の相対的な位置情報に基づいて、前記第1の相対的な位置情報を決定する、
請求項1~6のいずれか1項に記載の位置処理装置。 - 前記第1の相対的な位置情報を出力する出力部、を更に備える、
請求項1~11のいずれか1項に記載の位置処理装置。 - 他の飛行体とともに飛行グループを形成して飛行する飛行体であって、
前記飛行グループに属する複数の飛行体の制御を指示する操作装置から飛行の制御を指示する指示信号と、前記飛行グループに属する前記複数の飛行体の基準位置に対する前記飛行体の相対的な位置情報である第1の相対的な位置情報と、を取得する第1の取得部と、
前記指示信号と前記第1の相対的な位置情報とに基づいて、前記基準位置と前記飛行体との間の相対的な位置関係を固定して、前記飛行体の飛行を制御する制御部と、
を備える飛行体。 - 前記指示信号は、前記複数の飛行体の旋回を指示するための第1旋回指示情報を含み、
前記制御部は、前記第1旋回指示情報に基づいて、前記飛行体と前記飛行グループに属する前記複数の飛行体の基準位置との距離を固定して、前記基準位置を中心として前記飛行体が旋回するように、前記飛行体を制御する、
請求項13に記載の飛行体。 - 前記指示信号は、前記複数の飛行体の旋回を指示するための第2旋回指示情報を含み、
前記制御部は、前記第2旋回指示情報に基づいて、前記飛行体の位置を固定して、前記飛行体の位置を中心として前記飛行体が旋回するように、前記飛行体の飛行を制御する、
請求項13に記載の飛行体。 - 第1の撮像部、を更に備え、
前記制御部は、
協調して飛行する前記飛行体の台数に基づいて、前記第1の撮像部の画角を制御し、
前記第1の相対的な位置情報に基づいて、前記第1の撮像部の撮像方向を制御する、
請求項13~15のいずれか1項に記載の飛行体。 - 前記飛行体の飛行位置を示す第1の飛行位置情報を取得する第2の取得部と、
前記基準位置と前記第1の相対的な位置情報とを基に、前記飛行体の飛行位置を示す第2の飛行位置情報を計算する計算部と、
を更に備え、
前記制御部は、前記第1の飛行位置情報と前記第2の飛行位置情報とが一致するように、前記飛行体の飛行を制御する、
請求項16のいずれか1項に記載の飛行体。 - 前記第1の撮像部は、前記第1の撮像部の画角を示す第1の画角の情報を取得し、
前記第1の取得部は、前記他の飛行体が備える第2の撮像部の画角を示す第2の画角の情報を取得し、
前記制御部は、前記第1の画角と前記第2の画角との差が略一定となるよう、前記飛行体の飛行を制御する、
請求項16に記載の飛行体。 - 前記第1の取得部は、前記他の飛行体が備える第2の撮像部により撮像された第2の撮像画像を取得し、
前記計算部は、前記第1の撮像部により撮像された第1の撮像画像及び前記第2の撮像画像に基づいて、前記他の飛行体に対する前記飛行体の相対的な位置情報である第2の相対的な位置情報を計算し、
前記第1の相対的な位置情報は、前記他の飛行体に対する前記飛行体の相対的な位置情報である第3の相対的な位置情報を含み、
前記制御部は、前記第2の相対的な位置情報と前記第3の相対的な位置情報とが一致するよう、前記飛行体の飛行を制御する、
請求項17に記載の飛行体。 - 前記飛行体と前記他の飛行体との間の距離を測定し、第1の距離情報を得る測距センサ、を更に備え、
前記第1の相対的な位置情報は、前記飛行体と前記他の飛行体との間の距離を示す第2の距離情報を含み、
前記制御部は、前記第1の距離情報と前記第2の距離情報とが一致するよう、前記飛行体の飛行を制御する、
請求項13~16のいずれか1項に記載の飛行体。 - 複数の飛行体の位置情報を処理する位置処理システムであって、
前記複数の飛行体を選択し、選択された前記複数の飛行体が属する飛行グループを形成する選択部と、
前記飛行体の制御を指示する操作装置による操作中における、前記飛行グループに属する前記複数の飛行体の相対的な位置情報を決定する決定部と、
前記相対的な位置情報を前記複数の飛行体に設定する設定部と、
を備える位置処理システム。 - 飛行グループを形成して飛行する複数の飛行体と、前記複数の飛行体の制御を指示する操作装置と、を備える飛行システムであって、
前記操作装置は、
前記複数の飛行体の飛行の制御を指示する指示信号を送信し、
前記複数の飛行体の各々は、
前記指示信号を受信し、
前記飛行グループに属する前記複数の飛行体の相対的な位置情報を取得し、
前記指示信号と前記相対的な位置情報とに基づいて、前記複数の飛行体の相対的な位置関係を固定して、前記飛行体の各々の飛行を制御する、
飛行システム。 - 画像処理装置、を更に備え、
前記複数の飛行体の各々は、
異なる撮像方向を撮像して撮像画像を取得し、
前記撮像画像を前記画像処理装置へ送信し、
前記画像処理装置は、
前記複数の飛行体の各々からの複数の撮像画像を受信し、
前記複数の撮像画像に基づいて、パノラマ画像及びステレオ画像の少なくとも一方を生成する、
請求項22に記載の飛行システム。 - 複数の飛行体の位置情報を処理する位置処理装置における位置処理方法であって、
前記複数の飛行体を選択し、選択された前記複数の飛行体が属する飛行グループを形成するステップと、
前記飛行体の制御を指示する操作装置による操作中における、前記飛行グループに属する複数の飛行体の相対的な位置情報である第1の相対的な位置情報を決定するステップと、
を有する位置処理方法。 - 前記位置情報を決定するステップは、前記第1の相対的な位置情報として、前記飛行グループに属する複数の飛行体の基準位置に対する前記複数の飛行体の各々の相対的な位置情報を決定するステップを含む、
請求項24に記載の位置処理方法。 - 前記位置情報を決定するステップは、前記複数の飛行体の各々の識別情報と、前記識別情報により識別される飛行体の各々の相対的な位置情報と、を関連付けて設定するステップを含む、
請求項25に記載の位置処理方法。 - 前記第1の相対的な位置情報は、前記複数の飛行体の3次元空間における相対的な位置情報を含む、
請求項24~26のいずれか1項に記載の位置処理方法。 - 前記第1の相対的な位置情報は、前記複数の飛行体の水平方向の距離情報を含む、
請求項27のいずれか1項に記載の位置処理方法。 - 前記第1の相対的な位置情報は、前記複数の飛行体の重力方向の距離情報を含む、
請求項27に記載の位置処理方法。 - 前記複数の飛行体を示す複数の飛行体画像を表示するステップと、
操作部への入力を受け付けるステップと、を更に含み、
前記位置情報を決定するステップは、表示された前記複数の飛行体画像の位置を、前記入力により変更することで、前記第1の相対的な位置情報を変更するステップを含む、
請求項24~29のいずれか1項に記載の位置処理方法。 - 前記入力を受け付けるステップは、ドラッグ操作による入力を受け付けるステップを含む、
請求項30に記載の位置処理方法。 - 前記飛行体画像を表示するステップは、前記ドラッグ操作により変更された複数の前記飛行体画像の位置に基づいて、前記複数の飛行体の間の距離情報を表示するステップを含む、
請求項31に記載の位置処理方法。 - 操作部への入力を受け付けるステップ、を更に含み、
前記位置情報を決定するステップは、前記操作部へ入力された前記複数の飛行体の間の距離情報に基づいて、前記第1の相対的な位置情報を決定するステップを含む、
請求項24~29のいずれか1項に記載の位置処理方法。 - 前記複数の飛行体の各々の位置情報を取得するステップ、を更に含み、
前記位置情報を決定するステップは、取得された複数の前記位置情報の差分に基づく相対的な位置情報である第2の相対的な位置情報に基づいて、前記第1の相対的な位置情報を決定するステップを含む、
請求項24~29のいずれか1項に記載の位置処理方法。 - 前記第1の相対的な位置情報を出力するステップ、を更に含む、
請求項24~34のいずれか1項に記載の位置処理方法。 - 他の飛行体とともに飛行グループを形成して飛行する飛行体における飛行制御方法であって、
前記飛行グループに属する複数の飛行体の制御を指示する操作装置から飛行の制御を指示する指示信号を取得するステップと、
前記飛行グループに属する前記複数の飛行体の基準位置に対する前記飛行体の相対的な位置情報である第1の相対的な位置情報を取得するステップと、
前記指示信号と前記第1の相対的な位置情報とに基づいて、前記基準位置と前記飛行体との間の相対的な位置関係を固定して、前記飛行体の飛行を制御するステップと、
を有する飛行制御方法。 - 前記指示信号は、前記複数の飛行体の旋回を指示するための第1旋回指示情報を含み、
前記飛行体の飛行を制御するステップは、前記第1旋回指示情報に基づいて、前記飛行体と前記飛行グループに属する前記複数の飛行体の基準位置との距離を固定して、前記基準位置を中心として前記飛行体が旋回するように、前記飛行体を制御するステップを含む、
請求項36に記載の飛行制御方法。 - 前記指示信号は、前記複数の飛行体の旋回を指示するための第2旋回指示情報を含み、
前記飛行体の飛行を制御するステップは、前記第2旋回指示情報に基づいて、前記飛行体の位置を固定して、前記飛行体の位置を中心として前記飛行体が旋回するように、前記飛行体の飛行を制御するステップを含む、
請求項36に記載の飛行制御方法。 - 前記飛行グループに属する前記飛行体の台数に基づいて、前記飛行体が備える第1の撮像部の画角を制御するステップと、
前記第1の相対的な位置情報に基づいて、前記第1の撮像部の撮像方向を制御するステップと、を更に含む、
請求項36~38のいずれか1項に記載の飛行制御方法。 - 前記飛行体の飛行位置を示す第1の飛行位置情報を取得するステップと、
前記基準位置と前記第1の相対的な位置情報とを基に、前記飛行体の飛行位置を示す第2の飛行位置情報を計算するステップと、
を更に含み、
前記飛行体の飛行を制御するステップは、前記第1の飛行位置情報と前記第2の飛行位置情報とが一致するように、前記飛行体の飛行を制御するステップを含む、
請求項39に記載の飛行制御方法。 - 前記第1の撮像部の画角を示す第1の画角の情報を取得するステップと、
前記他の飛行体が備える第2の撮像部の画角を示す第2の画角の情報を取得するステップと、を更に含み、
前記飛行体の飛行を制御するステップは、前記第1の画角と前記第2の画角との差が略一定となるよう、前記飛行体の飛行を制御するステップを含む、
請求項39に記載の飛行制御方法。 - 前記第1の撮像部により撮像し、第1の撮像画像を得るステップと、
前記他の飛行体が備える第2の撮像部により撮像された第2の撮像画像を取得するステップと、
前記第1の撮像画像及び前記第2の撮像画像に基づいて、前記他の飛行体に対する前記飛行体の相対的な位置情報である第2の相対的な位置情報を計算するステップと、を更に含み、
前記第1の相対的な位置情報は、前記他の飛行体に対する前記飛行体の相対的な位置情報である第3の相対的な位置情報を含み、
前記飛行体の飛行を制御するステップは、前記第2の相対的な位置情報と前記第3の相対的な位置情報とが一致するよう、前記飛行体の飛行を制御するステップを含む、
請求項40に記載の飛行制御方法。 - 前記飛行体と前記他の飛行体との間の距離を測定し、第1の距離情報を得るステップ、を更に含み、
前記第1の相対的な位置情報は、前記飛行体と前記他の飛行体との間の距離を示す第2の距離情報を含み、
前記飛行体の飛行を制御するステップは、前記第1の距離情報と前記第2の距離情報とが一致するよう、前記飛行体の飛行を制御するステップを含む、
請求項36~39のいずれか1項に記載の飛行制御方法。 - 複数の飛行体の位置情報を処理する位置処理システムにおける位置処理方法であって、
前記複数の飛行体を選択し、選択された前記複数の飛行体が属する飛行グループを形成するステップと、
前記飛行体の制御を指示する操作装置による操作中における、前記飛行グループに属する複数の飛行体の相対的な位置情報を決定するステップと、
前記相対的な位置情報を前記複数の飛行体に設定するステップと、
を有する位置処理方法。 - 飛行グループを形成して飛行する複数の飛行体と、前記複数の飛行体の制御を指示する操作装置と、を備える飛行システムにおける飛行制御方法であって、
前記複数の飛行体の飛行の制御を指示する指示信号を取得するステップと、
前記飛行グループに属する前記複数の飛行体の相対的な位置情報を取得するステップと、
前記指示信号と前記相対的な位置情報とに基づいて、前記複数の飛行体の相対的な位置関係を固定して、前記飛行体の各々の飛行を制御するステップと、
を有する飛行制御方法。 - 前記複数の飛行体の各々により異なる撮像方向を撮像するステップと、
撮像された前記複数の撮像画像を取得するステップと、
前記複数の撮像画像に基づいて、パノラマ画像及びステレオ画像の少なくとも一方を生成するステップと、を更に含む、
請求項45に記載の飛行制御方法。 - 複数の飛行体の位置情報を処理する位置処理装置であるコンピュータに、
前記複数の飛行体を選択し、選択された前記複数の飛行体が属する飛行グループを形成するステップと、
前記飛行体の制御を指示する操作装置による操作中における、前記飛行グループに属する複数の飛行体の相対的な位置情報である第1の相対的な位置情報を決定するステップと、
を実行させるためのプログラム。 - 他の飛行体とともに飛行グループを形成して飛行する飛行体に、
前記飛行グループに属する複数の飛行体の制御を指示する操作装置から飛行の制御を指示する指示信号を取得するステップと、
前記飛行グループに属する前記複数の飛行体の基準位置に対する前記飛行体の相対的な位置情報を取得するステップと、
前記指示信号と前記相対的な位置情報とに基づいて、前記基準位置と前記飛行体との間の相対的な位置関係を固定して、前記飛行体の飛行を制御するステップと、
を実行させるためのプログラム。 - 複数の飛行体の位置情報を処理する位置処理装置であるコンピュータに、
前記複数の飛行体を選択し、選択された前記複数の飛行体が属する飛行グループを形成するステップと、
前記飛行体の制御を指示する操作装置による操作中における、前記飛行グループに属する複数の飛行体の相対的な位置情報である第1の相対的な位置情報を決定するステップと、
を実行させるためのプログラムを記録したコンピュータ読取り可能な記録媒体。 - 他の飛行体とともに飛行グループを形成して飛行する飛行体であるコンピュータに、
前記飛行グループに属する複数の飛行体の制御を指示する操作装置から飛行の制御を指示する指示信号を取得するステップと、
前記飛行グループに属する前記複数の飛行体の基準位置に対する前記飛行体の相対的な位置情報を取得するステップと、
前記指示信号と前記相対的な位置情報とに基づいて、前記基準位置と前記飛行体との間の相対的な位置関係を固定して、前記飛行体の飛行を制御するステップと、
を実行させるためのプログラムを記録したコンピュータ読取り可能な記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780085560.3A CN110249281B (zh) | 2017-02-10 | 2017-02-10 | 位置处理装置、飞行体、及飞行*** |
PCT/JP2017/005016 WO2018146803A1 (ja) | 2017-02-10 | 2017-02-10 | 位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体 |
JP2018566725A JP6862477B2 (ja) | 2017-02-10 | 2017-02-10 | 位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体 |
US16/532,864 US11513514B2 (en) | 2017-02-10 | 2019-08-06 | Location processing device, flight vehicle, location processing system, flight system, location processing method, flight control method, program and recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2017/005016 WO2018146803A1 (ja) | 2017-02-10 | 2017-02-10 | 位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/532,864 Continuation US11513514B2 (en) | 2017-02-10 | 2019-08-06 | Location processing device, flight vehicle, location processing system, flight system, location processing method, flight control method, program and recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018146803A1 true WO2018146803A1 (ja) | 2018-08-16 |
Family
ID=63108283
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/005016 WO2018146803A1 (ja) | 2017-02-10 | 2017-02-10 | 位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US11513514B2 (ja) |
JP (1) | JP6862477B2 (ja) |
CN (1) | CN110249281B (ja) |
WO (1) | WO2018146803A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020088397A1 (zh) * | 2018-10-31 | 2020-05-07 | 深圳市大疆创新科技有限公司 | 位置推定装置、位置推定方法、程序以及记录介质 |
JP2021189002A (ja) * | 2020-05-28 | 2021-12-13 | 株式会社日立製作所 | 距離計測システム、及び距離計測方法 |
WO2022018790A1 (ja) * | 2020-07-20 | 2022-01-27 | 株式会社ナイルワークス | 無人航空機制御システム |
JP7063966B1 (ja) | 2020-10-21 | 2022-05-09 | 西日本電信電話株式会社 | 移動体制御システム |
WO2023203673A1 (ja) * | 2022-04-20 | 2023-10-26 | 株式会社クボタ | 飛行体制御システムおよび飛行体システム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102029906B1 (ko) * | 2017-11-10 | 2019-11-08 | 전자부품연구원 | 이동수단의 가상현실 콘텐츠 제공 장치 및 방법 |
JP6896962B2 (ja) * | 2019-12-13 | 2021-06-30 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 決定装置、飛行体、決定方法、及びプログラム |
JP2021094890A (ja) * | 2019-12-13 | 2021-06-24 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 飛行体、及び制御方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050230563A1 (en) * | 2004-02-21 | 2005-10-20 | Corcoran James J Iii | Automatic formation flight control system |
JP2010188893A (ja) * | 2009-02-19 | 2010-09-02 | Japan Aerospace Exploration Agency | 移動体の三次元群制御方法 |
JP2015191254A (ja) * | 2014-03-27 | 2015-11-02 | 日本電気株式会社 | 無人航空機、無人航空機の制御方法、および、管制システム |
Family Cites Families (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6926233B1 (en) * | 2004-02-21 | 2005-08-09 | Corcoran, Iii James John | Automatic formation flight control system (AFFCS)—a system for automatic formation flight control of vehicles not limited to aircraft, helicopters, or space platforms |
IL162767A (en) * | 2004-06-29 | 2008-04-13 | Israel Aerospace Ind Ltd | Collision avoidance system and a method thereof |
US20060167596A1 (en) * | 2005-01-24 | 2006-07-27 | Bodin William K | Depicting the flight of a formation of UAVs |
DE102005038017B3 (de) * | 2005-08-09 | 2007-05-10 | Eads Deutschland Gmbh | Verfahren zur Flugführung mehrerer im Verband fliegender Flugzeuge |
US7894948B2 (en) * | 2007-11-01 | 2011-02-22 | L-3 Communications Integrated Systems L.P. | Systems and methods for coordination of entities and/or communicating location information |
US8538673B2 (en) * | 2008-10-31 | 2013-09-17 | Czech Technical University In Prague | System and method for planning/replanning collision free flight plans in real or accelerated time |
US9762795B2 (en) * | 2013-09-04 | 2017-09-12 | Gyeongil Kweon | Method and apparatus for obtaining rectilinear images using rotationally symmetric wide-angle lens |
WO2012002976A1 (en) * | 2010-07-01 | 2012-01-05 | Mearthane Products Corporation | High performance resilient skate wheel with compression modulus gradient |
JP5618840B2 (ja) * | 2011-01-04 | 2014-11-05 | 株式会社トプコン | 飛行体の飛行制御システム |
FR2977333B1 (fr) * | 2011-06-28 | 2014-01-31 | Parrot | Procede de controle dynamique d'attitude d'un drone, pour l'execution automatique d'une figure de type vrille ou salto |
JP5979916B2 (ja) * | 2012-03-07 | 2016-08-31 | キヤノン株式会社 | 情報処理装置およびその制御方法 |
US8949090B2 (en) * | 2013-01-28 | 2015-02-03 | The Boeing Company | Formation flight control |
CN104062980A (zh) * | 2014-06-10 | 2014-09-24 | 北京空间机电研究所 | 一种无人机机载全景监测*** |
SG10201406357QA (en) * | 2014-10-03 | 2016-05-30 | Infinium Robotics Pte Ltd | System for performing tasks in an operating region and method of controlling autonomous agents for performing tasks in the operating region |
CN105518415A (zh) * | 2014-10-22 | 2016-04-20 | 深圳市大疆创新科技有限公司 | 一种飞行航线设置方法及装置 |
CN105518487B (zh) * | 2014-10-27 | 2017-09-12 | 深圳市大疆创新科技有限公司 | 飞行器的位置提示方法及装置 |
US9997079B2 (en) * | 2014-12-12 | 2018-06-12 | Amazon Technologies, Inc. | Commercial and general aircraft avoidance using multi-spectral wave detection |
RU2585204C1 (ru) * | 2015-01-29 | 2016-05-27 | Открытое акционерное общество "Раменское приборостроительное конструкторское бюро" (ОАО "РПКБ") | Способ управления летательным аппаратом при заходе на навигационную точку с заданного направления |
JP6594008B2 (ja) * | 2015-03-23 | 2019-10-23 | 株式会社メガチップス | 移動体制御装置、ランドマーク、および、プログラム |
JP6525145B2 (ja) * | 2015-04-23 | 2019-06-05 | 有限会社大平技研 | 飛翔体を用いた発光点図形パターン表示システム,発光点図形パターン表示方法ならびに該システムおよび方法に用いる飛翔体 |
US11034443B2 (en) * | 2015-06-12 | 2021-06-15 | Sunlight Aerospace Inc. | Modular aircraft assembly for airborne and ground transport |
JP6942909B2 (ja) * | 2015-07-27 | 2021-09-29 | ジェンギスコム ホールディングス エルエルシーGenghiscomm Holdings, Llc | 協調的mimoシステムにおける空中中継器 |
US9928748B2 (en) * | 2015-11-25 | 2018-03-27 | International Business Machines Corporation | Dynamic geo-fence for drone |
CN105425817B (zh) * | 2015-12-09 | 2018-06-22 | 深圳市峰创科技有限公司 | 一种多无人机编组飞行控制*** |
CN105511494B (zh) * | 2016-01-20 | 2018-06-19 | 浙江大学 | 一种多无人机分布式队形控制的方法 |
CN105717933A (zh) * | 2016-03-31 | 2016-06-29 | 深圳奥比中光科技有限公司 | 无人机以及无人机防撞方法 |
CN107368084A (zh) * | 2016-05-11 | 2017-11-21 | 松下电器(美国)知识产权公司 | 飞行控制方法及无人飞行器 |
CN105955308B (zh) * | 2016-05-20 | 2018-06-29 | 腾讯科技(深圳)有限公司 | 一种飞行器的控制方法和装置 |
CN106227224A (zh) * | 2016-07-28 | 2016-12-14 | 零度智控(北京)智能科技有限公司 | 飞行控制方法、装置及无人机 |
US10118292B2 (en) * | 2016-08-18 | 2018-11-06 | Saudi Arabian Oil Company | Systems and methods for configuring field devices using a configuration device |
US10163357B2 (en) * | 2016-08-24 | 2018-12-25 | Qualcomm Incorporated | Navigation assistance data and route planning for drones |
US10114384B2 (en) * | 2016-09-13 | 2018-10-30 | Arrowonics Technologies Ltd. | Formation flight path coordination of unmanned aerial vehicles |
CN106292294B (zh) * | 2016-10-20 | 2018-11-20 | 南京航空航天大学 | 基于模型参考自适应控制的舰载无人机自动着舰控制装置 |
KR20210084768A (ko) * | 2019-12-27 | 2021-07-08 | 엘지전자 주식회사 | 무인 비행체 |
-
2017
- 2017-02-10 WO PCT/JP2017/005016 patent/WO2018146803A1/ja active Application Filing
- 2017-02-10 JP JP2018566725A patent/JP6862477B2/ja active Active
- 2017-02-10 CN CN201780085560.3A patent/CN110249281B/zh active Active
-
2019
- 2019-08-06 US US16/532,864 patent/US11513514B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050230563A1 (en) * | 2004-02-21 | 2005-10-20 | Corcoran James J Iii | Automatic formation flight control system |
JP2010188893A (ja) * | 2009-02-19 | 2010-09-02 | Japan Aerospace Exploration Agency | 移動体の三次元群制御方法 |
JP2015191254A (ja) * | 2014-03-27 | 2015-11-02 | 日本電気株式会社 | 無人航空機、無人航空機の制御方法、および、管制システム |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020088397A1 (zh) * | 2018-10-31 | 2020-05-07 | 深圳市大疆创新科技有限公司 | 位置推定装置、位置推定方法、程序以及记录介质 |
CN111615616A (zh) * | 2018-10-31 | 2020-09-01 | 深圳市大疆创新科技有限公司 | 位置推定装置、位置推定方法、程序以及记录介质 |
JP2021189002A (ja) * | 2020-05-28 | 2021-12-13 | 株式会社日立製作所 | 距離計測システム、及び距離計測方法 |
JP7369093B2 (ja) | 2020-05-28 | 2023-10-25 | 株式会社日立製作所 | 距離計測システム、及び距離計測方法 |
WO2022018790A1 (ja) * | 2020-07-20 | 2022-01-27 | 株式会社ナイルワークス | 無人航空機制御システム |
JP7412041B2 (ja) | 2020-07-20 | 2024-01-12 | 株式会社ナイルワークス | 無人航空機制御システム |
JP7063966B1 (ja) | 2020-10-21 | 2022-05-09 | 西日本電信電話株式会社 | 移動体制御システム |
JP2022068016A (ja) * | 2020-10-21 | 2022-05-09 | 西日本電信電話株式会社 | 移動体制御システム |
WO2023203673A1 (ja) * | 2022-04-20 | 2023-10-26 | 株式会社クボタ | 飛行体制御システムおよび飛行体システム |
Also Published As
Publication number | Publication date |
---|---|
JP6862477B2 (ja) | 2021-04-21 |
US20190361435A1 (en) | 2019-11-28 |
CN110249281B (zh) | 2022-11-22 |
CN110249281A (zh) | 2019-09-17 |
JPWO2018146803A1 (ja) | 2019-11-21 |
US11513514B2 (en) | 2022-11-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11377211B2 (en) | Flight path generation method, flight path generation system, flight vehicle, program, and storage medium | |
JP6862477B2 (ja) | 位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体 | |
JP6878567B2 (ja) | 3次元形状推定方法、飛行体、モバイルプラットフォーム、プログラム及び記録媒体 | |
JP6765512B2 (ja) | 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体 | |
US20190318636A1 (en) | Flight route display method, mobile platform, flight system, recording medium and program | |
US11122209B2 (en) | Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium | |
US11082639B2 (en) | Image display method, image display system, flying object, program, and recording medium | |
US20210034052A1 (en) | Information processing device, instruction method for prompting information, program, and recording medium | |
CN111213107B (zh) | 信息处理装置、拍摄控制方法、程序以及记录介质 | |
JP2019101587A (ja) | 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体 | |
JP7067897B2 (ja) | 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体 | |
JP6921026B2 (ja) | 送信機、飛行体、飛行制御指示方法、飛行制御方法、プログラム、及び記憶媒体 | |
JP6684012B1 (ja) | 情報処理装置および情報処理方法 | |
WO2018138882A1 (ja) | 飛行体、動作制御方法、動作制御システム、プログラム及び記録媒体 | |
KR102542181B1 (ko) | 360도 vr 영상 제작을 위한 무인 비행체 제어 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17895611 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018566725 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25/06/2020) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17895611 Country of ref document: EP Kind code of ref document: A1 |