WO2024069788A1 - Mobile body system, aerial photography system, aerial photography method, and aerial photography program - Google Patents

Mobile body system, aerial photography system, aerial photography method, and aerial photography program Download PDF

Info

Publication number
WO2024069788A1
WO2024069788A1 PCT/JP2022/036119 JP2022036119W WO2024069788A1 WO 2024069788 A1 WO2024069788 A1 WO 2024069788A1 JP 2022036119 W JP2022036119 W JP 2022036119W WO 2024069788 A1 WO2024069788 A1 WO 2024069788A1
Authority
WO
WIPO (PCT)
Prior art keywords
flight
mode
court
flight mode
drone
Prior art date
Application number
PCT/JP2022/036119
Other languages
French (fr)
Japanese (ja)
Inventor
望 三浦
Original Assignee
株式会社RedDotDroneJapan
株式会社DRONE iPLAB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社RedDotDroneJapan, 株式会社DRONE iPLAB filed Critical 株式会社RedDotDroneJapan
Priority to PCT/JP2022/036119 priority Critical patent/WO2024069788A1/en
Publication of WO2024069788A1 publication Critical patent/WO2024069788A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras

Definitions

  • the present invention relates to a mobile system, an aerial photography system, an aerial photography method, and an aerial photography program.
  • Patent Document 1 discloses a camera viewpoint display system that detects the aircraft's position and nose direction, as well as the pan and tilt angles of a camera device mounted on the aircraft, calculates the camera viewpoint from each of these pieces of data, and displays the viewpoint on a map on a monitor screen. With this system, an operator controls the aircraft's position and attitude, as well as the camera's shooting direction, while grasping the aircraft's position and heading from a ground station.
  • Patent Document 1 The system described in Patent Document 1 was complicated because it required simultaneous control of the camera and the aircraft. Furthermore, in the system described in Patent Document 1, the operation of switching between shooting modes is cumbersome and not easy. In particular, when it is necessary to switch geofences in response to changes in the flight area of the aircraft, if the switching is not done at the appropriate time, control interference will occur and the aircraft may not move as intended.
  • the present invention was made in consideration of the above-mentioned problems, and aims to provide a mobile system that can ensure safety during aerial flight.
  • a mobile body system includes a mobile body that flies in a target area, and a flight mode switching unit that switches flight modes that correspond to a flyable area of the mobile body and a geofence of the mobile body that defines an area that includes the flyable area, and when switching from a first flight mode to a second flight mode, the flight mode switching unit transitions to the second flight mode via a third flight mode, and the third geofence in the third flight mode is different from the first geofence in the first flight mode and the second geofence in the second flight mode.
  • the third geofence may define an area covering an integrated area that combines a first area defined by the first geofence and a second area defined by the second geofence.
  • the third geofence may define an area covering an integrated area that combines a first area defined by the first geofence, a second area defined by the second geofence, and the gap between the first area and the second area.
  • One of the first flight mode and the second flight mode may be an edge flight mode in which the aircraft flies along the outer edge of the target area, and the other of the first flight mode and the second flight mode may be an intra-target area flight mode in which the aircraft flies above the target area.
  • the device may further include a flight path generation unit that generates a flight path for the moving body in the third flight mode, and that changes the flight path when at least a portion of the flight path is generated outside the third geofence.
  • a flight path generation unit that generates a flight path for the moving body in the third flight mode, and that changes the flight path when at least a portion of the flight path is generated outside the third geofence.
  • the moving body may be caused to perform at least one of the following actions: landing, hovering, or moving inwardly of the first area, the second area, or the integrated area.
  • an aerial photography system includes a moving object flying in a target area, an event detection unit that detects an event based on an image acquired by a camera photographing the target area or an input from an external system, and a mode switching determination unit that determines whether or not to switch flight modes that define at least a flyable area of the moving object, the flight modes including at least an outer edge flight mode in which the flying object flies above the outer edge of a court configured within the target area along part or all of the outer edge of the court, and an inside court flight mode in which the flying object flies above the court, and the mode switching determination unit determines whether or not to switch flight modes depending on the event detected by the event detection unit.
  • the mode switching determination unit may switch the flight mode in response to an event related to a match taking place in the target area, which is detected from an image captured by the camera.
  • the system may further include an in-court flight control unit that generates, in the in-court flight mode, a flight path within the court for the moving body to automatically follow a specific player or ball in the target area, or a flight path for moving the moving body to a target position specified by a user.
  • an in-court flight control unit that generates, in the in-court flight mode, a flight path within the court for the moving body to automatically follow a specific player or ball in the target area, or a flight path for moving the moving body to a target position specified by a user.
  • the device may further include a display control unit that displays information about the moving object on an operation screen, and the display control unit may display the specific player or the ball that is the subject of tracking on the operation screen when the moving object is flying with the automatic tracking.
  • the on-court flight control unit may generate the flight path in the on-court flight mode by connecting a plurality of pre-set shooting positions.
  • the device may further include a display control unit that displays information about the moving object on an operation screen, and the display control unit may display information about the destination shooting position on the operation screen while flying between the shooting positions in the on-court flight mode.
  • the target area is made up of the court and an area outside the court, which are partitioned by connecting a pair of goal lines facing each other and a pair of touch lines facing each other, and further includes an outer edge flight control unit that controls the flight of the moving body in the outer edge flight mode, and the outer edge flight control unit may move the moving body along the touch lines in the outer edge flight mode and automatically photograph the moving body by following a specific player or the ball.
  • the outer edge flight control unit may be configured to move the moving object to the outside of the court area beyond the touch line in which the moving object is flying, and take a picture directly below the moving object or toward the court when the ball goes beyond the touch line and enters the outside of the court area.
  • the flight modes further include a fixed position flight mode in which the moving body is caused to fly in a fixed position, and further include a fixed position flight control unit that controls the operation of the moving body in the fixed position flight mode, and the fixed position flight control unit may hover at a predetermined position in the fixed position flight mode, and control the nose direction or the direction of the camera to follow a specific player or the ball and automatically take pictures.
  • the on-court flight control unit When the on-court flight control unit detects an obstacle on the flight path or near the moving object in the on-court flight mode, it may perform at least one of the following actions: change the connection of the shooting positions to regenerate a flight path that bypasses the obstacle, decide to fly at a higher altitude than the flight path, start moving the moving object on the flight path after hovering for a predetermined time, switch to manual control after hovering for a predetermined time, and display a message prompting the user to re-input the target position after hovering for a predetermined time.
  • the outer edge flight control unit may perform at least one of the following operations when generating a flight path for the moving body in the outer edge flight mode, and when detecting an obstacle on the flight path or near the moving body, regenerating a flight path that bypasses the obstacle to the inside of the court, deciding to fly at a higher altitude than the flight path, starting the movement of the moving body on the flight path after hovering for a predetermined time, switching to manual control after hovering for a predetermined time, and displaying a message prompting the user to re-input the target position after hovering for a predetermined time.
  • an aerial photography method includes an event detection step for detecting an event based on an image acquired by a camera photographing a target area or an input from an external system, and a mode switching determination step for determining whether or not to switch flight modes that define a flight area within which a mobile object can fly, the flight modes including at least an outer edge flight mode in which the mobile object flies above the outer edge of a court configured within the target area along part or all of the outer edge of the court, and an inside court flight mode in which the mobile object flies above the court, and the mode switching determination step determines whether or not to switch flight modes depending on the event detected by the event detection step.
  • an aerial photography program causes a computer to execute an event detection command for detecting an event based on an image acquired by a camera photographing a target area or an input from an external system, and a mode switching determination command for determining whether or not to switch flight modes that define at least a flight area in which a moving object flying in the target area can fly, the flight modes including at least an outer edge flight mode for flying above the outer edge along part or all of the outer edge of a court configured within the target area, and an intra-court flight mode for flying above the interior of the court, and the mode switching determination command determines whether or not to switch flight modes depending on the event detected by the event detection command.
  • the computer program may be provided by being stored on various data-readable recording media, or may be provided so as to be downloadable via a network such as the Internet.
  • the present invention makes it possible to ensure safety during aerial flight.
  • FIG. 1 is an overall configuration diagram of a mobile body system according to an embodiment of the present invention
  • FIG. 2 is a simplified external perspective view of the drone according to the embodiment.
  • FIG. 2 is a functional configuration diagram of the drone according to the embodiment.
  • (a) is a simplified front view of the exterior of the control device of the embodiment;
  • (b) is a schematic diagram showing the direction in which the drone moves or turns in response to input from the control device.
  • FIG. 2 is a functional configuration diagram of the control device according to the embodiment.
  • FIG. 2 is a functional configuration diagram of a server according to the embodiment.
  • 1 is a schematic diagram showing an example of a shooting position of a drone that is set in advance in a shooting target field where the drone flies.
  • FIG. 1A to 1D are schematic diagrams showing examples of setting geofences in the field to be photographed, in which (a) is a first example, (b) is a second example, (c) is a third example, and (d) is a fourth example.
  • FIG. 2 is a schematic state transition diagram showing the transition of flight modes of the drone. This is a schematic state transition diagram showing the state transition of the drone depending on the aircraft state of the drone. A schematic state transition diagram showing the state transition of the drone according to the aircraft behavior state of the drone.
  • FIG. 11 is a schematic state transition diagram showing state transitions of a game in a stadium as an example of a field to be photographed.
  • FIG. 2 is a schematic state transition diagram showing the state transition of the offensive and defensive states in the stadium.
  • FIG. 11 is a table showing an example of a correspondence relationship between a game state in the stadium and a flight mode of the drone.
  • 1 is a schematic diagram showing possible photographing positions and flight paths to which the photographing position can be transitioned; 1 is a flowchart of a control executed during flight of the drone.
  • 17 is a flowchart showing control of flight restrictions in the drone (details of S1002 in FIG. 16 ). 17 is a flowchart showing flight mode switching control of the drone (details of S1010 in FIG. 16 ).
  • FIG. 4 is a diagram showing a first example of a screen displayed on a terminal of the aerial photography system.
  • FIG. 13 is a diagram showing a second example of a screen displayed on the terminal of the aerial photography system.
  • FIG. 1 is an overall configuration diagram of an aerial photography system 1 (hereinafter also referred to as "system 1") according to one embodiment of the present invention.
  • the system 1 uses a drone 100 (an example of a moving body) to take aerial photographs of a competition held at a stadium F (Fig. 7) or an event held at an event venue.
  • the stadium F is an example of a target area.
  • the system 1 mainly includes a controller 200 that allows the pilot to operate the drone 100, a server 300 that manages the flight and photography of the drone 100, an external input device 600, and an external system 700.
  • the drone 100 and the controller 200 are connected to each other via wireless communication (which may include communication via a base station 800).
  • the controller 200 and the server 300 are connected to each other via a communication network 400 such as an internet line.
  • the drone 100 acquires satellite signals from an artificial satellite 500 to determine its own position, etc.
  • the external input device 600 is a device capable of transmitting and receiving information to and from the system 1, separate from the controller 200, and is composed of a mobile terminal such as a smartphone or tablet terminal.
  • the external input device 600 can be operated, for example, by the manager, coach, bench player, referee, or court equipment personnel of the competition taking place at the stadium F.
  • the external input device 600 has, for example, a function for receiving an emergency command to stop filming, and the drone 100 performs emergency evacuation based on the command.
  • the external input device 600 may also receive an input to switch the flight mode of the drone 100.
  • the external input device 600 may be equipped with a display device, and may display information similar to that of the display unit 201 of the controller 200.
  • the external input device 600 may acquire event information that occurs during the competition. The event information is referred to when the user of the external input device 600 makes an input to switch the flight mode of the drone 100.
  • the external system 700 may be any system configured separately from the system 1. For example, systems such as a court facility system, a match management system, and a referee support system may be applied as systems deployed in relation to the competition held at the stadium F, and systems such as a weather observation system or an earthquake observation system deployed independently of the competition may also be applied. Multiple external systems 700 may be connected to the system 1. The system 1 may receive an emergency command to stop filming or a command to switch the flight mode of the drone 100 from the various external systems 700. In addition, the various external systems 700 may acquire event information that occurs during the competition.
  • systems such as a court facility system, a match management system, and a referee support system may be applied as systems deployed in relation to the competition held at the stadium F, and systems such as a weather observation system or an earthquake observation system deployed independently of the competition may also be applied.
  • Multiple external systems 700 may be connected to the system 1.
  • the system 1 may receive an emergency command to stop filming or a command to switch the flight mode of the drone 100 from the various external systems 700
  • the court facilities system which is an example of the external system 700, may obtain the brightness of the captured image from the system 1, for example, and control the illuminance adjustment or blinking of the lighting in the stadium F.
  • the court facilities system may also receive a request for lighting illuminance from the system 1 and control the illuminance adjustment or blinking.
  • the configuration of system 1 is not limited to that shown in FIG. 1, and the drone 100, the controller 200, the server 300, and the base station 800 may each be connected to each other so that they can communicate with each other via a communication network 400 such as an Internet line.
  • the drone 100 may perform wireless communication directly with the communication network 400 using a communication method such as LTE without going through the controller 200. Therefore, the drone 100, the controller 200, and the base station 800 do not need to perform direct wireless communication, and it is sufficient if they can each be connected to the communication network 400 in a remote location. Therefore, this system configuration is suitable for cases where the drone 100 and the controller 200 are in a remote location (for example, when a pilot operates them remotely).
  • the drone 100, the controller 200, the base station 800, and the server 300 are each connected to each other so that they can communicate with each other via a communication network 400 such as an Internet line, and the drone 100 and the base station 800 may be communicatively connected to the communication network 400 by satellite communication via an artificial satellite 500.
  • a communication network 400 such as an Internet line
  • the drone 100 and the base station 800 may be communicatively connected to the communication network 400 by satellite communication via an artificial satellite 500.
  • multiple servers 300 may be connected to one drone 100 via multiple communication networks 400, i.e., the system may be made redundant.
  • the system may be made redundant.
  • the drone 100 and the controller 200 can be controlled even when they are remotely located, making them suitable for remote operation, but this is not limited to this, and they can also be applied to visual flight in which the pilot manually controls the drone 100 while watching it.
  • the device described in the above embodiment may be realized as a single device, or may be realized by multiple devices (e.g., drone 100, controller 200, cloud server 300) that are partially or completely connected by a communication network 400.
  • each functional unit and memory unit of server 300 may be realized by being implemented in different servers 300, drones 100, and controllers 200 that are connected to each other by the communication network 400.
  • Fig. 2 is a simplified external perspective view of the drone 100 of this embodiment.
  • Fig. 3 is a functional configuration diagram of the drone 100 of this embodiment. As described above, the drone 100 takes aerial photographs of competitions held in the stadium F (Fig. 7) and events held in the event venue.
  • drone refers to any flying object that has the ability to autonomously control its attitude, regardless of the power source (electricity, prime mover, etc.), control method (wireless or wired, and fully autonomous or partially manual, etc.), and whether manned or unmanned.
  • Drones are also sometimes referred to as Unmanned Aerial Vehicles (UAVs), flying objects, multicopters, RPAS (Remote Piloted Aircraft Systems), or UAS (Unmanned Aircraft Systems), etc.
  • the exterior of the drone 100 is mainly composed of a housing 101 and multiple propellers 122.
  • the housing 101 is, for example, a roughly rectangular parallelepiped, but may have any shape.
  • Rod-shaped connecting parts 102 extending laterally are connected to the left and right sides of the housing 101.
  • the other ends of the connecting parts 102 are respectively connected to propellers 122 and motors 121 that rotate the propellers 122.
  • the motors 121 are, for example, electric motors.
  • the propellers 122 may be composed of a single propeller, or may be composed of multiple propellers arranged coaxially.
  • the number and shape of the blades of each propeller are not particularly limited.
  • a propeller guard (not shown) may be provided on the outside of the propeller 122 to prevent the propeller from interfering with obstacles.
  • a photographing camera 141 is held by a camera holder 142 below the housing 101.
  • an obstacle detection camera 131 is disposed on the front surface of the housing 101.
  • the obstacle detection camera 131 is a so-called dual camera consisting of two cameras that form a pair.
  • the obstacle detection camera 131 is disposed so as to capture an image in front of the drone 100.
  • the obstacle detection camera 131 may be disposed not only on the front surface but also on all surfaces of the housing 101, for example, on six surfaces in the case of a housing 101 that is a substantially rectangular parallelepiped.
  • the drone 100 is equipped with an alarm device 250 that alerts people around the drone 100 to the presence of the drone 100.
  • the alarm device 250 has, for example, a warning light 251 and a speaker 252.
  • the warning light 251 is provided for each propeller 122 or motor 121, and is disposed, for example, on each side of multiple motors 121.
  • the warning light 251 may be disposed along the cylindrical side of the motor 121 so that it can be seen from all directions in addition to the front.
  • the speaker 252 outputs an alarm sound and is provided in the housing 101 of the drone 100.
  • the speaker 252 is provided, for example, on the underside of the housing 101, and transmits the alarm sound downwards of the drone 100.
  • the drone 100 is equipped with an arithmetic device such as a CPU (Central Processing Unit) for executing information processing, and storage devices such as a RAM (Random Access Memory) and a ROM (Read Only Memory), and thereby has the following functional blocks: a measurement unit 110, a flight function unit 120, an obstacle detection unit 130, an imaging unit 140, and a communication unit 150.
  • an arithmetic device such as a CPU (Central Processing Unit) for executing information processing
  • storage devices such as a RAM (Random Access Memory) and a ROM (Read Only Memory)
  • the measurement unit 110 is a functional unit that measures information related to the drone 100 or its surroundings.
  • the measurement unit 110 has, for example, a position measurement unit 111, a direction measurement unit 112, an altitude measurement unit 113, and a speed measurement unit 114.
  • the measurement unit 110 may also include various sensors that acquire information such as temperature, air pressure, wind speed, and acceleration.
  • the position measurement unit 111 receives signals from the artificial satellites 500 and measures the position (absolute position) of the aircraft based on the signals.
  • the position measurement unit 111 measures its current position using, for example, GNSS (Global Navigation Satellite System), GPS (Global Positioning System), etc., but is not limited to this.
  • GNSS Global Navigation Satellite System
  • GPS Global Positioning System
  • RTK-GNSS Real Time Kinematic - Global Navigation Satellite System
  • the position information includes at least two-dimensional coordinate information in a planar view (e.g., latitude, longitude), and preferably includes three-dimensional coordinate information including altitude information.
  • the base station 800 which provides information on the reference points of fixed stations used for relative positioning such as RTK, is connected to the drone 100 and the controller 200 so as to be able to communicate wirelessly with them, making it possible to measure the position of the drone 100 with greater accuracy.
  • the base station 800 can be omitted, or the accuracy of the position coordinate estimation of the base station 800 or the drone 100 can be further improved.
  • the orientation measurement unit 112 measures the orientation of the aircraft (nose direction, heading direction).
  • the orientation measurement unit 112 is composed of a geomagnetic sensor that measures the nose direction (heading direction) of the drone 100 aircraft by measuring geomagnetism, a compass, etc.
  • the altitude measurement unit 113 measures the altitude above the ground (hereinafter also referred to as "flight altitude”) as the distance from the ground below the drone 100 (vertically downward).
  • the speed measurement unit 114 detects the flight speed of the drone 100.
  • the speed measurement unit 114 may measure the speed using a known sensor such as a gyro sensor.
  • Flight function unit 120 is a mechanism and function unit that causes the drone 100 to fly, and generates thrust in the airframe for lifting the drone 100 and moving it in a desired direction. As shown in Figures 2 and 3, the flight function unit 120 has a plurality of motors 121, a plurality of propellers 122, and a flight control unit 123.
  • the flight control unit 123 independently controls the multiple motors 121 to rotate each propeller 122, causing the drone 100 to perform various operations such as taking off, moving forward, turning, and landing, and controls the attitude angle control and flight operations of the drone 100 from takeoff, during flight, and until landing.
  • the flight control unit 123 has a processing unit, also called a flight controller.
  • the processing unit may have one or more processors, such as a programmable processor (e.g., a central processing unit (CPU), MPU, or DSP).
  • the processing unit has access to a memory (storage unit).
  • the memory stores logic, code, and/or program instructions that the processing unit can execute to perform one or more steps.
  • the memory may include, for example, a separable medium such as an SD card or RAM, or an external storage device.
  • Various data acquired by the measurement unit 110, or video or still image data captured by the imaging camera 141 may be directly transmitted to and stored in the memory. Each data may also be recorded in an external memory.
  • the processing unit includes a control module configured to control the state of the drone 100.
  • the control module controls the flight function section 120 (thrust generating section) of the drone 100 to adjust the spatial arrangement, attitude angle, angular velocity, angular acceleration, angular jerk rate, and/or acceleration of the drone 100 having six degrees of freedom (translational motion x, y, and z, and rotational motion ⁇ x, ⁇ y, and ⁇ z).
  • the flight control unit 123 can control the flight of the drone 100 based on control signals from the pilot 200 or based on a preset autonomous flight program.
  • the flight control unit 123 can also control the flight of the drone 100 by controlling the motor 121 based on various information such as the field to be photographed, flight permitted/prohibited areas, information on the corresponding flight geofences, map information including two-dimensional or three-dimensional map data, the current position information of the drone 100, attitude information (heading information), speed information, and acceleration information, and any combination of these.
  • field of interest or “area of interest” refers to a two-dimensional location (e.g., stadium F) that is the subject of photography.
  • FIG. 7 is a schematic diagram showing an example of a playing field F, which is an example of a field to be photographed by a drone, viewed from above.
  • the playing field F is composed of a court F100 that is roughly rectangular and is defined by, for example, a straight outer edge, and an outer court area F200 that is a predetermined area that covers the outer edge of the court F100.
  • the outer edge of the court F100 is composed of mutually opposing goal lines F110a, F110b and mutually opposing touch lines F111a, F111b that are connected at roughly right angles.
  • the connection points of the goal lines F110a, F110b and the touch lines F111a, F111b are the corners F112a, F113a, F112b, F113b.
  • Goals F120a, F120b are provided approximately in the center of the pair of goal lines F110a, F110b.
  • Penalty areas F130a, F130b are defined in specific areas inside the court F100 adjacent to the goals F120a, F120b, and penalty lines F140a, F140b are drawn on the outer edges of the penalty areas.
  • a halfway line F150 is drawn in the center of the court F100, connecting the midpoints of a pair of touchlines F111a, F111b and dividing the court F100 into approximately equal parts.
  • the halfway line F150 is approximately parallel to the goal lines F110a, F110b.
  • goal lines F110a, F110b, touchlines F111a, F111b, penalty lines F140a, F140b, and halfway line F150 are required by the rules for players to play the game, and therefore all of these lines are generally drawn in a manner that allows them to be seen, but the technical scope of the present invention is not limited to this.
  • a soccer stadium is used as an example, but the sports that are photographed by the system of the present invention are not limited to soccer, and include any type of sports, such as tennis.
  • the subject of the photography is not limited to sports, and the system can also be applied to other events (concerts, ceremonies, etc.).
  • the shooting positions L101-L105, L206-L215 may be two-dimensional coordinates on a plane, or may be three-dimensional coordinate information that also defines the height at the corresponding positions.
  • the flight height of the drone 100 may be manually controllable based on input from the controller 200.
  • Photographing positions L101 to L105 are defined, for example, on the touchline F111b, at approximately equal intervals along the touchline F111b.
  • photographing position L101 is a point located in a range including the intersection of the halfway line F150 and the touchline F111b and slightly outside the court F100.
  • Photographing positions L103 and L105 are points near the corners F112a or F112b on both sides of the touchline F111b.
  • Photographing positions L102 and L104 are points between photographing positions L103 and L105 and photographing position L101. Note that the above positions are merely examples, and are not limited to these and may be any appropriate positions.
  • Photographing positions L206 to L215 are points defined within the court F100.
  • photographing positions L206 and L211 are points near the center of the penalty lines F140a and F140b on a line parallel to the goal lines F110a and F110b, and are so-called goal-front photographing positions.
  • Photographing positions L207 and L212 are photographing positions closer to the touch lines F111a and F111b and closer to the halfway line F150 than photographing positions L206 and L215. More specifically, for example, photographing positions L207 and L212 are points on an imaginary line segment connecting photographing position L101 and goals F120a and F120b, and are, for example, points approximately in the center of the imaginary line segment.
  • Photographing positions L209 and L215 are points that are linearly symmetrical to photographing positions L207 and L212.
  • Shooting position L208 is a point between shooting position L207 and the halfway line F150
  • shooting position L210 is a point between shooting position L209 and the halfway line F150
  • shooting position L213 is a point between shooting position L212 and the halfway line F150
  • shooting position L214 is a point between shooting position L215 and the halfway line F150.
  • an evacuation point H200 is set to which the drone 100 is to be evacuated if an abnormality or malfunction of the drone 100 or the system 1 is detected.
  • the abnormality referred to here is an abnormality related to the stability of the aerial movement of the drone 100.
  • the abnormality includes, for example, a case where the calculation load associated with the operation control (behavior control, shooting control, etc.) of the drone 100 exceeds a load threshold.
  • the abnormality may include a transient abnormality related to the environment, such as a case where the measured value of the behavior control value (e.g. speed) of the drone 100 exceeds an allowable value due to the influence of a strong wind or the like.
  • the evacuation point H200 is set at a point different from the shooting positions L101 to L105 and L206 to L215, and in this embodiment, it is set outside the touchline F111a and along the touchline F111a. There may be multiple evacuation points H200, and in this embodiment, there are three.
  • the evacuation point H220 is set near the extension of the halfway line F150.
  • the evacuation points H210 and H230 are set closer to the goals F120a and F120b than the shooting positions L206 and L211.
  • the evacuation points H210 and H230 are set at the ends of an area partitioned by a geofence G200, which will be described later, for example.
  • the drone 100 is replaced or the battery installed in the drone 100 is replaced.
  • a geofence indicates a virtual boundary line that divides an area, and in particular, the geofence in this embodiment indicates a fence that is the boundary line between a flight-permitted area in which the drone 100 is permitted to fly or move and a flight-prohibited area.
  • a geofence is a boundary line that divides an area that spreads three-dimensionally, including a plane and height.
  • the flight control unit 340, 350, or 360 described later lands the drone 100.
  • the flight control unit 340, 350, or 360 may also hover the drone 100. Additionally, the flight control unit 340, 350, or 360 may move the drone 100 toward the inside of the geofence G100, the geofence G200, or the third geofence. Detailed control of the geofence will be described later.
  • the geofence's boundary line in the height direction may include an upper limit and a lower limit.
  • the geofences G100, G200 that are applied to whether or not flight is permitted are switched according to the control of the system 1 while the drone 100 is flying.
  • the number of geofences G100, G200 depicted in the figure is two, but the number is arbitrary, and specifically may be three or more.
  • the geofence G100 is an area that includes the shooting positions L101 to L105, and defines an area that includes the touchline F111b and the area nearby. In other words, the geofence G100 is defined near the outer edge of the court F100, and a portion of it extends into the outer court area F200.
  • the geofence G200 is a geofence that is primarily applied in the outer edge flight mode M102, which will be described later.
  • the geofence G200 is an area that includes the shooting positions L206 to L215, and is set at least inside the court F100. This geofence G200 is a geofence that is primarily applied in the on-court flight mode M105, which will be described later.
  • the areas defined by the multiple geofences G100, G200 at least partially contact or overlap each other.
  • the areas defined by the multiple geofences G100, G200 also overlap in the height direction.
  • the heights of the multiple geofences G100, G200 may differ from each other. Specifically, the lower limit of the altitude of the geofence G200 set inside the stadium F is set higher than the lower limit of the altitude of the geofence G100 set on the outer edge of the stadium F.
  • FIG. 8 illustrates an embodiment of a geofence defined near the perimeter of court F100.
  • FIG. 8(a) is a diagram showing the geofences G100 and G200 described in FIG. 8B is another example of a geofence defined near the outer edge of the court F100.
  • a geofence G100a that defines an area that covers the entire perimeter of the court F100 is arranged.
  • FIG. 8C is a diagram showing an example of a geofence G100b that defines an area covering two adjacent sides of the outer edge of the court F100, instead of the geofence G100.
  • the geofence G100b defines an L-shaped area connecting an area along the goal line F110a and an area along the touch line F111a.
  • FIG. 8(d) is a diagram showing an example of a geofence G100c that defines an area along the goal line F110b instead of the geofence G100.
  • the geofence may be set in advance, or may be set by the user. For example, if there is a fixed obstacle near the court F100, the user can set the geofence so as not to interfere with the obstacle. In addition to the geofence, the user can also set an area in which the drone 100 may fly when the geofence is set, and store the area in association with the geofence.
  • the obstacle detection unit 130 is a functional unit that detects obstacles around the drone 100.
  • the obstacles may include, for example, people, for example, players, objects, animals such as birds, fixed equipment, and a ball.
  • the obstacle detection unit 130 measures the position, speed vector, and the like of an obstacle located, for example, below the drone 100 based on the acquired image.
  • the obstacle detection unit 130 includes, for example, an obstacle detection camera 131, a ToF (Time of Flight) sensor 132, and a laser sensor 133.
  • the ToF sensor 132 measures the time it takes for a laser pulse emitted from the sensor to return to the light receiving element in the sensor, and measures the distance to an object by converting this time into distance.
  • the laser sensor 133 uses, for example, the LiDAR (Light Detection And Ranging) method to shine light such as near-infrared light, visible light, or ultraviolet light on the target object and measure the distance by capturing the reflected light with an optical sensor.
  • LiDAR Light Detection And Ranging
  • FIG. 2 shows that the obstacle detection camera 131 is positioned facing forward, but the type, position and number of the camera 131, ToF sensor 132 and laser sensor 133 are arbitrary, and the ToF sensor 132 or laser sensor 133 may be positioned instead of the camera 131, or the ToF sensor 132 or laser sensor 133 may be provided on all six surfaces of the housing 101, i.e., the front, back, top, bottom and both sides.
  • the photographing unit 140 is a functional unit that photographs images of a competition in the stadium F (FIG. 7) or an event at an event venue, and has a photographing camera 141, a camera holding unit 142, and a photographing control unit 143.
  • the photographing camera 141 imaging device
  • the photographing camera 141 is a video camera (color camera) that photographs moving images.
  • the moving images may include audio data acquired by a microphone (not shown).
  • the photographing camera 141 may also be configured to photograph still images.
  • the orientation of the photographic camera 141 (the attitude of the photographic camera 141 relative to the housing 101 of the drone 100) can be adjusted by a camera actuator (not shown) built into the camera holding unit 142.
  • the photographic camera 141 may have an automatic control function for parameters such as exposure, contrast, or ISO.
  • the camera holding unit 142 may have a so-called gimbal control mechanism that suppresses the transmission of shaking or vibration of the aircraft to the photographic camera 141.
  • the photographic control unit 143 controls the photographic camera 141 and the camera holding unit 142 to adjust the orientation of the photographic camera 141, the photographic magnification (zoom amount), the camera's photographic conditions, etc.
  • Image data acquired by the photographic camera 141 can be transmitted to the memory unit of the drone 100 itself, the pilot 200, the server 300, etc.
  • the communication unit 150 is capable of radio wave communication via the communication network 400 and includes, for example, a radio wave communication module.
  • the communication unit 150 is capable of communication with the controller 200 and the like via the communication network 400 (including the wireless base station 800).
  • FIG. 4 is a front view of the appearance of the controller 200 of this embodiment in a simplified manner.
  • FIG. 5 is a functional configuration diagram of the controller 200 of this embodiment.
  • the controller 200 is a mobile information terminal that controls the drone 100 by the operation of the pilot and displays information received from the drone 100 (e.g., position, altitude, remaining battery level, camera image, etc.).
  • the flight state (altitude, attitude, etc.) of the drone 100 may be remotely controlled by the pilot 200, or may be autonomously controlled by the drone 100.
  • the drone 100 performs autonomous flight.
  • manual operation may be possible during basic operations such as takeoff and return, and in an emergency.
  • the controller 200 includes a display unit 201 and an input unit 202 as a hardware configuration.
  • the display unit 201 and the input unit 202 are connected to each other so that they can communicate with each other wired or wirelessly.
  • the display unit 201 may be configured as a touch panel or liquid crystal monitor that is integrated into the controller 200, or may be configured as a display device such as a liquid crystal monitor, tablet terminal, or smartphone that is connected to the controller 200 wired or wirelessly.
  • the display unit 201 as a hardware configuration may be configured as a touch panel display by integrally incorporating an element that accepts input such as touch.
  • the input unit 202 is a mechanism through which the pilot inputs operational commands such as flight direction and takeoff/landing when piloting the drone 100. As shown in FIG. 4A, the input unit 202 has a left slider 326L, a right slider 326R, a left input stick 327L, a right input stick 327R, a power button 328, and a return button 329.
  • the left slider 326L and the right slider 326R are operators that accept, for example, an input of 0/1, or an input of one-dimensional stepless or stepwise information, and the operator slides the left and right index fingers to input, for example, while holding the controller 200 in his/her hand.
  • the left input stick 327L and the right input stick 327R are operators that accept an input of multi-dimensional stepless or stepwise information, and are, for example, so-called joysticks.
  • the left input stick 327L and the right input stick 327R may also accept an input of 0/1 by pressing them.
  • the power button 328 and the return button 329 are operators that accept pressing them, and are constituted by mechanical switches or the like.
  • the left input stick 327L and the right input stick 327R accept input operations that instruct the three-dimensional flight operations of the drone 100, including, for example, takeoff, landing, ascent, descent, right turn, left turn, forward movement, backward movement, left movement, and right movement.
  • Figure 4(b) is a schematic diagram showing the movement direction or rotation direction of the drone 100 corresponding to each input of the left input stick 327L and right input stick 327R shown in Figure 4(a). Note that this correspondence is an example.
  • the controller 200 includes a processor such as a CPU for executing information processing, and storage devices such as RAM and ROM, which constitute the software configuration of the main functional blocks of the display control unit 210, input control unit 220, and communication unit 240.
  • a processor such as a CPU for executing information processing
  • storage devices such as RAM and ROM, which constitute the software configuration of the main functional blocks of the display control unit 210, input control unit 220, and communication unit 240.
  • the display control unit 210 displays to the pilot the drone 100 or the status information of the drone 100 acquired from the server 300.
  • the display control unit 210 can display images relating to various information such as the shooting target field, flight permitted/prohibited areas, flight geofence, map information, current position information of the drone 100, attitude information (directional information), speed information, acceleration information, and remaining battery power.
  • the "current position information” referred to here is sufficient to include information on the horizontal position of the current position of the drone 100 (i.e., latitude and longitude), and does not need to include altitude information (absolute altitude or relative altitude).
  • the display control unit 210 has a mode display unit 211 and a shooting status display unit 212.
  • the mode display unit 211 is a functional unit that displays at least the state, i.e., the mode, to which the drone 100 belongs on the display unit 201.
  • the mode to which the drone 100 belongs is, for example, the flight mode shown in FIG. 9, but instead of or in addition to this, the aircraft state shown in FIG. 10, the aircraft action state shown in FIG. 11, the game state shown in FIG. 12, or the offensive and defensive state shown in FIG. 13 may be displayed on the display unit 201.
  • the screen G1 displayed on the display unit 201 displays, for example, a display field G21 for the flight mode to which the drone 100 belongs, as well as a status display field G22 showing the aircraft status, aircraft behavior status, match status, and offensive/defensive status.
  • the display 5 is a functional unit that displays, on the display unit 201, an image captured by the imaging camera 141 mounted on the drone 100.
  • the screen G1 displayed on the display unit 201 displays, for example, an image field G40 in which an image being captured by the drone 100 is displayed.
  • the screen G1 and each mode will be described in detail later.
  • the input control unit 220 shown in FIG. 5 receives various inputs from a user such as a pilot.
  • the input control unit 220 of this embodiment mainly has the following functional units: an aircraft position operation unit 221, an aircraft attitude operation unit 222, a camera attitude operation unit 223, a camera zoom operation unit 224, a flight mode switching unit 225, a target position receiving unit 226, a power supply input unit 227, and a return input unit 228.
  • the aircraft position operation unit 221 includes an up/down movement input unit 221a and a left/right movement input unit 221b.
  • the aircraft attitude operation unit 222 includes a forward/backward movement input unit 222a and a yaw rotation input unit 222b.
  • the up-down movement input unit 221a is an input unit for allowing the pilot to move the drone 100 up and down, and acquires input to the right input stick 327R. That is, when the right input stick 327R is moved upward (toward the back when held in the hand), the drone 100 rises, and when the right input stick 327R is moved downward (toward the front when held in the hand), the drone 100 descends.
  • the left-right movement input unit 221b is an input unit for allowing the pilot to move the drone 100 left and right, and acquires input to the right input stick 327R. That is, when the right input stick 327R is moved to the right, the drone 100 moves to the right, and when the right input stick 327R is moved to the left, the drone 100 moves to the left.
  • the forward/backward movement input unit 222a is an input unit for allowing the pilot to move the drone 100 forward/backward, and acquires input to the left input stick 327L. That is, when the left input stick 327L is moved upward (toward the rear when held in the hand), the drone 100 moves forward, and when the left input stick 327L is moved downward (toward the front when held in the hand), the drone 100 moves backward.
  • the yaw rotation input unit 222b is an input unit for allowing the pilot to yaw rotate the drone 100, and acquires input to the left input stick 327L. That is, when the left input stick 327L is moved to the right, the drone 100 turns right, and when the left input stick 327L is moved to the left, the drone 100 turns left.
  • the camera attitude operation unit 223 is an input unit for operating the camera holding unit 142 via the shooting control unit 143 and for controlling the orientation of the shooting camera 141 relative to the housing 101 of the drone 100.
  • the camera attitude operation unit 223 obtains input to the right slider 326R.
  • the camera attitude operation unit 223 accepts operation of either or both of the pitch angle and yaw angle of the shooting camera 141 relative to the housing 101.
  • the camera zoom operation unit 224 is an input unit for operating the shooting magnification of the shooting camera 141, and acquires input to the left slider 326L.
  • the flight mode switching unit 225 is an input unit for switching flight modes. Flight modes selectable by the flight mode switching unit 225 include at least, for example, the outer edge flight mode M102 (see FIG. 9), the inside court flight mode M105 (see FIG. 9), and the fixed position flight mode M103 or M107 (see FIG. 9).
  • the flight mode switching unit 225 accepts switching of flight modes via, for example, a touch panel display integrated with the display unit 201.
  • the target position receiving unit 226 is a functional unit that receives input of a target shooting position to which the drone 100 should head.
  • the target position receiving unit 226 receives input of a point on the stadium F. For example, when at least a portion of an image or schematic diagram of the stadium F is displayed on the display unit 201, the target position receiving unit 226 may receive input of the target shooting position via a touch panel display that is configured integrally with the display unit 201.
  • the target position receiving unit 226 may receive a selection input of a target shooting position when a point that can be selected as the target shooting position, i.e., a shooting position, is specified in advance.
  • the flight modes of the drone 100 mainly include a pre-preparation mode M100, an off-court takeoff and landing mode M101, an outer edge flight mode M102, an off-court fixed position flight mode M103, an on-court entry mode M104, an on-court flight mode M105, an off-court exit mode M106, an on-court fixed position flight mode M107, and an on-court takeoff and landing mode M108.
  • the advance preparation mode M100 is a mode in which advance settings such as geofence settings are made.
  • the advance preparation mode M100 transitions to an off-court takeoff and landing mode M101.
  • this off-court takeoff and landing mode M101 the drone 100 takes off from point L101g (see FIG. 15). Note that in the off-court takeoff and landing mode M101, the drone 100 may take off from a point outside the court F100 other than point L101g.
  • the off-court takeoff and landing mode M101 is the mode to which the drone 100 belongs when control starts or ends.
  • the drone 100 transitions from the off-court takeoff and landing mode M101 to the perimeter flight mode M102.
  • the outer edge flight mode M102 is a mode in which the drone flies above the outer edge along part or all of the outer edge of the court F100 to photograph the playing field F, and more specifically, flies at one of the photographing positions L101 to L105 ( Figure 7) to photograph the playing field F.
  • the outer edge flight mode M102 is a mode in which the drone flies above the touchline F111b.
  • the "outer edge" on which the outer edge flight mode M102 flies is a concept that includes not only directly above the touchline F111b but also slightly outside the court F100.
  • the drone 200 receives user instructions via the target position receiving unit 226, and flies at one of the specified shooting positions L101 to L105.
  • the shooting angle may be manually controlled according to the user's instructions, or may be fixed at a predetermined angle.
  • the drone 100 may change its shooting position while keeping the shooting angle fixed, a so-called dolly shot may be used to follow and shoot a specific player.
  • the outer edge flight mode M102 can transition to an outside court takeoff and landing mode M101, an outside court fixed position flight mode M103, or an inside court approach mode M104.
  • the outside-court fixed position flight mode M103 is a mode in which the drone 100 flies in a fixed position outside the area of the court F100.
  • the outside-court fixed position flight mode M103 can transition to the outer edge flight mode M102.
  • the on-court entry mode M104 is a mode in which a series of processes required for the drone 100 to enter the area of the court F100 is performed.
  • the drone 100 transitions to the on-court flight mode M105 via the on-court entry mode M104.
  • Intra-court flight mode M105 is a mode in which the drone flies above the court F100 to photograph the stadium F, and more specifically, flies at one of the photographing positions L206 to L215 ( Figure 7) to photograph.
  • the drone accepts a user command to select a photographing position via the target position receiving unit 226 of the controller 200, and flies at one of the specified photographing positions L206 to L215.
  • the photographing direction may be manually controlled according to the user's instructions, or may be fixed at a predetermined angle.
  • the on-court flight mode M105 can transition to an off-court exit mode M106, an on-court fixed position flight mode M107, or an on-court takeoff and landing mode M108.
  • the court exit mode M106 is a mode in which the drone 100 performs a series of processes required for the drone 100 to exit the area of the court F100.
  • the drone 100 transitions from the court exit mode M106 to the outer edge flight mode M102. Note that the court exit mode M106 and the court entry mode M104 can transition back and forth.
  • the on-court fixed position flight mode M107 is a mode in which the drone flies in a fixed position within the area of the court F100.
  • the on-court fixed position flight mode M107 can transition to the on-court flight mode M105.
  • the on-court takeoff and landing mode M108 is a mode in which the drone takes off and lands within the area of the court F100, and is a mode to which the drone transitions mainly when a command to land on the spot is issued by manual intervention.
  • a drone that takes off in the on-court takeoff and landing mode M108 transitions to the on-court flight mode M105.
  • the power input unit 227 is a functional unit that accepts the power on/off command for the controller 200 via the power button 328.
  • the return input unit 228 is a functional unit that accepts a command to return the drone 100 located in the stadium F ( Figure 7) to the target landing point L101g (see Figure 15) via the return button 329.
  • the input control unit 220 may be capable of receiving touch input to the display unit 201 and transmitting control commands to the drone 100 in response to the input. More specifically, for example, when the user selects appropriate information such as a map or schematic diagram displayed on the display unit 201, a route to the selected point may be automatically generated, causing the drone 100 to fly autonomously.
  • the communication unit 240 is a functional unit that transmits and receives signals between the controller 200 and an appropriate configuration included in the system 1.
  • the controller 200 has a communication function that performs wireless communication with the drone 100 by wireless communication using Wi-Fi, 2.4 GHz, and 5.6 to 5.8 GHz frequency bands.
  • the controller 200 also has a wireless communication function that can communicate with the server 300 via the communication network 400 using a communication standard such as LTE (Long Term Evolution).
  • the communication unit 240 transmits various input signals by a user such as a pilot to the drone 100 or the server 300.
  • the communication unit 240 also receives signals from the drone 100 or the server 300.
  • (A-1-4. Server 300) (A-1-4-1. Overview of Server 300) 6 is a functional configuration diagram of the server 300 according to the present embodiment.
  • the server 300 manages or controls the flight and photography of the drone 100.
  • the server 300 includes an input/output unit (not shown) for inputting or outputting various types of information (image output, audio output).
  • the server 300 may be a general-purpose computer such as a workstation or personal computer, or may be logically realized by cloud computing.
  • the server 300 is equipped with a calculation device such as a CPU for executing information processing, and storage devices such as RAM and ROM, and as a software configuration, it mainly configures the following functional blocks: a presetting unit 310, an event detection unit 320, a flight mode switching unit 330, an outer edge flight control unit 340, an in-court flight control unit 350, a fixed position flight control unit 360, a communication unit 370, and a memory unit 380.
  • a presetting unit 310 such as a CPU for executing information processing
  • storage devices such as RAM and ROM
  • the pre-setting unit 310 is a functional unit that performs the settings necessary for the flight of the drone 100 before the drone 100 flies over the field to be photographed.
  • the presetting unit 310 mainly includes a geofence setting unit 311 .
  • the geofence setting unit 311 is a functional unit that sets the geofence of the drone 100.
  • the geofence includes information on the planar direction and the height direction.
  • the geofence setting unit 311 sets a geofence according to the flight mode. That is, for example, the geofence setting unit 311 activates the geofence G100 in the outer edge flight mode M102 (see FIG. 9). The geofence setting unit 311 also activates the geofence G200 in the inner court flight mode M105 (see FIG. 9). The geofence setting unit 311 also sets a geofence different from the geofence G100 or the geofence G200, that is, a so-called third geofence, in the intermediate modes that are intermediate in the transition between the outer edge flight mode M102 and the inner court flight mode M105, that is, the outer court exit mode M106 and the inner court entry mode M104.
  • the outer edge flight mode M102 and the inner court flight mode M105 are examples of the first flight mode or the second flight mode in the scope of the claims.
  • the outer court exit mode M106 and the inner court entry mode M104 are examples of the third flight mode in the scope of the claims.
  • the third geofence is a geofence that covers a combined area that combines the first area defined by the geofence in the first flight mode and the second area defined by the geofence in the second flight mode.
  • the geofence G100 of the outer edge flight mode M102 and the geofence G200 of the inside court flight mode M105 overlap. Therefore, the third geofence is a geofence that divides the area that combines the geofences G100 and G200.
  • the third geofence may be a geofence that covers an area that combines a first area defined by the first geofence corresponding to the first flight mode, a second area defined by the second geofence corresponding to the second flight mode, and a gap between the first area and the second area.
  • the event detection unit 320 is a functional unit that detects the state of the subject to be photographed or the drone 100.
  • the event detection unit 320 detects an event based on the camera image of the photographing camera 141 or an input from the external system 700.
  • the detection criteria for each event are stored in, for example, the storage unit 380, and the event detection unit 320 detects an event by referring to the storage unit 380.
  • the event detection unit 320 may also detect an event by analysis using a neural network.
  • the detection process by the event detection unit 320 can be performed using any known appropriate image analysis technology.
  • the event detection unit 320 detects events that trigger a change in the flight mode or shooting conditions of the drone 100.
  • the event detection unit 320 mainly has an aircraft state acquisition unit 321 , an aircraft action state acquisition unit 322 , a game state acquisition unit 323 , and an offensive/defensive state acquisition unit 324 .
  • the aircraft status acquisition unit 321 is a functional unit that acquires the aircraft status of the drone 100.
  • 10 is a diagram showing the state transition of the aircraft state of the drone 100.
  • the aircraft state is broadly divided into, for example, a normal operation flight mode M200, a detection and judgment mode M210, and an action mode M220.
  • the drone 100 starts flying, the drone 100 transitions to the normal operation flight mode M200.
  • the detection and judgment mode M210 includes an abnormality detection mode M211, a failure detection mode M212, a manual intervention mode M213, and a low battery mode M214.
  • abnormality detection mode M211 if an abnormality is detected in normal operation flight mode M200, the mode transitions to abnormality detection mode M211.
  • This abnormality is a transient, in other words, reversible disturbance such as a drop in radio wave strength or strong winds. If the abnormality is resolved in abnormality detection mode M211, the mode transitions to normal operation flight mode M200.
  • the drone 100 transitions to failure detection mode M212. If a manual control command is received, the drone transitions to manual intervention mode M213, and if it is detected that the remaining battery charge is less than a predetermined value, the drone transitions to low battery mode M214. In addition, if a manual control command is received in abnormality detection mode M211, failure detection mode M212, or low battery mode M214, the drone transitions to manual intervention mode M213. The drone 100 transitions to an action mode M220 that corresponds to the detection judgment mode M210.
  • the action mode M220 is a state in which the drone 100 performs a series of actions that are preset for each state.
  • the action mode M220 includes a landing mode M221 at an evacuation point, an emergency stop mode M222, a landing on the spot mode M223, a return mode M224, and a fixed position flight mode M225.
  • the landing at evacuation point mode M221 is set to fly the drone 100 to the evacuation point H200 and land it.
  • the landing at evacuation point mode M221 is entered when the abnormality is not resolved in the abnormality detection mode M211.
  • the emergency stop mode M222 is set to stop the propellers 122 on the spot. In the emergency stop mode M222, the drone 100 falls freely.
  • the emergency stop mode M222 can be selected in the manual intervention mode M213 when the propellers 122 are about to come into contact with a person or object.
  • the on-site landing mode M223 is set to perform a soft landing on the spot.
  • the return mode M224 is set to return to the takeoff and landing point.
  • the fixed position flight mode M225 is a state in which the drone flies at a fixed position, and can transition to the normal operation flight mode M200 based on a user operation.
  • the user operation is input, for example, by selecting a button displayed on the display unit 201.
  • the fixed position flight mode M225 if an event that can cause a transition from the normal operation flight mode M200 to the detection and judgment mode M210, i.e., an abnormality, a malfunction, manual intervention, or a low battery, is detected, the drone 100 transitions from the fixed position flight mode M225 to each state of the detection and judgment mode M210 via the normal operation flight mode M200.
  • the drone 100 in the fixed position flight mode M225 can transition to the return mode M224 based on a user operation.
  • the drone 100 in the abnormality detection mode M211 and the failure detection mode M212 transitions to a landing mode M221 at an evacuation point.
  • the drone 100 in the manual intervention mode M213 transitions to one of the following states depending on the input command: landing mode M221 at an evacuation point, emergency stop mode M222, landing on the spot mode M223, return mode M224, and fixed position flight mode M225.
  • the drone 100 in the low battery mode M214 transitions to the return mode M224.
  • the drone 100 in the normal operation flight mode M200 can also transition to the return mode M224 based on a user operation.
  • the user operation is input, for example, by selecting a button displayed on the display unit 201.
  • the aircraft behavior state acquisition unit 322 is a functional unit that acquires the aircraft behavior state of the drone 100.
  • Each mode of the aircraft behavior state is a sub-mode of the aircraft state that is performed to realize a transition of the aircraft state.
  • 11 is a diagram showing state transitions of the aircraft's behavioral states.
  • the aircraft's behavioral states are broadly divided into a takeoff mode M300, an evacuation mode M310, a normal mode M320, and a landing mode M340, for example.
  • Takeoff mode M300 is a mode in which drone 100 takes off.
  • the state transition of the aircraft's behavior state starts from takeoff mode M300.
  • the aircraft's behavior state transitions from takeoff mode M300 to evacuation mode M310 or normal mode M320.
  • Evacuation mode M310 mainly includes evacuation point arrival stationary mode M311 and evacuation moving mode M312.
  • Normal mode M320 also includes point arrival stationary mode M321 and moving mode M322.
  • Evacuation mode M310 and normal mode M320 can transition to each other via temporary suspension mode M330. This is just one example.
  • the evacuation point arrival stationary mode M311 is a mode in which the drone moves to the evacuation point H200 and remains stationary there, i.e., hovers.
  • the drone 100 in the evacuation point arrival stationary mode M311 transitions to evacuation in-motion mode M312.
  • the aircraft behavior state transitions from the evacuation point arrival stationary mode M311 or the evacuation in-motion mode M312 to the temporary suspension mode M330.
  • Point arrival stationary mode M321 is a mode in which the drone moves to a specified destination and remains stationary on the spot, i.e., hovers.
  • moving to another location in normal use conditions the drone 100 in point arrival stationary mode M321 transitions to moving mode M322.
  • the aircraft behavior state transitions from point arrival stationary mode M321 or moving mode M322 to temporary suspension mode M330.
  • the drone 100 in the evacuation point arrival stationary mode M311, the point arrival stationary mode M321, the moving mode M322, and the pause mode M330 can transition to the landing mode M340.
  • the aircraft's operating state ends processing in the landing mode M340.
  • the game status acquisition unit 323 shown in FIG. 6 is a functional unit that acquires the game status of the competition held at the stadium F.
  • the game status acquisition unit 323 detects the game status by performing image processing on the captured image.
  • the game status acquisition unit 323 may also acquire the game status based on decision-related information input by the umpire to the external input device 600 or the umpire support system, which is an example of the external system 700.
  • the game status acquisition unit 323 may acquire the game status based on information input from the external input device 600 held by a team member, for example, the manager or coach.
  • FIG. 12 is a diagram showing an example of state transitions in a match state.
  • the match state includes a pre-match state M400, a normal play state M410, and an end-of-match state M460.
  • the state transition starts from the pre-match state M400, and transitions from the pre-match state M400 to the normal play state M410.
  • the normal play state M410 is a state in which the game is progressing. When the match ends, transitions from the normal play state M410 to the end-of-match state M460. Note that a transition from the normal play state M410 to the end-of-match state M460 may occur not only at the end of the match, but also during a break during the match, such as halftime.
  • the game state also includes a play suspended without foul play state M420 and a play suspended with foul play state M440.
  • a transition to the play suspended without foul play state M420 occurs.
  • the play suspended without foul play state M420 occurs, for example, when the ball crosses the goal line F110a, F110b or the touch line F111a, F111b and goes out of the court.
  • a transition to a throw-in state M421, a goal kick state M422, or a corner kick state M423 occurs in accordance with events that occur according to the rules of the game, such as the type of line the ball crossed or the affiliation of the player who kicked the ball out of the court.
  • the throw-in state M421, the goal kick state M422, or the corner kick state M423 transitions to the normal play state M410.
  • a transition to the foul state M431 occurs.
  • an offside occurs or is recognized by the referee
  • a transition to the offside state M432 occurs.
  • a transition from the foul state M431 or the offside state M432 occurs to the foul play interruption state M440.
  • a transition to the free kick state M441 or the penalty kick state M442 occurs depending on the location where the foul occurred and the event that occurred.
  • a so-called indirect free kick may be performed instead of a free kick.
  • the free kick state M441 may be subdivided into a free kick state for the attacking side and a free kick state for the defending side.
  • the free kick state M441 and the penalty kick state M442 when the event in each state ends, the match is resumed and the match state transitions to the normal play state M410.
  • the normal play state M410 transitions to the post-match state M460, and the state transition for the match state ends.
  • the normal play state M410 may also transition to a penalty shootout state M443. Although not shown in the figure, the penalty shootout state M443 may transition to an end-of-match state M460, thereby terminating the state transition.
  • Some of the game states shown in FIG. 12 may trigger a change in flight mode, while other game states may not.
  • the flight mode may be changed based on a transition to the shaded states in the figure, i.e., the pre-game state M400, goal kick state M422, corner kick state M423, free kick state M441, penalty kick state M442, player substitution state M450, and end-of-game state M460.
  • the flight mode may be changed to one that corresponds to the offensive or defensive state.
  • the offensive and defensive state acquisition unit 324 is a functional unit that acquires the offensive and defensive states of the teams in the match held at the stadium F.
  • the offensive and defensive state acquisition unit 324 detects the offensive and defensive states by performing image processing on the captured images.
  • the offensive and defensive state acquisition unit 324 may also acquire the offensive and defensive states based on judgment-related information input by the umpire to the external input device 600 or the umpire support system, which is an example of the external system 700.
  • the offensive and defensive state acquisition unit 324 may acquire the offensive and defensive states based on information input from the external input device 600 held by a team member, for example, the manager or coach.
  • FIG. 13 is a diagram showing an example of a state transition between offensive and defensive states.
  • the figure shows an example of an offensive and defensive state in soccer.
  • the offensive and defensive state transitions to an offensive state M510 or a defensive state M520.
  • the offensive state M510 and the defensive state M520 transition to each other via an offensive/defensive change state M530 or an offensive/defensive uncertainty state M540.
  • the offensive state M510 is a state in which one of the teams (hereinafter also referred to as "Team A") designated in advance is on the offensive.
  • An offensive state is, for example, a state in which Team A is in possession of the ball and is advancing toward the other team (hereinafter also referred to as “Team B”), but is not limited to this and may be a predetermined state determined by any determination criterion stored in advance in the memory unit 380.
  • the attack state M510 includes an A team offensive (own side) state M511, an A team offensive (enemy side) state M512, and an A team quick attack state M513.
  • a transition is possible between the A team offensive (own side) state M511 and the A team offensive (enemy side) state M512, and between the A team offensive (own side) state M511 and the A team quick attack state M513.
  • a transition is also possible from the A team quick attack state M513 to the A team offensive (enemy side) state M512.
  • the defensive state M520 includes a team A defensive (opponent's half) state M521, a team A defensive (own's half) state M522, and a team B fast attack state M523.
  • the team A defensive (own's half) state M521 and the team A defensive (own's half) state M522, and the team A defensive (opponent's half) state M521 and the team B fast attack state M523 can transition to each other. Also, a transition can be made from the team B fast attack state M523 to the team A defensive (own's half) state M522.
  • the offense/defense switching state M530 and the offense/defense uncertain state M540 can be transitioned to from any of the following: Team A offensive (own side) state M511, Team A offensive (opponent's side) state M512, Team A quick attack state M513, Team A defensive (opponent's side) state M521, Team A defensive (own side) state M522, and Team B quick attack state M523.
  • the event detection unit 320 may determine an event based on input information from the external system 700, instead of or in addition to the above-mentioned acquisition units 321 to 324.
  • the event detection unit 320 may determine an external disturbance such as a strong wind as an event based on input information from a weather information system, which is an example of the external system 700.
  • the event detection unit 320 may also determine an event based on input information from a court facility system, which is another example of the external system 700, or facility information entered by a person involved with the court facility.
  • the flight mode switching unit 330 is a functional unit that switches the flight mode depending on the detection result by the event detection unit 320.
  • the flight mode switching unit 330 mainly has a mode switching determination unit 331, a flight permitted area switching unit 332, a geofence switching unit 333, and a flight path generation unit 334.
  • the mode switching determination unit 331 is a functional unit that determines whether or not a flight mode needs to be switched.
  • the mode switching determination unit 331 determines whether or not a flight mode needs to be switched depending on an event detected by the event detection unit 320.
  • the mode switching determination unit 331 refers to table T1 (see FIG. 14) that associates the state detected by the event detection unit 320 with the flight mode in that state, and determines that a flight mode needs to be switched if the flight modes are different.
  • the event-flight mode table T1 is a table in which events detected as game states are associated with flight modes to be selected for those events and stored. More specifically, for example, in the goal kick state M422 and the defending free kick state M441, the outer edge flight mode M102 is associated to avoid the risk of the ball hitting the drone 100. Also, in the penalty kick state M442 and the attacking free kick state M441, the inside court flight mode M105 is associated to capture the goal scene up close. In the pre-match state M400, penalty shootout state M443, and player substitution state M450, the movement of the target to be watched is small or nonexistent, so the fixed position flight mode M103 or M107 is associated.
  • Whether the off-court fixed position flight mode M103 or the on-court fixed position flight mode M107 is used may be determined according to the flight mode at the time of transition to the pre-match state M400, the penalty shootout state M443, or the player substitution state M450.
  • the on-court flight mode M105 is associated in order to take close-up shots of the players' expressions.
  • the permitted flight area switching unit 332 is a functional unit that switches the permitted flight area in response to switching of the flight mode.
  • the geofence switching unit 333 is a functional unit that switches geofences in response to switching of flight modes. For example, when the flight mode is the outer edge flight mode M102, the geofence switching unit 333 sets a geofence G100. When the flight mode is the inner court flight mode M105, the geofence switching unit 333 sets a geofence G200. In addition, in the intermediate modes that are intermediate in the transition between the outer edge flight mode M102 and the inner court flight mode M105, i.e., the inner court entry mode M104 and the outer court exit mode M106, a third geofence different from the geofences G100 and G200 in the outer edge flight mode M102 and the inner court flight mode M105 is set.
  • the flight path generating unit 334 is a functional unit that generates a flight path of the drone 100 during movement involving switching of flight modes.
  • the flight path generating unit 334 determines, for example, the shooting position at which the mode transitions to the inside-court entry mode M104 or the outside-court exit mode M106.
  • the flight path generating unit 334 also determines the shooting position at which the mode transitions from the inside-court entry mode M104 to the inside-court flight mode M105, or the shooting position at which the mode transitions from the outside-court exit mode M106 to the outer edge flight mode M102.
  • the flight path generating unit 334 generates a specific flight path in the inside-court entry mode M104 or the outside-court exit mode M106. This flight path is generated, in principle, within the area of the third geofence.
  • the outer edge flight control unit 340 is a functional unit that controls the flight of the drone 100 in the outer edge flight mode.
  • the outer edge flight control unit 340 has a target position acquisition unit 341 and a flight path generation unit 342.
  • the target position acquisition unit 341 is a functional unit that acquires the target position to which the drone 100 should head. This target position acquisition unit 341 acquires a target position that is located within the range of the flight area in the outer edge flight mode M102. The target position acquisition unit 341 may acquire a target position input by the user, for example, received via the target position reception unit 226 of the controller 200.
  • the flight path generating unit 342 is a functional unit that generates a flight path along which the drone 100 moves in the outer flight mode M102. In other words, the flight path generating unit 342 generates a flight path in a flyable area in the outer flight mode M102. When the drone 100 belongs to a flyable area in the outer flight mode M102 and the acquired target position also belongs to a flyable area in the outer flight mode M102, the flight path generating unit 342 generates a flight path from the current position to the target position. In addition, when the drone 100 moves across the outer flight mode M102 and the on-court flight mode M105, the flight path generating unit 342 may generate a flight path in the flight range in the outer flight mode M102. The flight path generating unit 342 moves the moving body, for example, along the touch line F111b.
  • the flight path generation unit 342 moves the drone 100 to the outer court area F200 outside the touchline F111b.
  • the flight path generation unit 342 may also perform shooting directly below the drone 100 or from that point toward the court F100. With this configuration, it is possible to follow and shoot the ball even if the ball rolls into the outer court area F200.
  • the geofence G100 of the outer edge flight mode M102 may be set in advance to extend beyond the touchline F111b to the outside of the court F100. With this configuration, even if the drone 100 follows the ball and flies slightly outside the touchline F111b as described above, the drone 100 can be reliably maintained within the geofence G100 without resetting the geofence G100.
  • the flight path generating unit 342 When the flight path generating unit 342 detects an obstacle on the flight path or near the drone 100, it regenerates a flight path that bypasses the obstacle on the inside of the court F100. When the flight path generating unit 342 detects an obstacle, it may hover for a predetermined time and then move along the originally generated flight path. This is because while the safety of the drone 100 is not ensured in the area outside the court F200 in the stadium F, the safety of the drone 100 inside the court F100 is highly likely to be ensured.
  • the obstacle may be detected, for example, by the obstacle detecting unit 130 of the drone 100, or may be detected by information from an external system 700 or the like.
  • the outer edge flight control unit 340 may switch to manual control after causing the flight path generation unit 342 to hover for a predetermined time. Furthermore, when an obstacle is detected, the outer edge flight control unit 340 may cause the flight path generation unit 342 to hover for a predetermined time, and then display a message prompting the user to re-input the target position via the display control unit 210.
  • the on-court flight control unit 350 is a functional unit that controls the flight of the drone 100 in the on-court flight mode M105.
  • the on-court flight control unit 350 has a target position acquisition unit 351 and a flight path generation unit 352.
  • the target position acquisition unit 351 is a functional unit that acquires a target position located within the range of the flight area in the on-court flight mode M105.
  • the flight path generating unit 352 generates a flight path along which the drone 100 moves in the on-court flight mode M105. That is, the flight path generating unit 352 generates a flight path in a flyable area in the on-court flight mode M105. More specifically, the flight path generating unit 352 generates a flight path by connecting multiple preset shooting positions in the on-court flight mode M105. Like the flight path generating unit 342 of the outer edge flight control unit 340, the flight path generating unit 352 may generate a flight path in a flight range in the on-court flight mode M105 when the current position and the target position belong to flyable areas of different flight modes.
  • the flight path generating unit 352 detects an obstacle on the flight path or near the drone 100, it regenerates a flight path that bypasses the obstacle.
  • the obstacle is detected by, for example, the obstacle detection unit 130.
  • the flight path generating unit 352 may regenerate the flight path by changing the connection between multiple shooting positions that have been set in advance, or may change the flight path to a higher altitude while maintaining the flight path on a plane.
  • the flight path generating unit 352 may hover for a predetermined time when it detects an obstacle, and then move along the flight path that was initially generated.
  • the in-court flight control unit 350 may switch to manual control after hovering for a predetermined time by the flight path generating unit 352 when it detects an obstacle.
  • the in-court flight control unit 350 may display a message prompting the user to re-input the target position via the display control unit 210 after hovering for a predetermined time by the flight path generating unit 352 when it detects an obstacle.
  • the obstacle may be, for example, a bird, a fixed facility, or a player.
  • the obstacles also include balls.
  • flight control in the outer edge flight mode M102 is performed by the outer edge flight control unit 340
  • flight control in the inner court flight mode M105 is performed by the inner court flight control unit 350.
  • the shooting positions defined in each of the outer edge flight mode M102 and the inner court flight mode M105 are presented as options, and a flight path to the selected target shooting position is generated.
  • the technical scope of the present invention is not limited to this, and the shooting position and orientation of the drone 100 may be controlled by the pilot 200 to fly freely at any position in the area within the geofences G100 and G200 set corresponding to each flight mode.
  • flight path generation units 334, 342, and 352 are an example, and for example, a single flight path generation unit may generate the flight path without subdivision.
  • FIG. 15 is a schematic diagram showing the routes that the drone 100 can follow, defined by the shooting positions L101-L105, L206-L215, and the evacuation point H200.
  • Point L101g on the ground at the shooting position L101 is the takeoff and landing point for the drone 100.
  • the position transition of the drone 100 begins with the step of taking off from point L101g and arriving at the shooting position L101.
  • the drone 100 also descends at the shooting position L101, and lands at point L101g to end shooting.
  • the drone 100 may only be able to transition to adjacent shooting positions.
  • the point to which the drone 100 at shooting position L105 can transition while maintaining the outer edge flight mode is shooting position L104.
  • the points to which the drone 100 at shooting position L105 can transition after switching to the inside court flight mode M105 are shooting positions L106 and L107.
  • the flight path generation unit 334, 342, or 352 (see FIG. 6) generates a flight path for the drone 100 by referring to the possible transition paths.
  • the drone 100 transitions to the selected shooting position via the available shooting positions. For example, when the drone 100 is at shooting position L105 and shooting position L215 is selected, the flight path generating unit 334, 342, or 352 (see FIG. 6) generates a flight path that transitions through shooting positions L105, L207, L208, L213, L212, and L215 in that order, and the drone 100 flies along this flight path.
  • the drone 100 may transition to an adjacent shooting position when the transition of the shooting position involves a flight mode switch, and may transition directly to a non-adjacent shooting position when the transition does not involve a flight mode switch. That is, for example, when the drone moves from shooting position L105 to shooting position L215, it may transition from shooting position L105 to shooting position L207 with a mode switch, and then move linearly through the area within the geofence G200 from shooting position L107 to shooting position L215.
  • the outer edge flight control unit 340 and the inner court flight control unit 350 autonomously fly the drone 100 in each flight area according to the flight mode.
  • the outer edge flight control unit 340 and the inner court flight control unit 350 may perform dolly shooting within each flight area, that is, the drone 100 may automatically follow and shoot a specific object such as a ball or a designated player.
  • the outer edge flight control unit 340 and the inner court flight control unit 350 may also automatically control the direction of the nose of the drone 100 or the direction of the shooting camera 141, that is, the so-called shooting direction.
  • the outer edge flight control unit 340 and the inner court flight control unit 350 may automatically control the flight height of the drone 100.
  • the autonomous flight mode may differ depending on the flight mode.
  • dolly shooting may be performed when controlled by the outer edge flight control unit 340, while automatic tracking shooting of only the shooting direction with a fixed shooting position, or automatic tracking shooting of the position and shooting direction may be performed when controlled by the inner court flight control unit 350.
  • the outer edge flight control unit 340 and the in-court flight control unit 350 may generate a flight path within the court (within the stadium F) for moving the drone 100 to a target position specified by the user in each flight area.
  • the fixed position flight control unit 360 is a functional unit that controls the flight of the drone 100 in the off-court fixed position flight mode M103 and the on-court fixed position flight mode M107. In the fixed position flight mode, the fixed position flight control unit 360 hovers at a predetermined position and controls the nose direction or the direction of the shooting camera 141 to follow a specific player or the ball and perform automatic shooting. Note that the above-mentioned "control of direction” is a concept that includes control not only in the left-right direction (so-called “pan direction”) but also in the up-down direction (so-called "tilt direction").
  • the fixed position flight control unit 360 includes a mode switching permission determination unit 361.
  • the mode switching permission determination unit 361 is a functional unit that determines whether or not the flight mode can be switched in the fixed position flight mode M103 or M107.
  • the communication unit 370 has a modem or the like (not shown) and is capable of communicating with the drone 100, the controller 200, and the like via the communication network 400.
  • the communication unit 370 may, for example, monitor the state of the drone 100 and its surroundings and notify the controller 200.
  • the memory unit 380 is a functional unit, such as a database, that stores information related to flight control of the drone 100.
  • the memory unit 380 stores, for example, the coordinates of multiple shooting positions L101-L105, L210-L215 in the stadium F. These coordinates may be two-dimensional coordinates on a plane or three-dimensional coordinates including information in the height direction.
  • the memory unit 380 also stores an event-flight mode table T1 shown in FIG. 14. Furthermore, the memory unit 380 may store a strong wind state detected as an event in association with a fixed position flight mode M103 or M107. This is because in the case of strong winds, the drone 100 may not be able to fly in the intended direction, so it is safer not to move the drone 100.
  • the event-flight mode table T1 is recorded in a rewritable manner. Furthermore, multiple event-flight mode tables T1 may be stored.
  • Flowcharts Fig. 16 is a flowchart showing the overall flow of aerial photography control in this embodiment.
  • Fig. 17 is a subroutine of the flight restriction process S1002 in Fig. 16.
  • Fig. 18 is a subroutine of the flight mode switching process S1010 in Fig. 16.
  • the control described in the flowchart shown in FIG. 16 is executed in a regular loop. As shown in FIG. 16, if it is detected that the drone 100 is approaching the vicinity of geofences G100, G200 while flying (YES in step S1001), the process proceeds to flight restriction processing in step S1002. The subroutine of the flight restriction processing S1002 is explained in FIG. 17.
  • step S1003 If it is not detected in step S1001 that the drone is approaching the vicinity of geofences G100, G200 (NO in step S1001), the presence or absence of an obstacle in the path or near the drone 100 is detected (step S1003). If an obstacle is detected in step S1003 (YES in step S1003), the drone 100 is caused to hover or a detour route is generated, and the flight route of the drone 100 is changed to the detour route (step S1004).
  • step S1003 If no obstacle is detected in step S1003 (NO in step S1003), the presence or absence of an action determination of the aircraft state is detected (step S1005). If an action determination of the aircraft state is detected in step S1005 (YES in step S1005), the process proceeds to step S1006, where an event type determination process is executed (step S1006).
  • step S1005 If no action determination is detected in step S1005 (NO in step S1005), it is determined whether or not there is an input from the controller 200 by the user (step S1007). If an input from the controller 200 is detected in step S1007 (YES in step S1007), a command based on the input is executed (step S1008).
  • step S1007 If no input from the controller 200 is detected in step S1007 (NO in step S1007), the presence or absence of an event is determined (step S1009). If an event is detected (YES in step S1009), the process proceeds to flight mode switching processing in step S1010 (step S1010). If no event is detected in step S1009 (NO in step S1009), the process returns to step S1001 and steps S1001 to S1009 are repeated.
  • the overall processing of aerial photography control is performed in the following order: geofence restriction, obstacle detection, control based on aircraft state, control based on user input, and control based on in-game events such as the game state or offensive and defensive states.
  • each control is executed before control based on in-game events.
  • This order is based on the priority of performing safe control processing. With this configuration, the safety of flying the drone 100 can be more reliably guaranteed.
  • step S1101 the flight control unit 123 of the drone 100 issues an operation command to restrict the drone 100 from advancing outside the geofence.
  • step S1101 a restriction is set on the flight target position so that the drone 100 does not advance outside the geofence even when the drone 100 is manually operated.
  • step S1102 if the drone 100 does not advance outside the geofence (NO in step S1102), the process ends.
  • step S1103 If the drone 100 advances outside the geofence (YES in step S1102), proceed to step S1103. Possible causes of this situation include, for example, the wind being too strong and causing the aircraft to be swept away, or a malfunction in the drone 100 preventing it from flying in the intended direction.
  • the flight control unit 123 issues an operation command to return to the geofence. More specifically, in step S1103, a flight target position command to return to the geofence, that is, an operation command to set the flight target position to a specified point inside the geofence, is given to the drone 100.
  • step S1104 by referring to information measured by the measurement unit 110 of the drone 100, such as information on position, direction, altitude, or speed, it is determined whether the drone 100 is approaching the inside of the geofence (step S1104).
  • step S1104 is executed a predetermined time after step S1103. Note that in step S1104, it is sufficient if the drone 100 is closer to the inside of the geofence than at the time of step S1103, and it is not necessary to determine whether the drone 100 is located inside the geofence.
  • step S1104 If it is determined in step S1104 that the drone 100 is not approaching the geofence (NO in step S1104), it is determined that the above operation command is ineffective, and the flight control unit 123 forces the drone 100 to land (step S1105).
  • step S1106 information measured by the measurement unit 110 of the drone 100, such as the position and altitude, is referenced to determine whether the drone 100 is located within the geofence. If the drone 100 is located within the geofence (YES in S1106), end the process. If the drone 100 is not located within the geofence (NO in S1106), return to step 1104 and continue operation based on the operation command to return to the geofence until the drone 100 returns within the geofence.
  • FIG. 18 is a diagram showing an example of a switching process flow for switching flight modes.
  • FIG. 18 shows a process flow involving switching from the outer edge flight mode M102 to the inside court flight mode M105, and in particular shows a process flow including switching flight modes and generating a flight path when moving from shooting position L101 to shooting position L215. Note that the following explanation assumes that automatic switching of flight modes has occurred based on event detection, but the flight mode switching process itself is the same when the flight mode switch is input by manual intervention.
  • the flight mode switching unit 330 switches the flight mode from the outer edge flight mode M102 to the inside court entry mode M104, and the geofence switching unit 333 activates the geofence corresponding to the inside court entry mode M104 (step S1201).
  • the geofence corresponding to the inside court entry mode M104 is an integrated geofence defined by the area that combines the geofence G100 of the outer edge flight mode M102 and the geofence G200 of the inside court flight mode M105.
  • the flight path generating unit 334 generates a flight path from the photographing position L101 to the photographing position L215 (step S1202).
  • the flight path may be a flight path that transitions between adjacent photographing positions along the path shown in FIG. 15, or may be a flight path that linearly connects the photographing position L101 to the photographing position L215.
  • the flight path may be generated by the flight path generating unit 342 or 352 instead of or in addition to the flight path generating unit 334.
  • step S1203 it is determined whether the generated flight path is generated inside the integrated geofence (step S1203), and if at least a part of the flight path is not generated inside the integrated geofence (NO in step S1203), the flight path is corrected by the flight path generation unit 334 (step S1204), and the process returns to step S1203.
  • This configuration makes it possible to prevent the drone 100 from going outside the geofence while moving, thereby ensuring high safety. If the flight route is generated within the integrated geofence (YES in step S1203), the process proceeds to step S1205.
  • step S1205 a command to move to the image capture position L215 is sent (step S1205), and the drone 100 starts moving.
  • step S1206 determines whether or not there is a command from the controller 200 to exit the court F100 (step S1206). If no command to exit is received (NO in step S1206), step S1206 is continued until the movement into the court F100 is completed (NO in step S1207).
  • step S1206 is continued until the movement into the court F100 is completed (NO in step S1207).
  • the flight mode switching unit 330 changes the flight mode to the in-court flight mode M105, and the geofence switching unit 333 switches the geofence to the geofence G200 corresponding to the in-court flight mode M105 (step S1208).
  • the drone 100 continues flying, and when it reaches the destination, the shooting position L215 (YES in step S1220), the process ends.
  • step S1206 If, in step S1206, a command to exit from court F100 is received from the controller 200 (YES in step S1206), the process proceeds to step S1209.
  • the flight path generation unit 334 generates a flight path to the input destination outside court F100 (step S1209).
  • the flight control unit 123 starts moving the drone 100 along that flight path. Movement continues until movement to the destination outside court F100 is complete (NO in step S1210), and when movement to the destination is complete (YES in step S1210), the process proceeds to step S1211.
  • step S1211 the flight mode switching unit 330 changes the flight mode to the outer edge flight mode M102, and the geofence switching unit 333 switches the geofence to the geofence G100 corresponding to the outer edge flight mode M102 (step S1211).
  • step S1211 the drone 100 continues moving within the geofence G100, and ends the process when the movement to the destination is completed (step S1220).
  • FIGS. 19 and 20 are examples of screens G1 and G2 displayed on the display unit 201 of the controller 200 by the display control unit 210.
  • FIG. 19 is an example of screens G1 and G2 displayed on the display unit 201 of the controller 200 by the display control unit 210.
  • the screen G1 shown in FIG. 19 displays a field map G10 that shows a schematic bird's-eye view of the stadium F and the shooting positions L101 to L215, an icon G11 that shows the position information of the drone 100, a shooting range G12 that is being shot by the shooting camera 141, a display field G21 of the flight mode to which the drone 100 belongs, a status display field G22 that shows the status detected by the event detection unit, such as the aircraft status, aircraft behavior status, game status, and offensive and defensive status, a mode switching button G31 that accepts switching operations of the flight mode, a control switching button G32 that accepts switching operations between automatic control and manual control, and a video field G40 in which images captured by the drone 100 are displayed.
  • a field map G10 that shows a schematic bird's-eye view of the stadium F and the shooting positions L101 to L215
  • an icon G11 that shows the position information of the drone 100
  • a shooting range G12 that is being shot by the shooting camera 141
  • the mode switching button G31 displays the switchable modes, and in FIG. 19, it is shown that it is possible to switch to the in-court flight mode.
  • the control switching button G32 displays whether it is possible to switch to automatic control or manual control.
  • Figure 19 shows that switching to manual intervention is possible.
  • the position and shooting direction of the drone 100 may be controlled manually, or automatic tracking control of the ball or a specific player may be performed.
  • automatic tracking control information about the ball or specific player being tracked may be displayed on the screen G1.
  • FIG. 20 shows an example of the display of screen G2 displayed by the display control unit 210 when the drone 100 is moving.
  • FIG. 20 shows that the drone 100 is flying from the shooting position L208 to the shooting position L206 in the on-court flight mode M105.
  • the screen G2 also displays information on the destination, i.e., the target position, shooting position L206. More specifically, the drone 100 is flying between the shooting positions L206 and L208, and an arrow extending from the drone 100 to the destination shooting position L206 is displayed.
  • the target position, shooting position L206 is displayed in a different manner from the other shooting positions, for example, highlighted. More specifically, the shooting position L206 is displayed in bold or a large font.
  • the status display field G22 also displays that the aircraft behavior status is "moving".
  • the nose direction of the drone 100 does not necessarily have to be the direction of travel of the drone 100, but may be in any direction. Furthermore, the nose direction of the drone 100 does not have to be constant while moving, and for example, the drone 100 may move while photographing a player or the ball by rotating in a yaw motion.
  • the present invention is not limited to the above embodiment, and various configurations can be adopted based on the contents of this specification.
  • the series of processes described in relation to the above embodiment may be implemented using software, hardware, or a combination of software and hardware.
  • a computer program for implementing each function of the server 300 according to this embodiment may be created and implemented in a PC or the like.
  • a computer-readable recording medium on which such a computer program is stored may also be provided. Examples of the recording medium include a magnetic disk, an optical disk, a magneto-optical disk, and a flash memory.
  • the above computer program may also be distributed, for example, via the communication network 400, without using a recording medium.
  • Aerial photography system 100 Drone (mobile body) 141
  • Shooting camera 200
  • Pilot 220
  • Input control unit 300
  • Event detection unit 330
  • Flight mode switching unit 331
  • Mode switching determination unit 380
  • Memory unit F Stadium F100 Court F200 Area outside the court
  • Outer edge flight mode M105 Inner court flight mode

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

[Problem] To ensure safety during aerial photography. [Solution] A mobile body system 1 comprising a mobile body 100 that flies in a subject area F and a flight mode switching unit 330 for switching flight modes M102 and M105 in which flyable areas for the mobile body and mobile-body geofences G100 and G200 obtained by defining a region encompassing flyable areas are associated with each other, wherein the flight mode switching unit transitions to a second flight mode M105 via third flight modes M104 and M106 when switching from the first flight mode M102 to the second flight mode, and a third geofence of the third flight mode differs from a first geofence G100 of the first flight mode and a second geofence G200 of the second flight mode.

Description

移動体システム、空中撮影システム、空中撮影方法および空中撮影プログラムMOBILE SYSTEM, AERIAL PHOTOGRAPHY SYSTEM, AERIAL PHOTOGRAPHY METHOD, AND AERIAL PHOTOGRAPHY PROGRAM
 本発明は、移動体システム、空中撮影システム、空中撮影方法および空中撮影プログラムに関する。 The present invention relates to a mobile system, an aerial photography system, an aerial photography method, and an aerial photography program.
 特許文献1には、飛行体の機***置と機首方向を検出するとともに、飛行体に搭載されるカメラ装置のパン角度とチルト角度を検出し、これらの各データから、カメラの視点を計算し、モニタ画面の地図上に視点を表示するカメラの視点表示システムが開示されている。このシステムでは、操作者は、飛行体の機体の位置や進行方向を地上局で把握しながら、飛行体の位置や姿勢、およびカメラの撮影方向等を操縦する。 Patent Document 1 discloses a camera viewpoint display system that detects the aircraft's position and nose direction, as well as the pan and tilt angles of a camera device mounted on the aircraft, calculates the camera viewpoint from each of these pieces of data, and displays the viewpoint on a map on a monitor screen. With this system, an operator controls the aircraft's position and attitude, as well as the camera's shooting direction, while grasping the aircraft's position and heading from a ground station.
特開2006-281830号明細書JP 2006-281830 A
 スポーツ等を効果的に撮影するためには、試合状況等、撮影対象の詳細な内容に適した撮影が必要である。より具体的には、内容に応じて、飛行体の飛行エリア、撮影方向等が定義される撮影モードを切り替えて撮影すると好適である。また、各撮影モードに対し飛行体のジオフェンスが定義されることで、安全性を担保することができる。 In order to effectively photograph sports and other events, it is necessary to shoot in a way that is appropriate to the detailed content of the subject, such as the game situation. More specifically, it is preferable to shoot by switching between shooting modes that define the flying area of the aircraft, shooting direction, etc., depending on the content. In addition, safety can be ensured by defining a geofence for the aircraft for each shooting mode.
 特許文献1記載のシステムでは、カメラと飛行体を同時に制御する必要があり複雑であった。また、特許文献1記載のシステムでは、撮影モードの切替操作は煩雑であり容易ではない。特に、飛行体の飛行エリアの変更に応じてジオフェンスを切り替える必要がある場合には、適切なタイミングで切り替えないと制御干渉が生じ、飛行体の移動が意図通り行えないおそれもある。 The system described in Patent Document 1 was complicated because it required simultaneous control of the camera and the aircraft. Furthermore, in the system described in Patent Document 1, the operation of switching between shooting modes is cumbersome and not easy. In particular, when it is necessary to switch geofences in response to changes in the flight area of the aircraft, if the switching is not done at the appropriate time, control interference will occur and the aircraft may not move as intended.
 本発明は上記のような課題を考慮してなされたものであり、空中飛行における安全性を確保することが可能な移動体システムを提供することを目的とする。 The present invention was made in consideration of the above-mentioned problems, and aims to provide a mobile system that can ensure safety during aerial flight.
 上記目的を達成するため、本発明の一の観点に係る移動体システムは、対象エリアを飛行する移動体と、前記移動体の飛行可能エリアと、前記飛行可能エリアを包含する領域が規定されてなる前記移動体のジオフェンスと、が対応付けられてなる飛行モードを切り替える飛行モード切替部と、を備え、前記飛行モード切替部は、第1飛行モードから第2飛行モードに切替を行う場合に、第3飛行モードを介して前記第2飛行モードへ遷移させ、前記第3飛行モードにおける第3ジオフェンスは、前記第1飛行モードにおける第1ジオフェンスおよび前記第2飛行モードにおける第2ジオフェンスとは異なる。 In order to achieve the above object, a mobile body system according to one aspect of the present invention includes a mobile body that flies in a target area, and a flight mode switching unit that switches flight modes that correspond to a flyable area of the mobile body and a geofence of the mobile body that defines an area that includes the flyable area, and when switching from a first flight mode to a second flight mode, the flight mode switching unit transitions to the second flight mode via a third flight mode, and the third geofence in the third flight mode is different from the first geofence in the first flight mode and the second geofence in the second flight mode.
 前記第3ジオフェンスは、前記第1ジオフェンスで定義される第1エリアと、前記第2ジオフェンスで定義される第2エリアと、を統合した統合エリアを覆う領域が規定されるものとしてもよい。 The third geofence may define an area covering an integrated area that combines a first area defined by the first geofence and a second area defined by the second geofence.
 前記第3ジオフェンスは、前記第1ジオフェンスで定義される第1エリアと、前記第2ジオフェンスで定義される第2エリアと、前記第1エリアと前記第2エリアとの空隙と、を統合した統合エリアを覆う領域が規定されるものとしてもよい。 The third geofence may define an area covering an integrated area that combines a first area defined by the first geofence, a second area defined by the second geofence, and the gap between the first area and the second area.
 前記第1飛行モードおよび前記第2飛行モードの一方は、前記対象エリアの外縁上を飛行する外縁飛行モードであり、前記第1飛行モードおよび前記第2飛行モードの他方は、前記対象エリアの上空を飛行する対象エリア内飛行モードであるものとしてもよい。 One of the first flight mode and the second flight mode may be an edge flight mode in which the aircraft flies along the outer edge of the target area, and the other of the first flight mode and the second flight mode may be an intra-target area flight mode in which the aircraft flies above the target area.
 前記第3飛行モードにおいて前記移動体の飛行経路を生成する飛行経路生成部をさらに備え、前記飛行経路の少なくとも一部が前記第3ジオフェンスの外部に生成されている場合に、前記飛行経路を変更するものとしてもよい。 The device may further include a flight path generation unit that generates a flight path for the moving body in the third flight mode, and that changes the flight path when at least a portion of the flight path is generated outside the third geofence.
 前記移動体が、設定されている前記第1ジオフェンス、前記第2ジオフェンス又は前記第3ジオフェンスのいずれかの外部に逸脱したことを検知した場合、前記移動体を着陸、ホバリング、前記第1エリア又は前記第2エリア又は前記統合エリアの内側方向への移動、の少なくともいずれかの動作を前記移動体に実行させるものとしてもよい。 If it is detected that the moving body has deviated outside the first geofence, the second geofence, or the third geofence that has been set, the moving body may be caused to perform at least one of the following actions: landing, hovering, or moving inwardly of the first area, the second area, or the integrated area.
 また、上記目的を達成するため、本発明の一の観点に係る空中撮影システムは、対象エリアを飛行する移動体と、前記対象エリアを撮影するカメラにより取得される画像又は外部システムからの入力に基づいてイベントを検出するイベント検出部と、少なくとも前記移動体の飛行可能エリアが定められてなる飛行モードの切替要否を判定するモード切替判定部と、を備え、前記飛行モードは、前記対象エリア内に構成されるコートの外縁の一部又は全部に沿って、前記外縁の上空を飛行する外縁飛行モードと、前記コート内の上空を飛行するコート内飛行モードと、を少なくとも含み、前記モード切替判定部は、前記イベント検出部により検出されるイベントに応じて、前記飛行モードの切替要否を判定する。 In order to achieve the above object, an aerial photography system according to one aspect of the present invention includes a moving object flying in a target area, an event detection unit that detects an event based on an image acquired by a camera photographing the target area or an input from an external system, and a mode switching determination unit that determines whether or not to switch flight modes that define at least a flyable area of the moving object, the flight modes including at least an outer edge flight mode in which the flying object flies above the outer edge of a court configured within the target area along part or all of the outer edge of the court, and an inside court flight mode in which the flying object flies above the court, and the mode switching determination unit determines whether or not to switch flight modes depending on the event detected by the event detection unit.
 前記モード切替判定部は、前記カメラにより撮影される撮影画像により検出した、前記対象エリアで行われている試合に関するイベントに応じて前記飛行モードを切り替えるものとしてもよい。 The mode switching determination unit may switch the flight mode in response to an event related to a match taking place in the target area, which is detected from an image captured by the camera.
 前記コート内飛行モードにおいて、前記移動体が前記対象エリアの特定の選手又はボールに自動追従するための飛行経路、又はユーザにより指定される目標位置に前記移動体を移動させるための飛行経路を前記コート内において生成するコート内飛行制御部をさらに備えるものとしてもよい。 The system may further include an in-court flight control unit that generates, in the in-court flight mode, a flight path within the court for the moving body to automatically follow a specific player or ball in the target area, or a flight path for moving the moving body to a target position specified by a user.
 前記移動体の情報を操作画面に表示する表示制御部をさらに備え、前記表示制御部は、前記自動追従をして飛行している場合に、追従対象となる前記特定の選手又は前記ボールを前記操作画面に表示するものとしてもよい。 The device may further include a display control unit that displays information about the moving object on an operation screen, and the display control unit may display the specific player or the ball that is the subject of tracking on the operation screen when the moving object is flying with the automatic tracking.
 前記コート内飛行制御部は、前記コート内飛行モードにおいて、あらかじめ設定された複数の撮影位置を接続して前記飛行経路を生成するものとしてもよい。 The on-court flight control unit may generate the flight path in the on-court flight mode by connecting a plurality of pre-set shooting positions.
 前記移動体の情報を操作画面に表示する表示制御部をさらに備え、前記表示制御部は、前記コート内飛行モードにより前記撮影位置間を飛行中において、移動先の前記撮影位置の情報を前記操作画面に表示するものとしてもよい。 The device may further include a display control unit that displays information about the moving object on an operation screen, and the display control unit may display information about the destination shooting position on the operation screen while flying between the shooting positions in the on-court flight mode.
 前記対象エリアは、互いに向かい合う1対のゴールラインと、互いに向かい合う1対のタッチラインと、がそれぞれ接続されることにより区画される前記コートとコート外領域とにより構成され、前記外縁飛行モードにおいて前記移動体の飛行を制御する外縁飛行制御部をさらに備え、前記外縁飛行制御部は、前記外縁飛行モードにおいては、前記タッチラインに沿って前記移動体を移動させ、かつ、前記移動体を特定の選手又はボールに追従させて自動撮影を行うものとしてもよい。 The target area is made up of the court and an area outside the court, which are partitioned by connecting a pair of goal lines facing each other and a pair of touch lines facing each other, and further includes an outer edge flight control unit that controls the flight of the moving body in the outer edge flight mode, and the outer edge flight control unit may move the moving body along the touch lines in the outer edge flight mode and automatically photograph the moving body by following a specific player or the ball.
 前記外縁飛行制御部は、前記ボールが、前記移動体が飛行している前記タッチラインを超えて前記コート外領域に出た場合に、前記コート外領域であって前記タッチラインよりも外側に前記移動体を移動させ、前記移動体の真下又は前記コートに向かって撮影するものとしてもよい。 The outer edge flight control unit may be configured to move the moving object to the outside of the court area beyond the touch line in which the moving object is flying, and take a picture directly below the moving object or toward the court when the ball goes beyond the touch line and enters the outside of the court area.
 前記飛行モードは、前記移動体を定位置で飛行させる定位置飛行モードをさらに含み、前記定位置飛行モードにおける前記移動体の動作を制御する定位置飛行制御部をさらに備え、前記定位置飛行制御部は、前記定位置飛行モードでは、所定位置においてホバリングを行うとともに、機首方向又は前記カメラの方向を制御し、特定の選手又はボールに追従させて自動撮影を行うものとしてもよい。 The flight modes further include a fixed position flight mode in which the moving body is caused to fly in a fixed position, and further include a fixed position flight control unit that controls the operation of the moving body in the fixed position flight mode, and the fixed position flight control unit may hover at a predetermined position in the fixed position flight mode, and control the nose direction or the direction of the camera to follow a specific player or the ball and automatically take pictures.
 前記コート内飛行制御部は、前記コート内飛行モードにおいて、前記飛行経路上又は前記移動体の近傍に障害物を検知した場合には、前記撮影位置の接続を変更して前記障害物を迂回する飛行経路を再生成、前記飛行経路よりも高い高度による飛行を決定、所定時間ホバリングした後に前記飛行経路で前記移動体の移動を開始、所定時間ホバリングした後に手動操縦に切替、および所定時間ホバリングした後にユーザに前記目標位置の再入力を促す表示を行う、の少なくともいずれかの動作を実行するものとしてもよい。 When the on-court flight control unit detects an obstacle on the flight path or near the moving object in the on-court flight mode, it may perform at least one of the following actions: change the connection of the shooting positions to regenerate a flight path that bypasses the obstacle, decide to fly at a higher altitude than the flight path, start moving the moving object on the flight path after hovering for a predetermined time, switch to manual control after hovering for a predetermined time, and display a message prompting the user to re-input the target position after hovering for a predetermined time.
 前記外縁飛行制御部は、前記外縁飛行モードにおいて前記移動体の飛行経路を生成し、前記飛行経路上又は前記移動体の近傍に障害物を検知した場合には、前記障害物を前記コートの内側に迂回する飛行経路を再生成、前記飛行経路よりも高い高度による飛行を決定、所定時間ホバリングした後に前記飛行経路で前記移動体の移動を開始、所定時間ホバリングした後に手動操縦に切替、および所定時間ホバリングした後にユーザに目標位置の再入力を促す表示を行う、の少なくともいずれかの動作を実行するものとしてもよい。 The outer edge flight control unit may perform at least one of the following operations when generating a flight path for the moving body in the outer edge flight mode, and when detecting an obstacle on the flight path or near the moving body, regenerating a flight path that bypasses the obstacle to the inside of the court, deciding to fly at a higher altitude than the flight path, starting the movement of the moving body on the flight path after hovering for a predetermined time, switching to manual control after hovering for a predetermined time, and displaying a message prompting the user to re-input the target position after hovering for a predetermined time.
 上記目的を達成するため、本発明の別の観点に係る空中撮影方法は、対象エリアを撮影するカメラにより取得される画像又は外部システムからの入力に基づいてイベントを検出するイベント検出ステップと、少なくとも前記対象エリアを飛行する移動体の飛行可能エリアが定められてなる飛行モードの切替要否を判定するモード切替判定ステップと、を備え、前記飛行モードは、前記対象エリア内に構成されるコートの外縁の一部又は全部に沿って、前記外縁の上空を飛行する外縁飛行モードと、前記コート内の上空を飛行するコート内飛行モードと、を少なくとも含み、前記モード切替判定ステップでは、前記イベント検出ステップにより検出されるイベントに応じて、前記飛行モードの切替要否を判定する。 In order to achieve the above object, an aerial photography method according to another aspect of the present invention includes an event detection step for detecting an event based on an image acquired by a camera photographing a target area or an input from an external system, and a mode switching determination step for determining whether or not to switch flight modes that define a flight area within which a mobile object can fly, the flight modes including at least an outer edge flight mode in which the mobile object flies above the outer edge of a court configured within the target area along part or all of the outer edge of the court, and an inside court flight mode in which the mobile object flies above the court, and the mode switching determination step determines whether or not to switch flight modes depending on the event detected by the event detection step.
 上記目的を達成するため、本発明の別の観点に係る空中撮影プログラムは、対象エリアを撮影するカメラにより取得される画像又は外部システムからの入力に基づいてイベントを検出するイベント検出命令と、少なくとも前記対象エリアを飛行する移動体の飛行可能エリアが定められてなる飛行モードの切替要否を判定するモード切替判定命令と、をコンピュータに実行させ、前記飛行モードは、前記対象エリア内に構成されるコートの外縁の一部又は全部に沿って、前記外縁の上空を飛行する外縁飛行モードと、前記コート内の上空を飛行するコート内飛行モードと、を少なくとも含み、前記モード切替判定命令では、前記イベント検出命令により検出されるイベントに応じて、前記飛行モードの切替要否を判定する。
 なお、コンピュータプログラムは、各種のデータ読取可能な記録媒体に格納して提供したり、インターネット等のネットワークを介してダウンロード可能に提供したりすることができる。
In order to achieve the above-mentioned object, an aerial photography program according to another aspect of the present invention causes a computer to execute an event detection command for detecting an event based on an image acquired by a camera photographing a target area or an input from an external system, and a mode switching determination command for determining whether or not to switch flight modes that define at least a flight area in which a moving object flying in the target area can fly, the flight modes including at least an outer edge flight mode for flying above the outer edge along part or all of the outer edge of a court configured within the target area, and an intra-court flight mode for flying above the interior of the court, and the mode switching determination command determines whether or not to switch flight modes depending on the event detected by the event detection command.
The computer program may be provided by being stored on various data-readable recording media, or may be provided so as to be downloadable via a network such as the Internet.
 本発明によれば、空中飛行における安全性を確保することが可能となる。 The present invention makes it possible to ensure safety during aerial flight.
本発明の一実施形態に係る移動体システムの全体構成図である。1 is an overall configuration diagram of a mobile body system according to an embodiment of the present invention; 前記実施形態のドローンを簡略的に示す外観斜視図である。FIG. 2 is a simplified external perspective view of the drone according to the embodiment. 前記実施形態のドローンの機能構成図である。FIG. 2 is a functional configuration diagram of the drone according to the embodiment. (a)前記実施形態の操縦装置を簡略的に示す外観正面図、(b)前記操縦装置の入力に応じてドローンが移動又は旋回する方向を示す模式図、である。(a) is a simplified front view of the exterior of the control device of the embodiment; (b) is a schematic diagram showing the direction in which the drone moves or turns in response to input from the control device. 前記実施形態の操縦装置の機能構成図である。FIG. 2 is a functional configuration diagram of the control device according to the embodiment. 前記実施形態のサーバの機能構成図である。FIG. 2 is a functional configuration diagram of a server according to the embodiment. ドローンが飛行する撮影対象フィールドにおいてあらかじめ設定される前記ドローンの撮影位置の例を示す模式図である。1 is a schematic diagram showing an example of a shooting position of a drone that is set in advance in a shooting target field where the drone flies. 前記撮影対象フィールドにおけるジオフェンスの設定の例を示す模式図であって、(a)第1例、(b)第2例、(c)第3例、(d)第4例を示す図である。1A to 1D are schematic diagrams showing examples of setting geofences in the field to be photographed, in which (a) is a first example, (b) is a second example, (c) is a third example, and (d) is a fourth example. 前記ドローンの飛行モードの遷移の様子を示す概略状態遷移図である。FIG. 2 is a schematic state transition diagram showing the transition of flight modes of the drone. 前記ドローンの機体状態に応じたドローンの状態遷移の様子を示す概略状態遷移図である。This is a schematic state transition diagram showing the state transition of the drone depending on the aircraft state of the drone. 前記ドローンの機体行動状態に応じたドローンの状態遷移の様子を示す概略状態遷移図である。A schematic state transition diagram showing the state transition of the drone according to the aircraft behavior state of the drone. 撮影対象フィールドの例としての競技場における試合状態の状態遷移の様子を示す概略状態遷移図である。FIG. 11 is a schematic state transition diagram showing state transitions of a game in a stadium as an example of a field to be photographed. 前記競技場における攻守状態の状態遷移の様子を示す概略状態遷移図である。FIG. 2 is a schematic state transition diagram showing the state transition of the offensive and defensive states in the stadium. 前記競技場における試合状態と、前記ドローンの飛行モードとの対応関係の1例を示す表である。11 is a table showing an example of a correspondence relationship between a game state in the stadium and a flight mode of the drone. 前記撮影位置から遷移可能な撮影位置および飛行経路を示す模式図である。1 is a schematic diagram showing possible photographing positions and flight paths to which the photographing position can be transitioned; 前記ドローンの飛行中に実施される制御のフローチャートである。1 is a flowchart of a control executed during flight of the drone. 前記ドローンにおける飛行制限の制御にかかるフローチャート(図16のS1002の詳細)である。17 is a flowchart showing control of flight restrictions in the drone (details of S1002 in FIG. 16 ). 前記ドローンにおける飛行モードの切替制御に係るフローチャート(図16のS1010の詳細)である。17 is a flowchart showing flight mode switching control of the drone (details of S1010 in FIG. 16 ). 前記空中撮影システムの端末に表示される画面の第1例を示す図である。FIG. 4 is a diagram showing a first example of a screen displayed on a terminal of the aerial photography system. 前記空中撮影システムの端末に表示される画面の第2例を示す図である。FIG. 13 is a diagram showing a second example of a screen displayed on the terminal of the aerial photography system.
 以下では、添付図面を参照しながら、本発明の好適な実施形態について詳細に説明する。本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。また、以下に示す実施形態は、例を表すに過ぎず、その用途、目的又は規模等に応じて、他の既知の要素や代替手段を採用可能である。 Below, a preferred embodiment of the present invention will be described in detail with reference to the attached drawings. In this specification and drawings, components having substantially the same functional configuration will be denoted with the same reference numerals to avoid repetitive explanation. Furthermore, the embodiments shown below are merely examples, and other known elements or alternative means may be adopted depending on the application, purpose, scale, etc.
<A.一実施形態>
[A-1.構成]
(A-1-1.全体構成)
 図1は、本発明の一実施形態に係る空中撮影システム1(以下「システム1」ともいう。)の全体構成図である。システム1は、競技場F(図7)で行われている競技、催物会場で行われている催物等をドローン100(移動体の例である。)で空中撮影するものである。競技場Fは、対象エリアの一例である。
A. One embodiment
[A-1. composition]
(A-1-1. Overall configuration)
Fig. 1 is an overall configuration diagram of an aerial photography system 1 (hereinafter also referred to as "system 1") according to one embodiment of the present invention. The system 1 uses a drone 100 (an example of a moving body) to take aerial photographs of a competition held at a stadium F (Fig. 7) or an event held at an event venue. The stadium F is an example of a target area.
 以降の説明においては、必要に応じてサッカーを撮影するシステム1を例に説明するが、本システム1はサッカー以外の競技や催物にも適用可能である。 In the following explanation, we will use as an example the system 1 that films soccer as needed, but this system 1 can also be applied to sports and events other than soccer.
 図1に示すように、システム1は、ドローン100に加えて、主として、操縦者がドローン100を操作するための操縦器200と、ドローン100の飛行及び撮影を管理するサーバ300と、外部入力装置600と、外部システム700と、を有する。 As shown in FIG. 1, in addition to the drone 100, the system 1 mainly includes a controller 200 that allows the pilot to operate the drone 100, a server 300 that manages the flight and photography of the drone 100, an external input device 600, and an external system 700.
 ドローン100と操縦器200は、無線通信(基地局800を介するものを含み得る。)を介して互いに接続される。操縦器200とサーバ300は、インターネット回線等の通信ネットワーク400を介して互いに接続される。ドローン100は、自己位置の特定等のため、人工衛星500から衛星信号を取得する。 The drone 100 and the controller 200 are connected to each other via wireless communication (which may include communication via a base station 800). The controller 200 and the server 300 are connected to each other via a communication network 400 such as an internet line. The drone 100 acquires satellite signals from an artificial satellite 500 to determine its own position, etc.
 外部入力装置600は、操縦器200とは別に本システム1との間で情報を送受信できる装置であり、例えばスマートホン又はタブレット端末等のモバイル端末で構成される。外部入力装置600は、例えば、競技場Fで行われている競技の監督、コーチ、ベンチの選手、審判、又はコート設備関係者等により操作可能である。外部入力装置600は、例えば、緊急の撮影中断指令を受け付ける機能を有し、当該撮影中断指令に基づいてドローン100は緊急避難を行う。また、外部入力装置600は、ドローン100の飛行モードの切替入力を受け付けてもよい。さらに、外部入力装置600は表示装置を備え、操縦器200の表示部201と同様の情報が表示されてもよい。特に、外部入力装置600は、競技で発生するイベント情報を取得してもよい。当該イベント情報は、外部入力装置600のユーザにより、ドローン100の飛行モードを切り替える入力を行う際に参照される。 The external input device 600 is a device capable of transmitting and receiving information to and from the system 1, separate from the controller 200, and is composed of a mobile terminal such as a smartphone or tablet terminal. The external input device 600 can be operated, for example, by the manager, coach, bench player, referee, or court equipment personnel of the competition taking place at the stadium F. The external input device 600 has, for example, a function for receiving an emergency command to stop filming, and the drone 100 performs emergency evacuation based on the command. The external input device 600 may also receive an input to switch the flight mode of the drone 100. Furthermore, the external input device 600 may be equipped with a display device, and may display information similar to that of the display unit 201 of the controller 200. In particular, the external input device 600 may acquire event information that occurs during the competition. The event information is referred to when the user of the external input device 600 makes an input to switch the flight mode of the drone 100.
 外部システム700は、システム1とは別途に構成される任意のシステムであってよく、例えば、競技場Fで行われる競技に関して配備されるシステムとして、コート設備システム、試合運営システム、審判支援システム、といったシステムが適用可能である他、競技とは独立して配備されている気象観測システム又は地震観測システムといったシステムが適用可能である。複数の外部システム700がシステム1に接続されていてもよい。システム1は、種々の外部システム700から、緊急の撮影中断指令やドローン100の飛行モードの切替指令を受け付けてもよい。また、種々の外部システム700は、競技で発生するイベント情報を取得してもよい。 The external system 700 may be any system configured separately from the system 1. For example, systems such as a court facility system, a match management system, and a referee support system may be applied as systems deployed in relation to the competition held at the stadium F, and systems such as a weather observation system or an earthquake observation system deployed independently of the competition may also be applied. Multiple external systems 700 may be connected to the system 1. The system 1 may receive an emergency command to stop filming or a command to switch the flight mode of the drone 100 from the various external systems 700. In addition, the various external systems 700 may acquire event information that occurs during the competition.
 外部システム700の1例としてのコート設備システムは、例えばシステム1から撮影画像の輝度を取得し、競技場Fの照明の照度調整又は明滅を制御してもよい。また、コート設備システムは、システム1から照明照度の要求を受信して照度調整又は明滅を制御してもよい。 The court facilities system, which is an example of the external system 700, may obtain the brightness of the captured image from the system 1, for example, and control the illuminance adjustment or blinking of the lighting in the stadium F. The court facilities system may also receive a request for lighting illuminance from the system 1 and control the illuminance adjustment or blinking.
 システム1の構成は、図1に示すものに限らず、例えばインターネット回線等の通信ネットワーク400を介して、ドローン100と操縦器200とサーバ300と基地局800とがそれぞれ相互に通信可能に接続されていてもよい。この場合、ドローン100は操縦器200を介さずにLTE等の通信方法によって直接通信ネットワーク400と無線通信を行ってよい。そのため、ドローン100と操縦器200及び基地局800は、直接無線通信を行う必要がなく、遠隔地においてそれぞれ通信ネットワーク400に接続できればよい。そのため、ドローン100と操縦器200が遠隔地に存在する場合(例えば、操縦者が遠隔操作を行う場合等)に適したシステム構成である。 The configuration of system 1 is not limited to that shown in FIG. 1, and the drone 100, the controller 200, the server 300, and the base station 800 may each be connected to each other so that they can communicate with each other via a communication network 400 such as an Internet line. In this case, the drone 100 may perform wireless communication directly with the communication network 400 using a communication method such as LTE without going through the controller 200. Therefore, the drone 100, the controller 200, and the base station 800 do not need to perform direct wireless communication, and it is sufficient if they can each be connected to the communication network 400 in a remote location. Therefore, this system configuration is suitable for cases where the drone 100 and the controller 200 are in a remote location (for example, when a pilot operates them remotely).
 また、システム1は、インターネット回線等の通信ネットワーク400を介して、ドローン100と操縦器200と基地局800とサーバ300とがそれぞれ相互に通信可能に接続され、且つドローン100及び基地局800は人工衛星500を介した衛星通信により通信ネットワーク400と通信接続されてもよい。 In addition, in the system 1, the drone 100, the controller 200, the base station 800, and the server 300 are each connected to each other so that they can communicate with each other via a communication network 400 such as an Internet line, and the drone 100 and the base station 800 may be communicatively connected to the communication network 400 by satellite communication via an artificial satellite 500.
 さらに、システム1は、1台のドローン100に対して複数のサーバ300が複数の通信ネットワーク400を介して接続され、すなわちシステムが冗長化されていてもよい。この場合、サーバ300、又は通信ネットワーク400に異常が生じた場合であっても、冗長化された他のサーバ300や通信ネットワーク400によりシステム1の動作、ひいてはドローン100による撮影を継続することができるため、システム1の信頼性を向上させることができる。なお、上記の2形態においても、ドローン100と操縦器200が遠隔にあっても操縦可能であるため、遠隔操作に適した構成ではあるが、これに限られず、操縦者がドローン100を見ながら手動制御する有視界飛行にも適用可能である。 Furthermore, in the system 1, multiple servers 300 may be connected to one drone 100 via multiple communication networks 400, i.e., the system may be made redundant. In this case, even if an abnormality occurs in the server 300 or communication network 400, the operation of the system 1, and therefore shooting by the drone 100, can be continued by the other redundant servers 300 and communication networks 400, thereby improving the reliability of the system 1. Note that in both of the above forms, the drone 100 and the controller 200 can be controlled even when they are remotely located, making them suitable for remote operation, but this is not limited to this, and they can also be applied to visual flight in which the pilot manually controls the drone 100 while watching it.
 上記実施形態において説明した装置は、単独の装置として実現されてもよく、一部又は全部が通信ネットワーク400で接続された複数の装置(例えばドローン100、操縦器200、クラウドサーバ300、)等により実現されてもよい。例えば、サーバ300の各機能部及び記憶部は、互いに通信ネットワーク400で接続された異なるサーバ300、ドローン100、操縦器200に実装されることにより実現されてもよい。 The device described in the above embodiment may be realized as a single device, or may be realized by multiple devices (e.g., drone 100, controller 200, cloud server 300) that are partially or completely connected by a communication network 400. For example, each functional unit and memory unit of server 300 may be realized by being implemented in different servers 300, drones 100, and controllers 200 that are connected to each other by the communication network 400.
(A-1-2.ドローン100)
(A-1-2-1.ドローン100の概要)
 図2は、本実施形態のドローン100を簡略的に示す外観斜視図である。図3は、本実施形態のドローン100の機能構成図である。上記の通り、ドローン100は、競技場F(図7)で行われている競技、催物会場で行われている催物等を空中撮影する。
(A-1-2. Drone 100)
(A-1-2-1. Overview of the drone 100)
Fig. 2 is a simplified external perspective view of the drone 100 of this embodiment. Fig. 3 is a functional configuration diagram of the drone 100 of this embodiment. As described above, the drone 100 takes aerial photographs of competitions held in the stadium F (Fig. 7) and events held in the event venue.
 本明細書において、「ドローン」とは、動力手段(電力、原動機等)、操縦方式(無線であるか有線であるか、及び、完全自律飛行型であるか部分手動操縦型であるか等)を問わず、また、有人か無人かを問わず、自律的に姿勢制御を行う機能を有する飛行体全般を指すこととする。また、ドローンは、無人航空機(Unmanned Aerial Vehicle:UAV)、飛行体、マルチコプター(Multi Copter)、RPAS(Remote Piloted Aircraft Systems)、又はUAS(Unmanned Aircraft Systems)等と称呼されることがある。 In this specification, "drone" refers to any flying object that has the ability to autonomously control its attitude, regardless of the power source (electricity, prime mover, etc.), control method (wireless or wired, and fully autonomous or partially manual, etc.), and whether manned or unmanned. Drones are also sometimes referred to as Unmanned Aerial Vehicles (UAVs), flying objects, multicopters, RPAS (Remote Piloted Aircraft Systems), or UAS (Unmanned Aircraft Systems), etc.
 図2に示すように、ドローン100の外観は主として、筐体101と、複数のプロペラ122と、により構成される。筐体101は例えば略直方体であるが、形状は任意である。筐体101の左右側面には、側方に伸び出る棒状の連結部102が連結されている。連結部102の他端には、それぞれプロペラ122と、各プロペラ122を回転させるモータ121が連結される。モータ121は、例えば電動モータである。なお、本実施形態においては、連結部102、プロペラ122およびモータ121は4個ずつ備えられているが、個数はこれに限られない。プロペラ122は単独のプロペラで構成されていてもよいし、同軸配置された複数のプロペラで構成されていてもよい。各プロペラの羽根(ブレード)の枚数及び形状は特に限定されない。 As shown in FIG. 2, the exterior of the drone 100 is mainly composed of a housing 101 and multiple propellers 122. The housing 101 is, for example, a roughly rectangular parallelepiped, but may have any shape. Rod-shaped connecting parts 102 extending laterally are connected to the left and right sides of the housing 101. The other ends of the connecting parts 102 are respectively connected to propellers 122 and motors 121 that rotate the propellers 122. The motors 121 are, for example, electric motors. Note that in this embodiment, there are four connecting parts 102, propellers 122, and motors 121, but the number is not limited to this. The propellers 122 may be composed of a single propeller, or may be composed of multiple propellers arranged coaxially. The number and shape of the blades of each propeller are not particularly limited.
 また、プロペラ122の外側には、障害物に対するプロペラの干渉を防ぐためのプロペラガード(図示せず)を設けてもよい。 In addition, a propeller guard (not shown) may be provided on the outside of the propeller 122 to prevent the propeller from interfering with obstacles.
 筐体101には、例えば撮影用カメラ141が、筐体101下方にカメラ保持部142により保持されている。また、筐体101の前方面には、障害物検知カメラ131が配設されている。障害物検知カメラ131は、本実施形態においては対をなす2個のカメラにより構成される、いわゆるデュアルカメラである。障害物検知カメラ131は、ドローン100の前方を撮像するように配設されている。なお、障害物検知カメラ131は、前方面だけではなく筐体101のすべての面、例えば略直方体の筐体101においては6面に設けられていてもよい。 In the housing 101, for example, a photographing camera 141 is held by a camera holder 142 below the housing 101. In addition, an obstacle detection camera 131 is disposed on the front surface of the housing 101. In this embodiment, the obstacle detection camera 131 is a so-called dual camera consisting of two cameras that form a pair. The obstacle detection camera 131 is disposed so as to capture an image in front of the drone 100. Note that the obstacle detection camera 131 may be disposed not only on the front surface but also on all surfaces of the housing 101, for example, on six surfaces in the case of a housing 101 that is a substantially rectangular parallelepiped.
 ドローン100は、ドローン100の周囲にいる人々に対して、ドローン100の存在について注意喚起を行う警報装置250を備える。警報装置250は、例えば警告灯251及びスピーカ252を有する。警告灯251は、プロペラ122又はモータ121毎に設けられ、例えば複数のモータ121の各側面に配設される。警告灯251は正面の他、あらゆる方向から視認できるようモータ121の円筒状の側面に沿って配設されてよい。スピーカ252は、警告音を出力するものであり、ドローン100の筐体101に設けられる。スピーカ252は、例えば筐体101下面に設けられ、警告音をドローン100の下方に向かって伝達させる。 The drone 100 is equipped with an alarm device 250 that alerts people around the drone 100 to the presence of the drone 100. The alarm device 250 has, for example, a warning light 251 and a speaker 252. The warning light 251 is provided for each propeller 122 or motor 121, and is disposed, for example, on each side of multiple motors 121. The warning light 251 may be disposed along the cylindrical side of the motor 121 so that it can be seen from all directions in addition to the front. The speaker 252 outputs an alarm sound and is provided in the housing 101 of the drone 100. The speaker 252 is provided, for example, on the underside of the housing 101, and transmits the alarm sound downwards of the drone 100.
(A-1-2-2.ドローン100の機能ブロック)
 図3に示すように、ドローン100は、情報処理を実行するためのCPU(Central Processing Unit)等の演算装置、RAM(Random Access Memory)及びROM(Read Only Memory)等の記憶装置を備え、これにより、主として、測定部110、飛行機能部120,障害物検知部130、撮影部140および通信部150の各機能ブロックを有する。
(A-1-2-2. Functional blocks of drone 100)
As shown in FIG. 3, the drone 100 is equipped with an arithmetic device such as a CPU (Central Processing Unit) for executing information processing, and storage devices such as a RAM (Random Access Memory) and a ROM (Read Only Memory), and thereby has the following functional blocks: a measurement unit 110, a flight function unit 120, an obstacle detection unit 130, an imaging unit 140, and a communication unit 150.
 測定部110は、ドローン100又はその周辺に関する情報を測定する機能部である。測定部110は、例えば位置測定部111、方位測定部112、高度測定部113および速度測定部114等を有する。測定部110はこれらに加えて、温度、気圧、風速、加速度等の情報を取得する種々のセンサ等を含んでもよい。 The measurement unit 110 is a functional unit that measures information related to the drone 100 or its surroundings. The measurement unit 110 has, for example, a position measurement unit 111, a direction measurement unit 112, an altitude measurement unit 113, and a speed measurement unit 114. In addition to these, the measurement unit 110 may also include various sensors that acquire information such as temperature, air pressure, wind speed, and acceleration.
 位置測定部111は、人工衛星500からの信号を受信し、それに基づいて機体の位置(絶対位置)を測定する。位置測定部111は、特に限定されないが、例えば、GNSS(Global Navigation Satellite System)、GPS(Global Positioning System)等を用いて、現時点での自己位置を測定する。自己位置の測定方法として、例えば、RTK-GNSS(Real Time Kinematic - Global Navigation Satellite System)を用いることもできる。位置情報は、少なくとも平面視での2次元での座標情報(例えば緯度、経度)を含み、好ましくは高度情報を含む3次元での座標情報を含む。 The position measurement unit 111 receives signals from the artificial satellites 500 and measures the position (absolute position) of the aircraft based on the signals. The position measurement unit 111 measures its current position using, for example, GNSS (Global Navigation Satellite System), GPS (Global Positioning System), etc., but is not limited to this. As a method for measuring the position, for example, RTK-GNSS (Real Time Kinematic - Global Navigation Satellite System) can also be used. The position information includes at least two-dimensional coordinate information in a planar view (e.g., latitude, longitude), and preferably includes three-dimensional coordinate information including altitude information.
 また、RTK等の相対測位に用いる固定局の基準点の情報を提供する基地局800がドローン100及び操縦器200と無線通信可能に接続されることで、ドローン100の位置をより高い精度で計測することが可能となる。ここで、VRS(Virtual Reference Station)による仮想基準点方式を用いたRTK計測を行う場合には、基地局800を省略すること、又は、基地局800又はドローン100の位置座標推定の精度をさらに向上することができる。 In addition, the base station 800, which provides information on the reference points of fixed stations used for relative positioning such as RTK, is connected to the drone 100 and the controller 200 so as to be able to communicate wirelessly with them, making it possible to measure the position of the drone 100 with greater accuracy. Here, when performing RTK measurement using a virtual reference point method using a VRS (Virtual Reference Station), the base station 800 can be omitted, or the accuracy of the position coordinate estimation of the base station 800 or the drone 100 can be further improved.
 方位測定部112は、機体の向き(機首方向、ヘディング方向)を測定する。方位測定部112は、例えば地磁気の測定によりドローン100の機体の機首方位(ヘディング方向)を測定する地磁気センサ、コンパス等で構成される。 The orientation measurement unit 112 measures the orientation of the aircraft (nose direction, heading direction). The orientation measurement unit 112 is composed of a geomagnetic sensor that measures the nose direction (heading direction) of the drone 100 aircraft by measuring geomagnetism, a compass, etc.
 高度測定部113は、ドローン100下方(鉛直下向き)の地面に対する距離としての対地高度(以下「飛行高度」ともいう。)を測定する。 The altitude measurement unit 113 measures the altitude above the ground (hereinafter also referred to as "flight altitude") as the distance from the ground below the drone 100 (vertically downward).
 速度測定部114は、ドローン100の飛行速度を検出する。速度測定部114は、例えばジャイロセンサ等公知のセンサにより速度を測定してよい。 The speed measurement unit 114 detects the flight speed of the drone 100. The speed measurement unit 114 may measure the speed using a known sensor such as a gyro sensor.
(A-1-2-3.飛行機能部120)
 飛行機能部120は、ドローン100を飛行させる機構および機能部であり、ドローン100を浮上させて、所望の方向に移動するための推力を機体に発生させる。図2及び図3に示すように、飛行機能部120は、複数のモータ121と、複数のプロペラ122と、飛行制御部123と、を有する。
(A-1-2-3. Flight function unit 120)
The flight function unit 120 is a mechanism and function unit that causes the drone 100 to fly, and generates thrust in the airframe for lifting the drone 100 and moving it in a desired direction. As shown in Figures 2 and 3, the flight function unit 120 has a plurality of motors 121, a plurality of propellers 122, and a flight control unit 123.
 飛行制御部123は、複数のモータ121を独立して制御することにより各プロペラ122を回転させ、ドローン100に浮上、前進、旋回、着陸等の各動作を行わせ、離陸から飛行中、着陸までのドローン100の姿勢角制御及び飛行動作を制御する。 The flight control unit 123 independently controls the multiple motors 121 to rotate each propeller 122, causing the drone 100 to perform various operations such as taking off, moving forward, turning, and landing, and controls the attitude angle control and flight operations of the drone 100 from takeoff, during flight, and until landing.
 飛行制御部123は、フライトコントローラとも呼ばれる処理ユニットを有する。処理ユニットは、プログラマブルプロセッサ(例えば、中央処理ユニット(CPU)、MPU又はDSP)等の1つ以上のプロセッサを有することができる。処理ユニットは、メモリ(記憶部)にアクセス可能である。メモリは、1つ以上のステップを行うために処理ユニットが実行可能であるロジック、コード、及び/又はプログラム命令を記憶している。メモリは、例えば、SDカードやRAM等の分離可能な媒体又は外部の記憶装置を含んでいてもよい。測定部110により取得される各種データ、又は撮影用カメラ141で撮影した動画もしくは静止画のデータは、当該メモリに直接に伝達され且つ記憶されてもよい。なお、各データは外部メモリに記録することもできる。 The flight control unit 123 has a processing unit, also called a flight controller. The processing unit may have one or more processors, such as a programmable processor (e.g., a central processing unit (CPU), MPU, or DSP). The processing unit has access to a memory (storage unit). The memory stores logic, code, and/or program instructions that the processing unit can execute to perform one or more steps. The memory may include, for example, a separable medium such as an SD card or RAM, or an external storage device. Various data acquired by the measurement unit 110, or video or still image data captured by the imaging camera 141, may be directly transmitted to and stored in the memory. Each data may also be recorded in an external memory.
 処理ユニットは、ドローン100の機体の状態を制御するように構成された制御モジュールを含んでいる。例えば、制御モジュールは、6自由度(並進運動x、y及びz、並びに回転運動θx、θy及びθz)を有するドローン100の空間的配置、姿勢角角度、角速度、角加速度、角躍度速度及び/又は加速度を調整するためにドローン100の飛行機能部120(推力発生部)を制御する。 The processing unit includes a control module configured to control the state of the drone 100. For example, the control module controls the flight function section 120 (thrust generating section) of the drone 100 to adjust the spatial arrangement, attitude angle, angular velocity, angular acceleration, angular jerk rate, and/or acceleration of the drone 100 having six degrees of freedom (translational motion x, y, and z, and rotational motion θx, θy, and θz).
 飛行制御部123は、操縦器200からの操縦信号に基づいて、又は予め設定された自律飛行プログラムに基づいて、ドローン100の飛行を制御することができる。また飛行制御部123は、撮影対象フィールド、飛行許可/禁止エリア、これに対応する飛行ジオフェンスの情報、2次元又は3次元の地図データを含む地図情報、ドローン100の現在の位置情報、姿勢情報(機首方位情報)、速度情報、及び加速度情報等の各種情報及びこれらの任意の組み合わせに基づいてモータ121を制御することにより、ドローン100の飛行を制御することができる。 The flight control unit 123 can control the flight of the drone 100 based on control signals from the pilot 200 or based on a preset autonomous flight program. The flight control unit 123 can also control the flight of the drone 100 by controlling the motor 121 based on various information such as the field to be photographed, flight permitted/prohibited areas, information on the corresponding flight geofences, map information including two-dimensional or three-dimensional map data, the current position information of the drone 100, attitude information (heading information), speed information, and acceleration information, and any combination of these.
●撮影対象フィールドと撮影位置の例 
 本明細書において、「撮影対象フィールド」又は「対象エリア」は、撮影対象となる2次元の場所(例えば、競技場F)を意味する。
●Examples of shooting fields and shooting positions
In this specification, the term "field of interest" or "area of interest" refers to a two-dimensional location (e.g., stadium F) that is the subject of photography.
 図7は、ドローンが飛行する撮影対象フィールドの例である競技場Fの1例を示す模式図であり、同図は競技場Fを上から見た図である。競技場Fは、例えば直線状の外縁により区画される略矩形のコートF100と、コートF100の外縁を覆う所定の領域であるコート外領域F200により構成される。コートF100の外縁は、互いに向かい合うゴールラインF110a、F110bと、互いに向かい合うタッチラインF111a、F111bと、が略直角にそれぞれ接続されることにより構成される。ゴールラインF110a、F110bとタッチラインF111a、F111bの接続点は、コーナーF112a、F113a、F112b、F113bとなっている。 FIG. 7 is a schematic diagram showing an example of a playing field F, which is an example of a field to be photographed by a drone, viewed from above. The playing field F is composed of a court F100 that is roughly rectangular and is defined by, for example, a straight outer edge, and an outer court area F200 that is a predetermined area that covers the outer edge of the court F100. The outer edge of the court F100 is composed of mutually opposing goal lines F110a, F110b and mutually opposing touch lines F111a, F111b that are connected at roughly right angles. The connection points of the goal lines F110a, F110b and the touch lines F111a, F111b are the corners F112a, F113a, F112b, F113b.
 1対のゴールラインF110a、F110bの略中央にはそれぞれゴールF120a、F120bが設けられている。コートF100の内部であってゴールF120a、F120bに連続する所定領域にはそれぞれペナルティエリアF130a、F130bが規定され、当該ペナルティエリアの外縁にはペナルティラインF140a、F140bが描画されている。 Goals F120a, F120b are provided approximately in the center of the pair of goal lines F110a, F110b. Penalty areas F130a, F130b are defined in specific areas inside the court F100 adjacent to the goals F120a, F120b, and penalty lines F140a, F140b are drawn on the outer edges of the penalty areas.
 コートF100の中央には、1対のタッチラインF111a、F111bの中点間を接続することでコートF100を略等分するハーフウェーラインF150が描画されている。ハーフウェーラインF150は、ゴールラインF110a、F110bと略平行である。 A halfway line F150 is drawn in the center of the court F100, connecting the midpoints of a pair of touchlines F111a, F111b and dividing the court F100 into approximately equal parts. The halfway line F150 is approximately parallel to the goal lines F110a, F110b.
 なお、ゴールラインF110a、F110b、タッチラインF111a、F111b、ペナルティラインF140a、F140bおよびハーフウェーラインF150は、競技者が競技を行うためにルール上必要な線であるため、いずれの線も視認できる態様で描画されることが一般的であるが、本発明の技術的範囲はこれに限られない。また、本説明においてはサッカーの競技場を例に説明するが、本発明にかかるシステムにより撮影される競技はサッカーに限られず、テニス等任意のあらゆる競技を含む。さらに、撮影対象はスポーツに限られず、その他の催物(コンサート、式典等)にも適用することが可能である。 Note that the goal lines F110a, F110b, touchlines F111a, F111b, penalty lines F140a, F140b, and halfway line F150 are required by the rules for players to play the game, and therefore all of these lines are generally drawn in a manner that allows them to be seen, but the technical scope of the present invention is not limited to this. Also, in this explanation, a soccer stadium is used as an example, but the sports that are photographed by the system of the present invention are not limited to soccer, and include any type of sports, such as tennis. Furthermore, the subject of the photography is not limited to sports, and the system can also be applied to other events (concerts, ceremonies, etc.).
 競技場Fには、あらかじめ定められた複数の撮影位置L101~L105、L206~L215が定義されている。撮影位置L101~L105、L206~L215は、平面上の2次元座標であってもよいし、当該位置における高さも合わせて定義された3次元座標の情報でもよい。ドローン100の飛行高さは、操縦器200からの入力に基づいて手動で制御可能になっていてもよい。 In the stadium F, multiple predetermined shooting positions L101-L105, L206-L215 are defined. The shooting positions L101-L105, L206-L215 may be two-dimensional coordinates on a plane, or may be three-dimensional coordinate information that also defines the height at the corresponding positions. The flight height of the drone 100 may be manually controllable based on input from the controller 200.
 撮影位置L101~L105は、例えばタッチラインF111b上に、タッチラインF111bに沿って略等間隔に定義されている。例えば、撮影位置L101は、ハーフウェーラインF150とタッチラインF111bの交点およびコートF100からやや外側を含む範囲に位置する地点である。撮影位置L103、L105は、タッチラインF111b両側のコーナーF112a又はF112b付近の地点である。撮影位置L102、L104は、撮影位置L103、L105と撮影位置L101の間の地点である。なお、上述の位置は例示であり、これに限られず適宜の位置であってよい。 Photographing positions L101 to L105 are defined, for example, on the touchline F111b, at approximately equal intervals along the touchline F111b. For example, photographing position L101 is a point located in a range including the intersection of the halfway line F150 and the touchline F111b and slightly outside the court F100. Photographing positions L103 and L105 are points near the corners F112a or F112b on both sides of the touchline F111b. Photographing positions L102 and L104 are points between photographing positions L103 and L105 and photographing position L101. Note that the above positions are merely examples, and are not limited to these and may be any appropriate positions.
 撮影位置L206~L215は、コートF100内部に定義される地点である。例えば、撮影位置L206、L211は、ペナルティラインF140a、F140bのうちゴールラインF110a、F110bと平行な線上の中央付近の地点であり、いわゆるゴール前撮影位置である。撮影位置L207、L212は、撮影位置L206、L215よりもタッチラインF111a又はF111b寄りかつハーフウェーラインF150寄りの撮影位置である。より具体的には例えば、撮影位置L207、L212は、撮影位置L101とゴールF120a、F120bとを結んだ仮想線分上の地点であり、例えば当該仮想線分の略中央の地点である。撮影位置L209、L215は、撮影位置L207、L212の線対称の地点である。撮影位置L208は、撮影位置L207とハーフウェーラインF150との間の地点、撮影位置L210は、撮影位置L209とハーフウェーラインF150との間の地点、撮影位置L213は、撮影位置L212とハーフウェーラインF150との間の地点、撮影位置L214は、撮影位置L215とハーフウェーラインF150との間の地点である。 Photographing positions L206 to L215 are points defined within the court F100. For example, photographing positions L206 and L211 are points near the center of the penalty lines F140a and F140b on a line parallel to the goal lines F110a and F110b, and are so-called goal-front photographing positions. Photographing positions L207 and L212 are photographing positions closer to the touch lines F111a and F111b and closer to the halfway line F150 than photographing positions L206 and L215. More specifically, for example, photographing positions L207 and L212 are points on an imaginary line segment connecting photographing position L101 and goals F120a and F120b, and are, for example, points approximately in the center of the imaginary line segment. Photographing positions L209 and L215 are points that are linearly symmetrical to photographing positions L207 and L212. Shooting position L208 is a point between shooting position L207 and the halfway line F150, shooting position L210 is a point between shooting position L209 and the halfway line F150, shooting position L213 is a point between shooting position L212 and the halfway line F150, and shooting position L214 is a point between shooting position L215 and the halfway line F150.
 コート外領域F200には、ドローン100又はシステム1の異常や故障を検知した場合に、ドローン100を退避させる退避地点H200が設定されている。ここにいう異常とは、ドローン100の空中移動の安定性に関する異常である。当該異常は、例えば、ドローン100の動作制御(挙動制御、撮影制御等)に伴う演算負荷が負荷閾値を上回る場合を含む。或いは、当該異常は、環境に関する一過性の異常、例えば強風等の影響によりドローン100の挙動制御値(例えば速度)の測定値が許容値を超えている場合を含んでもよい。 In the outside court area F200, an evacuation point H200 is set to which the drone 100 is to be evacuated if an abnormality or malfunction of the drone 100 or the system 1 is detected. The abnormality referred to here is an abnormality related to the stability of the aerial movement of the drone 100. The abnormality includes, for example, a case where the calculation load associated with the operation control (behavior control, shooting control, etc.) of the drone 100 exceeds a load threshold. Alternatively, the abnormality may include a transient abnormality related to the environment, such as a case where the measured value of the behavior control value (e.g. speed) of the drone 100 exceeds an allowable value due to the influence of a strong wind or the like.
 退避地点H200は、撮影位置L101~L105、L206~L215とは異なる地点に設定され、本実施形態においてはタッチラインF111aの外側に、タッチラインF111aに沿って設定されている。退避地点H200は複数あってよく、本実施例においては、3個である。退避地点H220は、ハーフウェーラインF150の延長線上付近に設定されている。退避地点H210、H230は、撮影位置L206、L211よりもゴールF120a、F120b寄りに設定されている。退避地点H210、H230は、例えば後述するジオフェンスG200に区画される領域内の端部に設定される。退避地点H200では、例えばドローン100の機体の交代やドローン100に搭載されているバッテリの交換が行われる。 The evacuation point H200 is set at a point different from the shooting positions L101 to L105 and L206 to L215, and in this embodiment, it is set outside the touchline F111a and along the touchline F111a. There may be multiple evacuation points H200, and in this embodiment, there are three. The evacuation point H220 is set near the extension of the halfway line F150. The evacuation points H210 and H230 are set closer to the goals F120a and F120b than the shooting positions L206 and L211. The evacuation points H210 and H230 are set at the ends of an area partitioned by a geofence G200, which will be described later, for example. At the evacuation point H200, for example, the drone 100 is replaced or the battery installed in the drone 100 is replaced.
 競技場Fにおいては、少なくとも各撮影位置L101~L105、L206~L215を包含する複数のジオフェンスG100、G200が設定されている。ジオフェンスは、領域を区画する仮想的な境界線を示すものであり、特に、本実施形態におけるジオフェンスは、ドローン100が飛行又は移動が許可される飛行許可エリアと飛行禁止エリアの境界線のフェンスを示す。ジオフェンスは平面および高さを含む3次元的に広がる領域を区画する境界線である。ドローン100等の移動体がジオフェンスに接触又はジオフェンスの外部に逸脱したことが検知された場合には、飛行許可エリアの外側に機体が飛び出さないように飛行又は移動が制限される。すなわち例えば、後述する飛行制御部340、350又は360は、ドローン100を着陸させる。また、飛行制御部340、350又は360は、ドローン100をホバリングさせてもよい。また、飛行制御部340、350又は360は、ドローン100をジオフェンスG100、ジオフェンスG200、又は第3ジオフェンスの内側方向へ移動させてもよい。ジオフェンスにおける詳細な制御については、後述する。 In the stadium F, multiple geofences G100, G200 are set, each of which includes at least the shooting positions L101 to L105, L206 to L215. A geofence indicates a virtual boundary line that divides an area, and in particular, the geofence in this embodiment indicates a fence that is the boundary line between a flight-permitted area in which the drone 100 is permitted to fly or move and a flight-prohibited area. A geofence is a boundary line that divides an area that spreads three-dimensionally, including a plane and height. When it is detected that a moving object such as the drone 100 has come into contact with a geofence or deviated outside the geofence, the flight or movement is restricted so that the aircraft does not fly out outside the flight-permitted area. That is, for example, the flight control unit 340, 350, or 360 described later lands the drone 100. The flight control unit 340, 350, or 360 may also hover the drone 100. Additionally, the flight control unit 340, 350, or 360 may move the drone 100 toward the inside of the geofence G100, the geofence G200, or the third geofence. Detailed control of the geofence will be described later.
 ジオフェンスの高さ方向の境界線は、上限および下限を含んでいてよい。本実施形態では、飛行可否に適用されるジオフェンスG100、G200は、ドローン100の飛行中に、システム1の制御に応じて切り替えられる。同図に描画されるジオフェンスG100、G200の数は2個であるが、個数は任意であり、具体的には3個以上であってもよい。 The geofence's boundary line in the height direction may include an upper limit and a lower limit. In this embodiment, the geofences G100, G200 that are applied to whether or not flight is permitted are switched according to the control of the system 1 while the drone 100 is flying. The number of geofences G100, G200 depicted in the figure is two, but the number is arbitrary, and specifically may be three or more.
 ジオフェンスG100は、撮影位置L101~L105を包含する領域であり、タッチラインF111b上およびその近傍の領域を包含する領域を区画している。言い換えれば、ジオフェンスG100は、コートF100の外縁付近に規定され、一部はコート外領域F200に伸び出ている。ジオフェンスG200は、後述する外縁飛行モードM102において主に適用されるジオフェンスである。ジオフェンスG200は、撮影位置L206~L215を包含する領域であり、少なくともコートF100の内部に設定されている。このジオフェンスG200は、後述するコート内飛行モードM105において主に適用されるジオフェンスである。 The geofence G100 is an area that includes the shooting positions L101 to L105, and defines an area that includes the touchline F111b and the area nearby. In other words, the geofence G100 is defined near the outer edge of the court F100, and a portion of it extends into the outer court area F200. The geofence G200 is a geofence that is primarily applied in the outer edge flight mode M102, which will be described later. The geofence G200 is an area that includes the shooting positions L206 to L215, and is set at least inside the court F100. This geofence G200 is a geofence that is primarily applied in the on-court flight mode M105, which will be described later.
 複数のジオフェンスG100、G200に区画される領域は、少なくとも一部が互いに接触又は重複する。複数のジオフェンスG100、G200に区画される領域は、高さ方向にも重複する。なお、複数のジオフェンスG100、G200の高さは互いに異なっていてもよい。具体的には、競技場F内部に設定されるジオフェンスG200の高度の下限は、競技場Fの外縁に設定されるジオフェンスG100の高度の下限よりも高く設定される。競技場Fの内部には競技のプレイヤー等の撮影対象者が存在している蓋然性が高いため、ジオフェンスの高度の下限を高くし、ドローン100を十分な高さで飛行させることで、撮影対象者の動作の妨害や撮影対象者又はボールとの衝突を防ぐことができる。 The areas defined by the multiple geofences G100, G200 at least partially contact or overlap each other. The areas defined by the multiple geofences G100, G200 also overlap in the height direction. The heights of the multiple geofences G100, G200 may differ from each other. Specifically, the lower limit of the altitude of the geofence G200 set inside the stadium F is set higher than the lower limit of the altitude of the geofence G100 set on the outer edge of the stadium F. Since there is a high probability that subjects to be photographed, such as players, are present inside the stadium F, by setting the lower limit of the altitude of the geofence high and flying the drone 100 at a sufficient height, it is possible to prevent interference with the movements of the subjects to be photographed and collisions with the subjects to be photographed or the ball.
 図8は、コートF100の外縁付近に規定されるジオフェンスの実施形態を示す図である。
 図8(a)は、図7で説明したジオフェンスG100、G200を示す図である。
 図8(b)は、コートF100の外縁付近に規定されるジオフェンスの別の例である。この例では、ジオフェンスG100に代えて、コートF100の全周を覆う領域を規定するジオフェンスG100aが配置されている。
FIG. 8 illustrates an embodiment of a geofence defined near the perimeter of court F100.
FIG. 8(a) is a diagram showing the geofences G100 and G200 described in FIG.
8B is another example of a geofence defined near the outer edge of the court F100. In this example, instead of the geofence G100, a geofence G100a that defines an area that covers the entire perimeter of the court F100 is arranged.
 図8(c)は、ジオフェンスG100に代えて、コートF100の外縁のうち隣り合う2辺を覆う領域を規定するジオフェンスG100bの例を示す図である。ジオフェンスG100bは、ゴールラインF110aに沿う領域と、タッチラインF111aに沿う領域と、を接続したL字状の領域を規定している。
 図8(d)は、ジオフェンスG100に代えて、ゴールラインF110bに沿う領域を規定するジオフェンスG100cの例を示す図である。
8C is a diagram showing an example of a geofence G100b that defines an area covering two adjacent sides of the outer edge of the court F100, instead of the geofence G100. The geofence G100b defines an L-shaped area connecting an area along the goal line F110a and an area along the touch line F111a.
FIG. 8(d) is a diagram showing an example of a geofence G100c that defines an area along the goal line F110b instead of the geofence G100.
 ジオフェンスは、予め設定されていてもよいし、ユーザにより設定できるようになっていてもよい。例えば、コートF100付近に固定の障害物がある場合に、ユーザは、障害物と干渉しないようにジオフェンスを設定することができる。また、ユーザは、ジオフェンスと合わせて、当該ジオフェンスが設定される際にドローン100が飛行しうるエリアを合わせて設定し、対応付けて格納させることもできる。 The geofence may be set in advance, or may be set by the user. For example, if there is a fixed obstacle near the court F100, the user can set the geofence so as not to interfere with the obstacle. In addition to the geofence, the user can also set an area in which the drone 100 may fly when the geofence is set, and store the area in association with the geofence.
(A-1-2-4.障害物検知部130)
 図3の説明に戻る。障害物検知部130は、ドローン100の周辺の障害物を検知する機能部である。障害物は、例えば人、例えば選手、物、鳥等の動物、固定設備およびボールを含んでよい。障害物検知部130は、取得画像に基づいてドローン100の下方等に位置する障害物の位置、速度ベクトル等を測定する。
(A-1-2-4. Obstacle detection unit 130)
Returning to the description of Fig. 3, the obstacle detection unit 130 is a functional unit that detects obstacles around the drone 100. The obstacles may include, for example, people, for example, players, objects, animals such as birds, fixed equipment, and a ball. The obstacle detection unit 130 measures the position, speed vector, and the like of an obstacle located, for example, below the drone 100 based on the acquired image.
 障害物検知部130は、例えば障害物検知カメラ131、ToF(Time of Flight)センサ132およびレーザーセンサ133を有する。ToFセンサ132は、センサからパルス投光されたレーザがセンサ内の受光素子に戻ってくるまでの時間を計測し、その時間を距離に換算することで物体までの距離を測定する。レーザーセンサ133は、例えばLiDAR(Light Detection And Ranging)方式により、近赤外光や可視光、紫外線等の光線を対象物に光を照射し、その反射光を光センサでとらえ距離を測定する。 The obstacle detection unit 130 includes, for example, an obstacle detection camera 131, a ToF (Time of Flight) sensor 132, and a laser sensor 133. The ToF sensor 132 measures the time it takes for a laser pulse emitted from the sensor to return to the light receiving element in the sensor, and measures the distance to an object by converting this time into distance. The laser sensor 133 uses, for example, the LiDAR (Light Detection And Ranging) method to shine light such as near-infrared light, visible light, or ultraviolet light on the target object and measure the distance by capturing the reflected light with an optical sensor.
 図2には、本実施形態では障害物検知カメラ131が前方を向いて配置されていることが図示されているが、このカメラ131、ToFセンサ132、およびレーザーセンサ133の種類、位置および数は任意であり、カメラ131に代えてToFセンサ132又はレーザーセンサ133が配置されていてもよいし、ToFセンサ132又はレーザーセンサ133が筐体101の6面、すなわち前面、背面、上面、底面、および両側面のすべてに設けられていてもよい。 In this embodiment, FIG. 2 shows that the obstacle detection camera 131 is positioned facing forward, but the type, position and number of the camera 131, ToF sensor 132 and laser sensor 133 are arbitrary, and the ToF sensor 132 or laser sensor 133 may be positioned instead of the camera 131, or the ToF sensor 132 or laser sensor 133 may be provided on all six surfaces of the housing 101, i.e., the front, back, top, bottom and both sides.
(A-1-2-5.撮影部140)
 撮影部140は、競技場F(図7)における競技、又は催物会場における催物等の映像を撮影する機能部であり、撮影用カメラ141、カメラ保持部142及び撮影制御部143を有する。図2に示すように、撮影用カメラ141(撮像装置)は、ドローン100の本体の下部に配置され、ドローン100の周辺を撮影した周辺画像に関する画像データを出力する。撮影用カメラ141は、動画を撮影するビデオカメラ(カラーカメラ)である。動画には、図示しないマイクロホンで取得した音声データを含めてもよい。これに加えて又はこれに代えて、撮影用カメラ141は、静止画の撮影を行うものとすることも可能である。
(A-1-2-5. Photographing unit 140)
The photographing unit 140 is a functional unit that photographs images of a competition in the stadium F (FIG. 7) or an event at an event venue, and has a photographing camera 141, a camera holding unit 142, and a photographing control unit 143. As shown in FIG. 2, the photographing camera 141 (imaging device) is disposed at the bottom of the main body of the drone 100, and outputs image data related to a peripheral image photographed around the drone 100. The photographing camera 141 is a video camera (color camera) that photographs moving images. The moving images may include audio data acquired by a microphone (not shown). In addition to or instead of this, the photographing camera 141 may also be configured to photograph still images.
 撮影用カメラ141は、カメラ保持部142に組み込まれた図示しないカメラアクチュエータにより向き(ドローン100の筐体101に対する撮影用カメラ141の姿勢)を調整可能である。撮影用カメラ141は、露出、コントラスト又はISO等のパラメータの自動制御機能を有していてもよい。カメラ保持部142は、機体の揺れ又は振動が撮影用カメラ141に伝わるのを抑制する、いわゆるジンバル制御の機構を有していてもよい。撮影制御部143は、撮影用カメラ141およびカメラ保持部142を制御し撮影用カメラ141の向き、撮影倍率(ズーム量)およびカメラの撮影条件等を調整する。撮影用カメラ141が取得した画像データは、ドローン100自体の記憶部、操縦器200、サーバ300等にデータを送信することができる。 The orientation of the photographic camera 141 (the attitude of the photographic camera 141 relative to the housing 101 of the drone 100) can be adjusted by a camera actuator (not shown) built into the camera holding unit 142. The photographic camera 141 may have an automatic control function for parameters such as exposure, contrast, or ISO. The camera holding unit 142 may have a so-called gimbal control mechanism that suppresses the transmission of shaking or vibration of the aircraft to the photographic camera 141. The photographic control unit 143 controls the photographic camera 141 and the camera holding unit 142 to adjust the orientation of the photographic camera 141, the photographic magnification (zoom amount), the camera's photographic conditions, etc. Image data acquired by the photographic camera 141 can be transmitted to the memory unit of the drone 100 itself, the pilot 200, the server 300, etc.
(A-1-2-6.通信部150)
 通信部150は、通信ネットワーク400を介しての電波通信が可能であり、例えば、電波通信モジュールを含む。通信部150は、通信ネットワーク400(無線基地局800を含む。)を介することで、操縦器200等との通信が可能である。
(A-1-2-6. Communication unit 150)
The communication unit 150 is capable of radio wave communication via the communication network 400 and includes, for example, a radio wave communication module. The communication unit 150 is capable of communication with the controller 200 and the like via the communication network 400 (including the wireless base station 800).
(A-1-3.操縦器200)
(A-1-3-1.操縦器200の概要)
 図4は、本実施形態の操縦器200を簡略的に示す外観正面図である。図5は、本実施形態の操縦器200の機能構成図である。操縦器200は、操縦者の操作によりドローン100を制御すると共に、ドローン100から受信した情報(例えば、位置、高度、電池残量、カメラ映像等)を表示する携帯情報端末である。なお、本実施形態では、ドローン100の飛行状態(高度、姿勢等)は、操縦器200が遠隔制御可能であってもよいし、ドローン100が自律的に制御してもよい。例えば、操縦器200を介して操縦者からドローン100に飛行指令が送信されると、ドローン100は自律飛行を行う。また、離陸や帰還等の基本操作時、及び緊急時にはマニュアル操作が行なえるようになっていてもよい。
(A-1-3. Controller 200)
(A-1-3-1. Overview of Controller 200)
FIG. 4 is a front view of the appearance of the controller 200 of this embodiment in a simplified manner. FIG. 5 is a functional configuration diagram of the controller 200 of this embodiment. The controller 200 is a mobile information terminal that controls the drone 100 by the operation of the pilot and displays information received from the drone 100 (e.g., position, altitude, remaining battery level, camera image, etc.). In this embodiment, the flight state (altitude, attitude, etc.) of the drone 100 may be remotely controlled by the pilot 200, or may be autonomously controlled by the drone 100. For example, when a flight command is transmitted from the pilot to the drone 100 via the pilot 200, the drone 100 performs autonomous flight. In addition, manual operation may be possible during basic operations such as takeoff and return, and in an emergency.
 図4に示すように、操縦器200は、ハードウェア構成として、表示部201および入力部202を備える。表示部201および入力部202は、互いに有線又は無線で通信可能に接続されている。表示部201は、操縦器200に一体に組み込まれたタッチパネル又は液晶モニタ等で構成されていてもよいし、操縦器200に有線接続又は無線接続された液晶モニタ、タブレット端末、スマートホン等の表示装置で構成されていてもよい。ハードウェア構成としての表示部201には、タッチ等の入力を受け付ける素子が一体的に組み込まれ、タッチパネルディスプレイとなっていてもよい。 As shown in FIG. 4, the controller 200 includes a display unit 201 and an input unit 202 as a hardware configuration. The display unit 201 and the input unit 202 are connected to each other so that they can communicate with each other wired or wirelessly. The display unit 201 may be configured as a touch panel or liquid crystal monitor that is integrated into the controller 200, or may be configured as a display device such as a liquid crystal monitor, tablet terminal, or smartphone that is connected to the controller 200 wired or wirelessly. The display unit 201 as a hardware configuration may be configured as a touch panel display by integrally incorporating an element that accepts input such as touch.
 入力部202は、操縦者がドローン100を操縦する際に飛行方向や離陸/着陸等の動作指令を入力する機構である。
 図4(a)に示すように、入力部202は、左スライダ326L、右スライダ326R、左入力スティック327L、右入力スティック327R、電源ボタン328及び帰還ボタン329を有する。左スライダ326Lおよび右スライダ326Rは、例えば0/1の入力、又は1次元の無段階もしくは段階的な情報の入力を受け付ける操作子であり、操縦者は例えば操縦器200を手で保持した状態で、左右の人差し指によりスライドさせて入力を行う。左入力スティック327Lおよび右入力スティック327Rは、複数次元の無段階又は段階的な情報の入力を受け付ける操作子であり、例えばいわゆるジョイスティックである。また、左入力スティック327Lおよび右入力スティック327Rは、押下による0/1の入力を受け付けてもよい。電源ボタン328及び帰還ボタン329は、押下を受け付ける操作子であり、機械式スイッチ等により構成される。
The input unit 202 is a mechanism through which the pilot inputs operational commands such as flight direction and takeoff/landing when piloting the drone 100.
As shown in FIG. 4A, the input unit 202 has a left slider 326L, a right slider 326R, a left input stick 327L, a right input stick 327R, a power button 328, and a return button 329. The left slider 326L and the right slider 326R are operators that accept, for example, an input of 0/1, or an input of one-dimensional stepless or stepwise information, and the operator slides the left and right index fingers to input, for example, while holding the controller 200 in his/her hand. The left input stick 327L and the right input stick 327R are operators that accept an input of multi-dimensional stepless or stepwise information, and are, for example, so-called joysticks. The left input stick 327L and the right input stick 327R may also accept an input of 0/1 by pressing them. The power button 328 and the return button 329 are operators that accept pressing them, and are constituted by mechanical switches or the like.
 左入力スティック327Lおよび右入力スティック327Rは、例えば、離陸、着陸、上昇、下降、右旋回、左旋回、前進、後進、左移動、および右移動等を含めた3次元のドローン100の飛行動作を指示する入力操作を受け付ける。図4(b)は、図4(a)に示す左入力スティック327Lおよび右入力スティック327Rの各入力に対応させて、ドローン100の移動方向又は旋回方向を示した模式図である。なお、この対応関係は例示である。 The left input stick 327L and the right input stick 327R accept input operations that instruct the three-dimensional flight operations of the drone 100, including, for example, takeoff, landing, ascent, descent, right turn, left turn, forward movement, backward movement, left movement, and right movement. Figure 4(b) is a schematic diagram showing the movement direction or rotation direction of the drone 100 corresponding to each input of the left input stick 327L and right input stick 327R shown in Figure 4(a). Note that this correspondence is an example.
 図5に示すように、操縦器200は、情報処理を実行するためのCPU等の演算装置、RAM及びROM等の記憶装置を備え、これによりソフトウェア構成として、主として表示制御部210、入力制御部220および通信部240の各機能ブロックを構成する。 As shown in FIG. 5, the controller 200 includes a processor such as a CPU for executing information processing, and storage devices such as RAM and ROM, which constitute the software configuration of the main functional blocks of the display control unit 210, input control unit 220, and communication unit 240.
(A-1-3-2.表示制御部210)
 表示制御部210は、ドローン100又はサーバ300から取得したドローン100のステータス情報等を操縦者に表示する。表示制御部210は、撮影対象フィールド、飛行許可/禁止エリア、飛行ジオフェンス、地図情報、ドローン100の現在の位置情報、姿勢情報(方向情報)、速度情報、加速度情報及びバッテリ残量等の各種情報に関する画像を表示することができる。ここにいう「現在の位置情報」は、ドローン100の現在位置の水平方向位置の情報(すなわち緯度及び経度)を含んでいればよく、高度情報(絶対高度又は相対高度)は含まなくてもよい。
(A-1-3-2. Display control unit 210)
The display control unit 210 displays to the pilot the drone 100 or the status information of the drone 100 acquired from the server 300. The display control unit 210 can display images relating to various information such as the shooting target field, flight permitted/prohibited areas, flight geofence, map information, current position information of the drone 100, attitude information (directional information), speed information, acceleration information, and remaining battery power. The "current position information" referred to here is sufficient to include information on the horizontal position of the current position of the drone 100 (i.e., latitude and longitude), and does not need to include altitude information (absolute altitude or relative altitude).
 表示制御部210は、モード表示部211と、撮影状態表示部212と、を有する。 The display control unit 210 has a mode display unit 211 and a shooting status display unit 212.
 モード表示部211は、少なくともドローン100が属する状態、すなわちモードを表示部201に表示させる機能部である。ドローン100が属するモードは、例えば図9に示す飛行モードであるが、これに代えて又は加えて、図10に示す機体状態、図11に示す機体の行動状態、図12に示す試合状態又は図13に示す攻守状態を表示部201に表示させてもよい。 The mode display unit 211 is a functional unit that displays at least the state, i.e., the mode, to which the drone 100 belongs on the display unit 201. The mode to which the drone 100 belongs is, for example, the flight mode shown in FIG. 9, but instead of or in addition to this, the aircraft state shown in FIG. 10, the aircraft action state shown in FIG. 11, the game state shown in FIG. 12, or the offensive and defensive state shown in FIG. 13 may be displayed on the display unit 201.
 図19に示すように、表示部201に表示される画面G1には、例えば、ドローン100が属する飛行モードの表示欄G21、ならびに機体状態、機体行動状態、試合状態および攻守状態を示す状態表示欄G22が表示される。 As shown in FIG. 19, the screen G1 displayed on the display unit 201 displays, for example, a display field G21 for the flight mode to which the drone 100 belongs, as well as a status display field G22 showing the aircraft status, aircraft behavior status, match status, and offensive/defensive status.
 図5に示す撮影状態表示部212は、ドローン100に搭載される撮影用カメラ141により撮影される映像を、表示部201に表示させる機能部である。図19に示すように、表示部201に表示される画面G1には、例えば、ドローン100により撮影されている画像が表示される映像欄G40が表示される。
 画面G1および各モードの詳細については、後述する。
5 is a functional unit that displays, on the display unit 201, an image captured by the imaging camera 141 mounted on the drone 100. As shown in FIG 19, the screen G1 displayed on the display unit 201 displays, for example, an image field G40 in which an image being captured by the drone 100 is displayed.
The screen G1 and each mode will be described in detail later.
(A-1-3-3.入力制御部220)
 図5に示す入力制御部220は、操縦者等のユーザによる各種の入力を受け付ける。
 本実施形態の入力制御部220は、主として、飛行***置の操作部221、飛行体姿勢の操作部222、カメラ姿勢の操作部223、カメラズームの操作部224、飛行モード切替部225、目標位置受付部226、電源入力部227および帰還入力部228の各機能部を有する。
(A-1-3-3. Input control unit 220)
The input control unit 220 shown in FIG. 5 receives various inputs from a user such as a pilot.
The input control unit 220 of this embodiment mainly has the following functional units: an aircraft position operation unit 221, an aircraft attitude operation unit 222, a camera attitude operation unit 223, a camera zoom operation unit 224, a flight mode switching unit 225, a target position receiving unit 226, a power supply input unit 227, and a return input unit 228.
 飛行***置の操作部221は、上下移動入力部221aおよび左右移動入力部221bを備える。飛行体姿勢の操作部222は、前後移動入力部222aおよびヨー旋回入力部222bを備える。 The aircraft position operation unit 221 includes an up/down movement input unit 221a and a left/right movement input unit 221b. The aircraft attitude operation unit 222 includes a forward/backward movement input unit 222a and a yaw rotation input unit 222b.
 上下移動入力部221aは、操縦者によりドローン100を上下移動させるための入力部であり、右入力スティック327Rへの入力を取得する。すなわち、右入力スティック327Rが上側(手に保持した状態において奥側)に移動されるとドローン100が上昇し、右入力スティック327Rが下側(手に保持した状態において手前側)に移動されるとドローン100が下降する。左右移動入力部221bは、操縦者によりドローン100を左右移動させるための入力部であり、右入力スティック327Rへの入力を取得する。すなわち、右入力スティック327Rが右側に移動されるとドローン100が右移動し、右入力スティック327Rが左側に移動されるとドローン100が左移動する。 The up-down movement input unit 221a is an input unit for allowing the pilot to move the drone 100 up and down, and acquires input to the right input stick 327R. That is, when the right input stick 327R is moved upward (toward the back when held in the hand), the drone 100 rises, and when the right input stick 327R is moved downward (toward the front when held in the hand), the drone 100 descends. The left-right movement input unit 221b is an input unit for allowing the pilot to move the drone 100 left and right, and acquires input to the right input stick 327R. That is, when the right input stick 327R is moved to the right, the drone 100 moves to the right, and when the right input stick 327R is moved to the left, the drone 100 moves to the left.
 前後移動入力部222aは、操縦者によりドローン100を前後移動させるための入力部であり、左入力スティック327Lへの入力を取得する。すなわち、左入力スティック327Lが上側(手に保持した状態において奥側)に移動されるとドローン100が前進し、左入力スティック327Lが下側(手に保持した状態において手前側)に移動されるとドローン100が後進する。ヨー旋回入力部222bは、操縦者によりドローン100をヨー旋回させるための入力部であり、左入力スティック327Lへの入力を取得する。すなわち、左入力スティック327Lが右側に移動されるとドローン100が右旋回し、左入力スティック327Lが左側に移動されるとドローン100が左旋回する。 The forward/backward movement input unit 222a is an input unit for allowing the pilot to move the drone 100 forward/backward, and acquires input to the left input stick 327L. That is, when the left input stick 327L is moved upward (toward the rear when held in the hand), the drone 100 moves forward, and when the left input stick 327L is moved downward (toward the front when held in the hand), the drone 100 moves backward. The yaw rotation input unit 222b is an input unit for allowing the pilot to yaw rotate the drone 100, and acquires input to the left input stick 327L. That is, when the left input stick 327L is moved to the right, the drone 100 turns right, and when the left input stick 327L is moved to the left, the drone 100 turns left.
 カメラ姿勢の操作部223は、撮影制御部143を介してカメラ保持部142を動作させ、ドローン100の筐体101に対する撮影用カメラ141の向きを操作するための入力部である。カメラ姿勢の操作部223は、右スライダ326Rへの入力を取得する。カメラ姿勢の操作部223は、筐体101に対する撮影用カメラ141のピッチ角およびヨー角のいずれか又は両方の操作を受け付ける。 The camera attitude operation unit 223 is an input unit for operating the camera holding unit 142 via the shooting control unit 143 and for controlling the orientation of the shooting camera 141 relative to the housing 101 of the drone 100. The camera attitude operation unit 223 obtains input to the right slider 326R. The camera attitude operation unit 223 accepts operation of either or both of the pitch angle and yaw angle of the shooting camera 141 relative to the housing 101.
 カメラズームの操作部224は、撮影用カメラ141の撮影倍率を操作するための入力部であり、左スライダ326Lへの入力を取得する。 The camera zoom operation unit 224 is an input unit for operating the shooting magnification of the shooting camera 141, and acquires input to the left slider 326L.
 飛行モード切替部225は、飛行モードを切り替えるための入力部である。飛行モード切替部225により選択可能な飛行モードは、少なくとも例えば外縁飛行モードM102(図9参照)、コート内飛行モードM105(図9参照)および定位置飛行モードM103又はM107(図9参照)を含む。飛行モード切替部225は、例えば表示部201と一体となったタッチパネルディスプレイを介して、飛行モードの切替を受け付ける。 The flight mode switching unit 225 is an input unit for switching flight modes. Flight modes selectable by the flight mode switching unit 225 include at least, for example, the outer edge flight mode M102 (see FIG. 9), the inside court flight mode M105 (see FIG. 9), and the fixed position flight mode M103 or M107 (see FIG. 9). The flight mode switching unit 225 accepts switching of flight modes via, for example, a touch panel display integrated with the display unit 201.
 目標位置受付部226は、ドローン100が向かうべき目標撮影位置の入力を受け付ける機能部である。目標位置受付部226は、競技場F上の地点の入力を受け付ける。目標位置受付部226は、例えば表示部201に競技場Fの映像又は模式図の少なくとも一部が表示されている状態において、表示部201と一体的に構成されるタッチパネルディスプレイを介して、目標撮影位置の入力を受け付けてよい。目標位置受付部226は、目標撮影位置として選択可能な地点、すなわち撮影位置があらかじめ規定されている場合に、目標撮影位置の選択入力を受け付けてもよい。 The target position receiving unit 226 is a functional unit that receives input of a target shooting position to which the drone 100 should head. The target position receiving unit 226 receives input of a point on the stadium F. For example, when at least a portion of an image or schematic diagram of the stadium F is displayed on the display unit 201, the target position receiving unit 226 may receive input of the target shooting position via a touch panel display that is configured integrally with the display unit 201. The target position receiving unit 226 may receive a selection input of a target shooting position when a point that can be selected as the target shooting position, i.e., a shooting position, is specified in advance.
●飛行モード
 ここで、ドローン100に設定される飛行モードの種類、およびその状態遷移の例について説明する。
 図9に示すように、ドローン100の飛行モードは、主として、事前準備モードM100、コート外離着陸モードM101、外縁飛行モードM102、コート外定位置飛行モードM103、コート内進入モードM104、コート内飛行モードM105、コート外退出モードM106、コート内定位置飛行モードM107、およびコート内離着陸モードM108を含む。
Flight Modes Here, we will explain the types of flight modes that can be set for the drone 100 and examples of their state transitions.
As shown in FIG. 9, the flight modes of the drone 100 mainly include a pre-preparation mode M100, an off-court takeoff and landing mode M101, an outer edge flight mode M102, an off-court fixed position flight mode M103, an on-court entry mode M104, an on-court flight mode M105, an off-court exit mode M106, an on-court fixed position flight mode M107, and an on-court takeoff and landing mode M108.
 事前準備モードM100は、ジオフェンス設定等の事前設定が行われるモードである。事前準備モードM100は、コート外離着陸モードM101に遷移する。このコート外離着陸モードM101において、ドローン100は、地点L101g(図15参照)から離陸する。なお、コート外離着陸モードM101において、ドローン100は地点L101gとは異なるコートF100外の地点で離陸してもよい。 The advance preparation mode M100 is a mode in which advance settings such as geofence settings are made. The advance preparation mode M100 transitions to an off-court takeoff and landing mode M101. In this off-court takeoff and landing mode M101, the drone 100 takes off from point L101g (see FIG. 15). Note that in the off-court takeoff and landing mode M101, the drone 100 may take off from a point outside the court F100 other than point L101g.
 コート外離着陸モードM101は、制御開始又は制御終了時にドローン100が属するモードである。ドローン100は、コート外離着陸モードM101から、外縁飛行モードM102に遷移する。 The off-court takeoff and landing mode M101 is the mode to which the drone 100 belongs when control starts or ends. The drone 100 transitions from the off-court takeoff and landing mode M101 to the perimeter flight mode M102.
 外縁飛行モードM102は、コートF100の外縁の一部又は全部に沿って外縁の上空を飛行して競技場Fの撮影を行うモードであり、具体的には、撮影位置L101~L105(図7)のいずれかで飛行して撮影を行うモードである。なお、本説明における主な実施態様においては、外縁飛行モードM102はタッチラインF111b上を飛行するモードである。ただし、外縁飛行モードM102が飛行する「外縁」は、タッチラインF111bの真上に加えてコートF100のやや外側も含まれる概念である。 The outer edge flight mode M102 is a mode in which the drone flies above the outer edge along part or all of the outer edge of the court F100 to photograph the playing field F, and more specifically, flies at one of the photographing positions L101 to L105 (Figure 7) to photograph the playing field F. In the main embodiment described in this description, the outer edge flight mode M102 is a mode in which the drone flies above the touchline F111b. However, the "outer edge" on which the outer edge flight mode M102 flies is a concept that includes not only directly above the touchline F111b but also slightly outside the court F100.
 外縁飛行モードM102においては、操縦器200の目標位置受付部226を介してユーザの指示を受け付け、指定された撮影位置L101~L105のいずれかで飛行を行う。なお、撮影アングルは、ユーザの指示に応じてマニュアルで操作可能であってもよいし、所定角度に固定されていてもよい。また、外縁飛行モードM102においては、撮影アングルを固定したままドローン100の撮影位置を変更する、いわゆるドリー撮影により、特定の選手に追従して撮影してもよい。 In outer edge flight mode M102, the drone 200 receives user instructions via the target position receiving unit 226, and flies at one of the specified shooting positions L101 to L105. The shooting angle may be manually controlled according to the user's instructions, or may be fixed at a predetermined angle. In outer edge flight mode M102, the drone 100 may change its shooting position while keeping the shooting angle fixed, a so-called dolly shot may be used to follow and shoot a specific player.
 外縁飛行モードM102は、コート外離着陸モードM101、コート外定位置飛行モードM103又はコート内進入モードM104に遷移可能である。 The outer edge flight mode M102 can transition to an outside court takeoff and landing mode M101, an outside court fixed position flight mode M103, or an inside court approach mode M104.
 コート外定位置飛行モードM103は、コートF100の領域外においてドローン100を定位置で飛行するモードである。コート外定位置飛行モードM103は、外縁飛行モードM102に遷移可能である。コート内進入モードM104は、ドローン100がコートF100の領域内に進入するのに必要な一連の処理を行うモードである。ドローン100は、コート内進入モードM104を経てコート内飛行モードM105に遷移する。 The outside-court fixed position flight mode M103 is a mode in which the drone 100 flies in a fixed position outside the area of the court F100. The outside-court fixed position flight mode M103 can transition to the outer edge flight mode M102. The on-court entry mode M104 is a mode in which a series of processes required for the drone 100 to enter the area of the court F100 is performed. The drone 100 transitions to the on-court flight mode M105 via the on-court entry mode M104.
 コート内飛行モードM105は、コートF100内の上空を飛行して競技場Fの撮影を行うモードであり、具体的には、撮影位置L206~L215(図7)のいずれかで飛行して撮影を行うモードである。コート内飛行モードM105においては、外縁飛行モードM102と同様、操縦器200の目標位置受付部226を介してユーザの撮影位置の選択指令を受け付け、指定された撮影位置L206~L215のいずれかで飛行を行う。なお、撮影方向(撮影アングル)は、ユーザの指示に応じてマニュアルで操作可能であってもよいし、所定角度に固定されていてもよい。 Intra-court flight mode M105 is a mode in which the drone flies above the court F100 to photograph the stadium F, and more specifically, flies at one of the photographing positions L206 to L215 (Figure 7) to photograph. In intra-court flight mode M105, as in outer edge flight mode M102, the drone accepts a user command to select a photographing position via the target position receiving unit 226 of the controller 200, and flies at one of the specified photographing positions L206 to L215. The photographing direction (photographing angle) may be manually controlled according to the user's instructions, or may be fixed at a predetermined angle.
 コート内飛行モードM105は、コート外退出モードM106、コート内定位置飛行モードM107又はコート内離着陸モードM108に遷移可能である。 The on-court flight mode M105 can transition to an off-court exit mode M106, an on-court fixed position flight mode M107, or an on-court takeoff and landing mode M108.
 コート外退出モードM106は、ドローン100がコートF100の領域外に退出するのに必要な一連の処理を行うモードである。ドローン100は、コート外退出モードM106を経て外縁飛行モードM102に遷移する。なお、コート外退出モードM106およびコート内進入モードM104は互いに遷移可能である。 The court exit mode M106 is a mode in which the drone 100 performs a series of processes required for the drone 100 to exit the area of the court F100. The drone 100 transitions from the court exit mode M106 to the outer edge flight mode M102. Note that the court exit mode M106 and the court entry mode M104 can transition back and forth.
 コート内定位置飛行モードM107は、コートF100の領域内において定位置で飛行するモードである。コート内定位置飛行モードM107は、コート内飛行モードM105に遷移可能である。コート内離着陸モードM108は、コートF100の領域内で離着陸を行うモードであり、主に手動介入によりその場で着陸する指令が出された場合に遷移するモードである。コート内離着陸モードM108において離陸したドローンは、コート内飛行モードM105に遷移する。 The on-court fixed position flight mode M107 is a mode in which the drone flies in a fixed position within the area of the court F100. The on-court fixed position flight mode M107 can transition to the on-court flight mode M105. The on-court takeoff and landing mode M108 is a mode in which the drone takes off and lands within the area of the court F100, and is a mode to which the drone transitions mainly when a command to land on the spot is issued by manual intervention. A drone that takes off in the on-court takeoff and landing mode M108 transitions to the on-court flight mode M105.
 図5に戻る。電源入力部227は、電源ボタン328を介して操縦器200の電源のオンオフを受け付ける機能部である。 Returning to FIG. 5, the power input unit 227 is a functional unit that accepts the power on/off command for the controller 200 via the power button 328.
 帰還入力部228は、帰還ボタン329を介して、競技場F(図7)にいるドローン100を目標着陸地点L101g(図15参照)に帰還させる指令を受け付ける機能部である。 The return input unit 228 is a functional unit that accepts a command to return the drone 100 located in the stadium F (Figure 7) to the target landing point L101g (see Figure 15) via the return button 329.
 また、入力制御部220は、上述の構成に代えて又は加えて、表示部201へのタッチ入力を受け付け、当該入力に応じてドローン100に制御命令を送信可能になっていてもよい。より具体的には例えば、表示部201に表示されている、地図又は模式図等の適宜の情報に対してユーザが選択操作することで、選択された地点に向かう経路を自動的に生成し、ドローン100が自律的に飛行するようになっていてもよい。 In addition to or instead of the above configuration, the input control unit 220 may be capable of receiving touch input to the display unit 201 and transmitting control commands to the drone 100 in response to the input. More specifically, for example, when the user selects appropriate information such as a map or schematic diagram displayed on the display unit 201, a route to the selected point may be automatically generated, causing the drone 100 to fly autonomously.
(A-1-3-4.通信部240)
 通信部240は、操縦器200と、システム1に含まれる適宜の構成との間で信号を送受信する機能部である。操縦器200は、Wi-Fi、2.4GHz、5.6~5.8GHzの周波数帯域を用いた無線通信によりドローン100と無線通信を行う通信機能を備えている。また、操縦器200は、LTE(Long Term Evolution)等の通信規格を利用して通信ネットワーク400を介してサーバ300と通信を行うことができる無線通信機能を備えている。通信部240は、例えば、操縦者等のユーザによる各種の入力信号をドローン100又はサーバ300等に送信する。また、通信部240は、ドローン100又はサーバ300等からの信号を受信する。
(A-1-3-4. Communication unit 240)
The communication unit 240 is a functional unit that transmits and receives signals between the controller 200 and an appropriate configuration included in the system 1. The controller 200 has a communication function that performs wireless communication with the drone 100 by wireless communication using Wi-Fi, 2.4 GHz, and 5.6 to 5.8 GHz frequency bands. The controller 200 also has a wireless communication function that can communicate with the server 300 via the communication network 400 using a communication standard such as LTE (Long Term Evolution). The communication unit 240 transmits various input signals by a user such as a pilot to the drone 100 or the server 300. The communication unit 240 also receives signals from the drone 100 or the server 300.
(A-1-4.サーバ300)
(A-1-4-1.サーバ300の概要)
 図6は、本実施形態のサーバ300の機能構成図である。サーバ300は、ドローン100の飛行及び撮影を管理又は制御する。サーバ300は、各種情報の入力又は出力(画像出力、音声出力)のための入出力部(図示せず)を備える。
(A-1-4. Server 300)
(A-1-4-1. Overview of Server 300)
6 is a functional configuration diagram of the server 300 according to the present embodiment. The server 300 manages or controls the flight and photography of the drone 100. The server 300 includes an input/output unit (not shown) for inputting or outputting various types of information (image output, audio output).
 サーバ300は、例えばワークステーション又はパーソナルコンピュータのような汎用コンピュータとしてもよいし、或いはクラウド・コンピューティングによって論理的に実現されてもよい。 The server 300 may be a general-purpose computer such as a workstation or personal computer, or may be logically realized by cloud computing.
 サーバ300は、情報処理を実行するためのCPU等の演算装置、RAM及びROM等の記憶装置を備え、これによりソフトウェア構成として、主として事前設定部310、イベント検出部320、飛行モード切替部330、外縁飛行制御部340、コート内飛行制御部350、定位置飛行制御部360、通信部370および記憶部380の各機能ブロックを構成する。 The server 300 is equipped with a calculation device such as a CPU for executing information processing, and storage devices such as RAM and ROM, and as a software configuration, it mainly configures the following functional blocks: a presetting unit 310, an event detection unit 320, a flight mode switching unit 330, an outer edge flight control unit 340, an in-court flight control unit 350, a fixed position flight control unit 360, a communication unit 370, and a memory unit 380.
(A-1-4-2.事前設定部310)
 事前設定部310は、ドローン100が撮影対象フィールドを飛行する前に、ドローン100の飛行に必要な設定を行う機能部である。
 事前設定部310は、主としてジオフェンス設定部311を有する。
(A-1-4-2. Presetting unit 310)
The pre-setting unit 310 is a functional unit that performs the settings necessary for the flight of the drone 100 before the drone 100 flies over the field to be photographed.
The presetting unit 310 mainly includes a geofence setting unit 311 .
 ジオフェンス設定部311は、ドローン100のジオフェンスを設定する機能部である。ジオフェンスは、平面方向および高さ方向の情報を含む。 The geofence setting unit 311 is a functional unit that sets the geofence of the drone 100. The geofence includes information on the planar direction and the height direction.
 ジオフェンス設定部311は、飛行モードに応じたジオフェンスを設定する。すなわち例えば、ジオフェンス設定部311は、外縁飛行モードM102(図9参照)においてジオフェンスG100を有効化する。また、ジオフェンス設定部311は、コート内飛行モードM105(図9参照)においてジオフェンスG200を有効化する。また、ジオフェンス設定部311は、外縁飛行モードM102とコート内飛行モードM105との遷移にあたり介在する介在モード、すなわちコート外退出モードM106およびコート内進入モードM104において、ジオフェンスG100又はジオフェンスG200とは異なるジオフェンス、いわゆる第3ジオフェンスを設定する。外縁飛行モードM102およびコート内飛行モードM105は、特許請求の範囲における第1飛行モード又は第2飛行モードの例である。コート外退出モードM106およびコート内進入モードM104は、特許請求の範囲における第3飛行モードの例である。 The geofence setting unit 311 sets a geofence according to the flight mode. That is, for example, the geofence setting unit 311 activates the geofence G100 in the outer edge flight mode M102 (see FIG. 9). The geofence setting unit 311 also activates the geofence G200 in the inner court flight mode M105 (see FIG. 9). The geofence setting unit 311 also sets a geofence different from the geofence G100 or the geofence G200, that is, a so-called third geofence, in the intermediate modes that are intermediate in the transition between the outer edge flight mode M102 and the inner court flight mode M105, that is, the outer court exit mode M106 and the inner court entry mode M104. The outer edge flight mode M102 and the inner court flight mode M105 are examples of the first flight mode or the second flight mode in the scope of the claims. The outer court exit mode M106 and the inner court entry mode M104 are examples of the third flight mode in the scope of the claims.
 第1飛行モードにおけるジオフェンスと移動先での飛行モードである第2飛行モードのジオフェンスの少なくとも一部が重複(オーバーラップ)している場合、第3ジオフェンスは、第1飛行モードにおけるジオフェンスで定義される第1エリアと、第2飛行モードにおけるジオフェンスで定義される第2エリアと、を統合した統合エリアを覆うジオフェンスである。本実施形態において、外縁飛行モードM102のジオフェンスG100とコート内飛行モードM105のジオフェンスG200とは重複している。したがって、第3ジオフェンスは、ジオフェンスG100とジオフェンスG200とを統合した領域を区画するジオフェンスとなる。 If the geofence in the first flight mode and the geofence in the second flight mode, which is the flight mode at the destination, overlap at least in part, the third geofence is a geofence that covers a combined area that combines the first area defined by the geofence in the first flight mode and the second area defined by the geofence in the second flight mode. In this embodiment, the geofence G100 of the outer edge flight mode M102 and the geofence G200 of the inside court flight mode M105 overlap. Therefore, the third geofence is a geofence that divides the area that combines the geofences G100 and G200.
 また、第3ジオフェンスは、第1飛行モードにおけるジオフェンスと第2飛行モードのジオフェンスとが重複(オーバーラップ)していない場合、第1飛行モードに対応する第1ジオフェンスで定義される第1エリアと、第2飛行モードに対応する第2ジオフェンスで定義される第2エリアと、第1エリアと第2エリアとの空隙と、を統合したエリアを覆うジオフェンスとなっていてもよい。この構成によれば、ジオフェンスが重複していない飛行モード間を遷移する場合にも、モードの遷移中にジオフェンスから逸脱することがなく、安全性を担保できる。
 ジオフェンス設定部311は、設定されたジオフェンスの情報を記憶部380に格納する。
In addition, when the geofence in the first flight mode and the geofence in the second flight mode do not overlap, the third geofence may be a geofence that covers an area that combines a first area defined by the first geofence corresponding to the first flight mode, a second area defined by the second geofence corresponding to the second flight mode, and a gap between the first area and the second area. With this configuration, even when transitioning between flight modes in which the geofences do not overlap, the vehicle does not deviate from the geofence during the mode transition, ensuring safety.
The geofence setting unit 311 stores information about the set geofence in the memory unit 380.
(A-1-4-3.イベント検出部320)
 イベント検出部320は、撮影対象又はドローン100の状態を検出する機能部である。イベント検出部320は、撮影用カメラ141のカメラ画像又は外部システム700からの入力に基づいてイベントを検出する。各イベントの検出基準は、例えば記憶部380に記憶されており、イベント検出部320は、記憶部380を参照してイベントを検出する。また、イベント検出部320は、ニューラルネットワークを用いた解析によりイベントを検出してもよい。イベント検出部320による検出処理の態様は、公知の適宜の画像解析技術を適用することができる。
(A-1-4-3. Event detection unit 320)
The event detection unit 320 is a functional unit that detects the state of the subject to be photographed or the drone 100. The event detection unit 320 detects an event based on the camera image of the photographing camera 141 or an input from the external system 700. The detection criteria for each event are stored in, for example, the storage unit 380, and the event detection unit 320 detects an event by referring to the storage unit 380. The event detection unit 320 may also detect an event by analysis using a neural network. The detection process by the event detection unit 320 can be performed using any known appropriate image analysis technology.
 イベント検出部320は、特にドローン100の飛行モード又は撮影条件の変更の契機となるイベントを検出する。
 イベント検出部320は、主として、機体状態取得部321、機体行動状態取得部322、試合状態取得部323、および攻守状態取得部324を有する。
The event detection unit 320 detects events that trigger a change in the flight mode or shooting conditions of the drone 100.
The event detection unit 320 mainly has an aircraft state acquisition unit 321 , an aircraft action state acquisition unit 322 , a game state acquisition unit 323 , and an offensive/defensive state acquisition unit 324 .
 機体状態取得部321は、ドローン100の機体状態を取得する機能部である。
 図10は、ドローン100の機体状態の状態遷移を示す図である。機体状態は、例えば、通常操作飛行モードM200、検知判断モードM210およびアクションモードM220に大別される。ドローン100の飛行が開始されると、ドローン100は通常操作飛行モードM200に遷移する。
The aircraft status acquisition unit 321 is a functional unit that acquires the aircraft status of the drone 100.
10 is a diagram showing the state transition of the aircraft state of the drone 100. The aircraft state is broadly divided into, for example, a normal operation flight mode M200, a detection and judgment mode M210, and an action mode M220. When the drone 100 starts flying, the drone 100 transitions to the normal operation flight mode M200.
 通常操作飛行モードM200において、何等かの検知又は受信が生じると、機体状態は検知判断モードM210に遷移する。検知判断モードM210は、異常検知モードM211、故障検知モードM212、手動介入モードM213およびバッテリ不足モードM214を含む。 If any detection or reception occurs in the normal operation flight mode M200, the aircraft state transitions to the detection and judgment mode M210. The detection and judgment mode M210 includes an abnormality detection mode M211, a failure detection mode M212, a manual intervention mode M213, and a low battery mode M214.
 具体的には、通常操作飛行モードM200において異常が検知された場合には異常検知モードM211に遷移する。この異常は、電波強度の低下又は強風等の一過性、言い換えれば可逆性の外乱である。異常検知モードM211において異常が解消された場合は、通常操作飛行モードM200に遷移する。 Specifically, if an abnormality is detected in normal operation flight mode M200, the mode transitions to abnormality detection mode M211. This abnormality is a transient, in other words, reversible disturbance such as a drop in radio wave strength or strong winds. If the abnormality is resolved in abnormality detection mode M211, the mode transitions to normal operation flight mode M200.
 通常操作飛行モードM200において、機体又はシステムの故障が検知された場合には故障検知モードM212に遷移する。手動操縦の命令が受信された場合には手動介入モードM213に遷移し、バッテリの充電残量が所定値より少ないことが検知されるとバッテリ不足モードM214に遷移する。また、異常検知モードM211、故障検知モードM212、バッテリ不足モードM214において手動操縦の命令が受信された場合には、手動介入モードM213に遷移する。ドローン100は、検知判断モードM210に応じたアクションモードM220に遷移する。 If an aircraft or system failure is detected in normal operation flight mode M200, the drone 100 transitions to failure detection mode M212. If a manual control command is received, the drone transitions to manual intervention mode M213, and if it is detected that the remaining battery charge is less than a predetermined value, the drone transitions to low battery mode M214. In addition, if a manual control command is received in abnormality detection mode M211, failure detection mode M212, or low battery mode M214, the drone transitions to manual intervention mode M213. The drone 100 transitions to an action mode M220 that corresponds to the detection judgment mode M210.
 アクションモードM220は、状態ごとにあらかじめ設定された一連の動作をドローン100に行わせる状態である。アクションモードM220は、退避地点に着陸モードM221、緊急停止モードM222、その場着陸モードM223、帰還モードM224、および定位置飛行モードM225を含む。 The action mode M220 is a state in which the drone 100 performs a series of actions that are preset for each state. The action mode M220 includes a landing mode M221 at an evacuation point, an emergency stop mode M222, a landing on the spot mode M223, a return mode M224, and a fixed position flight mode M225.
 退避地点に着陸モードM221は、ドローン100を退避地点H200まで飛行させて着陸させる動作が設定されている。退避地点に着陸モードM221は、異常検知モードM211において異常が解消されない場合に遷移する。 The landing at evacuation point mode M221 is set to fly the drone 100 to the evacuation point H200 and land it. The landing at evacuation point mode M221 is entered when the abnormality is not resolved in the abnormality detection mode M211.
 緊急停止モードM222は、その場でプロペラ122を停止させる動作が設定されている。緊急停止モードM222においてドローン100は自由落下する。緊急停止モードM222は、プロペラ122が人や物に接触しそうな場合に、手動介入モードM213において選択されうる。 The emergency stop mode M222 is set to stop the propellers 122 on the spot. In the emergency stop mode M222, the drone 100 falls freely. The emergency stop mode M222 can be selected in the manual intervention mode M213 when the propellers 122 are about to come into contact with a person or object.
 その場着陸モードM223は、その場に軟着陸する動作が設定されている。帰還モードM224は、離着陸地点に帰還する動作が設定されている。 The on-site landing mode M223 is set to perform a soft landing on the spot. The return mode M224 is set to return to the takeoff and landing point.
 定位置飛行モードM225は、一定の位置で飛行する状態であり、ユーザの操作に基づいて通常操作飛行モードM200に遷移可能である。ユーザの操作は、例えば表示部201に表示されるボタンの選択により入力される。また、定位置飛行モードM225において、通常操作飛行モードM200から検知判断モードM210に遷移し得る事象、すなわち異常、故障、手動介入又はバッテリ不足を検知すると、定位置飛行モードM225から通常操作飛行モードM200を介して検知判断モードM210の各状態に遷移する。また、定位置飛行モードM225におけるドローン100は、ユーザの操作に基づいて帰還モードM224への遷移が可能である。 The fixed position flight mode M225 is a state in which the drone flies at a fixed position, and can transition to the normal operation flight mode M200 based on a user operation. The user operation is input, for example, by selecting a button displayed on the display unit 201. In addition, in the fixed position flight mode M225, if an event that can cause a transition from the normal operation flight mode M200 to the detection and judgment mode M210, i.e., an abnormality, a malfunction, manual intervention, or a low battery, is detected, the drone 100 transitions from the fixed position flight mode M225 to each state of the detection and judgment mode M210 via the normal operation flight mode M200. In addition, the drone 100 in the fixed position flight mode M225 can transition to the return mode M224 based on a user operation.
 異常検知モードM211および故障検知モードM212のドローン100は、退避地点に着陸モードM221に遷移する。手動介入モードM213のドローン100は、入力された命令に応じて、退避地点に着陸モードM221、緊急停止モードM222、その場着陸モードM223、帰還モードM224、および定位置飛行モードM225のいずれかの状態に遷移する。バッテリ不足モードM214のドローン100は、帰還モードM224に遷移する。 The drone 100 in the abnormality detection mode M211 and the failure detection mode M212 transitions to a landing mode M221 at an evacuation point. The drone 100 in the manual intervention mode M213 transitions to one of the following states depending on the input command: landing mode M221 at an evacuation point, emergency stop mode M222, landing on the spot mode M223, return mode M224, and fixed position flight mode M225. The drone 100 in the low battery mode M214 transitions to the return mode M224.
 なお、通常操作飛行モードM200のドローン100は、ユーザの操作に基づいて帰還モードM224に遷移することも可能である。ユーザの操作は、例えば表示部201に表示されるボタンの選択により入力される。 The drone 100 in the normal operation flight mode M200 can also transition to the return mode M224 based on a user operation. The user operation is input, for example, by selecting a button displayed on the display unit 201.
 機体行動状態取得部322は、ドローン100の機体の行動状態を取得する機能部である。機体の行動状態が有する各モードは、機体状態の遷移を実現するために行われる機体状態のサブモードである。
 図11は、機体の行動状態の状態遷移を示す図である。機体の行動状態は、例えば、離陸モードM300、退避モードM310、通常モードM320および着陸モードM340に大別される。
The aircraft behavior state acquisition unit 322 is a functional unit that acquires the aircraft behavior state of the drone 100. Each mode of the aircraft behavior state is a sub-mode of the aircraft state that is performed to realize a transition of the aircraft state.
11 is a diagram showing state transitions of the aircraft's behavioral states. The aircraft's behavioral states are broadly divided into a takeoff mode M300, an evacuation mode M310, a normal mode M320, and a landing mode M340, for example.
 離陸モードM300は、ドローン100が離陸するモードである。機体の行動状態の状態遷移は、離陸モードM300から開始する。ドローン100が離陸すると、機体の行動状態は離陸モードM300から退避モードM310又は通常モードM320に遷移する。退避モードM310は、主として、退避地点到着静止モードM311と、退避移動中モードM312とを含む。また、通常モードM320は、地点到着静止モードM321と、移動中モードM322とを含む。退避モードM310と通常モードM320は、一時停止モードM330を介して互いに遷移可能である。なお、この態様は1例である。 Takeoff mode M300 is a mode in which drone 100 takes off. The state transition of the aircraft's behavior state starts from takeoff mode M300. When drone 100 takes off, the aircraft's behavior state transitions from takeoff mode M300 to evacuation mode M310 or normal mode M320. Evacuation mode M310 mainly includes evacuation point arrival stationary mode M311 and evacuation moving mode M312. Normal mode M320 also includes point arrival stationary mode M321 and moving mode M322. Evacuation mode M310 and normal mode M320 can transition to each other via temporary suspension mode M330. This is just one example.
 離陸モードM300から退避モードM310に遷移する場合において、離陸モードM300のドローン100は、退避地点到着静止モードM311に遷移する。退避地点到着静止モードM311は、退避地点H200に移動してその場で静止、すなわちホバリングを行うモードである。退避を目的に、退避地点H200から別の地点に移動する場合、退避地点到着静止モードM311のドローン100は、退避移動中モードM312に遷移する。ドローン100が所定の目的地に到達、又は手動介入による移動命令がない状態となると、機体行動状態は、退避地点到着静止モードM311又は退避移動中モードM312から一時停止モードM330に遷移する。 When transitioning from takeoff mode M300 to evacuation mode M310, the drone 100 in takeoff mode M300 transitions to evacuation point arrival stationary mode M311. The evacuation point arrival stationary mode M311 is a mode in which the drone moves to the evacuation point H200 and remains stationary there, i.e., hovers. When moving from the evacuation point H200 to another point for the purpose of evacuation, the drone 100 in the evacuation point arrival stationary mode M311 transitions to evacuation in-motion mode M312. When the drone 100 reaches the specified destination or there is no movement command by manual intervention, the aircraft behavior state transitions from the evacuation point arrival stationary mode M311 or the evacuation in-motion mode M312 to the temporary suspension mode M330.
 また、離陸モードM300から通常モードM320に遷移する場合において、離陸モードM300のドローン100は、地点到着静止モードM321に遷移する。地点到着静止モードM321は、所定の目的地に移動してその場で静止、すなわちホバリングを行うモードである。通常の使用状態において別の地点に移動する場合、地点到着静止モードM321のドローン100は、移動中モードM322に遷移する。ドローン100が所定の目的地に到達、又は手動介入による移動命令がない状態となると、機体行動状態は、地点到着静止モードM321又は移動中モードM322から一時停止モードM330に遷移する。 In addition, when transitioning from takeoff mode M300 to normal mode M320, the drone 100 in takeoff mode M300 transitions to point arrival stationary mode M321. Point arrival stationary mode M321 is a mode in which the drone moves to a specified destination and remains stationary on the spot, i.e., hovers. When moving to another location in normal use conditions, the drone 100 in point arrival stationary mode M321 transitions to moving mode M322. When the drone 100 reaches the specified destination or there is no movement command by manual intervention, the aircraft behavior state transitions from point arrival stationary mode M321 or moving mode M322 to temporary suspension mode M330.
 退避地点到着静止モードM311、地点到着静止モードM321、移動中モードM322および一時停止モードM330のドローン100は、着陸モードM340に遷移可能である。機体の行動状態は、着陸モードM340において処理を終了する。 The drone 100 in the evacuation point arrival stationary mode M311, the point arrival stationary mode M321, the moving mode M322, and the pause mode M330 can transition to the landing mode M340. The aircraft's operating state ends processing in the landing mode M340.
 図6に示す試合状態取得部323は、競技場Fで行われる競技の試合状態を取得する機能部である。試合状態取得部323は、撮影した画像を画像処理することにより試合状態を検出する。また、試合状態取得部323は、審判員が外部入力装置600又は外部システム700の1例である審判支援システムに入力する判定関連情報に基づいて試合状態を取得してもよい。さらに、試合状態取得部323は、チーム関係者、例えば監督もしくはコーチが所持する外部入力装置600から入力される情報に基づいて試合状態を取得してもよい。 The game status acquisition unit 323 shown in FIG. 6 is a functional unit that acquires the game status of the competition held at the stadium F. The game status acquisition unit 323 detects the game status by performing image processing on the captured image. The game status acquisition unit 323 may also acquire the game status based on decision-related information input by the umpire to the external input device 600 or the umpire support system, which is an example of the external system 700. Furthermore, the game status acquisition unit 323 may acquire the game status based on information input from the external input device 600 held by a team member, for example, the manager or coach.
 図12は、試合状態の状態遷移の1例を示す図である。同図においては、サッカーにおける試合状態の例を示している。試合状態は、試合開始前状態M400および通常プレー状態M410、および試合終了後状態M460を含む。状態遷移は、試合開始前状態M400から開始し、試合開始前状態M400から通常プレー状態M410に遷移する。通常プレー状態M410は、競技が進行している状態である。試合が終了すると、通常プレー状態M410から試合終了後状態M460に遷移する。なお、試合の終了時点の他、ハーフタイム等試合中の休憩時間において、通常プレー状態M410から試合終了後状態M460に遷移してもよい。 FIG. 12 is a diagram showing an example of state transitions in a match state. In this figure, an example of a match state in soccer is shown. The match state includes a pre-match state M400, a normal play state M410, and an end-of-match state M460. The state transition starts from the pre-match state M400, and transitions from the pre-match state M400 to the normal play state M410. The normal play state M410 is a state in which the game is progressing. When the match ends, transitions from the normal play state M410 to the end-of-match state M460. Note that a transition from the normal play state M410 to the end-of-match state M460 may occur not only at the end of the match, but also during a break during the match, such as halftime.
 また、試合状態は、反則なしプレー中断状態M420と、反則有りプレー中断状態M440とを含む。通常プレー状態M410において反則が発生せずにプレーが中断されると、反則なしプレー中断状態M420に遷移する。反則なしプレー中断状態M420は、例えばボールがゴールラインF110a、F110b又はタッチラインF111a、F111bを超えてコート外に出た場合に遷移する。反則なしプレー中断状態M420においては、ボールが超えたラインの種類又はボールをコート外に出した選手の所属等、競技のルールに応じて生じるイベントに則して、スローイン状態M421、ゴールキック状態M422又はコーナーキック状態M423に遷移する。スローイン状態M421、ゴールキック状態M422又はコーナーキック状態M423は、各状態でのイベントが終了すると、通常プレー状態M410に遷移する。 The game state also includes a play suspended without foul play state M420 and a play suspended with foul play state M440. When play is suspended without a foul play in the normal play state M410, a transition to the play suspended without foul play state M420 occurs. The play suspended without foul play state M420 occurs, for example, when the ball crosses the goal line F110a, F110b or the touch line F111a, F111b and goes out of the court. In the play suspended without foul play state M420, a transition to a throw-in state M421, a goal kick state M422, or a corner kick state M423 occurs in accordance with events that occur according to the rules of the game, such as the type of line the ball crossed or the affiliation of the player who kicked the ball out of the court. When the event in each state ends, the throw-in state M421, the goal kick state M422, or the corner kick state M423 transitions to the normal play state M410.
 通常プレー状態M410において、反則が発生した場合、より正確には審判が反則の発生を認定した場合には、反則状態M431に遷移する。また、オフサイドが発生又は審判により認定された場合には、オフサイド状態M432に遷移する。そして、反則状態M431又はオフサイド状態M432から、反則有りプレー中断状態M440に遷移する。反則有りプレー中断状態M440においては、反則が発生した地点や発生した事象に応じて、フリーキック状態M441又はPK状態M442に遷移する。なお、フリーキック状態M441では、フリーキックに代えていわゆる間接フリーキックが行われる場合がある。また、フリーキック状態M441は、攻めている側のフリーキック状態と、守っている側のフリーキック状態とに細分化されてもよい。フリーキック状態M441およびPK状態M442は、各状態でのイベントが終了すると、試合は再開され、試合状態は通常プレー状態M410に遷移する。 In the normal play state M410, if a foul occurs, or more precisely, if the referee recognizes the occurrence of a foul, a transition to the foul state M431 occurs. In addition, if an offside occurs or is recognized by the referee, a transition to the offside state M432 occurs. Then, a transition from the foul state M431 or the offside state M432 occurs to the foul play interruption state M440. In the foul play interruption state M440, a transition to the free kick state M441 or the penalty kick state M442 occurs depending on the location where the foul occurred and the event that occurred. In the free kick state M441, a so-called indirect free kick may be performed instead of a free kick. In addition, the free kick state M441 may be subdivided into a free kick state for the attacking side and a free kick state for the defending side. In the free kick state M441 and the penalty kick state M442, when the event in each state ends, the match is resumed and the match state transitions to the normal play state M410.
 通常プレー状態M410は、試合が終了すると試合終了後状態M460に遷移し、試合状態の状態遷移は終了する。 When the match ends, the normal play state M410 transitions to the post-match state M460, and the state transition for the match state ends.
 また、通常プレー状態M410は、PK戦状態M443に遷移しうる。なお、図示は省略するが、PK戦状態M443から試合終了後状態M460に遷移し、状態遷移を終了してよい。 The normal play state M410 may also transition to a penalty shootout state M443. Although not shown in the figure, the penalty shootout state M443 may transition to an end-of-match state M460, thereby terminating the state transition.
 図12に示す複数の試合状態のうち、一部の試合状態が飛行モードの切替の契機となり、他の試合状態は飛行モード切替の契機とならないものとしてもよい。例えば、図中の網掛けの状態、すなわち、試合開始前状態M400、ゴールキック状態M422、コーナーキック状態M423、フリーキック状態M441、PK状態M442、選手交代状態M450および試合終了後状態M460への遷移に基づいて、飛行モードの切替が行われてもよい。なお、他の試合状態から通常プレー状態M410に遷移した場合には、攻守状態に応じた飛行モードに切り替わってよい。 Some of the game states shown in FIG. 12 may trigger a change in flight mode, while other game states may not. For example, the flight mode may be changed based on a transition to the shaded states in the figure, i.e., the pre-game state M400, goal kick state M422, corner kick state M423, free kick state M441, penalty kick state M442, player substitution state M450, and end-of-game state M460. Note that when a transition occurs from another game state to the normal play state M410, the flight mode may be changed to one that corresponds to the offensive or defensive state.
 攻守状態取得部324は、競技場Fで行われる試合におけるチームの攻守状態を取得する機能部である。攻守状態取得部324は、撮影した画像を画像処理することにより攻守状態を検出する。また、攻守状態取得部324は、審判員が外部入力装置600又は外部システム700の1例である審判支援システムに入力する判定関連情報に基づいて攻守状態を取得してもよい。さらに、攻守状態取得部324は、チーム関係者、例えば監督もしくはコーチが所持する外部入力装置600から入力される情報に基づいて攻守状態を取得してもよい。 The offensive and defensive state acquisition unit 324 is a functional unit that acquires the offensive and defensive states of the teams in the match held at the stadium F. The offensive and defensive state acquisition unit 324 detects the offensive and defensive states by performing image processing on the captured images. The offensive and defensive state acquisition unit 324 may also acquire the offensive and defensive states based on judgment-related information input by the umpire to the external input device 600 or the umpire support system, which is an example of the external system 700. Furthermore, the offensive and defensive state acquisition unit 324 may acquire the offensive and defensive states based on information input from the external input device 600 held by a team member, for example, the manager or coach.
 図13は、攻守状態の状態遷移の1例を示す図である。同図においては、サッカーにおける攻守状態の例を示している。試合が開始されると、攻守状態は攻撃状態M510又は守備状態M520に遷移する。また、攻撃状態M510および守備状態M520は、攻守交替状態M530又は攻守不確定状態M540を介して互いに遷移する。 FIG. 13 is a diagram showing an example of a state transition between offensive and defensive states. The figure shows an example of an offensive and defensive state in soccer. When a match starts, the offensive and defensive state transitions to an offensive state M510 or a defensive state M520. In addition, the offensive state M510 and the defensive state M520 transition to each other via an offensive/defensive change state M530 or an offensive/defensive uncertainty state M540.
 攻撃状態M510は、あらかじめ指定する一方のチーム(以下、「Aチーム」ともいう。)が攻勢である状態である。攻勢である状態は例えばAチームがボールを保持した状態で他方のチーム(以下、「Bチーム」ともいう。)に向かって進行している状態であるが、これに限られず、記憶部380にあらかじめ格納される任意の判定基準に判定される所定の状態であってよい。 The offensive state M510 is a state in which one of the teams (hereinafter also referred to as "Team A") designated in advance is on the offensive. An offensive state is, for example, a state in which Team A is in possession of the ball and is advancing toward the other team (hereinafter also referred to as "Team B"), but is not limited to this and may be a predetermined state determined by any determination criterion stored in advance in the memory unit 380.
 攻撃状態M510は、Aチーム攻勢(自陣)状態M511、Aチーム攻勢(敵陣)状態M512、およびAチーム速攻状態M513を含む。Aチーム攻勢(自陣)状態M511とAチーム攻勢(敵陣)状態M512、Aチーム攻勢(自陣)状態M511とAチーム速攻状態M513は、互いに遷移可能である。また、Aチーム速攻状態M513からAチーム攻勢(敵陣)状態M512に遷移可能である。 The attack state M510 includes an A team offensive (own side) state M511, an A team offensive (enemy side) state M512, and an A team quick attack state M513. A transition is possible between the A team offensive (own side) state M511 and the A team offensive (enemy side) state M512, and between the A team offensive (own side) state M511 and the A team quick attack state M513. A transition is also possible from the A team quick attack state M513 to the A team offensive (enemy side) state M512.
 守備状態M520は、Aチーム守備(敵陣)状態M521、Aチーム守備(自陣)状態M522、およびBチーム速攻状態M523を含む。Aチーム守備(自陣)状態M521とAチーム守備(自陣)状態M522、Aチーム守備(敵陣)状態M521とBチーム速攻状態M523は、互いに遷移可能である。また、Bチーム速攻状態M523からAチーム守備(自陣)状態M522に遷移可能である。 The defensive state M520 includes a team A defensive (opponent's half) state M521, a team A defensive (own's half) state M522, and a team B fast attack state M523. The team A defensive (own's half) state M521 and the team A defensive (own's half) state M522, and the team A defensive (opponent's half) state M521 and the team B fast attack state M523 can transition to each other. Also, a transition can be made from the team B fast attack state M523 to the team A defensive (own's half) state M522.
 攻守交替状態M530および攻守不確定状態M540は、Aチーム攻勢(自陣)状態M511、Aチーム攻勢(敵陣)状態M512、Aチーム速攻状態M513、Aチーム守備(敵陣)状態M521、Aチーム守備(自陣)状態M522、およびBチーム速攻状態M523のいずれからも遷移しうる。 The offense/defense switching state M530 and the offense/defense uncertain state M540 can be transitioned to from any of the following: Team A offensive (own side) state M511, Team A offensive (opponent's side) state M512, Team A quick attack state M513, Team A defensive (opponent's side) state M521, Team A defensive (own side) state M522, and Team B quick attack state M523.
 なお、イベント検出部320は、上述の各取得部321乃至324に代えて、又は加えて、外部システム700からの入力情報に基づいてイベントを判定してよい。例えば、イベント検出部320は、外部システム700の例としての気象情報システムからの入力情報に基づき、強風等の外乱をイベントとして判定してよい。また、イベント検出部320は、外部システム700の別の例としてのコート設備システムからの入力情報、又はコート設備の関係者が入力する設備情報に基づいてイベントを判定してもよい。 In addition, the event detection unit 320 may determine an event based on input information from the external system 700, instead of or in addition to the above-mentioned acquisition units 321 to 324. For example, the event detection unit 320 may determine an external disturbance such as a strong wind as an event based on input information from a weather information system, which is an example of the external system 700. The event detection unit 320 may also determine an event based on input information from a court facility system, which is another example of the external system 700, or facility information entered by a person involved with the court facility.
(A-1-4-4.飛行モード切替部330)
 飛行モード切替部330は、イベント検出部320による検出結果に応じて、飛行モードを切り替える機能部である。
 飛行モード切替部330は主として、モード切替判定部331と、飛行許可エリア切替部332と、ジオフェンス切替部333と、飛行経路生成部334と、を有する。
(A-1-4-4. Flight mode switching unit 330)
The flight mode switching unit 330 is a functional unit that switches the flight mode depending on the detection result by the event detection unit 320.
The flight mode switching unit 330 mainly has a mode switching determination unit 331, a flight permitted area switching unit 332, a geofence switching unit 333, and a flight path generation unit 334.
 モード切替判定部331は、飛行モードの切替の要否を判定する機能部である。モード切替判定部331は、イベント検出部320により検出されるイベントに応じて、飛行モードの切替要否を判定する。モード切替判定部331は、イベント検出部320により検出される状態と、当該状態における飛行モードとが対応付けられるテーブルT1(図14参照)を参照し、飛行モードが異なっている場合に、飛行モードの切替が必要であると判定する。 The mode switching determination unit 331 is a functional unit that determines whether or not a flight mode needs to be switched. The mode switching determination unit 331 determines whether or not a flight mode needs to be switched depending on an event detected by the event detection unit 320. The mode switching determination unit 331 refers to table T1 (see FIG. 14) that associates the state detected by the event detection unit 320 with the flight mode in that state, and determines that a flight mode needs to be switched if the flight modes are different.
 図14に示すように、イベント-飛行モードテーブルT1は、試合状態として検出されるイベントと、当該イベントにおいて選択されるべき飛行モードとが対応付けられて記憶されるテーブルである。より具体的には、例えば、ゴールキック状態M422および守り側のフリーキック状態M441においては、ボールがドローン100に当たるリスクを回避するため、外縁飛行モードM102が対応付けられている。また、PK状態M442および攻め側のフリーキック状態M441においては、ゴールシーンを近くで撮影するため、コート内飛行モードM105が対応付けられている。試合開始前状態M400、PK戦状態M443および選手交代状態M450においては、注視したい対象の動きが小さいか無いため、定位置飛行モードM103又はM107が対応付けられている。コート外定位置飛行モードM103とするかコート内定位置飛行モードM107とするかは、試合開始前状態M400、PK戦状態M443又は選手交代状態M450に遷移した時点での飛行モードに応じて決定されるものとしてもよい。試合終了後状態M460においては、選手の表情を近くで撮影するためコート内飛行モードM105が対応付けられている。 14, the event-flight mode table T1 is a table in which events detected as game states are associated with flight modes to be selected for those events and stored. More specifically, for example, in the goal kick state M422 and the defending free kick state M441, the outer edge flight mode M102 is associated to avoid the risk of the ball hitting the drone 100. Also, in the penalty kick state M442 and the attacking free kick state M441, the inside court flight mode M105 is associated to capture the goal scene up close. In the pre-match state M400, penalty shootout state M443, and player substitution state M450, the movement of the target to be watched is small or nonexistent, so the fixed position flight mode M103 or M107 is associated. Whether the off-court fixed position flight mode M103 or the on-court fixed position flight mode M107 is used may be determined according to the flight mode at the time of transition to the pre-match state M400, the penalty shootout state M443, or the player substitution state M450. In the post-match state M460, the on-court flight mode M105 is associated in order to take close-up shots of the players' expressions.
 上述のような構成によれば、イベントに適した飛行モードにより、対象エリアFで行われる競技又は催物を適切に撮影することができる。 With the above-mentioned configuration, it is possible to appropriately capture images of the competition or event taking place in the target area F using a flight mode appropriate for the event.
 飛行許可エリア切替部332は、飛行モードの切替に応じて飛行許可エリアを切り替える機能部である。 The permitted flight area switching unit 332 is a functional unit that switches the permitted flight area in response to switching of the flight mode.
 ジオフェンス切替部333は、飛行モードの切替に応じてジオフェンスを切り替える機能部である。例えば、ジオフェンス切替部333は、飛行モードが外縁飛行モードM102で有る場合、ジオフェンスG100を設定する。また、ジオフェンス切替部333は、飛行モードがコート内飛行モードM105である場合、ジオフェンスG200を設定する。また、外縁飛行モードM102とコート内飛行モードM105との遷移に介在する介在モード、すなわちコート内進入モードM104およびコート外退出モードM106においては、外縁飛行モードM102およびコート内飛行モードM105におけるジオフェンスG100、G200とは異なる第3ジオフェンスを設定する。 The geofence switching unit 333 is a functional unit that switches geofences in response to switching of flight modes. For example, when the flight mode is the outer edge flight mode M102, the geofence switching unit 333 sets a geofence G100. When the flight mode is the inner court flight mode M105, the geofence switching unit 333 sets a geofence G200. In addition, in the intermediate modes that are intermediate in the transition between the outer edge flight mode M102 and the inner court flight mode M105, i.e., the inner court entry mode M104 and the outer court exit mode M106, a third geofence different from the geofences G100 and G200 in the outer edge flight mode M102 and the inner court flight mode M105 is set.
 飛行経路生成部334は、飛行モードの切替を伴う移動におけるドローン100の飛行経路を生成する機能部である。飛行経路生成部334は、例えば、コート内進入モードM104又はコート外退出モードM106に遷移する撮影位置を決定する。また、飛行経路生成部334は、コート内進入モードM104からコート内飛行モードM105に遷移する撮影位置、又はコート外退出モードM106から外縁飛行モードM102に遷移する撮影位置を決定する。さらに、飛行経路生成部334は、コート内進入モードM104又はコート外退出モードM106における具体的な飛行経路を生成する。この飛行経路は、原則として第3ジオフェンスの領域内で生成される。 The flight path generating unit 334 is a functional unit that generates a flight path of the drone 100 during movement involving switching of flight modes. The flight path generating unit 334 determines, for example, the shooting position at which the mode transitions to the inside-court entry mode M104 or the outside-court exit mode M106. The flight path generating unit 334 also determines the shooting position at which the mode transitions from the inside-court entry mode M104 to the inside-court flight mode M105, or the shooting position at which the mode transitions from the outside-court exit mode M106 to the outer edge flight mode M102. Furthermore, the flight path generating unit 334 generates a specific flight path in the inside-court entry mode M104 or the outside-court exit mode M106. This flight path is generated, in principle, within the area of the third geofence.
(A-1-4-5.外縁飛行制御部340)
 外縁飛行制御部340は、外縁飛行モードにおけるドローン100の飛行を制御する機能部である。外縁飛行制御部340は、目標位置取得部341と、飛行経路生成部342とを有する。
(A-1-4-5. Outer flight control unit 340)
The outer edge flight control unit 340 is a functional unit that controls the flight of the drone 100 in the outer edge flight mode. The outer edge flight control unit 340 has a target position acquisition unit 341 and a flight path generation unit 342.
 目標位置取得部341は、ドローン100が向かうべき目標位置を取得する機能部である。この目標位置取得部341は、外縁飛行モードM102での飛行可能エリアの範囲内に位置する目標位置を取得する。目標位置取得部341は、例えば操縦器200の目標位置受付部226を介して受け付けた、ユーザにより入力される目標位置を取得してよい。 The target position acquisition unit 341 is a functional unit that acquires the target position to which the drone 100 should head. This target position acquisition unit 341 acquires a target position that is located within the range of the flight area in the outer edge flight mode M102. The target position acquisition unit 341 may acquire a target position input by the user, for example, received via the target position reception unit 226 of the controller 200.
 飛行経路生成部342は、外縁飛行モードM102においてドローン100が移動する飛行経路を生成する機能部である。言い換えれば、飛行経路生成部342は、外縁飛行モードM102での飛行可能エリアにおける飛行経路を生成する。飛行経路生成部342は、ドローン100が外縁飛行モードM102での飛行可能エリアに属し、取得した目標位置も外縁飛行モードM102での飛行可能エリアに属する場合には、現在地点から目標位置までの飛行経路を生成する。また、飛行経路生成部342は、ドローン100が外縁飛行モードM102およびコート内飛行モードM105をまたがる移動を行う場合には、外縁飛行モードM102での飛行範囲における飛行経路を生成してよい。飛行経路生成部342は、例えば、タッチラインF111bに沿って前記移動体を移動させる。 The flight path generating unit 342 is a functional unit that generates a flight path along which the drone 100 moves in the outer flight mode M102. In other words, the flight path generating unit 342 generates a flight path in a flyable area in the outer flight mode M102. When the drone 100 belongs to a flyable area in the outer flight mode M102 and the acquired target position also belongs to a flyable area in the outer flight mode M102, the flight path generating unit 342 generates a flight path from the current position to the target position. In addition, when the drone 100 moves across the outer flight mode M102 and the on-court flight mode M105, the flight path generating unit 342 may generate a flight path in the flight range in the outer flight mode M102. The flight path generating unit 342 moves the moving body, for example, along the touch line F111b.
 飛行経路生成部342は、ボールが、ドローン100が飛行しているタッチラインF111bを超えてコート外領域F200に出た場合に、コート外領域F200であってタッチラインF111bよりも外側にドローン100を移動させる。また、飛行経路生成部342は、ドローン100の真下、又は当該地点からコートF100に向かって撮影を行わせてもよい。この構成によれば、コート外領域F200にボールが転がり出た場合においてもボールの追従撮影が可能である。なお、外縁飛行モードM102のジオフェンスG100は、前記タッチラインF111b上よりもコートF100外側にまで広がってあらかじめ設定されているとよい。この構成によれば、上述のようにドローン100がボールに追従してタッチラインF111bのやや外側を飛行する場合であっても、ジオフェンスG100の再設定をすることなく、ドローン100をジオフェンスG100内に確実に維持できる。 When the ball crosses the touchline F111b on which the drone 100 is flying and enters the outer court area F200, the flight path generation unit 342 moves the drone 100 to the outer court area F200 outside the touchline F111b. The flight path generation unit 342 may also perform shooting directly below the drone 100 or from that point toward the court F100. With this configuration, it is possible to follow and shoot the ball even if the ball rolls into the outer court area F200. The geofence G100 of the outer edge flight mode M102 may be set in advance to extend beyond the touchline F111b to the outside of the court F100. With this configuration, even if the drone 100 follows the ball and flies slightly outside the touchline F111b as described above, the drone 100 can be reliably maintained within the geofence G100 without resetting the geofence G100.
 飛行経路生成部342は、飛行経路上に、又はドローン100の近傍に障害物を検知した場合には、コートF100内部側に障害物を迂回する飛行経路を再生成する。また、飛行経路生成部342は、障害物を検知した場合に、所定時間ホバリングをした後に、当初生成した飛行経路での移動を行わせるものとしてもよい。競技場Fにおいてコート外領域F200におけるドローン100の安全は確保されていない一方、コートF100内のドローン100の安全は確保されている蓋然性が高いためである。障害物は、例えばドローン100の障害物検知部130により検知される他、外部システム700等からの情報により検知してもよい。
 また、外縁飛行制御部340は、障害物を検知した場合に、飛行経路生成部342により所定時間ホバリングをさせた後に、手動操縦への切替を行ってもよい。さらにまた、外縁飛行制御部340は、障害物を検知した場合に、飛行経路生成部342により所定時間ホバリングをさせた後に、表示制御部210を介してユーザに目標位置の再入力を促す表示を行ってもよい。
When the flight path generating unit 342 detects an obstacle on the flight path or near the drone 100, it regenerates a flight path that bypasses the obstacle on the inside of the court F100. When the flight path generating unit 342 detects an obstacle, it may hover for a predetermined time and then move along the originally generated flight path. This is because while the safety of the drone 100 is not ensured in the area outside the court F200 in the stadium F, the safety of the drone 100 inside the court F100 is highly likely to be ensured. The obstacle may be detected, for example, by the obstacle detecting unit 130 of the drone 100, or may be detected by information from an external system 700 or the like.
Furthermore, when an obstacle is detected, the outer edge flight control unit 340 may switch to manual control after causing the flight path generation unit 342 to hover for a predetermined time. Furthermore, when an obstacle is detected, the outer edge flight control unit 340 may cause the flight path generation unit 342 to hover for a predetermined time, and then display a message prompting the user to re-input the target position via the display control unit 210.
(A-1-4-6.コート内飛行制御部350)
 コート内飛行制御部350は、コート内飛行モードM105におけるドローン100の飛行を制御する機能部である。コート内飛行制御部350は、目標位置取得部351と、飛行経路生成部352とを有する。目標位置取得部351は、コート内飛行モードM105での飛行可能エリアの範囲内に位置する目標位置を取得する機能部である。
(A-1-4-6. In-court flight control unit 350)
The on-court flight control unit 350 is a functional unit that controls the flight of the drone 100 in the on-court flight mode M105. The on-court flight control unit 350 has a target position acquisition unit 351 and a flight path generation unit 352. The target position acquisition unit 351 is a functional unit that acquires a target position located within the range of the flight area in the on-court flight mode M105.
 飛行経路生成部352は、コート内飛行モードM105においてドローン100が移動する飛行経路を生成する。すなわち、飛行経路生成部352は、コート内飛行モードM105での飛行可能エリアにおける飛行経路を生成する。より具体的には、飛行経路生成部352は、コート内飛行モードM105において、あらかじめ設定された複数の撮影位置を接続して飛行経路を生成する。飛行経路生成部352は、外縁飛行制御部340の飛行経路生成部342と同様、現在地点と目標位置が異なる飛行モードの飛行可能エリアに属している場合には、コート内飛行モードM105での飛行範囲における飛行経路を生成してよい。 The flight path generating unit 352 generates a flight path along which the drone 100 moves in the on-court flight mode M105. That is, the flight path generating unit 352 generates a flight path in a flyable area in the on-court flight mode M105. More specifically, the flight path generating unit 352 generates a flight path by connecting multiple preset shooting positions in the on-court flight mode M105. Like the flight path generating unit 342 of the outer edge flight control unit 340, the flight path generating unit 352 may generate a flight path in a flight range in the on-court flight mode M105 when the current position and the target position belong to flyable areas of different flight modes.
 飛行経路生成部352は、飛行経路上に、又はドローン100の近傍に障害物を検知した場合には、障害物を迂回する飛行経路を再生成する。障害物は、例えば障害物検知部130により検知される。飛行経路生成部352は、あらかじめ設定された複数の撮影位置の接続を変更して飛行経路を再生成してもよいし、平面上の飛行経路を維持したまま高度を高くした飛行経路に変更してもよい。また、飛行経路生成部352は、障害物を検知した場合に、所定時間ホバリングをした後に、当初生成した飛行経路での移動を行わせるものとしてもよい。さらに、コート内飛行制御部350は、障害物を検知した場合に、飛行経路生成部352により所定時間ホバリングをさせた後に、手動操縦への切替を行ってもよい。さらにまた、コート内飛行制御部350は、障害物を検知した場合に、飛行経路生成部352により所定時間ホバリングをさせた後に、表示制御部210を介してユーザに目標位置の再入力を促す表示を行ってもよい。障害物は、例えば、鳥、固定設備又は選手である。また、障害物は、ボールも含まれる。 If the flight path generating unit 352 detects an obstacle on the flight path or near the drone 100, it regenerates a flight path that bypasses the obstacle. The obstacle is detected by, for example, the obstacle detection unit 130. The flight path generating unit 352 may regenerate the flight path by changing the connection between multiple shooting positions that have been set in advance, or may change the flight path to a higher altitude while maintaining the flight path on a plane. In addition, the flight path generating unit 352 may hover for a predetermined time when it detects an obstacle, and then move along the flight path that was initially generated. Furthermore, the in-court flight control unit 350 may switch to manual control after hovering for a predetermined time by the flight path generating unit 352 when it detects an obstacle. Furthermore, the in-court flight control unit 350 may display a message prompting the user to re-input the target position via the display control unit 210 after hovering for a predetermined time by the flight path generating unit 352 when it detects an obstacle. The obstacle may be, for example, a bird, a fixed facility, or a player. The obstacles also include balls.
 なお、本実施形態においては、外縁飛行モードM102における飛行制御は外縁飛行制御部340が行い、コート内飛行モードM105における飛行制御はコート内飛行制御部350が行うものとした。そして、外縁飛行モードM102とコート内飛行モードM105のそれぞれにおいて規定された撮影位置を選択肢として提示し、選択された目標撮影位置への飛行経路を生成するものとした。しかしながら本発明の技術的範囲はこれに限られず、操縦器200によりドローン100の撮影位置と向きを制御して、各飛行モードに対応して設定されるジオフェンスG100、G200内のエリアの任意の位置を自由に飛行させるようにしてもよい。 In this embodiment, flight control in the outer edge flight mode M102 is performed by the outer edge flight control unit 340, and flight control in the inner court flight mode M105 is performed by the inner court flight control unit 350. Then, the shooting positions defined in each of the outer edge flight mode M102 and the inner court flight mode M105 are presented as options, and a flight path to the selected target shooting position is generated. However, the technical scope of the present invention is not limited to this, and the shooting position and orientation of the drone 100 may be controlled by the pilot 200 to fly freely at any position in the area within the geofences G100 and G200 set corresponding to each flight mode.
 また、飛行経路生成部334、342、352の機能構成は一例であり、例えば細分化することなく1個の飛行経路生成部が飛行経路を生成するものとしてもよい。 Furthermore, the functional configuration of the flight path generation units 334, 342, and 352 is an example, and for example, a single flight path generation unit may generate the flight path without subdivision.
 図15は、撮影位置L101~L105、L206~L215および退避地点H200に定義される、ドローン100が遷移可能な経路を模式的に表した図である。撮影位置L101の地上の地点L101gは、ドローン100の離着陸地点となっている。ドローン100の位置遷移は、地点L101gから離陸し、撮影位置L101に到達するステップから開始される。また、ドローン100は、撮影位置L101において下降し、地点L101gに着陸して撮影を終了する。 FIG. 15 is a schematic diagram showing the routes that the drone 100 can follow, defined by the shooting positions L101-L105, L206-L215, and the evacuation point H200. Point L101g on the ground at the shooting position L101 is the takeoff and landing point for the drone 100. The position transition of the drone 100 begins with the step of taking off from point L101g and arriving at the shooting position L101. The drone 100 also descends at the shooting position L101, and lands at point L101g to end shooting.
 各撮影位置L101~L105、L206~L215において、ドローン100は、隣接する撮影位置にのみ遷移可能であってもよい。例えば、撮影位置L105にいるドローン100が外縁飛行モードを維持した状態で遷移可能な地点は、撮影位置L104である。また、撮影位置L105にいるドローン100がコート内飛行モードM105にモード切替した上で遷移可能な地点は撮影位置L106および撮影位置L107である。飛行経路生成部334、342又は352(図6参照)は、当該遷移可能な経路を参照して、ドローン100の飛行経路を生成する。 At each of the shooting positions L101 to L105 and L206 to L215, the drone 100 may only be able to transition to adjacent shooting positions. For example, the point to which the drone 100 at shooting position L105 can transition while maintaining the outer edge flight mode is shooting position L104. Furthermore, the points to which the drone 100 at shooting position L105 can transition after switching to the inside court flight mode M105 are shooting positions L106 and L107. The flight path generation unit 334, 342, or 352 (see FIG. 6) generates a flight path for the drone 100 by referring to the possible transition paths.
 目標位置受付部226により撮影位置の選択を受け付けると、ドローン100は、遷移可能な撮影位置を経由して選択された撮影位置に遷移する。例えば、ドローン100が撮影位置L105にいる場合に、撮影位置L215の選択がなされると、飛行経路生成部334、342又は352(図6参照)は、撮影位置L105、L207、L208、L213、L212、L215をこの順に遷移する飛行経路を生成し、ドローン100は、この飛行経路に沿って飛行する。 When the target position receiving unit 226 receives the selection of the shooting position, the drone 100 transitions to the selected shooting position via the available shooting positions. For example, when the drone 100 is at shooting position L105 and shooting position L215 is selected, the flight path generating unit 334, 342, or 352 (see FIG. 6) generates a flight path that transitions through shooting positions L105, L207, L208, L213, L212, and L215 in that order, and the drone 100 flies along this flight path.
 なお、ドローン100は、飛行モード切替を伴う撮影位置の遷移の場合には、隣接する撮影位置に遷移し、飛行モード切替を伴わない遷移の場合には、隣接しない撮影位置へ直接遷移可能になっていてもよい。すなわち例えば、ドローンが撮影位置L105から撮影位置L215に移動する場合に、モード切替を伴って撮影位置L105から撮影位置L207に遷移した後、撮影位置L107から撮影位置L215へはジオフェンスG200内の領域を直線的に移動してもよい。 Note that the drone 100 may transition to an adjacent shooting position when the transition of the shooting position involves a flight mode switch, and may transition directly to a non-adjacent shooting position when the transition does not involve a flight mode switch. That is, for example, when the drone moves from shooting position L105 to shooting position L215, it may transition from shooting position L105 to shooting position L207 with a mode switch, and then move linearly through the area within the geofence G200 from shooting position L107 to shooting position L215.
 外縁飛行制御部340およびコート内飛行制御部350は、各飛行可能エリアにおいて飛行モードに応じてドローン100を自律飛行させる。例えば、外縁飛行制御部340およびコート内飛行制御部350は、各飛行可能エリアの内部においてドリー撮影、すなわちドローン100にボール又は指定の選手等の特定対象物の自動追従撮影を行わせてもよい。また、外縁飛行制御部340およびコート内飛行制御部350は、ドローン100の機首の向き又は撮影用カメラ141の向き、いわゆる撮影方向を自動制御してもよい。外縁飛行制御部340およびコート内飛行制御部350は、ドローン100の飛行高さを自動制御してもよい。また、自律飛行の態様は、飛行モードに応じて異なっていてよい。例えば、外縁飛行制御部340での制御時にはドリー撮影を行う一方、コート内飛行制御部350での制御時には撮影位置を固定した撮影方向のみの自動追従撮影、又は位置および撮影方向の自動追従撮影を行ってもよい。また、外縁飛行制御部340およびコート内飛行制御部350は、各飛行可能エリアにおいてユーザにより指定される目標位置にドローン100を移動させるための飛行経路をコート内(競技場F内)に生成してもよい。 The outer edge flight control unit 340 and the inner court flight control unit 350 autonomously fly the drone 100 in each flight area according to the flight mode. For example, the outer edge flight control unit 340 and the inner court flight control unit 350 may perform dolly shooting within each flight area, that is, the drone 100 may automatically follow and shoot a specific object such as a ball or a designated player. The outer edge flight control unit 340 and the inner court flight control unit 350 may also automatically control the direction of the nose of the drone 100 or the direction of the shooting camera 141, that is, the so-called shooting direction. The outer edge flight control unit 340 and the inner court flight control unit 350 may automatically control the flight height of the drone 100. The autonomous flight mode may differ depending on the flight mode. For example, dolly shooting may be performed when controlled by the outer edge flight control unit 340, while automatic tracking shooting of only the shooting direction with a fixed shooting position, or automatic tracking shooting of the position and shooting direction may be performed when controlled by the inner court flight control unit 350. Additionally, the outer edge flight control unit 340 and the in-court flight control unit 350 may generate a flight path within the court (within the stadium F) for moving the drone 100 to a target position specified by the user in each flight area.
(A-1-4-7.定位置飛行制御部360)
 定位置飛行制御部360は、コート外定位置飛行モードM103およびコート内定位置飛行モードM107におけるドローン100の飛行制御を行う機能部である。定位置飛行制御部360は、定位置飛行モードでは、所定位置においてホバリングを行うとともに、機首方向又は撮影用カメラ141の方向を制御し、特定の選手又はボールに追従させて自動撮影を行う。なお、上述した「方向の制御」とは、左右方向(いわゆる「パン方向」)のみならず、上下方向(いわゆる「ティルト方向」)の制御も含む概念である。
(A-1-4-7. Fixed position flight control unit 360)
The fixed position flight control unit 360 is a functional unit that controls the flight of the drone 100 in the off-court fixed position flight mode M103 and the on-court fixed position flight mode M107. In the fixed position flight mode, the fixed position flight control unit 360 hovers at a predetermined position and controls the nose direction or the direction of the shooting camera 141 to follow a specific player or the ball and perform automatic shooting. Note that the above-mentioned "control of direction" is a concept that includes control not only in the left-right direction (so-called "pan direction") but also in the up-down direction (so-called "tilt direction").
 定位置飛行制御部360は、モード切替許可判定部361を備える。モード切替許可判定部361は、定位置飛行モードM103又はM107において飛行モードの切替の可否を判定する機能部である。 The fixed position flight control unit 360 includes a mode switching permission determination unit 361. The mode switching permission determination unit 361 is a functional unit that determines whether or not the flight mode can be switched in the fixed position flight mode M103 or M107.
(A-1-4-8.通信部370)
 通信部370は、図示しないモデム等を有し、通信ネットワーク400を介してドローン100、操縦器200等との通信が可能である。通信部370は、例えばドローン100及びその周囲の状態を監視し、操縦器200に通知してもよい。
(A-1-4-8. Communication unit 370)
The communication unit 370 has a modem or the like (not shown) and is capable of communicating with the drone 100, the controller 200, and the like via the communication network 400. The communication unit 370 may, for example, monitor the state of the drone 100 and its surroundings and notify the controller 200.
(A-1-4-9.記憶部380)
 記憶部380は、ドローン100の飛行制御に係る情報を記憶する機能部であり、例えばデータベースである。記憶部380は、例えば競技場Fにおける複数の撮影位置L101~L105、L210~L215の座標を記憶する。この座標は、平面上の2次元座標の他、高さ方向の情報を含む3次元座標であってもよい。
(A-1-4-9. Storage unit 380)
The memory unit 380 is a functional unit, such as a database, that stores information related to flight control of the drone 100. The memory unit 380 stores, for example, the coordinates of multiple shooting positions L101-L105, L210-L215 in the stadium F. These coordinates may be two-dimensional coordinates on a plane or three-dimensional coordinates including information in the height direction.
 また、記憶部380は、図14に示すイベント-飛行モードテーブルT1を記憶している。さらに、記憶部380は、イベントとして検出される強風状態と、定位置飛行モードM103又はM107とを対応付けて記憶していてもよい。強風の場合には、ドローン100が意図した方向に飛行できないおそれがあるため、ドローン100を移動させない方が安全であるためである。イベント-飛行モードテーブルT1は書換可能に記録されている。また、イベント-飛行モードテーブルT1は複数記憶されていてもよい。 The memory unit 380 also stores an event-flight mode table T1 shown in FIG. 14. Furthermore, the memory unit 380 may store a strong wind state detected as an event in association with a fixed position flight mode M103 or M107. This is because in the case of strong winds, the drone 100 may not be able to fly in the intended direction, so it is safer not to move the drone 100. The event-flight mode table T1 is recorded in a rewritable manner. Furthermore, multiple event-flight mode tables T1 may be stored.
●フローチャート
 図16は、本実施形態における空中撮影制御の全体的な流れを示すフローチャートである。図17は、図16の飛行制限処理S1002のサブルーチンである。図18は、図16の飛行モード切替処理S1010のサブルーチンである。
● Flowcharts Fig. 16 is a flowchart showing the overall flow of aerial photography control in this embodiment. Fig. 17 is a subroutine of the flight restriction process S1002 in Fig. 16. Fig. 18 is a subroutine of the flight mode switching process S1010 in Fig. 16.
 図16に示すフローチャートに記載する制御は、定期的にループして実行される。図16に示すように、ドローン100の飛行中に、ジオフェンスG100、G200の近傍に近づいたことを検出した場合(ステップS1001でYES)、ステップS1002の飛行制限処理に移行する。飛行制限処理S1002のサブルーチンは、図17で説明する。 The control described in the flowchart shown in FIG. 16 is executed in a regular loop. As shown in FIG. 16, if it is detected that the drone 100 is approaching the vicinity of geofences G100, G200 while flying (YES in step S1001), the process proceeds to flight restriction processing in step S1002. The subroutine of the flight restriction processing S1002 is explained in FIG. 17.
 ステップS1001でジオフェンスG100、G200の近傍に近づいたことが検出されない場合(ステップS1001でNO)、進路又はドローン100の機体近傍に障害物の有無を検出する(ステップS1003)。ステップS1003で障害物が検出された場合(ステップS1003でYES)、ドローン100をホバリングさせる、又は迂回経路を生成し、ドローン100の飛行経路を当該迂回経路に変更する(ステップS1004)。 If it is not detected in step S1001 that the drone is approaching the vicinity of geofences G100, G200 (NO in step S1001), the presence or absence of an obstacle in the path or near the drone 100 is detected (step S1003). If an obstacle is detected in step S1003 (YES in step S1003), the drone 100 is caused to hover or a detour route is generated, and the flight route of the drone 100 is changed to the detour route (step S1004).
 ステップS1003で障害物が検出されない場合、(ステップS1003でNO)、機体状態のアクション判定の有無を検出する(ステップS1005)。ステップS1005において機体状態のアクション判定が検出された場合(ステップS1005でYES)、ステップS1006に移行し、イベント種別の判定処理を実行する(ステップS1006)。 If no obstacle is detected in step S1003 (NO in step S1003), the presence or absence of an action determination of the aircraft state is detected (step S1005). If an action determination of the aircraft state is detected in step S1005 (YES in step S1005), the process proceeds to step S1006, where an event type determination process is executed (step S1006).
 ステップS1005でアクション判定が検出されない場合(ステップS1005でNO)、ユーザによる操縦器200からの入力の有無を判定する(ステップS1007)。ステップS1007において操縦器200からの入力が検出された場合(ステップS1007でYES)、入力に基づく指令を実行する(ステップS1008)。 If no action determination is detected in step S1005 (NO in step S1005), it is determined whether or not there is an input from the controller 200 by the user (step S1007). If an input from the controller 200 is detected in step S1007 (YES in step S1007), a command based on the input is executed (step S1008).
 ステップS1007において操縦器200からの入力が検出されない場合(ステップS1007でNO)、イベントの有無を判定する(ステップS1009)。イベントが検出された場合(ステップS1009でYES)、ステップS1010の飛行モード切替処理に移行する(ステップS1010)。ステップS1009においてイベントが検出されない場合(ステップS1009でNO)、ステップS1001に戻りステップS1001~S1009を繰り返す。 If no input from the controller 200 is detected in step S1007 (NO in step S1007), the presence or absence of an event is determined (step S1009). If an event is detected (YES in step S1009), the process proceeds to flight mode switching processing in step S1010 (step S1010). If no event is detected in step S1009 (NO in step S1009), the process returns to step S1001 and steps S1001 to S1009 are repeated.
 図16に示す通り、空中撮影制御の全体的な処理は、ジオフェンス制限、障害物検知、機体状態に基づく制御、ユーザの入力に基づく制御、試合状態又は攻守状態といった試合内のイベントに基づく制御、の順に行われる。すなわち、試合内のイベントに基づく制御より先に、各制御が実行される。この順序は、安全な制御処理を行う優先度の高い順となっている。このような構成によれば、ドローン100の飛行に関する安全性をより確実に担保することができる。 As shown in FIG. 16, the overall processing of aerial photography control is performed in the following order: geofence restriction, obstacle detection, control based on aircraft state, control based on user input, and control based on in-game events such as the game state or offensive and defensive states. In other words, each control is executed before control based on in-game events. This order is based on the priority of performing safe control processing. With this configuration, the safety of flying the drone 100 can be more reliably guaranteed.
 図17に示すように、まず、ドローン100の飛行制御部123により、ドローン100がジオフェンスの外へ進出しないよう制限する動作指令を行う(ステップS1101)。ステップS1101では、手動でドローン100を操縦する際にも、ジオフェンスの外側にドローン100が進出しないように、飛行目標位置の制限を設ける。 As shown in FIG. 17, first, the flight control unit 123 of the drone 100 issues an operation command to restrict the drone 100 from advancing outside the geofence (step S1101). In step S1101, a restriction is set on the flight target position so that the drone 100 does not advance outside the geofence even when the drone 100 is manually operated.
 次いで、ジオフェンスの外にドローン100が進出しない場合には(ステップS1102でNO)、処理を終了する。 Next, if the drone 100 does not advance outside the geofence (NO in step S1102), the process ends.
 ドローン100がジオフェンスの外に進出した場合(ステップS1102でYES)、ステップS1103に進む。この状況の原因としては、例えば風が強すぎて機体が流される場合や、ドローン100の故障により意図する方向へ飛行しない場合等が考えられる。ステップS1103では、飛行制御部123により、ジオフェンス内に戻る動作指令を行う。より具体的には、ステップS1103では、ジオフェンス内に戻るような飛行目標位置の指令、すなわちジオフェンスの内部の所定地点を飛行目標位置とする動作指令をドローン100に与える。 If the drone 100 advances outside the geofence (YES in step S1102), proceed to step S1103. Possible causes of this situation include, for example, the wind being too strong and causing the aircraft to be swept away, or a malfunction in the drone 100 preventing it from flying in the intended direction. In step S1103, the flight control unit 123 issues an operation command to return to the geofence. More specifically, in step S1103, a flight target position command to return to the geofence, that is, an operation command to set the flight target position to a specified point inside the geofence, is given to the drone 100.
 次いで、ドローン100の測定部110により測定される情報、例えば位置、方位、高度又は速度の情報を参照し、ドローン100がジオフェンスの内側へ接近しているか判定する(ステップS1104)。この場合、ステップS1104はステップS1103の所定時間後に実行される。なお、ステップS1104においては、ステップS1103の時点よりもジオフェンスの内側に接近していれば足り、ジオフェンスの内側に位置するか否かを判定しなくてよい。 Next, by referring to information measured by the measurement unit 110 of the drone 100, such as information on position, direction, altitude, or speed, it is determined whether the drone 100 is approaching the inside of the geofence (step S1104). In this case, step S1104 is executed a predetermined time after step S1103. Note that in step S1104, it is sufficient if the drone 100 is closer to the inside of the geofence than at the time of step S1103, and it is not necessary to determine whether the drone 100 is located inside the geofence.
 ステップS1104において、ドローン100がジオフェンス内へ接近していないと判定される場合(ステップS1104でNO)、上記動作指令の効果がないと判断して、飛行制御部123により、ドローン100を強制着陸させる(ステップS1105)。 If it is determined in step S1104 that the drone 100 is not approaching the geofence (NO in step S1104), it is determined that the above operation command is ineffective, and the flight control unit 123 forces the drone 100 to land (step S1105).
 ステップS1104においてドローン100がジオフェンス内へ接近している場合、(ステップS1104でYES)、ステップS1106に進む。ステップS1106では、ドローン100の測定部110により測定される情報、例えば位置および高度を参照し、ドローン100がジオフェンス内に位置しているか判定する。ドローン100がジオフェンス内に位置している場合(S1106でYES)、処理を終了する。ドローン100がジオフェンス内に位置していない場合(S1106でNO)、ステップ1104に戻り、ジオフェンス内にドローン100が帰還するまで、ジオフェンス内へ戻る動作指令に基づく動作を継続する。 If the drone 100 is approaching the geofence in step S1104 (YES in step S1104), proceed to step S1106. In step S1106, information measured by the measurement unit 110 of the drone 100, such as the position and altitude, is referenced to determine whether the drone 100 is located within the geofence. If the drone 100 is located within the geofence (YES in S1106), end the process. If the drone 100 is not located within the geofence (NO in S1106), return to step 1104 and continue operation based on the operation command to return to the geofence until the drone 100 returns within the geofence.
 図18は、飛行モードの切替における切替処理フローの1例を示す図である。図18では、外縁飛行モードM102からコート内飛行モードM105への切替を伴う処理フローを示しており、特に、撮影位置L101から撮影位置L215へ移動する場合の、飛行モードの切替および飛行経路の生成を含む処理フローを示している。なお、以下の説明ではイベント検出に基づく飛行モードの自動切替が発生したことを想定して説明するが、飛行モードの切替処理自体は、飛行モードの切替が手動介入により入力された場合も同様である。 FIG. 18 is a diagram showing an example of a switching process flow for switching flight modes. FIG. 18 shows a process flow involving switching from the outer edge flight mode M102 to the inside court flight mode M105, and in particular shows a process flow including switching flight modes and generating a flight path when moving from shooting position L101 to shooting position L215. Note that the following explanation assumes that automatic switching of flight modes has occurred based on event detection, but the flight mode switching process itself is the same when the flight mode switch is input by manual intervention.
 図18に示すように、まず、飛行モード切替部330により、飛行モードを外縁飛行モードM102からコート内進入モードM104に切り替え、ジオフェンス切替部333により、コート内進入モードM104に対応するジオフェンスを有効化する(ステップS1201)。コート内進入モードM104に対応するジオフェンスは、外縁飛行モードM102のジオフェンスG100とコート内飛行モードM105のジオフェンスG200とを統合した領域により定義される統合ジオフェンスである。 As shown in FIG. 18, first, the flight mode switching unit 330 switches the flight mode from the outer edge flight mode M102 to the inside court entry mode M104, and the geofence switching unit 333 activates the geofence corresponding to the inside court entry mode M104 (step S1201). The geofence corresponding to the inside court entry mode M104 is an integrated geofence defined by the area that combines the geofence G100 of the outer edge flight mode M102 and the geofence G200 of the inside court flight mode M105.
 次いで、飛行経路生成部334により、撮影位置L101から撮影位置L215までの飛行経路を生成する(ステップS1202)。飛行経路は、図15に示した経路に沿って隣接する撮影位置を遷移する飛行経路であってもよいし、撮影位置L101から撮影位置L215までを直線的に接続する飛行経路であってもよい。なお、飛行経路生成部334に代えて、又は加えて飛行経路生成部342又は352により飛行経路を生成してもよい。 Then, the flight path generating unit 334 generates a flight path from the photographing position L101 to the photographing position L215 (step S1202). The flight path may be a flight path that transitions between adjacent photographing positions along the path shown in FIG. 15, or may be a flight path that linearly connects the photographing position L101 to the photographing position L215. Note that the flight path may be generated by the flight path generating unit 342 or 352 instead of or in addition to the flight path generating unit 334.
 次いで、生成された飛行経路が、統合ジオフェンス内部に生成されているかを判定し(ステップS1203)、飛行経路の少なくとも一部が統合ジオフェンス内に生成されていない場合には(ステップS1203でNO)飛行経路生成部334により飛行経路の修正を行い(ステップS1204)、ステップS1203に戻る。この構成によれば、ドローン100が移動中にジオフェンス外に出てしまうことを防止し、高い安全性を担保することができる。
 飛行経路が統合ジオフェンス内部に生成されている場合には(ステップS1203でYES)ステップS1205に進む。
Next, it is determined whether the generated flight path is generated inside the integrated geofence (step S1203), and if at least a part of the flight path is not generated inside the integrated geofence (NO in step S1203), the flight path is corrected by the flight path generation unit 334 (step S1204), and the process returns to step S1203. This configuration makes it possible to prevent the drone 100 from going outside the geofence while moving, thereby ensuring high safety.
If the flight route is generated within the integrated geofence (YES in step S1203), the process proceeds to step S1205.
 次いで、撮影位置L215への移動指令が通知され(ステップS1205)、ドローン100は移動を開始する。 Next, a command to move to the image capture position L215 is sent (step S1205), and the drone 100 starts moving.
 移動中において、システム1は、操縦器200からコートF100外への退出指令の有無を判定する(ステップS1206)。退出指令の受信がない場合(ステップS1206でNO)、コートF100内への移動が完了するまで(ステップS1207でNO)ステップS1206を継続する。ドローン100のコートF100内への移動が完了すると(ステップS1207でYES)、飛行モード切替部330により飛行モードをコート内飛行モードM105に変更し、ジオフェンス切替部333により、ジオフェンスをコート内飛行モードM105に対応するジオフェンスG200に切り替える(ステップS1208)。また、ドローン100は飛行を継続し、目的地である撮影位置L215に到達すると(ステップS1220でYES)処理を終了する。 During the movement, the system 1 determines whether or not there is a command from the controller 200 to exit the court F100 (step S1206). If no command to exit is received (NO in step S1206), step S1206 is continued until the movement into the court F100 is completed (NO in step S1207). When the movement of the drone 100 into the court F100 is completed (YES in step S1207), the flight mode switching unit 330 changes the flight mode to the in-court flight mode M105, and the geofence switching unit 333 switches the geofence to the geofence G200 corresponding to the in-court flight mode M105 (step S1208). The drone 100 continues flying, and when it reaches the destination, the shooting position L215 (YES in step S1220), the process ends.
 ステップS1206において、操縦器200からコートF100外への退出指令の受信があった場合(ステップS1206でYES)、ステップS1209に進む。次いで、飛行経路生成部334により、入力されたコートF100外の目的地への飛行経路を生成する(ステップS1209)。次いで、飛行制御部123により、当該飛行経路に沿ってドローン100の移動を開始する。コートF100外の目的地への移動が完了するまで移動を継続し(ステップS1210でNO)、当該目的地への移動が完了すると(ステップS1210でYES)、ステップS1211に進む。 If, in step S1206, a command to exit from court F100 is received from the controller 200 (YES in step S1206), the process proceeds to step S1209. Next, the flight path generation unit 334 generates a flight path to the input destination outside court F100 (step S1209). Next, the flight control unit 123 starts moving the drone 100 along that flight path. Movement continues until movement to the destination outside court F100 is complete (NO in step S1210), and when movement to the destination is complete (YES in step S1210), the process proceeds to step S1211.
 ステップS1211では、飛行モード切替部330により飛行モードを外縁飛行モードM102に変更し、ジオフェンス切替部333によりジオフェンスを外縁飛行モードM102に対応するジオフェンスG100に切り替える(ステップS1211)。次いで、ドローン100はジオフェンスG100内の移動を継続し、目的地への移動が完了すると処理を終了する(ステップS1220)。 In step S1211, the flight mode switching unit 330 changes the flight mode to the outer edge flight mode M102, and the geofence switching unit 333 switches the geofence to the geofence G100 corresponding to the outer edge flight mode M102 (step S1211). Next, the drone 100 continues moving within the geofence G100, and ends the process when the movement to the destination is completed (step S1220).
●表示部201の表示例
 図19および図20は、表示制御部210により表示される、操縦器200の表示部201に表示される画面G1、G2の例である。
Display Examples of the Display Unit 201 FIGS. 19 and 20 are examples of screens G1 and G2 displayed on the display unit 201 of the controller 200 by the display control unit 210. FIG.
 図19に示す画面G1には、競技場Fおよび撮影位置L101~L215を俯瞰的に模式的に示すフィールドマップG10、ドローン100の位置情報を示すアイコンG11、撮影用カメラ141が撮影している撮影範囲G12、ドローン100が属する飛行モードの表示欄G21、機体状態、機体行動状態、試合状態および攻守状態といったイベント検出部により検出される状態を示す状態表示欄G22、飛行モードの切替操作を受け付けるモード切替ボタンG31、自動操縦および手動操縦の切替操作を受け付ける操縦切替ボタンG32、ドローン100により撮影されている画像が表示される映像欄G40等が表示されている。図19の例では、ドローンは撮影位置L101において外縁飛行モードで飛行している。モード切替ボタンG31には、切替可能なモードが表示され、図19においてはコート内飛行モードに切替可能であることが示されている。また、操縦切替ボタンG32は、自動操縦および手動操縦のいずれに切替可能かが表示される。図19においては、手動介入への切替が可能であることが示されている。 The screen G1 shown in FIG. 19 displays a field map G10 that shows a schematic bird's-eye view of the stadium F and the shooting positions L101 to L215, an icon G11 that shows the position information of the drone 100, a shooting range G12 that is being shot by the shooting camera 141, a display field G21 of the flight mode to which the drone 100 belongs, a status display field G22 that shows the status detected by the event detection unit, such as the aircraft status, aircraft behavior status, game status, and offensive and defensive status, a mode switching button G31 that accepts switching operations of the flight mode, a control switching button G32 that accepts switching operations between automatic control and manual control, and a video field G40 in which images captured by the drone 100 are displayed. In the example of FIG. 19, the drone is flying in the outer edge flight mode at the shooting position L101. The mode switching button G31 displays the switchable modes, and in FIG. 19, it is shown that it is possible to switch to the in-court flight mode. In addition, the control switching button G32 displays whether it is possible to switch to automatic control or manual control. Figure 19 shows that switching to manual intervention is possible.
 なお、ドローン100の位置と撮影方向の制御は、手動であってもよいし、ボール又は特定の選手への自動追従制御が行われてもよい。自動追従制御が行われている場合には、追従対象となるボール又は特定の選手の情報を画面G1上に表示するものとしてもよい。 The position and shooting direction of the drone 100 may be controlled manually, or automatic tracking control of the ball or a specific player may be performed. When automatic tracking control is performed, information about the ball or specific player being tracked may be displayed on the screen G1.
 図20には、ドローン100が移動中である場合に表示制御部210により表示される、画面G2の表示例を示している。図20は、ドローン100がコート内飛行モードM105により撮影位置L208から撮影位置L206に向かって飛行中であることを示している。また、画面G2には、移動先、すなわち目標位置である撮影位置L206の情報が表示されている。より具体的には、ドローン100は撮影位置L206と撮影位置L208との間を飛行中であり、ドローン100から、移動先の撮影位置L206に向かって伸びる矢印が表示されている。また、目標位置である撮影位置L206は他の撮影位置とは異なる態様で表示がなされ、例えば強調表示されている。より具体的には、撮影位置L206は太字又は大きなフォントで表示されている。また、状態表示欄G22には、機体行動状態は「移動中」である旨が表示されている。 20 shows an example of the display of screen G2 displayed by the display control unit 210 when the drone 100 is moving. FIG. 20 shows that the drone 100 is flying from the shooting position L208 to the shooting position L206 in the on-court flight mode M105. The screen G2 also displays information on the destination, i.e., the target position, shooting position L206. More specifically, the drone 100 is flying between the shooting positions L206 and L208, and an arrow extending from the drone 100 to the destination shooting position L206 is displayed. The target position, shooting position L206, is displayed in a different manner from the other shooting positions, for example, highlighted. More specifically, the shooting position L206 is displayed in bold or a large font. The status display field G22 also displays that the aircraft behavior status is "moving".
 なお、ドローン100の機首方向は、ドローン100の進行方向とは限らず、任意の方向を向いていてよい。また、移動中においてドローン100の機首方向は一定でなくてもよく、例えばヨー回転により選手又はボールを撮影しながら移動してもよい。 The nose direction of the drone 100 does not necessarily have to be the direction of travel of the drone 100, but may be in any direction. Furthermore, the nose direction of the drone 100 does not have to be constant while moving, and for example, the drone 100 may move while photographing a player or the ball by rotating in a yaw motion.
[A-2.本実施形態の効果]
 本実施形態によれば、空中撮影における安全性を確保することができる。より具体的には、互いに異なるジオフェンスが設定される飛行モードを切り替える場合に、切替時には切替前後のジオフェンスとは異なる第3のジオフェンスを設定することで、切替中の飛行においてもジオフェンス内で確実に飛行させ、ジオフェンスと飛行経路の干渉を防止し、飛行モードの安全な切替を担保することができる。
[A-2. Effects of this embodiment]
According to this embodiment, it is possible to ensure safety in aerial photography. More specifically, when switching flight modes in which different geofences are set, a third geofence different from the geofences before and after the switching is set at the time of switching, so that the flight can be reliably flown within the geofence even during switching, interference between the geofence and the flight path can be prevented, and the safe switching of flight modes can be ensured.
 なお、本発明は、上記実施形態に限らず、本明細書の記載内容に基づき、種々の構成を採り得ることはもちろんである。 The present invention is not limited to the above embodiment, and various configurations can be adopted based on the contents of this specification.
 上記実施形態に関連した説明した一連の処理は、ソフトウェア、ハードウェア並びにソフトウェア及びハードウェアの組合せのいずれを用いて実現されてもよい。本実施形態に係るサーバ300の各機能を実現するためのコンピュータプログラムを作製し、PC等に実装することが可能である。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体も提供することができる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えば通信ネットワーク400を介して配信されてもよい。 The series of processes described in relation to the above embodiment may be implemented using software, hardware, or a combination of software and hardware. A computer program for implementing each function of the server 300 according to this embodiment may be created and implemented in a PC or the like. A computer-readable recording medium on which such a computer program is stored may also be provided. Examples of the recording medium include a magnetic disk, an optical disk, a magneto-optical disk, and a flash memory. The above computer program may also be distributed, for example, via the communication network 400, without using a recording medium.
 上記実施形態で用いたフローチャートに関し、必ずしも図示された順序で実行されなくてもよい。いくつかの処理ステップは、並列的に実行されてもよい。また、追加的な処理ステップが採用されてもよく、一部の処理ステップが省略されてもよい。 The flowcharts used in the above embodiments do not necessarily have to be executed in the order shown. Some processing steps may be executed in parallel. In addition, additional processing steps may be employed, and some processing steps may be omitted.
1     空中撮影システム
100   ドローン(移動体)
 141  撮影用カメラ
200   操縦器
 220  入力制御部
300   サーバ
320   イベント検出部
330   飛行モード切替部
 331  モード切替判定部
380   記憶部
F     競技場
 F100 コート
 F200 コート外領域
M102  外縁飛行モード
M105  コート内飛行モード

 
1 Aerial photography system 100 Drone (mobile body)
141 Shooting camera 200 Pilot 220 Input control unit 300 Server 320 Event detection unit 330 Flight mode switching unit 331 Mode switching determination unit 380 Memory unit F Stadium F100 Court F200 Area outside the court M102 Outer edge flight mode M105 Inner court flight mode

Claims (19)

  1.  対象エリアを飛行する移動体と、
     前記移動体の飛行可能エリアと、前記飛行可能エリアを包含する領域が規定されてなる前記移動体のジオフェンスと、が対応付けられてなる飛行モードを切り替える飛行モード切替部と、
    を備え、
     前記飛行モード切替部は、第1飛行モードから第2飛行モードに切替を行う場合に、第3飛行モードを介して前記第2飛行モードへ遷移させ、
     前記第3飛行モードにおける第3ジオフェンスは、前記第1飛行モードにおける第1ジオフェンスおよび前記第2飛行モードにおける第2ジオフェンスとは異なる、
     移動体システム。
     
    A moving object flying in a target area;
    A flight mode switching unit that switches a flight mode in which a flight area of the moving body and a geofence of the moving body that defines an area including the flight area are associated with each other;
    Equipped with
    When switching from a first flight mode to a second flight mode, the flight mode switching unit transitions to the second flight mode via a third flight mode,
    The third geofence in the third flight mode is different from the first geofence in the first flight mode and the second geofence in the second flight mode.
    Mobile systems.
  2.  前記第3ジオフェンスは、前記第1ジオフェンスで定義される第1エリアと、前記第2ジオフェンスで定義される第2エリアと、を統合した統合エリアを覆う領域が規定される、
    請求項1記載の移動体システム。
     
    The third geofence defines an area covering an integrated area obtained by integrating a first area defined by the first geofence and a second area defined by the second geofence,
    2. The mobile system of claim 1.
  3.  前記第3ジオフェンスは、前記第1ジオフェンスで定義される第1エリアと、前記第2ジオフェンスで定義される第2エリアと、前記第1エリアと前記第2エリアとの空隙と、を統合した統合エリアを覆う領域が規定される、
    請求項1記載の移動体システム。
     
    The third geofence defines an area covering an integrated area obtained by integrating a first area defined by the first geofence, a second area defined by the second geofence, and a gap between the first area and the second area;
    2. The mobile system of claim 1.
  4.  前記第1飛行モードおよび前記第2飛行モードの一方は、前記対象エリアの外縁上を飛行する外縁飛行モードであり、前記第1飛行モードおよび前記第2飛行モードの他方は、前記対象エリアの上空を飛行する対象エリア内飛行モードである、
    請求項2又は3記載の移動体システム。
     
    One of the first flight mode and the second flight mode is an outer edge flight mode in which the aircraft flies on the outer edge of the target area, and the other of the first flight mode and the second flight mode is an inner target area flight mode in which the aircraft flies above the target area.
    4. A mobile system according to claim 2 or 3.
  5.  前記第3飛行モードにおいて前記移動体の飛行経路を生成する飛行経路生成部をさらに備え、
     前記飛行経路生成部は、前記飛行経路の少なくとも一部が前記第3ジオフェンスの外部に生成されている場合に、前記飛行経路を変更する、
    請求項2又は3記載の移動体システム。
     
    A flight path generating unit that generates a flight path of the moving body in the third flight mode,
    The flight path generation unit changes the flight path when at least a portion of the flight path is generated outside the third geofence.
    4. A mobile system according to claim 2 or 3.
  6.  前記移動体が、設定されている前記第1ジオフェンス、前記第2ジオフェンス又は前記第3ジオフェンスのいずれかの外部に逸脱したことを検知した場合、前記移動体を着陸、ホバリング、前記第1エリア又は前記第2エリア又は前記統合エリアの内側方向へ移動、の少なくともいずれかの動作を前記移動体に実行させる
    請求項2又は3記載の移動体システム。
     
    The mobile body system of claim 2 or 3, wherein when it is detected that the mobile body has deviated outside of any of the first geofence, the second geofence or the third geofence that has been set, the mobile body is caused to perform at least one of the following actions: landing, hovering, or moving toward the inside of the first area, the second area, or the integrated area.
  7.  対象エリアを飛行する移動体と、
     前記対象エリアを撮影するカメラにより取得される画像又は外部システムからの入力に基づいてイベントを検出するイベント検出部と、
     少なくとも前記移動体の飛行可能エリアが定められてなる飛行モードの切替要否を判定するモード切替判定部と、
    を備え、
     前記飛行モードは、
      前記対象エリア内に構成されるコートの外縁の一部又は全部に沿って、前記外縁の上空を飛行する外縁飛行モードと、
      前記コート内の上空を飛行するコート内飛行モードと、
    を少なくとも含み、
     前記モード切替判定部は、前記イベント検出部により検出されるイベントに応じて、前記飛行モードの切替要否を判定する、
    空中撮影システム。
     
    A moving object flying in a target area;
    an event detection unit that detects an event based on an image acquired by a camera that captures the target area or an input from an external system;
    A mode switching determination unit that determines whether or not switching of a flight mode is required, the flight mode being determined by at least a flight area of the moving object;
    Equipped with
    The flight mode is
    An outer edge flight mode in which the drone flies above the outer edge along a part or all of the outer edge of a court configured within the target area;
    an in-court flight mode in which the flight is performed above the court;
    At least
    The mode switching determination unit determines whether or not the flight mode needs to be switched in response to the event detected by the event detection unit.
    Aerial photography system.
  8.  前記モード切替判定部は、前記カメラにより撮影される撮影画像により検出した、前記対象エリアで行われている試合に関するイベントに応じて前記飛行モードを切り替える、
    請求項7記載の空中撮影システム。
     
    The mode switching determination unit switches the flight mode in response to an event related to a match being held in the target area, the event being detected from an image captured by the camera.
    8. The aerial photography system according to claim 7.
  9.  前記コート内飛行モードにおいて、前記移動体が前記対象エリアの特定の選手又はボールを自動追従するための飛行経路、又はユーザにより指定される目標位置に前記移動体を移動させるための飛行経路を、前記コート内において生成するコート内飛行制御部をさらに備える、
    請求項7記載の空中撮影システム。
     
    In the in-court flight mode, an in-court flight control unit generates, within the court, a flight path for the moving body to automatically follow a specific player or ball in the target area, or a flight path for moving the moving body to a target position designated by a user.
    8. The aerial photography system according to claim 7.
  10.  前記移動体の情報を操作画面に表示する表示制御部をさらに備え、
     前記表示制御部は、前記自動追従をして飛行している場合に、追従対象となる前記特定の選手又は前記ボールを前記操作画面に表示する、
    請求項9記載の空中撮影システム。
     
    A display control unit that displays information about the moving object on an operation screen,
    the display control unit displays, on the operation screen, the specific player or the ball that is to be tracked, when the ball is flying while being automatically tracked.
    10. The aerial photography system according to claim 9.
  11.  前記コート内飛行制御部は、前記コート内飛行モードにおいて、あらかじめ設定された複数の撮影位置を接続して前記飛行経路を生成する、
    請求項9記載の空中撮影システム。
     
    The in-court flight control unit generates the flight path by connecting a plurality of preset shooting positions in the in-court flight mode.
    10. The aerial photography system according to claim 9.
  12.  前記移動体の情報を操作画面に表示する表示制御部をさらに備え、
     前記表示制御部は、前記コート内飛行モードにより前記撮影位置間を飛行中において、移動先の前記撮影位置の情報を前記操作画面に表示する、
    請求項11記載の空中撮影システム。
     
    A display control unit that displays information about the moving object on an operation screen,
    The display control unit displays information about the destination photographing position on the operation screen during flying between the photographing positions in the intra-court flight mode.
    The aerial imaging system according to claim 11.
  13.  前記対象エリアは、互いに向かい合う1対のゴールラインと、互いに向かい合う1対のタッチラインと、がそれぞれ接続されることにより区画される前記コートとコート外領域とにより構成され、
     前記外縁飛行モードにおいて前記移動体の飛行を制御する外縁飛行制御部をさらに備え、前記外縁飛行制御部は、前記外縁飛行モードにおいては、前記タッチラインに沿って前記移動体を移動させ、かつ、前記移動体を特定の選手又はボールに追従させて自動撮影を行う、
    請求項7記載の空中撮影システム。
     
    The target area is composed of the court and an outside court area, which are defined by connecting a pair of goal lines facing each other and a pair of touch lines facing each other,
    Further, an outer edge flight control unit is provided that controls the flight of the moving body in the outer edge flight mode, and the outer edge flight control unit moves the moving body along the touch line in the outer edge flight mode and automatically photographs the moving body by following a specific player or a ball.
    8. The aerial photography system according to claim 7.
  14.  前記外縁飛行制御部は、前記ボールが、前記移動体が飛行している前記タッチラインを超えて前記コート外領域に出た場合に、前記コート外領域であって前記タッチラインよりも外側に前記移動体を移動させ、前記移動体の真下又は前記コートに向かって撮影する、
    請求項13記載の空中撮影システム。
     
    When the ball goes beyond the touch line along which the moving object is flying and enters the outside court area, the outer edge flight control unit moves the moving object to the outside of the touch line in the outside court area, and takes a picture directly below the moving object or toward the court.
    14. The aerial imaging system according to claim 13.
  15. /
     前記飛行モードは、前記移動体を定位置で飛行させる定位置飛行モードをさらに含み、
     前記定位置飛行モードにおける前記移動体の動作を制御する定位置飛行制御部をさらに備え、
     前記定位置飛行制御部は、前記定位置飛行モードでは、所定位置においてホバリングを行うとともに、機首方向又は前記カメラの方向を制御し、特定の選手又はボールに追従させて自動撮影を行う、
    請求項7記載の空中撮影システム。
     
    /
    The flight modes further include a fixed-position flight mode in which the moving body is flown at a fixed position;
    A fixed-position flight control unit that controls the operation of the moving body in the fixed-position flight mode is further provided.
    In the fixed position flight mode, the fixed position flight control unit hovers at a predetermined position and controls the nose direction or the direction of the camera to follow a specific player or ball and automatically shoot.
    8. The aerial photography system according to claim 7.
  16.  前記コート内飛行制御部は、前記コート内飛行モードにおいて、前記飛行経路上又は前記移動体の近傍に障害物を検知した場合には、前記撮影位置の接続を変更して前記障害物を迂回する飛行経路を再生成、前記飛行経路よりも高い高度による飛行を決定、所定時間ホバリングした後に前記飛行経路で前記移動体の移動を開始、所定時間ホバリングした後に手動操縦に切替、および所定時間ホバリングした後にユーザに目標位置の再入力を促す表示を行う、の少なくともいずれかの動作を実行する、
    請求項11記載の空中撮影システム。
     
    When an obstacle is detected on the flight path or near the moving body in the flight mode within the court, the in-court flight control unit executes at least one of the following operations: changing the connection of the shooting positions to regenerate a flight path that bypasses the obstacle, determining flight at a higher altitude than the flight path, starting movement of the moving body on the flight path after hovering for a predetermined time, switching to manual control after hovering for a predetermined time, and displaying a message prompting the user to re-input a target position after hovering for a predetermined time.
    The aerial imaging system according to claim 11.
  17.  前記外縁飛行制御部は、前記外縁飛行モードにおいて前記移動体の飛行経路を生成し、前記飛行経路上又は前記移動体の近傍に障害物を検知した場合には、前記障害物を前記コートの内側に迂回する飛行経路を再生成、前記飛行経路よりも高い高度による飛行を決定、所定時間ホバリングした後に前記飛行経路で前記移動体の移動を開始、所定時間ホバリングした後に手動操縦に切替、および所定時間ホバリングした後にユーザに前記目標位置の再入力を促す表示を行う、の少なくともいずれかの動作を実行する、
    請求項13記載の空中撮影システム。
     
    The outer edge flight control unit generates a flight path for the moving body in the outer edge flight mode, and when an obstacle is detected on the flight path or near the moving body, performs at least one of the following operations: regenerates a flight path that bypasses the obstacle to the inside of the court; determines flight at an altitude higher than the flight path; starts moving the moving body on the flight path after hovering for a predetermined time; switches to manual control after hovering for a predetermined time; and displays a message prompting the user to re-input the target position after hovering for a predetermined time.
    14. The aerial imaging system according to claim 13.
  18.  対象エリアを撮影するカメラにより取得される画像又は外部システムからの入力に基づいてイベントを検出するイベント検出ステップと、
     少なくとも前記対象エリアを飛行する移動体の飛行可能エリアが定められてなる飛行モードの切替要否を判定するモード切替判定ステップと、
    を備え、
     前記飛行モードは、
      前記対象エリア内に構成されるコートの外縁の一部又は全部に沿って、前記外縁の上空を飛行する外縁飛行モードと、
      前記コート内の上空を飛行するコート内飛行モードと、
    を少なくとも含み、
     前記モード切替判定ステップでは、前記イベント検出ステップにより検出されるイベントに応じて、前記飛行モードの切替要否を判定する、
    空中撮影方法。
     
    An event detection step of detecting an event based on an image acquired by a camera photographing a target area or an input from an external system;
    A mode switching determination step for determining whether or not switching of a flight mode is required, in which a flight area in which a moving object flying at least in the target area is defined;
    Equipped with
    The flight mode is
    An outer edge flight mode in which the drone flies above the outer edge along a part or all of the outer edge of a court configured within the target area;
    an in-court flight mode in which the flight is performed above the court;
    At least
    In the mode switching determination step, it is determined whether or not the flight mode needs to be switched depending on the event detected in the event detection step.
    Aerial photography method.
  19.  対象エリアを撮影するカメラにより取得される画像又は外部システムからの入力に基づいてイベントを検出するイベント検出命令と、
     少なくとも前記対象エリアを飛行する移動体の飛行可能エリアが定められてなる飛行モードの切替要否を判定するモード切替判定命令と、
    をコンピュータに実行させ、
     前記飛行モードは、
      前記対象エリア内に構成されるコートの外縁の一部又は全部に沿って、前記外縁の上空を飛行する外縁飛行モードと、
      前記コート内の上空を飛行するコート内飛行モードと、
    を少なくとも含み、
     前記モード切替判定命令では、前記イベント検出命令により検出されるイベントに応じて、前記飛行モードの切替要否を判定する、
    空中撮影プログラム。
     

     
    An event detection command for detecting an event based on an image captured by a camera photographing a target area or an input from an external system;
    A mode switching determination command for determining whether or not a flight mode needs to be switched, the flight mode determining command determining whether or not a flight area in which a flying object that flies in at least the target area is defined is required.
    Run the following on your computer:
    The flight mode is
    An outer edge flight mode in which the drone flies above the outer edge along a part or all of the outer edge of a court configured within the target area;
    an intra-court flight mode in which the flight is performed above the court;
    At least
    The mode switching determination command determines whether or not the flight mode needs to be switched depending on the event detected by the event detection command.
    Aerial photography program.


PCT/JP2022/036119 2022-09-28 2022-09-28 Mobile body system, aerial photography system, aerial photography method, and aerial photography program WO2024069788A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/036119 WO2024069788A1 (en) 2022-09-28 2022-09-28 Mobile body system, aerial photography system, aerial photography method, and aerial photography program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/036119 WO2024069788A1 (en) 2022-09-28 2022-09-28 Mobile body system, aerial photography system, aerial photography method, and aerial photography program

Publications (1)

Publication Number Publication Date
WO2024069788A1 true WO2024069788A1 (en) 2024-04-04

Family

ID=90476758

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/036119 WO2024069788A1 (en) 2022-09-28 2022-09-28 Mobile body system, aerial photography system, aerial photography method, and aerial photography program

Country Status (1)

Country Link
WO (1) WO2024069788A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013503504A (en) * 2009-08-31 2013-01-31 トレイス オプティクス ピーティワイ リミテッド Method and apparatus for relative control of multiple cameras
JP2016538607A (en) * 2014-04-17 2016-12-08 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Aviation control in flight restricted areas.
JP2018501560A (en) * 2015-03-31 2018-01-18 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Geofencing device identification method and geofencing device identification system
JP2018505089A (en) * 2014-12-19 2018-02-22 エアロバイロメント, インコーポレイテッドAerovironment, Inc. Supervisory safety system for control and restriction of unmanned aerial vehicle (UAS) maneuvers
JP2019193209A (en) * 2018-04-27 2019-10-31 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Control apparatus and photographing method
US20210086896A1 (en) * 2019-05-08 2021-03-25 Ford Global Technologies, Llc Zone-based unmanned aerial vehicle landing systems and methods

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013503504A (en) * 2009-08-31 2013-01-31 トレイス オプティクス ピーティワイ リミテッド Method and apparatus for relative control of multiple cameras
JP2016538607A (en) * 2014-04-17 2016-12-08 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Aviation control in flight restricted areas.
JP2018505089A (en) * 2014-12-19 2018-02-22 エアロバイロメント, インコーポレイテッドAerovironment, Inc. Supervisory safety system for control and restriction of unmanned aerial vehicle (UAS) maneuvers
JP2018501560A (en) * 2015-03-31 2018-01-18 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Geofencing device identification method and geofencing device identification system
JP2019193209A (en) * 2018-04-27 2019-10-31 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Control apparatus and photographing method
US20210086896A1 (en) * 2019-05-08 2021-03-25 Ford Global Technologies, Llc Zone-based unmanned aerial vehicle landing systems and methods

Similar Documents

Publication Publication Date Title
US20210072745A1 (en) Systems and methods for uav flight control
US11797009B2 (en) Unmanned aerial image capture platform
JP7465615B2 (en) Smart aircraft landing
US11188101B2 (en) Method for controlling aircraft, device, and aircraft
US11740630B2 (en) Fitness and sports applications for an autonomous unmanned aerial vehicle
US20210116944A1 (en) Systems and methods for uav path planning and control
US11008098B2 (en) Systems and methods for adjusting UAV trajectory
US10377484B2 (en) UAV positional anchors
US10357709B2 (en) Unmanned aerial vehicle movement via environmental airflow
US20180246507A1 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
US20170173451A1 (en) Method and system for integrated real and virtual game play for multiple remotely-controlled aircraft
CN114594792A (en) Device and method for controlling a movable object
US20220137647A1 (en) System and method for operating a movable object based on human body indications
WO2024069788A1 (en) Mobile body system, aerial photography system, aerial photography method, and aerial photography program
WO2024069789A1 (en) Aerial imaging system, aerial imaging method, and aerial imaging program
WO2024069790A1 (en) Aerial photography system, aerial photography method, and aerial photography program
WO2023238208A1 (en) Aerial photography system, aerial photography method, and aerial mobile body management device
CN114641744A (en) Control method, apparatus, system, and computer-readable storage medium
WO2024018643A1 (en) Imaging system, imaging method, imaging control device and program
WO2024009447A1 (en) Flight control system and flight control method
KR102502021B1 (en) Virtual reality drone simulator
KR102502020B1 (en) Virtual reality drone simulator
WO2023139628A1 (en) Area setting system and area setting method
JP2021036452A (en) System and method for adjusting uav locus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22960851

Country of ref document: EP

Kind code of ref document: A1