WO2021166175A1 - Système de drone, contrôleur et procédé de définition de zone de travail - Google Patents

Système de drone, contrôleur et procédé de définition de zone de travail Download PDF

Info

Publication number
WO2021166175A1
WO2021166175A1 PCT/JP2020/006849 JP2020006849W WO2021166175A1 WO 2021166175 A1 WO2021166175 A1 WO 2021166175A1 JP 2020006849 W JP2020006849 W JP 2020006849W WO 2021166175 A1 WO2021166175 A1 WO 2021166175A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
survey
drone
points
unit
Prior art date
Application number
PCT/JP2020/006849
Other languages
English (en)
Japanese (ja)
Inventor
俊一郎 渡辺
了 宮城
千大 和氣
宏記 加藤
Original Assignee
株式会社ナイルワークス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ナイルワークス filed Critical 株式会社ナイルワークス
Priority to JP2022501520A priority Critical patent/JP7412037B2/ja
Priority to CN202080096913.1A priority patent/CN115136090A/zh
Priority to PCT/JP2020/006849 priority patent/WO2021166175A1/fr
Publication of WO2021166175A1 publication Critical patent/WO2021166175A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the invention of the present application relates to a method of defining a drone system, an operator and a work area.
  • Patent Document 2 discloses an autonomous traveling system for a tractor that acquires the position and shape of a field by recording the transition of the position when the tractor is orbited along the outer circumference of the field.
  • Patent Document 3 discloses a driving support system that acquires information on the topography and ground of a field and stores the acquired topography and ground information in field map data in association with the position information of a work vehicle.
  • Patent Document 4 discloses a field shape determining device that determines the shape of a field based on GPS position data of a work point in a field in which a work vehicle travels.
  • the drone system is a system that defines a work area of a drone based on information on survey points, and has a display unit that displays information on a plurality of surveyed survey points and the display.
  • An area definition unit that defines the area by connecting a surveying point selection unit that accepts the selection of the surveying points displayed on the unit and a plurality of surveying points accepted by the surveying point selection unit to each other. To be equipped.
  • the area defined by the area definition unit may be further provided with an area type selection unit that determines an area type including a work area of the drone and an obstacle area prohibiting the flight of the drone.
  • the information of the survey point includes the type information of the area to which the survey point belongs, and the area definition unit determines the area type based on the type information of the area to which the plurality of survey points used for defining the area belong. It may be provided with an area type determination function.
  • the survey point information includes the type information of the area to which the survey point belongs, and the area definition unit permits connection between survey points to which the same type information is attached, and is accompanied by different type information. Connection between existing survey points may be prohibited.
  • the area definition unit may connect each survey point in the order in which selection is received by the survey point selection unit, and define the area with each connection line as the outer edge of the area.
  • the area definition unit determines whether or not the survey points are selected in the order in which the connection lines intersect, and when the survey points are selected in the order in which at least a part of the connection lines intersect, an error is generated. It may be a notification.
  • the area definition unit connects a plurality of surveying points received by the surveying point selection unit to each other, and the area of the area is reduced.
  • the area may be defined so as to be maximum.
  • the area defined by the area definition unit may be output to the display unit.
  • An operator according to another aspect of the present invention is the operator of the drone that defines a work area of the drone that autonomously flies based on the information of the survey points, and is a device of the plurality of surveyed survey points.
  • An area is divided by connecting a display unit that displays information, a survey point selection unit that accepts the selection of survey points displayed on the display unit, and a plurality of survey points received by the survey point selection unit to each other. It includes an area definition unit that defines the area.
  • a method of defining a work area is a method of defining a work area of a drone that autonomously flies based on information of survey points, and information on a plurality of surveyed survey points.
  • the area is divided by connecting the display step for displaying, the survey point selection step for accepting the selection of the survey point displayed in the display step, and the plurality of survey points received by the survey point selection step to each other. It includes an area definition step for defining an area.
  • the field surveying work can be made more efficient.
  • the above-mentioned drone system is a flowchart showing a flow of defining an area. It is a conceptual diagram which shows the outline of the pointing work for defining the area of a field and an obstacle in the said drone system. It is a conceptual diagram which shows the outline of the pointing work of the related technology.
  • the drone is regardless of the power means (electric power, prime mover, etc.) and the maneuvering method (wireless or wired, autonomous flight type, manual maneuvering type, etc.). It refers to all air vehicles with multiple rotor blades.
  • the rotor blades 101-1a, 101-1b, 101-2a, 101-2b, 101-3a, 101-3b, 101-4a, 101-4b are It is a means for flying the Drone 100, and is equipped with eight aircraft (four sets of two-stage rotor blades) in consideration of the balance between flight stability, aircraft size, and power consumption.
  • Each rotor 101 is arranged on all sides of the housing 110 by an arm protruding from the housing 110 of the drone 100.
  • the rotors 101-1a and 101-1b are on the left rear side in the direction of travel, the rotor blades 101-2a and 101-2b are on the left front side, the rotor blades 101-3a and 101-3b are on the right rear side, and the rotor blades 101- are on the right front side. 4a and 101-4b are arranged respectively.
  • the drone 100 has the traveling direction facing downward on the paper in FIG.
  • a grid-shaped propeller guard 115-1,115-2,115-3,115-4 forming a substantially cylindrical shape is provided on the outer circumference of each set of the rotor blade 101 to prevent the rotor blade 101 from interfering with foreign matter.
  • the radial members for supporting the propeller guards 115-1,115-2,115-3,115-4 have a wobbling structure rather than a horizontal structure. This is to encourage the member to buckle outside the rotor in the event of a collision and prevent it from interfering with the rotor.
  • Rod-shaped legs 107-1, 107-2, 107-3, 107-4 extend downward from the rotation axis of the rotor 101, respectively.
  • Motors 102-1a, 102-1b, 102-2a, 102-2b, 102-3a, 102-3b, 102-4a, 102-4b are rotor blades 101-1a, 101-1b, 101-2a, 101- It is a means to rotate 2b, 101-3a, 101-3b, 101-4a, 101-4b (typically an electric motor, but it may also be a motor, etc.), and one rotor is provided for each rotor. Has been done.
  • Motor 102 is an example of a propulsion device.
  • the upper and lower rotors (eg, 101-1a and 101-1b) in one set, and their corresponding motors (eg, 102-1a and 102-1b), are used for drone flight stability, etc.
  • the axes are on the same straight line and rotate in opposite directions.
  • Nozzles 103-1, 103-2, 103-3, 103-4 are means for spraying the sprayed material downward and are equipped with four nozzles.
  • the sprayed material generally refers to a liquid or powder sprayed on a field such as a pesticide, a herbicide, a liquid fertilizer, an insecticide, a seed, and water.
  • the tank 104 is a tank for storing the sprayed material, and is provided at a position close to the center of gravity of the drone 100 and at a position lower than the center of gravity from the viewpoint of weight balance.
  • the hoses 105-1, 105-2, 1053, 105-4 are means for connecting the tank 104 and the nozzles 103-1, 103-2, 103-3, 103-4, and are made of a hard material. Therefore, it may also serve as a support for the nozzle.
  • the pump 106 is a means for discharging the sprayed material from the nozzle.
  • FIG. 6 shows an overall conceptual diagram of the flight control system of the drone 100 according to the present invention.
  • This figure is a schematic view, and the scale is not accurate.
  • the drone 100, the actuator 401, the base station 404, and the server 405 are connected to each other via the mobile communication network 400.
  • These connections may be wireless communication by Wi-Fi instead of the mobile communication network 400, or may be partially or wholly connected by wire.
  • the components may have a configuration in which they are directly connected to each other in place of or in addition to the mobile communication network 400.
  • Drone 100 and base station 404 communicate with GNSS positioning satellite 410 such as GPS to acquire drone 100 and base station 404 coordinates. There may be a plurality of positioning satellites 410 with which the drone 100 and the base station 404 communicate.
  • the operator 401 transmits a command to the drone 100 by the operation of the user, and also displays information received from the drone 100 (for example, position, amount of sprayed material, battery level, camera image, etc.). It is a means and may be realized by a portable information device such as a general tablet terminal that runs a computer program.
  • the actuator 401 includes an input unit and a display unit as a user interface device.
  • the drone 100 according to the present invention is controlled to perform autonomous flight, but may be capable of manual operation during basic operations such as takeoff and return, and in an emergency.
  • an emergency operation device (not shown) having a function dedicated to emergency stop may be used.
  • the emergency operation device may be a dedicated device provided with a large emergency stop button or the like so that an emergency response can be taken quickly.
  • the system may include a small mobile terminal capable of displaying a part or all of the information displayed on the operating device 401, for example, a smart phone.
  • the small mobile terminal is connected to, for example, the base station 404, and can receive information and the like from the server 405 via the base station 404.
  • Field 403 is a rice field, field, etc. that is the target of spraying with the drone 100. In reality, the terrain of the field 403 is complicated, and the topographic map may not be available in advance, or the topographic map and the situation at the site may be inconsistent. Field 403 is usually adjacent to houses, hospitals, schools, other crop fields, roads, railroads, etc. In addition, there may be intruders such as buildings and electric wires in the field 403.
  • Base station 404 functions as an RTK-GNSS base station and can provide the exact location of the drone 100. Further, it may be a device that provides a master unit function of Wi-Fi communication. The base unit function of Wi-Fi communication and the RTK-GNSS base station may be independent devices. Further, the base station 404 may be able to communicate with the server 405 by using a mobile communication system such as 3G, 4G, and LTE. The base station 404 and the server 405 constitute a farming cloud.
  • the server 405 is typically a group of computers operated on a cloud service and related software, and may be wirelessly connected to the actuator 401 by a mobile phone line or the like.
  • the server 405 may be configured by a hardware device.
  • the server 405 may analyze the image of the field 403 taken by the drone 100, grasp the growing condition of the crop, and perform a process for determining the flight route.
  • the topographical information of the stored field 403 may be provided to the drone 100.
  • the history of the flight and captured images of the drone 100 may be accumulated and various analysis processes may be performed.
  • the small mobile terminal is, for example, a smart phone.
  • information on the expected operation of the drone 100 more specifically, the scheduled time when the drone 100 will return to the departure / arrival point 406, the content of the work to be performed by the user at the time of return, etc. Information is displayed as appropriate. Further, the operation of the drone 100 may be changed based on the input from the small mobile terminal.
  • the drone 100 takes off from the departure / arrival point outside the field 403 and returns to the departure / arrival point after spraying the sprayed material on the field 403 or when it becomes necessary to replenish or charge the sprayed material.
  • the flight route (invasion route) from the departure / arrival point to the target field 403 may be stored in advance on the server 405 or the like, or may be input by the user before the start of takeoff.
  • the departure / arrival point may be a virtual point defined by the coordinates stored in the drone 100, or may have a physical departure / arrival point.
  • FIG. 7 shows a block diagram showing a control function of an embodiment of the spraying drone according to the present invention.
  • the flight controller 501 is a component that controls the entire drone, and may be an embedded computer including a CPU, memory, related software, and the like.
  • the flight controller 501 uses motors 102-1a and 102-1b via control means such as ESC (Electronic Speed Control) based on the input information received from the controller 401 and the input information obtained from various sensors described later. , 102-2a, 102-2b, 102-3a, 102-3b, 104-a, 104-b to control the flight of the drone 100.
  • ESC Electronic Speed Control
  • the actual rotation speeds of the motors 102-1a, 102-1b, 102-2a, 102-2b, 102-3a, 102-3b, 104-a, 104-b are fed back to the flight controller 501, and normal rotation is performed. It is configured so that it can be monitored.
  • the rotary blade 101 may be provided with an optical sensor or the like so that the rotation of the rotary blade 101 is fed back to the flight controller 501.
  • the software used by the flight controller 501 can be rewritten through a storage medium for function expansion / change, problem correction, etc., or through communication means such as Wi-Fi communication or USB. In this case, protection is performed by encryption, checksum, electronic signature, virus check software, etc. so that rewriting by malicious software is not performed.
  • a part of the calculation process used by the flight controller 501 for control may be executed by another computer located on the controller 401, the server 405, or somewhere else. Due to the high importance of the flight controller 501, some or all of its components may be duplicated.
  • the flight controller 501 communicates with the actuator 401 via the communication device 530 and further via the mobile communication network 400, receives necessary commands from the actuator 401, and transmits necessary information to the actuator 401. Can be sent. In this case, the communication may be encrypted so as to prevent fraudulent acts such as interception, spoofing, and device hijacking.
  • the base station 404 also has an RTK-GPS base station function in addition to a communication function via the mobile communication network 400. By combining the signal of the RTK base station 404 and the signal from the positioning satellite 410 such as GPS, the flight controller 501 can measure the absolute position of the drone 100 with an accuracy of about several centimeters. Flight controllers 501 are so important that they may be duplicated and multiplexed, and each redundant flight controller 501 should use a different satellite to handle the failure of a particular GPS satellite. It may be controlled.
  • the 6-axis gyro sensor 505 is a means for measuring the acceleration of the drone body in three directions orthogonal to each other, and further, a means for calculating the velocity by integrating the acceleration.
  • the 6-axis gyro sensor 505 is a means for measuring the change in the attitude angle of the drone aircraft in the above-mentioned three directions, that is, the angular velocity.
  • the geomagnetic sensor 506 is a means for measuring the direction of the drone body by measuring the geomagnetism.
  • the barometric pressure sensor 507 is a means for measuring barometric pressure, and can also indirectly measure the altitude of the drone.
  • the laser sensor 508 is a means for measuring the distance between the drone body and the ground surface by utilizing the reflection of the laser light, and may be an IR (infrared) laser.
  • the sonar 509 is a means for measuring the distance between the drone aircraft and the ground surface by utilizing the reflection of sound waves such as ultrasonic waves. These sensors may be selected according to the cost target and performance requirements of the drone. In addition, a gyro sensor (angular velocity sensor) for measuring the inclination of the aircraft, a wind power sensor for measuring wind power, and the like may be added. Further, these sensors may be duplicated or multiplexed.
  • the flight controller 501 may use only one of them, and if it fails, it may switch to an alternative sensor for use. Alternatively, a plurality of sensors may be used at the same time, and if the measurement results do not match, it may be considered that a failure has occurred.
  • the flow rate sensor 510 is a means for measuring the flow rate of the sprayed material, and is provided at a plurality of locations on the path from the tank 104 to the nozzle 103.
  • the liquid drainage sensor 511 is a sensor that detects that the amount of sprayed material has fallen below a predetermined amount.
  • the growth diagnosis camera 512a is a means for photographing the field 403 and acquiring data for the growth diagnosis.
  • the growth diagnostic camera 512a is, for example, a multispectral camera and receives a plurality of light rays having different wavelengths from each other.
  • the plurality of light rays are, for example, red light (wavelength of about 650 nm) and near-infrared light (wavelength of about 774 nm).
  • the growth diagnosis camera 512a may be a camera that receives visible light.
  • the pathological diagnosis camera 512b is a means for photographing the crops growing in the field 403 and acquiring the data for the pathological diagnosis.
  • the pathological diagnosis camera 512b is, for example, a red light camera.
  • the red light camera is a camera that detects the amount of light in the frequency band corresponding to the absorption spectrum of chlorophyll contained in the plant, and detects, for example, the amount of light in the band around 650 nm.
  • the pathological diagnosis camera 512b may detect the amount of light in the frequency bands of red light and near infrared light.
  • the pathological diagnosis camera 512b may include both a red light camera and a visible light camera such as an RGB camera that detects light amounts of at least three wavelengths in the visible light band.
  • the pathological diagnosis camera 512b may be a multispectral camera, and may detect the amount of light in the band having a wavelength of 650 nm to 680 nm.
  • the growth diagnosis camera 512a and the pathology diagnosis camera 512b may be realized by one hardware configuration.
  • the obstacle detection camera 513 is a camera for detecting a drone intruder, and since the image characteristics and the orientation of the lens are different from the growth diagnosis camera 512a and the pathological diagnosis camera 512b, what are the growth diagnosis camera 512a and the pathological diagnosis camera 512b? Another device.
  • the switch 514 is a means for the user 402 of the drone 100 to make various settings.
  • the obstacle contact sensor 515 is a sensor for detecting that the drone 100, in particular, its rotor or propeller guard part, has come into contact with an intruder such as an electric wire, a building, a human body, a standing tree, a bird, or another drone. ..
  • the obstacle contact sensor 515 may be replaced by a 6-axis gyro sensor 505.
  • the cover sensor 516 is a sensor that detects that the operation panel of the drone 100 and the cover for internal maintenance are in the open state.
  • the inlet sensor 517 is a sensor that detects that the inlet of the tank 104 is
  • sensors may be selected according to the cost target and performance requirements of the drone, and may be duplicated / multiplexed.
  • a sensor may be provided at the base station 404, the actuator 401, or some other place outside the drone 100, and the read information may be transmitted to the drone.
  • the base station 404 may be provided with a wind sensor to transmit information on wind power and wind direction to the drone 100 via the mobile communication network 400 or Wi-Fi communication.
  • the flight controller 501 sends a control signal to the pump 106 to adjust the discharge amount and stop the discharge.
  • the current status of the pump 106 (for example, the number of revolutions) is fed back to the flight controller 501.
  • the LED107 is a display means for notifying the drone operator of the drone status.
  • Display means such as a liquid crystal display may be used in place of or in addition to the LED.
  • the buzzer is an output means for notifying the state of the drone (particularly the error state) by an audio signal.
  • the communication device 530 is connected to a mobile communication network 400 such as 3G, 4G, and LTE, and can communicate with a farming cloud composed of a base station and a server and an operator via the mobile communication network 400. Will be done.
  • other wireless communication means such as Wi-Fi, infrared communication, Bluetooth (registered trademark), ZigBee (registered trademark), NFC, or wired communication means such as USB connection. You may use it.
  • the speaker 520 is an output means for notifying the state of the drone (particularly the error state) by means of recorded human voice, synthetic voice, or the like. Depending on the weather conditions, it may be difficult to see the visual display of the drone 100 in flight. In such cases, voice communication is effective.
  • the warning light 521 is a display means such as a strobe light for notifying the state of the drone (particularly the error state). These input / output means may be selected according to the cost target and performance requirements of the drone, and may be duplicated or multiplexed.
  • the field management device 1 shown in FIG. 8 is a device that defines the area of the field to be operated by the drone 100 based on the coordinates acquired by the coordinate surveying device 2. In the defined area, the field management device 1, the drone 100, the actuator 401, or an external device that can be connected to the network NW generates a flight route for the drone 100 to autonomously fly in each area. In addition, the field management device 1 defines an area of obstacles that the drone 100 cannot enter. Flight routes are generated avoiding obstacle areas.
  • the field management device 1 constitutes the drone system 500 together with the drone 100, the actuator 401, the base station 404, and the coordinate surveying device 2 connected via the network NW.
  • the field management device 1 may have its function on the server 405, or may be a separate device. Further, the field management device 1 may have a configuration included in the drone 100.
  • a field is an example of a work area.
  • the coordinate surveying device 2 is a device that has the function of a mobile station of RTK-GNSS, and can survey the coordinate information of the field.
  • the coordinate surveying device 2 is a small device that can be held and walked by the user, for example, a rod-shaped device.
  • the coordinate surveying device 2 may be a wand-like device having a length sufficient for the user to stand upright and hold the upper end portion with the lower end touching the ground.
  • the number of coordinate surveying devices 2 that can be used to read the coordinate information of a field may be one or plural. According to the configuration in which the coordinate information about one field can be measured by a plurality of coordinate surveying devices 2, a plurality of users can each hold the coordinate measuring device 2 and walk in the field, so that the surveying work can be performed. It can be completed in a short time.
  • the coordinate surveying device 2 can measure information on obstacles in the field. Obstacles include walls, slopes, utility poles, power lines, etc. where the drone 100 may collide, and various objects that do not require chemical spraying or monitoring.
  • the coordinate surveying device 2 includes an input unit 201, a coordinate detection unit 202, and a transmission unit 203.
  • the input unit 201 has a configuration provided at the upper end of the coordinate surveying device 2, and is, for example, a button that accepts a user's press. The user presses the button of the input unit 201 when measuring the coordinates of the lower end of the coordinate surveying device 2. Further, the input unit 201 may have a configuration that accepts an input that is pressed once and deletes the data of the survey point whose coordinates are measured.
  • the input unit 201 is configured to be able to distinguish whether the information to be input is the outer edge coordinates of the field or the outer edge coordinates of the obstacle.
  • the input unit 201 may have at least two buttons, one button being a button for acquiring the outer edge coordinates of the field and the other button being a button for acquiring the outer edge coordinates of the obstacle. Further, the input unit 201 can input the outer edge coordinates of the obstacle in association with the type of the obstacle.
  • the coordinate detection unit 202 is a functional unit capable of detecting the three-dimensional coordinates of the lower end of the coordinate surveying device 2 by appropriately communicating with the base station 404.
  • the transmission unit 203 is a functional unit that transmits the three-dimensional coordinates of the lower end of the coordinate surveying device 2 at the time of the input to the actuator 401 or the field management device 1 via the network NW based on the input to the input unit 201. ..
  • the transmission unit 203 transmits the three-dimensional coordinates together with the pointing order.
  • the user moves the field with the coordinate surveying device 2 and points by the input unit 201 at the end point or the end side of the field and the obstacle.
  • the pointed and transmitted three-dimensional coordinates on the end points or edges of the field are received by the field management device 1 by distinguishing between the three-dimensional coordinates of the outer circumference of the field and the three-dimensional coordinates of obstacles. Further, the three-dimensional coordinates to be pointed may be received by the receiving unit 4011 of the actuator 401 and displayed by the display unit 4012. Further, the operator 401 determines whether the received three-dimensional coordinates are suitable as the three-dimensional coordinates of the outer periphery of the field or the obstacle, and if it is determined that re-surveying is necessary, the operator re-measures through the display unit 4012. You may encourage a survey.
  • the field management device 1 includes an arithmetic unit such as a CPU (Central Processing Unit) for executing information processing, and a storage device such as RAM (RandomAccessMemory) and ROM (ReadOnlyMemory), whereby at least as a software resource. It has a coordinate acquisition unit 11, a survey point selection unit 12, an area definition unit 13, and an area output unit 14.
  • arithmetic unit such as a CPU (Central Processing Unit) for executing information processing
  • RAM RandomAccessMemory
  • ROM ReadOnlyMemory
  • the coordinate acquisition unit 11 is a functional unit that acquires the coordinates measured by the coordinate surveying device 2.
  • the coordinate acquisition unit 11 acquires the coordinates of the survey points together with the order acquired by the coordinate surveying device 2.
  • the coordinate acquisition unit 11 may acquire the coordinates of the survey point together with the time acquired by the coordinate surveying device 2. Further, the coordinate acquisition unit 11 links the type of the survey point to the point indicating the outer edge coordinates of the field or the point indicating the outer edge coordinates of the obstacle, that is, the area type to which the survey point belongs, with the coordinate information. Get it with.
  • a first example of the area definition screen will be described with reference to FIG.
  • the survey points P1 to P6 acquired by the coordinate acquisition unit 11 are displayed superimposed on the map or photograph of the field on the area definition screen G1 displayed on the display unit 4012.
  • the surveying point list window G11 is displayed on the right side of the area definition screen G1.
  • the surveying dates and times of the surveying points are displayed in a list in the order acquired by the coordinate surveying device 2.
  • the surveying point list window G11 can be expanded by tapping the icon G110 in the upper right corner, and closed by tapping it again.
  • a trash can icon G112 is displayed for each survey point column G111, and the data of the survey point can be deleted by tapping the icon G112.
  • the description "deleted" is displayed.
  • the survey point selection unit 12 is a functional unit that accepts the user's selection of measurement points on the display unit 4012 of the actuator 401.
  • the user can tap the survey points on the map or photo of the field displayed on the area definition screen G1, or tap the survey points listed on the survey point list window G11. Select a survey point.
  • the survey points are selected one by one. be able to.
  • the information of the selected survey points is displayed in the selected point list window G12 arranged on the left side of the area definition screen G1.
  • the order selected on the display unit 4012 may be displayed together.
  • the selected survey points are displayed in the order of selection from the upper part to the lower part in the figure.
  • deselection may be accepted by a predetermined input, for example, by tapping the "x" portion.
  • the survey point selection unit 12 may accept only the selection of survey points with the same area type. That is, the survey point selection unit 12 allows connection between survey points with the same area type information, and prohibits connection between survey points with different area information.
  • a warning may be displayed when a survey point with different area information is selected. For example, when the first selected survey point is associated with the information that it belongs to the field, only the second and subsequent survey points indicating the coordinates of the outer edge of the field may be selectable. That is, the selection of the survey point indicating the outer edge coordinates of the obstacle may be invalidated. Further, the input of the area type defined before the operation of selecting the survey point may be accepted, and the survey points that can be selected may be displayed based on the input area type. When defining the area of a field or an obstacle, it is possible to accurately define the area of the field or the obstacle by surely selecting the survey points of the same area type.
  • the survey point selection unit 12 may have a function of changing the associated area type for each survey point.
  • the area type of the survey point may be changed and the selection may be accepted for each area type. According to this configuration, even if an incorrect area type is input at the time of surveying by the coordinate surveying device 2, the area can be defined without re-surveying.
  • the surveying point selection unit 12 may be able to select a surveying point regardless of the area type associated at the time of surveying by the coordinate surveying device 2. In this case, the user can select the area type by the area type selection unit 132 described later.
  • the survey point indicating the outer edge coordinates of the field may be displayed in a different manner from the survey point indicating the outer edge coordinates of the obstacle, or only the survey points indicating the outer edge coordinates of the field are displayed. You may.
  • the display of the survey point indicating the coordinates of the outer edge of the obstacle may be grayed out.
  • the area definition unit 13 is a functional unit that defines an area of a field or an obstacle by connecting a plurality of survey points received by the survey point selection unit 12.
  • the area definition unit 13 includes an outer edge defining unit 131 and an area type selection unit 132.
  • the outer edge regulation unit 131 connects a plurality of survey points received by the survey point selection unit 12 to divide the area and defines the area.
  • the outer edge defining unit 131 may connect the survey points in the order in which the survey point selection unit 12 accepts the selection, and this connecting line may be a line indicating the outer edge of the area. According to this configuration, the user can intuitively define the area by tapping the survey points so as to surround the area to be defined on the area definition screen G1. If one area is not defined by the above-mentioned connection procedure, an error notification may be given via a user interface device such as the actuator 401.
  • the area definition unit 13 determines whether or not the survey points are selected in the order in which the connection lines intersect, and notifies an error when the survey points are selected in the order in which at least a part of the connection lines intersect. ..
  • the case where one area is not defined is, for example, the case where connecting lines intersect with each other.
  • the outer edge defining unit 131 defines an area by connecting a plurality of survey points that have been selected by the survey point selection unit 12 so that the plurality of survey points are on the end points or edges of the outer edge of one area. You may.
  • the outer edge defining portion 131 may connect surveying points that are adjacent to each other in terms of coordinates, for example. According to this configuration, the defined area can be automatically generated. If there are a plurality of areas that can be generated based on the selected survey point, the outer edge defining unit 131 may adopt the area generated so that the area of the area is maximized.
  • the area type selection unit 132 is a functional unit that selects the area type of the area specified by the outer edge regulation unit 131.
  • the area type selection unit 132 may determine the type of the area based on the information of the type associated at the time of the survey by the coordinate surveying device 2. Further, the area type selection unit 132 may accept selection of whether the area is a field or an obstacle for the area specified by the outer edge regulation unit 131. Further, the area type selection unit 132 may be configured to further accept the detailed type of obstacle and incidental information when the area defined by the outer edge regulation unit 131 is selected as an obstacle area. For example, "guardrail", “telephone pole”, “electric wire”, “tree”, etc. can be registered as detailed types of obstacles, and information on the vertical coordinates (positions) of obstacles can be registered as incidental information. You may.
  • the area output unit 14 superimposes the defined area A1 on the field displayed on the area definition screen G1. In addition to or in place of this, the area output unit 14 outputs information on the area to a device that generates a flight route for the drone 100.
  • the area output unit 14 may display to that effect on the display unit 4012.
  • a plurality of areas may be displayed in a switchable or superposed manner to prompt the user to select an area to be adopted.
  • the area output unit 14 superimposes the area A2 defined by selecting the survey points P11, P12, P13 and P14 on the field on the area definition screen G1.
  • Area A2 is an area type different from area A1, for example, area A1 is a work area and area A2 is an obstacle area.
  • the obstacle area is displayed in a different manner from the work area. For example, the shaded color and pattern of the area may be different between the obstacle area and the work area.
  • the user can tap the survey points on the map or photo of the field displayed on the area definition screen G1, or tap the survey points listed on the survey point list window G11. Select a survey point.
  • the survey point list window G11 can be tapped with reference to the identification number on the map. Therefore, even when a plurality of survey points are close to each other and it is difficult to tap them separately on the map, the survey points can be appropriately selected.
  • the information of the selected survey point is displayed in the selected point list window G12 arranged on the left side of the area definition screen G1.
  • the selected survey points are displayed in the selected survey points in the selected point list window G12.
  • the identification number of each surveying point is also displayed in the selected point list window G12. According to this configuration, even when the selection is deselected in the selection point list window G12, the deselection can be appropriately input by referring to the identification number.
  • FIG. 13 is a screen showing an example of connecting the selected survey points in the selected order.
  • the survey points P1, P2, P4, and P3 are selected in this order in the survey point list window G11.
  • the connecting line connecting the survey point P2 and the survey point P4 and the connecting line connecting the survey point P3 and the survey point P1 intersect.
  • an error notification "Not a simple polygon” is displayed at the bottom of the selected point list window G12.
  • “automatic solution” is displayed on the right side of the error notification.
  • a plurality of survey points are automatically connected so as to be on the end point or the end edge of the outer edge of one area, and the area is redefined.
  • the survey type selection window G13 is displayed on the left side of the area definition screen G2 instead of the selection point list window G12.
  • the survey type can be selectively selected as to whether the area is a field area or an obstacle area.
  • Flow chart for defining an area As shown in Fig. 15, first, the coordinates of the survey points are acquired (S1), displayed on the area definition screen G1 of the display unit 4012, and the selection of multiple survey points used for defining the area is selected. Accept (S2). When a survey point is selected, the survey points are connected so as to be at the end point or on the edge of the outer edge of one area, and the outer edge of the area is defined (S3). It is determined in step S3 whether or not the area can be defined (S4), and if the area cannot be defined, an error is notified and the outer edge is automatically redefined (S5).
  • step S4 When the area can be defined in step S4, or when the area is redefined in step S5, the selection of the area type of the area is accepted (S6). Next, the information of the area is displayed on the display unit 4012, or is output to the device that generates the flight route of the drone 100 (S7). After that, by repeating steps S2 to S7, a plurality of field areas and obstacle areas can be defined.
  • the outer edge is automatically redefined in step S5, the area generated by the outer edge defining unit 131 so as to maximize the area may be adopted.
  • FIG. 17 is a conceptual diagram showing an outline of pointing work in the related technology.
  • the surveying points of the outer edge are acquired for each defined area, it is necessary to orbit the outer edge for each field and each obstacle.
  • the obstacle 1 and the obstacle 2 are utility poles
  • the obstacle 3 is a guardrail.
  • the surveying points P201 to P204 on the outer edge of the field 1, the surveying points P205 to P208 on the outer edge of the field 2, the surveying points P209 to P212 on the outer edge of the field 3, and the surveying of the outer edge of the field 4 are performed.
  • Points P213 to P216 are pointed in this order along the direction of the arrow.
  • the survey points P221 to P224 on the outer edge of the obstacle 1, the survey points P231 to P234 on the outer edge of the obstacle 2, and the survey points P241 to P244 on the outer edge of the obstacle 3 are rotated in this order along the direction of the arrow. , Pointing.
  • FIG. 16 is a conceptual diagram showing an outline of the pointing work in the present invention.
  • the user can point the survey points P101 to P128 necessary for defining the area in this order. That is, the user can go around in the order of proximity, regardless of the area type to which the user belongs, even if the surveying points constitute the outer edge of another area. This is because in the present invention, the survey point can be selected ex post facto from the area definition screen G1 and the area can be defined regardless of the survey order of the survey points.
  • pointing of survey points P104 to P113 can be performed in this order while proceeding in one direction without reciprocating on the farm road beside the field. That is, it is not necessary to make a round trip between the survey points P105 and P128, the farm road between the survey points P107 and P122, and the farm road between the survey points P109 and P120. Further, since the surveying points P103 and P104 of the adjacent obstacles can be surveyed at the timing before and after the surveying of the surveying points P102 on the outer edge of the field 1, it is not necessary to reciprocate on the long side in the left-right direction of the obstacle 3.
  • the surveying of obstacle 1 and obstacle 2 can also be performed on the way straight from the surveying point P114 to P128 on the outer edge of the field, and reciprocates between the surveying point of the field and the surveying point of the obstacle. You don't have to. That is, the surveying work can be performed efficiently.
  • the drone is not limited to the form of autonomously flying in the work area, for example, a drone that flies a part or all in the work area or on the movement route between the departure / arrival point and the work area based on the control of the user. It may be.
  • the drone system according to the present invention may be a system that prevents the drone 100 from leaving the work area defined in the present system. Specifically, when the drone 100 is within a predetermined range on the outer edge of the work area or inside the work area, a warning may be notified to the user via an actuator. In particular, the warning may be notified when the drone 100 is moving in the direction of exiting the work area within a predetermined range inside the work area, or when the drone 100 has acceleration in that direction.
  • the maneuvering command from the user may be invalidated and the drone 100 may be hovered on the outer edge or within a predetermined range inside the work area. Further, instead of hovering, it may be landed on the spot.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention a pour objet d'augmenter l'efficacité du géométrage de terres agricoles. L'invention concerne un système (500) définissant une zone de travail (A1) pour un drone (100), en fonction d'informations relatives à des points d'arpentage (P1-P6), ce système comprenant : une unité d'affichage (4012) destinée à afficher des informations relatives à la pluralité de points d'arpentage relevés ; une unité de sélection de point d'arpentage (12) destinée à accepter des sélections de points affichés sur l'unité d'affichage ; et une unité de définition de zone (13) destinée à diviser et à définir la zone en reliant entre eux une pluralité de points d'arpentage acceptés par l'unité de sélection de points d'arpentage.
PCT/JP2020/006849 2020-02-20 2020-02-20 Système de drone, contrôleur et procédé de définition de zone de travail WO2021166175A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2022501520A JP7412037B2 (ja) 2020-02-20 2020-02-20 ドローンシステム、操作器および作業エリアの定義方法
CN202080096913.1A CN115136090A (zh) 2020-02-20 2020-02-20 无人机***、操作器以及作业区域的定义方法
PCT/JP2020/006849 WO2021166175A1 (fr) 2020-02-20 2020-02-20 Système de drone, contrôleur et procédé de définition de zone de travail

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/006849 WO2021166175A1 (fr) 2020-02-20 2020-02-20 Système de drone, contrôleur et procédé de définition de zone de travail

Publications (1)

Publication Number Publication Date
WO2021166175A1 true WO2021166175A1 (fr) 2021-08-26

Family

ID=77390758

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/006849 WO2021166175A1 (fr) 2020-02-20 2020-02-20 Système de drone, contrôleur et procédé de définition de zone de travail

Country Status (3)

Country Link
JP (1) JP7412037B2 (fr)
CN (1) CN115136090A (fr)
WO (1) WO2021166175A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023139628A1 (fr) * 2022-01-18 2023-07-27 株式会社RedDotDroneJapan Système de réglage de zone et procédé de réglage de zone

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017206066A (ja) * 2016-05-16 2017-11-24 株式会社プロドローン 薬液散布用無人航空機
JP2017211734A (ja) * 2016-05-24 2017-11-30 ヤンマー株式会社 自律走行経路生成システム
JP2019082846A (ja) * 2017-10-30 2019-05-30 国立大学法人北海道大学 協調作業システム

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3580765B2 (ja) * 2000-08-01 2004-10-27 株式会社ニコン・トリンブル 複数点種観測システム及び方法
CN106054917A (zh) 2016-05-27 2016-10-26 广州极飞电子科技有限公司 一种无人飞行器的飞行控制方法、装置和遥控器
WO2018020659A1 (fr) 2016-07-29 2018-02-01 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Corps mobile, procédé de commande de corps mobile, système de commande de corps mobile et programme de commande de corps mobile
JP6289560B2 (ja) 2016-07-29 2018-03-07 株式会社 ミックウェア 地図情報処理装置、地図情報処理システム、地図情報処理方法、およびプログラム
JP7133298B2 (ja) * 2017-08-10 2022-09-08 株式会社小松製作所 運搬車両の管制システム及び運搬車両の管理方法
CN109708636B (zh) 2017-10-26 2021-05-14 广州极飞科技股份有限公司 导航图配置方法、避障方法以及装置、终端、无人飞行器
JP7270265B2 (ja) 2018-10-03 2023-05-10 株式会社ナイルワークス 運転経路生成装置、運転経路生成方法、運転経路生成プログラム、およびドローン

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017206066A (ja) * 2016-05-16 2017-11-24 株式会社プロドローン 薬液散布用無人航空機
JP2017211734A (ja) * 2016-05-24 2017-11-30 ヤンマー株式会社 自律走行経路生成システム
JP2019082846A (ja) * 2017-10-30 2019-05-30 国立大学法人北海道大学 協調作業システム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023139628A1 (fr) * 2022-01-18 2023-07-27 株式会社RedDotDroneJapan Système de réglage de zone et procédé de réglage de zone

Also Published As

Publication number Publication date
JPWO2021166175A1 (fr) 2021-08-26
CN115136090A (zh) 2022-09-30
JP7412037B2 (ja) 2024-01-12

Similar Documents

Publication Publication Date Title
JP6851106B2 (ja) 運転経路生成システム、運転経路生成方法、および運転経路生成プログラム、ならびにドローン
WO2021214812A1 (fr) Système de lever, procédé de lever et programme de lever
JP7008999B2 (ja) 運転経路生成システム、運転経路生成方法、および運転経路生成プログラム、ならびにドローン
JP6982908B2 (ja) 運転経路生成装置、運転経路生成方法、および運転経路生成プログラム、ならびにドローン
JP7270265B2 (ja) 運転経路生成装置、運転経路生成方法、運転経路生成プログラム、およびドローン
JP2022084735A (ja) ドローン、ドローンの制御方法、および、ドローンの制御プログラム
WO2021140657A1 (fr) Système de drone, dispositif de gestion de vol et drone
WO2021205559A1 (fr) Dispositif d'affichage, dispositif de détermination de propriété de vol de drone, drone, procédé de détermination de propriété de vol de drone et programme d'ordinateur
JP7037235B2 (ja) 産業機械システム、産業機械、管制装置、産業機械システムの制御方法、および、産業機械システムの制御プログラム
WO2021166175A1 (fr) Système de drone, contrôleur et procédé de définition de zone de travail
WO2021152797A1 (fr) Système de croissance de culture
JP7079547B1 (ja) 圃場評価装置、圃場評価方法および圃場評価プログラム
WO2021191947A1 (fr) Système de drone, drone, et procédé de détection d'obstacle
WO2021224970A1 (fr) Système et procédé de positionnement, corps mobile et système et procédé d'estimation de vitesses
US20210254980A1 (en) Traveling route generating system, traveling route generating method, traveling route generating program, coordinate measuring system, and drone
JP2022088441A (ja) ドローン操縦機、および、操縦用プログラム
WO2022018790A1 (fr) Système de commande de véhicules aériens sans pilote
JP7285557B2 (ja) 運転経路生成システム、運転経路生成方法、運転経路生成プログラム、およびドローン
WO2021220409A1 (fr) Système d'édition de zone, dispositif d'interface utilisateur et procédé d'édition de zone de travail
WO2021111621A1 (fr) Système de diagnostic pathologique pour plantes, procédé de diagnostic pathologique pour plantes, dispositif de diagnostic pathologique pour plantes et drone
WO2021205501A1 (fr) Dispositif de détermination de la nécessité d'un nouveau levé, système de levé, système de drone et procédé de détermination de la nécessité d'un nouveau levé
WO2021199243A1 (fr) Système de positionnement, drone, machine d'arpentage et procédé de positionnement
WO2021166101A1 (fr) Dispositif de commande et programme d'exploitation de drone
JP2021082134A (ja) ドローンシステム、ドローン、管制装置、ドローンシステムの制御方法、および、ドローンシステム制御プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20920200

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022501520

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20920200

Country of ref document: EP

Kind code of ref document: A1