EP3391163A1 - Vorrichtung und verfahren für ein unbemanntes flugobjekt - Google Patents
Vorrichtung und verfahren für ein unbemanntes flugobjektInfo
- Publication number
- EP3391163A1 EP3391163A1 EP16828716.7A EP16828716A EP3391163A1 EP 3391163 A1 EP3391163 A1 EP 3391163A1 EP 16828716 A EP16828716 A EP 16828716A EP 3391163 A1 EP3391163 A1 EP 3391163A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sensor
- objects
- predefined
- reference data
- signal processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 20
- 238000012545 processing Methods 0.000 claims abstract description 47
- 238000003384 imaging method Methods 0.000 claims abstract description 30
- 230000033001 locomotion Effects 0.000 claims description 33
- 238000001514 detection method Methods 0.000 claims description 27
- 238000005259 measurement Methods 0.000 claims description 22
- 230000009471 action Effects 0.000 claims description 5
- 230000001419 dependent effect Effects 0.000 claims description 4
- 239000013598 vector Substances 0.000 claims description 4
- 230000006870 function Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 claims description 3
- 230000004069 differentiation Effects 0.000 claims description 2
- 239000007787 solid Substances 0.000 claims description 2
- 238000005070 sampling Methods 0.000 claims 1
- 230000003044 adaptive effect Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 241000282412 Homo Species 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 208000010201 Exanthema Diseases 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 201000005884 exanthem Diseases 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000004886 process control Methods 0.000 description 1
- 206010037844 rash Diseases 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/12—Target-seeking control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
- B64U2101/31—UAVs specially adapted for particular uses or applications for imaging, photography or videography for surveillance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
Definitions
- the invention relates to the field of unmanned aerial objects, in particular drones, such as copter or multicopter.
- flying objects are known, which have a camera and thus allow images of objects or persons from the air, which are transmitted in parallel to an operator, so that the operator can inspect the environment.
- the operator controls the flying object at positions from which suitable recordings can be taken.
- flying objects are known that automatically follow objects, such as people or vehicles, and this z. B. when running a sport to film to evaluate the footage, for example, later. Athletes can use the evaluation to optimize their movement sequences. For this purpose, the flying object pursues the athlete on a daily basis by the flying object follows a radio transmitter, which carries the athlete on the body.
- a flying object automatically an object such.
- this object to be tracked does not have a radio transmitter on his body, so the known automatic tracking is not possible.
- the freedom of action of the person controlling the flying object is also strongly restricted by the control, so that a parallel active pursuit of persons by the operator is not possible while he controls the flying object.
- an object of the present invention to solve one of the aforementioned problems of the prior art.
- an apparatus and a method are to be found in order to enable the tracking of an object, such as a vehicle or a person by an unmanned flying object, without a radio link to a sender on the object.
- At least an alternative solution to the prior art should be proposed.
- a device for an unmanned flying object in particular for a drone or a copter, such as a multicopter or a quadrocopter and a surface aircraft, according to claim 1 is proposed.
- the device comprises a sensor interface for receiving sensor data of at least one imaging sensor and at least one distance-giving sensor. Furthermore, the device comprises a signal processing unit which is set up to detect at least one predefined object by comparing reference data with the received sensor data and / or to distinguish a predefined object from other objects. In addition, the signal processing unit is set up to determine parameters, such as the position, distance and / or movement, of the predefined object. In particular, a determination of the parameters relative to the flying object is possible.
- the device comprises an output interface for outputting the position, distance and / or movement of the predefined object.
- Reference data correspond z. B. predetermined dimensions, such. B. predetermined sizes, widths and / or depths and predetermined values of temperatures and / or speeds and / or accelerations. Reference data can also areas of predetermined dimensions, such. B. predetermined sizes, widths and / or depths but also areas of predetermined values of temperatures, speeds and / or accelerations correspond. The reference data thus serve to predefine the object to be recognized and / or distinguished.
- an object recognition based on the sensor data of an imaging sensor and the sensor data of a distance sensor are considered together to detect a predefined object.
- an object recognition with only one imaging sensor is possible, but only two-dimensional images of an environment can be provided. Therefore, some parameters of an object, such as. B. its size, can not be detected with imaging sensors.
- a distance sensor such.
- parameters of an object such as the size, the position and the distance, detectable, wherein for detecting and distinguishing different objects a very compute-intensive and expensive evaluation is necessary.
- an object with the imaging sensor for. B. due to its contours, to differentiate from other objects.
- this object can now be distinguished from other objects, or an object corresponding to the reference data can be detected.
- the reference data predefined objects are thus recognizable in a particularly simple manner. Furthermore, the position, distance and / or movement of the predefined object will be output at the output interface, which can then be used, for example, for control of the unmanned flying object, for example, with a predefined distance and / or height for the predefined object follow.
- the signal processing unit is set up to determine movement patterns of the object to be recognized or of the object to be distinguished.
- the operator can control the flying object by executing a movement pattern, which is then recognized by the signal processing unit. It is Z. B. conceivable that a person's waving is recognizable by an image processing of the data recorded with the sensor data of the imaging sensor. This waving, for example, corresponds to a pre-set movement pattern that causes the flying object in the form of a command to abort the pursuit and fly back to an exit location, such as a parking area.
- z. B different distances between the flying object and the predefined object, which is here, for example, the athlete just mentioned, by various gestures, which are recognized as movement patterns are set.
- the movement patterns represent a command or value, e.g. B. is also stepless, wherein according to a further alternative, a movement pattern can also represent a direction vector.
- three-dimensional gestures of a person can be z.
- analog control commands for controlling the unmanned aerial object by the distance of two different body parts of the Gestenausprocessden person, such as between the hand and head is detected in all three-dimensional lengths in a simple manner.
- three-dimensional gestures can be used to select, for example, an aviation flying object, such as a drone, or to point to an object to track using a directional vector.
- the device comprises a configuration interface, which is designed to receive reference data, so that an object to be recognized or distinguished can be predefined on the basis of this reference data.
- a memory for storing the reference data is present as part of the device.
- reference data, z. B. by means of programming via the configuration interface, predetermined by a user of the flying object.
- the device and thus the flying object is thus adaptable to the detection or differentiation of different objects to be tracked, for example. If the flying object so in a first application, a person, such as an athlete on a surfboard Surf, track, so other reference data for the detection of the person as a predefined object is required, as in the pursuit of a vehicle, such. B. a car.
- the device preferably comprises a controller, the z. B. is a remote control or a radio remote control.
- the control is used to predefine reference data for predefining an object from the memory and / or to assign movement patterns to commands and / or values.
- the controller includes a selection menu, the z. B. is displayed on a touch-sensitive screen of the controller.
- this menu or selection menu is preferably for use in conjunction with the signal processing unit to determine which motion or which physical measure of any object initiates which form of action. For example, it can be determined in such a menu, with what kind of gestures a flying object can be controlled in what way. In another case, z. For example, it may be determined that the signal processing unit accepts gesture control from any person as soon as they point to the drone. Alternatively, for example, the marking of objects by means of gestures and the resulting actions can be defined, such. For example, tracking the selected object. New commands can preferably be learned via a learning mode in the signal processing unit. In this case, objects can be placed in front of the sensors and / or numerical inputs within the menu.
- size, speed or heat ranges of the respective object can be stored as reference data.
- the gesture can be recorded like a short film via a start / stop mechanism, and movements of individual body parts as speed ranges and position data can be stored in relation to the person's torso, for example.
- detected motion patterns may be used to refer to other objects, such as to initialize the tracking or closer viewing.
- detected movement patterns are used to select individual drones from a plurality of drones and to provide them with further gestures that can be recognized as movement patterns by the respective drone, eg. For example, assign a goal or control it by gesture control.
- measurement condition and acquisition by movement pattern can be transferred to another object. For example, pointing to a person causes the flying object to begin to track the marked person.
- the measurement conditions are adapted to the tracking of the person in order to track the object with high agility and reliability.
- a command includes searching for an object within an already detected object. So z.
- the elaborate detection of a license plate may be coupled to the condition that the license plate may only be searched within a detected object, such as a car.
- a detected object such as a car.
- the computation-intensive image processing of high-resolution data is limited only to image areas in which a car is present.
- the signal processing unit is set up to output sensor settings for setting the sensor, in particular the measuring conditions of the sensor, as a function of the reference data via the sensor interface to the sensor.
- the reference data may include M essock rfas- rules that z. B. include a minimum resolution of a particular object, in which case a minimum resolution is transmitted as a measurement condition to the sensor in the form of a setting of the sensor.
- the reference data comprise rules for the measurement value detection, which, for. B. is a minimum resolution and / or a minimum update rate of one or more sensors.
- These measured value acquisition rules may additionally predetermine, for example, also a sensor measuring range which, for example, also includes an accurate temperature detection range for an infrared camera.
- the sensor interface is used for receiving sensor data of at least one imaging sensor, which is an infrared sensor for capturing images in the infrared range.
- the sensor interface is used to receive sensor data from at least one imaging sensor, which is a photographic and / or video camera for recording images in the visible light range.
- sensor data of an infrared sensor is thus also an object detection at night darkness or adverse weather conditions possible.
- the sensor data of the photo and / or video camera is the detection of objects at high ambient temperatures, in which z. B. an ambient temperature prevails, which is similar to the body temperature of a person to be tracked and therefore an infrared sensor can not be used, still safe possible.
- the device comprises at least one imaging sensor, which is an infrared sensor for taking images in the infrared range.
- the infrared sensor is an infrared camera and / or a microbeam.
- the device comprises at least one imaging sensor which is a photographic and / or video camera for taking images in the visible light range. The device is thus integrated as a complete component with sensors in a simple manner in an unmanned flying object.
- the infrared sensor as an imaging sensor allows recording of temperatures and / or heat signatures, ie an areal distribution of temperatures that are characteristic of an object. Therefore, reference data of temperatures or thermal signatures for predefining objects can also be stored in the memory of the device in order to recognize objects predefined by the stored temperatures and / or heat signatures and / or to distinguish them from other objects having different temperatures and / or thermal signatures ,
- the sensor interface is set up to receive sensor data from at least one distance-emitting sensor, which is a TOF camera operating according to the transit time principle, in particular by means of electromagnetic waves, operating lidar or radar sensor or a stereoscopic camera.
- a distance-emitting sensor which is a TOF camera operating according to the transit time principle, in particular by means of electromagnetic waves, operating lidar or radar sensor or a stereoscopic camera.
- These sensors are known in a small size and can therefore with the interface in a suitable manner - at least in the case of a TOF camera, a lidar or radar sensor - at nighttime darkness and / or poor visibility reliably determine a distance to the object.
- a radar sensor can be distinguished simultaneously between organic and non-organic objects. For example, if the amplitude of the reflected electromagnetic radar waves is considered, then reflect z. B. non-organic objects z. B. have a metal surface, the radar waves to a greater extent than a person. By specifying amplitudes of the radar waves as reference data can So additionally an organic object can be distinguished from non-organic objects or vice versa.
- the objects are therefore first determined or identified on the basis of the data of the imaging sensor according to a preferred embodiment, and then the distance of each of the determined or identified objects is determined simultaneously or successively with the sensor.
- the device comprises at least one distance-emitting sensor comprising a radar sensor, a TOF camera, a lidar sensor and / or a stereoscopic camera, for. B. a stereoscopic infrared camera, is. Accordingly, the device can be integrated as an integral component in a simple manner in an unmanned flying object.
- the signal processing unit is set up to determine and output at least one object to be recognized and / or discriminated, which determines the reliability of the recognition and / or the distinction and / or the parameters, such as the position and / or the distance indicates.
- a probability value ie z.
- a value between 0 and 1 is output indicating the probability that the data output from the output interface will be accurate.
- a flying object can therefore fall below a predefined threshold, the z. B. is determined by tests, assume by the probability value that the predefined object can no longer be reliably detected and therefore a tracking must be canceled.
- the threshold value is preferably determined as a function of the reference data and probability values that lie at or above the threshold value correspond to correct detections and / or discrimination and / or parameters and probability values that are below the threshold value correspond to incorrect detections and / or Distinctions and / or parameters.
- the probability value is dependent on the sensor settings, e.g. As the measurement conditions of the sensor, and / or parameters of the predefined object.
- the probability value is therefore adaptive and depends on the context of the detected object. Because z. B. the mere erroneous detection of a person is less critical than the misinterpretation of a person who currently wants to control the drone by gesture control or about to be followed by the drone, a higher probability value is given for both cases than for passive detection , Thus, the same object may assume different probability values for its detection.
- the context is to be considered, for example, whether the object should only be detected and / or tracked.
- the probability values are adaptive to possible interaction commands, such as drone gesture control.
- the probability values change by interaction of the object with the drone.
- the signal processing unit takes into account the observance of measured value acquisition rules, eg. For example, minimum resolutions and / or minimum update rates that are object and context dependent.
- the invention relates to a method for an unmanned flying object, in particular for a drone or a copter, such as a multicopter or a quadrocopter, and a surface aircraft, with a device, in particular according to one of the aforementioned embodiments.
- the method comprises receiving sensor data with at least one imaging sensor and at least one distance-generating sensor with a sensor interface. Furthermore, the method comprises recognizing at least one predefined object and / or distinguishing a predefined object from other objects by comparing reference data with the received sensor data. A predefine of the objects thus takes place in particular by the reference data itself.
- parameters such as the position, distance and / or movement, of the predefined object are determined, these being determined in particular relative to the flying object.
- the determination is carried out by means of a signal processing unit.
- the method includes outputting the parameters with an output interface.
- reference data, in particular parameters, for predefining the at least one object to be recognized or distinguished are received with a configuration interface and / or stored in a memory.
- at least one probability value is output, which is determined and output by the signal processing unit for the at least one object to be detected or output in order to determine the reliability of the recognition and / or the distinction and / or the parameters, such as position and / or the distance of the predefined object.
- movement patterns of the object to be recognized or to be distinguished are determined and output by the signal processing unit.
- the signal processing unit uses the sensor data of the imaging sensor to determine contours of different recorded objects in the currently detected sensor data and to record the distances of all objects with the sensor data of the distance-giving sensor. Further, based on the sensor data of the imaging sensor and the distance, the parameters, such as the dimensions, of the objects are determined and compared with the reference data. In the case that one or more objects whose parameters, such. B. dimensions match the reference data, they are recognized as predefined objects or distinguished from the other objects.
- the temperatures and / or thermal signatures of the objects are determined with the infrared sensor and the dimensions and the temperatures are compared with the reference data.
- One or more objects are recognized in the case as an object or distinguished from the other objects, in which the dimensions and temperatures are substantially identical to the dimensions or temperature data stored as reference data.
- the invention relates to an unmanned flying object with a device according to one of the aforementioned embodiments, which is set up in particular for carrying out the method according to one of the aforementioned embodiments.
- Other embodiments will be apparent from the illustrated in the drawings embodiments. In the drawing shows the
- FIG. 6 shows an image area of a sensor of the flying object.
- FIG. 1 shows an unmanned flying object 10, which here is a drone, which is also called a copter.
- the unmanned aerial object 10 has a plurality of distance-giving sensors 12A to 12D, which correspond to radar sensors here.
- a pivotable camera 14 is connected to the flying object.
- the distance-giving sensors 12A to 12C are immovable while the sensor 12D is fixedly connected to the camera 14 pivotable with respect to the flying object 10.
- the camera 14 is connected to the flying object 10 via a gimbal 15.
- imaging sensors 16A to 16D are shown, in which case the sensors 16A to 16C are immovably connected to the flying object 10, while the imaging sensor 16D is rotatably and pivotally connected to the camera 14.
- the imaging sensors 16A to 16D correspond to infrared cameras.
- the imaging sensors 16A to 16C and the distance sensors 12A to 12C are also disposed on the non-visible side of the flying object 10, so that there is an all-round view with the sensors 12A to 12C and 16A to 16C at all times.
- the sensors 12D and 16D are additionally pivotable in order in particular to be able to scan a blind spot exactly underneath the flying object and to simplify object tracking.
- Fig. 2 shows an alternative embodiment of the arrangement of the distance-giving sensors 12 and the imaging sensors 16, in which case no moving parts needed are, since adjacent pairs of sensors, which each consist of a distance sensor 12 and an imaging sensor 16, are each arranged at an angle of 60 ° to each other.
- FIG. 3 now shows an exemplary embodiment of the device 18, which here comprises a distance-giving sensor 12 and an imaging sensor 16. These sensors are connected by means of a sensor interface 18 to a signal processing unit 20 in order to supply sensor data 21 to the signal processing unit 20.
- the signal processing unit 20 comprises a memory 22 in which reference data are stored. This reference data can be programmed and / or reprogrammed via a configuration interface 24.
- the signal processing unit comprises a processor 26 to which the sensor data 21 of the sensors 12, 16 and the reference data 25 are supplied.
- a DSP Digital Signal Processor
- FPGA Field Programmable Gate Array
- the processor 26 then recognizes dimensions and temperatures of the objects which are located in the sensor area of the sensors 12, 16 on the basis of the sensor data 21.
- a comparison means 28 the dimensions and temperatures of the objects are compared with the reference data 25 stored in the memory 22 and, in the case of a match, parameters of the object for which the match exists, such as its dimensions and its position and its spacing, via an output interface 29 transmitted to a controller 30 of the flying object 10.
- the flying object 10 is now held, for example, at a fixed distance from the detected and / or distinguished predefined object. Thus, a movement of the object is traceable. Thus, persons are automatically trackable with a flying object 10 with the device 18 without control of the trajectory by a user.
- the use of sensors based on controllable electromagnetic waves has the problem that due to the speed of light in each case a high resolution and a low Akutalleitersrate or a low resolution and a high refresh rate must be selected .
- the image area is first detected passively with a camera and possible Objects are segmented for capture within the image area. Then the objects are assigned a minimum resolution and a minimum update rate.
- the minimum resolution and the minimum update rate are defined by the reference data.
- the signal processing unit distinguishes at least three contexts: the mere detection and representation of the object, the detection of an object and a resulting tracking of the object and the detection of an object and a resulting control command or a signal for use within the signal processing unit.
- the contexts are arbitrarily expandable and always result from an external or internal signal through the object or through a command within the signal processing unit.
- the measurement conditions for the same object may change frequently during the acquisition.
- the method has the advantage of searching specifically for details within detected objects.
- details could be highlighted within objects, such as the face of a person or the license plate of a car.
- the signal processing unit is designed to track objects and / or to process control commands and signals, for example by gestures of a user, which are recognized as movement patterns. Stepless control commands are conceivable in the example of said gesture control, which result, for example, from the distance between different parts of the body. In doing so, it must be ensured that the sensors which detect the gesture as a movement pattern have a sufficiently high resolution to detect, for example, measured distances from head to hand with sufficient gradations between zero point and maximum deflection. Furthermore, the signal processing unit preferably has to ensure that the gestures are detected at a sufficiently high update rate to guarantee smooth control.
- the detection of the gestures is aborted, even though the signal processing unit has recognized the gestures as such.
- the signal processing unit it is also preferably ensured that the signal processing unit always has enough computing power available for certain calculations and acquisitions. Which minimum conditions have to be met depends on the reference data of the respective object.
- a change in the measurement conditions can take place as soon as the object executes certain movement patterns or changes its physical state.
- a command such as to track an object by the operator of the drone may cause an object to be tracked to be detected with other measurement conditions.
- the signal processing unit Due to the ability of the signal processing unit to recognize, track, or interact with objects, it is conceivable that erroneous recognition or interpretation of measurement data could lead to critical drone control errors. To avoid this problem, different contexts of object detection have different thresholds that allow detection. Thus, it is usually less critical if the signal processing unit erroneously detects a human than if a human controls the flying object with the help of Gestestenst réelle and occur during the process interpretation errors of the gestures.
- the threshold value is kept adaptive and adapts to the context in which the object is detected. If a human is detected, this happens with a relatively low threshold. Here, the measurement data of the sensors are processed with a large tolerance. Now if the captured human wants to control the flying object, he or she makes a gesture. Now the context of the capture is changing. The threshold is incremented for the recognition of control commands and the operator can only control the object if the Detection of the gestures by the signal processing unit meets the new, higher threshold.
- This measure guarantees that control commands are detected with high reliability while still maintaining the ability of the signal processing unit to continue allocating dirty measurement data to an object.
- the signal processing unit Due to the ability of the signal processing unit to recognize, track or interact with objects, it is conceivable that the signal processing unit also has a configuration in addition to the reference data, which assigns control commands, physical measured values or objects to an action. For example, in such a menu can be determined with what kind of gestures the flying object can be controlled as. Here are also complex commands conceivable, such as the response of a flying object by just pointing it.
- FIGs 4 and 5 the detection of gestures by the sensors of the drone is shown.
- the detection takes place both two-dimensionally and three-dimensionally via the use of at least one video sensor and a depth sensor.
- the illustrated grid 40 corresponds to the resolution of one of the involved sensors 12A-12D upon detection of the operator.
- the distance to the operator is five times greater than in Fig. 5 below.
- the drone Although it recognizes the gestures error-free as such, will not process, since in Fig. 4, the minimum resolution can not be met. In Fig. 5, this minimum resolution is maintained.
- the minimum resolution is set here distance to the object, the resolution of the Sensor as well as its angle together. It is specified in pixels / cm for each object within the reference data.
- FIG. 6 shows the image area of a 1000 ⁇ 1000 pixel solid state lidar.
- the sensor can drive 1 million pixels per second and determine the distance.
- Object A is a dormant person
- object B is a human being who makes a gesture
- Object C and object D are vehicles, which are also detected.
- Object E is a person who controls the drone by gesture control.
- the problem with this scenario is that the sensor can refresh its image area at 1 Hz. However, letting the sensor measure only said objects results in an update rate of 12.5 Hz for all objects.
- resting people A at 2 Hz and moving vehicles C, D at 5 Hz should be recorded.
- humans E who perform an active control of the drone, should be recorded with at least 20 Hz and humans B, who perform a signaling gesture, with 15 Hz. Due to the speed of light and the number of objects in the image area, a sufficiently high refresh rate is not possible.
- a context-dependent measurement method is displayed.
- the lidar only scans the respective objects per second as often as the minimum requirements from the reference data require. This allows measurements to be redirected to objects with a higher required update rate, such as the drone operator. If during the acquisition objects signal that they are making a gesture, for example, or that the drone should actively track a detected object, the measurement condition for the respective context is dynamically adjusted.
- object A occupies 2,137 pixels, object B 2,438 pixels, objects C and D respectively 28,600 pixels, and object E 17,712 pixels. This yields 4,346 pixels / sec for object A, 36,570 pixels / sec for object B, 143,000 pixels / sec for objects C and D, and 354,240 pixels / sec for object E, so the sensor uses only 681,156 measurements per object Must perform second and complies with the measurement conditions for each object.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Astronomy & Astrophysics (AREA)
- Human Computer Interaction (AREA)
- Radar Systems Or Details Thereof (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102015122183.7A DE102015122183B4 (de) | 2015-12-18 | 2015-12-18 | Vorrichtung und Verfahren für ein unbemanntes Flugobjekt |
PCT/EP2016/081700 WO2017103255A1 (de) | 2015-12-18 | 2016-12-19 | Vorrichtung und verfahren für ein unbemanntes flugobjekt |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3391163A1 true EP3391163A1 (de) | 2018-10-24 |
Family
ID=57838314
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16828716.7A Withdrawn EP3391163A1 (de) | 2015-12-18 | 2016-12-19 | Vorrichtung und verfahren für ein unbemanntes flugobjekt |
Country Status (4)
Country | Link |
---|---|
US (1) | US10351241B2 (de) |
EP (1) | EP3391163A1 (de) |
DE (1) | DE102015122183B4 (de) |
WO (1) | WO2017103255A1 (de) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107239728B (zh) * | 2017-01-04 | 2021-02-02 | 赛灵思电子科技(北京)有限公司 | 基于深度学习姿态估计的无人机交互装置与方法 |
US10496107B2 (en) * | 2017-01-17 | 2019-12-03 | Valeo North America, Inc. | Autonomous security drone system and method |
USD928072S1 (en) * | 2018-12-03 | 2021-08-17 | SZ DJI Technology Co., Ltd. | Aerial vehicle |
CN110132049A (zh) * | 2019-06-11 | 2019-08-16 | 南京森林警察学院 | 一种基于无人机平台的自动瞄准式狙击步枪 |
DE102020112362A1 (de) | 2020-05-07 | 2021-11-11 | Spleenlab GmbH | Zur räumlichen Distanzierung ("Social Distancing") geeignete Personenerfassung und Distanzmessung |
US11817000B2 (en) * | 2020-12-10 | 2023-11-14 | Rockwell Collins, Inc. | System and method to reduce runway occupancy time using pseudo threshold |
US11656723B2 (en) * | 2021-02-12 | 2023-05-23 | Vizio, Inc. | Systems and methods for providing on-screen virtual keyboards |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8718838B2 (en) * | 2007-12-14 | 2014-05-06 | The Boeing Company | System and methods for autonomous tracking and surveillance |
US9965850B2 (en) * | 2012-07-05 | 2018-05-08 | Bernard Fryshman | Object image recognition and instant active response with enhanced application and utility |
US9110168B2 (en) * | 2011-11-18 | 2015-08-18 | Farrokh Mohamadi | Software-defined multi-mode ultra-wideband radar for autonomous vertical take-off and landing of small unmanned aerial systems |
US20130289858A1 (en) * | 2012-04-25 | 2013-10-31 | Alain Anthony Mangiat | Method for controlling and communicating with a swarm of autonomous vehicles using one-touch or one-click gestures from a mobile platform |
US20140008496A1 (en) * | 2012-07-05 | 2014-01-09 | Zhou Ye | Using handheld device to control flying object |
US20140316614A1 (en) * | 2012-12-17 | 2014-10-23 | David L. Newman | Drone for collecting images and system for categorizing image data |
WO2014115139A1 (en) * | 2013-01-23 | 2014-07-31 | Iatas (Automatic Air Traffic Control) Ltd | System and methods for automated airport air traffic control services |
US9367067B2 (en) * | 2013-03-15 | 2016-06-14 | Ashley A Gilmore | Digital tethering for tracking with autonomous aerial robot |
US10240930B2 (en) * | 2013-12-10 | 2019-03-26 | SZ DJI Technology Co., Ltd. | Sensor fusion |
EP3081902B1 (de) * | 2014-03-24 | 2019-04-17 | SZ DJI Technology Co., Ltd. | Verfahren und vorrichtung zur korrektur des flugzeugstatus in echtzeit |
US20160101856A1 (en) * | 2014-06-23 | 2016-04-14 | Nixie Labs, Inc. | Wearable unmanned aerial vehicles, and associated systems and methods |
CN107168352B (zh) * | 2014-07-30 | 2020-07-14 | 深圳市大疆创新科技有限公司 | 目标追踪***及方法 |
CN112097789B (zh) * | 2014-10-27 | 2023-02-28 | 深圳市大疆创新科技有限公司 | 无人飞行器飞行显示 |
WO2016065623A1 (en) * | 2014-10-31 | 2016-05-06 | SZ DJI Technology Co., Ltd. | Systems and methods for surveillance with visual marker |
US9927809B1 (en) * | 2014-10-31 | 2018-03-27 | State Farm Mutual Automobile Insurance Company | User interface to facilitate control of unmanned aerial vehicles (UAVs) |
US20170102699A1 (en) * | 2014-12-22 | 2017-04-13 | Intel Corporation | Drone control through imagery |
CN111762136A (zh) * | 2015-05-12 | 2020-10-13 | 深圳市大疆创新科技有限公司 | 识别或检测障碍物的设备和方法 |
KR20160138806A (ko) * | 2015-05-26 | 2016-12-06 | 엘지전자 주식회사 | 글래스타입 단말기 및 그 제어방법 |
US10310617B2 (en) * | 2015-06-11 | 2019-06-04 | Intel Corporation | Drone controlling device and method |
EP3225026A4 (de) * | 2015-07-31 | 2017-12-13 | SZ DJI Technology Co., Ltd. | Verfahren zur sensorunterstützten ratensteuerung |
US9663227B1 (en) * | 2015-12-22 | 2017-05-30 | Gopro, Inc. | Systems and methods for controlling an unmanned aerial vehicle |
US20170371410A1 (en) * | 2016-06-28 | 2017-12-28 | International Business Machines Corporation | Dynamic virtual object interactions by variable strength ties |
-
2015
- 2015-12-18 DE DE102015122183.7A patent/DE102015122183B4/de not_active Expired - Fee Related
-
2016
- 2016-12-19 EP EP16828716.7A patent/EP3391163A1/de not_active Withdrawn
- 2016-12-19 WO PCT/EP2016/081700 patent/WO2017103255A1/de active Application Filing
-
2018
- 2018-06-08 US US16/004,064 patent/US10351241B2/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
DE102015122183A1 (de) | 2017-06-22 |
US20180290750A1 (en) | 2018-10-11 |
WO2017103255A1 (de) | 2017-06-22 |
US10351241B2 (en) | 2019-07-16 |
DE102015122183B4 (de) | 2018-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3391163A1 (de) | Vorrichtung und verfahren für ein unbemanntes flugobjekt | |
EP1067399B1 (de) | Verfahren zur Sichtweitenbestimmung | |
DE102008016215B4 (de) | Informationsvorrichtungsbediengerät | |
DE60209760T2 (de) | Gesichtsbilderzeugungssystem zur bildaufnahme und automatisierte bestätigung der identität | |
DE102013012466B4 (de) | Bediensystem und Verfahren zum Bedienen einer fahrzeugseitigen Vorrichtung | |
EP3642696B1 (de) | Verfahren und vorrichtung zum erfassen einer nutzereingabe anhand einer geste | |
DE102013110581A1 (de) | Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung | |
DE10319700A1 (de) | Verfahren und Vorrichtung zur Ermittlung einer Wahrscheinlichkeit für eine Kollision eines Fahrzeugs mit einem Gegenstand | |
DE112009001727T5 (de) | Bildverarbeitungsvorrichtung | |
DE102017218366A1 (de) | Verfahren und system zur fussgängererfassung in einem fahrzeug | |
DE10252323A1 (de) | Verfahren zur Bestimmung einer Eigenbewegung eines Fahrzeuges | |
EP3671546A1 (de) | Verfahren und system zum bestimmen von landmarken in einer umgebung eines fahrzeugs | |
DE102010005290A1 (de) | Vereinigungsmodul für mehrere Objekte für ein Kollisionsvorbereitungssystem | |
DE102009034848A1 (de) | Optoelektronischer Sensor | |
EP4030188B1 (de) | Vorrichtung und verfahren zum absichern eines überwachungsbereiches | |
DE10353347A1 (de) | Verfahren zur Erkennung von Fußgängern | |
EP3663881B1 (de) | Verfahren zur steuerung eines autonomen fahrzeugs auf der grundlage von geschätzten bewegungsvektoren | |
EP3151543A1 (de) | Bildaufnahmeeinrichtung für ein kraftfahrzeug und verfahren zum betreiben einer derartigen bildaufnahmeeinrichtung | |
EP3921819B1 (de) | Überwachungsvorrichtung und verfahren zur man-overboard-überwachung eines schiffsabschnitts | |
DE102018008282A1 (de) | Vorrichtung und Verfahren zum Erfassen von Flugobjekten | |
EP3921804A1 (de) | Kalibriereinrichtung für eine überwachungsvorrichtung, überwachungsvorrichtung zur man-overboard-überwachung sowie verfahren zur kalibrierung | |
EP3642697A1 (de) | Verfahren und vorrichtung zum erfassen einer nutzereingabe anhand einer geste | |
DE2939656A1 (de) | Verfahren zur verfolgung eines zieles | |
WO2015110331A1 (de) | Verfahren zur erkennung einer bewegungsbahn mindestens eines bewegten objektes innerhalb eines erfassungsbereiches, verfahren zur gestikerkennung unter einsatz eines derartigen erkennungsverfahrens sowie vorrichtung zur durchführung eines derartigen erkennungsverfahrens | |
DE102022107274B4 (de) | System und Verfahren für die Gestenerkennung und/oder Gestensteuerung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180718 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20190417 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20210227 |