CN111344649A - Management method and equipment of unmanned control platform - Google Patents

Management method and equipment of unmanned control platform Download PDF

Info

Publication number
CN111344649A
CN111344649A CN201880071487.9A CN201880071487A CN111344649A CN 111344649 A CN111344649 A CN 111344649A CN 201880071487 A CN201880071487 A CN 201880071487A CN 111344649 A CN111344649 A CN 111344649A
Authority
CN
China
Prior art keywords
control platform
environment
unmanned control
passage
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880071487.9A
Other languages
Chinese (zh)
Other versions
CN111344649B (en
Inventor
唐克坦
葛宏斌
朱成伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111344649A publication Critical patent/CN111344649A/en
Application granted granted Critical
Publication of CN111344649B publication Critical patent/CN111344649B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

A management method and equipment of an unmanned control platform are disclosed, and the method comprises the following steps: acquiring sensing data of an environment acquired by an environment sensor, and detecting an unmanned control platform (101) in the environment according to the sensing data; when the unmanned control platform is detected, the control mark display device displays the passing indication mark, so that the unmanned control platform identifies the passing indication mark to determine the passing mode of the unmanned control platform (102). The traffic management of the unmanned control platform is realized by displaying the traffic indicator, a complex communication system is not required to be established, the reliability of control is ensured, in addition, the requirement on the computing capacity of the unmanned control platform management equipment is reduced, and the cost is effectively reduced.

Description

Management method and equipment of unmanned control platform
Technical Field
The embodiment of the invention relates to the field of control, in particular to a management method and equipment of an unmanned control platform.
Background
In the prior art, an unmanned control platform (e.g., an unmanned ground robot, an unmanned aerial vehicle, etc.) is often scheduled by a central processing module (e.g., a central server, etc.), so as to implement path planning, automatic control, etc. of the unmanned control platform.
However, such a centralized management and control method has a high requirement on the computing power of the central processing module, which increases the control cost of the unmanned control platform. In addition, the centralized management and control mode has very high requirements on the reliability of the communication between the unmanned control platform and the central processing module, which results in very high cost of the highly reliable communication between the unmanned control platform and the central processing module, and in some scenarios, the highly reliable communication between the unmanned control platform and the central processing module cannot be realized, which results in low reliability of the management of the unmanned control platform.
Disclosure of Invention
The embodiment of the invention provides a management method and equipment of an unmanned control platform, which are used for improving the flexibility and reliability of the management of the unmanned control platform.
A second aspect of an embodiment of the present invention provides a method for managing an unmanned control platform, including:
acquiring sensing data of an environment acquired by an environment sensor, and detecting an unmanned control platform in the environment according to the sensing data;
when the unmanned control platform is detected, the control mark display device displays the passing indication mark, so that the unmanned control platform identifies the passing indication mark to determine the passing mode of the unmanned control platform.
A second aspect of an embodiment of the present invention is to provide a management device for an unmanned control platform, including: an environment sensor, a control mark display device and a processor,
the processor is configured to perform the following operations:
acquiring sensing data of an environment acquired by an environment sensor, and detecting an unmanned control platform in the environment according to the sensing data;
when the unmanned control platform is detected, the control mark display device displays the passing indication mark, so that the unmanned control platform identifies the passing indication mark to determine the passing mode of the unmanned control platform.
In the management method and the management device for the unmanned control platform provided by this embodiment, the unmanned control platform in the environment is detected according to the sensing data, and when the unmanned control platform is detected, the control identifier display device displays the passage indication identifier, so that the unmanned control platform identifies the passage indication identifier to determine the passage mode of the unmanned control platform. The traffic management of the unmanned control platform is realized in a mode of displaying the communication indicator, a complex communication system is not required to be established, the reliability of control is ensured, in addition, the computing capacity of the management equipment of the unmanned control platform is reduced, and the cost is effectively reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
FIG. 1 is a flow chart of a management method provided by an embodiment of the invention;
fig. 2 is a scene diagram of management of an unmanned control platform according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a first drone control platform and a second drone control platform according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an unmanned control platform in a detection target area according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an embodiment of the present invention for detecting an unmanned control platform in multiple target areas;
fig. 6 is a block diagram of a management device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When a component is referred to as being "connected" to another component, it can be directly connected to the other component or intervening components may also be present.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
The embodiment of the invention provides a management method of an unmanned control platform. Fig. 1 is a flowchart of a management method of an unmanned control platform according to an embodiment of the present invention. As shown in fig. 1, the method in this embodiment may include:
s101, acquiring sensing data of an environment acquired by an environment sensor, and detecting an unmanned control platform in the environment according to the sensing data.
Specifically, as shown in fig. 2, the unmanned control platform 201 may be any unmanned device, for example, the unmanned control platform may be an unmanned vehicle, an unmanned aerial vehicle, an unmanned ship, etc., and is schematically illustrated herein as the unmanned aerial vehicle. The management device of the drone platform may include environmental sensors 202 and a processor (not shown), wherein the processor may include one or more processors, which operate alone or in cooperation. The environment sensor 201 may sense an environment around the sensor 201 and output sensing data, and the processor may acquire the sensing data output by the sensor 202, wherein the sensing data may include one or more of an image, depth data, and point cloud data. The processor may detect the drone in the measurement range of the sensor 202 based on the sensed data, i.e., detect whether the drone exists in the measurement range.
Further, the detecting an unmanned control platform in the environment according to the sensing data comprises: and detecting an unmanned control platform in the preset distance from the sensor in the environment according to the sensing data, namely detecting whether the unmanned control platform exists in the preset distance from the sensor in the environment according to the sensing data. The preset distance may be fixed in the program code of the processor, and in some cases, the management device of the unmanned control platform may receive an input from a user and set the preset distance according to the input from the user.
It is understood that a person skilled in the art may set the installation position of the environmental sensor 202 according to requirements, for example, the environmental sensor 202 may be set in the air by a carrying device (e.g., a mounting rod), in some cases, the environmental sensor 202 may be set on the ground, and the environmental sensor 202 may output the sensing data above the ground, which is not limited herein.
Step S102, when the unmanned control platform is detected, the control mark display device displays a passing indication mark, so that the unmanned control platform identifies the passing indication mark to determine a passing mode of the unmanned control platform.
Specifically, with continued reference to fig. 2, the management apparatus further includes an identification display device 203, where the identification display device 203 may be any apparatus that implements a display function, and the identification information device 203 may be an electronic display (a liquid crystal display, an LED display, etc.) or a non-electronic display (an identification panel, etc.). When the processor determines that the unmanned control platform has been detected according to the sensing data, the processor may control the identifier display device 203 to display a passage indicator, after the passage indicator is displayed, the unmanned control platform may identify the passage indicator by using a sensor (e.g., a shooting device) configured by the unmanned control platform, where the different passage indicators indicate different passage modes of the unmanned control platform, and the unmanned control platform determines the passage mode corresponding to the passage indicator through the passage indicator displayed by the identifier display device 203 and moves according to the determined passage mode. The passing indication mark can be a mark which can be recognized by any unmanned control platform, for example, the passing indication mark can comprise one or more of a random point mark, a bar code mark, a two-dimensional code mark and a road traffic mark.
Further, the passage indicator includes a passage allowing indicator, and the control indicator display device displays the passage indicator, so that the unmanned control platform determines the passage mode by recognizing the passage indicator includes: the control mark display device displays the passage allowing mark, so that the unmanned control platform identifies the passage allowing mark to determine the passage allowing mode. Specifically, the passage indication mark comprises a passage allowing mark, and when the unmanned control platform detects the passage allowing mark, the unmanned control platform determines a passage allowing mode, so that the unmanned control platform can pass directly.
Further, the passage mark comprises a passage limiting mark, and the control mark display device displays the passage indicating mark, so that the unmanned control platform identifies the passage indicating mark to determine the passage mode comprises the following steps: the control mark display device displays the passage limiting mark, so that the unmanned control platform identifies the passage limiting mark to determine the passage limiting mode. Specifically, the passage indication identifier includes a passage limiting identifier, and when the unmanned control platform identifies the passage limiting identifier, the unmanned control platform determines a passage limiting mode, and then the unmanned control platform can control the movement of the unmanned control platform according to the passage limiting mode. The communication limiting mode can comprise a deceleration passing mode or a pass prohibition identification, when the unmanned control platform identifies the deceleration passing mode identification or the pass prohibition identification, the unmanned control platform determines the deceleration passing mode or the pass prohibition mode, and the unmanned control platform can decelerate passing or stop moving and wait for the display device to display the pass permission identification.
Further, the passing identifier comprises a left-turn passing identifier or a right-turn passing identifier, and the control identifier display device displays the passing indicator, so that the unmanned control platform recognizes the passing indicator to determine the passing mode, and the method comprises the following steps: the control mark display device displays a left-turn passing mark or a right-turn passing mark so that the unmanned control platform identifies the left-turn or right-turn passing mark to determine a left-turn or right-turn passing mode.
In the management method and the management device for the unmanned control platform provided by this embodiment, the unmanned control platform in the environment is detected according to the sensing data, and when the unmanned control platform is detected, the control identifier display device displays the passage indication identifier, so that the unmanned control platform identifies the passage indication identifier to determine the passage mode of the unmanned control platform. The traffic management of the unmanned control platform is realized in a mode of displaying the communication indicator, a complex communication system is not required to be established, the reliability of control is ensured, in addition, the computing capacity of the management equipment of the unmanned control platform is reduced, and the cost is effectively reduced.
In some embodiments, the traffic indicator is associated with location information, and the processor may control the indicator display device to display the traffic indicator, so that the unmanned control platform can identify the traffic indicator to determine its location information. Specifically, the local storage device in the unmanned control platform may store corresponding relationships between different passage indication identifiers and location information, and when the unmanned control platform recognizes the passage indication identifiers, the local storage device may query the location information corresponding to the passage indication identifiers, and determine the location information of the unmanned control platform through the queried location information, for example, determine the location information as the location information of the unmanned control platform.
In some embodiments, the environment includes a first channel and a second channel intersecting the first channel, the drone control platform includes: the method comprises the following steps that a first unmanned control platform moving to the direction close to a target position in a first channel and a second unmanned control platform moving to the direction close to the target position in a second channel, the target position is the intersection of the first channel and the second channel, the identification display device comprises a first identification display device and a second identification display device, the control identification display device displays a passing indication identification, and the passing mode of the unmanned control platform is determined by the unmanned control platform through recognition of the passing indication identification comprises the following steps: the first identification display device is controlled to display the passage allowing identification, so that the first unmanned control platform identifies the passage allowing identification to determine the passage allowing mode, and the second identification display device is controlled to display the passage limiting identification, so that the second unmanned control platform identifies the passage forbidding identification to determine the passage limiting mode.
Specifically, the environment may include a first channel 301 and a second channel 302, where the first channel 301 and the second channel 302 intersect, and the intersection of the first channel 301 and the second channel 302 may be referred to as a target location. When the first drone control platform 303 moves to the target position on the first channel 301 and the second drone control platform 304 moves to the target position on the second channel 302, there is a possibility of collision between the first drone control platform 303 and the second drone control platform 304. To prevent the first drone control platform 303 and the second drone control platform 304 from colliding, the processor may control the first sign display device to display a permission sign, the first drone control platform 303 recognizes the permission sign to determine a permission mode, and the first drone control platform 303 continues to move to the target location. The processor controls the second sign display device to display the restricted passage sign, and the second unmanned control platform 304 can slow down or stop moving by recognizing the restricted passage sign to determine the restricted passage mode.
Optionally, with continued reference to fig. 3, the environmental sensors may be located on the ground at the target location, such as at 305 as shown, and the environmental sensors may detect the unmanned control platforms in the first and second aisles from the ground. In some embodiments, the environmental sensor may also be disposed at 306 as shown, and is not specifically limited herein.
Further, the controlling the first sign display device to display the passage permission sign and the controlling the second sign display device to display the passage restriction sign includes: and controlling the first identification display device to display the passing permission identification and controlling the second identification display device to display the passing restriction identification according to the preset priorities of the first channel and the second channel.
Specifically, the first channel and the second channel may be preset with priorities, and what kind of passage identifier is displayed by the first identifier display device and the second identifier display device may be determined according to the priorities of the first channel and the second channel. For example, if the priority of the preset first channel is higher than the priority of the preset second channel, the processor may give priority to the first unmanned control platform in the first channel to pass through first, and the processor may control the first identification display device to display the passage permission identification and control the second identification display device to display the passage restriction identification.
In some embodiments, the acquiring sensing data of an environment by an environment sensor, and the detecting an unmanned control platform in the environment according to the sensing data includes: the method comprises the steps of acquiring sensing data of a target area in an environment acquired by an environment sensor, and detecting an unmanned control platform of the target area in the environment according to the sensing data in the target area.
Specifically, in some cases, the processor only cares whether there is an unmanned control platform in the target area in the environment, and the processor can acquire the sensing data of the target area in the environment acquired by the environment sensor. Referring to fig. 4, the target area may be a rectangular area in the figure, and the sensing data of the target area in the environment acquired by the environment sensor is acquired, and the unmanned control platform of the target area in the environment is detected according to the sensing data in the target area. Further, referring to fig. 5, the number of the target regions may include a plurality, for example, 2, 3, 4 or 5, which is not specifically limited herein. By the method, the data processing amount can be effectively reduced, and the operation resources are saved.
In some embodiments, the sensing data includes image and depth data, and the detecting an unmanned control platform in the environment from the sensing data includes: running a color-based foreground extraction algorithm according to the image to obtain a detection result of a first foreground in the environment; determining a detection result of a second foreground in the environment by operating a depth-based foreground extraction algorithm according to the depth data; and fusing the detection result of the first foreground and the second detection result of the second foreground to detect the unmanned aerial vehicle control platform in the environment.
In particular, the sensory data may include image and depth data, wherein the depth data may include a depth image or a point cloud. For example, the sensor may comprise an RGB camera which may output an image and a TOF camera which may output a depth image, which may optionally be determined from a disparity calculated from the image. The processor may execute a color-based foreground extraction algorithm according to the image to obtain a detection result of a first foreground in the environment, where the detection result of the first foreground may be used to indicate whether the first foreground exists in the environment, where the first foreground includes the unmanned control platform, that is, the processor may determine whether the unmanned control platform exists in the environment from the dimension of the color. Further, the processor may execute a depth-based foreground extraction algorithm according to the depth data to obtain a detection result of a second foreground in the environment, where the detection result of the second foreground may be used to indicate whether the second foreground exists in the environment, where the first foreground includes the unmanned control platform, that is, the processor may determine whether the unmanned control platform exists in the environment from a dimension of the depth. The processor may perform fusion calculation according to a detection result of the first foreground and a detection result of the second foreground, and detect the unmanned control platform in the environment by fusing the results after the calculation. For example, when the detection result of the first foreground and the detection result of the second foreground both indicate that the unmanned control platform exists in the environment, it is determined that the unmanned control platform in the environment is detected, that is, it is determined that the unmanned control platform is detected. The fusion calculation method may also adopt other manners, such as weighting calculation, and the like, and is not limited in this respect. By detecting the unmanned control platform in the environment from two dimensions of color and depth, the accuracy of detection can be effectively improved.
Further, the obtaining a detection result of the first foreground in the image by running a color-based foreground extraction algorithm according to the image includes: acquiring a background color model corresponding to the image; and matching the image with the background color model to obtain a detection result of a first foreground in the environment.
Specifically, the processor may obtain a background color model corresponding to the image, wherein the color model is used to indicate a color feature of the background in the image. The processor may match the image to the background color model, i.e., determine whether any pixels in the image match the background color model. When a pixel which does not match the background color model exists in the image, it may be determined that the pixel is a pixel corresponding to a first foreground, it may be determined that the first foreground exists in the environment, and when all pixels in the image match the background color model, it may be determined that the first foreground does not exist in the environment. Further, the background color model includes a color model corresponding to each pixel of the background in the image, and each pixel of the image may be matched with the color model corresponding to each pixel of the background color model, and when the pixel does not match the color model corresponding to the pixel, it may be determined that the pixel is a pixel corresponding to the first foreground, and it may be determined that the first foreground exists in the environment, and when all pixels of the image all match the color model corresponding to the pixel, it may be determined that the first foreground does not exist in the environment.
In some embodiments, the background color model is a gaussian mixture model, where the gaussian mixture model includes a gaussian mixture distribution corresponding to each pixel in the image, and the matching the image with the background color model to obtain the detection result of the first foreground in the environment includes: and matching each pixel of the image with the Gaussian mixture corresponding to each pixel in the background color model to obtain a detection result of the first foreground in the environment.
Specifically, the background color model is a gaussian mixture model, that is, the background is modeled by the gaussian mixture model, where the gaussian mixture model includes a gaussian mixture distribution corresponding to each pixel in the image. The basic idea of modeling the background with a mixture gaussian distribution is to represent the color information of a pixel of the background in the image by a superposition of K gaussian distributions, where K is usually between 3 and 5, i.e. the mixture gaussian distribution of each pixel is:
Figure BDA0002476934890000101
xjrepresenting the value of the pixel j at the time t, if the pixel is an RGB three-channel, then xjIs a vector, xj=[xjR,xjG,xjB],
Figure BDA0002476934890000102
An estimate of the weight coefficients representing the ith gaussian distribution in the gaussian mixture model at time t,
Figure BDA0002476934890000103
and
Figure BDA0002476934890000104
mean vector and covariance matrix representing the ith Gaussian distribution in the Gaussian mixture model at time t (assuming red, green, and green pixels,The blue components are independent of each other) η represents a gaussian distribution probability density function.
Figure BDA0002476934890000105
If the pixel value of the pixel is matched with at least one Gaussian distribution in the K Gaussian distributions corresponding to the pixel, the pixel is a pixel corresponding to the background, otherwise, the pixel is classified as the foreground. Further, if the distance between the pixel value of the pixel and the mean value in at least one gaussian distribution of the K gaussian distributions corresponding to the pixel is less than 2.5 times of the standard deviation, it can be determined that the pixel value of the pixel matches the color model corresponding to the pixel.
In some embodiments, said determining a detection result of a second foreground in the environment by running a depth-based foreground extraction algorithm on the depth data comprises: obtaining a background depth model corresponding to the depth data, wherein the background depth model is used for indicating a depth range of a background in the environment; and matching the depth data with the background depth model to obtain a detection result of a second foreground in the environment.
In particular, the processor may obtain a background depth model corresponding to the depth data, wherein the background depth model is indicative of a depth range of a background in the environment, and the depth data may include a point cloud or a depth image. For example, the depth data includes a depth image, and the background depth model includes a depth range of the background in each pixel in the depth image, that is, the depth range is a depth range when the pixel is a pixel of the background. And matching the depth data with the background depth model to obtain a detection result of a second foreground in the environment, wherein the second foreground exists in the environment when the depth data matched with the background depth model exists in the depth data, and the second foreground does not exist in the environment when the depth data matched with the background depth model does not exist in the depth data.
Further, the depth data includes a depth image, the background depth model is used to indicate a depth range corresponding to each pixel of the background in the depth image, and the matching the depth data and the background depth model to obtain the detection result of the second foreground in the environment includes: and matching the depth of each pixel in the depth image with the depth range corresponding to each pixel in the background depth model to obtain a detection result of a second foreground in the environment.
Specifically, the processor matches the depth corresponding to each pixel in the depth image with the depth range of each pixel to determine the detection result in the environment, and further, when the depth of one pixel in the depth image is in the depth range corresponding to the pixel, it may be determined that the pixel is a pixel of a second background, a second foreground exists in the environment, when the depths of all pixels in the depth image are not in the depth range corresponding to the pixel, the pixel corresponding to the second foreground does not exist in the depth image, and the second foreground does not exist in the environment.
The depth data for the environment may be directly output by an environment sensor, which may include a TOF camera or radar, for example, which may directly output depth data. In some embodiments, the depth data is calculated from sensory data, e.g., the sensory data comprises an image from which the depth data may be calculated.
The embodiment of the invention provides a management device of an unmanned control platform. Fig. 6 is a block diagram of a management device of an unmanned control platform according to an embodiment of the present invention, where the management device of the unmanned control platform includes an environmental sensor 601, an identifier display device 602, and a processor 603, where,
the processor 603 is configured to perform the following operations:
acquiring sensing data of an environment acquired by an environment sensor 601, and detecting an unmanned control platform in the environment according to the sensing data;
when the unmanned control platform is detected, the control mark display device 603 displays a passage indication mark, so that the unmanned control platform identifies the passage indication mark to determine the passage mode of the unmanned control platform.
In some embodiments, the pass identifier includes a pass-allowed identifier, and the processor controls the identifier display device to display the pass indicator, so that when the unmanned control platform identifies the pass indicator to determine the pass mode, the processor is specifically configured to:
the control mark display device displays the passage allowing mark, so that the unmanned control platform identifies the passage allowing mark to determine the passage allowing mode.
In some embodiments, the pass identifier includes a restricted pass identifier, and the processor controls the identifier display device to display the pass indicator, so that when the unmanned control platform identifies the pass indicator to determine the pass mode, the processor is specifically configured to:
the control mark display device displays the passage limiting mark, so that the unmanned control platform identifies the passage limiting mark to determine the passage limiting mode.
In some embodiments, the environment includes a first channel and a second channel intersecting the first channel, the drone control platform includes: a first unmanned control platform moving to the direction close to the target position in the first channel and a second unmanned control platform moving to the direction close to the target position in the second channel, wherein the target position is the intersection of the first channel and the second channel, the mark display device comprises a first mark display device and a second mark display device,
the processor controls the identification display device to display a passing indication identification, so that when the unmanned control platform identifies the passing indication identification to determine the passing mode of the unmanned control platform, the processor is specifically configured to:
the first identification display device is controlled to display the passage allowing identification, so that the first unmanned control platform identifies the passage allowing identification to determine the passage allowing mode, and the second identification display device is controlled to display the passage limiting identification, so that the second unmanned control platform identifies the passage limiting identification to determine the passage limiting mode.
In some embodiments, when the processor controls the first identifier display device to display the permitted passage identifier and controls the second identifier display device to display the restricted passage identifier, the processor is specifically configured to:
and controlling the first identification display device to display the passage permission identification and controlling the second identification display device to display the passage restriction identification according to the preset priorities of the first channel and the second channel.
In some embodiments, the transit indication identifier is associated with location information, and the processor is further configured to:
the control mark display device displays the passing indication mark so that the unmanned control platform can identify the passing indication mark to determine the position information of the unmanned control platform.
In certain embodiments, the restricted-traffic mode comprises a slowed-traffic mode or a prohibited-traffic mode.
In some embodiments, the processor acquires sensing data of an environment acquired by an environment sensor, and when detecting an unmanned control platform in the environment according to the sensing data, the processor is specifically configured to:
the method comprises the steps of acquiring sensing data in a target area in the environment acquired by an environment sensor, and detecting an unmanned control platform of the target area according to the sensing data in the target area;
when detecting unmanned control platform, control sign display device shows current instruction sign includes:
and when the unmanned control platform is detected in the target area, controlling the mark display device to display the passing indication mark.
In some embodiments, the sensory data includes image and depth data,
when the processor detects the unmanned control platform in the environment according to the sensing data, the processor is specifically configured to:
running a color-based foreground extraction algorithm according to the image to obtain a detection result of a first foreground in the environment;
determining a detection result of a second foreground in the environment by operating a depth-based foreground extraction algorithm according to the depth data;
and fusing the detection result of the first foreground and the second detection result of the second foreground to detect the unmanned aerial vehicle control platform in the environment.
In some embodiments, when the processor runs a color-based foreground extraction algorithm according to the image to obtain a detection result of the first foreground in the image, the processor is specifically configured to:
acquiring a background color model corresponding to the image;
and matching the image with the background color model to obtain a detection result of a first foreground in the environment.
In some embodiments, the background color model is a gaussian mixture model;
when the processor matches the image with the background color model to obtain a detection result of the first foreground in the environment, the processor is specifically configured to:
and matching each pixel of the image with the Gaussian mixture corresponding to each pixel in the background color model to obtain a detection result of the first foreground in the environment.
In some embodiments, when the processor determines the detection result of the second foreground in the environment by running a foreground extraction algorithm according to the depth image, the processor is specifically configured to:
obtaining a background depth model corresponding to the depth data, wherein the background depth model is used for indicating a depth range of a background in the environment;
and matching the depth data with the background depth model to obtain a detection result of a second foreground in the environment.
It is to be understood that reference may be made to the foregoing related portions for specific principles and explanation of the method, which are not repeated herein.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working process of the device described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (25)

1. A management method of an unmanned control platform is characterized by comprising the following steps:
acquiring sensing data of an environment acquired by an environment sensor, and detecting an unmanned control platform in the environment according to the sensing data;
when the unmanned control platform is detected, the control mark display device displays the passing indication mark, so that the unmanned control platform identifies the passing indication mark to determine the passing mode of the unmanned control platform.
2. The method of claim 1, wherein the pass indicator comprises a permit pass indicator, and wherein the control indicator display device displays a pass indicator such that the unmanned control platform identifies the pass indicator to determine the pass mode comprises:
the control mark display device displays the passage allowing mark, so that the unmanned control platform identifies the passage allowing mark to determine the passage allowing mode.
3. The method of claim 1 or 2, wherein the pass indicators comprise restricted pass indicators, and the control indicator display device displays the pass indicators such that the unmanned control platform identifies the pass indicators to determine the pass mode comprises:
the control mark display device displays the passage limiting mark, so that the unmanned control platform identifies the passage limiting mark to determine the passage limiting mode.
4. The method of any of claims 1-3, wherein the environment includes a first channel and a second channel intersecting the first channel, the drone control platform including: a first unmanned control platform moving to the direction close to the target position in the first channel and a second unmanned control platform moving to the direction close to the target position in the second channel, wherein the target position is the intersection of the first channel and the second channel, the mark display device comprises a first mark display device and a second mark display device,
the control mark display device displays a passing indication mark, so that the unmanned control platform identifies the passing indication mark to determine the passing mode of the unmanned control platform comprises the following steps:
the first identification display device is controlled to display the passage allowing identification, so that the first unmanned control platform identifies the passage allowing identification to determine the passage allowing mode, and the second identification display device is controlled to display the passage limiting identification, so that the second unmanned control platform identifies the passage limiting identification to determine the passage limiting mode.
5. The method of claim 4, wherein controlling the first sign display device to display the passage permission sign and controlling the second sign display device to display the restricted passage sign comprises:
and controlling the first identification display device to display the passage permission identification and controlling the second identification display device to display the passage restriction identification according to the preset priorities of the first channel and the second channel.
6. The method of claim 1, wherein the traffic indication identifier is associated with location information, the method further comprising:
the control mark display device displays the passing indication mark so that the unmanned control platform can identify the passing indication mark to determine the position information of the unmanned control platform.
7. The method of claim 4 or 5, wherein the restricted-traffic mode comprises a slowed-traffic mode or a no-traffic mode.
8. The method of any one of claims 1-7, wherein the obtaining environmental sensors collects sensory data of an environment, and detecting an unmanned control platform in the environment from the sensory data comprises:
the method comprises the steps of acquiring sensing data in a target area in the environment acquired by an environment sensor, and detecting an unmanned control platform of the target area according to the sensing data in the target area;
when detecting unmanned control platform, control sign display device shows current instruction sign includes:
and when the unmanned control platform is detected in the target area, controlling the mark display device to display the passing indication mark.
9. The method of any of claims 1-8, wherein the sensory data comprises image and depth data,
the detecting an unmanned control platform in the environment from the sensory data comprises:
running a color-based foreground extraction algorithm according to the image to obtain a detection result of a first foreground in the environment;
determining a detection result of a second foreground in the environment by operating a depth-based foreground extraction algorithm according to the depth data;
and fusing the detection result of the first foreground and the second detection result of the second foreground to detect the unmanned aerial vehicle control platform in the environment.
10. The method of claim 9, wherein the performing a color-based foreground extraction algorithm on the image to obtain a detection result of the first foreground in the image comprises:
acquiring a background color model corresponding to the image;
and matching the image with the background color model to obtain a detection result of a first foreground in the environment.
11. The method of claim 10, wherein the background color model is a Gaussian mixture model, wherein the Gaussian mixture model includes a Gaussian mixture distribution corresponding to each pixel in the image,
the matching the image with the background color model to obtain a detection result of a first foreground in the environment comprises:
and matching each pixel of the image with the Gaussian mixture corresponding to each pixel in the background color model to obtain a detection result of the first foreground in the environment.
12. The method of any of claims 9-11, wherein the determining a detection result of the second foreground in the environment by running a depth-based foreground extraction algorithm on the depth data comprises:
obtaining a background depth model corresponding to the depth data, wherein the background depth model is used for indicating a depth range of a background in the environment;
and matching the depth data with the background depth model to obtain a detection result of a second foreground in the environment.
13. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 12.
14. A management apparatus of an unmanned control platform, comprising: an environment sensor, a control mark display device and a processor,
the processor is configured to perform the following operations:
acquiring sensing data of an environment acquired by an environment sensor, and detecting an unmanned control platform in the environment according to the sensing data;
when the unmanned control platform is detected, the control mark display device displays the passing indication mark, so that the unmanned control platform identifies the passing indication mark to determine the passing mode of the unmanned control platform.
15. The apparatus according to claim 14, wherein the passage identifier comprises a passage-allowed identifier, and the processor controls the identifier display device to display a passage indicator, so that the unmanned control platform is specifically configured to, when identifying the passage indicator to determine the passage mode:
the control mark display device displays the passage allowing mark, so that the unmanned control platform identifies the passage allowing mark to determine the passage allowing mode.
16. The apparatus according to claim 14 or 15, wherein the passage identifier comprises a restricted passage identifier, and the processor controls the identifier display device to display the passage indicator, so that the unmanned control platform is specifically configured to, when identifying the passage indicator to determine the passage mode:
the control mark display device displays the passage limiting mark, so that the unmanned control platform identifies the passage limiting mark to determine the passage limiting mode.
17. The apparatus of any of claims 14-16, wherein the environment includes a first channel and a second channel intersecting the first channel, the unmanned control platform comprising: a first unmanned control platform moving to the direction close to the target position in the first channel and a second unmanned control platform moving to the direction close to the target position in the second channel, wherein the target position is the intersection of the first channel and the second channel, the mark display device comprises a first mark display device and a second mark display device,
the processor controls the identification display device to display a passing indication identification, so that when the unmanned control platform identifies the passing indication identification to determine the passing mode of the unmanned control platform, the processor is specifically configured to:
the first identification display device is controlled to display the passage allowing identification, so that the first unmanned control platform identifies the passage allowing identification to determine the passage allowing mode, and the second identification display device is controlled to display the passage limiting identification, so that the second unmanned control platform identifies the passage limiting identification to determine the passage limiting mode.
18. The device according to claim 17, wherein the processor is configured to, when controlling the first sign displaying unit to display the passage permission sign and controlling the second sign displaying unit to display the passage restriction sign:
and controlling the first identification display device to display the passage permission identification and controlling the second identification display device to display the passage restriction identification according to the preset priorities of the first channel and the second channel.
19. The device of claim 14, wherein the transit indication identifier is associated with location information, and wherein the processor is further configured to:
the control mark display device displays the passing indication mark so that the unmanned control platform can identify the passing indication mark to determine the position information of the unmanned control platform.
20. The apparatus of claim 17 or 18, wherein the restricted-traffic mode comprises a slowed-traffic mode or a no-traffic mode.
21. The apparatus according to any of claims 14-20, wherein the processor is configured to acquire sensing data of an environment from an environment sensor, and to detect the drone in the environment based on the sensing data, and is specifically configured to:
the method comprises the steps of acquiring sensing data in a target area in the environment acquired by an environment sensor, and detecting an unmanned control platform of the target area according to the sensing data in the target area;
when detecting unmanned control platform, control sign display device shows current instruction sign includes:
and when the unmanned control platform is detected in the target area, controlling the mark display device to display the passing indication mark.
22. The apparatus of any of claims 14-21, wherein the sensory data comprises image and depth data,
when the processor detects the unmanned control platform in the environment according to the sensing data, the processor is specifically configured to:
running a color-based foreground extraction algorithm according to the image to obtain a detection result of a first foreground in the environment;
determining a detection result of a second foreground in the environment by operating a depth-based foreground extraction algorithm according to the depth data;
and fusing the detection result of the first foreground and the second detection result of the second foreground to detect the unmanned aerial vehicle control platform in the environment.
23. The device according to claim 22, wherein when the processor runs a color-based foreground extraction algorithm on the image to obtain the detection result of the first foreground in the image, the processor is specifically configured to:
acquiring a background color model corresponding to the image;
and matching the image with the background color model to obtain a detection result of a first foreground in the environment.
24. The apparatus of claim 23, wherein the background color model is a Gaussian mixture model, wherein the Gaussian mixture model includes a Gaussian mixture distribution corresponding to each pixel in the image,
when the processor matches the image with the background color model to obtain a detection result of the first foreground in the environment, the processor is specifically configured to:
and matching each pixel of the image with the Gaussian mixture corresponding to each pixel in the background color model to obtain a detection result of the first foreground in the environment.
25. The device according to any of claims 22-24, wherein the processor, when running a depth-based foreground extraction algorithm on the depth data to determine the detection result of the second foreground in the environment, is specifically configured to:
obtaining a background depth model corresponding to the depth data, wherein the background depth model is used for indicating a depth range of a background in the environment;
and matching the depth data with the background depth model to obtain a detection result of a second foreground in the environment.
CN201880071487.9A 2018-12-29 2018-12-29 Management method and equipment of unmanned control platform Active CN111344649B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/125482 WO2020133387A1 (en) 2018-12-29 2018-12-29 Unmanned control platform management method and device

Publications (2)

Publication Number Publication Date
CN111344649A true CN111344649A (en) 2020-06-26
CN111344649B CN111344649B (en) 2024-06-14

Family

ID=71128505

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880071487.9A Active CN111344649B (en) 2018-12-29 2018-12-29 Management method and equipment of unmanned control platform

Country Status (2)

Country Link
CN (1) CN111344649B (en)
WO (1) WO2020133387A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1218355A (en) * 1998-11-24 1999-06-02 杨更新 Automatic driving system of vehicle
US20010029428A1 (en) * 2000-03-15 2001-10-11 Bernd Huertgen Navigation system
CN101739841A (en) * 2009-12-22 2010-06-16 上海久银车库工程有限公司 Intelligent comprehensive monitoring and security integrated management system for parking garage
CN103000035A (en) * 2012-11-22 2013-03-27 北京交通大学 Information acquisition release system and method for guiding left-hand turning vehicle to pass through intersection
CN104345734A (en) * 2013-08-07 2015-02-11 苏州宝时得电动工具有限公司 Automatic working system, automatic walking equipment and control method thereof
CN105652889A (en) * 2016-04-12 2016-06-08 谭圆圆 Remote control device of unmanned aerial vehicle and control method of remote control device
CN105841694A (en) * 2016-06-14 2016-08-10 杨珊珊 Beacon navigation device of unmanned vehicle, beacons and navigation method of beacon navigation device of unmanned vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1218355A (en) * 1998-11-24 1999-06-02 杨更新 Automatic driving system of vehicle
US20010029428A1 (en) * 2000-03-15 2001-10-11 Bernd Huertgen Navigation system
CN101739841A (en) * 2009-12-22 2010-06-16 上海久银车库工程有限公司 Intelligent comprehensive monitoring and security integrated management system for parking garage
CN103000035A (en) * 2012-11-22 2013-03-27 北京交通大学 Information acquisition release system and method for guiding left-hand turning vehicle to pass through intersection
CN104345734A (en) * 2013-08-07 2015-02-11 苏州宝时得电动工具有限公司 Automatic working system, automatic walking equipment and control method thereof
CN105652889A (en) * 2016-04-12 2016-06-08 谭圆圆 Remote control device of unmanned aerial vehicle and control method of remote control device
CN105841694A (en) * 2016-06-14 2016-08-10 杨珊珊 Beacon navigation device of unmanned vehicle, beacons and navigation method of beacon navigation device of unmanned vehicle

Also Published As

Publication number Publication date
CN111344649B (en) 2024-06-14
WO2020133387A1 (en) 2020-07-02

Similar Documents

Publication Publication Date Title
EP3581890B1 (en) Method and device for positioning
CN110287276A (en) High-precision map updating method, device and storage medium
US20170193430A1 (en) Restocking shelves based on image data
JP2021530394A (en) Smart driving control methods and devices, vehicles, electronic devices, and storage media
US20220343758A1 (en) Data Transmission Method and Apparatus
CN111856963A (en) Parking simulation method and device based on vehicle-mounted looking-around system
US20210272314A1 (en) Queuing recommendation method and device, terminal and computer readable storage medium
CN112735253B (en) Traffic light automatic labeling method and computer equipment
CN113887418A (en) Method and device for detecting illegal driving of vehicle, electronic equipment and storage medium
CN115719436A (en) Model training method, target detection method, device, equipment and storage medium
CN113859264A (en) Vehicle control method, device, electronic device and storage medium
CN111160132B (en) Method and device for determining lane where obstacle is located, electronic equipment and storage medium
CN114167404A (en) Target tracking method and device
CN111401190A (en) Vehicle detection method, device, computer equipment and storage medium
CN111553339A (en) Image unit determination method, small target detection method and computer equipment
CN112863195A (en) Vehicle state determination method and device
CN113091737A (en) Vehicle-road cooperative positioning method and device, automatic driving vehicle and road side equipment
CN112189176B (en) Multi-machine operation route planning method, control terminal and computer readable storage medium
CN111344649A (en) Management method and equipment of unmanned control platform
CN112639655A (en) Control method and device for return flight of unmanned aerial vehicle, movable platform and storage medium
US20220397676A1 (en) Aircraft identification
CN115762230A (en) Parking lot intelligent guiding method and device based on remaining parking space amount prediction
CN115019546A (en) Parking prompting method and device, electronic equipment and storage medium
US11721111B2 (en) Systems and methods for detecting objects in an image of an environment
CN113762030A (en) Data processing method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant