CN111487984A - Equipment control method and device and electronic equipment - Google Patents

Equipment control method and device and electronic equipment Download PDF

Info

Publication number
CN111487984A
CN111487984A CN202010539728.7A CN202010539728A CN111487984A CN 111487984 A CN111487984 A CN 111487984A CN 202010539728 A CN202010539728 A CN 202010539728A CN 111487984 A CN111487984 A CN 111487984A
Authority
CN
China
Prior art keywords
data
current position
passing
blind area
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010539728.7A
Other languages
Chinese (zh)
Other versions
CN111487984B (en
Inventor
张�浩
支涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunji Technology Co Ltd
Original Assignee
Beijing Yunji Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yunji Technology Co Ltd filed Critical Beijing Yunji Technology Co Ltd
Priority to CN202010539728.7A priority Critical patent/CN111487984B/en
Publication of CN111487984A publication Critical patent/CN111487984A/en
Application granted granted Critical
Publication of CN111487984B publication Critical patent/CN111487984B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides an equipment control method, an equipment control device and electronic equipment, which are used for solving the problems that in the prior art, a deceleration area is marked manually on a map where a robot runs, the labor cost is increased, and the map data cannot be updated conveniently after the map environment changes. The method comprises the following steps: acquiring passing area data according to prestored map data; acquiring current position data, and generating an effective detection range corresponding to the current position according to the current position data and the traffic area data; acquiring detection data, and screening passage blind area data from the passage area data according to the effective detection range and the detection data; and generating a control instruction according to the preset path, the current position data and the passing blind area data.

Description

Equipment control method and device and electronic equipment
Technical Field
The application relates to the field of robot control, in particular to a device control method and device and an electronic device.
Background
In the robot field, sensors which can be used for providing obstacle avoidance information include a laser range finder, an ultrasonic detector, a depth vision sensor, an infrared sensor and the like. These sensors have a range of action and are only effective over a range of heights, angles, distances. These sensors are easily shielded by obstacles such as walls, so that the situation behind the obstacles such as walls cannot be detected by the sensors, and a detection blind area of the robot is formed. If the object in the blind area is close to the robot fast, the sensor of the robot can not detect the object, and then can not take obstacle avoidance action in advance, and the object and the robot can collide with each other. The existing scheme is that a deceleration area is marked manually on a map where a robot runs, so that the robot decelerates and crawls when passing through a marked blind area. The manual marking of the map increases labor cost for robot deployment, and is not beneficial to timely updating of map data after map environment changes.
Disclosure of Invention
An object of the embodiments of the present application is to provide an apparatus control method, an apparatus, and an electronic device, so as to solve the problems that in the prior art, a deceleration area is marked manually on a map where a robot runs, which increases labor cost, and it is not beneficial to update map data in time after a map environment changes.
In a first aspect, an embodiment of the present invention provides an apparatus control method, including: acquiring passing area data according to prestored map data; acquiring current position data, and generating an effective detection range corresponding to the current position according to the current position data and the traffic area data; acquiring detection data, and screening passage blind area data from the passage area data according to the effective detection range and the detection data; and generating a control instruction according to the preset path, the current position data and the passing blind area data.
In an embodiment, the obtaining the passing area information according to the pre-stored map data includes: acquiring pre-stored map data, wherein the pre-stored map data comprises gray values of all coordinate points; and extracting the passing area according to the gray value and the gray threshold value to generate passing area information.
In one embodiment, acquiring current position data, and generating an effective detection range corresponding to a current position according to the current position data and traffic area data includes: determining a coordinate point corresponding to the current position in the passing area according to the current position data and the passing area; and generating an effective detection range related to the coordinate point according to the coordinate point and a preset detection length.
In one embodiment, the acquiring the detection data, and screening the passing blind area data from the passing area data according to the effective detection range and the detection data includes: mapping the detection data to a valid detection range; and according to the detection data, removing the detected area from the effective detection range to generate passing blind area data.
In an embodiment, generating a control instruction according to the preset path, the current position data, and the blind passage area data includes: acquiring path information of a preset path, and judging whether the preset path passes through a blind area or not according to the path information and passing blind area data; if so, calculating first distance data between the current position and the blind area according to the current position data and the passing blind area data; judging whether the first distance data is smaller than a first threshold value; if yes, a deceleration command is generated.
In an embodiment, after the obtaining the path information of the preset path and determining whether the preset path passes through the blind area according to the path information and the blind area data, the method further includes: if not, calculating second distance data between the preset path and the blind area; judging whether the second distance data is smaller than a second threshold value; if yes, a deceleration command is generated.
In a second aspect, an embodiment of the present invention provides an apparatus control device, including: the first acquisition module is used for acquiring passing area data according to prestored map data; the first generation module is used for acquiring current position data and generating an effective detection range corresponding to the current position according to the current position data and the traffic area data; the first screening module is used for acquiring detection data and screening passage blind area data from the passage area data according to the effective detection range and the detection data; and the second generation module is used for generating a control instruction according to the preset path, the current position data and the passing blind area data.
In an embodiment, the first obtaining module is further configured to: acquiring pre-stored map data, wherein the pre-stored map data comprises gray values of all coordinate points; and extracting the passing area according to the gray value and the gray threshold value to generate passing area information.
In an embodiment, the first generating module further includes: determining a coordinate point corresponding to the current position in the passing area according to the current position data and the passing area; and generating an effective detection range related to the coordinate point according to the coordinate point and a preset detection length.
In an embodiment, the first filtering module is further configured to: mapping the detection data to a valid detection range; and according to the detection data, removing the detected area from the effective detection range to generate passing blind area data.
In an embodiment, the second generating module is further configured to: acquiring path information of a preset path, and judging whether the preset path passes through a blind area or not according to the path information and passing blind area data; if so, calculating first distance data between the current position and the blind area according to the current position data and the passing blind area data; judging whether the first distance data is smaller than a first threshold value; if yes, a deceleration command is generated.
In an embodiment, the apparatus further includes a second generating module, configured to: if not, calculating second distance data between the preset path and the blind area; judging whether the second distance data is smaller than a second threshold value; if yes, a deceleration command is generated.
In a third aspect, an embodiment of the present invention provides an electronic device, including: a memory for storing a computer program; a processor for performing the method of any of the preceding embodiments.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a device application scenario diagram provided in the embodiment of the present application;
fig. 3 is a schematic flowchart of an apparatus control method according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of another apparatus and method according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an apparatus control device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the present application, the terms "first," "second," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
As shown in fig. 1, the present embodiment provides an electronic apparatus 1 including: at least one processor 11 and a memory 12, one processor being exemplified in fig. 1. The processor 11 and the memory 12 are connected by a bus 10, and the memory 12 stores instructions executable by the processor 11 and the instructions are executed by the processor 11.
In an embodiment, the electronic device 1 is configured to determine whether a blind area exists on the current action path according to a map of the robot operation environment and the current detection data, and control the robot to decelerate according to the calculated distance from the blind area. The electronic apparatus 1 may be a delivery robot, a tracking robot, and an automatic path-finding robot.
Referring to fig. 2, which is an application scene diagram of the device according to this embodiment, the electronic device 1 may divide the map into the following areas C1, C2, C3, C4, C5, and C6 (excluding physical obstacles such as walls) according to the static map and the data scanned by the laser. These areas together are the areas that the laser sensor is able to detect. Wherein C1, C5, C6 are unknown areas, C3 is the area that the laser can detect under the current environment, C2, C4 are because the blind area that barrier such as wall caused, and the robot can not detect but other objects can pass.
Please refer to fig. 3, which is a method for controlling a device according to this embodiment, and the method can be executed by the electronic device 1 shown in fig. 1 and applied to the application scenario shown in fig. 2 to determine whether a blind area exists on a current movement path according to a map of a robot operating environment and current detection data, and control the robot to decelerate according to a calculated distance from the blind area. The method comprises the following steps:
step 301: and acquiring data of the passing area according to the pre-stored map data.
In this step, the pre-stored map data may be a static map for the robot to recognize, and the static map is created by using an S L AM (simultaneous localization and mapping) technique.
Step 302: and acquiring current position data, and generating an effective detection range corresponding to the current position according to the current position data and the traffic area data.
In this step, the effective detection range is generated according to a certain radius length or diagonal length by taking a point mapped on a map at the current position of the robot as a center, and the range can be adjusted according to the moving speed per hour and the braking efficiency of the robot, so that when the blind area appears at the edge of the effective detection range, the robot is timely decelerated to a safe speed or the position after emergency braking does not enter the blind area.
In one embodiment, the sensor may further include: a Global Navigation Satellite System (GNSS), a odometer, an Inertial Measurement Unit (IMU), a Global Positioning System (GPS), and the like.
Step 303: and acquiring detection data, and screening passing blind area data from the passing area data according to the effective detection range and the detection data.
In this step, according to the detection data detected by the sensor, the effective detection range and the passing area are combined, and whether the passing area covered by the effective detection range is actually detected by the sensor or not can be screened out. If not, the passing area which is not detected and enters the effective detection area is determined to be the blind area by combining the static map. In one embodiment, the sensors that may be used to provide the detection data are laser range finders, ultrasonic detectors, depth vision sensors, infrared sensors, and the like.
Step 304: and generating a control instruction according to the preset path, the current position data and the passing blind area data.
In this step, the preset path is a movement route set by the robot when executing the current movement task, the movement route enables the robot to move in the passing area, and when the robot approaches the blind area, a deceleration command is generated to control the robot to decelerate and crawl.
Please refer to fig. 4, which is another device control method provided in this embodiment, and the method can be executed by the electronic device 1 shown in fig. 1 and applied to the application scenario shown in fig. 2 to determine whether a blind area exists on a current action path according to a map of a robot operating environment and current detection data, and control the robot to decelerate according to a calculated distance from the blind area. The method comprises the following steps:
step 401: and acquiring pre-stored map data, wherein the pre-stored map data comprises the gray value of each coordinate point.
In this step, a passing area and a non-passing area can be distinguished according to the difference of pixel values of each pixel point in the pixel map, the non-passing area can include an obstacle area and an unknown area, when the robot detects a current scene, the sensor can determine the position of an obstacle and mark the obstacle when the map is generated, and the unknown area which cannot be scanned can be generated due to the fact that scanning rays of the sensor are blocked by the obstacle.
In one embodiment, when the sensor of the robot scans the passing area, once the sensor scans the obstacle, an outline representing the obstacle is generated, and due to the blocking of the obstacle, the sensor cannot scan the area behind the obstacle, namely, an unknown area is generated, so that the outline is an isolation line between the passing area and the unknown area.
Step 402: and extracting the passing area according to the gray value and the gray threshold value to generate passing area information.
In this step, the first threshold may be the gray level at point I, which in one embodiment represents an initial non-passing area when I (x, y) is less than 255 and an initial passing area when I (x, y) is equal to 255. in one embodiment, a distance represented by a pixel may be 0.02 m or 0.05 m.
Step 403: and determining a coordinate point corresponding to the current position in the passing area according to the current position data and the passing area.
In the step, the current position data is obtained according to the positioning sensor, and the current position of the robot is mapped to the static map by combining the static map, so that the coordinate point of the robot at the current position is determined.
Step 404: and generating an effective detection range related to the coordinate point according to the coordinate point and a preset detection length.
In this step, the effective detection range is generated according to a certain radius length or diagonal length by taking a point mapped on a map at the current position of the robot as a center, and the range can be adjusted according to the moving speed per hour and the braking efficiency of the robot, so that when the blind area appears at the edge of the effective detection range, the robot is timely decelerated to a safe speed or the position after emergency braking does not enter the blind area.
Step 405: the probe data is mapped to a valid probe range.
In this step, the detection data is data detected by the sensor to the to-be-traveled area within a certain range when the robot operates in the travel area, the detection data is mapped to the effective detection range to determine whether the travel areas within the effective detection range are all detected, and if the travel areas within the effective detection range are not detected, the travel areas are indicated as blind areas.
Step 406: and according to the detection data, removing the detected area from the effective detection range to generate passing blind area data.
In this step, if the traffic zone in the effective detection range is not detected, it is determined that the traffic zone is a blind zone.
Step 407: and acquiring path information of the preset path, and judging whether the preset path passes through the blind area or not according to the path information and the passing blind area data. If so, go to step 408, otherwise go to step 410.
In this step, the path information of the preset path may be a reference line, the reference line is mapped in the static map, and the robot determines that the moving route of the robot follows the reference line according to the positioning sensor.
Step 408: and calculating first distance data between the current position and the blind area according to the current position data and the passing blind area data.
In this step, if the preset path passes through the blind area, it indicates that the robot needs to decelerate or stop traveling in time when approaching the blind area. The remaining distance value on the path between the current position and the blind area is calculated, namely the remaining length value of the path reference line between the current position and the blind area is calculated.
Step 409: and judging whether the first distance data is smaller than a first threshold value. If yes, go to step 412, otherwise return to step 408.
In this step, the first threshold may be a distance value, and according to the first distance data and the first threshold, it is determined whether the current position of the robot is close to the blind area, and if the first distance data is smaller than the first threshold, it indicates that the robot is close to the blind area.
In this step, if the approaching first distance data is smaller than the first threshold, it indicates that the robot has approached the blind area, and therefore a deceleration command needs to be generated so that the robot decelerates according to the deceleration command.
Step 410: and calculating second distance data between the preset path and the blind area.
In this step, if the preset path does not pass through the blind area, it is necessary to continuously determine whether the preset path is close to the blind area. In the process of the robot moving along the path, the robot can pass through a T-shaped intersection in a straight-going manner, and the intersection can also generate a blind area to influence the robot to pass through.
Step 411: and judging whether the second distance data is smaller than a second threshold value. If yes, go to step 412, otherwise return to step 410.
In this step, the second threshold may be a distance value, and according to the second distance data and the second threshold, it is determined whether the current position of the robot is close to the blind area, and if the second distance data is smaller than the second threshold, it indicates that the robot is close to the blind area.
Step 412: a deceleration instruction is generated.
In this step, if the first distance is smaller than the first threshold or the second distance is smaller than the second threshold, a deceleration command is generated to decelerate the robot.
Referring to fig. 5, it is a device control apparatus 500 according to the present embodiment, where the device control apparatus 500 may be applied to the electronic device 1 shown in fig. 1 and applied to the application scenario shown in fig. 2, so as to determine whether a blind area exists on a current action path according to a map of a robot running environment and current detection data, and control the robot to decelerate according to a calculated distance from the blind area. The device control apparatus 500 includes: a first obtaining module 501, a first generating module 502, a first screening module 503, and a second generating module 504. The specific principle relationship of each module is as follows:
the first obtaining module 501 is configured to obtain data of a passing area according to pre-stored map data. Please refer to the description of step 301 in the above embodiments.
In an embodiment, the first obtaining module 501 is further configured to obtain pre-stored map data, where the pre-stored map data includes gray values of each coordinate point. And extracting the passing area according to the gray value and the gray threshold value to generate passing area information. Please refer to the description of steps 401-402 in the above embodiment.
The first generating module 502 is configured to obtain current position data, and generate an effective detection range corresponding to the current position according to the current position data and the traffic area data. Please refer to the description of step 302 in the above embodiment.
In an embodiment, the first generating module 502 is further configured to determine a coordinate point corresponding to the current location in the passing area according to the current location data and the passing area. And generating an effective detection range related to the coordinate point according to the coordinate point and a preset detection length. Please refer to the description of steps 403-404 in the above embodiment.
The first screening module 503 is configured to acquire the detection data, and screen the passage blind area data from the passage area data according to the effective detection range and the detection data. Please refer to the description of step 303 in the above embodiments.
In an embodiment, the first filtering module 503 is further configured to map the detection data to a valid detection range. And according to the detection data, removing the detected area from the effective detection range to generate passing blind area data. Please refer to the description of steps 405-406 in the above embodiments.
And a second generating module 504, configured to generate a control instruction according to the preset path, the current position data, and the blind passage area data. Please refer to the description of step 304 in the above embodiment.
In an embodiment, the second generating module 504 is further configured to: the method comprises the steps of obtaining path information of a preset path, judging whether the preset path passes through a blind area or not according to the path information and passing blind area data, if the preset path passes through the blind area, calculating first distance data between a current position and the blind area according to current position data and passing blind area data, judging whether the first distance data is smaller than a first threshold value or not, and if the first distance data is smaller than the first threshold value or not, generating a deceleration instruction.
And if the preset path does not pass through the blind area, calculating second distance data between the preset path and the blind area, and judging whether the second distance data is smaller than a second threshold value. And if the second distance data is smaller than a second threshold value, generating a deceleration instruction. Please refer to step 407-.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of a unit is merely a division of one logic function, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
It should be noted that the functions, if implemented in the form of software functional modules and sold or used as independent products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above embodiments are merely examples of the present application and are not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. An apparatus control method characterized by comprising:
acquiring passing area data according to prestored map data;
acquiring current position data, and generating an effective detection range corresponding to the current position according to the current position data and the traffic area data;
acquiring detection data, and screening passing blind area data from the passing area data according to the effective detection range and the detection data;
and generating a control instruction according to a preset path, the current position data and the passing blind area data.
2. The method of claim 1, wherein the obtaining traffic zone information from pre-stored map data comprises:
acquiring pre-stored map data, wherein the pre-stored map data comprises gray values of all coordinate points;
and extracting the passing area according to the gray value and the gray threshold value to generate passing area information.
3. The method of claim 1, wherein obtaining current location data and generating a valid detection range corresponding to a current location based on the current location data and the traffic zone data comprises:
according to the current position data and the passing area, determining a coordinate point corresponding to the current position in the passing area;
and generating an effective detection range related to the coordinate point according to the coordinate point and a preset detection length.
4. The method of claim 1, wherein the obtaining detection data and screening traffic blind zone data from the traffic zone data according to the effective detection range and the detection data comprises:
mapping the probe data to the valid probe range;
and according to the detection data, removing the detected area from the effective detection range to generate the passing blind area data.
5. The method of claim 1, wherein generating control instructions based on the preset path, the current location data, and the blind transit zone data comprises:
acquiring path information of a preset path, and judging whether the preset path passes through a blind area or not according to the path information and the passing blind area data;
if yes, calculating first distance data between the current position and the blind area according to the current position data and the passing blind area data;
judging whether the first distance data is smaller than a first threshold value;
if yes, a deceleration command is generated.
6. The method according to claim 5, wherein after the obtaining path information of the preset path and determining whether the preset path passes through a blind area according to the path information and the blind area data, the method further comprises:
if not, calculating second distance data between the preset path and the blind area;
judging whether the second distance data is smaller than a second threshold value;
if yes, a deceleration command is generated.
7. An apparatus control device, characterized by comprising:
the first acquisition module is used for acquiring passing area data according to prestored map data;
the first generation module is used for acquiring current position data and generating an effective detection range corresponding to the current position according to the current position data and the traffic area data;
the first screening module is used for acquiring detection data and screening passing blind area data from the passing area data according to the effective detection range and the detection data;
and the second generation module is used for generating a control instruction according to a preset path, the current position data and the blind area data.
8. The apparatus of claim 7, wherein the second generating module is further configured to:
acquiring path information of a preset path, and judging whether the preset path passes through a blind area or not according to the path information and the passing blind area data;
if yes, calculating first distance data between the current position and the blind area according to the current position data and the passing blind area data;
judging whether the first distance data is smaller than a first threshold value;
if yes, a deceleration command is generated.
9. The apparatus of claim 7, wherein the second generating module is further configured to:
if not, calculating second distance data between the preset path and the blind area;
judging whether the second distance data is smaller than a second threshold value;
if yes, a deceleration command is generated.
10. An electronic device, comprising:
a memory for storing a computer program;
a processor for performing the method of any one of claims 1-6.
CN202010539728.7A 2020-06-15 2020-06-15 Equipment control method and device and electronic equipment Active CN111487984B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010539728.7A CN111487984B (en) 2020-06-15 2020-06-15 Equipment control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010539728.7A CN111487984B (en) 2020-06-15 2020-06-15 Equipment control method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111487984A true CN111487984A (en) 2020-08-04
CN111487984B CN111487984B (en) 2021-01-08

Family

ID=71795788

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010539728.7A Active CN111487984B (en) 2020-06-15 2020-06-15 Equipment control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111487984B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112706766A (en) * 2021-01-25 2021-04-27 广州小鹏自动驾驶科技有限公司 Automatic driving method and device
CN113778109A (en) * 2021-11-05 2021-12-10 深圳市普渡科技有限公司 Forbidden path setting method and device for robot, robot and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106476697A (en) * 2016-10-27 2017-03-08 深圳市元征科技股份有限公司 A kind of driving indicating means and device
US20190011913A1 (en) * 2017-07-05 2019-01-10 GM Global Technology Operations LLC Methods and systems for blind spot detection in an autonomous vehicle
CN109375622A (en) * 2018-11-01 2019-02-22 北京云迹科技有限公司 Bidirectional walking method and robot
CN109471429A (en) * 2018-09-29 2019-03-15 北京奇虎科技有限公司 A kind of robot cleaning method, device and electronic equipment
CN110516621A (en) * 2019-08-29 2019-11-29 北京行易道科技有限公司 Detection method, device, vehicle and the storage medium of accessible running region

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106476697A (en) * 2016-10-27 2017-03-08 深圳市元征科技股份有限公司 A kind of driving indicating means and device
US20190011913A1 (en) * 2017-07-05 2019-01-10 GM Global Technology Operations LLC Methods and systems for blind spot detection in an autonomous vehicle
CN109471429A (en) * 2018-09-29 2019-03-15 北京奇虎科技有限公司 A kind of robot cleaning method, device and electronic equipment
CN109375622A (en) * 2018-11-01 2019-02-22 北京云迹科技有限公司 Bidirectional walking method and robot
CN110516621A (en) * 2019-08-29 2019-11-29 北京行易道科技有限公司 Detection method, device, vehicle and the storage medium of accessible running region

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112706766A (en) * 2021-01-25 2021-04-27 广州小鹏自动驾驶科技有限公司 Automatic driving method and device
CN113778109A (en) * 2021-11-05 2021-12-10 深圳市普渡科技有限公司 Forbidden path setting method and device for robot, robot and storage medium

Also Published As

Publication number Publication date
CN111487984B (en) 2021-01-08

Similar Documents

Publication Publication Date Title
US11353553B2 (en) Multisensor data fusion method and apparatus to obtain static and dynamic environment features
CN103155015B (en) Moving-object prediction device, virtual-mobile-object prediction device, program module, mobile-object prediction method, and virtual-mobile-object prediction method
CN110286389B (en) Grid management method for obstacle identification
US20150336575A1 (en) Collision avoidance with static targets in narrow spaces
CN111487984B (en) Equipment control method and device and electronic equipment
JP2018200267A (en) Upper structure determination device and driving support system
JP5712900B2 (en) Peripheral object detection device
US11400923B2 (en) Information processing device, vehicle control device, and mobile object control method
GB2558752A (en) Vehicle vision
EP4105908A1 (en) Method, apparatus, server, and computer program for collision accident prevention
JP2019168953A (en) State estimation device and program
JPWO2018221454A1 (en) Map creation device, control method, program, and storage medium
CN110703770A (en) Method and device for controlling automatic running of track inspection vehicle
Pietzsch et al. Results of a precrash application based on laser scanner and short-range radars
CN113525382A (en) Vehicle control device and storage medium
CN113358110B (en) Method and device for constructing robot obstacle map, robot and storage medium
CN114842106A (en) Method and apparatus for constructing grid map, self-walking apparatus, and storage medium
CN110426714A (en) A kind of obstacle recognition method
WO2021176031A1 (en) Method and system for determining visibility region of different object types for an autonomous vehicle
KR102355426B1 (en) Method and apparatus for detecting and avoiding obstacles on driving path
US20200118424A1 (en) Map information system
EP3598175A1 (en) Object detection system
WO2021044707A1 (en) Surroundings observation system, surroundings observation program, and surroundings observation method
CN111886167A (en) Autonomous vehicle control via collision risk map
CN114212106B (en) Method and device for determining safety probability in drivable area of vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Room 201, building 4, courtyard 8, Dongbeiwang West Road, Haidian District, Beijing

Patentee after: Beijing Yunji Technology Co.,Ltd.

Address before: Room 201, building 4, courtyard 8, Dongbeiwang West Road, Haidian District, Beijing

Patentee before: BEIJING YUNJI TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder