CN115947276A - Anti-collision control system, control method thereof, processor and aerial work platform - Google Patents

Anti-collision control system, control method thereof, processor and aerial work platform Download PDF

Info

Publication number
CN115947276A
CN115947276A CN202211740590.2A CN202211740590A CN115947276A CN 115947276 A CN115947276 A CN 115947276A CN 202211740590 A CN202211740590 A CN 202211740590A CN 115947276 A CN115947276 A CN 115947276A
Authority
CN
China
Prior art keywords
obstacle
radar sensor
obstacle information
collision
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211740590.2A
Other languages
Chinese (zh)
Inventor
侯力玮
马昌训
喻畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Zoomlion Intelligent Aerial Work Machinery Co Ltd
Original Assignee
Hunan Zoomlion Intelligent Aerial Work Machinery Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Zoomlion Intelligent Aerial Work Machinery Co Ltd filed Critical Hunan Zoomlion Intelligent Aerial Work Machinery Co Ltd
Priority to CN202211740590.2A priority Critical patent/CN115947276A/en
Publication of CN115947276A publication Critical patent/CN115947276A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Forklifts And Lifting Vehicles (AREA)

Abstract

The embodiment of the application provides an anti-collision control method, a processor, an anti-collision control system, an aerial work platform and a machine readable storage medium for the aerial work platform. The control method comprises the following steps: acquiring obstacle information detected by a radar sensor; acquiring an image of a monitoring area acquired by a visual sensor; identifying whether an operator inside the working platform performs operation or not according to the image; when the fact that the operator works is recognized, whether non-obstacle information caused by the operation of the operator is contained in the obstacle information or not is determined; removing the non-obstacle information when the obstacle information is determined to contain the non-obstacle information; and executing an anti-collision strategy according to the real obstacle information without the non-obstacle information in the obstacle information. The visual sensor is adopted to obtain the working condition of the operating personnel in the working platform, and the false alarm information of the radar sensor possibly generated in the operation process is identified, so that the false alarm condition of the anti-collision control system can be avoided.

Description

Anti-collision control system, control method thereof, processor and aerial work platform
Technical Field
The application relates to the field of aerial work platform safety control, in particular to an anti-collision control method, a processor, an anti-collision control system, an aerial work platform and a machine readable storage medium for the aerial work platform.
Background
An Aerial work platform (Aerial work platform) is a movable Aerial work product for serving Aerial work, equipment installation, maintenance and the like in various industries. For example, conventional aerial work platform related products may include scissor aerial work platforms, vehicle mounted aerial work platforms, crank arm aerial work platforms, self-propelled aerial work platforms, aluminum alloy aerial work platforms, telescopic aerial work platforms, and the like.
At present, in the operation process of the aerial work platform, if the aerial work platform is not operated properly or has a sight blind area, the phenomenon that the working bucket collides with the external environment can occur. Once collision happens, huge economic loss and even casualties can be caused. An anti-collision control system is added on the aerial work platform, so that collision can be prevented through early warning, limiting and other modes. At present, most of anti-collision systems for aerial work platforms are based on pure radar schemes, distance information between obstacles and a working platform is obtained through an ultrasonic radar or a millimeter wave radar, whether collision risks exist in the working platform is judged according to the distance information, and corresponding anti-collision measures such as alarming or forced braking are taken according to the collision risks. However, the existing anti-collision scheme applied to the aerial work platform has the problem that the anti-collision measures can be triggered when the worker works normally.
Disclosure of Invention
An object of the embodiments of the present application is to provide an anti-collision control method for an aerial work platform, a processor, an anti-collision control system, an aerial work platform and a machine readable storage medium.
In order to achieve the above object, an embodiment of the present application provides an anti-collision control method, which is applied to an aerial work platform, where the aerial work platform includes a work platform and an anti-collision control system, the anti-collision control system includes a visual sensor for acquiring an image of a monitoring area and a radar sensor for detecting an obstacle, and the control method includes:
acquiring obstacle information of an obstacle detected by a radar sensor;
acquiring an image of a monitoring area acquired by a visual sensor, wherein the monitoring area comprises an activity area when an operator works on a working platform;
identifying whether an operator inside the working platform works or not according to the image;
when the fact that the operator works is recognized, whether non-obstacle information caused by the operation of the operator is contained in the obstacle information or not is determined;
removing the non-obstacle information when the obstacle information is determined to contain the non-obstacle information;
and executing an anti-collision strategy according to the real obstacle information without the non-obstacle information in the obstacle information.
In the embodiment of the present application, identifying whether the worker performs the job based on the image includes:
inputting the image to a target detection model;
the target detection model outputs a recognition result, and the output result comprises the detected hand or work tool type of the operator and the position relation of the bounding box of the hand or the work tool relative to the working platform;
and identifying whether the operator performs the operation or not according to the identification result.
In the embodiment of the present application, when it is recognized that the operator is performing the work, determining whether or not non-obstacle information caused by the work of the operator is included in the obstacle information includes:
determining an obstacle position of the detected obstacle according to the obstacle information;
determining whether there is an obstacle position in the obstacle position that matches the positional relationship;
when the position of an obstacle having a matching positional relationship is determined, it is determined that non-obstacle information is included in the obstacle information.
In an embodiment of the present application, the radar sensor includes a first radar sensor and a second radar sensor, a first detection area of the first radar sensor and a second detection area of the second radar sensor have overlapping detection areas, and the anti-collision control method further includes:
respectively acquiring first object information and second object information which are detected by a first radar sensor and a second radar sensor aiming at the movement of a target object in an overlapping detection area;
respectively generating a first motion track under a first coordinate system of a first radar sensor and a second motion track under a second coordinate system of a second radar sensor according to the first object information and the second object information;
and determining a rotation angle and a translation matrix between the first motion track and the second motion track to realize the spatial alignment of the first motion track and the second motion track, thereby completing the calibration of the first radar sensor and the second radar sensor.
In the embodiment of the present application, executing the anti-collision policy according to the real obstacle information after removing the non-obstacle information from the obstacle information includes:
determining the barrier distance of the detected barrier and the relative speed of the barrier and the working platform according to the real barrier information;
calculating a risk level according to the distance of the obstacle and the relative speed; and
and executing the anti-collision strategy according to the risk level.
In an embodiment of the present application, calculating the risk level according to the obstacle distance and the relative speed includes calculating the risk level according to the following formula:
R=min(m,f(D)+v flag )
f(D)=m-2*(D-1)
Figure BDA0004032422140000031
wherein R is the risk level, m is the highest risk level, D is the obstacle distance, v flag Is a speed threshold.
In an embodiment of the present application, acquiring obstacle information of an obstacle detected by a radar sensor includes:
obstacle information of an obstacle in a target direction detected by a radar sensor is acquired.
In the embodiment of the present application, the target direction is the movement direction of the working platform.
A second aspect of the present application provides a processor configured to execute the above-mentioned collision avoidance control method.
A third aspect of the present application provides an anti-collision control system, which is applied to an aerial work platform, wherein the aerial work platform comprises a working platform, and the anti-collision control system comprises:
a vision sensor configured to acquire an image of a monitoring area, the monitoring area including an active area when an operator performs work on the work platform;
a radar sensor configured to detect an obstacle; and
the processor described above.
In the embodiment of the application, the vision sensor is installed on the working platform through the supporting rod.
In the embodiment of the present application, the support rod is an electric retractable support rod.
In this application embodiment, radar sensor is including left side radar sensor, rear side radar sensor and the right side radar sensor that is located work platform's left side, rear side and right side respectively, and wherein, rear side radar sensor has first overlapping detection area with left side radar sensor, and rear side radar sensor has second overlapping detection area with right side radar sensor.
The present application provides in a fourth aspect an aerial work platform comprising:
a working platform; and
the anti-collision control system.
A fifth aspect of the present application provides a machine-readable storage medium having stored thereon instructions, which when executed by a processor, cause the processor to implement the above-mentioned collision avoidance control method.
Through the technical scheme, the working condition of the operating personnel in the working platform is acquired by adopting the visual sensor, and the information possibly generating false alarm of the radar sensor in the operation process is identified, so that the false alarm condition of the anti-collision control system can be avoided, and the use experience of the operating personnel on the anti-collision control system is greatly improved. In addition, the radar sensors in multiple directions can be arranged, so that higher detection coverage rate of the surrounding environment of the working platform is realized, and the safety of the working platform can be improved.
Additional features and advantages of embodiments of the present application will be described in detail in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the embodiments of the disclosure, but are not intended to limit the embodiments of the disclosure. In the drawings:
fig. 1A and 1B schematically illustrate the arrangement of various sensors in an anti-collision control system for aerial work platforms according to an embodiment of the present application;
fig. 2 schematically illustrates a flow chart of a collision avoidance control method for an aerial work platform according to an embodiment of the present application; and
fig. 3 schematically illustrates a functional block diagram of an anti-collision control system for an aerial work platform according to an embodiment of the present application.
Description of the reference numerals
1 processor 2 visual sensor
3 radar sensor 3-1 rear radar sensor
3-2 left side radar sensor 3-3 right side radar sensor
4 operator identification module 5 obstacle detection module
6 multi-sensing data fusion module and 7 anti-collision control module
8 support rod
Detailed Description
The following describes in detail specific embodiments of the present application with reference to the drawings. It should be understood that the detailed description and specific examples, while indicating the embodiments of the application, are given by way of illustration and explanation only, not limitation.
It should be noted that, if directional indications (such as up, down, left, right, front, and back … …) are referred to in the embodiments of the present application, the directional indications are only used to explain the relative positional relationship between the components and the movement in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indications are changed accordingly.
If there is a description in the embodiments of the present application referring to "first", "second", etc., the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between the respective embodiments may be combined with each other, but it is necessary that the technical solutions are capable of being implemented by a person having ordinary skill in the art, and when the technical solutions are contradictory to each other or cannot be implemented, such a combination should not be considered to exist, and is not within the protection scope claimed in the present application.
References to "left", "right" and "rear" sides of a work platform in embodiments of the present application are defined relative to a "front" side of the work platform, which refers to the side that a worker normally faces when standing on the work platform.
Fig. 1A and 1B schematically illustrate the arrangement of various sensors in an anti-collision control system for aerial work platforms according to an embodiment of the present application. Referring to fig. 1A and 1B, in an embodiment of the present application, there is provided an anti-collision control system for an aerial work platform that may include a work platform and other components, which may include, but are not limited to, for example, a main body (e.g., a walking mechanism), an arm support, a lifting mechanism, etc., depending on the type of aerial work platform. The collision avoidance control system may include:
a vision sensor 2 configured to acquire an image of a monitoring area including an activity area when an operator performs work on the work platform;
a radar sensor 3 configured to detect an obstacle; and
a processor 1.
In particular, the vision sensor 2 may comprise at least one camera, for example a CCD camera. In one example, the vision sensor 2 may be mounted on the work platform by a support bar 8. Specifically, in one example, the support bar 8 may be a motorized retractable support bar 8, the height of the vision sensor 2 may be adjusted by the retraction of the support bar 8 and the support bar 8 may be retracted to the same height as the work platform or below that height when not needed for use. The vision sensor 2 may be disposed on top of the support bar 8. The support bar 8 may be mounted on a fence of the work platform.
Examples of the radar sensor 3 may include, but are not limited to, an ultrasonic sensor, a millimeter wave sensor, and a laser sensor. Preferably, the radar sensor 3 may be a millimeter wave sensor. In one example, the radar sensors 3 may include a left side radar sensor 3-2, a rear side radar sensor 3-1, and a right side radar sensor 3-3 located on the left, rear, and right sides of the work platform, respectively. For example, these side radar sensors 3 may be arranged at the sides of the bottom of the work platform.
In addition, in order to reduce the detection blind areas of the radar sensors 3 as much as possible, in the embodiment of the present application, adjacent radar sensors 3 may have overlapping detection regions. For example, the rear side radar sensor 3-1 and the left side radar sensor 3-2 may have a first overlapping detection region. For another example, the rear side radar sensor 3-1 and the right side radar sensor 3-3 may have a second overlapping detection area.
In an alternative embodiment of the present application, a top radar sensor 3 may be provided on the top of the support bar 8 for detecting the upper area of the work platform to further increase the detection range of the radar sensor 3.
In an alternative embodiment of the application, a vision sensor 2 (e.g. a camera) may be provided on the side of the work platform to further monitor the environment around the work platform. For example, the vision sensors 2 may be provided on the left side, right side, and rear side of the work platform, respectively.
Examples of processor 1 may include, but are not limited to, a single chip microcomputer, a microprocessor, a Field Programmable Gate Array (FPGA), a Programmable Logic Controller (PLC), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a state machine, and the like.
The processor 1 may be configured to acquire a detection signal of the radar sensor 3 and an environment image photographed by the camera, determine whether a target obstacle exists according to the detection signal and the environment image, and perform a corresponding collision prevention measure in case that it is determined that the target obstacle exists. In an example, the collision avoidance measures may comprise alarms/reminders, for example by means of sound/light. The collision avoidance control system may further comprise an alarm device and the processor 1 may, upon determining that an alarm is required, send a command to the alarm device instructing the alarm device to sound/light an alarm, for example. In another example, the processor 1 may control the movement of the work platform. For example, the aerial work platform may include a drive mechanism for driving the work platform in motion, such as a boom slewing mechanism, a boom luffing mechanism, a platform lifting mechanism, and the like. The processor 1 may control the driving mechanism to perform a corresponding action according to the detected obstacle information (e.g., position information, speed information, etc.) of the target obstacle, so as to avoid the working platform colliding with the target obstacle, or send an instruction (indication) to a controller of the driving mechanism, and the controller may control the driving mechanism to perform a corresponding action after receiving the instruction.
Fig. 2 schematically shows a flow chart of an anti-collision control method for an aerial work platform according to an embodiment of the application. As shown in fig. 2, in the embodiment of the present application, the collision avoidance control method may be applied to the collision avoidance control system of any of the embodiments described above. The collision avoidance control method may include the following steps.
In step S210, obstacle information of an obstacle detected by the radar sensor is acquired.
Specifically, for example, the processor may acquire a detection signal of the radar sensor from the radar sensor, and if the radar sensor detects an obstacle, the processor may obtain obstacle information of the detected obstacle from the detection signal. For example, taking as an example that the radar sensor is a millimeter wave radar sensor, the millimeter wave radar sensor sends point cloud data to the processor, and the processor may determine position information of the detected obstacle and speed information (relative speed) with respect to the work platform from the point cloud data.
In the embodiment of the present application, the radar sensors may adopt the arrangement shown in fig. 1A and fig. B, that is, the radar sensor 3 may include a left-side radar sensor 3-2, a right-side radar sensor 3-3, and a rear-side radar sensor 3-1, and the rear-side radar sensor 3-1 has an overlapping detection area with the left-side radar sensor 3-2 and the right-side radar sensor 3-3, respectively. The pair of radar sensors for the rear side radar sensor 3-1 and the left side radar sensor 3-2, or the pair of radar sensors for the rear side radar sensor 3-1 and the right side radar sensor 3-3, which are referred to as a first radar sensor and a second radar sensor for convenience of description below, have overlapping detection areas of a first detection area of the first radar sensor and a second detection area of the second radar sensor. In order to avoid the situation that two radar sensors report the same obstacle when the obstacle is detected in the overlapped detection area, the first radar sensor and the second radar sensor can be calibrated.
Specifically, the calibration method may include:
respectively acquiring first object information and second object information which are detected by a first radar sensor and a second radar sensor aiming at the movement of a target object in an overlapping detection area;
respectively generating a first motion track under a first coordinate system of a first radar sensor and a second motion track under a second coordinate system of a second radar sensor according to the first object information and the second object information;
and determining a rotation angle and a translation matrix between the first motion track and the second motion track to realize the spatial alignment of the first motion track and the second motion track, thereby completing the calibration of the first radar sensor and the second radar sensor.
More specifically, the object moves within the overlapping detection areas of the first radar sensor and the second radar sensor (denoted by radar a and radar B, respectively), ensuring that it can be detected by both radars simultaneously. Generating a motion track under respective coordinate systems of the two radars, and recording the motion track as tau A And τ B . At the moment, the spatial alignment of the two tracks can be realized only by solving the rotation angle and the translation matrix between the two tracks, and the calibration of the two radars is completed.
Representing the trajectory by a sequence of discrete points, denoted A i Indicating the ith point of the track detected by radar A, using B i I point representing the trace detected by radar B, by alpha AB Representing the rotation angle from the radar a coordinate system to the radar B coordinate system. When alpha is AB When the requirement is met, the track tau A And τ B The variance of the distance between should be minimized, so α can be set AB The solution of (a) translates into the following optimization problem:
Figure BDA0004032422140000091
wherein N is the number of the track points,
Figure BDA0004032422140000101
R AB is a rotation matrix, satisfies
Figure BDA0004032422140000102
Solving the optimization problem:
Figure BDA0004032422140000103
wherein R' AB Is R AB For alpha AB The derivative of (c).
Order to
Figure BDA0004032422140000104
If the above-mentioned deflection formula is zero, then there are:
Figure BDA0004032422140000105
by calculating to obtain alpha AB The alignment of the two radar postures can be realized, and the corresponding translation matrix can be obtained through calculation by combining the track distance.
In step S220, an image of a monitoring area acquired by the vision sensor is acquired, where the monitoring area includes an activity area when the worker performs work on the work platform.
Specifically, the visual sensor (top visual sensor) performs a top-down shot on the working platform, and can shoot an image containing the working platform and a certain range of monitoring area around the working platform.
In step S230, it is recognized whether or not the worker inside the work platform performs the work based on the image.
Specifically, the image may be input to the target detection model, the target detection model outputs the recognition result, and the output result may include the detected hand or work tool category of the worker and the positional relationship of the hand or work tool enclosure with respect to the work platform, and whether the worker performs work is recognized according to the recognition result.
More specifically, when the object detection model identifies the hand of the operator or the work tool in the image, the identified hand or the work tool may be marked with a bounding box (detection frame), and it may be determined whether the operator is performing work at this time based on the positional relationship of the bounding box with respect to the work platform. For example, when the operator works, the operator may need to hold the working tool and extend to the outer side of the guardrail of the working platform, at this time, the target detection model may recognize the hand and the working tool and mark the hand and the working tool with the bounding box, and the operator may be determined to work according to the position relationship of the bounding box at this time.
In the embodiment of the present application, the target detection model may be a deep learning model, examples of which may include, but are not limited to, a one-stage detection model, such as a model of YOLO series, an SSD model, and the like, a two-stage detection model, such as a model of RCCN series, and the like.
In step S240, when it is recognized that the operator is performing the work, it is determined whether non-obstacle information due to the work of the operator is included in the obstacle information.
In step S250, when it is determined that the obstacle information includes the non-obstacle information, the non-obstacle information is removed.
Specifically, if it is recognized that a worker inside the work platform is performing work, for example, the processor may determine an obstacle position according to the detected obstacle information uploaded by the radar sensor, compare the obstacle position with the positional relationship of the bounding box, and if there is an obstacle position in the determined obstacle position that matches the positional relationship (for example, the positional relationship of the bounding box is substantially the same as the obstacle position), it may be determined that the obstacle at the obstacle position detected by the radar sensor is due to a work action of the worker, and therefore, the corresponding obstacle information may be considered as non-obstacle information, and the non-obstacle information is included in the obstacle information uploaded by the radar sensor. The processor may remove the non-obstacle information from the obstacle information. In one example, the radar sensors are arranged on three sides of the work platform, i.e. the left side, the right side and the rear side, and if the determined non-obstacle information is from one of the three side radar sensors, e.g. the left side radar sensor, the processor may directly mask (disable) the detection signal uploaded by the left side radar sensor.
In the present embodiment, it may be only of interest whether there is a risk of collision in some specified directions. For example, the work platform may be moved in a first direction (e.g., toward the left side of the work platform), where it may be of interest only to determine whether there is a risk of collision in front of the left side of the work platform. Thus, for example, the processor may obtain motion information (e.g., direction of motion) of the work platform, and obtain obstacle information for an obstacle detected in the direction of the target. For example, the processor may obtain a detection signal uploaded by the left radar sensor from the left radar sensor, and determine the obstacle information according to the detection signal. If the operator is working in the second direction at this time (e.g. working towards the rear side of the work platform), the subsequent collision avoidance strategy may be ignored, regardless of whether the processor (object detection model) identifies that the operator is working. That is, for the detection signal uploaded by the radar sensor, the processor only needs to correlate the detection signal of the radar sensor in the target direction (e.g., the moving direction of the working platform). For example, if the work platform is moving to the left, then the processor need only acquire probes from the left radar sensor, and may mask (disable or discard) probes uploaded from other radar sensors (e.g., the right radar sensor and the back radar sensor). In this case, only if the positional relationship of the bounding box output by the target detection model is associated with the target direction (for example, the hand of the worker and the work tool are moving on the left side of the work platform), there is a possibility that there is a match with the position of the obstacle detected by the radar sensor.
In step S260, a collision avoidance strategy is performed according to the real obstacle information from which the non-obstacle information is removed from the obstacle information.
Specifically, after the non-obstacle information is removed, the remaining information is the real obstacle information. The obstacle distance of the detected obstacle and the relative speed between the obstacle and the working platform can be determined according to the real obstacle information;
calculating a risk level according to the distance of the obstacle and the relative speed; and
and executing the anti-collision strategy according to the risk level.
Calculating the risk level based on the obstacle distance and the relative velocity includes calculating the risk level based on the following equation:
R=min(m,f(D)+v flag )
f(D)=m-2*(D-1)
Figure BDA0004032422140000131
wherein R is the risk level, m is the highest risk level, D is the obstacle distance, v flag Is a speed threshold.
After determining the risk level, corresponding collision avoidance strategies may be formulated for different risk levels. For example, collision avoidance strategies may include alarms (e.g., acoustic/optical alarms), deceleration motions, emergency braking, etc., and higher degrees of strategy (e.g., deceleration motions or even emergency braking) may be employed when the risk level is higher.
The embodiment of the present application provides a processor 1 configured to execute the anti-collision control method according to any of the above embodiments.
In particular, the processor 1 may be configured to:
acquiring obstacle information of an obstacle detected by a radar sensor 3;
acquiring an image of a monitoring area acquired by the vision sensor 2, wherein the monitoring area comprises an activity area when an operator works on the working platform;
identifying whether an operator works according to the image;
when the fact that the operator works is recognized, whether non-obstacle information caused by the operation of the operator is contained in the obstacle information or not is determined;
removing the non-obstacle information when the obstacle information is determined to contain the non-obstacle information;
and executing an anti-collision strategy according to the real obstacle information without the non-obstacle information in the obstacle information.
In the embodiment of the present application, identifying whether the worker performs the job based on the image includes:
inputting the image to a target detection model;
the target detection model outputs a recognition result, and the output result comprises the detected hand or the type of the working tool of the operator and the position relation of the bounding box of the hand or the working tool relative to the working platform;
and identifying whether the operator performs operation according to the identification result.
In the embodiment of the present application, when it is recognized that an operator is performing an operation, determining whether or not non-obstacle information caused by the operation of the operator is included in obstacle information includes:
determining an obstacle position of the detected obstacle according to the obstacle information;
determining whether an obstacle position matching the positional relationship exists in the obstacle positions;
when the position of an obstacle having a matching positional relationship is determined, it is determined that non-obstacle information is included in the obstacle information.
In the embodiment of the present application, the processor 1 may be further configured to:
acquiring first object information and second object information detected by a first radar sensor and a second radar sensor for the movement of a target object in an overlapping detection area respectively;
respectively generating a first motion track under a first coordinate system of a first radar sensor and a second motion track under a second coordinate system of a second radar sensor according to the first object information and the second object information;
and determining a rotation angle and a translation matrix between the first motion track and the second motion track to realize the spatial alignment of the first motion track and the second motion track, thereby completing the calibration of the first radar sensor and the second radar sensor.
In the embodiment of the present application, executing the anti-collision policy according to the real obstacle information after removing the non-obstacle information from the obstacle information includes:
determining the barrier distance of the detected barrier and the relative speed of the barrier and the working platform according to the real barrier information;
calculating a risk grade according to the obstacle distance and the relative speed; and
and executing the anti-collision strategy according to the risk level.
In an embodiment of the present application, calculating the risk level according to the obstacle distance and the relative speed includes calculating the risk level according to the following formula:
R=min(m,f(D)+v flag )
f(D)=m-2*(D-1)
Figure BDA0004032422140000151
wherein R is a risk level, m is a highest risk level, D is an obstacle distance, v flag Is a speed threshold.
In an embodiment of the present application, acquiring obstacle information of an obstacle detected by a radar sensor includes:
obstacle information of an obstacle in a target direction detected by a radar sensor is acquired.
In the embodiment of the present application, the target direction is the moving direction of the working platform.
Fig. 3 schematically illustrates a functional block diagram of an anti-collision control system for an aerial work platform according to an embodiment of the present application. As shown in fig. 3, the functional division of the processor 1 of the collision avoidance control system may include:
an operator identification module 4 including a target detection model configured to; acquiring an acquired image of a monitoring area from the visual sensor 2, wherein the monitoring area comprises an activity area when an operator works on the working platform; identifying whether an operator works according to the image and outputting an identification result;
an obstacle detection module 5 configured to: acquiring a detection signal of the detected obstacle from the radar sensor 3, and obtaining obstacle information according to the detection signal, wherein the obstacle information may include position information and speed information of the obstacle;
a multi-sensory data fusion module 6 configured to: receiving the identification result and the obstacle information from the operator identification module 4 and the obstacle detection module 5 respectively, and determining whether the obstacle information contains non-obstacle information caused by the operation of the operator according to the identification result and the obstacle information when the operator performs the operation according to the identification result; removing the non-obstacle information when the obstacle information is determined to contain the non-obstacle information;
a collision avoidance control module 7 configured to: and acquiring real obstacle information without non-obstacle information from the multi-sensing data fusion module 6, and making and executing an anti-collision strategy according to the real obstacle information.
The embodiment of the application provides an aerial working platform, includes:
a working platform; and
the collision avoidance control system of any of the embodiments described above.
The embodiment of the present application provides a machine-readable storage medium, which stores instructions thereon, and when executed by a processor, the processor is enabled to implement the anti-collision control method according to any of the above embodiments.
The scheme provided by the embodiment of the application has at least one of the following beneficial effects:
(1) The visual sensor is used for acquiring semantic information in the image and removing non-obstacle information, so that the false alarm rate of the anti-collision control system is effectively reduced, and the correctness of the anti-collision control action is ensured.
(2) By the aid of the multi-directional arrangement of the radar sensors, the detection range of the surrounding environment of the working platform is widened. Meanwhile, the arrangement design of the visual sensor is provided, the camera can conveniently acquire the working condition in the working platform, the visual sensor can be used for identifying hands of operating personnel or operating tools, and false alarm is avoided.
(3) The method for calibrating the plurality of radar sensors in pairs can support the combined calibration of the radar sensors in the radar sensor arrangement scheme in the embodiment of the application, effectively improve the ranging accuracy of the radar sensors, and meanwhile, the same obstacle can be prevented from being detected by two radar sensors as different obstacles.
(4) The anti-collision control method comprises risk grade calculation, not only considers the space absolute position relation of the barrier and the working platform, but also considers the relative motion relation, and can more accurately describe the collision risk condition.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor 1 of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor 1 of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors 1 (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (15)

1. An anti-collision control method is applied to an aerial work platform, the aerial work platform comprises a work platform and an anti-collision control system, the anti-collision control system comprises a visual sensor for acquiring images of a monitoring area and a radar sensor for detecting obstacles, and the control method comprises the following steps:
acquiring obstacle information of an obstacle detected by the radar sensor;
acquiring an image of a monitoring area acquired by the vision sensor, wherein the monitoring area comprises an activity area when an operator works on the working platform;
identifying whether the operator in the working platform works or not according to the image;
when the operator is identified to be working, determining whether non-obstacle information caused by the operation of the operator is contained in the obstacle information;
removing the non-obstacle information if it is determined that the non-obstacle information is included in the obstacle information;
and executing an anti-collision strategy according to the real obstacle information from which the non-obstacle information is removed.
2. The collision avoidance control method according to claim 1, wherein the identifying whether the operator is performing the task based on the image comprises:
inputting the image to a target detection model;
the target detection model outputs a recognition result, and the output result comprises the detected hand or work tool category of the operator and the position relation of the bounding box of the hand or work tool relative to the working platform;
and identifying whether the operator works or not according to the identification result.
3. The collision avoidance control method according to claim 2, wherein the determining whether or not non-obstacle information resulting from an operation by an operator is included in the obstacle information when it is recognized that the operator is performing the operation includes:
determining an obstacle position of the detected obstacle according to the obstacle information;
determining whether there is an obstacle position in the obstacle position that matches the positional relationship;
and when the position of the obstacle matching the position relation is determined to exist, determining that the non-obstacle information is contained in the obstacle information.
4. The collision avoidance control method according to claim 1, wherein the radar sensor includes a first radar sensor and a second radar sensor, a first detection area of the first radar sensor and a second detection area of the second radar sensor have overlapping detection areas, the collision avoidance control method further comprising:
acquiring first object information and second object information detected by the first radar sensor and the second radar sensor respectively for a target object moving in the overlapping detection area;
respectively generating a first motion track under a first coordinate system of the first radar sensor and a second motion track under a second coordinate system of the second radar sensor according to the first object information and the second object information;
and determining a rotation angle and a translation matrix between the first motion track and the second motion track to realize the spatial alignment of the first motion track and the second motion track, thereby completing the calibration of the first radar sensor and the second radar sensor.
5. The method according to claim 1, wherein the performing the collision avoidance strategy according to the real obstacle information from which the non-obstacle information is removed comprises:
determining the obstacle distance of the detected obstacle and the relative speed of the obstacle and the working platform according to the real obstacle information;
calculating a risk level according to the obstacle distance and the relative speed; and
and executing the anti-collision strategy according to the risk level.
6. The collision avoidance control method of claim 5, wherein said calculating a risk level as a function of the obstacle distance and the relative speed comprises calculating a risk level as a function of the following equation:
R=min(m,f(D)+v flag )
f(D)=m-2*(D-1)
Figure FDA0004032422130000031
wherein R is the risk level, m is the highest risk level, D is the obstacle distance, v flag Is a speed threshold.
7. The collision avoidance control method according to any one of claims 1 to 6, wherein the acquiring obstacle information of the obstacle detected by the radar sensor includes:
and acquiring obstacle information of an obstacle in the target direction detected by the radar sensor.
8. The collision avoidance control method of claim 7, wherein the target direction is a direction of motion of the working platform.
9. A processor configured to perform the collision avoidance control method of any one of claims 1 to 8.
10. An anti-collision control system, applied to an aerial work platform comprising a work platform, the anti-collision control system comprising:
a vision sensor configured to acquire an image of a monitoring area, the monitoring area including an active area of a worker while performing work on the work platform;
a radar sensor configured to detect an obstacle; and
the processor of claim 9.
11. A collision avoidance control system according to claim 10, wherein the vision sensors are mounted on the work platform by support rods.
12. The collision avoidance control system of claim 11, wherein the support rods are electrically retractable support rods.
13. The collision avoidance control system of claim 10, wherein the radar sensors comprise a left side radar sensor, a rear side radar sensor, and a right side radar sensor located on a left side, a rear side, and a right side of the work platform, respectively, wherein the rear side radar sensor has a first overlapping detection zone with the left side radar sensor and the rear side radar sensor has a second overlapping detection zone with the right side radar sensor.
14. An aerial work platform, comprising:
a working platform; and a collision avoidance control system according to any one of claims 10 to 13.
15. A machine-readable storage medium having instructions stored thereon, which when executed by a processor causes the processor to implement the collision avoidance control method according to any one of claims 1 to 8.
CN202211740590.2A 2022-12-30 2022-12-30 Anti-collision control system, control method thereof, processor and aerial work platform Pending CN115947276A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211740590.2A CN115947276A (en) 2022-12-30 2022-12-30 Anti-collision control system, control method thereof, processor and aerial work platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211740590.2A CN115947276A (en) 2022-12-30 2022-12-30 Anti-collision control system, control method thereof, processor and aerial work platform

Publications (1)

Publication Number Publication Date
CN115947276A true CN115947276A (en) 2023-04-11

Family

ID=87287451

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211740590.2A Pending CN115947276A (en) 2022-12-30 2022-12-30 Anti-collision control system, control method thereof, processor and aerial work platform

Country Status (1)

Country Link
CN (1) CN115947276A (en)

Similar Documents

Publication Publication Date Title
FI115678B (en) Arrangement for Mining Vehicle Collision Prevention
JP5283622B2 (en) Monitoring method and apparatus using camera for preventing collision of machine
US10741079B2 (en) Route prediction system
CN111409630A (en) Vehicle obstacle avoidance method, system and device
JP6393123B2 (en) Obstacle detection system and transport vehicle
US20180273030A1 (en) Autonomous Vehicle having Pedestrian Protection Subsystem
CN114523963B (en) System and method for predicting road collisions with host vehicles
CN108028017A (en) Method for distinguishing real obstruction and false barrier in the driver assistance system of motor vehicle
JP4530996B2 (en) Mobile robot
JP6747665B2 (en) robot
CN111516777A (en) Robot trolley and obstacle identification method thereof
CN112373467A (en) Intelligent obstacle avoidance system of unmanned automobile
KR20210073204A (en) Method, apparatus and computer program for preventing collision of automatic driving vehicle
JP2017044530A (en) Obstacle detection system
US9440651B2 (en) Method and device for monitoring a setpoint trajectory of a vehicle
CN114368693B (en) Anti-collision method and device for arm support, processor and crane
CN115565058A (en) Robot, obstacle avoidance method, device and storage medium
US11198985B2 (en) Method for monitoring movement of a cantilever structure of an offshore platform, monitoring system, offshore platform
KR102544505B1 (en) Crash preventing system of crane and crash preventing method thereof
CN114089733B (en) Guidance control method, guidance control device, security inspection vehicle, medium, and program product
KR20220146617A (en) Method and apparatus for detecting blooming in lidar measurements
CN115947276A (en) Anti-collision control system, control method thereof, processor and aerial work platform
JP2006092253A (en) Autonomous movement device
JP7486095B2 (en) VEHICLE MONITORING METHOD, VEHICLE MONITORING DEVICE, VEHICLE, AND VEHICLE MONITORING SYSTEM
JP2019089636A (en) Safety apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination