CN114049767A - Edge calculation method and device and readable storage medium - Google Patents

Edge calculation method and device and readable storage medium Download PDF

Info

Publication number
CN114049767A
CN114049767A CN202111328580.3A CN202111328580A CN114049767A CN 114049767 A CN114049767 A CN 114049767A CN 202111328580 A CN202111328580 A CN 202111328580A CN 114049767 A CN114049767 A CN 114049767A
Authority
CN
China
Prior art keywords
target
data
field
targets
precision map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111328580.3A
Other languages
Chinese (zh)
Other versions
CN114049767B (en
Inventor
刘鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202111328580.3A priority Critical patent/CN114049767B/en
Publication of CN114049767A publication Critical patent/CN114049767A/en
Application granted granted Critical
Publication of CN114049767B publication Critical patent/CN114049767B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides an edge calculation method, an edge calculation device and a readable storage medium. The road perception and edge calculation unit is used for acquiring accurate and reliable real-time traffic element dynamic data and sending out a cooperative control instruction for strong cooperation and strong control of traffic elements, so that the cooperative control function of controlling vehicles by roads is realized, and the reliability and the safety of multi-vehicle cooperative control in a complex traffic scene are ensured.

Description

Edge calculation method and device and readable storage medium
Technical Field
The present application relates to the field of edge computing technologies, and in particular, to an edge computing method and apparatus, and a readable storage medium.
Background
In recent years, the automatic driving technology is rapidly developed, the automatic driving technology is suitable for popularization and application of the automatic driving technology in closed traffic environments such as airports, parks, ports and the like, the automatic driving replaces manned driving, a large amount of labor cost can be saved, the advantages of automatic sensing, real-time communication, accurate positioning and the like of automatic driving vehicles are effectively exerted, and the traffic service safety performance and the operation efficiency are improved. However, the automatic driving currently adopts a single-vehicle intelligent mode, that is, the perception capability and the calculation force of the automatic driving vehicle are depended on, the perception capability and the calculation force are relatively limited, and a perception blind area exists due to the influence of the visual distance and the visual angle of the vehicle-mounted perception device on the vehicle during the driving process of the vehicle, so that the single-vehicle intelligent mode has the problems of low reliability and poor safety.
The vehicle-road cooperation is based on the intelligent automatic driving of a single vehicle, real-time sensing and high-precision positioning are carried out on the road traffic environment through sensing detection equipment (such as a camera, a radar and the like) arranged on the road, data interaction is carried out on the road side RSU and the vehicle-mounted OBU, information interaction sharing (network interconnection) of different degrees between the vehicle and the vehicle, between the vehicle and the road, between the vehicle and the network and between the vehicle and the person is realized, safety warning is sent out in real time for the vehicle, the visual field of the vehicle is expanded, and the safety is improved. However, in the multi-vehicle interaction scene of the intersection, the combination, the input and the output of the large traffic flow, the high-efficiency cooperative passing of multiple vehicles is difficult to achieve in the safety warning mode of the vehicle-road cooperation.
Disclosure of Invention
The embodiment of the application provides an edge calculation method, an edge calculation device and a readable storage medium, which can at least solve the problems of poor reliability and poor safety of multi-vehicle cooperative control in a multi-vehicle interaction scene in the related art.
A first aspect of an embodiment of the present application provides an edge calculation method, including:
receiving vehicle perception data sent by vehicle-mounted perception equipment and receiving road traffic situation perception data sent by road perception equipment of a district to which a vehicle belongs;
calculating target characteristic data based on the vehicle perception data and the road traffic situation perception data; wherein the target feature data comprises intra-field targets and corresponding feature values for the patch;
fusing the target characteristic data to obtain fused data, and generating high-precision map dynamic data corresponding to the region based on the fused data;
generating a cooperative control instruction of each in-field target according to the high-precision map dynamic data and the operation task of the in-field target;
and respectively sending the corresponding cooperative control command to the targets in each field.
A second aspect of embodiments of the present application provides an edge computing apparatus, including:
the receiving module is used for receiving vehicle perception data sent by the vehicle-mounted perception equipment and receiving road traffic situation perception data sent by road perception equipment of a district to which the vehicle belongs;
the calculation module is used for calculating target characteristic data based on the vehicle perception data and the road traffic situation perception data; wherein the target feature data comprises intra-field targets and corresponding feature values for the patch;
the fusion module is used for fusing the target characteristic data to obtain fusion data and generating high-precision map dynamic data corresponding to the film area based on the fusion data;
the generation module is used for generating a cooperative control instruction of each in-field target according to the high-precision map dynamic data and the operation task of the in-field target;
and the sending module is used for sending the corresponding cooperative control instruction to the targets in each field respectively.
A third aspect of embodiments of the present application provides an electronic apparatus, including: the edge calculation method includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the steps of the edge calculation method provided in the first aspect of the embodiments of the present application.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps in the edge computing method provided by the first aspect of the embodiments of the present application.
As can be seen from the above, according to the edge calculation method, the edge calculation device, and the readable storage medium provided in the present application, the perceptual data of the traffic elements are layered and then subjected to target perceptual calculation, and then the targets are subjected to data fusion to form a high-precision dynamic map in the area, and finally, a cooperative control instruction for the traffic elements is generated according to scene contact. The road perception and edge calculation unit is used for acquiring accurate and reliable real-time traffic element dynamic data and sending out a cooperative control instruction for strong cooperation and strong control of traffic elements, so that the cooperative control function of controlling vehicles by roads is realized, and the reliability and the safety of multi-vehicle cooperative control in a complex traffic scene are ensured.
Drawings
FIG. 1 is a schematic diagram of an overall system structure in which an edge computing device according to a first embodiment of the present application is located;
fig. 2 is a schematic flowchart of an edge calculation method according to a first embodiment of the present application;
FIG. 3 is a schematic diagram illustrating a sensing period of an intrafield target by a scene sensing device according to a first embodiment of the present application;
FIG. 4 is a schematic diagram of spatiotemporal analysis of perceptual sampling of target features in an scene sensing device according to a first embodiment of the present application;
FIG. 5 is a block diagram of an edge computing device according to a second embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to a third embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present invention more apparent and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In a complex traffic scene with large traffic flow and multiple intersections, a roadside intelligent system is necessary to be used as a dominant strong cooperation and strong control mode to construct a cooperative traffic system dominated by the roadside intelligent system, so that the cooperative traffic system covers vehicle automation types (vehicle automation) of different degrees, and an optimized cooperative control function (traffic cooperation) between vehicles and roads is realized. The road perception and edge calculation unit is used for acquiring accurate and reliable real-time traffic element dynamic data, and sending a cooperative control instruction for strong cooperation and strong control of traffic elements, so that the cooperative control function of controlling vehicles by roads is realized.
The edge calculation is a core unit of the intelligent system at the road side, bears a series of complex and rapid calculation functions of perception calculation, data fusion, scene triggering, cooperative control, instruction sending and the like at the road side, and is key equipment of the intelligent system at the road side and the like.
In order to solve the problem of poor reliability and safety of multi-vehicle cooperative control in a multi-vehicle interaction scene in the related art, a first embodiment of the present application provides an edge computing method, which is applied to an edge computing device, as shown in fig. 1, the overall system structure diagram where the edge computing device of this embodiment is located is shown.
The road sensing equipment is intelligent sensing equipment fixed on the road side, and can comprise various intelligent sensing equipment such as laser radar, millimeter wave radar, microwave radar, a camera and an underground vehicle detector, as well as V2X road side communication equipment RSU, vehicle-mounted communication equipment OBU and the like, and can also be a combination of the sensing equipment, such as a microwave radar video all-in-one machine, a millimeter wave radar video all-in-one machine and the like. Sensing the surrounding traffic situation by sensing equipment, such as a camera, generating a visual image or a video stream by utilizing a light imaging or thermal imaging technology; the equipment such as laser radar, millimeter wave radar and the like senses surrounding objects by utilizing a signal reflection principle, generates point cloud signals and realizes sensing of surrounding environment and the like; besides, other data acquisition modes, such as sensing vehicle passing information through geomagnetism, receiving carrier information of OBU broadcast by RSU, and the like, are available.
The targets in the field of the embodiment refer to traffic participants such as vehicles, pedestrians, bicycles, obstacles and the like, and mainly refer to vehicles in practical application, including automatic driving vehicles and manned vehicles. The vehicle is provided with a vehicle-mounted sensing device, a vehicle-mounted communication unit OBU, a controller and the like, wherein the OBU and the RSU are communicated with each other in a V2X vehicle-road cooperation mode.
The central platform system is a global control system of the system, and is connected with a plurality of edge computing devices, real-time high-precision map dynamic data formed by each edge computing device can be summarized to an upper layer of the central platform system, and high-precision map dynamic data in a larger range, such as high-precision map dynamic data of a whole park, an airport and a port, is finally formed through calculation and can be used as a basis for traffic scheduling and commanding in a larger range.
As shown in fig. 2, which is a schematic flow chart of the edge calculating method provided in this embodiment, the edge calculating method includes the following steps:
step 201, receiving vehicle perception data sent by a vehicle-mounted perception device, and receiving road traffic situation perception data sent by a road perception device of a district to which the vehicle belongs.
Specifically, in practical applications, the vehicle transmits the vehicle's own state (position, attitude, speed, angular velocity, acceleration, electric quantity, etc.) and the surrounding perception information to the RSU through the OBU unit, and then the RSU transmits the information to the edge computing device. The field sensing equipment senses road traffic situation information on the road side and outputs the road traffic situation information to the edge computing equipment in a digital mode.
Fig. 3 is a schematic diagram of a sensing cycle of a scene sensing device for an in-field target according to this embodiment, which illustrates that the scene sensing device (five sensors, such as a microwave radar, a millimeter wave radar, a laser radar, a camera, and an RSU module) monitors traffic elements on the scene, and performs a series of processes, such as target feature extraction, identification, and tracking, on monitoring signals, and a time difference between actual occurrence of the target.
Fig. 4 is a schematic diagram of spatiotemporal analysis of feature perception sampling of an in-field target by a scene perception device according to this embodiment, which describes periodic analysis of monitoring, identifying and tracking of an in-field target by a scene perception sensor. The detection period of millimeter wave radar and microwave radar is about 20 ms; the detection period of the laser radar is about 60 ms; the monitoring period of the camera is about 70 ms; the period of the RSU is approximately 100 ms.
And 202, calculating target characteristic data based on the vehicle perception data and the road traffic situation perception data.
Specifically, the target feature data of the present embodiment includes the intra-field targets of the parcel and the corresponding feature values. The edge computing equipment calculates and obtains the in-field target and the characteristic attribute vector thereof based on the scene perception data, and the result is
Figure BDA0003347855130000051
Where i denotes the ith edge-aware device and j denotes the jth intrafield object detected by the ith edge-aware device. Different errors exist in the traffic elements and the characteristic attributes thereof calculated by different edge sensing devices.
In some embodiments of this embodiment, the step of calculating the target feature data based on the vehicle perception data and the road traffic situation perception data includes: identifying targets in the field based on the vehicle perception data and the road traffic situation perception data to obtain a target set; performing coordinate transformation on all targets in the field in the target set, and transforming all targets in the field into actual coordinates of the field; performing time synchronization processing on the target set based on the data sensing time length and the target identification time length; the data sensing duration is the duration required by the sensor for sensing data, and the target identification duration is the duration required by identifying targets in the field; performing motion trajectory simulation on targets in each field based on Kalman filtering, and constructing a corresponding first target trajectory function; and carrying out timing characteristic sampling on the targets in each field according to the first target track function to obtain target characteristic data.
Specifically, for target recognition, the format and content of the sensed information change along with the diversity of the sensing devices, and the method for target recognition according to the collected data also changes. For image and video data type data, algorithms such as YOLO-tiny and the like can be adopted for target identification; for point cloud data output by radars such as laser and the like, algorithms such as clustering and the like are adopted; the vehicle time sequence information received by the RSU can be combined with Kalman filtering for classification and identification.Different edge computing devices may employ different target recognition algorithms. After target identification, generating identified targets in the field to form a target set { O }(i,t)Where i denotes the identified ith intrafield object at time t. In addition, in this embodiment, through continuous detection by the sensing device, the edge computing device obtains a continuous monitoring value of the target in the field, and then performs processing such as target classification, ranging, positioning and the like according to the target monitoring value to obtain an observation set { (O)(i,1),O(i,2),…,O(i,t))}. And then, denoising and filtering the observed value for identifying the target i, and then performing track simulation tracking and prediction on the target i by adopting a Kalman filtering function.
In this embodiment, since the features of all the recognized targets, such as distances and positions, are information of the distances and positions calculated with respect to the sensor, the targets need to be subjected to coordinate transformation and transformed into actual coordinates of the field.
In addition, the time until the identified in-field target is sensed by the sensor (some sensors will image) until the identified in-field target is identified is recorded as Δ t, and after the time of Δ t, the in-field target O(i,t)If the characteristics (position, velocity, etc.) of the sensor change, time synchronization is required to eliminate the error. Wherein the content of the first and second substances,
Δt=Δt1+Δt2
Δt1representing the time required for the sensor to sample the data to output; Δ t2The time required for recognizing the target based on the output data and calculating the target feature is shown.
Further, based on Kalman filtering to the target OiSimulating and constructing a function target trajectory function Ti(T), taking Ti(t + dt) is the target OiAnd simulating all the identified target motion tracks according to the current characteristic values.
Finally, all identified targets are sampled according to timing characteristics, namely according to a target track function Ti(t) sampling at fixed point to obtain target OiA series of sample values.
And 203, fusing the target characteristic data to obtain fused data, and generating high-precision map dynamic data corresponding to the regions based on the fused data.
In some embodiments of this embodiment, the step of fusing the target feature data to obtain fused data includes: based on different sensor types, grouping and associating the target characteristic data to obtain new target characteristic data; calibrating the identity information of the grouped targets in the new target characteristic data; performing motion trajectory simulation on the grouped targets based on Kalman filtering, and constructing a corresponding second target trajectory function; predicting the motion trail of the grouped target based on a second target trail function; and sampling timing characteristics on the grouped simulation tracks according to the second target track function to obtain fusion data.
Specifically, in the present embodiment, the sensing information of different sensors (such as RSU, camera, laser radar, millimeter wave radar, etc.) is processed separately, the target is identified, and the target features are extracted at regular time, so as to achieve the time synchronization problem of identifying the target and sampling the features between different sensors. By using
Figure BDA0003347855130000071
And j represents the j identified target, and t represents the time t to sample the target feature.
In practical applications, the present embodiment is based on different target feature data
Figure BDA0003347855130000072
Clustering analysis was performed, with the objective function as follows:
Figure BDA0003347855130000073
finally, a new target set is formed and recorded as
Figure BDA0003347855130000074
Then, according to the new merged shapeTarget feature data
Figure BDA0003347855130000075
And for each intrafield object in the data
Figure BDA00033478551300000713
And carrying out identity calibration, determining the identity of each target in the field, and giving an ID (identity).
Further, trajectory simulation is performed for the grouped objects, i.e. based on
Figure BDA0003347855130000076
Set of time series characteristics for objects within the field
Figure BDA0003347855130000077
The track is subjected to Kalman filtering and fitting again to construct a track function of the target in the field
Figure BDA0003347855130000078
Still further, the track prediction is carried out for the grouped targets, namely according to the target track function constructed in the previous step
Figure BDA0003347855130000079
To the target
Figure BDA00033478551300000710
And (6) performing prediction.
Finally, timing sampling is carried out on the grouped target simulation track, namely fixed point prediction sampling is carried out according to the grouped target track function to obtain the target
Figure BDA00033478551300000711
A series of sample values of
Figure BDA00033478551300000712
In another embodiment of this embodiment, the step of generating high-precision map dynamic data corresponding to the parcel based on the fused data includes: acquiring a target type corresponding to the target in the field based on the identity information; determining a corresponding target attribute value according to the target type; constructing the geometric structure of the target in the field in the current plane space based on the target attribute value; carrying out spatial transformation on the geometric structure to a planar spatial layer taking the scene digital map as a reference; and fusing the plane space map layer and the scene basic map layer, labeling map layer elements, and generating high-precision map dynamic data corresponding to the areas.
Specifically, the present embodiment depends on the type of object
Figure BDA0003347855130000081
From the traffic knowledge base, the target attribute value of the type can be determined
Figure BDA0003347855130000082
Such as a traffic element like a car, which has a series of attribute values like the size value of the car. Then, carrying out coordinate transformation of the target area, namely constructing a geometric structure in the current plane space according to the traffic target calculated in the last step and the attribute values such as the size, the position and the like of the traffic target
Figure BDA0003347855130000085
Further, a projection of the target area, i.e. the geometry, is performed
Figure BDA0003347855130000083
Performing spatial transformation to obtain planar spatial layer L based on scene digital mapt. Finally, the planar space layer L generated in the last step is processedtAnd scene base layer L0Fusing and labeling the map layer elements to construct a vector high-precision map Mt
And step 204, generating a cooperative control instruction of each in-field target according to the high-precision map dynamic data and the operation task of the in-field target.
Specifically, the edge computing device of this embodiment generates the cooperative control instruction of each in-field target according to the scene trigger principle based on the high-precision map dynamic data and the job task of the target in the local area, where the instruction cycle may beThe action behavior of each traffic element is cooperatively controlled for 0.1 second. The job task list distributed to the edge computing equipment by the central platform system is marked as { TiWhere i denotes an i-th intrafield target. A set of cooperative control instructions generated by the edge computing device, denoted as
Figure BDA0003347855130000084
Where i represents the ith intrafield object and j represents the ith traffic element in the jth time slice, each time slice being 0.1 seconds.
In practical application, the edge computing equipment in different areas sends the high-precision map dynamic data to the central platform system, and then receives an operation task planned by the central platform system according to the global high-precision map dynamic data; and the global high-precision map dynamic data is obtained by fusing corresponding high-precision map dynamic data of different regions.
In some embodiments of the present invention, the step of generating a cooperative control command for each intra-field target according to the high-precision map dynamic data and the job task of the intra-field target includes: carrying out motion trajectory simulation on the targets in the field according to the fusion data to construct a corresponding third target trajectory function; generating target motion planning data by combining a third target track function, high-precision map dynamic data and an operation task of an in-field target; judging whether track conflicts exist among targets in different fields or not based on the target motion planning data; if the track conflict exists, returning to the step of executing the operation task combining the third target track function, the high-precision map dynamic data and the targets in the field to generate target motion planning data; and if no track conflict exists, generating a cooperative control instruction of the targets in each field according to the target motion planning data.
Specifically, the embodiment first sets up a target feature set
Figure BDA0003347855130000091
Carrying out motion track simulation on targets in the field to generate a target track function
Figure BDA0003347855130000092
In order to predict the trajectory of objects within the field. Then, combining the target trajectory functions
Figure BDA0003347855130000093
Scene high-precision map MtAnd its corresponding target planned task path Pi(t) generating a new target motion plan
Figure BDA0003347855130000094
Target motion planning set at target time t in field
Figure BDA0003347855130000095
Further, detection is performed on a time-to-space basis
Figure BDA0003347855130000096
And
Figure BDA0003347855130000097
whether space-time conflict exists, that is, whether the planning is reasonable or not is judged, whether readjustment is needed or not is judged, and the judgment function is as follows:
Figure BDA0003347855130000098
where T0 represents the maximum length of time to detect a trajectory plan.
It should be noted that if
Figure BDA0003347855130000099
And
Figure BDA00033478551300000910
in the presence of space-time conflicts, depending on the targets in the field
Figure BDA00033478551300000911
And targets in the field
Figure BDA00033478551300000912
Task weight or priorityReadjusting targets in the field
Figure BDA00033478551300000913
And targets in the field
Figure BDA00033478551300000914
Until all targets in the field
Figure BDA00033478551300000915
And targets in the field
Figure BDA00033478551300000916
All the tracks meet the conflict-free condition, and finally the path plan of each in-field target is obtained
Figure BDA00033478551300000917
Finally, obtained according to the previous step
Figure BDA00033478551300000918
Outputting the planned time scale
Figure BDA00033478551300000919
Forming a target control command C(i,t)All target instructions constitute the instruction set for target control C(i,t)}。
And step 205, sending corresponding cooperative control instructions to targets in each field respectively.
Specifically, in this embodiment, the edge computing device issues a cooperative control instruction to the target in the field through the RSU, where the cooperative control instruction may be encapsulated in an information body according to an RSU interface protocol, and the information body may further include high-precision map dynamic data, job tasks, and the like.
In addition, it should be further noted that, in this embodiment, the road sensing device may further monitor the instruction execution effect of the target in the field in real time, and the edge computing device performs sensing computation again according to the instruction execution effect, updates the high-precision map dynamic data, and so on, to form a closed-loop accurate control. After the task operation of each traffic element in the field is completed, the system is in a servo waiting state until a new traffic element enters the field and the system is activated. The perception equipment that this embodiment relates to is of many kinds, implements all-round, the monitoring of no blind spot to the place, can reach sub-meter level's positioning accuracy, and data refresh frequency 10Hz (10 times per second), and partial vehicle's positioning accuracy can reach centimetre level.
Based on the technical scheme of the embodiment of the application, firstly, the perception data of the traffic elements are layered and then subjected to target perception calculation, then, the targets are subjected to data fusion to form a high-precision dynamic map in the region, and finally, a cooperative control instruction of the traffic elements is generated according to scene touch. The road perception and edge calculation unit is used for acquiring accurate and reliable real-time traffic element dynamic data and sending out a cooperative control instruction for strong cooperation and strong control of traffic elements, so that the cooperative control function of controlling vehicles by roads is realized, and the reliability and the safety of multi-vehicle cooperative control in a complex traffic scene are ensured.
Fig. 5 is a diagram of an edge computing device according to a second embodiment of the present application. The edge calculation apparatus can be used to implement the edge calculation method in the foregoing embodiments. As shown in fig. 5, the edge calculation means mainly includes:
the receiving module 501 is configured to receive vehicle sensing data sent by a vehicle-mounted sensing device and receive road traffic situation sensing data sent by a road sensing device in a zone to which a vehicle belongs;
a calculation module 502, configured to calculate target feature data based on the vehicle perception data and the road traffic situation perception data; wherein the target characteristic data comprises an in-field target of the parcel and a corresponding characteristic value;
the fusion module 503 is configured to fuse the target feature data to obtain fusion data, and generate high-precision map dynamic data corresponding to the parcel based on the fusion data;
the generation module 504 is used for generating a cooperative control instruction of each in-field target according to the high-precision map dynamic data and the operation task of the in-field target;
and a sending module 505, configured to send corresponding cooperative control instructions to the targets in each field respectively.
In some embodiments of this embodiment, the calculation module is specifically configured to: identifying targets in the field based on the vehicle perception data and the road traffic situation perception data to obtain a target set; performing coordinate transformation on all targets in the field in the target set, and transforming all targets in the field into actual coordinates of the field; performing time synchronization processing on the target set based on the data sensing time length and the target identification time length, wherein the data sensing time length is the time length required by the sensor for sensing data, and the target identification time length is the time length required by the target in the identification field; performing motion trajectory simulation on targets in each field based on Kalman filtering, and constructing a corresponding first target trajectory function; and carrying out timing characteristic sampling on the targets in each field according to the first target track function to obtain target characteristic data.
In some embodiments of this embodiment, when the fusion module performs a function of fusing the target feature data to obtain fused data, the fusion module is specifically configured to: based on different sensor types, grouping and associating the target characteristic data to obtain new target characteristic data; calibrating the identity information of the grouped targets in the new target characteristic data; performing motion trajectory simulation on the grouped targets based on Kalman filtering, and constructing a corresponding second target trajectory function; predicting the motion trail of the grouped target based on a second target trail function; and sampling timing characteristics on the grouped simulation tracks according to the second target track function to obtain fusion data.
Further, in some embodiments of this embodiment, when the fusion module executes a function of generating high-precision map dynamic data corresponding to the parcel based on the fusion data, the fusion module is specifically configured to: acquiring a target type corresponding to the target in the field based on the identity information; determining a corresponding target attribute value according to the target type; constructing the geometric structure of the target in the field in the current plane space based on the target attribute value; carrying out spatial transformation on the geometric structure to a planar spatial layer taking the scene digital map as a reference; and fusing the plane space map layer and the scene basic map layer, labeling map layer elements, and generating high-precision map dynamic data corresponding to the areas.
Further, in some embodiments of this embodiment, the generating module is specifically configured to: carrying out motion trajectory simulation on the targets in the field according to the fusion data to construct a corresponding third target trajectory function; generating target motion planning data by combining a third target track function, high-precision map dynamic data and an operation task of an in-field target; judging whether track conflicts exist among targets in different fields or not based on the target motion planning data; if the track conflict exists, returning to execute the operation task combining the third target track function, the high-precision map dynamic data and the targets in the field to generate target motion planning data; and if no track conflict exists, generating a cooperative control instruction of the targets in each field according to the target motion planning data.
In some implementations of this embodiment, the sending module is further configured to: sending the high-precision map dynamic data to a central platform system; the receiving module is further configured to: receiving an operation task planned by the central platform system according to the dynamic data of the global high-precision map; and the global high-precision map dynamic data is obtained by fusing corresponding high-precision map dynamic data of different regions.
It should be noted that the edge calculation method in the first embodiment can be implemented based on the edge calculation device provided in this embodiment, and it can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the edge calculation device described in this embodiment may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
According to the edge computing device provided by the embodiment, firstly, the perception data of the traffic elements are layered and then subjected to target perception computation, then, the targets are subjected to data fusion to form a high-precision dynamic map in the region, and finally, a cooperative control instruction of the traffic elements is generated according to scene touch. The road perception and edge calculation unit is used for acquiring accurate and reliable real-time traffic element dynamic data and sending out a cooperative control instruction for strong cooperation and strong control of traffic elements, so that the cooperative control function of controlling vehicles by roads is realized, and the reliability and the safety of multi-vehicle cooperative control in a complex traffic scene are ensured.
Referring to fig. 6, fig. 6 is an electronic device according to a third embodiment of the present application. The electronic device can be used to implement the edge calculation method in the foregoing embodiments. As shown in fig. 6, the electronic device mainly includes:
memory 601, processor 602, bus 603, and computer programs stored on memory 601 and executable on processor 602, memory 601 and processor 602 connected by bus 603. The processor 602, when executing the computer program, implements the edge calculation method in the foregoing embodiments. Wherein the number of processors may be one or more.
The Memory 601 may be a high-speed Random Access Memory (RAM) Memory, or a non-volatile Memory (non-volatile Memory), such as a disk Memory. The memory 601 is used for storing executable program code, and the processor 602 is coupled with the memory 601.
Further, an embodiment of the present application also provides a computer-readable storage medium, where the computer-readable storage medium may be provided in an electronic device in the foregoing embodiments, and the computer-readable storage medium may be the memory in the foregoing embodiment shown in fig. 6.
The computer-readable storage medium has stored thereon a computer program which, when executed by a processor, implements the edge calculation method in the foregoing embodiments. Further, the computer-readable storage medium may be various media that can store program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a RAM, a magnetic disk, or an optical disk.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of modules is merely a division of logical functions, and an actual implementation may have another division, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a readable storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present application. And the aforementioned readable storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In view of the above description of the edge calculation method, the apparatus and the readable storage medium provided by the present application, those skilled in the art will recognize that there may be variations in the embodiments and applications of the method and apparatus provided by the present application.

Claims (10)

1. An edge calculation method, comprising:
receiving vehicle perception data sent by vehicle-mounted perception equipment and receiving road traffic situation perception data sent by road perception equipment of a district to which a vehicle belongs;
calculating target characteristic data based on the vehicle perception data and the road traffic situation perception data; wherein the target feature data comprises intra-field targets and corresponding feature values for the patch;
fusing the target characteristic data to obtain fused data, and generating high-precision map dynamic data corresponding to the region based on the fused data;
generating a cooperative control instruction of each in-field target according to the high-precision map dynamic data and the operation task of the in-field target;
and respectively sending the corresponding cooperative control command to the targets in each field.
2. The edge calculation method according to claim 1, wherein the step of calculating target feature data based on the vehicle perception data and the road traffic situation perception data comprises:
identifying the targets in the field based on the vehicle perception data and the road traffic situation perception data to obtain a target set;
performing coordinate transformation on all targets in the field in the target set, and transforming all targets in the field into actual coordinates of the field;
performing time synchronization processing on the target set based on the data sensing time length and the target identification time length; the data sensing duration is the duration required by the sensor for sensing data, and the target identification duration is the duration required by identifying the targets in the field;
performing motion trajectory simulation on the targets in each field based on Kalman filtering to construct a corresponding first target trajectory function;
and carrying out timing characteristic sampling on the targets in each field according to the first target track function to obtain target characteristic data.
3. The edge calculation method according to claim 1, wherein the step of fusing the target feature data to obtain fused data includes:
based on different sensor types, grouping and associating the target characteristic data to obtain new target characteristic data;
calibrating the identity information of the grouped targets in the new target characteristic data;
performing motion trajectory simulation on the grouped targets based on Kalman filtering to construct corresponding second target trajectory functions;
predicting the motion trail of the grouped target based on the second target trail function;
and sampling timing characteristics on the grouped simulation tracks according to the second target track function to obtain fusion data.
4. The edge calculation method according to claim 3, wherein the step of generating high-precision map dynamic data corresponding to the parcel based on the fused data comprises:
acquiring a target type corresponding to the targets in the field based on the identity information;
determining a corresponding target attribute value according to the target type;
constructing a geometry of an object within the field in a current planar space based on the object property values;
carrying out spatial transformation on the geometric structure to transform the geometric structure into a plane spatial layer taking a scene digital map as a reference;
and fusing the plane space map layer and the scene basic map layer, labeling map layer elements, and generating high-precision map dynamic data corresponding to the areas.
5. The edge calculation method according to claim 4, wherein the step of generating a cooperative control instruction for each of the intrafield targets based on the high-precision map dynamic data and the job tasks of the intrafield targets includes:
carrying out motion trajectory simulation on the targets in the field according to the fusion data to construct a corresponding third target trajectory function;
generating target motion planning data by combining the third target track function, the high-precision map dynamic data and the operation tasks of the targets in the field;
determining whether a trajectory conflict exists between targets in different fields based on the target motion planning data;
if the track conflict exists, returning to the step of executing the operation task combining the third target track function, the high-precision map dynamic data and the targets in the field to generate target motion planning data;
and if no track conflict exists, generating a cooperative control instruction of the targets in each field according to the target motion planning data.
6. The edge-computation method of any of claims 1 to 5, wherein the step of tasking according to the high-precision map dynamic data and the targets within the field is preceded by:
sending the high-precision map dynamic data to a central platform system;
receiving the operation tasks planned by the central platform system according to the dynamic data of the global high-precision map; and the global high-precision map dynamic data is obtained by fusing the high-precision map dynamic data corresponding to different regions.
7. An edge computing device, comprising:
the receiving module is used for receiving vehicle perception data sent by the vehicle-mounted perception equipment and receiving road traffic situation perception data sent by road perception equipment of a district to which the vehicle belongs;
the calculation module is used for calculating target characteristic data based on the vehicle perception data and the road traffic situation perception data; wherein the target feature data comprises intra-field targets and corresponding feature values for the patch;
the fusion module is used for fusing the target characteristic data to obtain fusion data and generating high-precision map dynamic data corresponding to the film area based on the fusion data;
the generation module is used for generating a cooperative control instruction of each in-field target according to the high-precision map dynamic data and the operation task of the in-field target;
and the sending module is used for sending the corresponding cooperative control instruction to the targets in each field respectively.
8. The edge computing device of claim 7, wherein the sending module is further configured to: sending the high-precision map dynamic data to a central platform system; the receiving module is further configured to: receiving the operation tasks planned by the central platform system according to the dynamic data of the global high-precision map; and the global high-precision map dynamic data is obtained by fusing the high-precision map dynamic data corresponding to different regions.
9. An electronic device, comprising: a processor, a memory, and a bus;
the bus is used for realizing connection communication between the processor and the memory;
the processor is configured to execute one or more programs stored in the memory to implement the steps of the edge calculation method according to any one of claims 1 to 6.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores one or more programs which are executable by one or more processors to implement the steps of the edge calculation method according to any one of claims 1 to 6.
CN202111328580.3A 2021-11-10 2021-11-10 Edge computing method and device and readable storage medium Active CN114049767B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111328580.3A CN114049767B (en) 2021-11-10 2021-11-10 Edge computing method and device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111328580.3A CN114049767B (en) 2021-11-10 2021-11-10 Edge computing method and device and readable storage medium

Publications (2)

Publication Number Publication Date
CN114049767A true CN114049767A (en) 2022-02-15
CN114049767B CN114049767B (en) 2023-05-12

Family

ID=80208144

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111328580.3A Active CN114049767B (en) 2021-11-10 2021-11-10 Edge computing method and device and readable storage medium

Country Status (1)

Country Link
CN (1) CN114049767B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114615241A (en) * 2022-03-03 2022-06-10 智道网联科技(北京)有限公司 Dynamic road network display method based on high-precision map and related equipment
CN117275232A (en) * 2023-09-28 2023-12-22 广东省电信规划设计院有限公司 Dynamic sensing method and device based on vehicle-road cooperation

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108919803A (en) * 2018-07-04 2018-11-30 北京踏歌智行科技有限公司 A kind of cooperative control method and device of mining automatic driving vehicle
US20200258389A1 (en) * 2017-10-31 2020-08-13 Huawei Technologies Co., Ltd. Cellular network-based assisted driving method and traffic control unit
US20200257310A1 (en) * 2019-02-13 2020-08-13 GM Global Technology Operations LLC Method and system for determining autonomous vehicle (av) action based on vehicle and edge sensor data
CN111554088A (en) * 2020-04-13 2020-08-18 重庆邮电大学 Multifunctional V2X intelligent roadside base station system
CN111601266A (en) * 2020-03-31 2020-08-28 浙江吉利汽车研究院有限公司 Cooperative control method and system
CN111818189A (en) * 2020-09-09 2020-10-23 浙江吉利控股集团有限公司 Vehicle road cooperative control system, method and medium
CN112289059A (en) * 2020-10-22 2021-01-29 中电智能技术南京有限公司 Vehicle-road cooperative road traffic system
CN112435504A (en) * 2020-11-11 2021-03-02 清华大学 Centralized collaborative trajectory planning method and device under vehicle-road collaborative environment
CN112562314A (en) * 2020-11-02 2021-03-26 福瑞泰克智能***有限公司 Road end sensing method and device based on deep fusion, road end equipment and system
CN112906777A (en) * 2021-02-05 2021-06-04 北京邮电大学 Target detection method and device, electronic equipment and storage medium
CN113326719A (en) * 2020-02-28 2021-08-31 华为技术有限公司 Method, equipment and system for target tracking
CN113485319A (en) * 2021-06-08 2021-10-08 中兴智能汽车有限公司 Automatic driving system based on 5G vehicle-road cooperation
CN113581211A (en) * 2021-08-30 2021-11-02 深圳清航智行科技有限公司 Vehicle driving control method, system and device and readable storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200258389A1 (en) * 2017-10-31 2020-08-13 Huawei Technologies Co., Ltd. Cellular network-based assisted driving method and traffic control unit
CN108919803A (en) * 2018-07-04 2018-11-30 北京踏歌智行科技有限公司 A kind of cooperative control method and device of mining automatic driving vehicle
US20200257310A1 (en) * 2019-02-13 2020-08-13 GM Global Technology Operations LLC Method and system for determining autonomous vehicle (av) action based on vehicle and edge sensor data
CN113326719A (en) * 2020-02-28 2021-08-31 华为技术有限公司 Method, equipment and system for target tracking
CN111601266A (en) * 2020-03-31 2020-08-28 浙江吉利汽车研究院有限公司 Cooperative control method and system
CN111554088A (en) * 2020-04-13 2020-08-18 重庆邮电大学 Multifunctional V2X intelligent roadside base station system
CN111818189A (en) * 2020-09-09 2020-10-23 浙江吉利控股集团有限公司 Vehicle road cooperative control system, method and medium
CN112289059A (en) * 2020-10-22 2021-01-29 中电智能技术南京有限公司 Vehicle-road cooperative road traffic system
CN112562314A (en) * 2020-11-02 2021-03-26 福瑞泰克智能***有限公司 Road end sensing method and device based on deep fusion, road end equipment and system
CN112435504A (en) * 2020-11-11 2021-03-02 清华大学 Centralized collaborative trajectory planning method and device under vehicle-road collaborative environment
CN112906777A (en) * 2021-02-05 2021-06-04 北京邮电大学 Target detection method and device, electronic equipment and storage medium
CN113485319A (en) * 2021-06-08 2021-10-08 中兴智能汽车有限公司 Automatic driving system based on 5G vehicle-road cooperation
CN113581211A (en) * 2021-08-30 2021-11-02 深圳清航智行科技有限公司 Vehicle driving control method, system and device and readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王凌 等: "边缘计算资源分配与任务调度优化综述", ***仿真学报 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114615241A (en) * 2022-03-03 2022-06-10 智道网联科技(北京)有限公司 Dynamic road network display method based on high-precision map and related equipment
CN117275232A (en) * 2023-09-28 2023-12-22 广东省电信规划设计院有限公司 Dynamic sensing method and device based on vehicle-road cooperation
CN117275232B (en) * 2023-09-28 2024-05-31 广东省电信规划设计院有限公司 Dynamic sensing method and device based on vehicle-road cooperation

Also Published As

Publication number Publication date
CN114049767B (en) 2023-05-12

Similar Documents

Publication Publication Date Title
US11593950B2 (en) System and method for movement detection
CN109927719B (en) Auxiliary driving method and system based on obstacle trajectory prediction
Suhr et al. Sensor fusion-based low-cost vehicle localization system for complex urban environments
Khatab et al. Vulnerable objects detection for autonomous driving: A review
US20180349746A1 (en) Top-View Lidar-Based Object Detection
US10369993B2 (en) Method and device for monitoring a setpoint trajectory to be traveled by a vehicle for being collision free
CN110268413A (en) The fusion of low level sensor
WO2018047114A2 (en) Situational awareness determination based on an annotated environmental model
Rawashdeh et al. Collaborative automated driving: A machine learning-based method to enhance the accuracy of shared information
Alfred Daniel et al. Fully convolutional neural networks for LIDAR–camera fusion for pedestrian detection in autonomous vehicle
Wang et al. Realtime wide-area vehicle trajectory tracking using millimeter-wave radar sensors and the open TJRD TS dataset
CN114049767B (en) Edge computing method and device and readable storage medium
CN112379674A (en) Automatic driving equipment and system
Lee et al. A geometric model based 2D LiDAR/radar sensor fusion for tracking surrounding vehicles
Gazis et al. Examining the sensors that enable self-driving vehicles
Ma et al. Left-turn conflict identification at signal intersections based on vehicle trajectory reconstruction under real-time communication conditions
Bai et al. Cyber mobility mirror: A deep learning-based real-world object perception platform using roadside LiDAR
Lei et al. Automated Lane Change Behavior Prediction and Environmental Perception Based on SLAM Technology
Aranjuelo et al. Multimodal deep learning for advanced driving systems
Shangguan et al. Interactive perception-based multiple object tracking via CVIS and AV
Kurapati et al. Multiple object tracking using radar and vision sensor fusion for autonomous vehicle
EP4160269A1 (en) Systems and methods for onboard analysis of sensor data for sensor fusion
Nguyen et al. Optimized grid-based environment perception in advanced driver assistance systems
CN113771845A (en) Method, device, vehicle and storage medium for predicting vehicle track
Thupakula Data Fusion Techniques for Object Identification in Airport Environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant