CN113779174B - Method, system, equipment and medium for improving perception precision of roadside sensor - Google Patents

Method, system, equipment and medium for improving perception precision of roadside sensor Download PDF

Info

Publication number
CN113779174B
CN113779174B CN202111305830.1A CN202111305830A CN113779174B CN 113779174 B CN113779174 B CN 113779174B CN 202111305830 A CN202111305830 A CN 202111305830A CN 113779174 B CN113779174 B CN 113779174B
Authority
CN
China
Prior art keywords
target
data
bsm
intelligent networked
compensation value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111305830.1A
Other languages
Chinese (zh)
Other versions
CN113779174A (en
Inventor
杨唐涛
张龙洋
何书贤
陈琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ismartways Wuhan Technology Co ltd
Original Assignee
Ismartways Wuhan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ismartways Wuhan Technology Co ltd filed Critical Ismartways Wuhan Technology Co ltd
Priority to CN202111305830.1A priority Critical patent/CN113779174B/en
Publication of CN113779174A publication Critical patent/CN113779174A/en
Application granted granted Critical
Publication of CN113779174B publication Critical patent/CN113779174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The application relates to a method, a system, equipment and a storage medium for improving the perception accuracy of a road side sensor, wherein the method comprises the steps of obtaining target intersection map information sent by an RSU and BSM data broadcasted by an intelligent networked automobile; screening target BSM data of the intelligent networked automobile in the range of the target intersection according to the map information of the target intersection; acquiring sensing data of a road side sensor within a time range of the intelligent networked automobile entering and exiting a target intersection, matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value; and sending the target position compensation value to the roadside sensor so that the roadside sensor perceives the target position based on the target position compensation value. The target sensing precision of the road side sensor can be simply and effectively improved.

Description

Method, system, equipment and medium for improving perception precision of roadside sensor
Technical Field
The application relates to the technical field of vehicle networking, in particular to a method, a system, equipment and a storage medium for improving the perception accuracy of a roadside sensor.
Background
With the rapid development of intelligent transportation systems and the technology of the internet of vehicles V2X, the concept of internet intersections is also gradually proposed. A gateway is an infrastructure system that broadcasts signals, phase and countdown information (SPaT), MAP information MAP and location correction data to on-board units OBUs or mobile units MU. The network connection intersection comprises road side sensors such as a camera, a laser radar, a millimeter wave radar and the like, and the road side sensors are used for sensing various information of a target, namely a vehicle, of the intersection and have certain requirements on precision.
At present, the method for improving the target sensing precision of the roadside sensor mainly improves the self sensing precision of the sensor through a higher-order and complex algorithm, but the method needs continuous iteration of the algorithm and is very complicated.
Disclosure of Invention
In view of this, the present application provides a method, a system, a device and a storage medium for improving the sensing accuracy of a roadside sensor, so as to simply and effectively improve the target sensing accuracy of the roadside sensor.
In order to solve the above problem, in a first aspect, the present application provides a method for improving the sensing accuracy of a roadside sensor, where the method includes:
acquiring target intersection map information sent by an RSU (road side unit) and BSM (base station management) data broadcasted by an intelligent networked automobile;
screening target BSM data of the intelligent networked automobile in the range of the target intersection according to the map information of the target intersection;
acquiring sensing data of a road side sensor within a time range of the intelligent networked automobile entering and exiting a target intersection, matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value;
and sending the target position compensation value to the roadside sensor so that the roadside sensor perceives the target position based on the target position compensation value.
Optionally, the screening out target BSM data of the intelligent internet vehicles within the range of the target intersection according to the map information of the target intersection includes:
extracting longitude and latitude positions of center points of the target intersections from the map information of the target intersectionsLon c Lat c According to the longitude and latitude position in each BSM dataLon v Lat v And calculating the distance between the intelligent networked automobile and the central point position of the target intersectiondThe formula is as follows:
Figure 733766DEST_PATH_IMAGE001
Figure 884255DEST_PATH_IMAGE002
wherein,Rwhich is the radius of the earth, is,Lon c Lat c Lon v Lat v converting into radian substitution calculation in the calculation process;
if the distance between the intelligent networked automobile and the central point position of the target intersectiondAnd if the target BSM data is not greater than the preset threshold, determining that the intelligent networked automobile is in the range of the target intersection, and screening the target BSM data of the intelligent networked automobile.
Optionally, the matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value includes:
matching the target BSM data with corresponding parameter values in the perception data, and determining target point data corresponding to the intelligent networked automobile in the perception data;
acquiring a corresponding timestamp in the target point data, carrying out dead reckoning on the target BSM data based on a preset dead reckoning rule, and calculating to obtain a dead reckoning position of the intelligent networked automobile at the corresponding timestamp;
and comparing the calculated position with a target point position corresponding to the timestamp in the target point data to obtain a target position compensation value.
Optionally, the matching the target BSM data with the corresponding parameter values in the sensing data to determine target point data corresponding to the intelligent networked automobile in the sensing data includes:
if a preset parameter difference value corresponding to a target in the perception data and an intelligent networked automobile in the target BSM data meets a preset condition, matching the perception data corresponding to the target into target point data corresponding to the intelligent networked automobile; the preset parameter difference comprises a timestamp difference, a course angle difference, a speed difference and/or a distance difference, and the preset conditions are that the absolute value of the timestamp difference is smaller than a preset timestamp difference threshold, the absolute value of the course angle difference is smaller than a preset course angle difference threshold, the absolute value of the speed difference is smaller than a preset speed difference threshold and/or the absolute value of the distance difference is smaller than a preset distance difference threshold.
Optionally, the obtaining of the corresponding timestamp in the target point data, performing dead reckoning on the target BSM data based on a preset dead reckoning rule, and calculating to obtain the dead reckoning position of the intelligent networked automobile at the corresponding timestamp includes:
the target BSM data is addedt t,m Latitude and longitude position of timeLon m AndLat m conversion to UTM coordinatesy m Andx m and carrying out dead reckoning calculation on the corresponding timestamp of the intelligent networked automobile according to the following formulat t,n The estimated position of (2):
Figure 730989DEST_PATH_IMAGE003
wherein,s m is as followsmThe speed parameter in the bar target BSM data,h m is as followsmThe course angle parameter in the bar target BSM data,y m,p x m,p are respectively asy m x m Position calculated by dead reckoning, and calculatingy m,p x m,p Transforming the coordinate system into longitude and latitude coordinates of the calculated positionLon m,p Lat m,p
Optionally, the comparing and calculating the calculated position with a target point position corresponding to a timestamp in the target point data to obtain a target position compensation value includes:
according to the longitude and latitude coordinates of the calculated positionLon m,p Lat m,p Corresponding time stamp in the target point datat t,n Target point position ofLon n,o Lat n,o Calculating a difference value to obtain a target position compensation value, wherein the formula is as follows:
Figure 331734DEST_PATH_IMAGE004
wherein,
Figure 467181DEST_PATH_IMAGE005
for a longitude compensation value among the target position compensation values,
Figure 749257DEST_PATH_IMAGE006
and the latitude compensation value is the latitude compensation value in the target position compensation value.
Optionally, the method further includes:
calculating an average value of target position compensation values corresponding to all timestamps in the target point data of the intelligent networked automobile to obtain a first compensation value;
and calculating an average value of the first compensation values of all the intelligent networked automobiles in the range of the target intersection to obtain a second compensation value, and sending the second compensation value serving as a target position compensation value to the roadside sensor.
In a second aspect, the present application provides a system for improving the target sensing accuracy of a roadside sensor, the system comprising:
the acquisition module is used for acquiring the map information of the target intersection sent by the RSU and the BSM data broadcasted by the intelligent networked automobile;
the screening module is used for screening out target BSM data of the intelligent networked automobile in the range of the target intersection according to the map information of the target intersection;
the calculation module is used for acquiring sensing data of a road side sensor within a time range when the intelligent networked automobile enters and exits a target intersection, matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value;
and the transmitting module is used for transmitting the target position compensation value to the roadside sensor so that the roadside sensor can sense the target position based on the target position compensation value.
In a third aspect, the present application provides a computer device, which adopts the following technical solution:
a computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method of improving the perception accuracy of a roadside sensor when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, which adopts the following technical solutions:
a computer-readable storage medium, storing a computer program which, when executed by a processor, implements the steps of the method of improving the perception accuracy of a roadside sensor.
The beneficial effects of adopting the above embodiment are: and matching the BSM data of the intelligent networked automobile with the sensing data generated by the road side sensor, calculating a target position compensation value between the intelligent networked automobile and the data of the road side sensor according to a dead reckoning rule, and sending the target position compensation value to the road side sensor for application. Because the high-precision BSM data of the intelligent networked automobile can be communicated with the perception data of the roadside sensor, the goal perception precision of the roadside sensor is improved through the high-precision BSM data of the intelligent networked automobile, algorithm iteration is not needed, and the goal perception precision of the roadside sensor is simply and effectively improved.
Drawings
FIG. 1 is a flowchart of a method of an embodiment of a method for improving the perception accuracy of a roadside sensor provided by the present application;
FIG. 2 is a schematic block diagram of an embodiment of a system for improving the target sensing accuracy of a roadside sensor provided by the present application;
FIG. 3 is a functional block diagram of an embodiment of a computer device provided herein.
Detailed Description
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate preferred embodiments of the application and together with the description, serve to explain the principles of the application and not to limit the scope of the application.
In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The application provides a method, a system, equipment and a storage medium for improving the perception accuracy of a roadside sensor, and firstly, each term in the application is explained as follows: the RSU is an English abbreviation of Road Side Unit, is named as a roadside Unit in Chinese, is a device which is arranged at the roadside in a Vehicle-Road cooperative system, adopts the DSRC (differentiated Short Range communication) or C-V2X (Cellular Vehicle event) technology and communicates with an On-Board Unit (OBU). BSM, Basic Safety Message, including speed, turn, brake, double flash, position, etc. The UTM (Universal transform mercator Grid System) coordinate is a planar rectangular coordinate, and this coordinate Grid System and its projection based on it have been widely used in topographic maps, as a reference Grid for satellite images and natural resource databases, and for other applications requiring precise positioning. In the UTM system, the earth surface area between 84 degrees north latitude and 80 degrees south latitude is divided into north and south longitudinal bands (projection bands) by 6 degrees longitude. MEC, (Multi-access Edge Computing), multiple access Edge Computing technology.
Referring to fig. 1, a flowchart of a method according to an embodiment of the method for improving the sensing accuracy of a roadside sensor provided by the present application is shown, where the method for improving the sensing accuracy of the roadside sensor includes the following steps:
s101, acquiring target intersection map information sent by an RSU and BSM data broadcasted by an intelligent networking automobile;
s102, screening out target BSM data of the intelligent networked automobile in the range of the target intersection according to the map information of the target intersection;
s103, acquiring sensing data of a road side sensor within a time range of the intelligent networked automobile entering and exiting the target intersection, matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value;
and S104, sending the target position compensation value to the roadside sensor so that the roadside sensor can sense the target position based on the target position compensation value.
In the embodiment, the BSM data of the intelligent networked automobile is matched with the sensing data generated by the roadside sensor, the target position compensation value between the intelligent networked automobile and the data of the roadside sensor is calculated according to the dead reckoning rule, and the target position compensation value is sent to the roadside sensor for application. Because the high-precision BSM data of the intelligent networked automobile can be communicated with the perception data of the roadside sensor, the goal perception precision of the roadside sensor is improved through the high-precision BSM data of the intelligent networked automobile, algorithm iteration is not needed, and the goal perception precision of the roadside sensor is simply and effectively improved.
Specifically, in one embodiment, step S102 includes:
extracting longitude and latitude positions of center points of the target intersections from the map information of the target intersectionsLon c Lat c According to the longitude and latitude position in each BSM dataLon v Lat v And calculating the central point position of the intelligent networked automobile and the target intersectionDistance of arrangementdThe formula is as follows:
Figure 512814DEST_PATH_IMAGE007
Figure 693915DEST_PATH_IMAGE008
wherein,Rwhich is the radius of the earth, is,Lon c Lat c Lon v Lat v converting into radian substitution calculation in the calculation process;
if the distance between the intelligent networked automobile and the central point position of the target intersectiondAnd if the target BSM data is not greater than the preset threshold, determining that the intelligent networked automobile is in the range of the target intersection, and screening the target BSM data of the intelligent networked automobile.
This application combines the structured data of target perception of trackside sensor (equipment such as camera, laser radar, millimeter wave radar) output based on intelligent networking data, utilizes the vehicle centimetre level RTK locating data that OBU acquireed, in the trackside sensor detection range who is in the target intersection installation, has intelligent networking vehicle and non-intelligent networking vehicle. The roadside sensor can sense and acquire information such as the position, the speed, the course angle and the like of the vehicles by sensing a sensing algorithm of the roadside sensor. The intelligent internet vehicle has the advantage that due to the fact that the OBU equipment installed on the intelligent internet vehicle is provided with the GNSS module, centimeter-level high-precision positioning positions can be obtained through the RTCM difference server.
The intelligent networked vehicle is equipped with OBU equipment, and the external high accuracy GNSS orientation module of OBU equipment also inserts car CAN bus simultaneously. The positioning data is from a high-precision GNSS module, and the GNSS module acquires differential data in real time through an RTCM differential server so as to correct the GNSS module to obtain centimeter-level positioning precision; the status data of which originates from the CAN bus. And the OBU end acquires the height precision longitude and latitude position, the speed, the course angle and other data of the intelligent networked vehicle, generates a BSM message set and broadcasts the BSM message set outwards in a C-V2X communication mode. Road side equipment RSU, MEC and various sensors are installed at the road end. And the RSU end sends the intersection high-precision MAP MAP stored in the equipment and the OBU message sent by the intelligent networking vehicle to the MEC end. And after the MEC end acquires the data transmitted by the RSU equipment and the target structured data of the road side sensor, calculating a sensor perception target position compensation value. And finally, outputting the compensation value of the position of the perception target to a sensor and applying the compensation value.
Specifically, firstly, the MEC reads the longitude and latitude position of the central point of the target intersection from the MAP message uploaded by the RSU. In this embodiment, for a smart intersection with intelligent roadside devices RSU, target sensing sensors (such as cameras, laser radar, and millimeter wave radar), and a multi-access edge computing MEC on the roadside, BSM data provided by an intelligent internet vehicle with an intelligent on-board device OBU in the intersection range is used to correct the target sensing structured data of the sensors.
When the BSM data reported by the intelligent networked vehicles within the intersection range are to be screened out, firstly, the longitude and latitude positions of the central point of the target intersection are read from the MAP messages uploaded by the RSU. The RSU equipment is internally pre-stored with a high-precision lane-level MAP of the target intersection and broadcasts the MAP to the outside in the form of a MAP message. And the MEC end is connected with the RSU equipment through a convergence switch through a wire. The MEC end extracts the longitude and latitude position of the central point of the target intersection from the MAP standard message set and records the position asLon c AndLat c
and reading the BSM message, and extracting the position and state information of the intelligent networked vehicle. In the embodiment, the intelligent networked vehicle is provided with the OBU device and has the C-V2X and 4G/5G communication functions. The OBU equipment acquires an RTK signal of a positioning service provider through a 4G/5G network, and corrects the position acquired by a GPS receiver of the OBU equipment so as to acquire a centimeter-level positioning result; meanwhile, the OBU equipment CAN be connected with a vehicle CAN bus to directly read parameters such as vehicle speed and the like. The OBU fills these precise parameters into the BSM message and broadcasts it out. The OBU device broadcasts outwards through C-V2X communication, and the RSU end receives the BSM message sent by the OBU device and then transmits the BSM message to the MEC end through a wire. And the MEC end extracts the acquired BSM message, and acquires information such as a timestamp, an equipment ID, a centimeter-level longitude and latitude position, a speed, a course angle, a vehicle type and the like contained in the BSM message.
And screening and recording the BSM information in the range of the target intersection. In this embodiment, the MEC filters the acquired BSM messages, screens out BSM messages reported by the intelligent networked automobile within the intersection range (i.e., within the range that can be sensed and measured by other roadside sensors), and records the BSM messages. For each BSM message, the longitude and latitude positions are recorded asLon v AndLat v and calculating the distance between the intelligent networked vehicle and the central point of the target intersection in real timedAs follows:
Figure 847816DEST_PATH_IMAGE007
Figure 261480DEST_PATH_IMAGE009
wherein,R6371.393 is the radius of the earthkmLon c Lat c Lon v Lat v Respectively the longitude and latitude of the central point of the target intersection and the intelligent networked vehicle, and the longitude and latitude need to be converted into radian (·) in the calculation processπ/180) And substituting for calculation.
If the distance between the intelligent networked vehicle and the central point of the target intersection is smaller than the distance between the intelligent networked vehicle and the central point of the target intersectiondAnd if the following formula is satisfied, the current calculated intelligent networked vehicle position is considered to be within the intersection range:
Figure 754909DEST_PATH_IMAGE010
wherein,D t to determine whether the vehicle is within the intersection range, the distance threshold may be determined based on the detection range of the roadside sensor. And the MEC end stores the data generated and uploaded by the intelligent networked vehicle in the range of the target intersection through the determined BSM message in a memory for calculation and calling of a subsequent program.
Specifically, in an embodiment, the matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value includes:
matching the target BSM data with corresponding parameter values in the perception data, and determining target point data corresponding to the intelligent networked automobile in the perception data;
acquiring a corresponding timestamp in the target point data, carrying out dead reckoning on the target BSM data based on a preset dead reckoning rule, and calculating to obtain a dead reckoning position of the intelligent networked automobile at the corresponding timestamp;
and comparing the calculated position with a target point position corresponding to the timestamp in the target point data to obtain a target position compensation value.
In the embodiment, sensor target sensing data in the time when the intelligent networked vehicle enters and exits the target intersection are extracted. The basic idea of improving the perception accuracy of the road side sensor based on the intelligent internet data is that the road side sensor detects perception position data of a vehicle when the vehicle passes through a crossing range through a high-accuracy centimeter-level positioning position provided by the intelligent internet vehicle, so that a compensation value of target perception data output by perception equipment is calculated, and the perception accuracy of the sensor is improved. Therefore, after the BSM message within the range of the target intersection is acquired and recorded, the time when the intelligent internet vehicle enters and exits the target intersection should be extracted, and the target structured data output by the sensor within the time range should be extracted.
In this embodiment, for the BSM data reported by the intelligent networked automobile within the range of the target intersection, which is filtered and recorded, the time when each automobile enters or exits the intersection range can be identified according to the device ID and the timestamp of the BSM message, and these time can be respectively recorded as the time when each automobile enters or exits the intersection ranget s Andt e
the structured data of the target perception output by the roadside sensor should have the following data, as shown in table 1:
TABLE 1
Figure 635141DEST_PATH_IMAGE011
The road side sensor is connected with the MEC end in a wired mode, the MEC end receives road side sensor target perception structured data uploaded by the road side sensor, then the road side sensor target perception structured data are stored in the MEC locally, and the MEC end waits for extracting BSM data entering a target intersection range according to the stored BSM datat s Andt e then, the target structured data stored locally is screened, namely the timestamp of the target structured datat t Between the time that the V2X vehicle enters and exits the intersection, as follows:
Figure 604234DEST_PATH_IMAGE012
therefore, BSM data of the intelligent networked vehicle in the time range of getting in and out of the target intersection and target structural data detected by the sensor are obtained.
The BSM message is matched to the target structured data. In this embodiment, the BSM information reported by the intelligent networked automobile in the range of the target intersection and the perception target structured data OBJ reported by the road side sensor corresponding to the time period are obtained. Each piece of OBJ data contains data of a plurality of detection targets, and the OBJ data also contains the intelligent networked automobile. Therefore, the BSM data and the OBJ data need to be matched to find the target point data corresponding to the intelligent networked automobile in the OBJ data.
Assuming that the intelligent networked vehicle reports M BSM information in total within the intersection range, and the road side sensors report the BSM information in total within the same time periodNA strip of object structured data, wherein each strip of object structured data comprises a variable number of objects, which can be obtained from the object _ num field and is marked asN
Traversing each target in the BSM information and the target structured data, calculating the difference value of each parameter, and if the following formula is met, considering that the information provided by the BSM and the target detection data are the same target:
Figure 228113DEST_PATH_IMAGE013
wherein,
Figure 841628DEST_PATH_IMAGE014
marking the timestamp difference of the structured data for the M (M =1,2,3, …, M) th BSM data and the N (N =1,2,3, …, N) th entry;D T in order to judge whether two targets are the same target, the timestamp difference value threshold is generally taken as 100ms (the BSM message uploading period is 100 ms);
Figure 158340DEST_PATH_IMAGE015
is as followsmStripe BSM data andnitem object structured data itemo(o=1,2,3,…,N. ) The difference value of course angles of the targets;D h in order to judge whether the two targets are the course angle difference value threshold of the same target, the course angle difference value threshold can be adjusted according to the actual condition of the sensing precision of the sensor;
Figure 614729DEST_PATH_IMAGE016
is as followsmStripe BSM data andnitem object structured data itemoA speed difference of the individual targets;D s the speed difference threshold value for judging whether the two targets are the same target can be adjusted according to the actual condition of the sensing precision of the sensor;
Figure 979982DEST_PATH_IMAGE017
is as followsmStripe BSM data andnitem object structured data itemoThe distance of each target;Dthe distance difference threshold value for judging whether the two targets are the same target can be adjusted according to the actual condition of the sensing precision of the sensor.
The calculation formula of each parameter is as follows:
Figure 241812DEST_PATH_IMAGE018
Figure 57321DEST_PATH_IMAGE019
wherein,t t,m is as followsmA timestamp of the bar BSM data;t t,n is as followsnThe entry is marked with a timestamp of the structured data;h m is as followsmThe heading angle of the bar BSM data;h n,o is as followsnBar target structured dataoA heading angle of the individual target;s m is as followsmThe speed of the bar BSM data;s n,o is as followsnBar target structured dataoThe speed of the individual target;Lon m is as followsmLongitude of the bar BSM data;Lat m is as followsmLatitude of bar BSM data;Lon n,o is as followsnBar target structured dataoThe longitude of the individual target;Lat n,o is as followsnBar target structured dataoLatitude of the individual target.
For the first time that the condition is satisfiedmStripe BSM data andnbar target structured dataoBecause the timestamps of the targets are not consistent and time deviation still exists, dead reckoning needs to be performed on the BSM data, and the position estimation of the targets at the timestamps corresponding to the target structured data is calculated, so that the latitude and longitude compensation value is further calculated. The position estimation needs to be carried out firstlyt t,m BSM latitude and longitude position of timeLon m AndLat m conversion to UTM coordinatesy m Andx m . The position estimation formula is shown below:
Figure 548477DEST_PATH_IMAGE020
wherein,y m,p x m,p are respectively asy m x m Post-dead reckoning predictionMeasuring the position and finishing the calculationy m,p x m,p Then, the coordinate system is transformed and converted into longitude and latitude coordinatesLon m,p Lat m,p
And calculating a sensor perception target position compensation value. In this embodiment, the obtained longitude and latitude coordinatesLon m,p Lat m,p Is in an intelligent networked vehiclet t,n Predicted position of time, byt t,n The position compensation can be obtained by subtracting the position of the target point of the corresponding sensor target structured data, as follows:
Figure 514158DEST_PATH_IMAGE021
wherein,
Figure 899003DEST_PATH_IMAGE022
the longitude compensation value is a difference value between a predicted value of a longitude position matched through the BSM position and a corresponding longitude position of a target detection result of the vehicle;
Figure 354256DEST_PATH_IMAGE023
the latitude compensation value is the difference value between the predicted value of the latitude position matched through the BSM position and the corresponding latitude position of the target detection result of the vehicle.
Averaging the compensation values of all the position points under different timestamps matched by the BSM information to obtain the longitude and latitude compensation value obtained by one intelligent networked vehicle, as follows:
Figure 457341DEST_PATH_IMAGE024
and if a plurality of intelligent networked vehicles enter the intersection range, calculating the longitude and latitude compensation values through all the V2X vehicles.
Intelligence for equipping multiple OBU devicesIf the vehicle can drive into the intersection, different vehicles can be distinguished through the equipment ID field, and the BSM data of each vehicle is used for calculating the compensation value of the longitude and latitude positions of the sensor target
Figure 429976DEST_PATH_IMAGE025
And
Figure 997224DEST_PATH_IMAGE026
then, the average value is taken, and the average value is taken as the compensation value of the longitude and latitude position of the sensor target, as follows:
Figure 764322DEST_PATH_IMAGE027
wherein,nfor the number of intelligent networked vehicles entering the intersection range,
Figure 682600DEST_PATH_IMAGE028
and
Figure 82812DEST_PATH_IMAGE029
respectively representiA sensor target longitude and latitude position compensation value obtained by calculating vehicle BSM data,
Figure 379932DEST_PATH_IMAGE030
and outputting the perception target position compensation value to the sensor and applying the perception target position compensation value. In this embodiment, the MEC terminal reversely outputs the sensing target position compensation value obtained through calculation to the roadside sensor. After receiving the position compensation information, the roadside sensor adds a compensation value to the longitude and latitude position of each target of each piece of target structured data output by the roadside sensor, as follows:
Figure 442566DEST_PATH_IMAGE031
wherein,Lon f,o for the first time after compensation value correctionoThe longitude of the individual target;Lon c,o for the current to be correctedoThe longitude of the individual target;Lat f,o for the first time after compensation value correctionoThe latitude of the individual target;Lat c,o for the current to be correctedoLatitude of the individual target.
In the embodiment, the high-precision RTK centimeter-level positioning position and course angle information provided by the intelligent networked vehicle and the accurate speed value read to the CAN bus are combined with the target perception structured data output by the road side sensor to calculate the compensation value of the road side sensor for perceiving the longitude and latitude position, so that the target position perception precision of the road side sensor is improved.
According to the sensor target perception accuracy improving method, the sensor with poor road side perception accuracy is corrected by using high-accuracy intelligent internet vehicle data passing through a target intersection. According to the method, under the condition that the hardware configuration of the sensor is not required to be improved, the perception precision can be improved through additional intelligent networking data, the cost of upgrading the hardware of the road side sensor is greatly saved, and meanwhile, a good target position perception detection effect can be achieved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The embodiment also provides a system for improving the target perception precision of the road side sensor, and the system for improving the target perception precision of the road side sensor is in one-to-one correspondence with the method for improving the perception precision of the road side sensor in the embodiment. As shown in fig. 2, the system for improving the target perception accuracy of the roadside sensor includes an obtaining module 201, a screening module 202, a calculating module 203, and a sending module 204. The functional modules are explained in detail as follows:
an obtaining module 201, configured to obtain target intersection map information sent by an RSU and BSM data broadcast by an intelligent internet vehicle;
the screening module 202 is used for screening out target BSM data of the intelligent networked automobile in the range of the target intersection according to the map information of the target intersection;
the calculation module 203 is used for acquiring sensing data of a roadside sensor within a time range when the intelligent networked automobile enters and exits a target intersection, matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value;
a sending module 204, configured to send the target position compensation value to the roadside sensor, so that the roadside sensor perceives the target position based on the target position compensation value.
For specific limitations of each module of the system for improving the target sensing accuracy of the roadside sensor, reference may be made to the above limitations on the method for improving the target sensing accuracy of the roadside sensor, and details are not repeated here. All or part of the modules in the system for improving the target perception accuracy of the roadside sensor can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Referring to fig. 3, the present embodiment further provides a computer device, which may be a computing device such as a mobile terminal, a desktop computer, a notebook, a palmtop computer, and a server. The computer device comprises a processor 10, a memory 20 and a display 30. FIG. 3 shows only some of the components of the computer device, but it is to be understood that not all of the shown components are required to be implemented, and that more or fewer components may be implemented instead.
The storage 20 may in some embodiments be an internal storage unit of the computer device, such as a hard disk or a memory of the computer device. The memory 20 may also be an external storage device of the computer device in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the computer device. Further, the memory 20 may also include both an internal storage unit and an external storage device of the computer device. The memory 20 is used for storing application software installed in the computer device and various data, such as program codes installed in the computer device. The memory 20 may also be used to temporarily store data that has been output or is to be output. In one embodiment, the memory 20 has stored thereon a computer program 40 for improving the accuracy of roadside sensor target perception.
The processor 10 may be a Central Processing Unit (CPU), microprocessor or other data Processing chip in some embodiments, and is used for executing program codes stored in the memory 20 or Processing data, such as executing a method for improving the sensing accuracy of the roadside sensor.
The display 30 may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch panel, or the like in some embodiments. The display 30 is used for displaying information at the computer device and for displaying a visual user interface. The components 10-30 of the computer device communicate with each other via a system bus.
In one embodiment, the following steps are implemented when the processor 10 executes the computer program 40 in the memory 20 for improving the perception accuracy of the roadside sensor target:
acquiring target intersection map information sent by an RSU (road side unit) and BSM (base station management) data broadcasted by an intelligent networked automobile;
screening target BSM data of the intelligent networked automobile in the range of the target intersection according to the map information of the target intersection;
acquiring sensing data of a road side sensor within a time range of the intelligent networked automobile entering and exiting a target intersection, matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value;
and sending the target position compensation value to the roadside sensor so that the roadside sensor perceives the target position based on the target position compensation value.
The present embodiment also provides a computer-readable storage medium having stored thereon a computer program for improving the accuracy of roadside sensor target perception, the computer program, when executed by a processor, implementing the steps of:
acquiring target intersection map information sent by an RSU (road side unit) and BSM (base station management) data broadcasted by an intelligent networked automobile;
screening target BSM data of the intelligent networked automobile in the range of the target intersection according to the map information of the target intersection;
acquiring sensing data of a road side sensor within a time range of the intelligent networked automobile entering and exiting a target intersection, matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value;
and sending the target position compensation value to the roadside sensor so that the roadside sensor perceives the target position based on the target position compensation value.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above.
Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The above description is only for the preferred embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered within the scope of the present application.

Claims (8)

1. A method for improving the perception accuracy of a roadside sensor is characterized by comprising the following steps:
acquiring target intersection map information sent by an RSU (road side unit) and BSM (base station management) data broadcasted by an intelligent networked automobile;
screening target BSM data of the intelligent networked automobile in the range of the target intersection according to the map information of the target intersection;
acquiring sensing data of a road side sensor within a time range of the intelligent networked automobile entering and exiting a target intersection, matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value;
sending the target position compensation value to the roadside sensor so that the roadside sensor perceives a target position based on the target position compensation value;
the matching the target BSM data with the sensing data and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value includes:
matching the target BSM data with corresponding parameter values in the perception data, and determining target point data corresponding to the intelligent networked automobile in the perception data;
acquiring a corresponding timestamp in the target point data, carrying out dead reckoning on the target BSM data based on a preset dead reckoning rule, and calculating to obtain a dead reckoning position of the intelligent networked automobile at the corresponding timestamp;
comparing the calculated position with a target point position of a corresponding timestamp in the target point data to obtain a target position compensation value;
the acquiring of the corresponding timestamp in the target point data, performing dead reckoning on the target BSM data based on a preset dead reckoning rule, and calculating to obtain the dead reckoning position of the intelligent networked automobile at the corresponding timestamp includes:
the target BSM data is addedt t,m Latitude and longitude position of timeLon m AndLat m conversion to UTM coordinatesy m Andx m and carrying out dead reckoning calculation on the corresponding timestamp of the intelligent networked automobile according to the following formulat t,n The estimated position of (2):
Figure 164791DEST_PATH_IMAGE001
wherein,s m is as followsmThe speed parameter in the bar target BSM data,h m is as followsmThe course angle parameter in the bar target BSM data,y m,p x m,p are respectively asy m x m Position calculated by dead reckoning, and calculatingy m,p x m,p Transforming the coordinate system into longitude and latitude coordinates of the calculated positionLon m,p Lat m,p
2. The method for improving the perception accuracy of the roadside sensor according to claim 1, wherein the screening out target BSM data of the intelligent networked automobile within the range of the target intersection according to the map information of the target intersection comprises:
extracting longitude and latitude positions of center points of the target intersections from the map information of the target intersectionsLon c Lat c According to the longitude and latitude position in each BSM dataLon v Lat v And calculating the distance between the intelligent networked automobile and the central point position of the target intersectiondThe formula is as follows:
Figure 57792DEST_PATH_IMAGE002
Figure 411413DEST_PATH_IMAGE003
wherein,Rwhich is the radius of the earth, is,Lon c Lat c Lon v Lat v converting into radian substitution calculation in the calculation process;
if the distance between the intelligent networked automobile and the central point position of the target intersectiondAnd if the target BSM data is not greater than the preset threshold, determining that the intelligent networked automobile is in the range of the target intersection, and screening the target BSM data of the intelligent networked automobile.
3. The method for improving the perception accuracy of the roadside sensor according to claim 1, wherein the matching the target BSM data with corresponding parameter values in the perception data to determine target point data corresponding to an intelligent networked automobile in the perception data comprises:
if a preset parameter difference value corresponding to a target in the perception data and an intelligent networked automobile in the target BSM data meets a preset condition, matching the perception data corresponding to the target into target point data corresponding to the intelligent networked automobile; the preset parameter difference comprises a timestamp difference, a course angle difference, a speed difference and/or a distance difference, and the preset conditions are that the absolute value of the timestamp difference is smaller than a preset timestamp difference threshold, the absolute value of the course angle difference is smaller than a preset course angle difference threshold, the absolute value of the speed difference is smaller than a preset speed difference threshold and/or the absolute value of the distance difference is smaller than a preset distance difference threshold.
4. The method for improving the perception accuracy of the roadside sensor according to claim 1, wherein the comparing the calculated position with a target point position corresponding to a time stamp in the target point data to obtain a target position compensation value comprises:
according to the longitude and latitude coordinates of the calculated positionLon m,p Lat m,p Corresponding time stamp in the target point datat t,n Target point position ofLon n,o Lat n,o Calculating a difference value to obtain a target position compensation value, wherein the formula is as follows:
Figure 209605DEST_PATH_IMAGE004
wherein,
Figure 690396DEST_PATH_IMAGE005
for a longitude compensation value among the target position compensation values,
Figure 890433DEST_PATH_IMAGE006
and the latitude compensation value is the latitude compensation value in the target position compensation value.
5. The method for improving the perception accuracy of the roadside sensor according to claim 4, further comprising:
calculating an average value of target position compensation values corresponding to all timestamps in the target point data of the intelligent networked automobile to obtain a first compensation value;
and calculating an average value of the first compensation values of all the intelligent networked automobiles in the range of the target intersection to obtain a second compensation value, and sending the second compensation value serving as a target position compensation value to the roadside sensor.
6. A system for improving the target perception accuracy of a roadside sensor, the system comprising:
the acquisition module is used for acquiring the map information of the target intersection sent by the RSU and the BSM data broadcasted by the intelligent networked automobile;
the screening module is used for screening out target BSM data of the intelligent networked automobile in the range of the target intersection according to the map information of the target intersection;
the calculation module is used for acquiring sensing data of a road side sensor within a time range when the intelligent networked automobile enters and exits a target intersection, matching the target BSM data with the sensing data, and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value;
a sending module, configured to send the target position compensation value to the roadside sensor, so that the roadside sensor senses a target position based on the target position compensation value;
the matching the target BSM data with the sensing data and calculating the matched target BSM data and the sensing data based on a preset dead reckoning rule to obtain a target position compensation value includes:
matching the target BSM data with corresponding parameter values in the perception data, and determining target point data corresponding to the intelligent networked automobile in the perception data;
acquiring a corresponding timestamp in the target point data, carrying out dead reckoning on the target BSM data based on a preset dead reckoning rule, and calculating to obtain a dead reckoning position of the intelligent networked automobile at the corresponding timestamp;
comparing the calculated position with a target point position of a corresponding timestamp in the target point data to obtain a target position compensation value;
the acquiring of the corresponding timestamp in the target point data, performing dead reckoning on the target BSM data based on a preset dead reckoning rule, and calculating to obtain the dead reckoning position of the intelligent networked automobile at the corresponding timestamp includes:
the target BSM data is addedt t,m Latitude and longitude position of timeLon m AndLat m conversion to UTM coordinatesy m Andx m and carrying out dead reckoning calculation on the corresponding timestamp of the intelligent networked automobile according to the following formulat t,n The estimated position of (2):
Figure 149376DEST_PATH_IMAGE007
wherein,s m is as followsmThe speed parameter in the bar target BSM data,h m is as followsmThe course angle parameter in the bar target BSM data,y m,p x m,p are respectively asy m x m Position calculated by dead reckoning, and calculatingy m,p x m,p Transforming the coordinate system into longitude and latitude coordinates of the calculated positionLon m,p Lat m,p
7. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method for improving the perception accuracy of a roadside sensor according to any one of claims 1 to 5 when executing the computer program.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, implements the steps of the method for improving the perception accuracy of a roadside sensor according to any one of claims 1 to 5.
CN202111305830.1A 2021-11-05 2021-11-05 Method, system, equipment and medium for improving perception precision of roadside sensor Active CN113779174B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111305830.1A CN113779174B (en) 2021-11-05 2021-11-05 Method, system, equipment and medium for improving perception precision of roadside sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111305830.1A CN113779174B (en) 2021-11-05 2021-11-05 Method, system, equipment and medium for improving perception precision of roadside sensor

Publications (2)

Publication Number Publication Date
CN113779174A CN113779174A (en) 2021-12-10
CN113779174B true CN113779174B (en) 2022-04-01

Family

ID=78956643

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111305830.1A Active CN113779174B (en) 2021-11-05 2021-11-05 Method, system, equipment and medium for improving perception precision of roadside sensor

Country Status (1)

Country Link
CN (1) CN113779174B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114065876B (en) * 2022-01-11 2022-04-12 华砺智行(武汉)科技有限公司 Data fusion method, device, system and medium based on roadside multi-sensor
CN115188187A (en) * 2022-07-05 2022-10-14 浙江嘉兴数字城市实验室有限公司 Roadside perception data quality monitoring system and method based on vehicle-road cooperation
CN116561534B (en) * 2023-07-10 2023-10-13 苏州映赛智能科技有限公司 Method and system for improving accuracy of road side sensor based on self-supervision learning

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9835729B2 (en) * 2012-12-28 2017-12-05 Trimble Inc. Global navigation satellite system receiver system with radio frequency hardware component
CN105741546B (en) * 2016-03-18 2018-06-29 重庆邮电大学 The intelligent vehicle Target Tracking System and method that roadside device is merged with vehicle sensor
BR112020004099A2 (en) * 2017-08-30 2020-09-24 Nissan Motor Co., Ltd. position correction method and position error correction device for driving-aided vehicles
CN111006680B (en) * 2019-12-04 2020-12-08 无锡物联网创新中心有限公司 Automatic driving vehicle path planning system and method based on V2I technology
CN111447556B (en) * 2020-04-01 2022-02-15 联陆智能交通科技(上海)有限公司 Method and system for screening vehicles under cooperative vehicle and road environment
CN111785019B (en) * 2020-06-22 2022-08-16 北京千方科技股份有限公司 Vehicle traffic data generation method and system based on V2X and storage medium
CN111913200B (en) * 2020-06-28 2023-07-14 深圳市金溢科技股份有限公司 Vehicle group differential positioning method, RSU equipment, fusion sensing equipment and system

Also Published As

Publication number Publication date
CN113779174A (en) 2021-12-10

Similar Documents

Publication Publication Date Title
CN113779174B (en) Method, system, equipment and medium for improving perception precision of roadside sensor
Qin et al. Vehicles on RFID: Error-cognitive vehicle localization in GPS-less environments
CN105793669B (en) Vehicle position estimation system, device, method, and camera device
US20230288208A1 (en) Sensor plausibility using gps road information
KR102221321B1 (en) Method for providing information about a anticipated driving intention of a vehicle
US11530931B2 (en) System for creating a vehicle surroundings model
US8892331B2 (en) Drive assist system and wireless communication device for vehicle
CN102445702B (en) Use the relative positioning enhancement method based on GPS of neighboring entity information
CN110906939A (en) Automatic driving positioning method and device, electronic equipment, storage medium and automobile
US20220157168A1 (en) V2X with 5G/6G Image Exchange and AI-Based Viewpoint Fusion
US9307369B2 (en) Wireless position detection apparatus and storage medium
CN102334147A (en) Vehicle-mounted information processing apparatus and information processing method
JP2014109795A (en) Vehicle position estimation device
CN112009484A (en) Method and system for determining driving assistance data
US20220057230A1 (en) Method For Checking Detected Changes To An Environmental Model Of A Digital Area Map
Williams et al. A qualitative analysis of vehicle positioning requirements for connected vehicle applications
EP4020111B1 (en) Vehicle localisation
CN114503176B (en) Method for acquiring self position and electronic device
CN114174137A (en) Source lateral offset of ADAS or AD features
CN110869864B (en) Method for locating a vehicle with a high degree of automation, and corresponding driver assistance system and computer program
KR102273506B1 (en) Method, device and computer-readable storage medium with instructions for determinig the position of data detected by a motor vehicle
US11908206B2 (en) Compensation for vertical road curvature in road geometry estimation
CN112585425A (en) Method for locating a vehicle
CN113179303A (en) Method, device and program carrier for reporting traffic events
US20110037617A1 (en) System and method for providing vehicular safety service

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Methods, systems, equipment, and media for improving the perception accuracy of roadside sensors

Effective date of registration: 20231010

Granted publication date: 20220401

Pledgee: Bank of China Limited Wuhan Economic and Technological Development Zone sub branch

Pledgor: ISMARTWAYS (WUHAN) TECHNOLOGY Co.,Ltd.

Registration number: Y2023980060478