CN113639764A - ADAS synchronous testing device and method based on multi-vision sensor - Google Patents

ADAS synchronous testing device and method based on multi-vision sensor Download PDF

Info

Publication number
CN113639764A
CN113639764A CN202110776975.3A CN202110776975A CN113639764A CN 113639764 A CN113639764 A CN 113639764A CN 202110776975 A CN202110776975 A CN 202110776975A CN 113639764 A CN113639764 A CN 113639764A
Authority
CN
China
Prior art keywords
visual
sensors
vision
sensor
adas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110776975.3A
Other languages
Chinese (zh)
Inventor
李森林
肖蕾
郝江波
周风明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Kotei Informatics Co Ltd
Original Assignee
Wuhan Kotei Informatics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Kotei Informatics Co Ltd filed Critical Wuhan Kotei Informatics Co Ltd
Priority to CN202110776975.3A priority Critical patent/CN113639764A/en
Publication of CN113639764A publication Critical patent/CN113639764A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to an ADAS synchronous testing device and method based on multi-vision sensors, wherein the testing device comprises a camera bellows, a scene simulation device, a plurality of vision sensors, a display device and an upper computer, wherein the scene simulation device, the plurality of vision sensors and the display device are all arranged in the camera bellows, and the scene simulation device, the plurality of vision sensors and the display device are respectively arranged in the camera bellows: the scene simulation device is used for providing a motion environment and a field of view for simulating a scene for the plurality of visual sensors; the upper computer is respectively in communication connection with the plurality of visual sensors and the display device and is used for acquiring and analyzing message data of the plurality of visual sensors; the display device is used for providing synchronous video information for the plurality of visual sensors. The invention realizes the real-time result display of the synchronous detection of the multi-vision sensor and the final output of the visual comparison report, and effectively saves the efficiency and the cost of the ADAS algorithm detection of different vision sensors.

Description

ADAS synchronous testing device and method based on multi-vision sensor
Technical Field
The invention belongs to the field of automatic driving test, and relates to an ADAS synchronous test device and method based on a multi-vision sensor.
Background
ADAS (advanced driving assistance system) is an important stage in the early stage of automatic driving, and ADAS simulation test can provide test mileage which is difficult to reach in real drive test and quickly simulate any scene, so that real vehicle drive test time is shortened, real vehicle drive test cost is reduced, and real vehicle drive test risk is reduced. At present, the automatic driving assistance system mainly has three solutions: based on vision dominance, laser radar dominance and vehicle networking dominance, the three schemes all need vision solutions, and it is very important that a host factory performs function tests on vision sensors provided by various suppliers before actual road tests, while a traditional video camera bellows only detects a single vision sensor, so that the effect of synchronous detection of a plurality of vision sensors cannot be realized, and the traditional video camera bellows has the problem of single test scene and cannot effectively test the ADAS algorithm performance of the vision sensor.
Disclosure of Invention
In order to detect a plurality of visual sensors and provide a plurality of test scenes, in a first aspect of the present invention, a multi-visual sensor based ADAS synchronous testing device is provided, which includes a camera bellows, a scene simulation device, a plurality of visual sensors, a display device and an upper computer, wherein the scene simulation device, the plurality of visual sensors and the display device are all disposed inside the camera bellows, and: the scene simulation device is used for providing a motion environment and a field of view for simulating a scene for the plurality of visual sensors; the upper computer is respectively in communication connection with the plurality of visual sensors and the display device and is used for acquiring and analyzing message data of the plurality of visual sensors; the display device is used for providing synchronous video information for the plurality of visual sensors.
In some embodiments of the invention, the scene simulator comprises a plurality of sliding rails for providing a simulated motion environment for the plurality of visual sensors and a plurality of adjustment mechanisms; the plurality of adjustment mechanisms are used for adjusting the visual fields of the video information at the plurality of vision sensors.
Furthermore, each adjusting mechanism at least comprises a visual sensor base and a visual sensor support, and the visual sensor base is used for fixing one end of the visual sensor support on the camera bellows; the vision sensor support is used for providing support for the vision sensor under the motion of multiple degrees of freedom.
In some embodiments of the present invention, the acquiring and analyzing the message data of the plurality of visual sensors includes the following steps: identifying message data generated by each vision sensor according to the transmission protocol of the vision sensor; and analyzing the message data generated by each visual sensor according to the DBC file of the visual sensor. Further, the analyzing the message data generated according to the DBC file of each visual sensor includes the following steps: analyzing the message data generated by each visual sensor to obtain target object information or traffic sign information; acquiring the transverse distance, the longitudinal distance, the width and the height of the target object and the traffic sign from the target object information or the traffic sign information; and framing each target object and the traffic sign.
In the above embodiment, the upper computer is further configured to determine a distance between the scene simulation apparatus and each of the vision sensors according to the parameter of each of the vision sensors, and control the scene simulation apparatus to fill the field of view of each of the vision sensors with the synchronized video information provided by the display apparatus.
In a second aspect of the present invention, a testing method for the ADAS synchronous testing device based on the multi-vision sensor provided in the first aspect is provided, including the following steps: determining that video information in the display device occupies an entire field of view of each of the vision sensors; acquiring and analyzing message data of the plurality of vision sensors; and performing synchronous function verification on the multi-vision sensor in multiple scenes.
And further, outputting a test report according to the analyzed data and the verification result.
In a third aspect of the present invention, there is provided an electronic device comprising: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method for testing a multi-vision sensor-based ADAS synchronization test apparatus as described in the second aspect of the present invention.
In a fourth aspect of the present invention, a computer readable medium is provided, on which a computer program is stored, wherein the computer program, when being executed by a processor, implements the testing method of the ADAS synchronization testing apparatus based on multiple visual sensors according to the second aspect of the present invention.
The invention has the beneficial effects that:
1. according to the invention, the multi-video sensor synchronous detection is realized by building a video camera bellows hardware environment for the multi-vision sensor synchronous performance test;
2. the upper computer analyzes the message formats of different visual sensors, so that the invention CAN be suitable for the synchronous performance test of the visual sensors with different CAN and CANFD formats;
3. performing synchronous function verification on the multi-vision sensor through a preset automatic driving scene library;
4. and a visual comparison report of the multi-vision sensor synchronous performance test in different automatic driving scenes can be output through the upper computer.
Drawings
FIG. 1 is a schematic diagram of a multi-vision sensor based ADAS synchronization test apparatus according to some embodiments of the present invention;
fig. 2 is a schematic diagram of a specific structure of the ADAS synchronization testing apparatus based on multi-vision sensor in some embodiments of the present invention;
fig. 3 is a schematic basic flow diagram of a testing method of the multi-vision sensor based ADAS synchronous testing device in some embodiments of the present invention;
fig. 4 is a schematic flow chart illustrating a testing method of the ADAS synchronization testing apparatus based on multi-vision sensor according to some embodiments of the present invention;
fig. 5 is a schematic structural diagram of an electronic device in some embodiments of the invention.
Reference numerals
1. Camera bellows, 11, scene analogue means, 111, slide rail, 112, adjustment mechanism, 1121, vision sensor base, 1122, vision sensor support, 12, vision sensor, 13, display device, 2, host computer.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
Example 1
Referring to fig. 1, in a first aspect of the present invention, there is provided an ADAS synchronization testing apparatus based on multiple visual sensors, including a camera 1, a scene simulation apparatus 11, multiple visual sensors 12, a display apparatus 13, and an upper computer 2, where the scene simulation apparatus 11, the multiple visual sensors 12, and the display apparatus 13 are all disposed inside the camera 1, and: the scene simulation device 11 is configured to provide a scene simulation motion environment and a field of view for the plurality of visual sensors 12; the upper computer 2 is in communication connection with the plurality of visual sensors 12 and the display device 13 respectively, and is used for acquiring and analyzing message data of the plurality of visual sensors 12; the display device 13 is configured to provide the plurality of vision sensors 12 with synchronized video information.
Referring to fig. 2, in some embodiments of the present invention, the scene simulator 11 comprises a plurality of sliding rails 111 and a plurality of adjustment mechanisms 112, the plurality of sliding rails 111 being used for providing a simulated motion environment for the plurality of visual sensors 12; the plurality of adjustment mechanisms 112 are configured to adjust the field of view of the video information at the plurality of vision sensors 12. Alternatively, the sliding rail 111 may be replaced with other sliding mechanisms that can provide a moving environment for the vision sensor 12, such as a simulated track, a simulated road, or a city micro model.
It is understood that the above-mentioned adjusting mechanisms 112 may correspond to the vision sensors 12 one by one, and one vision sensor 12 may correspond to a plurality of adjusting mechanisms 112, so as to provide a scene simulation motion environment and a field of view for each vision sensor 12.
Further, each adjusting mechanism 112 at least comprises a vision sensor base 1121 and a vision sensor bracket 1122, wherein the vision sensor base 1121 is used for fixing one end of the vision sensor bracket 1122 on the camera bellows 1; the vision sensor mount 1122 is used to provide support for the vision sensor in multiple degrees of freedom of movement. The vision sensor holder 1122 connects the vision sensor base 1121 with the vision sensor 12, and has 6 degrees of freedom (translation along the x-axis, translation along the y-axis, translation along the z-axis, rotation around the x-axis, rotation around the y-axis, and rotation around the z-axis), so that the video information (usually, a video or an image displayed in a preset simulation scene) in the display device 13 just occupies the entire field of view of each vision sensor 12 through the 6 degrees of freedom of the vision sensor holder 1122, thereby achieving the detection effect of a realistic simulation camera on real road test.
Specifically, the display device 13 is a display. Alternatively, the display device 13 may be replaced with a wearable smart device with the display device 13, a portable computer, or even an electronic device with a display function module or component, such as VR (virtual display), AR (hybrid display), or the like.
In some embodiments of the present invention, the acquiring and parsing the message data of the plurality of vision sensors 12 includes the following steps: identifying message data generated by each vision sensor 12 according to its transmission protocol; the message data generated by each vision sensor 12 is parsed according to its DBC file. Further, the parsing the message data generated according to the DBC file of each vision sensor 12 includes the following steps: analyzing the message data generated by each visual sensor 12 to obtain target object information or traffic sign information; acquiring the transverse distance, the longitudinal distance, the width and the height of the target object and the traffic sign from the target object information or the traffic sign information; and framing each target object and the traffic sign.
In the above embodiment, the upper computer 2 is further configured to determine a distance between the scene simulator 11 and each of the vision sensors 12 according to the parameter of each of the vision sensors 12, and control the scene simulator 11 to fill the field of view of each of the vision sensors with the synchronized video information provided by the display device 13.
Specifically, based on the parameters of the vision sensor 12, the FOV (pitch angle, yaw angle, tilt angle), the focal length f, and the effective width of the display screen are known, and the distance from the vision sensor 12 to the display device 13 (display screen) is calculated as distance width 0.5 f/match. After the distance from the vision sensor 12 to the display is determined, the vision sensor base 1121 and the vision sensor support 1122 are adjusted until the distance from the display to the vision sensor 12 is the calculated distance, and the FOV of the camera is finely adjusted at the same time, so that the video image played in the display screen fills the whole field of view of each vision sensor 12.
Example 2
Referring to fig. 3, in a second aspect of the present invention, there is provided a testing method based on the ADAS synchronization testing apparatus for multi-vision sensor provided in the first aspect, including the following steps: s100, determining that video information in a display device occupies the whole visual field of each visual sensor; s200, acquiring message data of the plurality of visual sensors and analyzing the message data; s300, carrying out synchronous function verification on the multi-vision sensor in multiple scenes.
Specifically, in an embodiment of the present invention, the testing method of the ADAS synchronization testing apparatus based on the multi-vision sensor provided in the first aspect includes:
step 1, preparing video camera bellows hardware: the device comprises a camera bellows, a plurality of vision sensors, a display, a sliding rail, a vision sensor adjustable support, a vision sensor base, a power supply, PCAN-USB Pro equipment supporting CAN and CANFD, an upper computer 2 and other hardware.
Step 2, installation of a dark box environment: the display is horizontally fixed at the bottom of the dark box; the first sliding rail and the second sliding rail are arranged on the camera bellows frame; the visual sensor base is arranged on the sliding rails and can move around each sliding rail; visual sensor base and visual sensor are connected to the adjustable support of visual sensor, possess 6 degrees of freedom (along x axle translation, along y axle translation, along z axle translation, rotate around the x axle, rotate around the y axle, rotate around the z axle), make the video information in the display just fill the whole visual field of visual sensor through 6 degrees of freedom of adjusting the support, reach the detection effect of lifelike simulation camera at real way survey.
Step 3, calibrating the visual sensor: according to parameters of the vision sensor, a FOV (pitch angle, yaw angle, dip angle), a focal length f and an effective width of a display screen are known, a distance between the vision sensor and the display screen is calculated, the distance is calculated as distance which is 0.5f/Math.tan (fieldOfView 0.5f), wherein fieldOfView represents a viewing angle, a base and a support of the vision sensor are adjusted until the distance between the display and the vision sensor is the calculated distance after the distance between the vision sensor and the display is determined, and meanwhile, the FOV of the camera is finely adjusted to enable a video image (or other video scene information) played in the display (display screen) to fill the whole viewing field of each vision sensor.
And 4, analyzing the CAN/CANFD format message: and connecting the PCAN-USB pro equipment with the visual sensor to carry out CAN/CANFD message transmission.
Step 5, preparing an automatic driving scene library: and performing synchronous function verification on the multi-vision sensor in multiple scenes.
And 6, carrying out real-time visual comparison on the 2-end effect of the upper computer: (1) the upper computer 2 is connected with the camera bellows equipment through a PCAN-USB pro, and realizes automatic distinguishing and reading of message data in CAN and CANFD formats by setting Bus channel parameters for CAN/CANFD protocol transmission; (2) the method includes the steps that multiple visual sensors acquire the same scene video stream, one display screen in a video darkbox is used for playing scene videos in a natural driving scene library, and the multiple visual sensors synchronously acquire and identify scene data in the video stream; (3) analyzing the obtained CAN data according to DBC files of different visual sensors, and mainly analyzing target object information and traffic sign information; (4) framing the target object and framing the traffic sign: information such as transverse distance, longitudinal distance, width and height between a target object and a traffic sign, which is identified by a visual sensor, is acquired through a DBC file; (5) the method comprises the steps of carrying out area division on a picture by calling an OpenCV image processing module algorithm, determining central point coordinates of a target object and a traffic sign in the picture, determining rectangular frame positions of the target object and the traffic sign according to the central point coordinates, and framing on the picture to achieve the purpose of visually detecting different vision sensor ADAS algorithms.
Step 7, result visualization report form: after the scene video stream is played, the program of the upper computer 2 automatically executes a visual report generation function, the data collected by the plurality of visual sensors and the true value data are compared and verified, the error number, the missing detection number and the success number of various attributes of each frame are counted, the error number, the missing detection number and the success number are fed back in a chart form, the chart function calls a icon library in a pyharats module (an open source visual library realized by JavaScript) in python to realize the graph function, and finally calls a report module in python to generate a pdf file by comparing the formed data with the picture to store the pdf file, so that the final result visual report is obtained. The ECharts can be replaced by the existing visual frame and function library constructed by python, java and other programming languages to realize the visual report function.
Example 3
In a third aspect of the present invention, there is provided an electronic device comprising: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the testing method of the multi-vision sensor based ADAS synchronization testing apparatus provided by the second aspect of the present invention.
Referring to fig. 5, an electronic device 500 may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM502, and the RAM 503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following devices may be connected to the I/O interface 505 in general: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; a storage device 508 including, for example, a hard disk; and a communication device 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device 500 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 5 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the ROM 502. The computer program, when executed by the processing device 501, performs the above-described functions defined in the methods of embodiments of the present disclosure. It should be noted that the computer readable medium described in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more computer programs which, when executed by the electronic device, cause the electronic device to:
computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, Python, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. The utility model provides a synchronous testing arrangement of ADAS based on many visual sensors, a serial communication port, including camera bellows, scene analogue means, a plurality of visual sensor, display device and host computer, wherein scene analogue means, a plurality of visual sensor and display device all locate inside the camera bellows, just:
the scene simulation device is used for providing a motion environment and a field of view for simulating a scene for the plurality of visual sensors;
the upper computer is respectively in communication connection with the plurality of visual sensors and the display device and is used for acquiring and analyzing message data of the plurality of visual sensors;
the display device is used for providing synchronous video information for the plurality of visual sensors.
2. The multi-vision sensor-based ADAS synchronization testing device of claim 1, wherein said scene simulator comprises a plurality of slide rails and a plurality of adjustment mechanisms,
the plurality of sliding rails are used for providing a simulated motion environment for the plurality of visual sensors;
the plurality of adjustment mechanisms are used for adjusting the visual fields of the video information at the plurality of vision sensors.
3. The device for simultaneous testing of ADAS based on multiple vision sensors of claim 2, wherein each adjustment mechanism comprises at least one vision sensor base and one vision sensor holder,
the visual sensor base is used for fixing one end of the visual sensor bracket on the camera bellows;
the vision sensor support is used for providing support for the vision sensor under the motion of multiple degrees of freedom.
4. The device for synchronously testing the ADAS based on multiple visual sensors according to claim 1, wherein the step of obtaining and analyzing the message data of the multiple visual sensors comprises the steps of:
identifying message data generated by each vision sensor according to the transmission protocol of the vision sensor;
and analyzing the message data generated by each visual sensor according to the DBC file of the visual sensor.
5. The device for testing synchronization of ADAS based on multiple visual sensors as claimed in claim 4, wherein said parsing the message data generated according to DBC file of each visual sensor comprises the following steps:
analyzing the message data generated by each visual sensor to obtain target object information or traffic sign information;
acquiring the transverse distance, the longitudinal distance, the width and the height of the target object and the traffic sign from the target object information or the traffic sign information;
and framing each target object and the traffic sign.
6. The device for simultaneous testing of ADAS based on multiple visual sensors according to one of claims 1-5, characterized in that said host computer is further configured to determine the distance between the scene simulator and the visual sensors according to the parameters of each visual sensor, and to control said scene simulator to fill the visual field of each visual sensor with the synchronized video information provided by said display device.
7. A method for testing an ADAS synchronization test device based on a multi-vision sensor as claimed in any one of claims 1 to 5, comprising the following steps:
determining that video information in the display device occupies an entire field of view of each of the vision sensors;
acquiring and analyzing message data of the plurality of vision sensors;
and performing synchronous function verification on the multi-vision sensor in multiple scenes.
8. The method as claimed in claim 7, further comprising outputting a test report according to the parsed data and the verification result.
9. An electronic device, comprising: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the method for multi-vision sensor based ADAS synchronization testing as claimed in claims 7-8.
10. A computer-readable medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, is adapted to carry out the method for the simultaneous testing of ADAS based on multi-vision sensors of claims 7-8.
CN202110776975.3A 2021-07-08 2021-07-08 ADAS synchronous testing device and method based on multi-vision sensor Pending CN113639764A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110776975.3A CN113639764A (en) 2021-07-08 2021-07-08 ADAS synchronous testing device and method based on multi-vision sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110776975.3A CN113639764A (en) 2021-07-08 2021-07-08 ADAS synchronous testing device and method based on multi-vision sensor

Publications (1)

Publication Number Publication Date
CN113639764A true CN113639764A (en) 2021-11-12

Family

ID=78417007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110776975.3A Pending CN113639764A (en) 2021-07-08 2021-07-08 ADAS synchronous testing device and method based on multi-vision sensor

Country Status (1)

Country Link
CN (1) CN113639764A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109163889A (en) * 2018-08-09 2019-01-08 华域汽车***股份有限公司 A kind of test device and method of forward sight camera ADAS algorithm
CN109407547A (en) * 2018-09-28 2019-03-01 合肥学院 Multi-camera in-loop simulation test method and system for panoramic visual perception
CN112925223A (en) * 2021-02-03 2021-06-08 北京航空航天大学 Unmanned aerial vehicle three-dimensional tracking virtual test simulation system based on visual sensing network
CN112987593A (en) * 2021-02-19 2021-06-18 中国第一汽车股份有限公司 Visual positioning hardware-in-the-loop simulation platform and simulation method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109163889A (en) * 2018-08-09 2019-01-08 华域汽车***股份有限公司 A kind of test device and method of forward sight camera ADAS algorithm
CN109407547A (en) * 2018-09-28 2019-03-01 合肥学院 Multi-camera in-loop simulation test method and system for panoramic visual perception
CN112925223A (en) * 2021-02-03 2021-06-08 北京航空航天大学 Unmanned aerial vehicle three-dimensional tracking virtual test simulation system based on visual sensing network
CN112987593A (en) * 2021-02-19 2021-06-18 中国第一汽车股份有限公司 Visual positioning hardware-in-the-loop simulation platform and simulation method

Similar Documents

Publication Publication Date Title
US11763474B2 (en) Method for generating simulated point cloud data, device, and storage medium
KR101553273B1 (en) Method and Apparatus for Providing Augmented Reality Service
CN109163889A (en) A kind of test device and method of forward sight camera ADAS algorithm
CN111766951B (en) Image display method and apparatus, computer system, and computer-readable storage medium
IL263302A (en) Digital camera with audio, visual and motion analysis
CN103745627A (en) Three-dimension visual simulation method and three-dimension visual simulation system for driving simulation
CN110793548A (en) Navigation simulation test system based on virtual-real combination of GNSS receiver hardware in loop
CN115687106A (en) Reinjection hardware-based in-loop automatic testing method and device
CN106910358B (en) Attitude determination method and device for unmanned vehicle
CN113421588A (en) Method and device for detecting abnormal sound of household appliance, electronic equipment and storage medium
Soares et al. Designing a highly immersive interactive environment: The virtual mine
CN115460353B (en) Teaching tracking camera equipment configuration method and device, electronic equipment and medium
CN109816791B (en) Method and apparatus for generating information
CN113639764A (en) ADAS synchronous testing device and method based on multi-vision sensor
CN113997863B (en) Data processing method and device and vehicle
CN112990017B (en) Smart city big data analysis method and monitoring system
CN115308754A (en) Laser radar point cloud simulation time delay test method and system
CN113840138A (en) Around-looking camera simulation device, method and equipment and readable storage medium
CN114332224A (en) Method, device and equipment for generating 3D target detection sample and storage medium
CN112561740A (en) Intelligent auxiliary device and method for construction of assembly type building component
CN110633182B (en) System, method and device for monitoring server stability
CN111367485A (en) Method, device, medium and electronic equipment for controlling combined multimedia blackboard
CN114630081B (en) Video playing sequence determining method, device, equipment and medium
CN113595779B (en) Method, apparatus, medium, and network analysis system for acquiring data for network analysis
KR20180097913A (en) Image capturing guiding method and system for using user interface of user terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination