CN110091875A - Deep learning type intelligent driving context aware systems based on Internet of Things - Google Patents
Deep learning type intelligent driving context aware systems based on Internet of Things Download PDFInfo
- Publication number
- CN110091875A CN110091875A CN201910396591.1A CN201910396591A CN110091875A CN 110091875 A CN110091875 A CN 110091875A CN 201910396591 A CN201910396591 A CN 201910396591A CN 110091875 A CN110091875 A CN 110091875A
- Authority
- CN
- China
- Prior art keywords
- intelligent driving
- internet
- things
- deep learning
- context aware
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/55—External transmission of data to or from the vehicle using telemetry
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses the deep learning type intelligent driving context aware systems based on Internet of Things, belong to intelligent perception system field, deep learning type intelligent driving context aware systems based on Internet of Things, including sensory perceptual system and intelligent driving vehicle, sensory perceptual system includes sensing layer and cognition layer, decision-making level and control layer are provided on intelligent driving vehicle, sensing layer is acquired for data, decision-making level will be for that will recognize the information and route planning that layer transmits, it is handled with algorithm, and the instruction to control layer output adjustment speed and direction, the instruction of control layer reception decision-making level, and control the brake of vehicle, throttle and gear, this programme improves unpiloted implementation, utilize all movements and static barrier on sensory perceptual system acquisition road, the data information of these barriers is sent to all intelligence driving on the road Vehicle is driven, technical difficulty and production cost are reduced while improving unpiloted safety and stability.
Description
Technical field
The present invention relates to intelligent perception system fields, more specifically to the deep learning type intelligence based on Internet of Things
Driving environment sensory perceptual system.
Background technique
The unmanned research direction as automobile future has automobile industry even transportation far-reaching
Influence, the both hands that the arriving of pilotless automobile will liberate mankind, reduce occur traffic accident occur frequency, guarantee
The safety of people.Simultaneously with the core technologies such as artificial intelligence, sensing detection breakthrough and constantly promote, it is unmanned will
It is more intelligent, while also can be realized the industrialization of pilotless automobile.
This field of the addition of unmanned technology, especially internet and non-traditional Automobile Enterprises, so that this technology
Also non-popularization and application just will sink among a piece of " Red sea " again.But has a problem in that with as this intelligent robot nobody drive
Automobile is sailed, people's traffic trip ability of riding can be got a promotion and thoroughly again as invention automobile replaces carriage
Improve? it is too big that the poor principal element of Chinese big and medium-sized cities road traffic condition now is that vehicle possesses quantity, moreover we
Human factor causes the accident under national conditions and the reason of congestion accounts for sizable specific gravity.It largely knocks into the back, touch the traffic such as scratch
Accident can be summarized as relying on the driving of person for driving a car." independent behavior ability " mode, i.e., discovery-judgement-row in visual range
It is dynamic.Then just have find not in time, reaction not in time, the reaction time factors such as not enough cause the accident.And still to prolong
The Intelligent unattended driving mode of continuous artificial " personal independent behavior ability " still has above-mentioned these problems certainly, not from
A possibility that fundamentally solving, theoretically just occurring there are still above-mentioned accident.Meanwhile automatic driving vehicle batch is limited at present
Two big main problems of production are technical difficulty and cost problem, therefore are improved to unpiloted implementation, thus
Unpiloted safety, stability are improved, technical difficulty and production cost are reduced.
Summary of the invention
1. technical problems to be solved
Aiming at the problems existing in the prior art, the purpose of the present invention is to provide the deep learning type intelligence based on Internet of Things
Energy driving environment sensory perceptual system, it improves unpiloted implementation, is acquired on road and is owned using sensory perceptual system
Mobile and static barrier, is sent to all intelligent driving vehicles driving on the road for the data information of these barriers
, technical difficulty and production cost are reduced while improving unpiloted safety and stability.
2. technical solution
To solve the above problems, the present invention adopts the following technical scheme that.
Deep learning type intelligent driving context aware systems based on Internet of Things, including sensory perceptual system and intelligent driving vehicle
, the sensory perceptual system includes sensing layer and cognition layer, is provided with decision-making level and control layer on the intelligent driving vehicle, described
Sensing layer is acquired for data, and sensing layer includes radar cell, inertial navigation unit, positioning unit and camera unit, described
Layer is recognized to analyze for data, the decision-making level is used to that the information and route planning that layer transmits will to be recognized, it is handled with algorithm,
And the instruction to control layer output adjustment speed and direction, the control layer receive the instruction of decision-making level, and control the brake of vehicle
Vehicle, throttle and gear, this programme improve unpiloted implementation, utilize all shiftings on sensory perceptual system acquisition road
Dynamic and static barrier, is sent to all intelligent driving vehicles driving on the road for the data information of these barriers,
Technical difficulty and production cost are reduced while improving unpiloted safety and stability.
Further, the radar cell, inertial navigation unit, positioning unit and camera unit are mountable to both sides of the road
Street lamp on, using original street lamp device carry out sensory perceptual system installation and design, be not necessarily to additional large size hardware facility,
A large amount of hardware configurations, save the cost can be saved, and be not take up space, the cognition layer include to pedestrian, vehicle, traffic article,
The analysis of traffic mark and lane line.
Further, the radar cell includes laser radar and millimetre-wave radar, and millimetre-wave radar passes through transmitting/receiving wireless
Electric wave measures the device of angle and relative velocity at a distance from motor vehicle environment vehicle.Widely made currently as trailer-mounted radar
With.The not influence vulnerable to the bad weathers such as dense fog sleet and dust dirt etc., can steadily detect vehicle, in the present system,
Millimetre-wave radar is based on multi-target detection algorithm, for detecting fixed area inside lane, the moving obstacle above pavement
Distance, speed.
Further, the laser radar is to forbid, and the environment strafed is to fix, and laser radar obtains environment number first
It is stored in computer according to and with array form, the environmental data of acquisition is pre-processed, rejecting trees, the information such as ground,
The environmental data for carrying out non-planar algorithm simultaneously to range information, the Reflection intensity information of laser radar divides clustering processing, mentions
Barrier boundary rectangle contour feature is taken, laser radar is using multiple hypotheis tracking model algorithm to the obstacle information of two continuous frames
Data correlation is carried out, and dynamic barrier is continuously predicted and tracked using Kalman filtering algorithm.
Further, using wireless communication between the sensory perceptual system and intelligent driving vehicle, sensory perceptual system is by identification
Data information is sent to neighbouring all intelligent driving vehicles so that each car clearly control surrounding all obstacles, its
The driving direction and travel speed of his vehicle and other vehicles.
Further, the Data Fusion of Sensor of the radar cell, inertial navigation unit, positioning unit and camera unit
Including Space integration, Fusion in Time and Data Fusion of Sensor algorithm, accurate laser radar coordinate system is established, three-dimensional world is sat
Mark system, millimeter wave coordinate system are the key that the Space integrations for realizing multi-sensor data.Laser radar and millimetre-wave radar space
Fusion is exactly that the measured value of different sensors coordinate system is transformed into the same coordinate system;Laser radar and millimetre-wave radar letter
Breath is except spatially being merged, it is also necessary to which sensor synchronous acquisition in time realizes the fusion of time.Two
The sample frequency of kind sensor is different, in order to guarantee the reliability of data, on the basis of low sampling rate sensor, and low frequency
Pass hidden weapon and often adopt a frame image, choose the data of high frequency sensor previous frame caching, that is, complete one frame radar of common sampling with
The data of vision fusion, to ensure that millimetre-wave radar data are temporal synchronous with camera data;Sensing data melts
The core key of conjunction is still that the Data Fusion of Sensor algorithm of this system is using extension karr using suitable blending algorithm
Horse filtering algorithm.
Further, the radar cell, inertial navigation unit, positioning unit and camera unit are installed on fixed range
On adjacent street lamp, Position Design rationalizes, standardization.
Further, between the radar cell on the adjacent street lamp, inertial navigation unit, positioning unit and camera unit
Using connection type in parallel, when single electricity consumption component wear, it will not influence other normal operations for using electric device, guarantee entire
The stability of system operation.
3. beneficial effect
Compared with the prior art, the present invention has the advantages that
(1) this programme improves unpiloted implementation, utilizes all movements on sensory perceptual system acquisition road
With static barrier, the data information of these barriers is sent to all intelligent driving vehicles driving on the road,
Technical difficulty and production cost are reduced while improving unpiloted safety and stability.
(2) radar cell, inertial navigation unit, positioning unit and camera unit are mountable on the street lamp of both sides of the road,
The installation and design of sensory perceptual system are carried out using original street lamp device, is not necessarily to additional large size hardware facility, can be saved big
Hardware configuration, save the cost are measured, and is not take up space, cognition layer includes to pedestrian, vehicle, traffic article, traffic mark and vehicle
The analysis of diatom.
(3) radar cell includes laser radar and millimetre-wave radar, millimetre-wave radar by transmitting/receiving wireless electric wave, measurement with
The device of the distance of motor vehicle environment vehicle, angle and relative velocity.It is widely used currently as trailer-mounted radar.Not vulnerable to big
The influence of bad weathers and the dust dirts such as fog precipitation snow etc., can steadily detect vehicle, in the present system, millimetre-wave radar base
In multi-target detection algorithm, distance, the speed of the moving obstacle for detecting fixed area inside lane, above pavement.
(4) laser radar is to forbid, and the environment strafed is to fix, and laser radar obtains environmental data and first with array
Form is stored in computer, is pre-processed to the environmental data of acquisition, and trees, the information such as ground, to laser radar are rejected
Range information, Reflection intensity information simultaneously carry out non-planar algorithm environmental data segmentation clustering processing, extract barrier outside
Rectangular profile feature is connect, laser radar carries out data pass using obstacle information of the multiple hypotheis tracking model algorithm to two continuous frames
Connection, and dynamic barrier is continuously predicted and tracked using Kalman filtering algorithm.
(5) using wireless communication between sensory perceptual system and intelligent driving vehicle, sensory perceptual system sends out the data information of identification
Give neighbouring all intelligent driving vehicles so that each car clearly control surrounding all obstacles, other vehicles and
The driving direction and travel speed of other vehicles.
(6) Data Fusion of Sensor of radar cell, inertial navigation unit, positioning unit and camera unit includes that space is melted
Conjunction, Fusion in Time and Data Fusion of Sensor algorithm, establish accurate laser radar coordinate system, three-dimensional world coordinate system, millimeter
Wave coordinate system is the key that the Space integration for realizing multi-sensor data.Laser radar is exactly with millimetre-wave radar Space integration
The measured value of different sensors coordinate system is transformed into the same coordinate system;Laser radar and millimetre-wave radar information are being removed
It is spatially merged, it is also necessary to which sensor synchronous acquisition in time realizes the fusion of time.Two kinds of sensors
Sample frequency it is different, in order to guarantee the reliability of data, on the basis of low sampling rate sensor, low frequency pass hidden weapon it is every
A frame image is adopted, the data of high frequency sensor previous frame caching are chosen, that is, completes one frame radar of common sampling and is merged with vision
Data, to ensure that millimetre-wave radar data are temporal synchronous with camera data;The core of Data Fusion of Sensor
Key is still that, using suitable blending algorithm, the Data Fusion of Sensor algorithm of this system is calculated using extension karr horse filtering
Method.
(7) radar cell, inertial navigation unit, positioning unit and camera unit are installed on the adjacent street lamp of fixed range
On, Position Design rationalizes, standardization.
(8) using parallel connection between the radar cell on adjacent street lamp, inertial navigation unit, positioning unit and camera unit
Connection type when single electricity consumption component wear, will not influence other normal operations for using electric device, guarantee whole system running
Stability.
Detailed description of the invention
Fig. 1 is context aware systems conceptual scheme of the invention;
Fig. 2 is unmanned conceptual scheme of the invention;
Fig. 3 is laser radar recognizer flow chart of the invention;
Fig. 4 is adjacent domain high-precision electronic structure figures of the invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description;Obviously, described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments, is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the instruction such as term " on ", "lower", "inner", "outside", " top/bottom end "
Orientation or positional relationship be based on the orientation or positional relationship shown in the drawings, be merely for convenience of description the present invention and simplification retouch
It states, rather than the device or element of indication or suggestion meaning must have a particular orientation, be constructed and operated in a specific orientation,
Therefore it is not considered as limiting the invention.In addition, term " first ", " second " are used for description purposes only, and cannot understand
For indication or suggestion relative importance.
In the description of the present invention, it should be noted that unless otherwise clearly defined and limited, term " installation " " is set
Be equipped with ", " be arranged/connect ", " connection " etc., shall be understood in a broad sense, such as " connection ", may be a fixed connection, be also possible to removable
Connection is unloaded, or is integrally connected;It can be mechanical connection, be also possible to be electrically connected;It can be directly connected, it can also be in
Between medium be indirectly connected, can be the connection inside two elements.It for the ordinary skill in the art, can be specific
Situation understands the concrete meaning of above-mentioned term in the present invention.
Embodiment 1:
Referring to Fig. 1, the deep learning type intelligent driving context aware systems based on Internet of Things, including sensory perceptual system and intelligence
Vehicle can be driven, sensory perceptual system includes sensing layer and cognition layer, is provided with decision-making level and control layer on intelligent driving vehicle, perceives
Layer is acquired for data, and sensing layer includes radar cell, inertial navigation unit, positioning unit and camera unit, external meter
Calculation machine is analyzed as cognition layer for data, the decision-making level being arranged on intelligent driving vehicle be used to recognize information that layer transmits with
Route planning is handled with algorithm, and the instruction to control layer output adjustment speed and direction, is arranged on intelligent driving vehicle
Control layer receive the instruction of decision-making level, and control brake, throttle and the gear of vehicle, this programme is to unpiloted realization side
Formula improves, while improving to sensory perceptual system and intelligent driving vehicle, makes the two perfect cooperation, unmanned improving
Safety and stability while reduce technical difficulty and production cost.
Radar cell, inertial navigation unit, positioning unit and camera unit are mountable on the street lamp of both sides of the road, are utilized
Original street lamp device carries out the installation and design of sensory perceptual system, is not necessarily to additional large size hardware facility, can save a large amount of hard
Part configuration, save the cost, and it is not take up space, cognition layer includes to pedestrian, vehicle, traffic article, traffic mark and lane line
Analysis.
Radar cell includes laser radar and millimetre-wave radar, referring to Fig. 3, millimetre-wave radar passes through transmitting/receiving wireless electricity
Wave measures the device of angle and relative velocity at a distance from motor vehicle environment vehicle.Widely made currently as trailer-mounted radar
With.The not influence vulnerable to the bad weathers such as dense fog sleet and dust dirt etc., can steadily detect vehicle, in the present system,
Millimetre-wave radar is based on multi-target detection algorithm, for detecting fixed area inside lane, the moving obstacle above pavement
Distance, speed.
Laser radar is to forbid, and the environment strafed is to fix, and laser radar obtains environmental data and first with array shape
Formula is stored in computer, is pre-processed to the environmental data of acquisition, and trees, the information such as ground, to laser radar are rejected
Range information, Reflection intensity information carry out the environmental data segmentation clustering processing of non-planar algorithm simultaneously, and it is external to extract barrier
Rectangular profile feature, laser radar carry out data pass using obstacle information of the multiple hypotheis tracking model algorithm to two continuous frames
Connection, and dynamic barrier is continuously predicted and tracked using Kalman filtering algorithm.
Referring to Fig. 1, using wireless communication between sensory perceptual system and intelligent driving vehicle, sensory perceptual system is by the data of identification
Information is sent to neighbouring all intelligent driving vehicles, so that each car clearly controls surrounding all obstacles, other vehicles
And other vehicles driving direction and travel speed.
Radar cell, inertial navigation unit, positioning unit and camera unit Data Fusion of Sensor include Space integration,
Fusion in Time and Data Fusion of Sensor algorithm establish accurate laser radar coordinate system, three-dimensional world coordinate system, millimeter wave seat
Mark system is the key that the Space integration for realizing multi-sensor data.Laser radar and millimetre-wave radar Space integration are exactly will not
Measured value with sensor coordinate system is transformed into the same coordinate system;Laser radar and millimetre-wave radar information are except in space
It is upper to be merged, it is also necessary to which that sensor synchronous acquisition in time realizes the fusion of time.Two kinds of sensors are adopted
Sample frequency is different, and in order to guarantee the reliability of data, on the basis of low sampling rate sensor, low frequency passes hidden weapon and often adopts one
Frame image chooses the data of high frequency sensor previous frame caching, that is, completes the number that one frame radar of common sampling is merged with vision
According to ensure that millimetre-wave radar data are temporal synchronous with camera data;The core key of Data Fusion of Sensor
Still it is that the Data Fusion of Sensor algorithm of this system is using extension karr horse filtering algorithm using suitable blending algorithm.
Referring to Fig. 4, radar cell, inertial navigation unit, positioning unit and camera unit are installed on the phase of fixed range
On adjacent street lamp, Position Design rationalizes, standardization.
Radar cell, inertial navigation unit, positioning unit on adjacent street lamp and the company that parallel connection is used between camera unit
It connects mode, when single electricity consumption component wear, will not influence other normal operations for using electric device, it is easy to repair, it is single to pass through positioning
The setting of member can facilitate related maintenance personal quickly accurately to find guilty culprit position, accelerate maintenance process, guarantee entire
The stability of system operation.
Compared to traditional technology, this programme improves unpiloted implementation, while to sensory perceptual system and intelligence
Vehicle can be driven to improve, using all movements and static barrier on sensory perceptual system acquisition road, by these barriers
Data information be sent to all intelligent driving vehicles driving on the road, improving unpiloted safety and stability
While reduce technical difficulty and production cost.
The foregoing is intended to be a preferred embodiment of the present invention;But scope of protection of the present invention is not limited thereto.
Anyone skilled in the art in the technical scope disclosed by the present invention, according to the technique and scheme of the present invention and its
It improves design and is subject to equivalent substitution or change, should be covered by the scope of protection of the present invention.
Claims (10)
1. the deep learning type intelligent driving context aware systems based on Internet of Things, it is characterised in that: including sensory perceptual system and intelligence
Vehicle can be driven, the sensory perceptual system includes sensing layer and cognition layer, is provided with decision-making level and control on the intelligent driving vehicle
Preparative layer, the sensing layer is acquired for data, and sensing layer includes radar cell, inertial navigation unit, positioning unit and camera shooting
Unit, the cognition layer are analyzed for data, and the decision-making level uses algorithm for that will recognize the information and route planning that layer transmits
It is handled, and the instruction to control layer output adjustment speed and direction, the control layer receives the instruction of decision-making level, and controls
Brake, throttle and the gear of vehicle.
2. the deep learning type intelligent driving context aware systems according to claim 1 based on Internet of Things, feature exist
In: it is described on the street lamp that the radar cell, inertial navigation unit, positioning unit and camera unit are mountable to both sides of the road
Cognition layer includes the analysis to pedestrian, vehicle, traffic article, traffic mark and lane line.
3. the deep learning type intelligent driving context aware systems according to claim 1 based on Internet of Things, feature exist
In: the radar cell includes laser radar and millimetre-wave radar.
4. the deep learning type intelligent driving context aware systems according to claim 3 based on Internet of Things, feature exist
In: the laser radar is to forbid, and the environment strafed is to fix.
5. the deep learning type intelligent driving context aware systems according to claim 4 based on Internet of Things, feature exist
In: the laser radar is obtained environmental data first and is stored in computer with array form, to the environmental data of acquisition into
Row pretreatment, rejects trees, and the information such as ground carry out range information, the Reflection intensity information of laser radar non-planar simultaneously
The environmental data of algorithm divides clustering processing, extracts barrier boundary rectangle contour feature.
6. the deep learning type intelligent driving context aware systems according to claim 5 based on Internet of Things, feature exist
In: laser radar carries out data correlation using obstacle information of the multiple hypotheis tracking model algorithm to two continuous frames, and utilizes card
Kalman Filtering algorithm is continuously predicted and is tracked to dynamic barrier.
7. the deep learning type intelligent driving context aware systems according to claim 1 based on Internet of Things, feature exist
In: using wireless communication between the sensory perceptual system and intelligent driving vehicle.
8. the deep learning type intelligent driving context aware systems according to claim 1 based on Internet of Things, feature exist
In the Data Fusion of Sensor of: the radar cell, inertial navigation unit, positioning unit and camera unit include Space integration,
Fusion in Time and Data Fusion of Sensor algorithm, and Data Fusion of Sensor algorithm is using extension karr horse filtering algorithm.
9. the deep learning type intelligent driving context aware systems according to claim 2 based on Internet of Things, feature exist
In: on the adjacent street lamp that the radar cell, inertial navigation unit, positioning unit and camera unit are installed on fixed range.
10. the deep learning type intelligent driving context aware systems according to claim 9 based on Internet of Things, feature exist
In: radar cell, inertial navigation unit, positioning unit on the adjacent street lamp and the connection that parallel connection is used between camera unit
Mode.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910396591.1A CN110091875A (en) | 2019-05-14 | 2019-05-14 | Deep learning type intelligent driving context aware systems based on Internet of Things |
PCT/CN2020/077066 WO2020228393A1 (en) | 2019-05-14 | 2020-02-28 | Deep learning type intelligent driving environment perception system based on internet of things |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910396591.1A CN110091875A (en) | 2019-05-14 | 2019-05-14 | Deep learning type intelligent driving context aware systems based on Internet of Things |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110091875A true CN110091875A (en) | 2019-08-06 |
Family
ID=67447890
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910396591.1A Pending CN110091875A (en) | 2019-05-14 | 2019-05-14 | Deep learning type intelligent driving context aware systems based on Internet of Things |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110091875A (en) |
WO (1) | WO2020228393A1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111326002A (en) * | 2020-02-26 | 2020-06-23 | 公安部交通管理科学研究所 | Prediction method, device and system for environment perception of automatic driving automobile |
CN111833631A (en) * | 2020-06-24 | 2020-10-27 | 武汉理工大学 | Target data processing method, system and storage medium based on vehicle-road cooperation |
WO2020228393A1 (en) * | 2019-05-14 | 2020-11-19 | 长沙理工大学 | Deep learning type intelligent driving environment perception system based on internet of things |
CN112249035A (en) * | 2020-12-16 | 2021-01-22 | 国汽智控(北京)科技有限公司 | Automatic driving method, device and equipment based on general data flow architecture |
CN112435466A (en) * | 2020-10-23 | 2021-03-02 | 江苏大学 | Method and system for predicting take-over time of CACC vehicle changing into traditional vehicle under mixed traffic flow environment |
CN113490178A (en) * | 2021-06-18 | 2021-10-08 | 天津大学 | Intelligent networking vehicle multistage cooperative sensing system |
CN113734197A (en) * | 2021-09-03 | 2021-12-03 | 合肥学院 | Unmanned intelligent control scheme based on data fusion |
CN113911139A (en) * | 2021-11-12 | 2022-01-11 | 湖北芯擎科技有限公司 | Vehicle control method and device and electronic equipment |
CN114056351A (en) * | 2021-11-26 | 2022-02-18 | 文远苏行(江苏)科技有限公司 | Automatic driving method and device |
CN114291114A (en) * | 2022-01-05 | 2022-04-08 | 天地科技股份有限公司 | Vehicle control system and method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102135777A (en) * | 2010-12-14 | 2011-07-27 | 天津理工大学 | Vehicle-mounted infrared tracking system |
CN105866790A (en) * | 2016-04-07 | 2016-08-17 | 重庆大学 | Laser radar barrier identification method and system taking laser emission intensity into consideration |
CN106908783A (en) * | 2017-02-23 | 2017-06-30 | 苏州大学 | Obstacle detection method based on multi-sensor information fusion |
CN107193012A (en) * | 2017-05-05 | 2017-09-22 | 江苏大学 | Intelligent vehicle laser radar multiple-moving target tracking method based on IMM MHT algorithms |
CN107609522A (en) * | 2017-09-19 | 2018-01-19 | 东华大学 | A kind of information fusion vehicle detecting system based on laser radar and machine vision |
CN108196535A (en) * | 2017-12-12 | 2018-06-22 | 清华大学苏州汽车研究院(吴江) | Automated driving system based on enhancing study and Multi-sensor Fusion |
CN108417087A (en) * | 2018-02-27 | 2018-08-17 | 浙江吉利汽车研究院有限公司 | A kind of vehicle safety traffic system and method |
CN108458745A (en) * | 2017-12-23 | 2018-08-28 | 天津国科嘉业医疗科技发展有限公司 | A kind of environment perception method based on intelligent detection equipment |
CN108845579A (en) * | 2018-08-14 | 2018-11-20 | 苏州畅风加行智能科技有限公司 | A kind of automated driving system and its method of port vehicle |
CN109171684A (en) * | 2018-08-30 | 2019-01-11 | 上海师范大学 | A kind of automatic health monitor system based on wearable sensors and smart home |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4887849B2 (en) * | 2006-03-16 | 2012-02-29 | 日産自動車株式会社 | Vehicle obstacle detection device, road obstacle detection method, and vehicle with road obstacle detection device |
WO2016126315A1 (en) * | 2015-02-06 | 2016-08-11 | Delphi Technologies, Inc. | Autonomous guidance system |
CN108010360A (en) * | 2017-12-27 | 2018-05-08 | 中电海康集团有限公司 | A kind of automatic Pilot context aware systems based on bus or train route collaboration |
CN108646739A (en) * | 2018-05-14 | 2018-10-12 | 北京智行者科技有限公司 | A kind of sensor information fusion method |
CN110091875A (en) * | 2019-05-14 | 2019-08-06 | 长沙理工大学 | Deep learning type intelligent driving context aware systems based on Internet of Things |
-
2019
- 2019-05-14 CN CN201910396591.1A patent/CN110091875A/en active Pending
-
2020
- 2020-02-28 WO PCT/CN2020/077066 patent/WO2020228393A1/en active Application Filing
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102135777A (en) * | 2010-12-14 | 2011-07-27 | 天津理工大学 | Vehicle-mounted infrared tracking system |
CN105866790A (en) * | 2016-04-07 | 2016-08-17 | 重庆大学 | Laser radar barrier identification method and system taking laser emission intensity into consideration |
CN106908783A (en) * | 2017-02-23 | 2017-06-30 | 苏州大学 | Obstacle detection method based on multi-sensor information fusion |
CN107193012A (en) * | 2017-05-05 | 2017-09-22 | 江苏大学 | Intelligent vehicle laser radar multiple-moving target tracking method based on IMM MHT algorithms |
CN107609522A (en) * | 2017-09-19 | 2018-01-19 | 东华大学 | A kind of information fusion vehicle detecting system based on laser radar and machine vision |
CN108196535A (en) * | 2017-12-12 | 2018-06-22 | 清华大学苏州汽车研究院(吴江) | Automated driving system based on enhancing study and Multi-sensor Fusion |
CN108458745A (en) * | 2017-12-23 | 2018-08-28 | 天津国科嘉业医疗科技发展有限公司 | A kind of environment perception method based on intelligent detection equipment |
CN108417087A (en) * | 2018-02-27 | 2018-08-17 | 浙江吉利汽车研究院有限公司 | A kind of vehicle safety traffic system and method |
CN108845579A (en) * | 2018-08-14 | 2018-11-20 | 苏州畅风加行智能科技有限公司 | A kind of automated driving system and its method of port vehicle |
CN109171684A (en) * | 2018-08-30 | 2019-01-11 | 上海师范大学 | A kind of automatic health monitor system based on wearable sensors and smart home |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020228393A1 (en) * | 2019-05-14 | 2020-11-19 | 长沙理工大学 | Deep learning type intelligent driving environment perception system based on internet of things |
CN111326002A (en) * | 2020-02-26 | 2020-06-23 | 公安部交通管理科学研究所 | Prediction method, device and system for environment perception of automatic driving automobile |
CN111833631A (en) * | 2020-06-24 | 2020-10-27 | 武汉理工大学 | Target data processing method, system and storage medium based on vehicle-road cooperation |
CN111833631B (en) * | 2020-06-24 | 2021-10-26 | 武汉理工大学 | Target data processing method, system and storage medium based on vehicle-road cooperation |
CN112435466B (en) * | 2020-10-23 | 2022-03-22 | 江苏大学 | Method and system for predicting take-over time of CACC vehicle changing into traditional vehicle under mixed traffic flow environment |
CN112435466A (en) * | 2020-10-23 | 2021-03-02 | 江苏大学 | Method and system for predicting take-over time of CACC vehicle changing into traditional vehicle under mixed traffic flow environment |
CN112249035A (en) * | 2020-12-16 | 2021-01-22 | 国汽智控(北京)科技有限公司 | Automatic driving method, device and equipment based on general data flow architecture |
CN112249035B (en) * | 2020-12-16 | 2021-03-16 | 国汽智控(北京)科技有限公司 | Automatic driving method, device and equipment based on general data flow architecture |
CN113490178A (en) * | 2021-06-18 | 2021-10-08 | 天津大学 | Intelligent networking vehicle multistage cooperative sensing system |
CN113734197A (en) * | 2021-09-03 | 2021-12-03 | 合肥学院 | Unmanned intelligent control scheme based on data fusion |
CN113911139A (en) * | 2021-11-12 | 2022-01-11 | 湖北芯擎科技有限公司 | Vehicle control method and device and electronic equipment |
CN113911139B (en) * | 2021-11-12 | 2023-02-28 | 湖北芯擎科技有限公司 | Vehicle control method and device and electronic equipment |
CN114056351A (en) * | 2021-11-26 | 2022-02-18 | 文远苏行(江苏)科技有限公司 | Automatic driving method and device |
CN114056351B (en) * | 2021-11-26 | 2024-02-02 | 文远苏行(江苏)科技有限公司 | Automatic driving method and device |
CN114291114A (en) * | 2022-01-05 | 2022-04-08 | 天地科技股份有限公司 | Vehicle control system and method |
Also Published As
Publication number | Publication date |
---|---|
WO2020228393A1 (en) | 2020-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110091875A (en) | Deep learning type intelligent driving context aware systems based on Internet of Things | |
CN113002396B (en) | A environmental perception system and mining vehicle for automatic driving mining vehicle | |
EP3784989A1 (en) | Systems and methods for autonomous vehicle navigation | |
CN112009524B (en) | System and method for tramcar obstacle detection | |
CN111427348A (en) | Automatic drive mining dump truck environmental perception system and mining dump truck | |
CN110060467A (en) | Prediction meanss, prediction technique and storage medium | |
CN208149311U (en) | A kind of context aware systems for automatic Pilot passenger car | |
CN109905847B (en) | Collaborative correction system and method for accumulated errors of GNSS blind area intelligent vehicle auxiliary positioning system | |
CN104267721A (en) | Unmanned driving system of intelligent automobile | |
CN108021133A (en) | A kind of Multi-sensor Fusion high speed unmanned vehicle detects obstacle avoidance system | |
CN106114357A (en) | Device and method for preventing scratching during turning of vehicle | |
CN111383456B (en) | Localized artificial intelligence system for intelligent road infrastructure system | |
CN102997926A (en) | Method for acquiring navigation data | |
US6597984B2 (en) | Multisensory correlation of traffic lanes | |
CN103879404A (en) | Moving-object-traceable anti-collision warning method and device thereof | |
CN107977004A (en) | A kind of round-the-clock high speed unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion | |
CN107200016A (en) | Road adaptive forecasting method and the Vehicular system using this method | |
CN114415171A (en) | Automobile travelable area detection method based on 4D millimeter wave radar | |
CN212322114U (en) | Environment sensing and road environment crack detection system for automatic driving vehicle | |
CN111506069A (en) | All-weather all-ground crane obstacle identification system and method | |
CN202911633U (en) | Dynamic detection device based on multi-information fusion for hybrid electric vehicle lane identification lines | |
CN115236673A (en) | Multi-radar fusion sensing system and method for large vehicle | |
CN202130447U (en) | Novel lane line deviation detection device | |
CN211742265U (en) | Intelligent roadside system for intelligently driving bus | |
CN105678221A (en) | Pedestrian detection method and system in rainy and snowy weather |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190806 |
|
RJ01 | Rejection of invention patent application after publication |