CN113804211B - Overhead identification method and device - Google Patents

Overhead identification method and device Download PDF

Info

Publication number
CN113804211B
CN113804211B CN202110904063.XA CN202110904063A CN113804211B CN 113804211 B CN113804211 B CN 113804211B CN 202110904063 A CN202110904063 A CN 202110904063A CN 113804211 B CN113804211 B CN 113804211B
Authority
CN
China
Prior art keywords
vehicle
overhead
determining
road
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110904063.XA
Other languages
Chinese (zh)
Other versions
CN113804211A (en
Inventor
邱宇
李庆奇
李康
黄鹏飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110904063.XA priority Critical patent/CN113804211B/en
Publication of CN113804211A publication Critical patent/CN113804211A/en
Priority to PCT/CN2022/091520 priority patent/WO2023010923A1/en
Application granted granted Critical
Publication of CN113804211B publication Critical patent/CN113804211B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application provides an overhead identifying method and an overhead identifying device, wherein in the method, a first parameter at a first moment is determined according to GNSS signals, a second parameter of a vehicle at the first moment is determined through a sensor, and then a running state of the vehicle is determined by combining the first parameter and the second parameter, wherein when the running state of the vehicle belongs to a target state, an overhead identifying result of the vehicle is determined according to the height change of the vehicle. The target state includes at least one of the following driving states: the method comprises the steps of starting an up-ramp, ending the up-ramp, starting an down-ramp and ending the down-ramp, wherein an overhead recognition result of the vehicle is that the vehicle runs on a road on the upper side of an overhead or the vehicle runs on a road on the lower side of the overhead. By the scheme of the embodiment of the application, the problem that the overhead recognition result of the vehicle cannot be determined in the prior art can be solved, the number of errors in navigation can be further reduced, and the navigation accuracy can be improved.

Description

Overhead identification method and device
Technical Field
The application relates to the technical field of navigation, in particular to an overhead identification method and device.
Background
With the development of cities and the improvement of living standards of people, the number of automobiles is continuously increased. In order to improve the driving speed, ease the congestion situation and solve the safety problem of the line intersection of the road and pedestrians, many cities build overhead roads.
The overhead road can be simply called an overhead road, and is a three-dimensional road erected on a ground road and used for running vehicles. Through the overhead, the road can be divided into a road on the upper side of the overhead and a road on the lower side of the overhead. Depending on the destination of the user, the user may choose whether to drive the vehicle on a road above the overhead or on a road below the overhead.
In addition, the user often needs to navigate while driving the vehicle. In the course of navigation, a terminal device for navigation generally needs to determine position information of a vehicle and plan an appropriate route for the vehicle accordingly.
However, the position information of the road on the upper side of the same overhead and the road on the lower side of the overhead may be the same or similar, and therefore, the terminal device cannot always determine whether the vehicle is located on the road on the upper side of the overhead or the road on the lower side of the overhead according to the position information of the vehicle, which may cause a navigation error. The navigation error often leads to a user driving the vehicle to drive into an incorrect route, so that the user is provided with extremely poor driving experience, and the defects of long time consumption and high vehicle oil consumption in the driving process exist.
Disclosure of Invention
In order to solve the problem that whether a vehicle is positioned on a road on the upper side of an overhead or on a road on the lower side of the overhead can not be identified in the prior art, the embodiment of the application provides an overhead identification method and an overhead identification device.
In a first aspect, an embodiment of the present application provides an overhead identifying method, including:
determining a first parameter of the vehicle at a first moment according to a Global Navigation Satellite System (GNSS) signal;
determining a second parameter of the vehicle at a first moment according to the sensor;
determining a driving state of the vehicle according to the first parameter and the second parameter;
determining an overhead recognition result of the vehicle according to a change in height of the vehicle when the running state of the vehicle belongs to a target state, the target state including at least one of the following running states: the method comprises the steps of starting an up-ramp, ending the up-ramp, starting an down-ramp and ending the down-ramp, wherein an overhead recognition result of the vehicle is that the vehicle runs on a road on the upper side of an overhead or the vehicle runs on a road on the lower side of the overhead.
In this implementation manner, a first parameter at a first moment may be determined according to the GNSS signal, and a second parameter of the vehicle at the first moment may be determined by the sensor, then a driving state of the vehicle may be determined by combining the first parameter and the second parameter, and then an overhead recognition result of the vehicle may be determined by combining the driving state of the vehicle and a height change of the vehicle.
In a possible implementation manner, the first parameter includes at least one of the following information: the speed and heading of the vehicle;
the second parameter includes at least one of the following information: the pitch angle, roll angle and heading angle of the vehicle.
In a possible implementation manner, the determining a first parameter of the vehicle at a first moment includes:
when an operation of starting a navigation function is received, determining a first parameter of the vehicle at a first moment;
or when a position searching operation is received, determining a first parameter of the vehicle at a first moment;
or, when an operation for indicating a destination is received, determining a first parameter of the vehicle at a first time;
or when receiving an operation for indicating that the navigation mode is driving, determining a first parameter of the vehicle at a first moment
Or when the speed of the terminal equipment is greater than a target speed threshold value, determining a first parameter of the vehicle at a first moment;
or when the front of the vehicle is determined to contain a turn mouth according to the GNSS signals, determining a first parameter of the vehicle at a first moment;
alternatively, when it is determined that the front of the vehicle includes an overhead sign based on an image including the front of the vehicle, a first parameter of the vehicle at a first time is determined.
By the scheme, the terminal equipment can determine the first parameter of the vehicle at the first moment under the condition that a certain condition is met, so that the operation for determining the first parameter is reduced, the data quantity required to be processed is reduced, and the calculation resources occupied in the process of determining the first parameter can be reduced.
In a possible implementation manner, the determining the driving state of the vehicle according to the first parameter and the second parameter includes:
transmitting the first parameter and the second parameter to a classifier, wherein the classifier is used for classifying the running state of the vehicle according to the parameters of the vehicle;
and determining the running state of the vehicle according to the output of the classifier.
In a possible implementation manner, the overhead is a layer, and the determining the overhead identification result of the vehicle according to the height change of the vehicle includes:
if the running state of the vehicle is a starting ramp, and the height change of the vehicle in a first time period is larger than a first threshold value, determining that the overhead identification result of the vehicle is that the vehicle runs on a road on the upper side of an overhead;
or if the running state of the vehicle is a starting down-ramp and the absolute value of the change in height of the vehicle in the first time period is greater than the first threshold value, determining that the overhead recognition result of the vehicle is that the vehicle runs on the road under the overhead;
Or the running state of the vehicle is an ending ramp, the height change of the vehicle in a previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on a road on the upper side of the overhead;
or the running state of the vehicle is the end down ramp, the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on the road below the overhead.
Through the steps, under the condition that the overhead is one layer, the overhead identification result of the vehicle can be determined by combining the height change of the vehicle and the running state of the vehicle, so that the navigation accuracy is improved.
In a possible implementation manner, the overhead includes at least two layers, and the determining the overhead identification result of the vehicle according to the height change of the vehicle includes:
if the running state of the vehicle is a starting ramp, and the height change of the vehicle in a first time period is larger than a first threshold value, determining that the overhead identification result of the vehicle is that the vehicle runs on a road on the upper side of an overhead;
Or if the running state of the vehicle is a starting down-ramp, and the absolute value of the height change of the vehicle in the first time period is larger than the first threshold value, determining an overhead recognition result of the vehicle according to the state of the vehicle before the down-ramp;
or the running state of the vehicle is an ending ramp, the height change of the vehicle in a previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on a road on the upper side of the overhead;
or, the running state of the vehicle is an end down ramp, and the absolute value of the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead recognition result of the vehicle is determined according to the state of the vehicle before the down ramp.
Through the steps, under the condition that the overhead comprises multiple layers, the overhead identification result of the vehicle can be determined by combining the height change of the vehicle and the running state of the vehicle, so that the navigation accuracy is improved.
In a possible implementation, the state of the vehicle before the off ramp includes one of the following states: the vehicle travels on a first floor overhead and the vehicle travels on an overhead of other floors, the first floor overhead being the road of the overhead upper floor closest to the road of the lower side of the overhead.
In a possible implementation manner, the method further includes:
determining the state of the vehicle before the down ramp according to the height of the vehicle and the height of each layer of overhead;
or, after determining the overhead identification result of the vehicle, recording the change of the layer number of the overhead where the vehicle is located;
and determining the state of the vehicle before the down ramp according to the change of the layer number of the overhead where the vehicle is located.
Through the steps, the state of the vehicle before the down-ramp can be determined, so that the overhead recognition result of the vehicle can be determined according to the state of the vehicle before the down-ramp.
In a possible implementation manner, the method further includes:
after determining the overhead identification result of the vehicle, reporting the overhead identification result of the vehicle to a server;
or if the change of the layer number of the overhead where the vehicle is located is recorded, and the overhead identification result of the vehicle indicates that the vehicle runs on a road on the upper side of the overhead, reporting the overhead identification result of the vehicle and the layer number of the overhead where the vehicle is located to the server.
Through the steps, the accuracy of determining the position of the vehicle by the server can be improved, and the accuracy of navigation is further improved.
In a possible implementation manner, the method further includes:
if the overhead recognition result of the vehicle indicates that the vehicle is traveling on a road on the lower side of the overhead and the front of the traveling direction of the vehicle includes a weak signal region, the overhead recognition method is adjusted from navigation by GNSS signals to navigation by a network positioning method or navigation by a network positioning method and an inertial overhead recognition method.
According to the implementation mode, the navigation method can be adjusted before the vehicle runs to the weak signal area, so that the influence of the weak signal area on the navigation accuracy is reduced, and the navigation accuracy of the terminal equipment in the weak signal area is ensured.
In a possible implementation manner, the weak signal area includes: tunnel area or shelter-from area.
In a second aspect, an embodiment of the present application provides an overhead identifying apparatus, the apparatus including: a transceiver and a processor;
the transceiver is used for receiving GNSS signals of a global navigation satellite system;
the processor is configured to:
determining a first parameter of the vehicle at a first moment according to the GNSS signals;
determining a second parameter of the vehicle at a first moment according to the sensor;
determining a driving state of the vehicle according to the first parameter and the second parameter;
Determining an overhead recognition result of the vehicle according to a change in height of the vehicle when the running state of the vehicle belongs to a target state, the target state including at least one of the following running states: the method comprises the steps of starting an up-ramp, ending the up-ramp, starting an down-ramp and ending the down-ramp, wherein an overhead recognition result of the vehicle is that the vehicle runs on a road on the upper side of an overhead or the vehicle runs on a road on the lower side of the overhead.
In a possible implementation manner, the first parameter includes at least one of the following information: the speed and heading of the vehicle;
the second parameter includes at least one of the following information: the pitch angle, roll angle and heading angle of the vehicle.
In a possible implementation manner, the processor is configured to determine a first parameter of the vehicle at a first moment, specifically:
when an operation of starting a navigation function is received, determining a first parameter of the vehicle at a first moment;
or when a position searching operation is received, determining a first parameter of the vehicle at a first moment;
or, when an operation for indicating a destination is received, determining a first parameter of the vehicle at a first time;
Or when receiving an operation for indicating that the navigation mode is driving, determining a first parameter of the vehicle at a first moment
Or when the speed of the terminal equipment is greater than a target speed threshold value, determining a first parameter of the vehicle at a first moment;
or when the front of the vehicle is determined to contain a turn mouth according to the GNSS signals, determining a first parameter of the vehicle at a first moment;
alternatively, when it is determined that the front of the vehicle includes an overhead sign based on an image including the front of the vehicle, a first parameter of the vehicle at a first time is determined.
In a possible implementation manner, the processor is configured to determine a driving state of the vehicle according to the first parameter and the second parameter, specifically:
transmitting the first parameter and the second parameter to a classifier, wherein the classifier is used for classifying the running state of the vehicle according to the parameters of the vehicle;
and determining the running state of the vehicle according to the output of the classifier.
In a possible implementation manner, the overhead is a layer, and the processor is configured to determine an overhead identification result of the vehicle according to a height change of the vehicle, specifically:
If the running state of the vehicle is a starting ramp, and the height change of the vehicle in a first time period is larger than a first threshold value, determining that the overhead identification result of the vehicle is that the vehicle runs on a road on the upper side of an overhead;
or if the running state of the vehicle is a starting down-ramp and the absolute value of the change in height of the vehicle in the first time period is greater than the first threshold value, determining that the overhead recognition result of the vehicle is that the vehicle runs on the road under the overhead;
or the running state of the vehicle is an ending ramp, the height change of the vehicle in a previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on a road on the upper side of the overhead;
or the running state of the vehicle is the end down ramp, the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on the road below the overhead.
In a possible implementation manner, the overhead comprises at least two layers, and the processor determines an overhead identification result of the vehicle according to the height change of the vehicle, specifically:
If the running state of the vehicle is a starting ramp, and the height change of the vehicle in a first time period is larger than a first threshold value, determining that the overhead identification result of the vehicle is that the vehicle runs on a road on the upper side of an overhead;
or if the running state of the vehicle is a starting down-ramp, and the absolute value of the height change of the vehicle in the first time period is larger than the first threshold value, determining an overhead recognition result of the vehicle according to the state of the vehicle before the down-ramp;
or the running state of the vehicle is an ending ramp, the height change of the vehicle in a previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on a road on the upper side of the overhead;
or, the running state of the vehicle is an end down ramp, and the absolute value of the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead recognition result of the vehicle is determined according to the state of the vehicle before the down ramp.
In a possible implementation, the state of the vehicle before the off ramp includes one of the following states: the vehicle travels on a first floor overhead and the vehicle travels on an overhead of other floors, the first floor overhead being the road of the overhead upper floor closest to the road of the lower side of the overhead.
In a possible implementation, the processor is further configured to:
determining the state of the vehicle before the down ramp according to the height of the vehicle and the height of each layer of overhead;
or, after determining the overhead identification result of the vehicle, recording the change of the layer number of the overhead where the vehicle is located;
and determining the state of the vehicle before the down ramp according to the change of the layer number of the overhead where the vehicle is located.
In a possible implementation, the processor is further configured to:
after determining the overhead identification result of the vehicle, reporting the overhead identification result of the vehicle to a server;
or if the change of the layer number of the overhead where the vehicle is located is recorded, and the overhead identification result of the vehicle indicates that the vehicle runs on a road on the upper side of the overhead, reporting the overhead identification result of the vehicle and the layer number of the overhead where the vehicle is located to the server.
In a possible implementation, the processor is further configured to:
if the overhead recognition result of the vehicle indicates that the vehicle is traveling on a road on the lower side of the overhead and the front of the traveling direction of the vehicle includes a weak signal region, the overhead recognition method is adjusted from navigation by GNSS signals to navigation by a network positioning method or navigation by a network positioning method and an inertial overhead recognition method.
In a possible implementation manner, the weak signal area includes: tunnel area or shelter-from area.
In a third aspect, an embodiment of the present application provides a terminal device comprising a processor, which when executing a computer program or instructions in a memory, performs a method as described in the first aspect.
In a fourth aspect, an embodiment of the present application provides a terminal device, where the terminal device includes a processor and a memory; the memory is used for storing a computer program or instructions; the processor is configured to execute the computer program or instructions stored in the memory, to cause the terminal device to perform the method according to the first aspect.
In a fifth aspect, the present application provides a terminal device comprising a processor, a memory and a transceiver; the transceiver is used for receiving signals or transmitting signals; the memory is used for storing a computer program or instructions; the processor is configured to execute the computer program or instructions stored in the memory, to cause the terminal device to perform the method according to the first aspect.
In a sixth aspect, the present application provides a terminal device comprising a processor and an interface circuit; the interface circuit is used for receiving a computer program or instructions and transmitting the computer program or instructions to the processor; the processor is configured to execute the computer program or instructions to cause the terminal device to perform the method according to the first aspect.
In a seventh aspect, the present application provides a computer storage medium storing a computer program or instructions which, when executed, cause the method of the first aspect to be carried out.
In an eighth aspect, the application provides a computer program product comprising a computer program or instructions which, when executed, cause the method of the first aspect to be carried out.
In a ninth aspect, the present application provides a chip comprising a processor coupled to a memory for executing a computer program or instructions stored in the memory, which when executed, performs a method as described in the first aspect.
The embodiment of the application provides an overhead identification method and an overhead identification device. According to the method, a first parameter at a first moment is determined according to GNSS signals, a second parameter of the vehicle at the first moment is determined through a sensor, then the running state of the vehicle is determined by combining the first parameter and the second parameter, and then the overhead recognition result of the vehicle is determined by combining the running state of the vehicle and the height change of the vehicle, so that the problem that the overhead recognition result of the vehicle cannot be determined in the prior art is solved, the frequency of errors in navigation can be reduced, and the navigation accuracy is improved.
Furthermore, the scheme of the application can improve the navigation accuracy, so that the user experience on the vehicle can be improved, the time consumption in the driving process is reduced, the oil consumption of the vehicle is reduced, and the aim of saving energy is fulfilled.
Drawings
FIG. 1 is a schematic illustration of an upper and lower elevated ramp road network for a vehicle;
FIG. 2 is a schematic diagram of a GNSS system;
FIG. 3 is an interface schematic diagram of an electronic map displayed by a terminal device;
fig. 4 is a schematic view of a driving scenario of a vehicle according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a mobile phone according to an embodiment of the present application;
fig. 6 is a software structure block diagram of a mobile phone according to an embodiment of the present application;
FIG. 7 is a schematic view of a vehicle according to an embodiment of the present application;
fig. 8 (a) is an exemplary diagram of an interface of a terminal device according to an embodiment of the present application;
fig. 8 (b) is an exemplary diagram of an interface of still another terminal device according to an embodiment of the present application;
fig. 8 (c) is an exemplary diagram of an interface of still another terminal device according to an embodiment of the present application;
FIG. 9 is a schematic workflow diagram of an overhead recognition method according to an embodiment of the present application;
fig. 10 (a) is a schematic view of a road driving scenario of a vehicle on the upper side of an overhead according to an embodiment of the present application;
Fig. 10 (b) is a plan view showing a vehicle traveling on a road on an upper side of an overhead according to an embodiment of the present application;
fig. 11 is an interface schematic diagram of an electronic map displayed by a terminal device according to an embodiment of the present application;
FIG. 12 is a block diagram illustrating an embodiment of a navigation device according to the present application;
fig. 13 is a block diagram illustrating a structure of an embodiment of a chip according to the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application.
The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the following embodiments of the present application, "at least one", "one or more" means one, two or more than two. The term "and/or" is used to describe an association relationship of associated objects, meaning that there may be three relationships; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
For clarity and conciseness in the description of the following embodiments, a brief description of the related art will be given first:
in order to ensure the driving speed, ease the congestion situation and solve the safety problem of the intersection of the road and the pedestrian line, the overhead roads are constructed in many cities at present so as to cope with the increasing number of automobiles.
The overhead road may be simply referred to as an overhead road, and is a three-dimensional road that is erected on a ground road and is used for driving vehicles. Through the overhead, the road can be divided into a road on the upper side of the overhead and a road on the lower side of the overhead. For a scene with only one layer of overhead, the road on the upper side of the overhead is an overhead road higher than the ground, and the road on the lower side of the overhead is a ground road under the overhead; for overhead scenes with more than two floors, the road on the upper side of the overhead is any floor overhead road above the ground, and the road on the lower side of the overhead is the ground road below the overhead road closest to the ground.
In addition, the entrance of the overhead building is usually provided with a turn. The ramp ports generally comprise an upper ramp port and a lower ramp port, and vehicles firstly need to pass through the upper ramp port and then enter a road on the upper side of an overhead. When a vehicle is traveling down an overhead road or from a road above an overhead road of a certain floor to a road above an overhead road of a lower floor, the vehicle must first pass through a crossing of a lower turn and then enter the road above the overhead road or enter the road above the overhead road of the lower floor.
To clarify the scenario of a vehicle traveling on an overhead, fig. 1 is provided. Referring to the schematic view of the scene shown in fig. 1, the scene corresponding to the figure includes a layer of overhead, and the corresponding roads include roads on the upper side of the overhead and roads on the lower side of the overhead. The front of the vehicle includes a solid line with an arrow indicating the direction in which the vehicle travels. In the initial stage of the vehicle running, the vehicle is located at the left side in fig. 1, and the vehicle runs on the road under the overhead. After a while, the vehicle starts to go overhead through the overhead ramp opening, thereby driving into the road on the upper side of the overhead, and the vehicle is located at the position where the vehicle on the right side in fig. 1 is located. Then, after the road on the upper side of the overhead is driven for a while, the vehicle is driven to the road on the lower side of the overhead through the lower overhead ramp entrance. In this process, the travel route of the vehicle is: road running on the upper side of the overhead, road running on the lower side of the overhead, i.e., running on the ground.
During the running of a vehicle, a user can usually navigate using a terminal device (e.g. a mobile phone or a car-mounted terminal, etc.). For example, the user may turn on navigation of the terminal device in front of an overhead on the vehicle, or may turn on navigation while the vehicle is traveling on a road below the overhead. Currently, terminal devices determine their own position mainly by means of global navigation satellite systems (Global Navigation Satellite System, GNSS). Among them, GNSS is an air-based radio navigation positioning system capable of providing all-weather three-dimensional coordinates and speed and time information to a user at any place on the earth's surface or near-earth space.
GNSS systems typically include the global positioning system (global positioning system, GPS) of the united states, the russian (globalnaja nawigazionnaja sputnikowaja sistema, GLONASS) system, the GALILEO (GALILEO) system of the european union, the beidou satellite navigation system of china, and so on.
Among them, the GPS system is a satellite based radio navigation positioning system including 24 satellites covering the world. The Beidou satellite navigation system is a global satellite navigation system which is independently researched and developed in China. The system is divided into two generations, namely a Beidou first generation system and a Beidou second generation system. The system typically includes four geosynchronous orbit satellites.
Referring to the schematic diagram of the GNSS navigation system shown in FIG. 2, the GNSS navigation system generally comprises three major parts, namely a space part, a ground monitoring part and a user receiver.
As shown in fig. 2, the space portion of the GNSS navigation system includes a plurality of satellites 10, the ground monitoring portion includes a ground monitoring tracking station 20, the ground monitoring tracking station 20 generally includes a master station, a monitoring station, and an injection station, and a user receiver 30 of the GNSS navigation system may receive satellite signals transmitted by the plurality of satellites 10.
The basic principle of a GNSS navigation system is to determine the position of a user receiver by means of the distances between satellites of known positions and the user receiver. The position of the satellite can be found in the satellite ephemeris according to the time recorded by the satellite-borne clock, and the distance between the user receiver and the satellite can be determined by the time when the satellite signal transmitted by the satellite is transmitted to the user receiver, and the satellite signal can be called as a GNSS signal.
During navigation, the ground monitoring tracking station 20 may transmit satellite ephemeris and other information to the plurality of satellites 10; the plurality of satellites 10 may continuously transmit satellite signals, which typically include satellite ephemeris and a time of transmission of the satellite signals; the user receiver 30 may search for and receive satellite signals, determine the position of the satellite 10 by means of the ephemeris of the satellites in the satellite signals, determine the distance between itself and the satellite 10 by means of its own clock and the time of transmission of the satellite signals, and further determine the position information of the position of itself based on the position of the satellite 10 and the distance between itself and the satellite 10.
The user can navigate through the terminal device, which can be a device with navigation functions such as a mobile terminal (e.g. a mobile phone) and a car set. In addition, the terminal equipment displays the electronic map in the navigation process, so that a user can conveniently inquire a destination and conduct route planning.
In the process of navigating the vehicle, after determining the position of the terminal equipment, the terminal equipment can display the position of the terminal equipment on an electronic map, wherein the electronic map generally comprises environments around the position of the terminal equipment, indicates the position of the vehicle in the electronic map, and further can further comprise a route planned for the vehicle and indicates the advancing direction of the vehicle so as to meet the navigation requirement of a user.
It can be seen from the brief description of the above technology that, at present, the terminal device generally determines the position information of itself according to the received GNSS signal, and further navigates for the user according to the position information.
However, the positional information of the road on the upper side of the overhead and the road on the lower side of the overhead of the same overhead may be the same or similar. In this case, the terminal device cannot determine whether the vehicle is located on the road on the upper side of the overhead or on the road on the lower side of the overhead according to the determined position information, and thus navigation errors are more likely to occur.
The navigation error often leads to a user driving the vehicle to drive into an incorrect route, so that the user is provided with extremely poor driving experience, and the defects of long time consumption and high vehicle oil consumption in the driving process exist.
Fig. 3 is an exemplary electronic map displayed by a terminal device in a case where navigation accuracy is low. Referring to fig. 3, the navigation instruction user vehicle is located on the north four-ring east road auxiliary road and under the overhead bridge, and the triangle marked by the solid line in fig. 3 represents the position of the terminal device of the navigation instruction; the user's vehicle actually located position has driven from under the overhead bridge to on the overhead bridge, the triangle marked by the dotted line in fig. 3 represents the actual position of the vehicle, and it is seen that the actual position of the vehicle is inconsistent with the navigation position indicated in the electronic map displayed by the user's terminal device, and the navigation user deviates from the vehicle position identification.
In order to solve the above problems, embodiments of the present application provide an overhead recognition method and apparatus to recognize whether a vehicle is on a road on an upper side of an overhead or on a road on a lower side of an overhead, so as to improve navigation accuracy.
The technical scheme of the application can be applied to the field of vehicle driving, including but not limited to the fields of automatic driving (automated driving, ADS), intelligent driving (intelligent driving), intelligent network connection (Intelligent Connected Vehicle, ICV) and the like. The application provides a technical scheme for identifying whether a vehicle is in an upper viaduct or a lower viaduct state. The technical scheme can be applied to the field of vehicle driving and is used for providing positioning and navigation services for vehicles.
The technical scheme of the application can be applied to any positioning system or navigation system, as shown in fig. 4, and is a schematic view of a vehicle driving scene provided in this embodiment. The scenario involves a server, at least one terminal device and a vehicle to which the terminal device corresponds. The server and the terminal equipment (such as a mobile phone terminal) can be connected through a wireless network.
Further, the server may be a service platform or a car networking server for managing the mobile phone terminal, for example, the server is used for receiving a message sent by the mobile phone terminal, determining the vehicle position, and providing map and real-time navigation services for the user. Wherein, the server can store the electronic map of a plurality of areas.
The terminal equipment is used for sending a request to the server, and realizing the functions of real-time positioning and navigation of the vehicle. In addition, the vehicle comprises a communication module and a processing module, and the communication module and the processing module are used for receiving signals sent by the server and/or the mobile phone terminal, controlling the starting and stopping of the vehicle according to the signals and a preset program, and obtaining the upper overhead or lower overhead state of the vehicle.
Alternatively, the server may be one or more independent servers or a server cluster, or may also be a cloud platform service deployed in the cloud. The server may be a network device, such as a Base Station (BS), further, the base station may be a base station (base transceiver station, BTS) in a global system for mobile communications (global system for mobile communication, GSM) or code division multiple access (code division multiple access, CDMA), a base station (NodeB) in wideband-CDMA (WCDMA), an evolved NodeB (eNB/e-NodeB) in LTE, an evolved NodeB (next generation eNB, ng-eNB) in LTE, or a base station (gNB) in NR, or an access node in a future mobile communication system or a wireless fidelity (wireless fidelity, wiFi) system, or the like.
The terminal device in the embodiments of the present application may be a device that provides services and/or data connectivity to a user, a handheld device with wireless connectivity, or other processing device connected to a wireless modem, such as a wireless terminal, a vehicle mounted wireless terminal, a portable device, a wearable device, a mobile phone (or "cellular" phone), a portable, pocket, hand-held terminal, etc., that exchanges voice and/or data with a radio access network. Such as personal communication services (personal communication service, PCS) phones, cordless phones, session Initiation Protocol (SIP) phones, wireless local loop (wireless local loop, WLL) stations, personal digital assistants (personal digital assistant, PDA) and the like. The wireless terminal may also be a subscriber unit (subscriber unit), an access terminal (access terminal), a user terminal (user terminal), a user agent (user agent), a user device (user equipment), or a User Equipment (UE), and the type of the terminal device is not limited by the present application.
The mobile phone is taken as an example of the terminal device, and as shown in fig. 5, a schematic structural diagram of the mobile phone is shown.
The handset may include, among other things, a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, and the like.
It will be appreciated that the structure illustrated in the embodiments of the present application is not limited to a specific configuration of the mobile phone. In other embodiments of the application, the handset may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to a cell phone. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied to a cell phone. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, the antenna 1 and the mobile communication module 150 of the handset are coupled, and the antenna 2 and the wireless communication module 160 are coupled, so that the handset can communicate with a network and other devices through wireless communication technology. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The cell phone implements display functions through the GPU, the display 194, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the handset may include 1 or N display screens 194, N being a positive integer greater than 1.
The cell phone may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the handset may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the mobile phone selects a frequency point, the digital signal processor is used to perform fourier transform on the frequency point energy, etc.
Video codecs are used to compress or decompress digital video. The handset may support one or more video codecs. In this way, the mobile phone can play or record videos with various coding formats, for example: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The external memory interface 120 may be used to connect to an external memory card, such as a Micro SD card, to extend the memory capabilities of the handset. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the cellular phone and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the handset (e.g., audio data, phonebook, etc.), etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The handset may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The handset may listen to music through speaker 170A or to hands-free conversations.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the phone picks up a call or voice message, the phone can pick up voice by placing the receiver 170B close to the ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The handset may be provided with at least one microphone 170C. In other embodiments, the handset may be provided with two microphones 170C, which may also perform noise reduction in addition to collecting sound signals. In other embodiments, the handset may also be provided with three, four or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The sensor module 180 may include therein a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
Of course, the mobile phone may further include a charging management module, a power management module, a battery, a key, an indicator, 1 or more SIM card interfaces, and the embodiment of the present application is not limited in this respect.
Still taking the mobile phone as an example of the terminal device, the software system of the mobile phone may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, an Android system with a layered architecture is taken as an example, and the software structure of a mobile phone is illustrated.
Fig. 6 is a software architecture block diagram of an embodiment of a mobile phone according to the present application. Referring to fig. 6, the hierarchical architecture divides the software into several layers, each with a clear role and division of work. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 6, the application package may include applications such as cameras, gallery, phone calls, navigation, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 6, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager may obtain the size of the display screen, obtain parameters of each display area on the display interface, and so on.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including camera icons.
The telephone manager is used for providing communication functions of the mobile phone. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer may contain display drivers, camera drivers, audio drivers, sensor drivers, etc.
The system library and the kernel layer below the application framework layer can be also called an underlying system, and a state monitoring service for identifying the gesture change of the mobile phone is included in the underlying system, and the state monitoring service can be arranged in the system library and/or the kernel layer.
In another possible implementation manner, the terminal device performing the overhead identifying method provided by the embodiment of the present application may be a vehicle. For example, the overhead identification method may be performed by a vehicle machine within a vehicle, wherein the vehicle machine is typically installed in a center console of the vehicle.
In this implementation, the vehicle may be an intelligent vehicle. Fig. 7 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application. Referring to fig. 7, vehicle 100 may include various subsystems such as a travel system 1002, a sensor system 1004, a planning control system 1006, one or more peripheral devices 1008, as well as a power supply 1010, a computer system 1001, and a user interface 1016.
Alternatively, vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. In addition, each of the subsystems and elements of the vehicle 100 may be interconnected by wires or wirelessly.
The travel system 1002 may include components that power the vehicle 100. In one embodiment, the propulsion system 1002 may include an engine 1018, an energy source 1019, a transmission 1020, and wheels 1021. The engine 1018 may be an internal combustion engine, an electric motor, an air compression engine, or other type of engine or combination of engines, where the combination of engines may include, for example: a mixing engine composed of a gasoline engine and an electric motor, and a mixing engine composed of an internal combustion engine and an air compression engine. The engine 1018 converts the energy source 1019 into mechanical energy.
Examples of energy sources 1019 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity. The energy source 1019 may also provide energy to other systems of the vehicle 100.
The transmission 1020 may transmit mechanical power from the engine 1018 to the wheels 1021. The transmission 1020 may include a gearbox, a differential, and a drive shaft. In one embodiment, transmission 1020 may also include other devices, such as a clutch. Wherein the drive shaft may comprise one or more axles that may be coupled to one or more wheels 1021.
The sensor system 1004 may include several sensors that sense information about the vehicle 100 itself and the environment surrounding the vehicle 100. For example, the sensor system 1004 may include a positioning system 1022 (which may be a GNSS system, may include a GPS system, or may include a beidou system or other positioning system), an inertial measurement unit (inertial measurement unit, IMU) 1024, a radar 1026, a laser rangefinder 1028, a camera 1030, a computer vision system 1038, and a sensor fusion algorithm 1040. The sensor system 1004 may also include sensors of internal systems of the vehicle 100 (e.g., in-vehicle air quality monitors, fuel gauges, oil temperature gauges, etc.). One or more sensor data from these sensors may be used to detect the object to be detected and its corresponding characteristics (position, shape, direction, speed, etc.). Such detection and identification is a critical function of the vehicle 100 to achieve safe operation.
The global positioning system 1022 may be used to estimate the geographic location of the vehicle 100. The IMU 1024 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration. In one embodiment, the IMU 1024 may be a combination of accelerometers and gyroscopes.
Radar 1026 may utilize radio signals to sense objects in the surroundings of vehicle 100. In some embodiments, radar 1026 may be used to sense the speed or direction of travel of an object in addition to sensing the object.
Laser rangefinder 1028 may utilize a laser to sense objects in the environment in which vehicle 100 is located. In some embodiments, laser rangefinder 1028 may include one or more laser sources, a laser scanner, and one or more detectors, among other system components.
The camera 1030 may be used to capture multiple images of the surrounding environment of the vehicle 100. The camera 1030 may be a still camera or a video camera.
Computer vision system 1038 may be operative to process and analyze images captured by camera 1030 to identify objects or features in the environment surrounding vehicle 100. The objects or features may include traffic signals, road boundaries, and targets. Computer vision system 1038 may use object recognition algorithms, in-motion restoration structure (Structure from Motion, SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, computer vision system 1038 may be used to map an environment, track objects, estimate the speed of objects, and so forth.
The control system 1006 is programmed to control the operation of the vehicle 100 and its components. The planning control system 1006 may include various elements including a steering system 1032, a throttle 1034, a brake unit 1036, a route control system 1042, and a target avoidance system 1044.
The forward direction of the vehicle 100 can be adjusted by the operation of the steering system 1032. For example, in one embodiment may be a steering wheel system.
The throttle 1034 is used to control the operating speed of the engine 1018 and thus the speed of the vehicle 100.
The brake unit 1036 is used to control the vehicle 100 to decelerate. The brake unit 1036 may use friction to slow the wheel 1021. In other embodiments, the brake unit 1036 may convert kinetic energy of the wheels 1021 into electrical current. The brake unit 1036 may take other forms to slow the rotational speed of the wheels 1021 to control the speed of the vehicle 100.
Route planning system 1042 is used to determine a travel route of vehicle 100. In some embodiments, route planning system 1042 may plan a travel route for vehicle 100 that avoids potential targets in the environment in conjunction with data from sensors 1038, GPS 1022, and one or more predetermined maps. The trajectory planning method provided in the embodiment of the present application may be executed by the route planning system 1042 to output a target driving trajectory for the vehicle 100, where the target driving trajectory includes a plurality of target waypoints, each of the plurality of target waypoints includes coordinates of the waypoint, and a lateral allowable error and a speed allowable error of the waypoint, where the lateral allowable error includes a value range of the lateral allowable error, and may be understood as short for the value range of the lateral allowable error in some cases. The lateral direction herein refers to a direction perpendicular or approximately perpendicular to the traveling direction of the vehicle; the lateral allowable error means, in fact, a lateral displacement allowable error, that is, a range of values of allowable displacement errors in a direction perpendicular or approximately perpendicular to the vehicle traveling direction of the vehicle 100. This will not be described in detail later.
The control system 1044 is configured to generate control amounts of the accelerator brake and the steering angle according to the travel route/travel track outputted from the route planning system, thereby controlling the steering system 1032, the accelerator 1034, and the brake unit 1036.
Of course, in one example, the planning control system 1006 may include components in addition to or instead of those shown and described. Or some of the components shown above may be eliminated.
The vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through the peripheral device 1008. Peripheral devices 1008 may include a wireless communication system 1046, a vehicle computer 1048, a microphone 1050, or a speaker 1052.
In some embodiments, the peripheral device 1008 provides a means for a user of the vehicle 100 to interact with the user interface 1016. For example, the vehicle computer 1048 may provide information to a user of the vehicle 100. The user interface 1016 may also operate the vehicle computer 1048 to receive user input. In one implementation, the vehicle computer 1048 may be operated via a touch screen. In other cases, the peripheral device 1008 may provide a means for the vehicle 100 to communicate with other devices located within the vehicle. For example, microphone 1050 may receive audio (e.g., voice commands or other audio input) from a user of vehicle 100. Similarly, speaker 1052 can output audio to a user of vehicle 100.
The wireless communication system 1046 may communicate wirelessly with one or more devices directly or via a communication network. For example, wireless communication system 1046 may use 3G cellular communication, such as CDMA, EVD0, GSM/GPRS, or 4G cellular communication, such as LTE. Or 5G cellular communication. The wireless communication system 1046 may communicate with a wireless local area network (wireless local area network, WLAN) using WiFi. In some embodiments, the wireless communication system 1046 may communicate directly with the device using an infrared link, bluetooth, or ZigBee. Other wireless protocols, such as various vehicle communication systems, for example, the wireless communication system 1046 may include one or more dedicated short-range communication (dedicated short range communications, DSRC) devices, which may include public or private data communications between vehicles or roadside stations.
The power source 1010 may provide power to various components of the vehicle 100. In one embodiment, the power source 1010 may be a rechargeable lithium ion or lead acid battery. One or more battery packs of such batteries may be configured as a power source and provide power for various components of the vehicle 100. In some embodiments, the power source 1010 and the energy source 1019 may be implemented together, such as in an all-electric vehicle.
Some or all of the functions of the vehicle 100 are controlled by the computer system 1001. The computer system 1001 may include at least one processor 1013, the processor 1013 executing instructions 1015 stored in a non-transitory computer-readable medium such as a memory 1014. The computer system 1001 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
The processor 1013 can be any conventional processor such as a commercially available CPU. Alternatively, the processor may be a special purpose device such as an ASIC or other hardware-based processor. Although FIG. 1 functionally illustrates a processor, memory, and other elements of the computer system 1001, those skilled in the art will appreciate that the processor, memory, and the like may in fact comprise other multiple processors, or memories, that are not within the same physical housing. For example, the memory may be a hard disk drive or other storage medium located in a different housing than computer system 1001. Thus, references to a processor will be understood to include references to a collection of processors or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only calculations related to component-specific functions; or the subsystem of the travelling system, the sensor system, the planning control system and the like can also be provided with a processor for realizing the calculation of the related tasks of the corresponding subsystem so as to realize the corresponding functions.
In various aspects described herein, the processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the vehicle, while others are performed by a remote processor, including taking the necessary steps to perform a single maneuver.
In some embodiments, memory 1014 may contain instructions 1015 (e.g., program logic) that instructions 1015 may be executed by processor 1013 to perform various functions of vehicle 100, including those described above. The memory 1014 may also contain additional instructions, including instructions to send data to, receive data from, interact with, or control one or more of the travel system 1002, the sensor system 1004, the planning control system 1006, and the peripherals 1008.
In addition to instructions 1015, memory 1014 may store other related data such as road maps, route information, vehicle location, direction, speed, and other related information. Such information may be used by the vehicle 100 or, in particular, by the computer system 1001 during operation of the vehicle 100 in an autonomous, semi-autonomous, or manual mode.
A user interface 1016 for providing information to or receiving information from a user of the vehicle 100. Alternatively, the user interface 1016 may include one or more input/output devices within the set of peripheral devices 1008, such as a wireless communication system 1046, a vehicle computer 1048, a microphone 1050, and a speaker 1052.
The computer system 1001 may control functions of the vehicle 100 based on inputs received from various subsystems (e.g., the travel system 1002, the sensor system 1004, and the planning control system 1006) as well as from the user interface 1016. In some embodiments, computer system 1001 is operable to provide control over many aspects of vehicle 100 and its subsystems.
Alternatively, one or more of these components may be mounted separately from or associated with vehicle 100. For example, the memory 1014 may exist partially or completely separate from the vehicle 100. The above components may be communicatively coupled together in a wired or wireless manner.
Alternatively, the above components are only an example, and in practical applications, components in the above modules may be added or deleted according to actual needs, and fig. 7 should not be construed as limiting the embodiments of the present invention.
The vehicle 100 may be a car, a truck, a motorcycle, a bus, a ship, an airplane, a helicopter, a mower, an amusement ride, a casino vehicle, construction equipment, an electric car, a golf car, a train, a trolley, or the like, and the embodiment of the present invention is not particularly limited.
In the embodiment of the present application, the vehicle 100 may receive GNSS signals, determine the location of itself by using the GNSS signals, and implement positioning of itself.
The method for identifying an overhead provided by the embodiment of the present application is described below with reference to the terminal device shown in fig. 4 and the interface diagrams of the terminal device shown in fig. 8 (a) to 8 (c).
During the driving process of the vehicle, the vehicle can navigate through the terminal equipment. In one possible implementation, the vehicle may be guided during its travel by a navigation APP installed in the terminal device, such as a hundred degree mapGoldmap->Or drip travel->And navigation APP, which is used for navigating the vehicle.
The terminal equipment can determine the first parameter of the vehicle at the first moment through the overhead identifying method provided by the embodiment of the application, determine the second parameter of the vehicle at the first moment according to the sensor, and determine whether the vehicle is located on the road on the upper side of the overhead or the road on the lower side of the overhead according to the height change of the vehicle when the running state of the vehicle is determined to be in the target state according to the first parameter and the second parameter.
In some embodiments, the first parameter of the vehicle at the first moment may be determined periodically after each start-up of the terminal device.
In some embodiments, the terminal device may determine the first parameter of the vehicle at the first moment again if a certain trigger condition is met, for this scenario, for example, the following schemes are disclosed:
(1) When the terminal device receives an operation of starting a navigation function, a first parameter of the vehicle at a first moment is determined.
If the terminal device receives an operation of starting the navigation function, the user is indicated to need to navigate by using the terminal device, and in this case, a first parameter of the vehicle at a first moment is determined.
The operation of starting the navigation function may include various forms, for example, touch operation or specific gesture operation on the navigation APP, which is not limited in the embodiment of the present application.
(2) When the terminal device receives the position searching operation, a first parameter of the vehicle at a first moment is determined.
If the terminal equipment receives a position searching operation, which indicates that a user needs to view the surrounding environment of a certain position, the user often has navigation requirements, and then a first parameter of the vehicle at a first moment is determined.
For example, referring to an exemplary diagram of a display interface of a terminal device shown in fig. 8 (a), in an example corresponding to the diagram, the terminal device receives a location search operation for searching a national library, where a location indicated by a circle including a triangle is the national library, and in this case, a first parameter of a vehicle at a first moment may be determined.
(3) When the terminal device receives an operation for indicating a destination, a first parameter of the vehicle at a first time is determined.
If the terminal device receives an operation for indicating a destination, which indicates that the user needs to go to a certain destination, the user often has a navigation need, a first parameter of the vehicle at a first moment is determined.
For example, referring to an exemplary diagram of a display interface of a terminal device shown in fig. 8 (b), in an example corresponding to the diagram, the terminal device receives an operation for indicating that a destination is a national library in china, where a starting point is a current location of a vehicle, and an ending point is the national library in china. In this case, a first parameter of the vehicle at a first time may be determined.
(4) When the terminal device receives an operation for indicating that the navigation mode is driving, a first parameter of the vehicle at a first moment is determined.
In the navigation process, a user often selects different navigation modes according to navigation requirements. The navigation modes generally include: driving, public transportation, walking, riding, etc. If the navigation mode applied by the terminal equipment is driving, the user is indicated to drive the vehicle, and the user has navigation requirements. In this case, a first parameter of the vehicle at a first time may be determined.
For example, referring to an exemplary diagram of an interface of a terminal device shown in fig. 8 (c), in this example, the navigation mode applied by the terminal device is driving.
(5) When the speed of the terminal device is greater than the target speed threshold, a first parameter of the vehicle at a first moment is determined.
Wherein if the speed of the terminal device is greater than the target speed threshold, it indicates that the speed of the terminal device is relatively high and the user carrying the terminal device is driving the vehicle. While driving the vehicle, the user may be driving into the road above the overhead, and thus the first parameter of the vehicle at the first moment may be determined.
In this scenario, the target speed threshold may be, for example, 30KM/h, although the target speed threshold may be set to other values, which are not limited in this embodiment of the present application.
(6) When it is determined from the GNSS signals that the front of the vehicle contains a turn, a first parameter of the vehicle at a first moment is determined.
The terminal device can determine the position information of the terminal device according to the GNSS signals, and the terminal device can transmit the position information to a remote server. The position of each turn crossing is stored in the server, and whether the front of the vehicle contains the turn crossing is determined according to the received position information. After determining that the front of the vehicle comprises the ramp mouth, the server transmits corresponding prompt information to the terminal equipment so as to prompt the position of the ramp mouth to be driven into by the vehicle.
Alternatively, the terminal device may store the position information of the turn junctions of each place, and after determining the position information of the terminal device, the terminal device may match its own storage according to the position information of the terminal device to determine whether the turn junctions are included in front of the vehicle.
In addition, the terminal device may be connected to a device in the vehicle, for example, the terminal device may be connected to a vehicle body mounted in a center console of the vehicle. In this case, the device may store position information of the turn points of each place therein, and the terminal device may transmit the position information to the device, and the device may determine whether the front of the vehicle includes the turn points according to the position information, and transmit corresponding prompt information to the terminal device after determining that the front of the vehicle includes the turn points.
The ramp mouth typically includes a ramp entrance and a ramp exit. During the running process, the vehicle often needs to pass through the ramp entrance and then enter the road on the upper side of the overhead. When a vehicle is driven from a road above an overhead road to a road below the overhead road, the vehicle often needs to pass through a ramp exit and then drive into the road below the overhead road.
If it is determined that the front of the vehicle contains a turn, it is indicated that the vehicle is going to be elevated or is going to be elevated, in which case a first parameter of the vehicle at a first moment can be determined, in order to identify whether the vehicle is driving on a road on the upper side of the elevated or on the lower side of the elevated by means of the solution provided by the embodiment of the application.
(7) When it is determined that the front of the vehicle includes an overhead sign from an image including the front of the vehicle, a first parameter of the vehicle at a first time is determined.
The terminal device may acquire an image in front of the vehicle, determine whether the front of the vehicle includes an overhead sign through image analysis, and if it is determined that the front of the vehicle includes the overhead sign, indicate that the vehicle is about to go overhead or about to go down the overhead, in this case, determine a first parameter of the vehicle at a first moment, so as to identify whether the vehicle is traveling on a road on an upper side of the overhead or a lower side of the overhead through the scheme provided by the embodiment of the present application.
(8) When it is determined from the GNSS signals that the surroundings of the vehicle include an overhead, a first parameter of the vehicle at a first time is determined.
The terminal device can determine the position of the terminal device according to the GNSS signals, determine whether the periphery of the position comprises an overhead according to the electronic map, and if so, indicate that the vehicle is possibly on the upper or lower overhead, so that the first parameter of the vehicle at the first moment can be determined to identify whether the vehicle is on a road on the upper side or the lower side of the overhead.
Of course, the terminal device may also determine the first parameter of the vehicle at the first moment in other scenarios, which is not limited in the embodiment of the present application.
In order to clarify the scheme provided by the present application, the scheme provided by the present application will be described below by way of various embodiments with reference to the accompanying drawings.
In order to solve the problem that whether a vehicle is positioned on a road on the upper side of an overhead or on a road on the lower side of the overhead cannot be identified in the prior art, the embodiment of the application provides an overhead identification method.
Referring to a workflow diagram shown in fig. 9, the method for identifying an overhead provided in the embodiment of the present application includes the following steps:
step S11, determining a first parameter of the vehicle at a first moment according to GNSS signals.
In one possible design of the embodiment of the present application, the first parameter includes at least one of the following information: speed and heading of the vehicle. Wherein the direction of travel of the vehicle can generally be taken as the heading of the vehicle.
According to the GNSS signals received at each moment, the position information of the positions of the terminal equipment at different moments can be determined. It can be understood that when the terminal device is used for vehicle navigation, the position information of the terminal device at different moments can reflect the track of the vehicle to a certain extent, and the terminal device can determine the speed and the heading of the vehicle according to the position information. Wherein the speed of the vehicle can be determined by the distance and time difference between the different moments of the vehicle.
For example, referring to a schematic view of a scene of vehicle travel shown in fig. 10 (a), in this example, the vehicle travels on a road on the upper side of an overhead, and the traveling direction of the vehicle is from left to right. For this scene, fig. 10 (b) corresponding to fig. 10 (a) is disclosed, wherein fig. 10 (b) is a plan view for fig. 10 (a), the road shown in fig. 10 (b) is an overhead road, and the positions at the respective times of the vehicle are indicated by circles containing numerals in fig. 10 (b), wherein the smaller the numerals in the circles are, the earlier the time at which the vehicle is located at the positions is indicated. The vehicle is set at the circle position indicated by the number 1 at the time t1, the vehicle is set at the circle position indicated by the number 2 at the time t2, the vehicle is set at the circle position indicated by the number 3 at the time t3, the vehicle is set at the circle position indicated by the number 4 at the time t4, and the vehicle runs from left to right, so that the time t1 is earlier than the time t2, the time t2 is earlier than the time t3, and the time t3 is earlier than the time t4.
In this example, the speed of the vehicle between time t1 and time t2 is the ratio of the difference in distance between the circle position indicated by numeral 1 and the circle position indicated by numeral 2 to the difference in time between time t1 and time t 2; the speed of the vehicle between the time t3 and the time t2 is the ratio of the distance difference between the circle position indicated by the number 3 and the circle position indicated by the number 2 to the time difference between the time t3 and the time t 2; the speed of the vehicle between the time t1 and the time t4 is the ratio of the difference between the distance between the circle position indicated by the number 1 and the circle position indicated by the number 4 to the time difference between the time t1 and the time t 4.
In this case, the speed of the vehicle is determined from the positions of the vehicle at the time t1, the time t2, the time t3, and the time t4, respectively, and the time differences between the different times. In addition, in the scene shown in fig. 10 (b), the heading of the vehicle is the direction indicated by the broken line including the arrow in fig. 10 (b).
Step S12, determining a second parameter of the vehicle at the first moment according to the sensor.
In one possible design, the first time is the current time, and accordingly, the first parameter of the vehicle at the first time may be the first parameter of the current time. In addition, in this design, the second parameter of the vehicle at the first time may be the second parameter of the vehicle at the current time.
In another possible design, the first time may be any time within a time period, in which case the first parameter of the vehicle at the first time may be a first parameter at a certain time within the time period, and the second parameter of the vehicle at the first time may be a second parameter of the vehicle at another time within the time period.
In some embodiments, the second parameter includes at least one of the following information: pitch angle, roll angle, and heading angle of the vehicle.
Wherein the pitch angle of the vehicle generally refers to the angle at which the vehicle "pitches" relative to the XOY plane of the inertial coordinate system; roll angle of a vehicle generally refers to the lateral tilt angle in an inertial coordinate system used to identify the vehicle; the heading angle of a vehicle generally refers to the angle between the vehicle centroid speed and the transverse axis in an inertial coordinate system.
The second parameter may be acquired by a sensor, where the sensor for acquiring the second parameter may typically include a gyroscope, and of course, may also include other sensors capable of acquiring the second parameter, which is not limited by the embodiment of the present application.
The sensor for acquiring the second parameter can be arranged in the terminal equipment for navigation or in the vehicle. If the sensor is located in the vehicle, the sensor may transmit the acquired second parameter to the terminal device via the network.
In some embodiments, the sensor may periodically acquire and buffer the second parameter during the process of receiving the GNSS signal by the terminal device. In this case, based on the cached second parameter, the second parameter of the vehicle at the first time may be determined.
In some embodiments, during the process of determining the first parameter of the vehicle at the first moment, the sensor may be triggered to acquire the second parameter, and acquire the second parameter acquired by the sensor.
Step S13, determining the running state of the vehicle according to the first parameter and the second parameter.
In the solution provided in the embodiment of the present application, the driving states of the vehicle generally include a plurality of types. Wherein, if the vehicle needs to go up and down the overhead, the running state of the vehicle can comprise a start up ramp, an end up ramp, a start down ramp and an end down ramp. In addition, if the vehicle is not elevated and is not lowered, the driving state of the vehicle may include: uphill, downhill, and traveling on a road, etc.
Of course, the running state of the vehicle may also include other types, which are not limited by the embodiment of the present application.
In this step, the running state of the vehicle is determined based on the first parameter and the second parameter. In one possible solution provided by the embodiment of the present application, the operation may be implemented by:
Firstly, transmitting a first parameter and a second parameter to a classifier, wherein the classifier is used for classifying the running state of a vehicle according to the parameters of the vehicle;
then, the running state of the vehicle is determined from the output of the classifier.
That is, the running state of the vehicle may be determined based on a classifier, which may be determined by training in advance based on vehicle information of the vehicle in different shape states. Wherein the classifier is trainable based on the first parameter and the second parameter, and the state output by the classifier may include various driving states of the vehicle. For example, the classifier may be trained by the speed, heading, pitch angle, etc. of the vehicle, and the state of the classifier output may include the state of a start up ramp, an end up ramp, a start down ramp, an end down ramp, etc.
In this case, when the running state of the vehicle is determined from the first parameter and the second parameter, the first parameter and the second parameter are inputs of the classifier, and the running state of the vehicle is an output of the classifier.
The application is not limited to the type of classifier, which in one possible example may comprise a support vector machine (support vector machine, SVM).
And step S14, when the running state of the vehicle belongs to a target state, determining an overhead recognition result of the vehicle according to the height change of the vehicle.
Wherein the target state includes at least one of the following driving states: the method comprises the steps of starting an up-ramp, ending the up-ramp, starting an down-ramp and ending the down-ramp, wherein an overhead recognition result of the vehicle is that the vehicle runs on a road on the upper side of an overhead or the vehicle runs on a road on the lower side of the overhead.
In the embodiment of the application, if the running state of the vehicle is one of the target states, it is indicated that the running state of the vehicle belongs to the target state.
A ramp is typically provided at the entrance of the overhead. When the vehicle is on an overhead, the vehicle firstly needs to go on a ramp and then enters a road on the upper side of the overhead. Correspondingly, when the vehicle is on an overhead, the vehicle firstly needs to turn on a ramp and then enters a road on the lower side of the overhead. Therefore, if the running state of the vehicle belongs to the target state, it means that the vehicle is at the target state at the time of being elevated, having completed elevated, starting to be elevated, or having completed to be elevated.
And the height of the vehicle changes after the vehicle is elevated up and down. In this case, the embodiment of the present application determines the overhead recognition result of the vehicle in combination with the target state to which the vehicle belongs and the height change of the vehicle.
In practical road conditions, the overhead may be multi-layered or one-layered, wherein the overhead is one-layered, which means that the overhead only includes one layer of road on the upper side, and the overhead is multi-layered, which means that the overhead includes multiple layers of road on the upper side.
In some embodiments, if the overhead is a layer, the overhead recognition result of the vehicle can be determined by the following steps respectively in different driving states of the vehicle:
(1) If the driving state of the vehicle is the starting ramp and the change in the altitude of the vehicle in the first period of time is greater than a first threshold value, determining that the overhead recognition result of the vehicle is that the vehicle is driving on the road on the upper side of the overhead.
In this aspect, the first period is a period after a time at which the running state of the vehicle is determined. That is, the vehicle height is increased and the increased height is greater than the first threshold value within a period of time after the running state of the vehicle is the start of the ramp up, in which case it may be determined that the vehicle has been elevated and the vehicle is running on the road on the upper side of the elevation.
The first threshold is typically a positive number, and a specific value of the first threshold may be set in advance. Alternatively, the specific value of the first threshold may be determined by the local overhead height.
If the specific value of the first threshold is determined by the local overhead height, the first threshold is typically slightly less than the overhead height. Since the overhead is one layer, the height of the overhead is generally referred to as the height of the road on the upper side of the overhead with the road on the lower side of the overhead as a reference surface.
The terminal device may determine the local overhead height in a number of ways. In one possible way, the height of the overhead of each location may be stored in the terminal device, in which case the terminal device, after determining the location information of the terminal device from the GNSS signals, may determine the height of the overhead around the location by querying its own storage.
In another possible manner, after determining the position information of the terminal device according to the received GNSS signal, the terminal device may transmit the position information to a remote server, and then the remote server determines the height of the overhead around the position according to the position information, and then transmits the height of the overhead to the terminal device.
In another possible way, the terminal device may interact with the vehicle, in which case the terminal device may transmit its own location information to the vehicle, which determines the height of the overhead, and then transmits the height of the overhead to the terminal device.
Of course, the terminal device may also determine the local overhead height by other manners, which are not limited by the embodiment of the present application.
In one possible design of the embodiment of the present application, the duration of the first period of time may be preset.
In another possible embodiment, the first time period may be determined according to a time period t1 that is required to be spent when the vehicle is on or off the overhead, wherein the first time period may be a time period slightly longer than the time period t1. For example, if t1 is 20 seconds, the first period of time may be 25 seconds.
In traffic regulations, there is typically a limit to the speed of vehicles during the course of their ascent and descent. In this design, the time period t1 spent by the vehicle when moving up and down the overhead is determined based on the limitation of the vehicle speed and the length of the gradient when moving up and down the overhead.
Of course, the length of the first period may also be determined by other manners, for example, the terminal device may determine the length of the first period according to the received setting operation for the first period, which is not limited in the embodiment of the present application.
(2) If the running state of the vehicle is a starting down-ramp and the absolute value of the change in the height of the vehicle in the first period of time thereafter is greater than a first threshold value, it is determined that the overhead recognition result of the vehicle is that the vehicle is running on the road on the lower side of the overhead.
In this aspect, the first period is a period after a time at which the running state of the vehicle is determined. That is, the vehicle height is lowered and the reduced height is greater than the first threshold value for a period of time after the running state of the vehicle is the start of the down-ramp, in which case it can be determined that the vehicle has fallen down on the road on the lower side of the elevation.
(3) The driving state of the vehicle is the ending ramp, the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead recognition result of the vehicle is determined to be that the vehicle is driven on the road on the upper side of the overhead.
In this aspect, the second period is a period before a time at which the running state of the vehicle is determined. That is, the vehicle height is increased for a period of time before the running state of the vehicle is the end of the ramp up, and the increased height is greater than the first threshold value, in which case it can be determined that the vehicle has been elevated, and the vehicle is running on the road on the upper side of the elevation.
In one possible design of the embodiment of the present application, the duration of the second period of time may be preset.
In another possible embodiment, the second time period may be determined according to a time period t1 that is required to be spent when the vehicle is on or off the overhead, wherein the second time period may be a time period slightly longer than the time period t 1.
In addition, the length of the second period may be the same as the length of the first period, or may be different from the first period, which is not limited in the embodiment of the present application.
(4) The driving state of the vehicle is the end down ramp, the height change of the vehicle in the previous second time period is larger than a first threshold value, and the overhead recognition result of the vehicle is determined to be that the vehicle is driven on the road under the overhead.
In this aspect, the second period is a period before a time at which the running state of the vehicle is determined. That is, the vehicle height is reduced and the reduced height is greater than the first threshold value for a period of time after the running state of the vehicle is the start of the ramp up, in which case it can be determined that the vehicle has fallen down on the road on the lower side of the elevation.
Through the steps, the overhead recognition result of the vehicle in different driving states can be determined. In addition, when the overhead recognition result of the vehicle is determined, the steps are combined with the running state of the vehicle and the height change of the vehicle, so that whether the road on the upper side of the overhead or the road on the lower side of the overhead can be recognized, and the problem that the overhead recognition result of the vehicle cannot be determined in the prior art is solved.
In some scenarios, the overhead comprises two or more floors, i.e. the road on the upper side of the overhead comprises at least two floors. If the overhead comprises at least two layers, in this scenario, in the case of different driving states of the vehicle, the overhead recognition result of the vehicle can be determined by the following steps:
(1) If the driving state of the vehicle is the starting ramp and the change in the altitude of the vehicle in the first period of time is greater than a first threshold value, determining that the overhead recognition result of the vehicle is that the vehicle is driving on the road on the upper side of the overhead.
In this aspect, the first period of time is after a time at which the running state of the vehicle is determined. That is, the vehicle height is increased and the increased height is greater than the first threshold value within a period of time after the running state of the vehicle is the start of the ramp up, in which case it may be determined that the vehicle has been elevated and the vehicle is running on the road on the upper side of the elevation.
(2) If the running state of the vehicle is a starting down-ramp and the absolute value of the change in the height of the vehicle in a first period of time after the starting down-ramp is greater than a first threshold value, determining whether the overhead recognition result of the vehicle is that the vehicle is running on a road on the lower side of the overhead according to the state of the vehicle before the down-ramp.
If the running state of the vehicle is the start of the down-ramp and the absolute value of the change in the height of the vehicle in the first period is greater than the first threshold, it is indicated that the vehicle height decreases for a period of time after the running state of the vehicle is the start of the down-ramp.
Since the overhead may include multiple floors in this scenario, in this case, it is possible for the vehicle to drive into the road on the underside of the overhead, and also from the road on the higher floors on the upper side of the overhead, into the road on the lower floors on the upper side of the overhead. For example, the road on the upper side of the overhead includes three layers, and the higher the set height, the higher the corresponding layer number, the closest layer to the road on the lower side of the overhead is the first layer, the farthest layer from the road on the lower side of the overhead is the third layer, the vehicle starts the down ramp, and the absolute value of the change in height of the vehicle in the first period thereafter is greater than the first threshold value, possibly the vehicle is driven from the third layer road on the upper side of the overhead to the second layer road on the upper side of the overhead. Therefore, it is also necessary to determine whether the overhead recognition result of the vehicle is that the vehicle is traveling on the road on the lower side of the overhead, based on the state of the vehicle before the down-ramp.
In the solution provided by the embodiment of the present application, the state of the vehicle before the down-ramp generally includes: the vehicle travels on a first floor overhead, which is the overhead floor closest to the road under the overhead, and the vehicle travels on the other floors.
Wherein if the state of the vehicle before the down-ramp is that the vehicle is running on the first floor of the overhead, in the above step, it is determined that the overhead recognition result of the vehicle is that the vehicle is running on the road under the overhead; if the state of the vehicle before the down-ramp is that the vehicle is traveling on an overhead of another floor, in the above-described step, it may be determined that the overhead recognition result of the vehicle is that the vehicle is traveling on a road on an upper side of the overhead, that is, that the vehicle is a road on a higher floor from an upper side of the overhead, and that the vehicle is traveling into a road on a lower floor on an upper side of the overhead.
(3) The driving state of the vehicle is the ending ramp, the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead recognition result of the vehicle is determined to be that the vehicle is driven on the road on the upper side of the overhead.
In this aspect, the second period is a period before a time at which the running state of the vehicle is determined. That is, the vehicle height is increased for a period of time before the running state of the vehicle is the end of the ramp up, and the increased height is greater than the first threshold value, in which case it can be determined that the vehicle has been elevated, and the vehicle is running on the road on the upper side of the elevation.
(4) The running state of the vehicle is the end down-ramp, the absolute value of the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead recognition result of the vehicle is determined according to the state of the vehicle before the down-ramp.
In this aspect, the second period is a period before a time at which the running state of the vehicle is determined. If the running state of the vehicle is the ending down-ramp and the absolute value of the change in the height of the vehicle in the second period is greater than the first threshold, it is indicated that the vehicle height decreases for a period of time after the running state of the vehicle is the starting down-ramp.
Since the overhead may include multiple floors in this scenario, in this case, it is possible for the vehicle to drive into the road on the underside of the overhead, and also from the road on the higher floors on the upper side of the overhead, into the road on the lower floors on the upper side of the overhead. Therefore, it is also necessary to determine whether the overhead recognition result of the vehicle is that the vehicle is traveling on the road on the lower side of the overhead, based on the state of the vehicle before the down-ramp.
Wherein if the state of the vehicle before the down-ramp is that the vehicle is running on the first floor of the overhead, in the above step, it is determined that the overhead recognition result of the vehicle is that the vehicle is running on the road under the overhead; if the state of the vehicle before the down-ramp is that the vehicle is traveling on an overhead of another floor, in the above-described step, it may be determined that the overhead recognition result of the vehicle is that the vehicle is traveling on a road on an upper side of the overhead, that is, that the vehicle is a road on a higher floor from an upper side of the overhead, and that the vehicle is traveling into a road on a lower floor on an upper side of the overhead.
Through the steps, the overhead recognition result of the vehicle in different driving states can be determined under the condition that the overhead comprises at least two layers. And, the above steps combine the state of the vehicle before the down-ramp when determining the overhead recognition result of the vehicle. In some embodiments, the status of the vehicle before the down-ramp may be determined based on the height of the vehicle and the height of each floor of the overhead.
In this embodiment, the height of each floor of the overhead is generally referred to as the height of the road of the floor of the overhead with the road on the underside of the overhead as a reference surface. For example, the road on the upper side of the overhead is composed of a plurality of layers, and the higher the set height is, the higher the corresponding layer number is, and the layer closest to the road on the lower side of the overhead is the first layer, and the height of the nth layer overhead refers to the height of the nth layer overhead on the upper side of the overhead with the road on the lower side of the overhead as a reference plane.
Wherein the height of the vehicle may be determined from a height sensor (e.g., barometer, etc.). The height sensor may be installed in the terminal device, or the height sensor may be provided in the vehicle and transmit the height of the vehicle to the terminal device after the height of the vehicle is collected.
In this embodiment, the terminal device may determine the height of each floor overhead in a number of ways. In one possible way, the height of each floor of the elevations in the respective places can be stored within the terminal device. In this case, after determining the location information of the terminal device according to the GNSS signal, the terminal device may determine the overhead existing around the location information, and determine the height of each floor of the overhead by querying its own storage.
In another possible manner, after determining the location information of the terminal device according to the received GNSS signals, the terminal device may transmit the location information to a remote server, which determines the height of each floor of the overhead and transmits it to the terminal device, so that the terminal device determines the height of each floor of the overhead according to the transmission of the server.
Of course, the terminal device may also determine the height of each floor of the overhead in other manners, which is not limited by the embodiment of the present application.
By the above scheme, the state of the vehicle before the down-ramp can be determined based on the height of the vehicle and the height of each floor of the overhead.
In some embodiments, the state of the vehicle before the down-ramp may be determined by:
The method comprises the steps of firstly, recording the change of the layer number of an overhead of a vehicle after determining the overhead identification result of the vehicle each time;
and secondly, determining the state of the vehicle before the ramp-down according to the change of the layer number of the overhead where the vehicle is located.
For example, if the overhead includes more than two floors, in this embodiment, the terminal device may increase the number of floors of the road on the upper side of the overhead where the recorded vehicle is located by one each time after determining the road on the upper side of the overhead from the current location, and decrease the number of floors of the road on the upper side of the overhead where the recorded vehicle is located by one after determining the road on the lower side of the overhead from the current location.
In this case, if the number of floors of the overhead where the vehicle is located is 0 before the vehicle is already on the overhead, it is generally recorded that the vehicle is located on the road on the upper side of the overhead if the number of floors of the overhead where the vehicle is located is greater than 0, and it is recorded that the vehicle is located on the road on the lower side of the overhead if the number of floors of the overhead where the vehicle is located is 0.
In addition, if the number of floors of the overhead where the vehicle is located recorded by the terminal device is n, then the terminal device determines that the running state of the vehicle is the starting ramp according to the first parameter and the second parameter of the vehicle, and the height change of the vehicle in the first time period is greater than a first threshold value, in this case, the terminal device determines that the overhead where the vehicle is located once, and records that the number of floors of the overhead where the vehicle is located is n+1.
By the scheme, the state of the vehicle in front of the down ramp can be determined according to the record of the change of the layer number of the overhead where the vehicle is located.
The embodiment of the application provides an overhead identification method, which comprises the steps of determining a first parameter at a first moment according to GNSS signals, determining a second parameter of a vehicle at the first moment through a sensor, determining a running state of the vehicle by combining the first parameter and the second parameter, and determining an overhead identification result of the vehicle by combining the running state of the vehicle and the height change of the vehicle.
Furthermore, the scheme of the application can improve the navigation accuracy, so that the user experience on the vehicle can be improved, the time consumption in the driving process is reduced, the oil consumption of the vehicle is reduced, and the aim of saving energy is fulfilled.
In addition, if the state of the vehicle before the down ramp is determined according to the change of the layer number of the overhead where the vehicle is located in the embodiment of the application, the layer number of the overhead where the vehicle is located can be determined, and the navigation accuracy is further improved.
Further, to provide accuracy of navigation, in some embodiments, the method further comprises the steps of:
after determining the overhead identification result of the vehicle, reporting the overhead identification result of the vehicle to a server;
or if the change of the layer number of the overhead where the vehicle is located is recorded, and the overhead identification result of the vehicle indicates that the vehicle is running on the road on the upper side of the overhead, reporting the overhead identification result of the vehicle and the layer number of the overhead where the vehicle is located to the server.
The server can be a navigation APP server, and the server determines whether the vehicle runs on the road on the upper side of the overhead or on the road on the lower side of the overhead according to the reported information. If the terminal equipment reports not only the overhead identification result of the vehicle but also the number of layers of the overhead where the vehicle is located, the server can also determine the number of layers of the overhead where the vehicle is located when the vehicle runs on a road on the upper side of the overhead, so that the position where the terminal equipment is located can be determined more accurately, and the navigation accuracy can be improved.
Further, in some embodiments, the method further comprises the steps of:
if the overhead recognition result of the vehicle indicates that the vehicle is traveling on a road on the lower side of the overhead and the front of the traveling direction of the vehicle includes a weak signal region, the navigation method is adjusted from navigation by GNSS signals to navigation by a network positioning method or a network positioning method and an inertial navigation method.
Wherein the weak signal region generally comprises: tunnel area or shelter-from area. The shade-shielded area refers to an area that is shielded by a shade, which may be a building or vegetation, etc., resulting in a weaker signal.
In one possible design, the terminal device may store the positions of the respective weak signal areas, and determine whether the weak signal area is in front of the traveling direction of the vehicle based on its own storage.
In another possible design, the remote server may determine whether the front of the traveling direction of the vehicle is a weak signal area, and if so, the server transmits a corresponding instruction to the terminal device, so that the terminal device determines whether the front is a weak signal area according to the received instruction.
When the vehicle is located in the weak signal area, the GNSS signals received by the terminal equipment are weak or cannot be received, and if the terminal equipment continues to navigate based on the GNSS signals, the navigation accuracy is low, and even when the GNSS signals cannot be received, the navigation cannot be realized.
In this case, the terminal device performs navigation through the network positioning method or the network positioning method and the inertial navigation method, so that the navigation accuracy can be improved.
The network positioning is a positioning technology for determining the position of the terminal equipment through a network signal received by the terminal equipment. The network signal may originate from a base station or from a wireless fidelity (wireless fidelity, WIFI) hotspot.
If the network signal originates from the base station, the terminal equipment can determine the distance between the terminal equipment and the different base stations through the transmitting time of the network signal transmitted by the different base stations and the receiving time of the network signal, and then determine the position of the terminal equipment according to the distance between the terminal equipment and the different base stations and the position of the different base stations; if the network signal originates from the WIFI hotspot, the terminal equipment can determine the distance between the terminal equipment and the different WIFI hotspot through the transmitting time of the network signal transmitted by the different WIFI hotspot and the receiving time of the network signal, and then determine the position of the terminal equipment according to the distance between the terminal equipment and the different WIFI hotspot and the position of the different WIFI hotspot.
And the position of the equipment (such as a base station and a WIFI hot spot) for generating the network signal is different from the position of the satellite, when the shielding object has a larger influence on the receiving of the GNSS signal, the influence on the receiving of the network signal by the terminal equipment is possibly smaller, and the navigation is performed by the network positioning method, so that the navigation accuracy can be improved.
Further, the terminal device can perform navigation in combination with a network positioning method and an inertial navigation method. Among other things, inertial navigation methods rely on vehicle dead reckoning (vehicle dead reckoning, VDR) techniques, which can calculate the instantaneous position of a vehicle through inertial navigation sensors (e.g., direction and speed sensors). The network positioning method and the inertial navigation method are used for navigation together, so that the navigation accuracy can be further improved.
The overhead identification method provided by the embodiment of the application can effectively reduce navigation errors and improve navigation precision. To clarify the advantages of the present application, an example is provided below.
In this example, the terminal device respectively navigates the vehicle through the solutions provided by the prior art and the embodiments of the present application.
Fig. 3 is an electronic map displayed by a terminal device for navigating a vehicle when the terminal device navigates the vehicle according to the prior art. In the figure, navigation indicates that a user vehicle is positioned on a north four-ring east road auxiliary road and is positioned under an overhead bridge; the actual position of the user vehicle is on an overhead road of the north four-ring main road, and the five-pointed star in fig. 3 represents the actual position of the user vehicle, so that the actual position of the user vehicle is inconsistent with the navigation position of the user mobile phone, and the navigation deviates from the vehicle position identification.
Fig. 11 is an electronic map displayed by a terminal device for navigating a vehicle when the terminal device navigates the vehicle according to an embodiment of the present application. Referring to fig. 11, the overhead recognition method according to the embodiment of the application can accurately position the user vehicle on the overhead so that the navigation position of the mobile phone is consistent with the actual position of the vehicle, thereby realizing accurate positioning and navigation.
The method embodiments described herein may be independent schemes or may be combined according to internal logic, and these schemes fall within the protection scope of the present application.
It will be appreciated that in the various method embodiments described above, the methods and operations performed by the terminal device may also be performed by components (e.g., chips or circuits) that may be used in the terminal device.
The above embodiments describe the overhead recognition method provided by the present application. It will be appreciated that the terminal device, in order to implement the above-described functions, includes corresponding hardware structures and/or software modules that perform each of the functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the terminal equipment according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
The method provided by the embodiment of the application is described in detail above with reference to fig. 1 to 11. The following describes in detail the apparatus provided in the embodiment of the present application with reference to fig. 12 to 13. It should be understood that the descriptions of the apparatus embodiments and the descriptions of the method embodiments correspond to each other, and thus, descriptions of details not described may be referred to the above method embodiments, which are not repeated herein for brevity.
Referring to fig. 12, fig. 12 is a block diagram of an embodiment of a navigation device according to the present application. As shown in fig. 12, the apparatus 1000 may include: a transceiver 1001 and a processor 1002. The apparatus 1000 may perform the operations performed by the terminal device in the embodiment of the method shown in fig. 9.
Illustratively, in an alternative embodiment of the present application, the transceiver 1001 is configured to receive global navigation satellite system GNSS signals. The processor 1002 is configured to: determining a first parameter of the vehicle at a first moment according to the GNSS signals;
determining a second parameter of the vehicle at a first moment according to the sensor;
determining a driving state of the vehicle according to the first parameter and the second parameter;
determining an overhead recognition result of the vehicle according to a change in height of the vehicle when the running state of the vehicle belongs to a target state, the target state including at least one of the following running states: the method comprises the steps of starting an up-ramp, ending the up-ramp, starting an down-ramp and ending the down-ramp, wherein an overhead recognition result of the vehicle is that the vehicle runs on a road on the upper side of an overhead or the vehicle runs on a road on the lower side of the overhead.
In a possible implementation manner, the first parameter includes at least one of the following information: the speed and heading of the vehicle;
the second parameter includes at least one of the following information: the pitch angle, roll angle and heading angle of the vehicle.
In a possible implementation manner, the processor 1002 is configured to determine a first parameter of the vehicle at a first moment, specifically:
When an operation of starting a navigation function is received, determining a first parameter of the vehicle at a first moment;
or when a position searching operation is received, determining a first parameter of the vehicle at a first moment;
or, when an operation for indicating a destination is received, determining a first parameter of the vehicle at a first time;
or when receiving an operation for indicating that the navigation mode is driving, determining a first parameter of the vehicle at a first moment
Or when the speed of the terminal equipment is greater than a target speed threshold value, determining a first parameter of the vehicle at a first moment;
or when the front of the vehicle is determined to contain a turn mouth according to the GNSS signals, determining a first parameter of the vehicle at a first moment;
alternatively, when it is determined that the front of the vehicle includes an overhead sign based on an image including the front of the vehicle, a first parameter of the vehicle at a first time is determined.
In a possible implementation manner, the processor 1002 is configured to determine a driving state of the vehicle according to the first parameter and the second parameter, specifically:
transmitting the first parameter and the second parameter to a classifier, wherein the classifier is used for classifying the running state of the vehicle according to the parameters of the vehicle;
And determining the running state of the vehicle according to the output of the classifier.
In a possible implementation manner, the overhead is a layer, and the processor 1002 is configured to determine an overhead identification result of the vehicle according to a height change of the vehicle, specifically:
if the running state of the vehicle is a starting ramp, and the height change of the vehicle in a first time period is larger than a first threshold value, determining that the overhead identification result of the vehicle is that the vehicle runs on a road on the upper side of an overhead;
or if the running state of the vehicle is a starting down-ramp and the absolute value of the change in height of the vehicle in the first time period is greater than the first threshold value, determining that the overhead recognition result of the vehicle is that the vehicle runs on the road under the overhead;
or the running state of the vehicle is an ending ramp, the height change of the vehicle in a previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on a road on the upper side of the overhead;
or the running state of the vehicle is the end down ramp, the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on the road below the overhead.
In a possible implementation manner, the overhead includes at least two layers, and the processor 1002 determines an overhead identification result of the vehicle according to the height change of the vehicle, specifically:
if the running state of the vehicle is a starting ramp, and the height change of the vehicle in a first time period is larger than a first threshold value, determining that the overhead identification result of the vehicle is that the vehicle runs on a road on the upper side of an overhead;
or if the running state of the vehicle is a starting down-ramp, and the absolute value of the height change of the vehicle in the first time period is larger than the first threshold value, determining an overhead recognition result of the vehicle according to the state of the vehicle before the down-ramp;
or the running state of the vehicle is an ending ramp, the height change of the vehicle in a previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on a road on the upper side of the overhead;
or, the running state of the vehicle is an end down ramp, and the absolute value of the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead recognition result of the vehicle is determined according to the state of the vehicle before the down ramp.
In a possible implementation, the state of the vehicle before the off ramp includes one of the following states: the vehicle travels on a first floor overhead and the vehicle travels on an overhead of other floors, the first floor overhead being the road of the overhead upper floor closest to the road of the lower side of the overhead.
In a possible implementation, the processor 1002 is further configured to:
determining the state of the vehicle before the down ramp according to the height of the vehicle and the height of each layer of overhead;
or, after determining the overhead identification result of the vehicle, recording the change of the layer number of the overhead where the vehicle is located;
and determining the state of the vehicle before the down ramp according to the change of the layer number of the overhead where the vehicle is located.
In a possible implementation, the processor 1002 is further configured to:
after determining the overhead identification result of the vehicle, reporting the overhead identification result of the vehicle to a server;
or if the change of the layer number of the overhead where the vehicle is located is recorded, and the overhead identification result of the vehicle indicates that the vehicle runs on a road on the upper side of the overhead, reporting the overhead identification result of the vehicle and the layer number of the overhead where the vehicle is located to the server.
In a possible implementation, the processor 1002 is further configured to:
if the overhead recognition result of the vehicle indicates that the vehicle is traveling on a road on the lower side of the overhead and the front of the traveling direction of the vehicle includes a weak signal region, the overhead recognition method is adjusted from navigation by GNSS signals to navigation by a network positioning method or navigation by a network positioning method and an inertial overhead recognition method.
In a possible implementation manner, the weak signal area includes: tunnel area or shelter-from area.
That is, the apparatus 1000 may implement steps or procedures corresponding to those performed by the terminal device in the embodiment of the overhead identification method shown in fig. 9, and the apparatus 1000 may include modules for performing the method performed by the terminal device in the embodiment of the overhead identification method shown in fig. 9. It should be understood that the specific process of executing the corresponding steps by each module is already described in detail in the above embodiment of the overhead identification method, and is not described herein for brevity.
The embodiment of the application also provides a navigation device which comprises at least one processor and a communication interface. The communication interface is configured to provide information input and/or output to the at least one processor, which is configured to perform the method of the above-described method embodiments.
The present application also provides a terminal device comprising a processor, which when executing a computer program or instructions in a memory, performs a method as in the method embodiments described above.
The embodiment of the application also provides terminal equipment, which comprises a processor and a memory; the memory is used for storing a computer program or instructions; the processor is configured to execute the computer program or instructions stored in the memory, to cause the terminal device to perform a method as in the method embodiments described above.
The embodiment of the application also provides terminal equipment, which comprises a processor, a memory and a transceiver; the transceiver is used for receiving signals or transmitting signals; the memory is used for storing a computer program or instructions; the processor is configured to execute the computer program or instructions stored in the memory, to cause the terminal device to perform a method as in the method embodiments described above.
The embodiment of the application also provides terminal equipment, which comprises a processor and an interface circuit; the interface circuit is used for receiving a computer program or instructions and transmitting the computer program or instructions to the processor; the processor is configured to execute the computer program or instructions to cause the terminal device to perform a method as in the method embodiments described above.
It should be understood that the navigation device described above may be a chip. For example, referring to fig. 13, fig. 13 is a block diagram illustrating a structure of an embodiment of a chip according to the present application. The chip shown in fig. 13 may be a general-purpose processor or a special-purpose processor. The chip 1100 may include at least one processor 1101. Wherein the at least one processor 1101 may be configured to support the apparatus shown in fig. 12 to perform the technical solution shown in fig. 9.
Optionally, the chip 1100 may further include a transceiver 1102, where the transceiver 1102 is configured to receive control of the processor 1101, and is configured to support the apparatus shown in fig. 12 to perform the technical solution shown in fig. 9. Optionally, the chip 1100 shown in fig. 13 may further include a storage medium 1103. In particular, the transceiver 1102 may be replaced with a communication interface that provides information input and/or output to the at least one processor 1101.
It should be noted that the chip 1100 shown in fig. 13 may be implemented using the following circuits or devices: one or more field programmable gate arrays (field programmable gate array, FPGA), programmable logic devices (programmable logic device, PLD), application specific integrated chips (application specific integrated circuit, ASIC), system on chip (SoC), central processing unit (central processor unit, CPU), network processors (network processor, NP), digital signal processing circuits (digital signal processor, DSP), microcontrollers (micro controller unit, MCU), controllers, state machines, gate logic, discrete hardware components, any other suitable circuit, or any combination of circuits capable of executing the various functions described throughout this application.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor reads the information in the memory and, in combination with its hardware, performs the steps of the above method. To avoid repetition, a detailed description is not provided herein.
It should be noted that the processor in the embodiments of the present application may be an integrated circuit chip with signal processing capability. In implementation, the steps of the above method embodiments may be implemented by integrated logic circuits of hardware in a processor or instructions in software form. The processor may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, or discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor reads the information in the memory and, in combination with its hardware, performs the steps of the above method.
It will be appreciated that the memory in embodiments of the application may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), and direct memory bus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
According to the method provided by the embodiment of the application, the embodiment of the application also provides a computer program product, which comprises: computer program or instructions which, when run on a computer, cause the computer to perform the method of any of the embodiments shown in fig. 9.
According to the method provided by the embodiment of the present application, the embodiment of the present application further provides a computer storage medium storing a computer program or instructions, which when executed on a computer, cause the computer to perform the method of any one of the embodiments shown in fig. 9.
According to the method provided by the embodiment of the application, the embodiment of the application also provides a terminal device, wherein the terminal device is an intelligent device, and comprises a smart phone, a tablet personal computer or a personal digital assistant and the like, and the intelligent device comprises the generating device of the position information.
Those of ordinary skill in the art will appreciate that the various illustrative logical blocks (illustrative logical block) and steps (steps) described in connection with the embodiments disclosed herein can be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system, apparatus and module may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in the embodiments of the present application may be integrated in one processing unit, or each module may exist alone physically, or two or more modules may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The generating device, the chip, the computer storage medium, the computer program product and the terminal device of the location information provided in the embodiments of the present application are all configured to execute the method provided above, so that the beneficial effects achieved by the generating device, the chip, the computer storage medium, the computer program product and the terminal device can refer to the beneficial effects corresponding to the method provided above, and are not described herein.
It should be understood that, in the embodiments of the present application, the execution sequence of each step should be determined by the function and the internal logic, and the size of the sequence number of each step does not mean that the execution sequence is sequential, and does not limit the implementation process of the embodiments.
All parts of the specification are described in a progressive manner, and all parts of the embodiments which are the same and similar to each other are referred to each other, and each embodiment is mainly described as being different from other embodiments. In particular, for embodiments of the location information generating device, chip, computer storage medium, computer program product, terminal device, the description is relatively simple, as it is substantially similar to the method embodiments, as relevant see the description of the method embodiments.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
The embodiments of the present application described above do not limit the scope of the present application.

Claims (19)

1. An overhead identification method, wherein the method is applied to a terminal device, and the method comprises:
at a first moment, determining that the front part of a vehicle comprises a turn mouth according to GNSS signals of a global navigation satellite system, and determining first parameters of the vehicle, wherein the first parameters comprise speed and heading;
periodically acquiring, by a sensor, a second parameter of the vehicle, the second parameter including a pitch angle, a roll angle, and a heading angle;
determining that the vehicle is a starting ramp according to the first parameter and the second parameter;
determining that the change of the height of the vehicle is larger than a first threshold value in a first time period after the vehicle is determined to start an up-ramp, and determining that the vehicle runs on a road on the upper side of an overhead; the duration of the first time period is determined according to road section speed limit and gradient length when the overhead road is lifted, the first threshold is determined through the height of the overhead road, and the height of the overhead road is determined based on the position determined by the GNSS signals;
at a second moment, determining that the vehicle runs on a second layer of overhead according to the height of the vehicle and the height of each layer of overhead;
determining that the front of the vehicle comprises a turn mouth according to the GNSS signals, and determining a third parameter of the vehicle, wherein the third parameter comprises speed and heading;
Periodically acquiring fourth parameters of the vehicle by the sensor, wherein the fourth parameters comprise a pitch angle, a roll angle and a course angle;
determining that the vehicle is a starting ramp down according to the third parameter and the fourth parameter;
determining that the change of the height of the vehicle is larger than a first threshold value in a second time period after the vehicle is determined to be a starting down-ramp, and determining that the vehicle runs on a first layer of overhead; the duration of the second time period is determined according to the road section speed limit and the gradient length when the road section is lifted off;
at a third moment, determining that the front of the vehicle comprises a turn road junction according to the GNSS signals, and determining a fifth parameter of the vehicle, wherein the fifth parameter comprises speed and heading;
periodically acquiring, by the sensor, a sixth parameter of the vehicle, the sixth parameter including a pitch angle, a roll angle, and a heading angle;
determining that the vehicle is a starting ramp down according to the fifth parameter and the sixth parameter;
determining that the height change of the vehicle is larger than a first threshold value in a third time period after the vehicle is determined to be a starting down-ramp, and determining that the vehicle runs on a road on the lower side of the overhead; the duration of the third time period is determined according to the road section speed limit and the gradient length when the road section is lifted;
Determining that the front of the running direction of the vehicle comprises a weak signal area, and adjusting the navigation method from navigation through GNSS signals to navigation through a network positioning method or navigation through the network positioning method and an inertial navigation method.
2. The method of claim 1, wherein the determining that the vehicle is a starting ramp based on the first parameter and the second parameter comprises:
transmitting the first parameter and the second parameter to a classifier, wherein the classifier is used for classifying the running state of the vehicle according to the parameters of the vehicle;
and determining that the vehicle is a starting ramp according to the output of the classifier.
3. The method of claim 1, further comprising, if the overhead is a floor:
if the running state of the vehicle is a starting down ramp and the absolute value of the height change of the vehicle in the first time period is larger than the first threshold value, determining that the overhead identification result of the vehicle is that the vehicle runs on a road on the lower side of an overhead;
or the running state of the vehicle is an ending ramp, the height change of the vehicle in a previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on a road on the upper side of the overhead;
Or the running state of the vehicle is the end down ramp, the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on the road below the overhead.
4. The method of claim 1, further comprising, if the overhead comprises at least two floors:
if the running state of the vehicle is an ending ramp, and the height change of the vehicle in a previous second time period is larger than the first threshold value, determining that the overhead identification result of the vehicle is that the vehicle runs on a road on the upper side of the overhead;
or if the running state of the vehicle is the end down-ramp, and the absolute value of the height change of the vehicle in the previous second time period is larger than the first threshold value, determining the overhead recognition result of the vehicle according to the state of the vehicle before the down-ramp.
5. The method of claim 4, wherein the step of determining the position of the first electrode is performed,
the state of the vehicle before the down-ramp includes one of the following states: the vehicle travels on a first floor overhead and the vehicle travels on an overhead of other floors, the first floor overhead being the road of the overhead upper floor closest to the road of the lower side of the overhead.
6. The method as recited in claim 4, further comprising:
determining the state of the vehicle before the down ramp according to the height of the vehicle and the height of each layer of overhead;
or, after determining the overhead identification result of the vehicle, recording the change of the layer number of the overhead where the vehicle is located;
and determining the state of the vehicle before the down ramp according to the change of the layer number of the overhead where the vehicle is located.
7. The method as recited in claim 1, further comprising:
after determining the overhead identification result of the vehicle, reporting the overhead identification result of the vehicle to a server;
or if the change of the layer number of the overhead where the vehicle is located is recorded, and the overhead identification result of the vehicle indicates that the vehicle runs on a road on the upper side of the overhead, reporting the overhead identification result of the vehicle and the layer number of the overhead where the vehicle is located to the server.
8. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the weak signal region includes: tunnel area or shelter-from area.
9. An overhead identifying apparatus, the apparatus being applied to a terminal device, the apparatus comprising: a transceiver and a processor;
The transceiver is used for receiving GNSS signals of a global navigation satellite system;
the processor is configured to:
at a first moment, determining that the front part of a vehicle comprises a turn mouth according to GNSS signals of a global navigation satellite system, and determining first parameters of the vehicle, wherein the first parameters comprise speed and heading;
periodically acquiring, by a sensor, a second parameter of the vehicle, the second parameter including a pitch angle, a roll angle, and a heading angle;
determining that the vehicle is a starting ramp according to the first parameter and the second parameter;
determining that the change of the height of the vehicle is larger than a first threshold value in a first time period after the vehicle is determined to start an up-ramp, and determining that the vehicle runs on a road on the upper side of an overhead; the duration of the first time period is determined according to road section speed limit and gradient length when the overhead road is lifted, the first threshold is determined through the height of the overhead road, and the height of the overhead road is determined based on the position determined by the GNSS signals;
at a second moment, determining that the vehicle runs on a second layer of overhead according to the height of the vehicle and the height of each layer of overhead;
determining that the front of the vehicle comprises a turn mouth according to the GNSS signals, and determining a third parameter of the vehicle, wherein the third parameter comprises speed and heading;
Periodically acquiring fourth parameters of the vehicle by the sensor, wherein the fourth parameters comprise a pitch angle, a roll angle and a course angle;
determining that the vehicle is a starting ramp down according to the third parameter and the fourth parameter;
determining that the change of the height of the vehicle is larger than a first threshold value in a second time period after the vehicle is determined to be a starting down-ramp, and determining that the vehicle runs on a first layer of overhead; the duration of the second time period is determined according to the road section speed limit and the gradient length when the road section is lifted off;
at a third moment, determining that the front of the vehicle comprises a turn road junction according to the GNSS signals, and determining a fifth parameter of the vehicle, wherein the fifth parameter comprises speed and heading;
periodically acquiring, by the sensor, a sixth parameter of the vehicle, the sixth parameter including a pitch angle, a roll angle, and a heading angle;
determining that the vehicle is a starting ramp down according to the fifth parameter and the sixth parameter;
determining that the height change of the vehicle is larger than a first threshold value in a third time period after the vehicle is determined to be a starting down-ramp, and determining that the vehicle runs on a road on the lower side of the overhead; the duration of the third time period is determined according to the road section speed limit and the gradient length when the road section is lifted;
Determining that the front of the running direction of the vehicle comprises a weak signal area, and adjusting the navigation method from navigation through GNSS signals to navigation through a network positioning method or navigation through the network positioning method and an inertial navigation method.
10. The apparatus of claim 9, wherein the processor is configured to determine that the vehicle is a starting ramp based on the first parameter and the second parameter, specifically:
transmitting the first parameter and the second parameter to a classifier, wherein the classifier is used for classifying the running state of the vehicle according to the parameters of the vehicle;
and determining the running state of the vehicle according to the output of the classifier.
11. The apparatus of claim 9, wherein the overhead is a floor, the processor further configured to:
if the running state of the vehicle is a starting down ramp and the absolute value of the height change of the vehicle in the first time period is larger than the first threshold value, determining that the overhead identification result of the vehicle is that the vehicle runs on a road on the lower side of an overhead;
or the running state of the vehicle is an ending ramp, the height change of the vehicle in a previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on a road on the upper side of the overhead;
Or the running state of the vehicle is the end down ramp, the height change of the vehicle in the previous second time period is larger than the first threshold value, and the overhead identification result of the vehicle is determined to be that the vehicle runs on the road below the overhead.
12. The apparatus of claim 9, wherein if the overhead comprises at least two floors, the processor is further configured to:
if the running state of the vehicle is an ending ramp, and the height change of the vehicle in a previous second time period is larger than the first threshold value, determining that the overhead identification result of the vehicle is that the vehicle runs on a road on the upper side of the overhead;
or if the running state of the vehicle is the end down-ramp, and the absolute value of the height change of the vehicle in the previous second time period is larger than the first threshold value, determining the overhead recognition result of the vehicle according to the state of the vehicle before the down-ramp.
13. The apparatus of claim 12, wherein the device comprises a plurality of sensors,
the state of the vehicle before the down-ramp includes one of the following states: the vehicle travels on a first floor overhead and the vehicle travels on an overhead of other floors, the first floor overhead being the road of the overhead upper floor closest to the road of the lower side of the overhead.
14. The apparatus of claim 12, wherein the processor is further configured to:
determining the state of the vehicle before the down ramp according to the height of the vehicle and the height of each layer of overhead;
or, after determining the overhead identification result of the vehicle, recording the change of the layer number of the overhead where the vehicle is located;
and determining the state of the vehicle before the down ramp according to the change of the layer number of the overhead where the vehicle is located.
15. The apparatus of claim 9, wherein the processor is further configured to:
after determining the overhead identification result of the vehicle, reporting the overhead identification result of the vehicle to a server;
or if the change of the layer number of the overhead where the vehicle is located is recorded, and the overhead identification result of the vehicle indicates that the vehicle runs on a road on the upper side of the overhead, reporting the overhead identification result of the vehicle and the layer number of the overhead where the vehicle is located to the server.
16. The apparatus of claim 9, wherein the device comprises a plurality of sensors,
the weak signal region includes: tunnel area or shelter-from area.
17. A terminal device, characterized in that it comprises an apparatus according to any one of claims 9 to 16.
18. A computer storage medium having stored therein a computer program or instructions which, when executed, is adapted to carry out the method of any one of claims 1-8.
19. A chip comprising a processor coupled to a memory for executing a computer program or instructions stored in the memory, which when executed, performs the method of any of claims 1-8.
CN202110904063.XA 2021-08-06 2021-08-06 Overhead identification method and device Active CN113804211B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110904063.XA CN113804211B (en) 2021-08-06 2021-08-06 Overhead identification method and device
PCT/CN2022/091520 WO2023010923A1 (en) 2021-08-06 2022-05-07 Overpass identification method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110904063.XA CN113804211B (en) 2021-08-06 2021-08-06 Overhead identification method and device

Publications (2)

Publication Number Publication Date
CN113804211A CN113804211A (en) 2021-12-17
CN113804211B true CN113804211B (en) 2023-10-03

Family

ID=78942753

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110904063.XA Active CN113804211B (en) 2021-08-06 2021-08-06 Overhead identification method and device

Country Status (2)

Country Link
CN (1) CN113804211B (en)
WO (1) WO2023010923A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113804211B (en) * 2021-08-06 2023-10-03 荣耀终端有限公司 Overhead identification method and device
CN114509068A (en) * 2022-01-04 2022-05-17 海信集团控股股份有限公司 Method and device for judging positions of vehicles on multilayer road

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102012234A (en) * 2009-09-04 2011-04-13 陈宗炜 Vehicle navigation device
CN103335655A (en) * 2013-05-29 2013-10-02 周眉 Navigator and navigation method
CN106989743A (en) * 2017-03-31 2017-07-28 上海电机学院 A kind of energy automatic sensing passes in and out the apparatus for vehicle navigation of overpass information
CN107764274A (en) * 2016-08-17 2018-03-06 厦门雅迅网络股份有限公司 It is a kind of to differentiate whether vehicle travels the method in overpass
CN108195391A (en) * 2018-01-29 2018-06-22 千寻位置网络有限公司 Based on barometrical detection method on overhead or under overhead
CN109883438A (en) * 2019-03-21 2019-06-14 斑马网络技术有限公司 Automobile navigation method, device, medium and electronic equipment
CN110979339A (en) * 2019-11-26 2020-04-10 南京市德赛西威汽车电子有限公司 Front road form reconstruction method based on V2V
CN111127874A (en) * 2018-10-30 2020-05-08 上海擎感智能科技有限公司 Overhead identification method and identification system
CN112945244A (en) * 2021-02-03 2021-06-11 西华大学 Rapid navigation system and navigation method suitable for complex overpass
CN112945230A (en) * 2021-01-26 2021-06-11 腾讯科技(深圳)有限公司 Vehicle driving state identification method and device, computer equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102401660A (en) * 2010-09-17 2012-04-04 环达电脑(上海)有限公司 Positioning method of elevated road
CN104422426B (en) * 2013-08-30 2017-07-11 博世汽车部件(苏州)有限公司 The method and apparatus that vehicle navigation information is provided in overpass region
JP6169945B2 (en) * 2013-10-25 2017-07-26 アルパイン株式会社 Navigation device and elevated vertical path determination method
CN111310675A (en) * 2020-02-20 2020-06-19 上海赛可出行科技服务有限公司 Overhead identification auxiliary positioning method based on convolutional neural network
CN113804211B (en) * 2021-08-06 2023-10-03 荣耀终端有限公司 Overhead identification method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102012234A (en) * 2009-09-04 2011-04-13 陈宗炜 Vehicle navigation device
CN103335655A (en) * 2013-05-29 2013-10-02 周眉 Navigator and navigation method
CN107764274A (en) * 2016-08-17 2018-03-06 厦门雅迅网络股份有限公司 It is a kind of to differentiate whether vehicle travels the method in overpass
CN106989743A (en) * 2017-03-31 2017-07-28 上海电机学院 A kind of energy automatic sensing passes in and out the apparatus for vehicle navigation of overpass information
CN108195391A (en) * 2018-01-29 2018-06-22 千寻位置网络有限公司 Based on barometrical detection method on overhead or under overhead
CN111127874A (en) * 2018-10-30 2020-05-08 上海擎感智能科技有限公司 Overhead identification method and identification system
CN109883438A (en) * 2019-03-21 2019-06-14 斑马网络技术有限公司 Automobile navigation method, device, medium and electronic equipment
CN110979339A (en) * 2019-11-26 2020-04-10 南京市德赛西威汽车电子有限公司 Front road form reconstruction method based on V2V
CN112945230A (en) * 2021-01-26 2021-06-11 腾讯科技(深圳)有限公司 Vehicle driving state identification method and device, computer equipment and storage medium
CN112945244A (en) * 2021-02-03 2021-06-11 西华大学 Rapid navigation system and navigation method suitable for complex overpass

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
多源传感器信息融合的车辆立交桥通行状态识别方法研究;白杰;《中国优秀博硕士学位论文全文数据库(硕士) 工程科技Ⅱ辑》;20201015;第18-42页 *

Also Published As

Publication number Publication date
WO2023010923A1 (en) 2023-02-09
CN113804211A (en) 2021-12-17

Similar Documents

Publication Publication Date Title
CN113792589B (en) Overhead identification method and device
CN113804211B (en) Overhead identification method and device
CN110263688B (en) Driving related guidance providing method and apparatus, and computer readable recording medium
US9677899B2 (en) Electronic apparatus, control method thereof, computer program, and computer-readable recording medium
CN109445425A (en) Method for testing performance, device and the storage medium of automated driving system
CN115330923B (en) Point cloud data rendering method and device, vehicle, readable storage medium and chip
US20240017719A1 (en) Mapping method and apparatus, vehicle, readable storage medium, and chip
CN114882464B (en) Multi-task model training method, multi-task processing method, device and vehicle
CN114935334B (en) Construction method and device of lane topological relation, vehicle, medium and chip
WO2023169448A1 (en) Method and apparatus for sensing target
CN115170630B (en) Map generation method, map generation device, electronic equipment, vehicle and storage medium
CN114756700B (en) Scene library establishing method and device, vehicle, storage medium and chip
CN113790732B (en) Method and device for generating position information
CN115164910B (en) Travel route generation method, travel route generation device, vehicle, storage medium, and chip
CN114937351B (en) Motorcade control method and device, storage medium, chip, electronic equipment and vehicle
CN115205311B (en) Image processing method, device, vehicle, medium and chip
CN105651298A (en) Electronic apparatus and control method thereof
CN113820732A (en) Navigation method and device
CN113790733B (en) Navigation method and device
CN114842454B (en) Obstacle detection method, device, equipment, storage medium, chip and vehicle
CN115221260B (en) Data processing method, device, vehicle and storage medium
CN113790731B (en) Method and device for generating speed information
WO2024067078A1 (en) Vehicle positioning method and electronic device
CN115219151B (en) Vehicle testing method, system, electronic equipment and medium
CN115082886B (en) Target detection method, device, storage medium, chip and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant