CN113449211B - Line navigation method and device, electronic equipment and computer readable medium - Google Patents

Line navigation method and device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN113449211B
CN113449211B CN202110831980.XA CN202110831980A CN113449211B CN 113449211 B CN113449211 B CN 113449211B CN 202110831980 A CN202110831980 A CN 202110831980A CN 113449211 B CN113449211 B CN 113449211B
Authority
CN
China
Prior art keywords
route
transfer
image
preselected
replaceable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110831980.XA
Other languages
Chinese (zh)
Other versions
CN113449211A (en
Inventor
罗祎
申雪岑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110831980.XA priority Critical patent/CN113449211B/en
Publication of CN113449211A publication Critical patent/CN113449211A/en
Application granted granted Critical
Publication of CN113449211B publication Critical patent/CN113449211B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The disclosure provides a line navigation method and device, and relates to the technical fields of computer vision, image processing, augmented reality and the like. The specific implementation scheme is as follows: acquiring the current position of a terminal and an image to be subjected to augmented reality; based on the image and the current position, determining a transformable region and a transformable type corresponding to the transformable region; detecting whether there is a replaceable route between the exchangeable area and the destination with respect to a preselected route, which is a pre-selected preset type of route between the current location and the destination, based on the transfer type; in response to detecting the replaceable route, the replaceable route and the preselected route are indicated on the image by the augmented reality indication mark. This embodiment improves the user experience.

Description

Line navigation method and device, electronic equipment and computer readable medium
Technical Field
The present disclosure relates to the field of computer technology, and in particular, to the technical fields of computer vision, image processing, augmented reality, and the like, and more particularly, to a line navigation method and apparatus, an electronic device, a computer readable medium, and a computer program product.
Background
By superimposing virtual 3D AR (Augmented Reality ) markers in the live-action camera view that fit in three-dimensional space, navigation can be intuitively provided for the user.
In the existing AR navigation route searching process, switching or inquiring other types of traffic navigation schemes generally requires users to complete the search or interaction among different product functions (such as walking navigation product interfaces and public transportation navigation product interfaces) through pages, so that the navigation efficiency is low, and the navigation experience of the users is often interrupted.
Disclosure of Invention
A line navigation method and apparatus, an electronic device, a computer readable medium, and a computer program product are provided.
According to a first aspect, there is provided a line navigation method, the method comprising: acquiring the current position of a terminal and an image to be subjected to augmented reality; based on the image and the current position, determining a transformable region and a transformable type corresponding to the transformable region; detecting whether there is a replaceable route between the exchangeable area and the destination with respect to a preselected route, which is a pre-selected preset type of route between the current location and the destination, based on the transfer type; in response to detecting the replaceable route, the replaceable route and the preselected route are indicated on the image by the augmented reality indication mark.
According to a second aspect, there is provided a line navigation apparatus, the apparatus comprising: an acquisition unit configured to acquire a current position of a terminal and an image to be subjected to augmented reality; a determination unit configured to determine a transfer area and a transfer type corresponding to the transfer area based on the image and the current position; a detection unit configured to detect whether there is a replaceable route between the exchangeable area and the destination with respect to a preselected route, which is a pre-selected preset type of route between the current position and the destination, based on the transfer type; an indication unit configured to indicate the exchangeable route and the preselected route on the image by the augmented reality indication mark in response to detecting the exchangeable route.
According to a third aspect, there is provided an electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor, wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method as described in any one of the implementations of the first aspect.
According to a fourth aspect, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform a method as described in any implementation of the first aspect.
According to a fifth aspect, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method as described in any of the implementations of the first aspect.
The embodiment of the disclosure provides a line navigation method and device, firstly, a current position of a terminal and an image to be augmented reality are obtained; secondly, determining a transfer area and a transfer type corresponding to the transfer area based on the image and the current position; again, based on the transfer type, detecting whether there is a replaceable route between the transfer area and the destination relative to a preselected route, the preselected route being a pre-selected pre-set type of route between the current location and the destination; finally, in response to detecting the replaceable route, the replaceable route and the preselected route are indicated on the image by the augmented reality indication mark. Therefore, when a user runs on the current preselected route, the replaceable route is determined based on the acquired image to be enhanced and displayed, and the indication mark of the replaceable route is added on the image, so that intelligent recommendation of the travel scheme under the condition that the user does not exit the current augmented reality navigation state is realized, and user experience is improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a flow chart of one embodiment of a line navigation method according to the present disclosure;
FIG. 2 is a schematic diagram of augmented reality indication identification and annotation information in an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of two different styles of augmented reality indication markers in an embodiment of the present disclosure;
FIG. 4 is another schematic diagram of augmented reality indication identification and annotation information in an embodiment of the present disclosure;
FIG. 5 is a third schematic diagram of augmented reality indication identification and annotation information in an embodiment of the present disclosure;
FIG. 6 is a flow chart of another embodiment of a line navigation method according to the present disclosure;
FIG. 7 is a schematic structural view of an embodiment of a line navigation apparatus according to the present disclosure;
fig. 8 is a block diagram of an electronic device for implementing a line navigation method of an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
When a user uses the map AR navigation product to go to a certain place, there is a demand of switching the arrival mode during the journey, for example, after walking for a period of time, the user wants to temporarily take public transportation to reach the destination. The current solutions for such a demand are: the AR navigation product is manually exited, information of public transportation schemes is queried again based on the current position and the destination, and whether public transportation is to be taken to the destination is determined according to the scheme information. The disadvantage of this solution is that it requires manual switching and querying of various information, and the cost of interaction and time is high. For this reason, the present disclosure proposes a route navigation method to solve the above-described problems.
Fig. 1 shows a flow 100 according to one embodiment of a line navigation method of the present disclosure, comprising the steps of:
step 101, acquiring a current position of a terminal and an image to be subjected to augmented reality.
In this embodiment, the terminal may be a mobile terminal held by an object, where an image capturing device is provided on the mobile terminal, and the image capturing device may capture, in real time, a scene around the terminal or the object, and when the object has an augmented reality navigation requirement, an application supporting augmented reality display on the terminal is opened, and an image superimposed with an augmented reality indication identifier may be viewed in real time on an interface corresponding to the application, where the image to be augmented reality is an image captured, in real time, by the image capturing device after the object opens the application supporting augmented reality on the terminal to perform navigation.
In this embodiment, the execution body on which the line navigation method is executed may further implement the functions of the application, and may further provide a virtual reality navigation function for the object by superimposing an augmented reality indication identifier corresponding to the preselected route or the alternative route of the user in the image to be augmented reality, based on the preselected route of the object, so that the object experiences the display effect of the 3D virtual reality in the navigation process (as shown in fig. 2).
In this embodiment, the pre-selected route is a pre-selected preset type of route between the current location and the destination, and it should be noted that the preset type is a navigation type set by the object for the pre-selected route in the above application, and the navigation type may be a walking type, a riding type, a self-driving type, a driving type, or the like.
Step 102, determining a transfer area and a transfer type corresponding to the transfer area based on the image and the current position.
In this embodiment, the exchangeable area is an area for jumping from a preselected route to another type of exchangeable route, for example, the preselected route is a walking navigation route, the exchangeable route is a bus route, and the exchangeable area is a bus stop, a station, or the like. In this embodiment, the changeable route is a route different from the pre-selected route, and the transfer type of the changeable route may be the same as or different from the pre-selected route, but the transfer type of the changeable route and the pre-selected route both belong to the navigation type in the same navigation system, i.e. the transfer type of the changeable route and the pre-selected route belong to the same navigation type range. The navigation type range to which the navigation device belongs may be set based on the navigation requirement, for example, the range to which the navigation device belongs is: walking, public transportation, subway, self-driving, etc.
In this embodiment, the actual area where the object and the terminal of the object are located may be determined by the image to be augmented, for example, the regional characteristics (for example, the building, the clothing of the person, etc.) related to the region in the image to be augmented are identified, and the transformable area is determined by the regional characteristics. Further, the object can be positioned on the map through the current position; the attribute of each position point on the preselected route in the map is completely consistent with the attribute of the identified regional characteristics, the correct transfer area of the object is determined, and the transfer type corresponding to the transfer area can be determined further by the characteristics of the optional area. For example, if the selectable area is a subway station, the transfer type corresponding to the selectable area may be walking, public transportation, subway, etc.
Optionally, the determining the transfer area and the transfer type corresponding to the transfer area based on the image and the current position includes: dividing a circular area with the current position as a center and a preset distance (the preset distance can be determined by the walking path of an object, for example, 20 meters) as a radius, and taking the circular area as a changeable area; based on the image to be augmented reality, identifying the traffic identification and the road type in the image, and determining the transfer type corresponding to the transfer area through the road type and the traffic identification.
Optionally, the determining the transfer area and the transfer type corresponding to the transfer area based on the image and the current position includes: dividing a square area with the current position as a diagonal center and a preset distance (the preset distance can be determined by the walking path of the object, for example, 20 meters) as the length of the diagonal, and taking the square area as a convertible area; based on the image to be augmented reality, identifying the traffic identification and the road type in the image, and determining the transfer type corresponding to the transfer area through the road type and the traffic identification.
Step 103, based on the transfer type, it is detected whether there is a replaceable route between the exchangeable area and the destination with respect to the preselected route.
Wherein the preselected route is a preselected, preset type of route between the current location and the destination.
In this embodiment, the detecting whether there is a replaceable route between the exchangeable area and the destination with respect to the preselected route based on the transfer type includes: planning all pre-transfer routes from any point to a destination in a transfer area; in response to all of the pre-transfer routes having a pre-transfer route belonging to the transfer type, the pre-transfer route belonging to the transfer type is treated as a replaceable route, i.e., a replaceable route is detected.
In this embodiment, the pre-transfer route is a pre-transfer route on which the execution subject on which the line navigation method operates is based on a pre-selected transfer route between each position point of the transfer area to the destination, and the type of the pre-selected transfer route may be one of all types corresponding to the middle navigation type of the navigation system, i.e., the type of the pre-transfer route is the navigation type.
In this embodiment, the transfer type of the replaceable route may be the same as the preset type of the preselected route, or may be different from the preset type of the preselected route.
In response to detecting the replaceable route, the replaceable route and the preselected route are indicated on the image by the augmented reality indication mark 104.
In this embodiment, the replaceable route and the preselected route may be marked by using marking information, and the marking information for marking the replaceable route may be displayed on the terminal in an AR information format, and information such as the estimated arrival time, a specific trip scheme, and the like may be presented around the replaceable route by using the marking information.
When the labeling information and the replaceable route are rendered and displayed on the terminal to be augmented reality images, whether the object adopts the preselected route or the replaceable route can be judged according to the real-time traveling position of the object, and after the object travels for a certain distance, the AR presentation content of the unapplied replaceable route or the AR presentation content of the preselected route is subjected to blanking processing, so that the intelligent recommendation of the travel scheme and the scheme switching of the platform are realized under the condition that the AR navigation state is not exited.
In this embodiment, it may be indicated that different types of augmented reality indication identifiers are required to be superimposed on an image to be augmented reality, as shown in fig. 2, the augmented reality indication identifier a is used to indicate a walking navigation mode, and the labeling information of the preselected route indicated by the augmented reality indication identifier a is "25 meters in straight line", that is, the object in the handheld terminal in the current walking navigation state needs to travel 25 meters in straight line.
In this embodiment, when the transfer type of the replaceable route is different from the preset type of the preselected route and the augmented reality indication marks are required to be superimposed on the image to be augmented reality, the preselected route and the replaceable route may be indicated by using the augmented reality indication marks of different styles, as shown in fig. 3, specifically, the preselected route is a walking navigation route in a subway station, the replaceable route is a subway navigation route in the subway station, the first mark b is used to indicate the preselected route, the second mark c is used to indicate the replaceable route, and in fig. 3, the first mark b and the second mark c are dynamically displayed in real time as the object operates in the actual environment.
Alternatively, in one example of the present application, when the preselected route is a walking navigation route and there are multiple exchangeable routes, the same type of augmented reality indication marks may be used to indicate different exchangeable routes, respectively, and detailed information of the various exchangeable routes may be interpreted by labeling information of different contents, as shown in fig. 4, the augmented reality indication mark d is used to indicate the exchangeable route of which the navigation type is taxi, and in fig. 4, the labeling information is: taxis need to wait 30 minutes. As shown in fig. 5, the augmented reality indication identifier e is used for indicating that the navigation type is a replaceable route of a bus, and in fig. 5, the labeling information is: suggesting to take 15 roads, saving 15 minutes for you.
Optionally, the line navigation method may further include detecting whether the current position is located outside the transfer area; if the current position is detected to be located outside the exchangeable area, the indication of the exchangeable route in the image through the augmented reality indication mark is stopped.
Optionally, the line navigation method may further include: receiving a determination instruction to select a replaceable route, and stopping indicating the preselected route in the image by the augmented reality instruction identification based on the instruction.
In this alternative implementation, the instruction for determining the replaceable route may be an instruction issued by the object of the handheld terminal to determine to select the replaceable route, where the intention of the object is specified by the instruction: and selecting an exchangeable route for navigation.
In this embodiment, after obtaining the replaceable route, the execution subject renders the replaceable route in the image to be augmented reality of the terminal at the same time, and when the replaceable route is plural, the execution subject may also separately prompt each replaceable route in different manners.
The line navigation method provided by the embodiment of the disclosure includes the steps of firstly, acquiring a current position of a terminal and an image to be subjected to augmented reality; secondly, determining a transfer area and a transfer type corresponding to the transfer area based on the image and the current position; again, based on the transfer type, detecting whether there is a replaceable route between the transfer area and the destination relative to a preselected route, the preselected route being a pre-selected pre-set type of route between the current location and the destination; finally, in response to detecting the replaceable route, the replaceable route and the preselected route are indicated on the image by the augmented reality indication mark. Therefore, when a user runs on the current preselected route, the replaceable route is determined based on the acquired image to be enhanced and displayed, and the indication mark of the replaceable route is added on the image, so that intelligent recommendation of the travel scheme under the condition that the user does not exit the current augmented reality navigation state is realized, and user experience is improved.
In some optional implementations of the present embodiment, detecting whether there is a replaceable route between the availability of the transfer area and the destination with respect to the preselected route based on the transfer type includes: determining a pre-transfer route belonging to a transfer type from any point in the transfer area to the destination; screening out pre-transfer routes which are coincident with the near ends of the opposite transfer areas of the pre-selected routes in all the pre-transfer routes; based on the screened pre-transfer route, a replaceable route is determined that differs from the pre-selected route from any point in the convertible area to the destination.
In this alternative implementation, the screened pre-transfer route coincides with the pre-selected route, the coinciding zone is located in the exchangeable area, and the coinciding zone is located at the nearest proximal end of the pre-selected route to the exchangeable area. By screening the pre-transfer routes, pre-transfer routes that do not coincide with the preselected route, as well as pre-transfer routes that deviate from the preselected route direction, may be filtered out.
In this optional implementation manner, the determining, based on the screened pre-transfer route, a replaceable route that is different from the pre-selected route from any point in the transfer area to the destination may include: and comparing the screened pre-transfer route with the preselected route, and taking the screened pre-transfer route as a replaceable route if the screened pre-transfer route is different from the preselected route.
In the alternative implementation mode, a plurality of alternative navigation modes are provided for the navigation of the user by determining the alternative route, so that the navigation experience of the user is improved.
In the alternative implementation manner, the pre-transfer routes which are not overlapped with the pre-selected route can be removed by screening out the pre-transfer routes which are overlapped with the near end of the relative exchangeable area of the pre-selected route in all the pre-transfer routes, so that the pre-transfer routes in the opposite direction to the pre-selected route are removed, reliable basis is provided for providing the preferable exchangeable route for the object, and the accuracy of route navigation is ensured.
In this embodiment, the replaceable route may be determined based on the time consumption of the screened pre-transfer route, and in some optional implementations of this embodiment, the determining, based on the screened pre-transfer route, the replaceable route that is different from the pre-selected route from any point in the available transfer area to the destination includes: calculating the time consumption of the pre-transfer route and the pre-selected route selected by each screen; ascending sorting is carried out on time consumption of all the screened pre-transfer routes and the pre-selected routes; based on the ascending sort results, alternate routes of all screened pre-transfer routes that are different from the pre-selected route are determined.
In this optional implementation manner, determining, based on the ascending sort result, a replaceable route that is different from the preselected route among all the screened pre-transfer routes includes: and selecting a time-consuming pre-set (for example, the first 3 bits) route in the ascending sort result, removing a pre-selected route in the pre-set bit route, and taking the rest pre-transfer route as a replaceable route.
Optionally, determining the alternate route of all the screened pre-transfer routes that is different from the pre-selected route based on the ascending sort result comprises: judging whether the route with the shortest time consumption of the ascending sort result is a preselected route or not; in response to the shortest time consuming route being not the preselected route, the shortest time consuming pre-transfer route is treated as a replaceable route.
In the alternative implementation mode, the time consuming of each screened pre-transfer route and the time consuming of the preselected route are sequenced in an ascending order by calculating the time consuming of all the screened pre-transfer routes and the preselected route, and the route with the shortest time consuming can be determined, so that a reliable basis is provided for determining the replaceable route which is different from the preselected route in all the screened pre-transfer routes, and the reliability of the selection of the replaceable route is ensured.
Optionally, the determining, based on the screened pre-transfer route, a replaceable route that is different from the pre-selected route from any point in the replaceable area to the destination includes: calculating the time consumption of the pre-transfer route and the pre-selected route selected by each screen; comparing the time consumption of each screened pre-transfer route with the time consumption of the pre-selected route; the pre-transfer route that consumes less time than the pre-selected route is taken as a replaceable route that is distinct from the pre-selected route.
In this embodiment, the exchangeable route different from the preselected route may also be determined based on the length of the screened pre-transfer route and the transfer type corresponding to each pre-transfer route. The determining a replaceable route, which is different from a preselected route from any point in the replaceable area to the destination, based on the screened pre-transfer route includes: respectively calculating equivalent average lengths of each screened pre-transfer route and each pre-selected route, wherein the equivalent average lengths of each screened pre-transfer route and each pre-selected route correspond to the respective transfer type; ascending order is carried out on all equivalent lengths; and determining the replaceable routes which are different from the preselected route in all the screened pre-transfer routes based on the sequencing results of all the equivalent average lengths.
In this alternative implementation manner, in order to facilitate unified calculation of the pre-transfer route and the pre-selected route, different transfer type coefficients are set for the pre-transfer route of different transfer types, for example, the transfer type is walking, and the transfer type coefficient is 1.0; the transfer type is public transportation, and the transfer coefficient is 1.5; the transfer type is self-driving, and the transfer type coefficient is 2. After each pre-transfer route is obtained, calculating the route length of each screened pre-transfer route, dividing the route length of each screened pre-transfer route with the transfer type coefficient corresponding to the screened pre-transfer route, and obtaining the equivalent length corresponding to the screened pre-transfer route.
Similarly, if the preselected route is a determined route, the route length and the preset type of the preselected route are determined, the preset type of the preselected route is a unified type with the transfer type, that is, the preset type also has a transfer type coefficient, and the equivalent average length corresponding to the preselected route is obtained by dividing the route length of the preselected route and the transfer type coefficient corresponding to the preselected route.
In the alternative implementation mode, the equivalent lengths of the screened pre-transfer routes and the preselected routes are compared, so that a reliable basis is provided for determining the replaceable routes different from the preselected routes in all the screened pre-transfer routes, and the reliability of the selection of the replaceable routes is ensured.
In some optional implementations of the present embodiment, determining the transfer area and the transfer type corresponding to the transfer area based on the image and the current position includes: extracting a region identifier of the image based on the image, wherein the region identifier is used for indicating a transformable region; determining a transformable region based on the region identification and the current position; extracting traffic identification from the image based on the image; in response to the traffic identifications indicating different types of transfer paths, the type of transfer path corresponding to each traffic identification is taken as the transfer type corresponding to the transfer area.
In this embodiment, the traffic sign includes an indication sign, a road indication sign, an auxiliary sign, an indication mark line, and the like, and the sizes and rules of various traffic signs all have fixed requirements, and the type of transfer path which can be transferred relative to the current position of the object can be determined by the traffic sign in the image. Specifically, the transfer path may be of walking, subway, bus, self-driving, or the like.
In this embodiment, the image with the traffic sign feature may be identified in real time in the navigation image by the image identification technology, so as to determine whether the navigation image has the traffic sign. The real-time identification of the traffic mark in the navigation image by the image identification technology comprises the following steps: and analyzing and judging whether the navigation image contains traffic identification or not based on a specific image recognition algorithm. Image recognition algorithms include, but are not limited to, target recognition algorithms employing Fast Renn algorithm based on deep learning (english full name: faster Regions with CNNs features), SSD algorithm (english full name: single shot multibox detector), yolo algorithm (english full name: you Only Look Once), or other types of image target recognition algorithms.
In the alternative implementation manner, after an image to be subjected to augmented reality is obtained, identifying a region identifier in the image, and determining a transformable region by the region identifier and the current position; and the traffic mark in the image is extracted, and the transfer type is determined by the type of the traffic mark, so that a reliable basis is provided for obtaining the transfer area and the transfer type, and the reliability of the augmented reality indication of the replaceable route is ensured. In this alternative implementation, the area identifier may be an identifier of a place name, a landmark, or the like.
Optionally, the determining the transfer area and the transfer type corresponding to the transfer area based on the image and the current position includes: determining a transformable region based on the current location; and identifying the image, extracting traffic identifications in the image, and taking all traffic types as transfer types in response to determining that the traffic identifications indicate different traffic types.
In this optional implementation manner, the transfer type may be determined according to the traffic type indicated by the traffic identifier, where the traffic type may be a walking type, a public transportation type, a self-driving type, a rental type, or the like.
Fig. 6 illustrates a flow chart 600 of another embodiment of a line navigation method according to the present disclosure, the line navigation method comprising:
step 601, acquiring a current position of a terminal and an image to be augmented reality.
Step 602, determining a transfer area and a transfer type corresponding to the transfer area based on the image and the current position.
Step 603, based on the transfer type, detects whether there is a replaceable route between the transfer area and the destination with respect to the preselected route.
Wherein the preselected route is a preselected, preset type of route between the current location and the destination.
In response to detecting the replaceable route, the replaceable route and the preselected route are indicated on the image by the augmented reality indication mark, step 604.
It should be understood that the operations and features in steps 601-604 described above correspond to the operations and features in steps 101-104, respectively, and thus the descriptions of the operations and features in steps 101-104 described above also apply to steps 601-604, and are not repeated herein.
Step 605, responsive to the current location being on the exchangeable route, and after a set time duration on the exchangeable route, stops indicating the preselected route in the image by the augmented reality instruction marker.
In this embodiment, when the current position is located on the exchangeable route and the time is set on the exchangeable route continuously, it is determined that the object holding the terminal has selected the exchangeable route, and at this time, the preselected route may be replaced with the exchangeable route, thereby stopping the indication of the preselected route by the augmented reality indication mark.
In this embodiment, the set time may be determined according to a preset type of the preselected route, for example, the set time is 10s.
Alternatively, when the current location is located on the preselected route and is on the preselected route for a set time, it is determined that the object holding the terminal remains navigating on the preselected route, at which point the preselected route may be replaced with the alternative route, thereby ceasing to identify the alternative route via the augmented reality instruction.
Alternatively, the foregoing may also be maintained for a predetermined time (e.g., 5 seconds) to indicate the exchangeable route by the augmented reality indication mark before stopping to identify the exchangeable route by the augmented reality indication mark, so as to give the object a time to fully reroute.
Optionally, the line navigation method further comprises detecting whether the current position is located outside the transfer area; if the current position is detected to be located outside the exchangeable area, the indication of the exchangeable route in the image through the augmented reality indication mark is stopped.
In this embodiment, after the current position is located on the replaceable route and the time is set continuously on the replaceable route, the pre-selected route is stopped being indicated in the image by the augmented reality indication mark, so that the intention of the user can be well determined, reliable basis is provided for automatically displaying multiple types of replaceable routes and stopping displaying multiple types of replaceable routes, and the user experience is further improved.
With further reference to fig. 7, as an implementation of the method shown in the foregoing figures, the present disclosure provides an embodiment of a line navigation apparatus, which corresponds to the method embodiment shown in fig. 1, and which is particularly applicable to various electronic devices.
As shown in fig. 7, the line navigation apparatus 700 provided in the present embodiment includes: an acquisition unit 701, a determination unit 702, a detection unit 703, an instruction unit 704. The acquiring unit 701 may be configured to acquire the current location of the terminal and an image to be augmented reality. The above-described determination unit 702 may be configured to determine the transfer area and the transfer type corresponding to the transfer area based on the image and the current position. The above-described detection unit 703 may be configured to detect whether there is a replaceable route between the exchangeable area and the destination with respect to a preselected route, which is a pre-selected preset type of route between the current position and the destination, based on the transfer type. The above-described indication unit 704 may be configured to indicate the exchangeable route and the preselected route on the image by the augmented reality indication mark in response to detecting the exchangeable route.
In the present embodiment, in the line navigation apparatus 700: the specific processing of the obtaining unit 701, the determining unit 702, the detecting unit 703, and the indicating unit 704 and the technical effects thereof may refer to the descriptions related to step 101, step 102, step 103, and step 104 in the corresponding embodiment of fig. 1, and are not described herein.
In some optional implementations of this embodiment, the detection unit 703 includes: a determining module (not shown), a screening module (not shown), and a distinguishing module (not shown). Wherein the determining module may be configured to determine a pre-transfer route belonging to a transfer type from any point in the transfer area to the destination. The screening module may be configured to screen out all pre-transfer routes that coincide with a proximal end of the pre-selected route's relative transfer area. The differentiating module may be configured to determine a replaceable route that is different from the preselected route from any point in the convertible area to the destination based on the screened pre-conversion route.
In some optional implementations of this embodiment, the differentiating module includes: a calculation sub-module (not shown), a sorting sub-module (not shown), a selection sub-module (not shown). Wherein the calculation sub-module may be configured to calculate the time elapsed for the pre-transfer route and the pre-selected route through each of the screens. The sorting sub-module may be configured to sort the time elapsed for all of the screened pre-transfer routes and pre-selected routes in ascending order. The selection sub-module may be configured to determine a replaceable route of all the screened pre-transfer routes that is different from the pre-selected route based on the ascending sort result.
In some optional implementations of this embodiment, the differentiating module includes: an averaging sub-module (not shown), an ascending sub-module (not shown), a selecting sub-module (not shown). The average submodule can be configured to calculate equivalent lengths of the screened pre-transfer routes and the preselected routes respectively, and the equivalent lengths of the screened pre-transfer routes and the preselected routes correspond to respective transfer types. The ascending submodule may be configured to sort all equivalent lengths in ascending order. The selection sub-module may be configured to determine a replaceable route of all the screened pre-transfer routes that is different from the pre-selected route based on the ranking results of all equivalent average lengths.
In some optional implementations of the present embodiment, the determining unit 702 includes: the system comprises an extraction module (not shown), a localization module (not shown), an identification module (not shown), and a transfer module (not shown). Wherein the extracting module may be configured to extract, based on the image, a region identifier of the image, the region identifier being used to indicate the transferable region. The localization module may be configured to determine the transferable region based on the region identification and the current location. The identification module may be configured to extract traffic identifications in the images based on the images. The transfer module may be configured to, in response to the traffic identifications indicating different types of transfer paths, take the types of transfer paths corresponding to the respective traffic identifications as transfer types corresponding to the transfer areas.
In some optional implementations of this embodiment, the apparatus 700 further includes: a stop unit (not shown in the figure). Wherein the stopping unit may be configured to stop indicating the preselected route in the image by the augmented reality indication mark in response to the current position being located on the exchangeable route and after a set time duration on the exchangeable route
The line navigation apparatus provided by the embodiment of the present disclosure, first, an acquisition unit 701 acquires a current position of a terminal and an image to be augmented reality; next, the determination unit 702 determines the transferable region and the transfer type corresponding to the transferable region based on the image and the current position; again, the detection unit 703 detects whether there is a replaceable route between the exchangeable area and the destination with respect to a preselected route, which is a pre-selected preset type of route between the current position and the destination, based on the transfer type; finally, the indication unit 704 indicates the exchangeable route and the preselected route on the image by the augmented reality indication mark in response to detecting the exchangeable route. Therefore, when a user runs on the current preselected route, the replaceable route is determined based on the acquired image to be enhanced and displayed, and the indication mark of the replaceable route is added on the image, so that intelligent recommendation of the travel scheme under the condition that the user does not exit the current augmented reality navigation state is realized, and user experience is improved.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the related user personal information all conform to the regulations of related laws and regulations, and the public sequence is not violated.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
Fig. 8 illustrates a schematic block diagram of an example electronic device 800 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 8, the apparatus 800 includes a computing unit 801 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 802 or a computer program loaded from a storage unit 808 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data required for the operation of the device 800 can also be stored. The computing unit 801, the ROM 802, and the RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
Various components in device 800 are connected to I/O interface 805, including: an input unit 806 such as a keyboard, mouse, etc.; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, etc.; and a communication unit 809, such as a network card, modem, wireless communication transceiver, or the like. The communication unit 809 allows the device 800 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 801 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 801 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 801 performs the various methods and processes described above, such as a line navigation method. For example, in some embodiments, the line navigation method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 808. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 800 via ROM 802 and/or communication unit 809. When a computer program is loaded into RAM 803 and executed by computing unit 801, one or more steps of the line navigation method described above may be performed. Alternatively, in other embodiments, the computing unit 801 may be configured to perform the line navigation method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable line navigation apparatus, such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel, sequentially, or in a different order, provided that the desired results of the disclosed aspects are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (12)

1. A method of route navigation, the method comprising:
acquiring the current position of a terminal and an image to be subjected to augmented reality;
determining a transfer area and a transfer type corresponding to the transfer area based on the image and the current position;
detecting whether there is a replaceable route between the convertible area and a destination relative to a preselected route based on the transfer type, the preselected route being a preselected, preset type of route between the current location and the destination;
In response to detecting a replaceable route, indicating the replaceable route and the preselected route on the image by an augmented reality instruction identification; the detecting whether there is a replaceable route between the convertible area and a destination relative to a preselected route based on the transfer type, comprising:
determining a pre-transfer route belonging to the transfer type from any point in the transfer area to a destination;
screening out pre-transfer routes which are coincident with the near ends of the pre-selected routes and the opposite transfer areas of the pre-selected routes;
based on the screened pre-transfer route, a replaceable route is determined that differs from the pre-selected route from any point in the convertible area to a destination.
2. The method of claim 1, wherein the determining, based on the screened pre-transfer routes, a replaceable route that differs from the pre-selected route from any point in the exchangeable area to a destination comprises:
calculating the time consumption of the pre-transfer route and the pre-selected route selected by each screen;
ascending sorting is carried out on all screened pre-transfer routes and the time consumption of the pre-selected routes;
Based on the ascending sort result, a replaceable route that is different from the preselected route among all the screened pre-transfer routes is determined.
3. The method of claim 1, wherein the determining, based on the screened pre-transfer routes, a replaceable route that differs from the pre-selected route from any point in the exchangeable area to a destination comprises:
calculating equivalent average lengths of each screened pre-transfer route and the preselected route respectively, wherein the equivalent average lengths of each screened pre-transfer route and the preselected route correspond to the respective transfer type;
ascending order is carried out on all equivalent lengths;
and determining the replaceable routes which are different from the preselected routes in all the screened pre-transfer routes based on the sequencing results of all the equivalent average lengths.
4. The method of claim 1, wherein the determining a transfer type for a transfer area and the transfer area based on the image and the current location comprises:
extracting a region identifier of the image based on the image, wherein the region identifier is used for indicating a transformable region;
determining a transformable region based on the region identification and the current location;
Extracting traffic identification in the image based on the image;
and responding to the traffic identifications to indicate different types of transfer paths, and taking the type of the transfer path corresponding to each traffic identification as the transfer type corresponding to the transfer area.
5. The method of claim 1, the method further comprising:
responsive to the current location being on the alternate route and after a set duration on the alternate route, ceasing to indicate the preselected route in the image by an augmented reality indication mark.
6. A line navigation apparatus, the apparatus comprising:
an acquisition unit configured to acquire a current position of a terminal and an image to be subjected to augmented reality;
a determining unit configured to determine a transfer type corresponding to a transfer area and the transfer area based on the image and the current position;
a detection unit configured to detect, based on the transfer type, whether there is a replaceable route between the transfer area and a destination with respect to a preselected route, the preselected route being a pre-selected preset type of route between the current location and the destination;
an indication unit configured to indicate the exchangeable route and the preselected route on the image by an augmented reality indication mark in response to detecting an exchangeable route; the detection unit includes:
A determining module configured to determine a pre-transfer route belonging to the transfer type from any point in the transfer area to a destination;
a screening module configured to screen out all pre-transfer routes that coincide with a proximal end of the pre-selected route with respect to the exchangeable area;
a differentiating module configured to determine a replaceable route that is distinguishable from the preselected route from any point in the convertible region to a destination based on the screened pre-conversion route.
7. The apparatus of claim 6, wherein the differentiating module comprises:
a calculation sub-module configured to calculate time elapsed for each of the screened pre-transfer routes and the pre-selected routes;
the sorting sub-module is configured to sort all the screened pre-transfer routes and the time consumption of the pre-selected routes in an ascending order;
a selection sub-module configured to determine a replaceable route of all the screened pre-transfer routes that is different from the pre-selected route based on the ascending sort result.
8. The apparatus of claim 6, wherein the differentiating module comprises:
the average sub-module is configured to respectively calculate equivalent average lengths of each screened pre-transfer route and the pre-selected route, and the equivalent average lengths of each screened pre-transfer route and the pre-selected route correspond to respective transfer types;
The ascending sub-module is configured to sort all equivalent lengths in ascending order;
a selection sub-module configured to determine a replaceable route of all the screened pre-transfer routes that is different from the pre-selected route based on the ranking results of all the equivalent average lengths.
9. The apparatus of claim 6, wherein the determining unit comprises:
an extraction module configured to extract, based on the image, a region identification of the image, the region identification being used to indicate a transferable region;
a localization module configured to determine a transferable region based on the region identity and the current location;
an identification module configured to extract traffic identifications in the image based on the image;
and a transfer module configured to respond to the traffic identifications indicating different types of transfer paths, and take the types of the transfer paths corresponding to the traffic identifications as transfer types corresponding to the transfer areas.
10. The apparatus of claim 6, the apparatus further comprising:
and a stop display unit configured to stop indicating the preselected route in the image by an augmented reality indication mark in response to the current position being located on the exchangeable route and after a set time duration on the exchangeable route.
11. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-5.
12. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-5.
CN202110831980.XA 2021-07-22 2021-07-22 Line navigation method and device, electronic equipment and computer readable medium Active CN113449211B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110831980.XA CN113449211B (en) 2021-07-22 2021-07-22 Line navigation method and device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110831980.XA CN113449211B (en) 2021-07-22 2021-07-22 Line navigation method and device, electronic equipment and computer readable medium

Publications (2)

Publication Number Publication Date
CN113449211A CN113449211A (en) 2021-09-28
CN113449211B true CN113449211B (en) 2023-09-19

Family

ID=77817032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110831980.XA Active CN113449211B (en) 2021-07-22 2021-07-22 Line navigation method and device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN113449211B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120002858A (en) * 2010-07-01 2012-01-09 현대엠엔소프트 주식회사 A navigation apparatus and method for guiding transfer course
CN104019808A (en) * 2013-02-28 2014-09-03 联想(北京)有限公司 Navigation mode switching method and device
CN105136160A (en) * 2015-07-28 2015-12-09 浙江工业大学 Mobile terminal-augmented reality technology-based close range optimal bus station navigation method
CN107478237A (en) * 2017-06-29 2017-12-15 百度在线网络技术(北京)有限公司 Real scene navigation method, device, equipment and computer-readable recording medium
CN107576332A (en) * 2016-07-04 2018-01-12 百度在线网络技术(北京)有限公司 A kind of method and apparatus of transfering navigation
KR20180117866A (en) * 2017-04-20 2018-10-30 김현우 Navigation system by using augmented reality
KR101935040B1 (en) * 2018-09-12 2019-01-03 이현수 Method and system for providing speech recognition based route guidance service for public transportation
CN109297505A (en) * 2018-12-03 2019-02-01 深圳创维汽车智能有限公司 AR air navigation aid, car-mounted terminal and computer readable storage medium
CN110223199A (en) * 2018-08-01 2019-09-10 郎启红 A kind of application system and its implementation based on position simulation real community
CN110455303A (en) * 2019-08-05 2019-11-15 深圳市大拿科技有限公司 AR air navigation aid, device and the AR navigation terminal suitable for vehicle
CN112163701A (en) * 2020-09-23 2021-01-01 佳都新太科技股份有限公司 Station hub transfer management method and device
KR20210076755A (en) * 2019-12-16 2021-06-24 건국대학교 글로컬산학협력단 Apparatus and method for augmented reality based road guidance

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11354728B2 (en) * 2019-03-24 2022-06-07 We.R Augmented Reality Cloud Ltd. System, device, and method of augmented reality based mapping of a venue and navigation within a venue

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120002858A (en) * 2010-07-01 2012-01-09 현대엠엔소프트 주식회사 A navigation apparatus and method for guiding transfer course
CN104019808A (en) * 2013-02-28 2014-09-03 联想(北京)有限公司 Navigation mode switching method and device
CN105136160A (en) * 2015-07-28 2015-12-09 浙江工业大学 Mobile terminal-augmented reality technology-based close range optimal bus station navigation method
CN107576332A (en) * 2016-07-04 2018-01-12 百度在线网络技术(北京)有限公司 A kind of method and apparatus of transfering navigation
KR20180117866A (en) * 2017-04-20 2018-10-30 김현우 Navigation system by using augmented reality
CN107478237A (en) * 2017-06-29 2017-12-15 百度在线网络技术(北京)有限公司 Real scene navigation method, device, equipment and computer-readable recording medium
CN110223199A (en) * 2018-08-01 2019-09-10 郎启红 A kind of application system and its implementation based on position simulation real community
KR101935040B1 (en) * 2018-09-12 2019-01-03 이현수 Method and system for providing speech recognition based route guidance service for public transportation
CN109297505A (en) * 2018-12-03 2019-02-01 深圳创维汽车智能有限公司 AR air navigation aid, car-mounted terminal and computer readable storage medium
CN110455303A (en) * 2019-08-05 2019-11-15 深圳市大拿科技有限公司 AR air navigation aid, device and the AR navigation terminal suitable for vehicle
KR20210076755A (en) * 2019-12-16 2021-06-24 건국대학교 글로컬산학협력단 Apparatus and method for augmented reality based road guidance
CN112163701A (en) * 2020-09-23 2021-01-01 佳都新太科技股份有限公司 Station hub transfer management method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A New Approach for Navigation and Traffic Signs Indication Using Map Integrated Augmented Reality for Self-Driving Cars;Deore H 等;Scalable Computing Practice and Experience;全文 *
基于增强现实技术的长沙公交的应用研究;唐笑;;今日***(06);全文 *

Also Published As

Publication number Publication date
CN113449211A (en) 2021-09-28

Similar Documents

Publication Publication Date Title
CN108230379A (en) For merging the method and apparatus of point cloud data
CN111708858A (en) Map data processing method, device, equipment and storage medium
CN110763250B (en) Method, device and system for processing positioning information
CN103900584A (en) Electronic map traffic route determination method and apparatus
CN112650772B (en) Data processing method, data processing device, storage medium and computer equipment
CN110647886A (en) Interest point marking method and device, computer equipment and storage medium
CN111275011A (en) Mobile traffic light detection method and device, electronic equipment and storage medium
CN110727816A (en) Method and device for determining interest point category
CN111985457A (en) Traffic facility damage identification method, device, equipment and storage medium
CN112883236B (en) Map updating method and device, electronic equipment and storage medium
CN113673281A (en) Speed limit information determining method, device, equipment and storage medium
CN113283272A (en) Real-time image information prompting method and device for road congestion and electronic equipment
CN114565908A (en) Lane line detection method and device, electronic device and storage medium
CN113449211B (en) Line navigation method and device, electronic equipment and computer readable medium
CN113011298A (en) Truncated object sample generation method, target detection method, road side equipment and cloud control platform
CN112418081A (en) Method and system for air-ground joint rapid investigation of traffic accidents
CN113449687B (en) Method and device for identifying point of interest outlet and point of interest inlet and electronic equipment
CN113450794B (en) Navigation broadcasting detection method and device, electronic equipment and medium
CN116206326A (en) Training method of missing detection model, missing detection method and device of diversion area
CN113609956B (en) Training method, recognition device, electronic equipment and storage medium
CN113837455B (en) Taxi taking method, taxi taking device, electronic equipment and readable storage medium
CN114528365A (en) Method and device for identifying parking area on highway, electronic equipment and medium
CN114998863A (en) Target road identification method, target road identification device, electronic equipment and storage medium
CN114743145A (en) Traffic light detection method and device and electronic equipment
CN113850990A (en) Road fault processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant