CN108827307B - Navigation method, navigation device, terminal and computer readable storage medium - Google Patents

Navigation method, navigation device, terminal and computer readable storage medium Download PDF

Info

Publication number
CN108827307B
CN108827307B CN201810572735.XA CN201810572735A CN108827307B CN 108827307 B CN108827307 B CN 108827307B CN 201810572735 A CN201810572735 A CN 201810572735A CN 108827307 B CN108827307 B CN 108827307B
Authority
CN
China
Prior art keywords
navigation
information
marker information
point position
starting point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810572735.XA
Other languages
Chinese (zh)
Other versions
CN108827307A (en
Inventor
廖新风
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN201810572735.XA priority Critical patent/CN108827307B/en
Publication of CN108827307A publication Critical patent/CN108827307A/en
Application granted granted Critical
Publication of CN108827307B publication Critical patent/CN108827307B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The application belongs to the technical field of mobile terminals, and particularly relates to a navigation method, a navigation device, a navigation terminal and a computer-readable storage medium, wherein the method comprises the following steps: acquiring image information of a current environment of a user, and identifying first marker information in the image information; determining a first starting point position according to the first marker information; the first starting point position is used for indicating the current position of the user; acquiring a terminal position input by a user; acquiring navigation route information according to the first starting point position and the end point position, and navigating according to the navigation route information; the navigation route information carries second marker information associated with the corresponding position of the navigation path in the navigation route information, the technical problem that navigation accuracy is affected due to the fact that navigation signals are interfered by the environment is solved, and accurate navigation is achieved in the environment with poor communication signals.

Description

Navigation method, navigation device, terminal and computer readable storage medium
Technical Field
The present application belongs to the technical field of mobile terminals, and in particular, to a navigation method, an apparatus, a terminal, and a computer-readable storage medium.
Background
As a communication tool most widely used in the current population, a navigation function of a mobile terminal has become an indispensable auxiliary tool in modern society.
However, when the mobile terminal implements the navigation function by using signals such as Global Positioning System (GPS), bluetooth, WIFI signals, and 2G/3G/4G/5G, the navigation signal may be interfered by the environment, which may affect the accuracy of navigation.
Disclosure of Invention
The embodiment of the application provides a navigation method, a navigation device, a terminal and a computer readable storage medium, which can solve the technical problem that navigation accuracy is influenced by environmental interference of navigation signals.
A first aspect of an embodiment of the present application provides a navigation method, including:
acquiring image information of a current environment of a user, and identifying first marker information in the image information;
determining a first starting point position according to the first marker information; the first starting point position is used for indicating the current position of the user;
acquiring a terminal position input by a user;
acquiring navigation route information according to the first starting point position and the end point position, and navigating according to the navigation route information; the navigation route information carries second marker information associated with a position corresponding to a navigation path in the navigation route information.
A second aspect of an embodiment of the present application provides a navigation device, including:
the device comprises an acquisition unit, a display unit and a control unit, wherein the acquisition unit is used for acquiring image information of the current environment of a user and identifying first marker information in the image information;
the determining unit is used for determining a first starting point position according to the first marker information; the first starting point position is used for indicating the current position of the user;
the input unit is used for acquiring a terminal position input by a user;
and the navigation unit is used for acquiring navigation route information according to the first starting point position and the end point position and navigating according to the navigation route information, wherein the navigation route information carries second marker information associated with a position corresponding to a navigation path in the navigation route information.
A third aspect of the embodiments of the present application provides a terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the above method.
In the embodiment of the application, the current position of the user, namely the first starting point position, is obtained by acquiring the image information of the current environment of the user and identifying the first marker information in the image information, namely, signals such as GPS, Bluetooth, WIFI signals and 2G/3G/4G/5G are not required to be utilized when the position of the user is positioned; in addition, when navigation is performed, navigation is performed by using the navigation route information carrying the second marker information, and signals such as a GPS, a Bluetooth, a WIFI signal and a 2G/3G/4G/5G signal do not need to be used, so that in environments with poor signals such as the GPS, the Bluetooth, the WIFI signal and the 2G/3G/4G/5G signal, accurate navigation can still be achieved through the acquired navigation route information carrying the second marker information, and the technical problem that the navigation accuracy is influenced by environmental interference of the navigation signal is solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic flow chart illustrating an implementation process of a navigation method according to an embodiment of the present application;
fig. 2 is a flowchart illustrating an implementation of step 102 of a navigation method according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of another specific implementation of step 102 of a navigation method according to an embodiment of the present application;
FIG. 4 is a schematic diagram for determining a current location of a user on a navigation application interface according to an embodiment of the present application;
fig. 5 is a flowchart illustrating a detailed implementation of step 104 of a navigation method according to an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating displaying navigation path information on a navigation application interface according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a navigation device provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
In order to explain the technical solution of the present application, the following description will be given by way of specific examples.
In some environments with poor signals, when the navigation function is realized by using signals such as GPS, bluetooth, WIFI signals, 2G/3G/4G/5G, etc., the situation with poor accuracy of navigation may occur.
In the embodiment of the application, the current position of the user, namely the first starting point position, is obtained by acquiring the image information of the current environment of the user and identifying the first marker information in the image information, namely, signals such as GPS, Bluetooth, WIFI signals and 2G/3G/4G/5G are not required to be utilized when the position of the user is positioned; in addition, when navigation is performed, navigation is performed by using the navigation route information carrying the second marker information, and signals such as a GPS, a Bluetooth, a WIFI signal and a 2G/3G/4G/5G signal do not need to be used, so that in environments with poor signals such as the GPS, the Bluetooth, the WIFI signal and the 2G/3G/4G/5G signal, accurate navigation can still be achieved through the acquired navigation route information carrying the second marker information, and the technical problem that the navigation accuracy is influenced by environmental interference of the navigation signal is solved.
Fig. 1 is a schematic diagram illustrating a first implementation flow of a navigation method provided by an embodiment of the present application, where the method is applied to a terminal and can be executed by a navigation device configured on the terminal, and the navigation device is suitable for a situation that an environment with poor communication signals needs to be implemented to improve navigation accuracy. The terminal comprises terminal equipment such as a smart phone, a tablet personal computer, a Personal Digital Assistant (PDA), a learning machine and an electronic watch. The navigation method may include steps 101 to 104.
In step 101, image information of an environment where a user is currently located is acquired, and first marker information in the image information is identified.
In practical applications, when a user needs to acquire navigation data, a start position and an end position are generally acquired first, so as to determine navigation route information from the start position to the end position according to the start position and the end position.
In the embodiment of the application, the current position of the user is obtained by acquiring the image information of the current environment of the user and identifying the first marker information in the image information, so that the mobile terminal can still accurately position the current position of the user in the environment with poor communication signals and no communication signals or in the environment with indoor environment interference on the communication signals and greatly reduced navigation accuracy.
Identifying first marker information in the image information comprises performing target detection on the image information, classifying a foreground and a background at a pixel level, removing the background, and retaining one or more target objects, namely one or more pieces of first marker information.
In some embodiments of the present application, the method for detecting the target includes performing target detection on the image information by using a trained convolutional neural network model, and obtaining a detection result.
The trained convolutional neural network model is obtained by training according to each sample picture in the database and the detection result corresponding to each sample picture, wherein the detection result corresponding to each sample picture is used to indicate all target objects contained in the sample picture, that is, all first marker information, and the first marker information may be target objects which can play a role in identification and do not move frequently, for example, fixed buildings in channels such as railings and windows, and ornaments, characters and patterns which do not move frequently in the channels, the characters and patterns may be specific characters and patterns printed on a wall, and the specific characters and patterns printed on the wall may be characters or patterns in a billboard of a merchant, and the like.
Optionally, the training step of the convolutional neural network model may include: acquiring a sample image and a detection result corresponding to the sample image; and detecting the sample image by using a convolutional neural network model, adjusting parameters of the convolutional neural network model according to a detection result until the adjusted convolutional neural network model can detect all target objects in the sample image, or the accuracy of the target objects in the sample image is detected to be greater than a preset value, and taking the adjusted convolutional neural network model as a trained convolutional neural network model. The parameters of the convolutional neural network model may include the weight, deviation, and coefficient of the regression function of each convolutional layer in the convolutional neural network model, and may further include a learning rate, iteration times, the number of neurons in each layer, and the like. At present, the conventional Convolutional Neural Network models include an RCNN (region based Convolutional Neural Network) model, a Fast-RCNN model, and the like. The fast-RCNN model is evolved on the basis of the RCNN model and the fast-RCNN model, and compared with the RCNN model and the fast-RCNN model, the fast-RCNN model still cannot realize real-time detection of the target object, but has higher target detection precision and target detection speed compared with the RCNN model and the fast-RCNN model, so that in some embodiments of the application, the fast-RCNN model is preferred to the convolutional neural network model.
It should be noted that the above target detection method is only exemplified herein, and is not meant to limit the scope of the present application, and other methods capable of achieving target detection are also applicable to the present application, and are not listed herein.
Step 102, determining a first starting point position according to the first marker information; the first starting point position is used for indicating the current position of the user.
That is to say, in the embodiment of the present application, after the image information of the current environment where the user is located is obtained, and the first marker information in the image information is identified, the first starting point position of the user may be determined through the first marker information.
In some embodiments of the present application, determining a first starting point location from the first marker information comprises: and matching the first marker information with third marker information carried in a pre-stored navigation map, and taking a position corresponding to the third marker information successfully matched with the first marker information as the first starting point position.
For example, a user picks up a mobile terminal such as a mobile phone and the like provided with a camera to photograph surrounding objects, so as to complete acquisition of image information of the current environment of the user, and then identifies first marker information in the image information, matches the first marker information with third marker information carried in the pre-stored navigation map, and takes a position corresponding to the third marker information successfully matched with the first marker information as the first starting point position, so as to determine the current position of the user by using the first marker information and the navigation map.
The pre-stored navigation map may be a pre-acquired offline navigation map, the navigation map may be a video navigation map composed of a plurality of picture frames, and the navigation map carries third marker information. Under the conditions that the communication signal is weak and the communication signal is absent, or the navigation accuracy is greatly reduced when the communication signal is interfered by an indoor environment, the current position of the user can be determined by acquiring the image information of the current environment of the user, and matching the first marker information in the image information with the third marker carried in the off-line navigation map acquired in advance.
In step 103, the end position input by the user is obtained.
In practical applications, when a user needs to acquire navigation data, a start position and an end position generally need to be acquired first, so as to determine navigation route information from the start position to the end position according to the start position and the end position.
In the embodiment of the application, the acquiring of the destination position input by the user comprises the step that the user inputs the destination position to the mobile terminal in a voice or text input mode, or the destination position is directly input in a navigation application interface by selecting the map position in the navigation map.
Step 104, acquiring navigation route information according to the first starting point position and the end point position, and navigating according to the navigation route information; the navigation route information carries second marker information associated with a position corresponding to a navigation path in the navigation route information.
After the current position (first starting point position) of the user and the end position input by the user are obtained, a navigation path from the first starting point position to the end position can be obtained through calculation according to the used passing path between the first starting point position and the end position, and second marker information related to the navigation path is arranged at the corresponding position of the navigation path.
In the embodiment of the application, the current position of the user, namely the first starting point position, is obtained by acquiring the image information of the current environment of the user and identifying the first marker information in the image information, so that GPS, Bluetooth, WIFI signals, 2G/3G/4G/5G signals and the like are not required to be utilized in the user position location; in addition, when navigation is performed, navigation is performed by using the navigation route information carrying the second marker information, and signals such as a GPS signal, a Bluetooth signal, a WIFI signal and a 2G/3G/4G/5G signal do not need to be used, so that accurate navigation can be still achieved through the acquired navigation route information carrying the second marker information in environments with poor signals such as the GPS signal, the Bluetooth signal, the WIFI signal and the 2G/3G/4G/5G signal.
In some embodiment modes of the present application, as shown in fig. 2, the step of matching the first marker information with third marker information carried in a pre-stored navigation map and taking a position corresponding to the third marker information successfully matched with the first marker information as the first starting point position may include: step 201 to step 202; so as to reduce the data volume of the first marker information matched with the third marker information carried in the pre-stored navigation map.
In step 201, a second starting point position is obtained according to the received wireless signal, where the second starting point position is used to indicate a position where the user is most likely to be located currently.
Since accurate continuous navigation may not be achieved only in the case of poor signals such as GPS, bluetooth, WIFI signals, and 2G/3G/4G/5G signals, but coarse positioning of the starting point position may be achieved, a second starting point position may be obtained according to the received wireless signal, where the second starting point position is used to indicate a position where the user is most likely to be currently located.
In step 202, the first marker information is matched with third marker information within a preset distance range from the second starting point position, and a position corresponding to the third marker information successfully matched with the first marker information is taken as the first starting point position.
After the second starting point position is obtained, the matching range of the first marker information and the third marker information can be narrowed according to the second starting point position, and the first marker information is matched with the third marker information within a preset distance range from the second starting point position, so that the accurate current position of the user is obtained.
In an embodiment of the present invention, the matching range between the first marker information and the third marker information is reduced by screening out third marker information within a preset distance range from the second starting point position, and matching the first marker information with the third marker information within the preset distance range from the second starting point position.
In some embodiments of the present application, the preset distance range may be a circular range with a radius of a preset distance, and the preset distance may be determined according to an actual application scenario, for example, if the application scenario is indoor navigation, since there are more third markers within a unit distance (e.g., 10m) of the indoor navigation, the preset distance may be set to be smaller, for example, set to 100m to 300 m; if the application scenario is outdoor navigation, since the number of the third markers involved is small within a unit distance (e.g., 10m) of the outdoor navigation, the preset distance may be set to be larger, for example, 300m to 1000 m.
Optionally, if the first marker information is matched with third marker information within a preset distance range from the second starting point position, and the third marker information successfully matched with the first marker information is not matched, the preset distance range is expanded by one time until the preset distance range is expanded to a maximum preset multiple.
The maximum preset multiple can be determined according to an actual application scene, for example, if the application scene is indoor navigation, the maximum preset multiple can be set to be larger, for example, 5 to 10 times; if the application scene is outdoor navigation, the maximum preset multiple may be set to be smaller, for example, set to be 3-5 times.
The first marker information contained in the image information of the environment where the user is currently located may include a plurality of pieces of sub-marker information, and the third marker information carried in the navigation map stored in advance may also include a plurality of pieces of sub-marker information, and it is possible that the sub-marker information in the first marker information is successfully matched with the plurality of pieces of sub-marker information in the third marker information, so that the current location of the user cannot be really determined.
Therefore, in some embodiment modes of the present application, the matching the first marker information with third marker information carried in a navigation map stored in advance, and taking a position corresponding to the third marker information successfully matched with the first marker information as the first starting point position includes: matching the sub-marker information in the first marker information with sub-marker information in third marker information in sequence, screening out third marker information uniquely matched with the first marker information, and taking a position corresponding to the third marker information uniquely matched with the first marker information as the first starting point position.
For example, matching a first one of the first marker information with all of the sub-marker information in the third marker information; and if only one piece of sub-marker information in the third marker information is successfully matched with the first sub-marker information in the first marker information, taking the position corresponding to the third marker information successfully matched with the first sub-marker information in the first marker information as the first starting point position.
If there is a successful match of sub-marker information of the plurality of third marker information with the first sub-marker information of the first marker information, matching all sub-markers in each third marker information except the sub-marker successfully matched with the first sub-marker information in the first marker information with the second sub-marker information in the first marker information, if the sub-marker information in the plurality of third marker information is successfully matched with the second sub-marker information in the first marker information, continuing the next round of matching until there is only one successful matching of the sub-marker information in the third marker information with the sub-marker information in the first marker information, and taking the position corresponding to only one third marker information which is successfully matched with the first marker information as the first starting point position.
In the above embodiments, after the image information of the current environment of the user is acquired and the first marker information in the image information is identified, the step of determining the first starting point position according to the first marker information is all obtained by the mobile terminal through self-processing, which may consume more data processing resources of the mobile terminal, and therefore, in some embodiments of the present application, as shown in fig. 3, the step of determining the first starting point position according to the first marker information may include steps 301 to 303.
In step 301, a second starting point position is obtained according to the received wireless signal, and a starting point identifier is added to a corresponding position of a map displayed on a navigation application interface, where the second starting point position is used to indicate a position where a user is most likely to be located currently.
Since accurate continuous navigation may not be achieved only in the case of poor signals such as GPS, bluetooth, WIFI signals, and 2G/3G/4G/5G signals, but coarse positioning of the starting point position may be achieved, a second starting point position may be obtained according to the received wireless signal, where the second starting point position is used to indicate a position where the user is most likely to be currently located.
In step 302, a movement instruction triggered by the user on the navigation application interface to the starting point identifier is obtained, and the starting point identifier is moved.
In step 303, the position corresponding to the moved start point identifier is determined as the first start point position.
For example, as shown in fig. 4, the mobile terminal acquires a second starting point position according to the received wireless signal, adds a starting point identifier 41 to a corresponding position of a map displayed on the navigation application interface 40, then, after the user views surrounding markers through eyes, finds a position where the mobile terminal is located around the starting point identifier 41 on the map displayed on the navigation application interface, and after finding a position where the mobile terminal is located 42 on the map displayed on the navigation application interface, moves the starting point identifier 41 to the position 42, so that the mobile terminal determines the position corresponding to the moved starting point identifier as the first starting point position, thereby reducing the data processing amount of the mobile terminal for calculating the first starting point position.
Optionally, as shown in fig. 5, in some embodiments of the present application, in the step 104, navigating according to the navigation route information includes: step 501 to step 503.
In step 501, displaying the navigation route information in a navigation application interface in a text description mode;
in step 502, according to a navigation starting instruction triggered by a user on a navigation application interface, current movement direction information and current movement distance information of the user are acquired;
in step 503, the current movement direction information and the current movement distance information of the user are fitted to the navigation path in the navigation route information described in the text, and the text description matched with the current movement direction information and the current movement distance information of the user is highlighted on the navigation application interface.
For example, as shown in fig. 6, navigation route information 61 like "go straight from the current position by about 20m, see a left turn of a marker a, go straight by about 50m, have a marker B on the right hand side, go straight by about 10m, have a marker C on the right hand side, go straight by about 100m, and have a destination on the left hand side" is displayed in the form of text description on the navigation application interface 60.
Because the mobile terminal cannot be positioned by means of the communication signal under the condition of poor communication signal, when navigating, the current movement direction information and the current movement distance information of the user are required to be acquired so as to fit the current movement direction information and the current movement distance information of the user with the navigation path in the text description navigation route information, and the text description 'straight line 20 m' matched with the current movement direction information and the current movement distance information of the user is highlighted on the navigation application interface.
It should be noted that the manner of highlighting the text description matching with the current movement direction information and the current movement distance information of the user may be a manner of using a different font color, a different font, a different size and/or bolding the text description.
Optionally, in some embodiments of the present application, in the step 104, the obtaining navigation route information according to the first starting point position and the ending point position includes: acquiring a passing path communicating the first starting point position and the end point position; and calculating a navigation path from the first starting point position to the end point position according to the passing path, and obtaining navigation route information containing the navigation path.
In this embodiment, the navigation route information includes a navigation path and second marker information associated with a position corresponding to the navigation path, so that after the navigation path from the first starting point position to the end point position is calculated according to the transit path, the second marker information associated with the position corresponding to the navigation path is further acquired, and then the navigation route information including the navigation path and the second marker information associated with the position corresponding to the navigation path is acquired.
When the navigation paths from the first starting points to the end points are calculated, the route a, the route b, the route c and the route … … can be marked on the navigation application interface according to the distance sequence of the navigation paths for the user to select.
Optionally, in some embodiments of the present application, in the step 104, acquiring navigation route information according to the first starting point position and the ending point position, further includes: displaying a starting point identifier corresponding to the first starting point position, an end point identifier corresponding to the end point position and a passing path communicating the first starting point position and the end point position at a corresponding position of a map displayed on a navigation application interface; and receiving a path planning operation from a first starting point position to a final position triggered by a user on a map displayed on the navigation application interface, acquiring a navigation path according to the path planning operation, and acquiring navigation route information containing the navigation path.
That is to say, after a first starting point position and an end point position are obtained, the mobile terminal displays a starting point identifier corresponding to the first starting point position and an end point identifier corresponding to the end point position on the navigation application interface, and displays all traffic paths communicating the first starting point position and the end point position, so that a user can directly perform path planning on a map displayed on the navigation application interface as required.
The first starting point position and the end point position form a business circle, the time from the first starting point position to the end point position is abundant, and if a user wants to stroll from the first starting point position to the end point position, a path from the first starting point position to the end point position can be planned according to a strolling route.
In the embodiment of the application, the navigation path can be independently planned by the user, so that the navigation path can be personalized more, and the personalized requirements of the user can be met more.
Fig. 7 shows a schematic structural diagram of a navigation device 700 provided in an embodiment of the present application, which includes an acquisition unit 701, a determination unit 702, an input unit 703, and a navigation unit 704.
An obtaining unit 701, configured to obtain image information of an environment where a user is currently located, and identify first marker information in the image information;
a determining unit 702, configured to determine a first starting point position according to the first marker information; the first starting point position is used for indicating the current position of the user;
an input unit 703 for acquiring a destination position input by a user;
and a navigation unit 704, configured to obtain navigation route information according to the first starting point position and the end point position, and perform navigation according to the navigation route information, where the navigation route information carries second marker information associated with a position corresponding to a navigation path in the navigation route information.
In some embodiments of the application, the determining unit is specifically configured to match the first marker information with third marker information carried in a navigation map stored in advance, and set a position corresponding to the third marker information successfully matched with the first marker information as the first starting point position.
In some embodiments of the present application, the determining unit is specifically configured to obtain a second starting point location according to the received wireless signal, where the second starting point location is used to indicate a location where the user is most likely to be located currently; and matching the first marker information with third marker information within a preset distance range from the second starting point position, and taking a position corresponding to the third marker information successfully matched with the first marker information as the first starting point position.
In some embodiments of the present application, the determining unit is further specifically configured to match sub-marker information in the first marker information with sub-marker information in third marker information in sequence, screen out third marker information uniquely matching the first marker information, and use a position corresponding to the third marker information uniquely matching the first marker information as the first starting point position.
In some embodiments of the present application, the determining unit is further specifically configured to obtain a second starting point location according to the received wireless signal, and add a starting point identifier to a corresponding location of a map displayed on the navigation application interface, where the second starting point location is used to indicate a location where the user is most likely to be located currently; acquiring a movement instruction triggered by a user on a navigation application interface to the starting point identifier, and moving the starting point identifier; and determining the position corresponding to the moved starting point identifier as the first starting point position.
In some embodiments of the present application, the navigation unit is specifically configured to display the navigation route information in a form of text description on a navigation application interface; acquiring current movement direction information and current movement distance information of a user according to a navigation starting instruction triggered by the user on a navigation application interface; and fitting the current movement direction information and the current movement distance information of the user with the navigation path in the navigation route information of the text description, and highlighting the text description matched with the current movement direction information and the current movement distance information of the user on the navigation application interface.
In some embodiments of the present application, the navigation unit is further specifically configured to acquire a passing path communicating between the first starting point position and the ending point position; calculating a navigation path from a first starting point position to the end point position according to the passing path, and obtaining navigation route information containing the navigation path; or displaying a starting point identifier corresponding to the first starting point position, an end point identifier corresponding to the end point position and a passing path communicating the first starting point position and the end point position at a corresponding position of a map displayed on a navigation application interface; and receiving a path planning operation from a first starting point position to a final position triggered by a user on a map displayed on the navigation application interface, acquiring a navigation path according to the path planning operation, and acquiring navigation route information containing the navigation path.
It should be noted that, for convenience and brevity of description, the specific working process of the navigation device 700 described above may refer to the corresponding process of the method described above in fig. 1 to fig. 6, and is not described again here.
As shown in fig. 8, the present application provides a terminal for implementing the above navigation method, where the terminal may be a mobile terminal, and the mobile terminal may be a terminal such as a smart phone, a tablet computer, a Personal Computer (PC), a Personal Digital Assistant (PDA), a learning machine, and includes: one or more input devices 83 (only one shown in fig. 8) and one or more output devices 84 (only one shown in fig. 8). The processor 81, memory 82, input device 83, output device 84, and camera 85 are connected by a bus 86.
It should be understood that, in the embodiment of the present Application, the Processor 81 may be a Central Processing Unit (CPU), and the Processor may also be other general processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 83 may include a virtual keyboard, a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, etc., and the output device 84 may include a display, a speaker, etc.
The memory 82 may include a read-only memory and a random access memory, and provides instructions and data to the processor 81. Some or all of memory 82 may also include non-volatile random access memory. For example, the memory 82 may also store device type information.
The memory 82 stores a computer program that is executable on the processor 81, for example, a program of a navigation method. The processor 81 implements the steps of the navigation method, such as steps 101 to 104 shown in fig. 1, when executing the computer program. Alternatively, the processor 81 implements the functions of the modules/units in the device embodiments, such as the functions of the units 701 to 704 shown in fig. 7, when executing the computer program.
The computer program may be divided into one or more modules/units, which are stored in the memory 82 and executed by the processor 81 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used for describing the execution process of the computer program in the navigation terminal. For example, the computer program may be divided into an acquisition unit, a determination unit, an input unit, and a navigation unit, each unit having the following specific functions: the device comprises an acquisition unit, a display unit and a control unit, wherein the acquisition unit is used for acquiring image information of the current environment of a user and identifying first marker information in the image information; the determining unit is used for determining a first starting point position according to the first marker information; the first starting point position is used for indicating the current position of the user; the input unit is used for acquiring a terminal position input by a user; and the navigation unit is used for acquiring navigation route information according to the first starting point position and the end point position and navigating according to the navigation route information, wherein the navigation route information carries second marker information associated with a position corresponding to a navigation path in the navigation route information.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal and method may be implemented in other ways. For example, the above-described apparatus/terminal embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (7)

1. A navigation method, comprising:
acquiring image information of a current environment where a user is located, and identifying first marker information in the image information, wherein the method comprises the following steps: performing target detection on the image information, realizing pixel-level classification of a foreground and a background, removing the background, and reserving one or more target objects; the first marker information is a target object which plays a role in identification and is not moved frequently;
determining a first starting point position according to the first marker information; the first starting point position is used for indicating the current position of the user, and comprises the following steps: matching the first marker information with third marker information carried in a pre-stored navigation map, and taking a position corresponding to the third marker information successfully matched with the first marker information as the first starting point position; the navigation map stored in advance is an offline navigation map acquired in advance, the navigation map is a video navigation map formed by picture frames, and the navigation map carries third marker information;
acquiring a terminal position input by a user;
acquiring navigation route information according to the first starting point position and the end point position, and navigating according to the navigation route information, wherein the navigation route information comprises the following steps: displaying the navigation route information on a navigation application interface in a text description mode; acquiring current movement direction information and current movement distance information of a user according to a navigation starting instruction triggered by the user on a navigation application interface; fitting the current movement direction information and the current movement distance information of the user with the navigation path in the navigation route information of the text description, and highlighting the text description matched with the current movement direction information and the current movement distance information of the user on the navigation application interface; the navigation route information carries second marker information in the image information associated with a position corresponding to a navigation path in the navigation route information.
2. The navigation method according to claim 1, wherein the matching the first marker information with third marker information carried in a navigation map stored in advance, and taking a position corresponding to the third marker information that is successfully matched with the first marker information as the first starting point position includes:
acquiring a second starting point position according to the received wireless signal, wherein the second starting point position is used for indicating the position where the user is most likely to be located currently;
and matching the first marker information with third marker information within a preset distance range from the second starting point position, and taking a position corresponding to the third marker information successfully matched with the first marker information as the first starting point position.
3. The navigation method according to claim 1 or 2, wherein the matching the first marker information with third marker information carried in a navigation map stored in advance, and taking a position corresponding to the third marker information that is successfully matched with the first marker information as the first starting point position, includes:
matching the sub-marker information in the first marker information with sub-marker information in third marker information in sequence, screening out third marker information uniquely matched with the first marker information, and taking a position corresponding to the third marker information uniquely matched with the first marker information as the first starting point position.
4. The navigation method of claim 1, wherein the obtaining navigation route information based on the first starting point location and the ending point location comprises:
acquiring a passing path communicating the first starting point position and the end point position;
calculating a navigation path from a first starting point position to the end point position according to the passing path, and obtaining navigation route information containing the navigation path; or comprises the following steps:
displaying a starting point identifier corresponding to the first starting point position, an end point identifier corresponding to the end point position and a passing path communicating the first starting point position and the end point position at a corresponding position of a map displayed on a navigation application interface;
and receiving a path planning operation from a first starting point position to a final position triggered by a user on a map displayed on the navigation application interface, acquiring a navigation path according to the path planning operation, and acquiring navigation route information containing the navigation path.
5. A navigation device, comprising:
the acquiring unit is used for acquiring image information of the current environment of the user, and identifying first marker information in the image information, and comprises: performing target detection on the image information, realizing pixel-level classification of a foreground and a background, removing the background, and reserving one or more target objects; the first marker information is a target object which plays a role in identification and is not moved frequently;
the determining unit is used for determining a first starting point position according to the first marker information; the first starting point position is used for indicating the current position of the user; the determining unit is specifically configured to match the first marker information with third marker information carried in a pre-stored navigation map, and use a position corresponding to the third marker information that is successfully matched with the first marker information as the first starting point position; the navigation map stored in advance is an offline navigation map acquired in advance, the navigation map is a video navigation map formed by picture frames, and the navigation map carries third marker information;
the input unit is used for acquiring a terminal position input by a user;
the navigation unit is used for acquiring navigation route information according to the first starting point position and the end point position and navigating according to the navigation route information, wherein the navigation route information carries second marker information in the image information related to the corresponding position of the navigation path in the navigation route information; the navigation unit is specifically used for displaying the navigation route information in a navigation application interface in a text description mode; acquiring current movement direction information and current movement distance information of a user according to a navigation starting instruction triggered by the user on a navigation application interface; and fitting the current movement direction information and the current movement distance information of the user with the navigation path in the navigation route information of the text description, and highlighting the text description matched with the current movement direction information and the current movement distance information of the user on the navigation application interface.
6. A terminal comprising a camera, a memory, a processor, and a computer program stored in the memory and executable on the processor,
the camera is used for acquiring image information of the current environment where the user is located;
the processor, when executing the computer program, realizes the steps of the method according to any of claims 1 to 4.
7. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4.
CN201810572735.XA 2018-06-05 2018-06-05 Navigation method, navigation device, terminal and computer readable storage medium Expired - Fee Related CN108827307B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810572735.XA CN108827307B (en) 2018-06-05 2018-06-05 Navigation method, navigation device, terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810572735.XA CN108827307B (en) 2018-06-05 2018-06-05 Navigation method, navigation device, terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108827307A CN108827307A (en) 2018-11-16
CN108827307B true CN108827307B (en) 2021-01-12

Family

ID=64144376

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810572735.XA Expired - Fee Related CN108827307B (en) 2018-06-05 2018-06-05 Navigation method, navigation device, terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108827307B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109556603A (en) * 2018-11-19 2019-04-02 惠州Tcl移动通信有限公司 The determination method of starting direction when mobile terminal and its navigation
CN111380533B (en) * 2018-12-29 2023-03-24 深圳市优必选科技股份有限公司 Positioning navigation method, equipment and storage device
CN109682390B (en) * 2019-01-14 2021-04-02 Oppo广东移动通信有限公司 Navigation method and related product
WO2020155075A1 (en) * 2019-01-31 2020-08-06 华为技术有限公司 Navigation apparatus and method, and related device
CN109918768A (en) * 2019-03-04 2019-06-21 南方电网科学研究院有限责任公司 Path searching planning method and device of electric energy metering device based on password chip
CN110220530B (en) * 2019-06-17 2022-03-04 腾讯科技(深圳)有限公司 Navigation method and device, computer readable storage medium and electronic device
CN110275977B (en) * 2019-06-28 2023-04-21 北京百度网讯科技有限公司 Information display method and device
CN113532456A (en) * 2020-04-21 2021-10-22 百度在线网络技术(北京)有限公司 Method and device for generating navigation route
CN113945220A (en) * 2020-07-15 2022-01-18 奥迪股份公司 Navigation method and device
CN112146676B (en) * 2020-09-17 2022-10-25 北京小米移动软件有限公司 Information navigation method, device, equipment and storage medium
CN112686452A (en) * 2020-12-31 2021-04-20 深圳市元征科技股份有限公司 Path recommendation method, device, equipment and storage medium
CN113532444B (en) * 2021-09-16 2021-12-14 深圳市海清视讯科技有限公司 Navigation path processing method and device, electronic equipment and storage medium
CN114370884A (en) * 2021-12-16 2022-04-19 北京三快在线科技有限公司 Navigation method and device, electronic equipment and readable storage medium
CN114268771A (en) * 2021-12-29 2022-04-01 深圳市商汤科技有限公司 Video viewing method, mobile terminal and computer readable storage medium
CN114646320B (en) * 2022-02-09 2023-04-28 江苏泽景汽车电子股份有限公司 Path guiding method and device, electronic equipment and readable storage medium
CN114264309B (en) * 2022-02-28 2022-05-24 浙江口碑网络技术有限公司 Walking navigation method and device, electronic equipment and storage medium
CN115900713A (en) * 2022-11-11 2023-04-04 北京字跳网络技术有限公司 Auxiliary voice navigation method and device, electronic equipment and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103249142B (en) * 2013-04-26 2016-08-24 东莞宇龙通信科技有限公司 Positioning method, system and mobile terminal
CN104422439B (en) * 2013-08-21 2017-12-19 希姆通信息技术(上海)有限公司 Air navigation aid, device, server, navigation system and its application method
CN104748740A (en) * 2014-05-14 2015-07-01 深圳视景文化科技有限公司 Navigation method, navigation terminal and navigation system based on augmented reality technique
CN105318881B (en) * 2014-07-07 2020-10-16 腾讯科技(深圳)有限公司 Map navigation method, device and system
CN104268147B (en) * 2014-08-23 2018-08-28 李全波 A kind of method of private car carpooling system in terms of network map function is realized
US20160356605A1 (en) * 2015-06-08 2016-12-08 Nissim Zur Active-Micro-Location-based Apparatus Method and services guide the user by light sign
CN105737824A (en) * 2016-02-03 2016-07-06 北京京东尚科信息技术有限公司 Indoor navigation method and device

Also Published As

Publication number Publication date
CN108827307A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN108827307B (en) Navigation method, navigation device, terminal and computer readable storage medium
US11721098B2 (en) Augmented reality interface for facilitating identification of arriving vehicle
CN110121118B (en) Video clip positioning method and device, computer equipment and storage medium
US10677596B2 (en) Image processing device, image processing method, and program
CN109189879B (en) Electronic book display method and device
CN110443366B (en) Neural network optimization method and device, and target detection method and device
CN111311485A (en) Image processing method and related device
CN112115894B (en) Training method and device of hand key point detection model and electronic equipment
CN112147632A (en) Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm
CN110619027A (en) House source information recommendation method and device, terminal equipment and medium
CN111209354A (en) Method and device for judging repetition of map interest points and electronic equipment
CN111046927B (en) Method and device for processing annotation data, electronic equipment and storage medium
CN114111813B (en) High-precision map element updating method and device, electronic equipment and storage medium
JP6993282B2 (en) Information terminal devices, programs and methods
CN110084187A (en) Location recognition method, device, equipment and storage medium based on computer vision
CN111612851B (en) Method, apparatus, device and storage medium for calibrating camera
CN111445499B (en) Method and device for identifying target information
CN112115293A (en) Content recommendation method and content recommendation device
CN111860074B (en) Target object detection method and device, and driving control method and device
CN110672086B (en) Scene recognition method, device, equipment and computer readable medium
CN114299192B (en) Method, device, equipment and medium for positioning and mapping
CN115393616A (en) Target tracking method, device, equipment and storage medium
CN115375774A (en) Method, apparatus, device and storage medium for determining external parameters of a camera
CN115062240A (en) Parking lot sorting method and device, electronic equipment and storage medium
CN112036268B (en) Component identification method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210112