CN114061600A - Navigation method, device, system and storage medium - Google Patents

Navigation method, device, system and storage medium Download PDF

Info

Publication number
CN114061600A
CN114061600A CN202010763648.XA CN202010763648A CN114061600A CN 114061600 A CN114061600 A CN 114061600A CN 202010763648 A CN202010763648 A CN 202010763648A CN 114061600 A CN114061600 A CN 114061600A
Authority
CN
China
Prior art keywords
navigation
navigation point
target
navigated object
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010763648.XA
Other languages
Chinese (zh)
Inventor
谢法
夏云浩
钟启权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010763648.XA priority Critical patent/CN114061600A/en
Publication of CN114061600A publication Critical patent/CN114061600A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3492Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the application provides a navigation method, navigation equipment, a navigation system and a storage medium. In the embodiment of the application, under the condition that a steering event occurs aiming at the first navigation point, the rendering scale of the electronic map can be adjusted to be the set target scale, and the electronic map with higher geometric precision is displayed for a user; according to the real-time distance between the navigated object and the target navigation point, the depression angle of the camera is adjusted, so that the target navigation point is located in the preset position area of the screen, on one hand, the target navigation point can be ensured to be located in the preset position area of the screen, on the other hand, the navigation effect of dynamic driving of the navigated object can be simulated, a user can sense the dynamic effect of the navigated object moving to the target navigation point, the navigation intuition is improved, the navigation effect is improved, and the probability that the user misses the steering position is reduced.

Description

Navigation method, device, system and storage medium
Technical Field
The present application relates to the field of navigation technologies, and in particular, to a navigation method, device, system, and storage medium.
Background
With the popularization of intelligent terminals, travel-type application software (programs) having a navigation function is widely installed and used, which can plan a navigation route from a start location to a destination according to a start location and a destination input/selected by a user, and guide the user along the navigation route from the start location to the destination by means of voice guidance and/or picture guidance based on the navigation route.
The travel application software adopts an electronic map to realize navigation. When the existing travel application software is used for navigation, an electronic map and a navigation path are rendered on a screen by adopting fixed rendering parameters, so that the navigation intuitiveness is low, and the navigation effect is poor.
Disclosure of Invention
Aspects of the present application provide a navigation method, apparatus, system, and storage medium to improve the intuitiveness of navigation, thereby facilitating improvement of navigation effect.
The embodiment of the application provides a navigation method, which comprises the following steps:
responding to a steering event aiming at the first navigation point, and adjusting the rendering scale of the electronic map to a set target scale; the first navigation point is a position for guiding the navigated object to turn;
determining a real-time distance from the navigated object to a target navigation point based on the real-time location information of the navigated object and the location information of the target navigation point;
adjusting a camera depression angle for rendering an electronic map on a screen based on a real-time distance of the navigated object to the target navigation point so that the target navigation point is located in a predetermined location area of the screen; wherein the camera depression angle becomes larger as the real-time distance of the navigated object to the target navigation point decreases;
rendering an electronic map and a pre-planned navigation path on the screen according to a target scale and a camera depression angle when a target navigation point is located in a preset position area of the screen;
the target navigation point is a first navigation point or a second navigation point; the second navigation point is a position which is positioned in front of the first navigation point and used for guiding the navigated object to turn, and the distance between the second navigation point and the first navigation point is smaller than or equal to a set distance threshold.
An embodiment of the present application further provides a navigation system, including: the system comprises a navigation terminal and server equipment; the navigation terminal is carried on a navigated object;
the server device is configured to: determining a navigation guiding action of the navigated object at a first navigation point according to the real-time position information of the navigated object and a pre-planned navigation path; and under the condition that the navigation guidance of the navigated object at a first navigation point is taken as a steering action, issuing a steering guidance prompt aiming at the first navigation point to the navigation terminal;
the navigation terminal is used for: in response to the steering guidance prompt, adjusting an electronic map rendering scale to a set target scale; determining a real-time distance from the navigated object to a target navigation point based on the real-time location information of the navigated object and the location information of the target navigation point; adjusting a camera depression angle for rendering an electronic map on a screen based on a real-time distance from the navigated object to the target navigation point, so that the target navigation point is located in a preset position area of the screen; wherein the camera depression angle becomes larger as the real-time distance of the navigated object to the target navigation point decreases; rendering an electronic map and a pre-planned navigation path on the screen according to a target scale and a camera depression angle when a target navigation point is located in a preset position area of the screen; the target navigation point is a first navigation point or a second navigation point; the second navigation point is a position which is positioned in front of the first navigation point and used for guiding the navigated object to turn, and the distance between the second navigation point and the first navigation point is smaller than or equal to a set distance threshold.
An embodiment of the present application further provides an electronic device, including: a memory, a processor, and a screen; wherein the memory is used for storing a computer program;
the processor is coupled to the memory for executing the computer program for performing the steps in the navigation method described above.
Embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the above-mentioned navigation method.
In the embodiment of the application, a navigation terminal is mounted on a navigated object, an electronic map is displayed on a screen of the navigation terminal, and when a steering event occurs for a first navigation point, a rendering scale of the electronic map can be adjusted to a set target scale, so that the electronic map with higher geometric accuracy is displayed for a user; and based on the real-time distance from the navigated object to the target navigation point, adjusting the camera depression angle for rendering the electronic map to enable the target navigation area to be located in the preset position area of the screen, thereby solving the contradiction between the adjustment of the rendering scale and the navigation rendering requirement that the target navigation point on the electronic map is located in the preset position area of the screen. In addition, the depression angle of the camera is adjusted according to the real-time distance between the navigated object and the target navigation point, so that the target navigation point is located in the preset position area of the screen, on one hand, the target navigation point can be ensured to be located in the preset position area of the screen, on the other hand, the navigation effect of dynamic driving of the navigated object can be simulated, a user can sense the dynamic effect of the navigated object moving to the target navigation point, the navigation intuition is improved, the navigation effect is improved, and the probability that the user misses the turning position is reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1a and fig. 1b are schematic structural diagrams of a navigation system according to an embodiment of the present application;
fig. 1c is a schematic view illustrating a display effect of a navigation interface provided in an embodiment of the present application;
FIGS. 1 d-1 g are schematic diagrams illustrating the display effect of other navigation interfaces provided by the embodiment of the present application;
fig. 2 is a schematic flowchart of a navigation method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Aiming at the technical problems that the existing navigation mode is low in intuition and poor in navigation effect, in some embodiments of the application, a navigation terminal is carried by a navigated object, an electronic map is displayed on a screen of the navigation terminal, and under the condition that a steering event occurs at a first navigation point, a rendering scale of the electronic map can be adjusted to be a set target scale, so that the electronic map with higher geometric accuracy is displayed for a user; and based on the real-time distance from the navigated object to the target navigation point, adjusting the camera depression angle for rendering the electronic map to enable the target navigation area to be located in the preset position area of the screen, thereby solving the contradiction between the adjustment of the rendering scale and the navigation rendering requirement that the target navigation point on the electronic map is located in the preset position area of the screen. In addition, the depression angle of the camera is adjusted according to the real-time distance between the navigated object and the target navigation point, so that the target navigation point is located in the preset position area of the screen, on one hand, the target navigation point can be ensured to be located in the preset position area of the screen, on the other hand, the navigation effect of dynamic driving of the navigated object can be simulated, a user can sense the dynamic effect of the navigated object moving to the target navigation point, the navigation intuition is improved, the navigation effect is improved, and the probability that the user misses the turning position is reduced.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
It should be noted that: like reference numerals refer to like objects in the following figures and embodiments, and thus, once an object is defined in one figure or embodiment, further discussion thereof is not required in subsequent figures and embodiments.
Fig. 1a is a schematic structural diagram of a navigation system according to an embodiment of the present application. As shown in fig. 1a, the system comprises: a navigation terminal 11 and a server device 12. In this embodiment, the navigation terminal 11 may be mounted on the navigated object and move along with the movement of the navigated object. The navigated object can be any object that can move. For example, the navigated object may be a person or a bicycle, or the like, or may be a motor vehicle, such as a car, a taxi, a van, a motorcycle, an electric car, or the like, or an autonomous mobile device, such as a robot, an unmanned vehicle, or the like.
Wherein, the server device 12 and the navigation terminal 11 can be connected wirelessly or by wire. Optionally, the service-side device 12 may be communicatively connected to the navigation terminal 11 through a mobile network, and accordingly, the network format of the mobile network may be any one of 2G (gsm), 2.5G (gprs), 3G (WCDMA, TD-SCDMA, CDMA2000, UTMS), 4G (LTE), 4G + (LTE +), 5G, WiMax, and the like. Optionally, the server device 12 may also be communicatively connected to the navigation terminal 11 through bluetooth, WiFi, infrared, or the like.
In this embodiment, the server device 12 refers to a computer device capable of responding to a service request of the navigation terminal 11 and providing a navigation-related service for a user, and generally has the capability of undertaking and securing the service. The server device 12 may be equipped with a navigation engine. The server device 12 may be a single server device, a cloud server array, or a Virtual Machine (VM) running in the cloud server array. In addition, the server device may also refer to other computing devices with corresponding service capabilities, such as a terminal device (running a service program) such as a computer.
In the present embodiment, the navigation terminal 11 refers to an electronic device that can provide a navigation function for a navigated object. For example, the system can be a smart phone, a tablet computer, a personal computer, an intelligent wearable device and the like, and can also be a special navigation device. For example, an in-vehicle navigation apparatus, and the like. In the present embodiment, the navigation terminal 11 may be installed with software such as a navigation-related Application (APP), which may provide an electronic map to the user. The navigation terminal 11 presents an electronic map on its screen.
In this embodiment, the navigation terminal 11 can locate the current position information of the navigated object and provide its destination address by the user. Thus, the navigation terminal 11 can plan a navigation path according to the current position information and the destination address of the object to be navigated. Optionally, the navigation terminal 11 may also plan a navigation path for the user according to the current location information, the destination address and the current road condition information of the navigated object. Thus, the navigated object can reach the destination address along the navigation path. Alternatively, the navigation terminal 11 may provide the server device 12 with the location information of the current location of the navigated object and the destination address provided by the user. The server device 12 receives the current location information of the navigated object and the destination address provided by the user, plans the navigation path according to the current location information of the navigated object and the destination address, and provides the navigation path to the navigation terminal 11. Accordingly, the navigation terminal 11 receives the navigation path and renders the navigation path on the navigation guidance interface of the electronic map. Or, the server device 12 may also plan a navigation path for the user according to the current location information, the destination address, and the current road condition information of the navigated object, and provide the navigation path to the navigation terminal 11. Accordingly, the navigation terminal 11 receives the navigation path and renders the navigation path on the navigation guidance interface of the electronic map. Thus, the navigated object can reach the destination address along the navigation path.
In the process of moving the navigated object along the navigation path, steering and the like are often needed. For example, the navigated object is a person, which may also be referred to as a user, and the user needs to turn at a certain intersection during walking or riding along the navigation path. For another example, the navigated object is a motor vehicle, and the user may need to turn a corner while driving the motor vehicle.
Based on the above analysis, in order to reduce the user's excessive unnecessary travel when missing the turning position, in this embodiment, the server device 12 may acquire real-time position information of the navigated object during the process that the navigated object moves along the navigation path. In this embodiment, a specific implementation of the server device 12 acquiring the real-time location information of the navigated object is not limited, and several alternative implementations are exemplarily described below.
Embodiment 1: the navigation terminal 11 locates the navigated object. The method specifically comprises the following steps: the navigation terminal 11 acquires the real-time position information of the navigated object while the navigated object moves along the navigation path, and provides the real-time position information of the navigated object to the server device 12. Accordingly, the server device 12 receives the real-time position information of the navigated object provided by the navigation terminal 11. The real-time location information of the navigated object is actual geographic location information, such as latitude and longitude information.
Alternatively, the navigation terminal 11 may acquire real-time position information of the navigated object according to a set positioning cycle during the navigated object moves along the navigation path. In the present embodiment, the positioning period is not limited, and may be, for example, 30s, 1min, 5min, or the like.
In the embodiment of the present application, a specific implementation manner of the navigation terminal 11 acquiring the real-time position information of the navigated object is not limited. Alternatively, the navigation terminal 11 may acquire the real-time location information of the navigated object through a GPS positioning technology, a base station positioning technology, a WiFi positioning technology, or the like. The positioning accuracy of the GPS positioning technology, the base station positioning technology or the WiFI positioning technology is considered to be lower.
In the present embodiment, in order to improve the navigation positioning accuracy, the image capturing device 13 may be used to capture an environment image of the current environment of the navigated object during the movement of the navigated object along the navigation path. The image pickup device 13 is mounted on the object to be navigated. In the present embodiment, the number and implementation form of the image pickup devices 13 are not limited. For example, in an application scenario in which the user walks or rides, the image capturing device 13 may be a camera deployed on the navigation terminal 11 (such as a smartphone); for another example, the image capturing device 13 may be an in-vehicle camera or a camera on the navigation terminal 11 during driving of the motor vehicle driven by the user. The number of the cameras can be 1 or more. Plural means 2 or more. For example, the object to be navigated is a motor vehicle, and the cameras are 1 or more cameras mounted on the motor vehicle. For multiple cameras, the cameras can be deployed at different locations of the motor vehicle, for example, at the head, tail, left and right sides, and so on of the motor vehicle.
Further, the image pickup device 13 may provide the picked-up environment image to the navigation terminal 11. The communication mode between the image capturing device 13 and the navigation terminal 11 can refer to the communication mode between the navigation terminal 11 and the server device 12, which is not described herein again. Further, the navigation terminal 11 may determine real-time location information of the navigated object based on the environmental image and the electronic map data. Optionally, the navigation terminal 11 obtains a feature descriptor of a pixel point in the environment image; and determining real-time position information of the navigated object according to the feature descriptors of the pixel points in the environment image and the feature descriptors of the position points recorded in the electronic map data.
Optionally, when determining the position coordinates of the pixel points in the environment image in the environment map, the navigation terminal 11 may calculate similarities between the feature descriptors of the pixel points in the environment image and the feature descriptors of the respective position points in the electronic map data, and use the position points with the similarities greater than or equal to the set similarity threshold as the corresponding position points of the pixel points in the environment image in the electronic map data, that is, the position coordinates of the position points with the similarities greater than or equal to the set similarity threshold in the electronic map data as the position coordinates of the pixel points in the corresponding environment image in the electronic map data. Then, the navigation terminal 11 can calculate the real-time location information of the navigated object in the electronic map, that is, the real-time location information of the navigated object, according to the location coordinates of the pixel points in the environment image in the electronic map data. Further, the navigation terminal 11 provides the real-time position information of the navigated object to the server device 12. The server device 12 receives real-time location information of the navigated object.
Embodiment 2: the server device 12 locates the navigated object. The server device 12 obtains real-time position information of the navigated object during the process of moving the navigated object along the navigation path. Alternatively, the image capturing device 13 may be used to capture an image of the environment in which the navigated object is currently located during movement of the navigated object along the navigation path. The image pickup device 13 is mounted on the object to be navigated. For the implementation of the image capturing device 13, reference may be made to the relevant contents of the above embodiments, which are not described herein again.
Further, as shown in fig. 1b, the image capturing device 13 may provide the captured environment image to the server device 12. The communication mode between the image capturing device 13 and the server device 12 can refer to the communication mode between the navigation terminal 11 and the server device 12, which is not described herein again. Further, the server device 12 may determine real-time location information of the navigated object based on the environmental image and known electronic map data. Optionally, the server device 12 obtains feature descriptors of pixel points in the environment image; and determining real-time position information of the navigated object according to the feature descriptors of the pixel points in the environment image and the feature descriptors of the position points recorded in the electronic map data. The specific implementation of the server device 12 determining the real-time location information of the navigated object according to the feature descriptors of the pixel points in the environment image and the feature descriptors of the location points recorded in the electronic map data can refer to the above-mentioned related content of determining the real-time location information of the navigated object by the navigation terminal 11, which is not described herein again.
Further, after the server-side device 12 obtains the real-time location information of the navigated object, the server-side device 12 may determine the next navigation guiding action according to the pre-planned navigation path information and the real-time location information of the navigated object. Optionally, determining a next navigation point of the navigated object, namely a first navigation point, according to the real-time position information of the navigated object and a pre-planned navigation path; and determining the navigation guidance action to be executed by the navigated object when reaching the first navigation point as the navigation guidance action aiming at the first navigation point. Issuing a steering guidance prompt for the first navigation point to the navigation terminal 11 when the navigation guidance action of the navigated object for the first navigation point is taken as a steering action; the turn guide information is used for prompting the navigated object to turn at the first navigation point. The first navigation point is any navigation point on the navigation path. In the application scenario, the first navigation point is the next navigation point ahead of the navigated object. That is, if the navigation guidance of the object to be navigated at the first navigation point is performed as a steering operation, the first navigation point is a position for guiding the object to be navigated to steer.
For the navigation terminal 11, the navigation guidance prompt provided by the server side device 12 may be received. The navigation guidance prompt may be a straight-ahead guidance prompt, a turn guidance prompt, or a traffic condition guidance prompt, but is not limited thereto. Based on this, the navigation terminal 11 may also monitor whether the received navigation guidance prompt for the first navigation point is a turn guidance prompt, and if it is monitored that the navigation guidance prompt for the first navigation point is a turn guidance prompt, determine that a turn event for the first navigation point occurs.
Accordingly, the navigation terminal 11 may determine that a steering event for the first navigation point occurs in response to the steering guidance prompt, and adjust the electronic map rendering scale to the set target scale in response to the steering event. The electronic map rendering scale is a ratio of a distance on the electronic map to an actual physical distance.
Alternatively, for the navigation terminal 11, as shown in fig. 1a, before responding to a steering event for a first navigation point, an electronic map and a pre-planned navigation path may be rendered on a screen according to a first scale and a first camera depression angle at which the electronic map is rendered on the screen. The rendered electronic map and the pre-planned navigation path are shown in the navigation interface C1 in fig. 1 a. In practical applications, an icon of the navigated object may also be displayed on the screen of the navigation terminal 11, for example, as shown in fig. 1a and 1b, the navigated object is a car, and the icon of the navigated object is a car logo in fig. 1a and b. In this embodiment, the first camera depression may be a default camera depression, and specific values thereof are not limited. For example, the first camera depression may be 13 °, 15 °, 20 °, or the like.
In the process before the navigation terminal 11 responds to the steering event aiming at the first navigation point, the navigated object moves to the target navigation point, the real-time distance between the navigated object and the target navigation point is gradually shortened, and in order to simulate the driving effect of the navigated object to the target navigation point, the distance between the target navigation point and the icon of the navigated object on the screen can be adjusted based on the real-time distance between the navigated object and the target navigation point. And the distance between the target navigation point and the icon of the navigated object on the screen is reduced as the real-time distance between the target navigation point and the icon of the navigated object is reduced. The real-time distance between the navigated object and the target navigation point is the actual physical distance between the navigated object and the target navigation point.
The navigation terminal 11 may keep the position of the icon of the navigated object on the screen unchanged and adjust the position of the target navigation point on the screen based on the real-time distance of the navigated object to the target navigation point to adjust the distance of the first navigation point and the icon of the navigated object on the screen. And the distance between the target navigation point and the icon of the navigated object on the screen is reduced as the real-time distance between the target navigation point and the icon of the navigated object is reduced.
Further, the navigation terminal 11 may adjust the electronic map rendering scale from the first scale to the target scale in response to the steering event for the first navigation point a in the case where it is determined that the steering event for the first navigation point a occurs. The first scale may be a default electronic map rendering scale. The first scale is smaller than the target scale. Therefore, the navigation terminal responds to the steering event aiming at the first navigation point, the rendering scale of the electronic map is adjusted to be the target scale, the rendering scale of the electronic map is increased, the more detailed electronic map can be displayed, and the geometric precision of the electronic map display is improved.
Alternatively, the navigation terminal 11 may gradually increase the scale of the electronic map according to the set scale gradient until the scale is increased to the target scale; or the navigation terminal 11 may directly increase the electronic map rendering scale from the first scale to the target scale. In this embodiment, specific values of the first scale and the target scale are not limited. Alternatively, the first scale bar may be 1 cm: 50 m; the target scale may be 1 cm: 10m, etc.
In some application scenarios, the first navigation point a is far away from the second navigation point B guiding the navigated object to turn next, and the navigated object turns at the first navigation point a without paying attention to the second navigation point B, in which case the first navigation point a may be taken as the target navigation point. In other application scenes, the first navigation point a is closer to a second navigation point B which guides the next navigation object to turn, and when a user turns at the first navigation point a, the user needs to pay attention to the situation of the second navigation point B; in this case, the second navigation point B may be taken as the target navigation point.
Based on the above analysis, in the embodiment of the present application, the navigation terminal 11 may further determine the position information of the first navigation point a according to the real-time position information of the navigated object and the navigation path; judging whether a second navigation point B to which the navigated object needs to turn exists in a preset distance range of the first navigation point A or not according to a pre-planned navigation path and the position information of the first navigation point A; the preset distance range may be within 200m, 100m, or 80m, but is not limited thereto. Further, if a second navigation point B to which the navigated object needs to turn does not exist within the preset distance range of the first navigation point A, the first navigation point A is used as a target navigation point; and the position information of the first navigation point a is taken as the position information of the target navigation point. Correspondingly, if a second navigation point B to be steered by the navigated object exists in the preset distance range of the first navigation point A, taking the second navigation point B as a target navigation point; and the position information of the second navigation point B is taken as the position information of the target navigation point. It should be noted that the location information of the target navigation point refers to actual geographic location information of the target navigation point, such as longitude and latitude information.
Further, due to the increase of the rendering scale of the electronic map, the target navigation point may be out of the screen of the navigation terminal 11, and in the present embodiment, in order to improve the navigation effect, the target navigation point may be in a predetermined position area on the screen. Alternatively, the target navigation point may be set to a position area on the screen where 80% of the screen height is located. Alternatively, it may be set that the height difference between the height of the target navigation point on the screen and the screen height of 80% falls within the set height difference range, or the like. Accordingly, the predetermined location area may be 80% height ± height difference range.
In practical applications, an icon of the navigated object may also be displayed on the screen of the navigation terminal 11, for example, as shown in fig. 1a and 1b, the navigated object is a car, and the icon of the navigated object is a car logo in fig. 1a and b. In practical applications, the position of the icon of the navigated object on the screen is fixed, but in this embodiment, the navigation terminal 11 is required to set the position of the target navigation point to be in the predetermined position area on the screen, which results in that from the perspective of the user, the distance between the target navigation point and the navigated object on the electronic map is fixed, and the actual physical distance between the navigated object and the target navigation point is continuously shortened, i.e. the navigated object gradually moves to the target navigation point, so that the navigation effect and the actual driving effect of the navigated object are not consistent.
To solve this problem, in the present embodiment, as shown in fig. 1a, the navigation terminal 11 determines a real-time distance from the navigated object to the target navigation point according to the real-time position information of the navigated object and the position information of the target navigation point, and adjusts a camera depression angle for rendering the electronic map on the screen based on the real-time distance of the navigated object at the target navigation point so that the target navigation point is located at a predetermined position area of the screen. Wherein the camera depression angle becomes larger as the real-time distance from the navigated object to the target navigation point decreases. Further, the navigation terminal 11 renders the electronic map and the navigation path on the screen according to the target scale and the camera depression when the position of the target navigation point is located in the predetermined position area of the screen of the navigation terminal 11, as shown in the navigation interface C2 in fig. 1a and 1 b. Fig. 1a and 1b only illustrate the target navigation point as the first navigation point a.
In this embodiment, a navigation terminal is mounted on a navigated object, an electronic map is displayed on a screen of the navigation terminal, and when a steering event occurs for a first navigation point, a rendering scale of the electronic map can be adjusted to a set target scale, so that the electronic map with higher geometric accuracy is displayed to a user; and based on the real-time distance from the navigated object to the target navigation point, adjusting the camera depression angle for rendering the electronic map to enable the target navigation area to be located in the preset position area of the screen, thereby solving the contradiction between the adjustment of the rendering scale and the navigation rendering requirement that the target navigation point on the electronic map is located in the preset position area of the screen. According to the real-time distance between the navigated object and the target navigation point, the depression angle of the camera is adjusted, so that the target navigation point is located in the preset position area of the screen, on one hand, the target navigation point can be ensured to be located in the preset position area of the screen, on the other hand, the navigation effect of dynamic driving of the navigated object can be simulated, a user can sense the dynamic effect of the navigated object moving to the target navigation point, the navigation intuition is improved, the navigation effect is improved, and the probability that the user misses the steering position is reduced.
In this embodiment of the application, if there is no second navigation point B to which the navigated object needs to turn within the preset distance range of the first navigation point a, the first navigation point a is the target navigation point, and if there is a second navigation point B to which the navigated object needs to turn within the preset distance range of the first navigation point a, the second navigation point B is taken as the target navigation point. In the embodiment of the present application, whether the first navigation point a is the target navigation point or the second navigation point B is the target navigation point. When adjusting the camera depression angle of the electronic map rendered on the screen, the navigation terminal 11 may increase the camera depression angle until the target navigation point is located in a predetermined position area of the screen in the process of decreasing the real-time distance between the navigated object and the target navigation point. Wherein, the shorter the physical distance between the navigated object and the turning position, the larger the camera depression, and the rendered navigation interface is as shown in fig. 1 c. The depression angles of the cameras corresponding to the navigation interfaces C2-C5 shown in fig. 1C are sequentially increased.
Alternatively, the navigation terminal 11 may sequentially increase the camera depression angle according to the set angle change gradient; after the angle change gradient of the depression angle of the camera is increased every time, whether the target navigation point is located in a preset position area of the screen is judged; and if so, taking the camera depression angle after the adjustment as the camera depression angle of the current rendered electronic map. Correspondingly, if the judgment result is negative, increasing the set angle change gradient of the depression angle of the camera until the target navigation point is located in the preset position area of the screen. Optionally, when the navigated object travels to the first geographical position, the camera depression angles may be sequentially increased according to the set angle change gradient; after the angle change gradient of the depression angle of the camera is increased every time, whether the target navigation point is located in the appointed range of the screen is judged; and if so, rendering the target camera depression angle of the electronic map by taking the adjusted camera depression angle as the depression angle of the navigated object at the first geographic position. Wherein the first geographical position is any geographical position to which the navigated object travels after the navigation terminal 11 responds to the steering event for the first navigation point a. I.e. to which position the navigated object is traveling, this position is the first geographical position.
Further, when the depression angle of the camera is increased to a certain angle, the three-dimensional space sense of the rendered electronic map is reduced, and the navigation effect is influenced. Based on this, a camera depression upper limit value that renders the electronic map may also be set in the navigation terminal 11. Accordingly, the navigation terminal 11 may stop increasing the camera depression angle in a case where the camera depression angle is adjusted to the set depression angle upper limit value; and rendering an electronic map rendering and navigation path on a screen according to the target scale and the depression angle upper limit value. In the present embodiment, specific values of the upper limit of the depression angle are not limited, and for example, the upper limit of the depression angle may be 50 °, 55 °, 60 °, or the like. Preferably, the depression upper limit value is 50 ° or more and 90 ° or less.
Or stopping increasing the camera depression angle under the condition that the real-time distance between the navigated object and the target navigation point is less than or equal to the set distance threshold; and rendering an electronic map rendering and navigation path on the screen according to the camera depression angle when the increase is stopped and the target scale. The specific value of the distance threshold is not limited, and for example, the set distance threshold may be 50m, 80m, or 100 m.
It is worth noting that in some cases, the navigation terminal 11 increases the electronic map rendering scale to the target scale in response to the steering event for the first navigation point a, so that the electronic map is rendered on the screen according to the increased scale (such as the target scale) and the initial first camera depression angle, which may cause the target navigation point to be out of the screen of the navigation terminal 11. In this case, the navigation terminal 11 may further determine whether the position of the target navigation point is shown on the screen of the navigation terminal 11 in a case where the electronic map is rendered on the screen according to the target scale and the above-mentioned first camera depression angle before adjusting the camera depression angle at which the electronic map is rendered on the screen; if the judgment result is negative, the depression angle of the camera is reduced under the condition that the depression angle of the first camera is larger than the lower limit value of the depression angle until the target navigation point marked on the electronic map is positioned on the screen of the navigation terminal 11.
Further, if the camera depression angle decreases to the depression angle lower limit and the target navigation point is not yet displayed on the screen of the navigation terminal 11, the electronic map and the navigation path are rendered on the screen according to the target scale and the depression angle lower limit. Wherein the lower limit of the depression angle is smaller than the depression angle of the first camera.
Correspondingly, if the lower limit value of the depression angle is equal to the first camera depression angle, if the electronic map is rendered on the screen according to the target scale and the first camera depression angle, and the position of the target navigation point is outside the screen of the navigation terminal 11, the navigation terminal 11 renders the electronic map and the navigation path on the screen directly according to the target scale and the lower limit value of the depression angle.
Further, the navigation terminal 11 may also adjust the distance between the target navigation point and the navigated object on the screen according to the real-time distance between the navigated object and the target navigation point until the target navigation point is located in the predetermined position area of the screen. Wherein the position of the icon of the navigated object on the screen remains unchanged.
Accordingly, in the case of rendering the electronic map according to the target scale and the first camera depression angle, if the determination result indicates that the target navigation point is located on the screen of the navigation terminal 11, the camera depression angle is adjusted according to the real-time distance from the navigated object to the target navigation point, so that the target navigation point is located in the predetermined position area of the screen. Namely, in the process that the navigated object moves to the target navigation point, the target navigation point is adjusted to the preset position area of the screen by increasing the depression angle of the camera. Wherein the shorter the real-time distance from the navigated object to the target navigation point, the larger the camera depression.
Further, for the case that the target navigation point is the first navigation point a, after the navigated object enters the next segment through the first navigation point a, the scale and the camera depression angle for rendering the electronic map may be adjusted according to the navigation guidance prompt issued by the server device 12. Alternatively, if the server device 12 issues the turn guidance instruction for the next navigation point after the navigated object enters the next route segment, the navigation terminal 11 regards the next navigation point as the new first navigation point a and reenters the above operation in response to the turn event for the first navigation point a. If the navigated object enters the next road segment, and if the server device 12 issues the road-level navigation prompt, the navigation terminal 11 may readjust the scale for rendering the electronic map to the first scale on the market, readjust the camera depression angle for rendering the electronic map to the first camera depression angle, and then render the electronic map and the navigation path on the screen according to the first scale and the first camera depression angle.
It should be noted that the first navigation point a may be any turning position on the navigation path of the navigated object during the movement along the pre-planned navigation path. In some application scenarios, the first navigation point a is far away from the second navigation point B guiding the next turn, or there is no other navigation point guiding the turn after the first navigation point a, and then the first navigation point a can be used as the target navigation point. For the execution logic of rendering the electronic map and the navigation path by the navigation terminal 11 when the first navigation point a is the target navigation point, reference may be made to the relevant contents of the above embodiments, and details are not repeated herein.
In other application scenarios, the distance between the first navigation point a and the second navigation point B for guiding the next turn is short, and when the user turns at the first navigation point a and needs to pay attention to the situation of the second navigation point B, the second navigation point B may be used as the target navigation point, and the position information of the second navigation point B may be used as the position information of the target navigation point. For the case that the navigation terminal 11 determines whether the target navigation point is the first navigation point a or the second navigation point B, reference may be made to the relevant contents of the above embodiments, and details are not repeated here.
If the second navigation point B is the target navigation point. Accordingly, before adjusting the camera depression angle for rendering the electronic map on the screen, the navigation terminal 11 may further adjust a camera rotation angle for rendering the electronic map so that the second navigation point B is located on the screen of the navigation terminal 11, and render the electronic map and the navigation path on the screen according to the adjusted target scale and the adjusted camera rotation angle. And the rendered display effect schematic diagram of the navigation interface is shown in the lower diagram of fig. 1 d. Wherein, the upper diagram of fig. 1d represents a display effect schematic diagram of the navigation interface before the navigation terminal 11 responds to the steering event for the first navigation point a.
Optionally, the navigation terminal 11 may determine a relative position relationship between the navigated object and the second navigation point B when the navigated object moves to the second navigation point B according to a navigation path planned in advance and a moving direction of the navigated object; and adjusting the rotation angle of the camera according to the relative position relationship.
Further, if the navigated object moves to the second navigation point B, the second navigation point B is located on the right side of the navigated object, and the navigated object rotates clockwise around the z-axis of the camera coordinate system by a set angle; and if the navigated object moves to the second navigation point B, the second navigation point B is positioned on the left side of the navigated object, and the navigated object rotates counterclockwise around the z-axis of the camera coordinate system by a set angle. The set angle may be 20 °, 30 °, 35 °, 40 °, or the like. When the navigated object moves to the second navigation point B in fig. 1d, the second navigation point B is located on the left side of the navigated object, and rotates counterclockwise around the z-axis of the camera coordinate system by a set angle.
Further, the navigation terminal 11 may adjust a camera depression angle for rendering the electronic map according to a real-time distance between the navigated object and the second navigation point B before the navigated object passes through the first navigation point a, so that a position on the electronic map that identifies the second navigation point B is located in a predetermined position area of the screen of the navigation terminal 11. For a specific embodiment of the navigation terminal 11 adjusting the camera depression angle, reference may be made to the related contents of the above embodiments.
Further, the navigation terminal 11 may render an electronic map and a navigation path on the screen according to the increased target scale, the adjusted depression angle of the camera, and the adjusted rotation angle of the camera, where the rendered navigation interface is as shown in the lower diagram of fig. 1 d.
Further, the navigation terminal 11 may continuously adjust a camera depression angle for rendering the electronic map according to a real-time distance between the navigated object and the second navigation point B during the process that the navigated object moves to the first navigation point a, so that the second navigation point B is located in a predetermined position area of the screen of the navigation terminal 11. Wherein, the shorter the physical distance between the navigated object and the first navigation point a, the larger the corresponding camera depression when the location on the electronic map that identifies the second navigation point B is within the specified range of the screen of the navigation terminal 11. Further, the navigation terminal 11 renders an electronic map and a navigation path on the screen according to the target scale, the adjusted camera rotation angle, and the currently adjusted camera depression angle, and obtains a display effect that the navigation interface changes with the physical distance between the navigated object and the second navigation point a before the navigated object passes through the first navigation point a, as shown in fig. 1 e. In FIG. 1e, the physical distances between the navigated object and the second navigation point B in the navigation interfaces D1-D3 are successively shorter, and the camera angles of depression corresponding to the navigation interfaces D1-D3 are successively larger.
Further, as shown in fig. 1f, after the navigated object passes through the first navigation point a, the navigation terminal 11 may return to the camera rotation angle, i.e. readjust the camera rotation angle to the angle before rotating the set angle counterclockwise or clockwise, and the rendered navigation interface D5 is as shown in fig. 1f, where the navigation interface D4 shown in the upper diagram of fig. 1f is a schematic view of the display effect of the navigation interface before the navigated object passes through the first navigation point a.
Further, after the navigated object passes through the first navigation point a, the navigation terminal 11 may also adjust the camera depression angle according to the real-time distance between the navigated object and the second navigation point B, so that the second navigation point B is located in a predetermined position area of the screen of the navigation terminal 11. That is, during the process that the navigated object moves to the second navigation point B, the real-time distance between the navigated object and the second navigation point B gradually shortens, and in order to enable the second navigation point B to be located at the predetermined position area of the screen of the navigation terminal 11, the camera depression angle may be increased according to the real-time distance between the navigated object and the second navigation point B, so that the second navigation point B identified on the electronic map is located at the predetermined position area of the screen of the navigation terminal 11 until the navigated object passes through the second navigation point B. For a specific implementation manner of the navigation terminal 11 adjusting the camera depression angle according to the real-time distance between the navigated object and the second navigation point B, reference may be made to relevant contents in the foregoing embodiments, and details are not described herein again. Further, the navigation terminal 11 may render an electronic map and a navigation path on the screen according to the target scale, the corrected camera rotation angle, and the currently adjusted camera depression angle.
The shorter the real-time distance between the navigated object and the second navigation point B is, the larger the camera depression angle is, and the navigation interface rendered by using the increased camera depression angle is shown in fig. 1 g. In FIG. 1g, the real-time distances between the navigated object and the second navigation point B in the navigation interfaces D6-D8 are sequentially shorter, and the camera angles of depression corresponding to the navigation interfaces D6-D8 are sequentially larger.
In addition to the navigation system provided in the above embodiment, the embodiment of the present application also provides a navigation method, which is applicable to a navigation terminal or any electronic device with a processing function on a navigated object, and the navigation method provided in this embodiment is exemplarily described below.
Fig. 2 is a schematic flowchart of a navigation method according to an embodiment of the present application. As shown in fig. 2, the method mainly includes:
201. responding to a steering event aiming at the first navigation point, and adjusting the rendering scale of the electronic map to a set target scale; the first navigation point is a position at which the navigated object is guided to turn.
202. And determining the real-time distance between the navigated object and the target navigation point based on the real-time position information of the navigated object and the position information of the target navigation point.
203. Adjusting a camera depression angle for rendering the electronic map on the screen based on a real-time distance from the navigated object to the target navigation point, so that the target navigation point is located in a predetermined position area of the screen; wherein the camera depression angle becomes larger as the real-time distance from the navigated object to the target navigation point decreases.
204. And rendering an electronic map and a pre-planned navigation path on the screen according to the target scale and the depression angle of the camera when the target navigation point is positioned in the preset position area of the screen.
In this embodiment, the target navigation point is the first navigation point or the second navigation point. The first navigation point is any position on the navigation path where a steering action needs to be executed, and can be used for guiding the navigated object to steer. The second navigation point is positioned in front of the first navigation point on the pre-planned navigation path and is positioned in the set distance range of the first navigation point. The second navigation point is the next position of the navigated object after the first navigation point where a steering action needs to be performed.
In this embodiment, the navigation terminal may be mounted on the navigated object and move along with the movement of the navigated object. The navigation terminal may be installed with software such as a navigation-related Application (APP), which may provide an electronic map to a user. The navigation terminal displays an electronic map on its screen. The navigation terminal can locate the current position information of the navigated object and provide the destination address of the navigated object by the user. Therefore, the navigation path can be planned according to the current position information and the destination address of the navigated object. For specific implementation of planning the navigation path, reference may be made to the related contents of the above system embodiments, and details are not described herein.
In the process of moving the navigated object along the navigation path, steering and the like are often needed. For example, the navigated object is a person, which may also be referred to as a user, and the user needs to turn at a certain intersection during walking or riding along the navigation path. For another example, the navigated object is a motor vehicle, and the user may need to turn a corner while driving the motor vehicle.
Based on the above analysis, in order to reduce the user's excessive "making an unnecessary travel" for missing the turning position, in step 201 of this embodiment, the electronic map rendering scale may be adjusted to the set target scale in response to the turning event for the first navigation point. The electronic map rendering scale is a ratio of a distance on the electronic map to an actual physical distance.
In some embodiments, the navigation terminal may obtain real-time location information of the navigated object. For a specific implementation of the navigation terminal acquiring the real-time location information of the navigated object, reference may be made to related contents in implementation 1 of the above system embodiment, which is not described herein again. Further, the navigation terminal can determine the next navigation guiding action according to the pre-planned navigation path information and the real-time position information of the navigated object. Optionally, determining a next navigation point of the navigated object, namely a first navigation point, according to the real-time position information of the navigated object and a pre-planned navigation path; and determining the navigation guidance action to be executed by the navigated object when reaching the first navigation point as the navigation guidance action aiming at the first navigation point. When the navigation guiding action of the navigated object for the first navigation point is taken as the steering action, the steering event for the first navigation point is determined to occur.
Alternatively, the navigation guidance action for the first navigation point may also be determined by the server device. For an implementation manner of determining the navigation guidance action for the first navigation point by the server device, reference may be made to relevant contents of the above system embodiment, and details are not described herein again. Further, after determining the navigation guidance action at the first navigation point, the server device may provide a navigation guidance prompt for the first navigation point to the terminal device. Correspondingly, the navigation terminal can monitor whether the received navigation guidance prompt aiming at the first navigation point is a steering guidance prompt; and determining that a steering event aiming at the first navigation point occurs under the condition that the navigation guidance prompt aiming at the first navigation point is monitored to be a steering guidance prompt.
Alternatively, for the navigation terminal, the electronic map and the pre-planned navigation path may be rendered on the screen according to a first scale and a first camera depression angle at which the electronic map is rendered on the screen before responding to the steering event for the first navigation point. In practical application, icons of the navigated objects can be displayed on the screen of the navigation terminal. Before the navigation terminal responds to the steering event aiming at the first navigation point, the navigated object moves to the target navigation point, the real-time distance between the navigated object and the target navigation point is gradually shortened, and in order to simulate the driving effect of the navigated object to the target navigation point, the real-time distance between the navigated object and the target navigation point can be determined according to the real-time positioning information of the navigated object and the position information of the target navigation point. Optionally, the navigation terminal may further determine the position information of the target navigation point according to the real-time positioning information of the navigated object and a pre-planned navigation path, and the specific implementation will be described in detail in how to determine the target navigation point, which is not repeated herein. Further, the navigation terminal may adjust a distance of the icon of the navigated object from the target navigation point on the screen based on a real-time distance of the navigated object to the target navigation point. And the distance between the target navigation point and the icon of the navigated object on the screen is reduced as the real-time distance between the target navigation point and the icon of the navigated object is reduced. The real-time distance between the navigated object and the target navigation point is the actual physical distance between the navigated object and the target navigation point.
Alternatively, the navigation terminal may keep the position of the icon of the navigated object on the screen unchanged, and adjust the position of the target navigation point on the screen based on the real-time distance of the navigated object to the target navigation point to adjust the distance of the first navigation point and the icon of the navigated object on the screen. And the distance between the target navigation point and the icon of the navigated object on the screen is reduced as the real-time distance between the target navigation point and the icon of the navigated object is reduced.
Further, in an instance in which it is determined that a steering event for the first navigation point has occurred, the electronic map rendering scale may be adjusted from the first scale to the target scale in response to the steering event for the first navigation point. The first scale may be a default electronic map rendering scale. The first scale is smaller than the target scale. Therefore, the navigation terminal responds to the steering event aiming at the first navigation point, the rendering scale of the electronic map is adjusted to be the target scale, the rendering scale of the electronic map is increased, the more detailed electronic map can be displayed, and the geometric precision of the electronic map display is improved.
Optionally, the navigation terminal may gradually increase the scale of the electronic map according to the set scale gradient until the scale is increased to the target scale; or the navigation terminal can also directly increase the rendering scale of the electronic map from the first scale to the target scale.
In some application scenarios, the first navigation point is far away from the second navigation point which guides the navigated object to turn next, and when the navigated object turns at the first navigation point, there is no need to pay attention to the situation of the second navigation point, in which case, the first navigation point can be used as the target navigation point. In other application scenes, the first navigation point is closer to a second navigation point for guiding the next navigation object to turn, and when a user turns at the first navigation point, the user needs to pay attention to the condition of the second navigation point; in this case, the second navigation point may be taken as the target navigation point.
Based on the analysis, in the embodiment of the application, the navigation terminal can also determine the position information of the first navigation point according to the real-time position information and the navigation path of the navigated object; and judging whether a second navigation point to which the navigated object needs to turn exists in a preset distance range of the first navigation point according to a pre-planned navigation path and the position information of the first navigation point. Further, if a second navigation point to be steered by the navigated object does not exist within the preset distance range of the first navigation point, the first navigation point is taken as a target navigation point; and the position information of the first navigation point is used as the position information of the target navigation point. Correspondingly, if a second navigation point to be steered by the navigated object exists in the preset distance range of the first navigation point, taking the second navigation point as a target navigation point; and the position information of the second navigation point is used as the position information of the target navigation point.
Further, due to the increase of the rendering scale of the electronic map, the target navigation point may be out of the screen of the navigation terminal, and in the present embodiment, in order to improve the navigation effect, the target navigation point may be in a predetermined position area on the screen.
In practical application, an icon of a navigated object can be displayed on a screen of the navigation terminal, and the position of the icon of the navigated object on the screen is fixed, but in this embodiment, the navigation terminal is required to set the position of the target navigation point to be in a predetermined position area on the screen, which results in that from the perspective of a user, the distance between the target navigation point and the navigated object on the electronic map is fixed, and the actual physical distance between the navigated object and the target navigation point is continuously shortened, that is, the navigated object gradually moves to the target navigation point, so that the navigation effect and the actual driving effect of the navigated object are not consistent.
To solve this problem, in step 202, a real-time distance from the navigated object to the target navigation point may be determined according to the real-time location information of the navigated object and the location information of the target navigation point, and in step 203, a camera depression angle for rendering the electronic map on the screen is adjusted based on the real-time distance of the navigated object at the target navigation point such that the target navigation point is located at a predetermined location area of the screen. Wherein the camera depression angle becomes larger as the real-time distance from the navigated object to the target navigation point decreases. Further, in step 204, the electronic map and the navigation path are rendered on the screen according to the target scale and the camera depression angle when the position of the target navigation point is located in the predetermined position area of the screen of the navigation terminal.
In this embodiment, a navigation terminal is mounted on a navigated object, an electronic map is displayed on a screen of the navigation terminal, and when a steering event occurs for a first navigation point, a rendering scale of the electronic map can be adjusted to a set target scale, so that the electronic map with higher geometric accuracy is displayed to a user; and based on the real-time distance from the navigated object to the target navigation point, adjusting the camera depression angle for rendering the electronic map to enable the target navigation area to be located in the preset position area of the screen, thereby solving the contradiction between the adjustment of the rendering scale and the navigation rendering requirement that the target navigation point on the electronic map is located in the preset position area of the screen. In addition, the depression angle of the camera is adjusted according to the real-time distance between the navigated object and the target navigation point, so that the target navigation point is located in the preset position area of the screen, on one hand, the target navigation point can be ensured to be located in the preset position area of the screen, on the other hand, the navigation effect of dynamic driving of the navigated object can be simulated, a user can sense the dynamic effect of the navigated object moving to the target navigation point, the navigation intuition is improved, the navigation effect is improved, and the probability that the user misses the turning position is reduced.
In the embodiment of the application, whether the first navigation point is the target navigation point or the second navigation point is the target navigation point, the camera depression angle can be increased until the target navigation point is located in the preset position area of the screen in the process of reducing the real-time distance between the navigated object and the target navigation point. Wherein the shorter the physical distance between the navigated object and the steering position, the larger the camera angle of depression.
Optionally, the camera depression angles can be sequentially increased according to the set angle change gradient; after the angle change gradient of the depression angle of the camera is increased every time, whether the target navigation point is located in a preset position area of the screen is judged; and if so, taking the camera depression angle after the adjustment as the camera depression angle of the current rendered electronic map. Correspondingly, if the judgment result is negative, increasing the set angle change gradient of the depression angle of the camera until the target navigation point is located in the preset position area of the screen.
Further, when the depression angle of the camera is increased to a certain angle, the three-dimensional space sense of the rendered electronic map is reduced, and the navigation effect is influenced. Based on this, a camera depression upper limit value that renders the electronic map may also be set in the navigation terminal. Accordingly, the increase of the camera depression may be stopped when the camera depression is adjusted to the set depression upper limit value; and rendering an electronic map rendering and navigation path on a screen according to the target scale and the depression angle upper limit value.
Or stopping increasing the camera depression angle under the condition that the real-time distance between the navigated object and the target navigation point is less than or equal to the set distance threshold; and rendering an electronic map rendering and navigation path on the screen according to the camera depression angle when the increase is stopped and the target scale.
It is worth noting that in some cases, the navigation terminal increases the electronic map rendering scale to the target scale in response to the steering event for the first navigation point, so that the electronic map is rendered on the screen according to the increased scale (such as the target scale) and the initial first camera depression angle, which may cause the target navigation point to be out of the screen of the navigation terminal. In this case, before adjusting the camera depression angle for rendering the electronic map on the screen, it may be further determined whether the position of the target navigation point is displayed on the screen of the navigation terminal in a case where the electronic map is rendered on the screen according to the target scale and the first camera depression angle; if the judgment result is negative, the depression angle of the camera is reduced under the condition that the depression angle of the first camera is larger than the depression angle lower limit value until the target navigation point marked on the electronic map is positioned on the screen of the navigation terminal.
Further, if the camera depression angle is reduced to the depression angle lower limit value, and the target navigation point is not displayed on the screen of the navigation terminal, rendering the electronic map and the navigation path on the screen according to the target scale and the depression angle lower limit value. Wherein the lower limit of the depression angle is smaller than the depression angle of the first camera.
Correspondingly, if the lower limit value of the depression angle is equal to the depression angle of the first camera, if the electronic map is rendered on the screen according to the target scale and the depression angle of the first camera, and the position of the target navigation point is located outside the screen of the navigation terminal, the electronic map and the navigation path are rendered on the screen directly according to the target scale and the lower limit value of the depression angle.
Furthermore, the distance between the target navigation point and the navigated object on the screen can be adjusted according to the real-time distance between the navigated object and the target navigation point until the target navigation point is located in the preset position area of the screen. Wherein the position of the icon of the navigated object on the screen remains unchanged.
Correspondingly, under the condition that the electronic map is rendered according to the target scale and the first camera depression angle, if the target navigation point is located on the screen of the navigation terminal according to the judgment result, the camera depression angle is adjusted according to the real-time distance from the navigated object to the target navigation point, and the target navigation point is located in the preset position area of the screen. Namely, in the process that the navigated object moves to the target navigation point, the target navigation point is adjusted to the preset position area of the screen by increasing the depression angle of the camera. Wherein the shorter the real-time distance from the navigated object to the target navigation point, the larger the camera depression.
Further, for the case that the target navigation point is the first navigation point, after the navigated object enters the next segment through the first navigation point, the scale for rendering the electronic map and the camera depression angle can be adjusted according to the navigation guidance prompt issued by the server device. Optionally, after the navigated object enters the next segment, if the server device issues a turn guidance instruction for the next navigation point, the navigation terminal takes the next navigation point as a new first navigation point and reenters the operation responding to the turn event for the first navigation point. If the server device sends the road-level navigation prompt after the navigated object enters the next road segment, the navigation terminal can readjust the scale for rendering the electronic map to the first scale, readjust the camera depression angle for rendering the electronic map to the first camera depression angle, and then render the electronic map and the navigation path on the screen according to the first scale and the first camera depression angle.
In some application scenarios, when the first navigation point is the target navigation point, the navigation terminal renders the electronic map and the execution logic of the navigation path, which can be referred to in the related contents of the above embodiments and will not be described herein again.
In other application scenarios, the distance between the first navigation point and the second navigation point for guiding the next turn is short, and when the user turns at the first navigation point and needs to pay attention to the situation of the second navigation point, the second navigation point can be used as the target navigation point, and the position information of the second navigation point can be used as the position information of the target navigation point. For the case that the navigation terminal determines whether the target navigation point is the first navigation point or the second navigation point, reference may be made to the relevant contents of the above embodiments, which is not described herein again.
And if the second navigation point is the target navigation point. Correspondingly, before the camera depression angle for rendering the electronic map on the screen is adjusted, the camera rotation angle for rendering the electronic map can be adjusted, so that the second navigation point is located on the screen of the navigation terminal, and the electronic map and the navigation path are rendered on the screen according to the adjusted target scale and the adjusted camera rotation angle.
Optionally, the relative position relationship between the navigated object and the second navigation point when the navigated object moves to the second navigation point may be determined according to a pre-planned navigation path and the moving direction of the navigated object; and adjusting the rotation angle of the camera according to the relative position relationship.
Further, if the navigated object moves to the second navigation point, the second navigation point is located on the right side of the navigated object, and the navigated object rotates clockwise around the z-axis of the camera coordinate system by a set angle; and if the navigated object moves to the second navigation point, the second navigation point is positioned on the left side of the navigated object, and the navigated object rotates counterclockwise around the z-axis of the camera coordinate system by a set angle.
Further, before the navigated object passes through the first navigation point, the camera depression angle for rendering the electronic map can be adjusted according to the real-time distance between the navigated object and the second navigation point, so that the position on the electronic map for identifying the second navigation point is located in the predetermined position area of the screen of the navigation terminal. Further, the navigation terminal can render the electronic map and the navigation path on the screen according to the increased target scale, the adjusted depression angle of the camera and the adjusted rotation angle of the camera.
Further, after the navigated object passes through the first navigation point, the camera rotation angle may be returned, i.e., readjusted to the angle before the counterclockwise or clockwise rotation of the set angle.
Further, after the navigated object passes through the first navigation point, the navigation terminal can also adjust the camera depression angle according to the real-time distance between the navigated object and the second navigation point, so that the second navigation point is located in the predetermined position area of the screen of the navigation terminal. In order to enable the second navigation point to be positioned in the preset position area of the screen of the navigation terminal, the depression angle of the camera can be increased according to the real-time distance between the navigated object and the second navigation point, so that the second navigation point is positioned in the preset position area of the screen of the navigation terminal on the electronic map until the navigated object passes through the second navigation point. For a specific implementation manner of the navigation terminal adjusting the camera depression angle according to the real-time distance between the navigated object and the second navigation point, reference may be made to relevant contents in the above embodiments, and details are not repeated herein. Furthermore, the navigation terminal can render the electronic map and the navigation path on the screen according to the target scale, the corrected camera rotation angle and the currently adjusted camera depression angle.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subject of steps 202 and 203 may be device a; for another example, the execution subject of step 201 may be device a, and the execution subject of step 203 may be device B; and so on. For example, in the above embodiment, the step 201 and the step 203 can be executed by the navigation terminal or the server device separately; the navigation terminal and the server device can cooperate to complete the operation, for example, steps 201 and 202 can be executed by the server device, and step 203 can be executed by the navigation terminal; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 201, 202, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the navigation method described above.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device can be realized as a navigation terminal and is carried on a mobile device. As shown in fig. 3, the terminal device includes: a memory 30a, a processor 30b and a screen 30 c. The memory 30a is used for storing computer programs.
The processor 30b is coupled to the memory 30c for executing computer programs for: responding to a steering event aiming at the first navigation point, and adjusting the rendering scale of the electronic map to a set target scale; the first navigation point is a position for guiding the navigated object to turn; determining a real-time distance from the navigated object to the target navigation point based on the real-time location information of the navigated object and the location information of the target navigation point; adjusting a camera depression angle for rendering the electronic map on the screen 30c based on a real-time distance of the navigated object to the target navigation point such that the target navigation point is located at a predetermined location area of the screen 30 c; wherein the camera depression angle becomes larger as the real-time distance from the navigated object to the target navigation point decreases; then, the electronic map and the pre-planned navigation path are rendered on the screen 30c according to the target scale and the camera depression angle when the target navigation point is located in the predetermined position area of the screen.
Optionally, the target navigation point is a first navigation point or a second navigation point; the second navigation point is a position which is positioned in front of the first navigation point and is used for guiding the navigated object to turn, and the distance between the second navigation point and the first navigation point is less than or equal to the set distance threshold.
In some embodiments, the processor 30b is further configured to: rendering an electronic map and a navigation path on a screen according to a first scale and a first camera depression angle before responding to a steering event for a first navigation point; wherein the first scale is smaller than the target scale; adjusting the distance of the icon of the target navigation point and the navigated object on the screen 30c based on the real-time distance of the navigated object to the target navigation point; wherein the distance of the target navigation point from the icon of the navigated object on the screen 30c decreases as the real-time distance decreases.
Optionally, when the processor 30c adjusts the distance between the target navigation point and the icon of the navigated object on the screen, it is specifically configured to: the position of the icon of the navigated object on the screen is kept unchanged, and the position of the target navigation point on the screen 30c is adjusted based on the real-time distance of the navigated object to the target navigation point to adjust the distance of the first navigation point from the icon of the navigated object on the screen 30 c.
In other embodiments, the processor 30b, when adjusting the camera depression for rendering the electronic map on the screen 30c, is specifically configured to: sequentially increasing the depression angle of the camera according to a set angle change gradient in the process of reducing the real-time distance between the navigated object and the target navigation point; judging whether the target navigation point is located in a predetermined position area of the screen 30c after increasing the angle change gradient every time the camera depression angle; and if so, taking the camera depression angle after the adjustment as the camera depression angle of the current rendered electronic map.
In still other embodiments, the processor 30b is further configured to: stopping increasing the depression angle of the camera under the condition that the depression angle of the camera is adjusted to a set depression angle upper limit value; and renders the electronic map and the navigation path on the screen 30c according to the target scale and the depression upper limit value.
Optionally, the processor 30b is further configured to: stopping increasing the camera depression angle under the condition that the real-time distance between the navigated object and the target navigation point is less than or equal to a set distance threshold; an electronic map and a navigation path are rendered on the screen 30c according to the camera depression angle and the target scale when the enlargement is stopped.
In some other embodiments, the processor 30b is further configured to: determining whether the target navigation point is located on the screen in a case where the electronic map is rendered on the screen 30c according to the target scale and the first camera depression angle before adjusting the camera depression angle at which the electronic map is rendered on the screen 30c based on the real-time distance of the navigated object to the first navigation point; if the judgment result is negative, the camera depression angle is reduced under the condition that the first camera depression angle is larger than the depression angle lower limit value until the target navigation point is positioned on the screen 30 c. If the judgment result is no, under the condition that the depression angle of the first camera is equal to the depression angle lower limit value, the distance between the target navigation point and the icon of the navigated object on the screen 30c is adjusted according to the real-time distance between the navigated object and the target navigation point until the target navigation point is located in the preset position area of the screen 30 c.
Optionally, the processor 30b is further configured to: if the camera depression angle is reduced to the depression angle lower limit value and the target navigation point still does not appear on the screen 30c, rendering an electronic map and a navigation path on the screen 30c according to the target scale and the depression angle lower limit value; and adjusting the distance between the target navigation point and the icon of the navigated object on the screen 30c according to the real-time distance between the navigated object and the target navigation point until the target navigation point is located in the predetermined position area of the screen 30 c.
In some embodiments, the electronic device further comprises: a communication component 30 d. The processor 30b is further configured to: and receiving the navigation guidance prompt issued by the server device through the communication component 30 d. Accordingly, the processor 30b is further configured to: monitoring whether the navigation guidance prompt received by the communication component 30d and aiming at the first navigation point is a steering guidance prompt; under the condition that the navigation guidance prompt aiming at the first navigation point is monitored to be a steering guidance prompt, determining that a guidance steering event aiming at the first navigation point occurs; the navigation guidance prompt is issued by the server side equipment under the condition of determining the navigation guidance action of the navigated object at the first navigation point according to the real-time position information of the navigated object and the pre-planned navigation path.
In other embodiments, the processor 30b is further configured to: determining the navigation guiding action of the navigated object at the first navigation point according to the real-time position information of the navigated object and a pre-planned navigation path; and determining that a guiding steering event aiming at the first navigation point occurs when the navigation guiding motion of the navigated object at the first navigation point is taken as a steering action.
Optionally, the processor 30b is further configured to: acquiring real-time position information of a navigated object; and determining the position information of the target navigation point according to the real-time position information and the navigation path of the navigated object.
Optionally, when determining the position information of the target navigation point, the processor 30b is specifically configured to: determining the position information of a first navigation point according to the real-time position information of the navigated object and the navigation path; judging whether a second navigation point to be steered by the navigation object exists in a preset distance range of the first navigation point or not according to the position information and the navigation path of the first navigation point; and if the judgment result is negative, taking the first navigation point as a target navigation point, and taking the position information of the first navigation point as the position information of the target navigation point. Correspondingly, if the judgment result is yes, taking the second navigation point as the target navigation point; and the position information of the second navigation point is used as the position information of the target navigation point.
In the embodiment of the present application, if the target navigation point is the first navigation point, the processor 30b is further configured to: after the navigated object passes through the first navigation point, if a road-level navigation prompt is received, adjusting the scale of the rendered electronic map to a first scale; adjusting the camera depression angle to a first camera depression angle; and rendering the electronic map and the navigation path on the screen according to the first scale and the first camera depression angle.
If the target navigation point is the second navigation point, the processor 30b is further configured to: adjusting a camera rotation angle at which the electronic map is rendered on the screen so that the second navigation point is located on the screen before adjusting a camera depression angle at which the electronic map is rendered on the screen based on a real-time distance of the navigated object to the target navigation point.
Further, when adjusting the rotation angle of the camera for rendering the electronic map, the processor 30b is specifically configured to: determining the relative position relation between the navigated object and the second navigation point when the navigated object moves to the second navigation point according to the navigation path and the moving direction of the navigated object; and adjusting the rotation angle of the camera according to the relative position relation.
Optionally, when the processor 30b adjusts the camera rotation angle according to the relative position relationship, it is specifically configured to: if the navigated object moves to the second navigation point, the second navigation point is located on the right side of the navigated object, and the navigated object rotates clockwise around the z-axis of the camera coordinate system by a set angle; and if the navigated object moves to the second navigation point, the second navigation point is positioned on the left side of the navigated object, and the navigated object rotates counterclockwise around the z-axis of the camera coordinate system by a set angle.
Accordingly, when the processor 30b renders the electronic map and the pre-planned navigation path on the screen 30c, it is specifically configured to: and rendering the electronic map and the pre-planned navigation path on the screen 30c according to the target scale, the camera depression angle when the target navigation point is located in the preset position area of the screen and the adjusted camera rotation angle.
Further, the processor 30b, when rendering the electronic map and the pre-planned navigation path on the screen, is further configured to: returning to the camera by the angle of rotation after the navigated object passes the first navigation point; and rendering the electronic map and the navigation path on the screen 30c according to the target scale, the corrected camera rotation angle and the camera depression angle when the second navigation point is located in the preset position area of the screen 30 c.
In some optional embodiments, as shown in fig. 3, the electronic device may further include: power supply unit 30e and audio unit 30 f. Only some of the components are schematically shown in fig. 3, and it is not meant that the electronic device must include all of the components shown in fig. 3, nor that the electronic device only includes the components shown in fig. 3.
In embodiments of the present application, the memory is used to store computer programs and may be configured to store various other data to support operations on the device on which the memory is located. Wherein the processor may execute a computer program stored in the memory to implement the corresponding control logic. The memory may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
In the embodiments of the present application, the processor may be any hardware processing device that can execute the above described method logic. Alternatively, the processor may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or a Micro Controller Unit (MCU); programmable devices such as Field-Programmable Gate arrays (FPGAs), Programmable Array Logic devices (PALs), General Array Logic devices (GAL), Complex Programmable Logic Devices (CPLDs), etc. may also be used; or Advanced Reduced Instruction Set (RISC) processors (ARM), or System On Chips (SOC), etc., but is not limited thereto.
In embodiments of the present application, the communication component is configured to facilitate wired or wireless communication between the device in which it is located and other devices. The device in which the communication component is located can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, 4G, 5G or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may also be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, or other technologies.
In the embodiment of the present application, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
In embodiments of the present application, a power supply component is configured to provide power to various components of the device in which it is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
In embodiments of the present application, the audio component may be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals. For example, for an electronic device with language interaction functionality, voice interaction with a user may be enabled through an audio component, and so forth.
The electronic device provided by the embodiment is carried on a navigated object, an electronic map is displayed on a screen of the electronic device, and when a steering event occurs at a first navigation point, a rendering scale of the electronic map can be adjusted to a set target scale, so that the electronic map with higher geometric accuracy is displayed for a user; and based on the real-time distance from the navigated object to the target navigation point, adjusting the camera depression angle for rendering the electronic map to enable the target navigation area to be located in the preset position area of the screen, thereby solving the contradiction between the adjustment of the rendering scale and the navigation rendering requirement that the target navigation point on the electronic map is located in the preset position area of the screen. In addition, the depression angle of the camera is adjusted according to the real-time distance between the navigated object and the target navigation point, so that the target navigation point is located in the preset position area of the screen, on one hand, the target navigation point can be ensured to be located in the preset position area of the screen, on the other hand, the navigation effect of dynamic driving of the navigated object can be simulated, a user can sense the dynamic effect of the navigated object moving to the target navigation point, the navigation intuition is improved, the navigation effect is improved, and the probability that the user misses the turning position is reduced.
It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (22)

1. A navigation method, comprising:
responding to a steering event aiming at the first navigation point, and adjusting the rendering scale of the electronic map to a set target scale; the first navigation point is a position for guiding the navigated object to turn;
determining a real-time distance from the navigated object to a target navigation point based on the real-time location information of the navigated object and the location information of the target navigation point;
adjusting a camera depression angle for rendering an electronic map on a screen based on a real-time distance of the navigated object to the target navigation point so that the target navigation point is located in a predetermined location area of the screen; wherein the camera depression angle becomes larger as the real-time distance of the navigated object to the target navigation point decreases;
rendering an electronic map and a pre-planned navigation path on the screen according to a target scale and a camera depression angle when a target navigation point is located in a preset position area of the screen;
the target navigation point is a first navigation point or a second navigation point; the second navigation point is a position which is positioned in front of the first navigation point and used for guiding the navigated object to turn, and the distance between the second navigation point and the first navigation point is smaller than or equal to a set distance threshold.
2. The method of claim 1, wherein prior to responding to a steering event for a first navigation point, the method further comprises:
rendering the electronic map and the navigation path on the screen according to a first scale and a first camera depression; wherein the first scale is smaller than the target scale;
adjusting the distance between the target navigation point and the icon of the navigated object on the screen based on the real-time distance between the navigated object and the target navigation point;
wherein the distance of the target navigation point and the icon of the navigated object on the screen decreases as the real-time distance decreases.
3. The method of claim 2, wherein said adjusting a distance on the screen of the icon of the navigated object from a target navigation point based on a real-time distance of the navigated object to the target navigation point comprises:
keeping the position of the icon of the navigated object on the screen unchanged, and adjusting the position of the target navigation point on the screen based on the real-time distance from the navigated object to the target navigation point to adjust the distance from the first navigation point to the icon of the navigated object on the screen.
4. The method of claim 1, wherein said adjusting a camera top-down angle of an on-screen rendered electronic map based on a real-time distance of the navigated object to the target navigation point comprises:
sequentially increasing the depression angle of the camera according to a set angle change gradient in the process of reducing the real-time distance between the navigated object and the target navigation point;
after increasing the camera depression angle by the angle change gradient each time, judging whether the target navigation point is located in a preset position area of the screen;
and if so, taking the camera depression angle after the adjustment as the camera depression angle of the current rendered electronic map.
5. The method of claim 1, further comprising:
stopping increasing the camera depression when the camera depression is adjusted to a set depression upper limit value;
and rendering the electronic map and the navigation path on the screen according to the target scale and the depression upper limit value.
6. The method of claim 1, further comprising:
stopping increasing the camera depression if the real-time distance between the navigated object and the target navigation point is less than or equal to a set distance threshold;
and rendering the electronic map and the navigation path on the screen according to the camera depression angle when the increase is stopped and the target scale.
7. The method of claim 2, wherein prior to adjusting a camera depression at which the electronic map is rendered on the screen based on the real-time distance of the navigated object to the first navigation point, further comprising:
determining whether the target navigation point is located on the screen in a case where the electronic map is rendered on the screen according to the target scale and the first camera depression angle;
if the judgment result is negative, reducing the depression angle of the camera under the condition that the depression angle of the first camera is larger than the depression angle lower limit value until the target navigation point is positioned on the screen.
8. The method of claim 7, further comprising:
if the depression angle of the camera is reduced to the depression angle lower limit value and the target navigation point still does not appear on the screen, rendering an electronic map and the navigation path on the screen according to the target scale and the depression angle lower limit value;
and adjusting the distance between the target navigation point and the icon of the navigated object on the screen according to the real-time distance between the navigated object and the target navigation point until the target navigation point is positioned in the preset position area of the screen.
9. The method of claim 7, further comprising:
if the judgment result is negative, under the condition that the depression angle of the first camera is equal to the depression angle lower limit value, the distance between the target navigation point and the icon of the navigated object on the screen is adjusted according to the real-time distance between the navigated object and the target navigation point until the target navigation point is located in the preset position area of the screen.
10. The method according to any one of claims 1-9, further comprising:
monitoring whether the received navigation guidance prompt aiming at the first navigation point is a steering guidance prompt; determining that a guiding steering event aiming at the first navigation point occurs under the condition that a navigation guiding prompt aiming at the first navigation point is monitored as a steering guiding prompt; the navigation guidance prompt is issued by the server side equipment under the condition of determining the navigation guidance action of the navigated object at the first navigation point according to the real-time position information of the navigated object and a pre-planned navigation path;
or determining the navigation guiding action of the navigated object at the first navigation point according to the real-time position information of the navigated object and a pre-planned navigation path; determining that a guided turn event occurs for the first navigation point if a navigation guidance action of the navigated object at the first navigation point is taken as a turn action.
11. The method of claim 10, further comprising:
acquiring real-time position information of a navigated object;
and determining the position information of the target navigation point according to the real-time position information of the navigated object and the navigation path.
12. The method of claim 11, wherein said determining the position information of the target navigation point from the real-time position information of the navigated object and the navigation path comprises:
determining the position information of the first navigation point according to the real-time position information of the navigated object and the navigation path;
judging whether a second navigation point to be steered by the navigated object exists in a preset distance range of the first navigation point or not according to the position information of the first navigation point and the navigation path;
and if the judgment result is negative, taking the first navigation point as the target navigation point, and taking the position information of the first navigation point as the position information of the target navigation point.
13. The method of claim 12, further comprising:
if the judgment result is yes, taking the second navigation point as the target navigation point; and using the position information of the second navigation point as the position information of the target navigation point.
14. The method of claim 2, wherein if the target navigation point is the first navigation point, the method further comprises:
after the navigated object passes through the first navigation point, if a road-level navigation prompt is received, adjusting a scale of a rendered electronic map to the first scale; and adjusting the camera depression to the first camera depression;
rendering the electronic map and the navigation path on the screen according to the first scale and the first camera depression.
15. The method of any of claims 1-9, wherein if the target navigation point is the second navigation point, prior to adjusting a camera top-down angle at which the electronic map is rendered on the screen based on a real-time distance of the navigated object to the target navigation point, the method further comprises:
adjusting a camera rotation angle at which an electronic map is rendered on a screen so that the second navigation point is located on the screen.
16. The method of claim 15, the adjusting the camera rotation angle for rendering an electronic map, comprising:
determining the relative position relation between the navigated object and the second navigation point when the navigated object moves to the second navigation point according to the navigation path and the moving direction of the navigated object;
and adjusting the rotation angle of the camera according to the relative position relation.
17. The method of claim 16, wherein said adjusting the camera rotation angle according to the relative positional relationship comprises:
if the second navigation point is positioned at the right side of the navigated object when the navigated object moves to the second navigation point, clockwise rotating around the z-axis of the camera coordinate system by a set angle;
and if the second navigation point is positioned on the left side of the navigated object when the navigated object moves to the second navigation point, rotating counterclockwise by a set angle around the z-axis of the camera coordinate system.
18. The method of claim 15, wherein the rendering an electronic map and a preplanned navigation path on the screen according to a target scale and a camera depression angle when a target navigation point is located in a predetermined location area of the screen comprises:
and rendering an electronic map and a pre-planned navigation path on the screen according to the target scale, the camera depression angle when the target navigation point is located in the preset position area of the screen and the adjusted camera rotation angle.
19. The method of claim 15, wherein the rendering an electronic map and a preplanned navigation path on the screen according to a target scale and a camera depression angle when a target navigation point is located in a predetermined location area of the screen comprises:
after the navigated object passes the first navigation point, correcting back the camera rotation angle;
and rendering an electronic map and a navigation path on the screen according to the target scale, the corrected camera rotation angle and the camera depression angle when the second navigation point is located in the preset position area of the screen.
20. A navigation system, comprising: the system comprises a navigation terminal and server equipment; the navigation terminal is carried on a navigated object;
the server device is configured to: determining a navigation guiding action of the navigated object at a first navigation point according to the real-time position information of the navigated object and a pre-planned navigation path; and under the condition that the navigation guidance of the navigated object at a first navigation point is taken as a steering action, issuing a steering guidance prompt aiming at the first navigation point to the navigation terminal;
the navigation terminal is used for: in response to the steering guidance prompt, adjusting an electronic map rendering scale to a set target scale; determining a real-time distance from the navigated object to a target navigation point based on the real-time location information of the navigated object and the location information of the target navigation point; adjusting a camera depression angle for rendering an electronic map on a screen based on a real-time distance from the navigated object to the target navigation point, so that the target navigation point is located in a preset position area of the screen; wherein the camera depression angle becomes larger as the real-time distance of the navigated object to the target navigation point decreases; rendering an electronic map and a pre-planned navigation path on the screen according to a target scale and a camera depression angle when a target navigation point is located in a preset position area of the screen; the target navigation point is a first navigation point or a second navigation point; the second navigation point is a position which is positioned in front of the first navigation point and used for guiding the navigated object to turn, and the distance between the second navigation point and the first navigation point is smaller than or equal to a set distance threshold.
21. An electronic device, comprising: a memory, a processor, and a screen; the memory for storing a computer program;
the processor is coupled to the memory for executing the computer program for performing the steps of the method of any of claims 1-19.
22. A computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the method of any one of claims 1-19.
CN202010763648.XA 2020-07-31 2020-07-31 Navigation method, device, system and storage medium Pending CN114061600A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010763648.XA CN114061600A (en) 2020-07-31 2020-07-31 Navigation method, device, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010763648.XA CN114061600A (en) 2020-07-31 2020-07-31 Navigation method, device, system and storage medium

Publications (1)

Publication Number Publication Date
CN114061600A true CN114061600A (en) 2022-02-18

Family

ID=80231331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010763648.XA Pending CN114061600A (en) 2020-07-31 2020-07-31 Navigation method, device, system and storage medium

Country Status (1)

Country Link
CN (1) CN114061600A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114863672A (en) * 2022-03-22 2022-08-05 阿里巴巴(中国)有限公司 Dynamic traffic display method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114863672A (en) * 2022-03-22 2022-08-05 阿里巴巴(中国)有限公司 Dynamic traffic display method and device
CN114863672B (en) * 2022-03-22 2023-09-15 阿里巴巴(中国)有限公司 Dynamic traffic display method and device

Similar Documents

Publication Publication Date Title
US20210208597A1 (en) Sensor aggregation framework for autonomous driving vehicles
US10816984B2 (en) Automatic data labelling for autonomous driving vehicles
US11367354B2 (en) Traffic prediction based on map images for autonomous driving
US20200124423A1 (en) Labeling scheme for labeling and generating high-definition map based on trajectories driven by vehicles
US20200043326A1 (en) Use sub-system of autonomous driving vehicles (adv) for police car patrol
US20230048230A1 (en) Method for displaying lane information and apparatus for executing the method
CN111311902B (en) Data processing method, device, equipment and machine readable medium
US10860868B2 (en) Lane post-processing in an autonomous driving vehicle
KR102123844B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
US11591019B2 (en) Control dominated three-point turn planning for autonomous driving vehicles
CN113792589B (en) Overhead identification method and device
US20180156627A1 (en) Display control device, display device, and display control method
US11180145B2 (en) Predetermined calibration table-based vehicle control system for operating an autonomous driving vehicle
CN114061600A (en) Navigation method, device, system and storage medium
US11247700B2 (en) Enumeration-based three-point turn planning for autonomous driving vehicles
CN117842083A (en) Vehicle wading travel processing method, device, computer equipment and storage medium
CN113739800A (en) Navigation guiding method and computer program product
KR20200070100A (en) A method for detecting vehicle and device for executing the method
US10916077B2 (en) User privacy protection on autonomous driving vehicles
CN114061598A (en) Navigation method, device, system and storage medium
CN113108801A (en) Green wave road navigation method and device
CN109115233B (en) Method, device, system and computer readable medium for non-destination navigation
US20220254140A1 (en) Method and System for Identifying Object
CN116704472B (en) Image processing method, device, apparatus, medium, and program product
CN104639820A (en) Shooting method and device and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination