CN112146656A - Indoor navigation visualization method based on augmented reality - Google Patents

Indoor navigation visualization method based on augmented reality Download PDF

Info

Publication number
CN112146656A
CN112146656A CN202010913376.7A CN202010913376A CN112146656A CN 112146656 A CN112146656 A CN 112146656A CN 202010913376 A CN202010913376 A CN 202010913376A CN 112146656 A CN112146656 A CN 112146656A
Authority
CN
China
Prior art keywords
poi
information
mobile terminal
navigation
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010913376.7A
Other languages
Chinese (zh)
Other versions
CN112146656B (en
Inventor
朱欣焰
李洁玮
呙维
刘武平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202010913376.7A priority Critical patent/CN112146656B/en
Publication of CN112146656A publication Critical patent/CN112146656A/en
Application granted granted Critical
Publication of CN112146656B publication Critical patent/CN112146656B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention belongs to the technical field of visualization, and discloses an indoor navigation visualization method based on augmented reality, which comprises the steps of acquiring indoor POI information positioned in a first range of a target point according to navigation target information; screening the indoor POI information according to sensor information of a mobile terminal and position information of the mobile terminal to obtain first POI information; acquiring basic equipment information of a mobile terminal, acquiring sensor information of the mobile terminal in real time, and updating POI screen coordinate information in real time by combining the basic equipment information of the mobile terminal; acquiring navigation road information according to the position information and the navigation target information of the mobile terminal, and calculating a rotation angle of the obtained guide arrow in real time; and rendering the guide arrow and the POI screen coordinate information in real time. According to the invention, the indoor navigation data is visualized, and the virtual navigation information is superimposed on the real scene video stream, so that the virtual-real combined indoor navigation effect is realized, and the navigation information is more clear and visual.

Description

Indoor navigation visualization method based on augmented reality
Technical Field
The invention relates to the technical field of visualization, in particular to an indoor navigation visualization method based on augmented reality.
Background
With the application and development of the technology related to the location information based on the user, the location service is gradually developed from a macro level to a micro level. At present, a Global Positioning System (GPS) is developed increasingly in outdoor positioning and navigation, and the outdoor travel requirement of people is basically met. Due to the shielding of buildings and the existence of multipath effect, the GPS can generate reflection, refraction, scattering, and the like in an indoor environment, and cannot provide accurate indoor position information. However, indoor environments are becoming more and more complex as society develops, and eighty percent of human activities occur indoors.
Most of the conventional navigation systems are outdoor navigation systems, and some of the conventional navigation systems include an indoor navigation function. However, these indoor navigation modules perform a visual representation of a superimposed two-dimensional plane on a map representation, and cannot provide accurate indoor positioning and navigation services in terms of functions, and help for a user to find a way is very limited. In an indoor complex environment, such as the interior of a large building, such as a shopping center, a convention and exhibition center, a library, a warehouse, an underground parking lot, and the like, the indoor structure is similar, the information provided by the vectorized navigation map is not clear enough, and the user is still required to distinguish directions and find paths indoors, which is a very difficult thing for the user without direction sense. Therefore, how to express the indoor navigation information as intuitively and accurately as possible to realize indoor navigation visualization is an urgent problem to be solved.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides an indoor navigation visualization method based on augmented reality.
The invention provides an indoor navigation visualization method based on augmented reality, which comprises the following steps:
step 1, acquiring indoor POI information positioned in a first range of a target point according to navigation target information;
step 2, screening the indoor POI information according to sensor information of a mobile terminal and position information of the mobile terminal to obtain first POI information, wherein the first POI information is POI information within a visual angle range of the mobile terminal;
step 3, acquiring basic equipment information of the mobile terminal, acquiring sensor information of the mobile terminal in real time, and updating the POI screen coordinate information in real time by combining the basic equipment information of the mobile terminal; the POI screen coordinate information is screen mapping coordinate information obtained by overlaying the first POI information to a real video stream scene acquired by the mobile terminal;
step 4, obtaining navigation road information according to the position information of the mobile terminal and the navigation target information, and calculating a rotation angle of the guiding arrow in real time;
and 5, rendering the guide arrow and the POI screen coordinate information in real time.
Preferably, the step 2 comprises the following substeps:
step 2.1, storing the current position information of the mobile terminal as a first Object, wherein the first Object comprises an x coordinate value and a y coordinate value; respectively storing each POI in the indoor POI information as a second Object, wherein the second Object comprises an x coordinate value, a y coordinate value, a distance to a current coordinate of the mobile terminal and a direction angle to a current position of the mobile terminal of the POI;
step 2.2, acquiring a gyroscope angle theta of the mobile terminal and a camera field angle alpha of the mobile terminal, and if preset conditions are met for each POI, saving data of the POI to a visual array and forming first POI information; the preset conditions are as follows:
Figure BDA0002664127670000021
Figure BDA0002664127670000022
wherein, thetaiThe direction angle from the ith POI to the current coordinate of the mobile terminal in the indoor POI information, diThe distance from the ith POI to the current position of the mobile terminal,
Figure BDA0002664127670000023
is a distance threshold.
Preferably, the step 3 comprises the following substeps:
step 3.1, obtaining basic information of the equipment of the mobile terminal, wherein the basic information of the equipment comprises a screen pixel width W, a screen pixel height H and an initial gyroscope angle omega of the mobile terminal;
step 3.2, calculating initial screen coordinates of each POI in the first POI information according to the sensor information obtained in real time and by combining the basic information of the equipment, and adopting the following formula:
Figure BDA0002664127670000024
Figure BDA0002664127670000025
wherein x isiIs the initial x coordinate, y, of the ith POI in the first POI informationiThe initial y coordinate of the ith POI in the first POI information is obtained;
and 3.3, when the angle of the gyroscope of the mobile terminal does not exceed the field angle alpha of the camera of the mobile terminal, updating the screen coordinate of the POI according to the current angle of the gyroscope of the mobile terminal, wherein the updating adopts the following formula:
Figure BDA0002664127670000031
wherein x isi' is the updated initial x coordinate of the ith POI, theta is the gyroscope angle of the mobile terminal, and theta isiThe direction angle from the ith POI to the current coordinate of the mobile terminal in the indoor POI information is set;
and when the angle of the gyroscope of the mobile terminal exceeds the camera view angle alpha of the mobile terminal, or when the position movement of the mobile terminal exceeds a preset movement threshold, returning to the step 3.2.
Preferably, in the step 4, after the navigation road information is obtained, whether the mobile terminal moves on the navigation road is judged;
if so, calculating the rotation angle of the guiding arrow according to the current road direction and the angle of the gyroscope of the mobile terminal; if not, the rotation angle of the guiding arrow is calculated according to the direction angle of a vertical line segment formed by the current position of the moving end and the nearest road and the angle of a gyroscope of the moving end.
Preferably, said step 5 comprises the following sub-steps:
step 5.1, calculating the distance from each POI in the POI screen coordinate information to the current position of the mobile terminal, and obtaining the distance difference value R between the farthest POI and the nearest POI in the POI screen coordinate information according to the maximum distance and the minimum distance, wherein the following formula is adopted:
Figure BDA0002664127670000032
wherein d isiThe distance between the ith POI and the current position of the mobile terminal is represented, and R represents the distance difference value between the farthest POI and the nearest POI in the POI screen coordinate information and the current position of the mobile terminal;
step 5.2, calculating the y coordinate of each POI in the POI screen coordinate information, and adopting the following formula:
Figure BDA0002664127670000033
h represents the height of a lower boundary of a displayed POI area on the interface, and H represents the pixel height of the POI display area;
step 5.3, performing real-time visual rendering on the POI screen coordinate information according to the y coordinate of each POI in the POI screen coordinate information obtained by calculation in the step 5.2 and the updated initial x coordinate of the ith POI obtained by calculation in the step 3.3, and optimizing the label covering problem;
step 5.4, calculating the distance from the current position of the mobile terminal to each road inflection point in the navigation road information, and finding out a road r closest to the current position of the mobile terminal1Calculating the current of the mobile terminalPosition to road r1Direction angle alpha of3
Step 5.5, if the direction angle alpha3If the value is 0, judging that the mobile terminal moves on the navigation road, and adopting the following formula for guiding the correct road direction by the guide arrow:
α=α12
wherein alpha is the rotation angle of the guide arrow, alpha1For correct road direction, α2The current direction of the mobile terminal;
step 5.6, if the direction angle alpha3If not, judging that the mobile terminal deviates from the navigation road, and adopting the following formula by pointing to the correct road direction through the guide arrow:
α=α32
wherein α is a rotation angle of the guiding arrow, and the guiding arrow should point to a direction of a vertical line segment formed by the current position of the moving end and the nearest road.
One or more technical solutions provided in the embodiments of the present application have at least the following technical effects or advantages:
according to the indoor navigation visualization method based on augmented reality, indoor POI information located in a first range of a target point is obtained according to navigation target information; screening indoor POI information according to sensor information of a mobile terminal and position information of the mobile terminal to obtain first POI information, wherein the first POI information is POI information within a visual angle range of the mobile terminal; then, acquiring basic equipment information of the mobile terminal, acquiring sensor information of the mobile terminal in real time, and updating the POI screen coordinate information in real time by combining the basic equipment information of the mobile terminal; the method comprises the steps that POI screen coordinate information and screen mapping coordinate information obtained by superimposing first POI information on a real video stream scene obtained by a mobile terminal are obtained; then, acquiring navigation road information according to the position information and the navigation target information of the mobile terminal, and calculating the rotation angle of the obtained guide arrow in real time; and finally, rendering the guide arrow and the POI screen coordinate information in real time. The method obtains video stream data of a camera of a mobile terminal (such as a mobile phone), obtains POI data, obtains mobile phone sensor and position information, screens the POI data, calculates the mapping coordinates of a POI screen, obtains navigation data according to a starting point and an end point, calculates the rotation angle of a guide arrow according to an indoor positioning track, and renders the guide arrow and a POI label in real time. Real coordinates of the POI labels are converted into mobile phone screen coordinates through a screen coordinate conversion method, real-time rendering of the POI labels is achieved, indoor navigation data are visualized, virtual navigation information is overlaid on real scene video streams, and a virtual-real combined indoor navigation effect is achieved. The method and the device are suitable for indoor navigation by utilizing the augmented reality technology, can make navigation information more clear and intuitive, are convenient for users to know, and provide a better navigation experience for the users.
Drawings
Fig. 1 is a schematic frame diagram of an indoor navigation visualization method based on augmented reality according to the present invention;
fig. 2 is a detailed flowchart of an indoor navigation visualization method based on augmented reality according to the present invention.
Detailed Description
In order to better understand the technical solution, the technical solution will be described in detail with reference to the drawings and the specific embodiments.
The main idea of the technical scheme of the invention is as follows: the method has the advantages that the visual expression of the two-dimensional vector of the traditional map is changed, in order to help a user to find an indoor road clearly and definitely, the visualization of indoor navigation is carried out by adopting an augmented reality-based method, a real indoor scene is constructed by acquiring real video stream data, POI data and indoor navigation data are superposed, the visual effect of virtual-real combination is realized, indoor navigation information is expressed more straightly and accurately, the visual effect of real-time navigation is enriched, and more humanized and more accurate navigation experience is brought to the user.
The embodiment provides an augmented reality-based indoor navigation visualization method, which is shown in fig. 1 and 2 and comprises the following steps:
step 1, indoor POI information located in a first range of a target point is obtained according to navigation target information.
Specifically, indoor POI data is acquired from a server side.
And 2, screening the indoor POI information according to the sensor information of the mobile terminal and the position information of the mobile terminal to obtain first POI information, wherein the first POI information is POI information within the visual angle range of the mobile terminal.
And (4) carrying out POI data screening according to the mobile terminal sensor data, and screening out POI data in the user visual angle range.
Specifically, step 2 includes the following substeps:
step 2.1, storing the current position information of the mobile terminal as a first Object, wherein the first Object comprises an x coordinate value and a y coordinate value; and storing each POI point in the indoor POI information as a second Object, wherein the second Object comprises an x coordinate value, a y coordinate value, a distance to the current coordinate of the mobile terminal and a direction angle to the current position of the mobile terminal of the POI.
Step 2.2, acquiring a gyroscope angle theta of the mobile terminal and a camera angle alpha of the mobile terminal (such as a mobile phone), and if preset conditions are met for each POI, saving data of the POI to a visual array and forming first POI information; the preset conditions are as follows:
Figure BDA0002664127670000051
Figure BDA0002664127670000052
that is, if the preset condition is satisfied, the POI is considered to appear in the sight range, and the POI data is stored in the visual array.
Wherein, thetaiThe direction angle from the ith POI to the current coordinate of the mobile terminal in the indoor POI information, diThe distance from the ith POI to the current position of the mobile terminal,
Figure BDA0002664127670000061
is a distance threshold.
Step 3, acquiring basic equipment information of the mobile terminal, acquiring sensor information of the mobile terminal in real time, and updating the POI screen coordinate information in real time by combining the basic equipment information of the mobile terminal; and the POI screen coordinate information is screen mapping coordinate information obtained by superposing the first POI information on a real video stream scene acquired by the mobile terminal.
And calculating screen mapping coordinates of the longitude and latitude coordinates of the POI by combining basic parameters of the mobile end hardware equipment and the indoor positioning track coordinates, and obtaining initial screen coordinates of each POI. And acquiring the data of the mobile terminal sensor and the indoor positioning track in real time, and updating the screen coordinates of the POI in real time.
Specifically, step 3 includes the following substeps:
and 3.1, obtaining basic information of the mobile terminal, wherein the basic information of the mobile terminal comprises the width W of a screen pixel, the height H of the screen pixel, and the initial gyroscope angle omega of the mobile terminal when the system is opened.
Step 3.2, calculating initial screen coordinates of each POI in the first POI information according to the sensor data obtained in real time and by combining the basic information of the equipment, wherein the initial screen coordinates are shown in the following formula:
Figure BDA0002664127670000062
Figure BDA0002664127670000063
i.e. the initial y-coordinate of the POI is half the height of the cell phone pixels.
Wherein x isiIs the initial x coordinate, y, of the ith POI in the first POI informationiIs the initial y coordinate of the ith POI in the first POI information.
And 3.3, when the angle (rotation) of the gyroscope of the mobile terminal does not exceed the angle of the camera field of the mobile terminal, the POI slides left and right along with the change of the angle of the gyroscope of the mobile terminal. Updating the screen coordinates of the POI (namely obtaining the gyroscope angle of the current mobile terminal and calculating the screen coordinates of the POI after sliding) according to the angle of the gyroscope of the current mobile terminal, wherein the updating adopts the following formula:
Figure BDA0002664127670000064
wherein x isi' is the updated initial x coordinate of the ith POI, theta is the gyroscope angle of the mobile terminal, and theta isiAnd the direction angle from the ith POI in the indoor POI information to the current coordinate of the mobile terminal.
When the angle of the gyroscope of the mobile terminal exceeds the field angle of the camera of the mobile terminal or the position of the mobile terminal moves beyond a certain threshold, the euclidean distance and the direction angle from the POI to the user need to be recalculated, and the step 3.2 is returned.
And 4, acquiring navigation road information according to the position information and the navigation target information of the mobile terminal, and calculating the rotation angle of the guiding arrow in real time.
Calculating current navigation data according to a starting point and an end point of navigation, and finding a road closest to the user position according to an indoor positioning coordinate; judging whether the user walks on the navigation road, if so, calculating the rotation angle of the guide arrow according to the current road direction and the angle of a gyroscope at the mobile end; if not, the rotation angle of the guide arrow is calculated according to the direction angle of a vertical line segment formed by the current position (namely the positioning point) of the mobile end and the nearest road and the angle of a gyroscope of the mobile end.
And 5, rendering the guide arrow and the POI screen coordinate information in real time.
Specifically, step 5 includes the following substeps:
step 5.1, calculating the distance from each POI in the POI screen coordinate information to the current position of the mobile terminal, and obtaining the distance difference value R between the farthest POI and the nearest POI in the POI screen coordinate information according to the maximum distance and the minimum distance, wherein the following formula is adopted:
Figure BDA0002664127670000071
wherein d isiAnd R represents the distance difference between the farthest POI and the nearest POI in the POI screen coordinate information and the current position of the mobile terminal.
Step 5.2, calculating the y coordinate of each POI in the POI screen coordinate information, and adopting the following formula:
Figure BDA0002664127670000072
where H represents the height of the lower boundary of the POI display area on the interface, and H represents the pixel height of the POI display area.
And 5.3, performing real-time visual rendering on the POI screen coordinate information according to the y coordinate of each POI in the POI screen coordinate information obtained by calculation in the step 5.2 and the updated initial x coordinate of the ith POI obtained by calculation in the step 3.3, and optimizing the label covering problem.
Step 5.4, calculating the distance from the current position (namely the current positioning point coordinate) of the mobile terminal to each road inflection point (namely each navigation trigger point) in the navigation road information, and finding out a road r closest to the current position of the mobile terminal1. Calculating the current position of the mobile terminal to the road r1Direction angle alpha of3
Step 5.5, if the direction angle alpha3If the value is 0, the mobile terminal is judged to move on the navigation road (namely, the user walks on the navigation road), and at the moment, only the correct road direction alpha needs to be guided1. Due to the initial angle of the guide arrow and the direction alpha of the moving end2And thus, the direction arrow need only be rotated by a certain angle α to point in the road heading direction. The following formula is adopted for guiding the correct road direction by the guide arrow:
α=α12
wherein alpha is the rotation angle of the guide arrow, alpha1For correct road direction, α2Is the current direction of the mobile terminal.
Step 5.6, if the direction angle alpha3If not, judging that the mobile terminal deviates from the navigation road (namely, the user already knowsDeviating from the right road), the guiding arrow should point to the right road, and the guiding arrow should point to the direction of the vertical line segment formed by the current position (i.e. the positioning point) of the moving end and the nearest road. Pointing to the correct road direction by the guide arrow uses the following formula:
α=α32
wherein α is the rotation angle of the guide arrow. In summary, the invention provides an indoor navigation visualization method based on augmented reality, which is based on a real video stream scene, superimposes Point of Interest (POI) information and real-time guide information around a positioning Point, realizes fusion of the real scene and virtual navigation information, solves the confusion brought to users by two-dimensional plane visualization expression, enables users to acquire navigation information more intuitively, accurately and conveniently, and provides a more intuitive, rich and better human-computer interaction indoor navigation system for the users.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to examples, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.

Claims (5)

1. An indoor navigation visualization method based on augmented reality is characterized by comprising the following steps:
step 1, acquiring indoor POI information positioned in a first range of a target point according to navigation target information;
step 2, screening the indoor POI information according to sensor information of a mobile terminal and position information of the mobile terminal to obtain first POI information, wherein the first POI information is POI information within a visual angle range of the mobile terminal;
step 3, acquiring basic equipment information of the mobile terminal, acquiring sensor information of the mobile terminal in real time, and updating the POI screen coordinate information in real time by combining the basic equipment information of the mobile terminal; the POI screen coordinate information is screen mapping coordinate information obtained by overlaying the first POI information to a real video stream scene acquired by the mobile terminal;
step 4, obtaining navigation road information according to the position information of the mobile terminal and the navigation target information, and calculating a rotation angle of the guiding arrow in real time;
and 5, rendering the guide arrow and the POI screen coordinate information in real time.
2. The augmented reality based indoor navigation visualization method according to claim 1, wherein the step 2 comprises the following sub-steps:
step 2.1, storing the current position information of the mobile terminal as a first Object, wherein the first Object comprises an x coordinate value and a y coordinate value; respectively storing each POI in the indoor POI information as a second Object, wherein the second Object comprises an x coordinate value, a y coordinate value, a distance to a current coordinate of the mobile terminal and a direction angle to a current position of the mobile terminal of the POI;
step 2.2, acquiring a gyroscope angle theta of the mobile terminal and a camera field angle alpha of the mobile terminal, and if preset conditions are met for each POI, saving data of the POI to a visual array and forming first POI information; the preset conditions are as follows:
Figure FDA0002664127660000011
Figure FDA0002664127660000012
wherein, thetaiThe direction angle from the ith POI to the current coordinate of the mobile terminal in the indoor POI information, diThe distance from the ith POI to the current position of the mobile terminal,
Figure FDA0002664127660000013
is a distance threshold.
3. The augmented reality based indoor navigation visualization method according to claim 2, wherein the step 3 comprises the following sub-steps:
step 3.1, obtaining basic information of the equipment of the mobile terminal, wherein the basic information of the equipment comprises a screen pixel width W, a screen pixel height H and an initial gyroscope angle omega of the mobile terminal;
step 3.2, calculating initial screen coordinates of each POI in the first POI information according to the sensor information obtained in real time and by combining the basic information of the equipment, and adopting the following formula:
Figure FDA0002664127660000021
Figure FDA0002664127660000022
wherein x isiIs the initial x coordinate, y, of the ith POI in the first POI informationiThe initial y coordinate of the ith POI in the first POI information is obtained;
and 3.3, when the angle of the gyroscope of the mobile terminal does not exceed the field angle alpha of the camera of the mobile terminal, updating the screen coordinate of the POI according to the current angle of the gyroscope of the mobile terminal, wherein the updating adopts the following formula:
Figure FDA0002664127660000023
wherein x isi' is the updated initial x coordinate of the ith POI, theta is the gyroscope angle of the mobile terminal, and theta isiThe direction angle from the ith POI to the current coordinate of the mobile terminal in the indoor POI information is set;
and when the angle of the gyroscope of the mobile terminal exceeds the camera view angle alpha of the mobile terminal, or when the position movement of the mobile terminal exceeds a preset movement threshold, returning to the step 3.2.
4. The method for visualizing indoor navigation based on augmented reality of claim 1, wherein in step 4, after the navigation road information is obtained, it is determined whether the mobile terminal moves on the navigation road;
if so, calculating the rotation angle of the guiding arrow according to the current road direction and the angle of the gyroscope of the mobile terminal; if not, the rotation angle of the guiding arrow is calculated according to the direction angle of a vertical line segment formed by the current position of the moving end and the nearest road and the angle of a gyroscope of the moving end.
5. The augmented reality based indoor navigation visualization method according to claim 3, wherein the step 5 comprises the following sub-steps:
step 5.1, calculating the distance from each POI in the POI screen coordinate information to the current position of the mobile terminal, and obtaining the distance difference value R between the farthest POI and the nearest POI in the POI screen coordinate information according to the maximum distance and the minimum distance, wherein the following formula is adopted:
Figure FDA0002664127660000024
wherein d isiThe distance between the ith POI and the current position of the mobile terminal is represented, and R represents the distance difference value between the farthest POI and the nearest POI in the POI screen coordinate information and the current position of the mobile terminal;
step 5.2, calculating the y coordinate of each POI in the POI screen coordinate information, and adopting the following formula:
Figure FDA0002664127660000031
h represents the height of a lower boundary of a displayed POI area on the interface, and H represents the pixel height of the POI display area;
step 5.3, performing real-time visual rendering on the POI screen coordinate information according to the y coordinate of each POI in the POI screen coordinate information obtained by calculation in the step 5.2 and the updated initial x coordinate of the ith POI obtained by calculation in the step 3.3, and optimizing the label covering problem;
step 5.4, calculating the distance from the current position of the mobile terminal to each road inflection point in the navigation road information, and finding out a road r closest to the current position of the mobile terminal1Calculating the current position of the mobile terminal to the road r1Direction angle alpha of3
Step 5.5, if the direction angle alpha3If the value is 0, judging that the mobile terminal moves on the navigation road, and adopting the following formula for guiding the correct road direction by the guide arrow:
α=α12
wherein alpha is the rotation angle of the guide arrow, alpha1For correct road direction, α2The current direction of the mobile terminal;
step 5.6, if the direction angle alpha3If not, judging that the mobile terminal deviates from the navigation road, and adopting the following formula by pointing to the correct road direction through the guide arrow:
α=α32
wherein α is a rotation angle of the guiding arrow, and the guiding arrow should point to a direction of a vertical line segment formed by the current position of the moving end and the nearest road.
CN202010913376.7A 2020-09-03 2020-09-03 Indoor navigation visualization method based on augmented reality Active CN112146656B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010913376.7A CN112146656B (en) 2020-09-03 2020-09-03 Indoor navigation visualization method based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010913376.7A CN112146656B (en) 2020-09-03 2020-09-03 Indoor navigation visualization method based on augmented reality

Publications (2)

Publication Number Publication Date
CN112146656A true CN112146656A (en) 2020-12-29
CN112146656B CN112146656B (en) 2023-02-17

Family

ID=73889244

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010913376.7A Active CN112146656B (en) 2020-09-03 2020-09-03 Indoor navigation visualization method based on augmented reality

Country Status (1)

Country Link
CN (1) CN112146656B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11785430B2 (en) 2021-04-13 2023-10-10 Research Foundation Of The City University Of New York System and method for real-time indoor navigation

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110153198A1 (en) * 2009-12-21 2011-06-23 Navisus LLC Method for the display of navigation instructions using an augmented-reality concept
US20120158287A1 (en) * 2010-12-15 2012-06-21 Francesco Altamura Methods and systems for augmented navigation
CN104101354A (en) * 2013-04-15 2014-10-15 北京四维图新科技股份有限公司 Method, apparatus and system for optimizing POI guiding coordinates in map data
KR20150126289A (en) * 2014-05-02 2015-11-11 한국전자통신연구원 Navigation apparatus for providing social network service based on augmented reality, metadata processor and metadata processing method in the augmented reality navigation system
CN105095314A (en) * 2014-05-22 2015-11-25 北京四维图新科技股份有限公司 Point of interest (POI) marking method, terminal, navigation server and navigation system
CN105371847A (en) * 2015-10-27 2016-03-02 深圳大学 Indoor live-action navigation method and system
CN107240156A (en) * 2017-06-07 2017-10-10 武汉大学 A kind of outdoor augmented reality spatial information of high accuracy shows system and method
CN109974733A (en) * 2019-04-02 2019-07-05 百度在线网络技术(北京)有限公司 POI display methods, device, terminal and medium for AR navigation

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110153198A1 (en) * 2009-12-21 2011-06-23 Navisus LLC Method for the display of navigation instructions using an augmented-reality concept
US20120158287A1 (en) * 2010-12-15 2012-06-21 Francesco Altamura Methods and systems for augmented navigation
CN104101354A (en) * 2013-04-15 2014-10-15 北京四维图新科技股份有限公司 Method, apparatus and system for optimizing POI guiding coordinates in map data
KR20150126289A (en) * 2014-05-02 2015-11-11 한국전자통신연구원 Navigation apparatus for providing social network service based on augmented reality, metadata processor and metadata processing method in the augmented reality navigation system
CN105095314A (en) * 2014-05-22 2015-11-25 北京四维图新科技股份有限公司 Point of interest (POI) marking method, terminal, navigation server and navigation system
CN105371847A (en) * 2015-10-27 2016-03-02 深圳大学 Indoor live-action navigation method and system
CN107240156A (en) * 2017-06-07 2017-10-10 武汉大学 A kind of outdoor augmented reality spatial information of high accuracy shows system and method
CN109974733A (en) * 2019-04-02 2019-07-05 百度在线网络技术(北京)有限公司 POI display methods, device, terminal and medium for AR navigation

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
MARIA BEATRIZ CARMO 等: "PoI Awareness, Relevance and Aggregation for Augmented Reality", 《2016 20TH INTERNATIONAL CONFERENCE INFORMATION VISUALISATION (IV)》 *
MARIA BEATRIZ CARMO 等: "PoI Awareness, Relevance and Aggregation for Augmented Reality", 《2016 20TH INTERNATIONAL CONFERENCE INFORMATION VISUALISATION (IV)》, 22 July 2016 (2016-07-22), pages 300 - 305, XP032955365, DOI: 10.1109/IV.2016.47 *
侯晓宁等: "增强现实电子地图应用模式研究", 《测绘科学技术学报》, vol. 33, no. 06, 31 December 2016 (2016-12-31), pages 639 - 643 *
应申等: "基于Android的室内增强现实***的实现", 《地理信息世界》, vol. 23, no. 01, 29 February 2016 (2016-02-29), pages 93 - 98 *
程雄: "增强现实技术在iPhone平台室内导航***中的研究与应用", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》, no. 2, 15 December 2013 (2013-12-15), pages 136 - 1010 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11785430B2 (en) 2021-04-13 2023-10-10 Research Foundation Of The City University Of New York System and method for real-time indoor navigation

Also Published As

Publication number Publication date
CN112146656B (en) 2023-02-17

Similar Documents

Publication Publication Date Title
US10217288B2 (en) Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor
US10255726B2 (en) Systems and methods for augmented reality representations of networks
US10760922B2 (en) Augmented reality maps
CN108474666B (en) System and method for locating a user in a map display
EP2643822B1 (en) Guided navigation through geo-located panoramas
US20130162665A1 (en) Image view in mapping
EP3645971B1 (en) Map feature identification using motion data and surfel data
US11140510B2 (en) Contextual map view
Narzt et al. A new visualization concept for navigation systems
CN112146656B (en) Indoor navigation visualization method based on augmented reality
Ranasinghe et al. Pedestrian navigation and GPS deteriorations: User behavior and adaptation strategies
US20230134475A1 (en) Viewport system for dynamically framing of a map based on updating data
Stroila et al. Route visualization in indoor panoramic imagery with open area maps
US20230384871A1 (en) Activating a Handheld Device with Universal Pointing and Interacting Device
Mantoro et al. Pragmatic framework of 3D visual navigation for mobile user
Adya et al. Augmented Reality in Indoor Navigation
Gandhi et al. A* Algorithm and Unity for Augmented Reality-based Indoor Navigation.
Mower The augmented scene: integrating the map and the environment
Patel et al. Improving Navigation for Street Data Using Mobile Augmented Reality
Forward et al. overarching research challenges

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant