CN117470225A - Navigation method, navigation device, electronic equipment and storage medium - Google Patents

Navigation method, navigation device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117470225A
CN117470225A CN202210865737.4A CN202210865737A CN117470225A CN 117470225 A CN117470225 A CN 117470225A CN 202210865737 A CN202210865737 A CN 202210865737A CN 117470225 A CN117470225 A CN 117470225A
Authority
CN
China
Prior art keywords
audio
navigation
angle
distance
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210865737.4A
Other languages
Chinese (zh)
Inventor
王凯
史润宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202210865737.4A priority Critical patent/CN117470225A/en
Publication of CN117470225A publication Critical patent/CN117470225A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3629Guidance using speech or audio output, e.g. text-to-speech

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)

Abstract

The present disclosure relates to a navigation method, an apparatus, an electronic device, and a storage medium, where the method is applied to a terminal device that establishes a connection with an audio device in advance, and the method includes: in the navigation process, determining a first distance and a first angle of a target position relative to the current position of the terminal equipment; rendering the audio to be played according to the first distance and the first angle to obtain target audio, wherein the virtual sound source direction of the target audio is the direction of the target position; and controlling the audio equipment to play the target audio. Because the virtual sound source direction of the target audio is the direction of the target position, the target audio can enable a user to directly and accurately obtain the direction of the target position, so that the complexity and the recognition difficulty of the navigation interaction mode are reduced, the use difficulty of the user for navigation is reduced, the use experience of the user is improved, and the accuracy of navigation is improved.

Description

Navigation method, navigation device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of position navigation, in particular to a navigation method, a navigation device, electronic equipment and a storage medium.
Background
In recent years, technology of terminal devices such as smartphones has been advanced, and functions of the terminal devices have been developed greatly, and the terminal devices are becoming an indispensable part of life. Taking the navigation function of the terminal equipment as an example, the positioning precision and the navigation mode of the terminal equipment are obviously improved, and the terminal equipment becomes an indispensable function in the traveling of people. However, in the related art, the navigation interaction mode of the terminal device is complex, the recognition difficulty is high, and the user can easily recognize errors, so that the navigation errors are caused.
Disclosure of Invention
To overcome the problems in the related art, embodiments of the present disclosure provide a navigation method, apparatus, electronic device, and storage medium, which are used to solve the drawbacks in the related art.
According to a first aspect of embodiments of the present disclosure, there is provided a navigation method applied to a terminal device that establishes a connection with an audio device in advance, the method including:
in the navigation process, determining a first distance and a first angle of a target position relative to the current position of the terminal equipment;
rendering the audio to be played according to the first distance and the first angle to obtain target audio, wherein the virtual sound source direction of the target audio is the direction of the target position;
and controlling the audio equipment to play the target audio.
In one embodiment, the determining the first distance and the first angle of the target location relative to the current location of the terminal device includes:
and acquiring the target position and the current position on a navigation map in real time, and determining the first distance and the first angle according to the target position and the current position.
In one embodiment, the determining the first distance and the first angle of the target location relative to the current location of the terminal device includes:
responding to a navigation starting instruction, acquiring the target position and the current position on a navigation map, and determining the first distance and the first angle according to the target position and the current position;
and updating the first distance and the first angle in real time by using an inertial navigation algorithm.
In one embodiment, the determining the first distance and the first angle of the target location relative to the current location of the terminal device includes:
and determining a first distance and a first angle of the target position relative to the current position of the terminal equipment in real time by utilizing an indoor navigation technology.
In one embodiment, the determining, during navigation, the first distance and the first angle of the target position relative to the current position of the terminal device includes:
in the indoor navigation process, a first distance and a first angle of a target position relative to the current position of the terminal device are determined.
In one embodiment, the rendering the audio to be played according to the first distance and the first angle to obtain the target audio includes:
determining a third angle of the target position relative to the audio device according to the first angle and a second angle between the terminal device and the audio device;
and rendering the audio to be played according to the first distance and the third angle to obtain target audio.
In one embodiment, the terminal device has at least one motion sensor and the audio device has at least one motion sensor; the method further comprises the steps of:
acquiring a first motion parameter acquired by at least one motion sensor of the terminal equipment and a second motion parameter acquired by at least one motion sensor of the audio equipment;
and determining a second angle between the terminal equipment and the audio equipment according to the first motion parameter and the second motion parameter.
In one embodiment, further comprising:
generating and displaying correction prompt information, wherein the correction prompt information is used for prompting a user to adjust the terminal equipment and the audio equipment to a preset angle;
and in response to receiving a confirmation instruction, carrying out coordinate system unification on the first motion parameter and the second motion parameter, wherein the confirmation instruction is used for representing that the terminal equipment and the audio equipment are adjusted to a preset angle.
In one embodiment, further comprising:
and updating the second angle in real time by using an inertial navigation algorithm.
In one embodiment, the audio to be played includes navigation audio or multimedia audio currently played by the terminal device.
According to a second aspect of embodiments of the present disclosure, there is provided a navigation apparatus applied to a terminal device that establishes a connection with an audio device in advance, the apparatus including:
the determining module is used for determining a first distance and a first angle of the target position relative to the current position of the terminal equipment in the navigation process;
the rendering module is used for rendering the audio to be played according to the first distance and the first angle to obtain target audio, wherein the virtual sound source direction of the target audio is the direction of the target position;
and the playing module is used for controlling the audio equipment to play the target audio.
In one embodiment, the determining module is specifically configured to:
and acquiring the target position and the current position on a navigation map in real time, and determining the first distance and the first angle according to the target position and the current position.
In one embodiment, the determining module is specifically configured to:
responding to a navigation starting instruction, acquiring the target position and the current position on a navigation map, and determining the first distance and the first angle according to the target position and the current position;
and updating the first distance and the first angle in real time by using an inertial navigation algorithm.
In one embodiment, the determining module is specifically configured to:
and determining a first distance and a first angle of the target position relative to the current position of the terminal equipment in real time by utilizing an indoor navigation technology.
In one embodiment, the module is specifically configured to:
in the indoor navigation process, a first distance and a first angle of a target position relative to the current position of the terminal device are determined.
In one embodiment, the rendering module is specifically configured to:
determining a third angle of the target position relative to the audio device according to the first angle and a second angle between the terminal device and the audio device;
and rendering the audio to be played according to the first distance and the third angle to obtain target audio.
In one embodiment, the terminal device has at least one motion sensor and the audio device has at least one motion sensor; the apparatus further comprises a motion module for:
acquiring a first motion parameter acquired by at least one motion sensor of the terminal equipment and a second motion parameter acquired by at least one motion sensor of the audio equipment;
and determining a second angle between the terminal equipment and the audio equipment according to the first motion parameter and the second motion parameter.
In one embodiment, the system further comprises a correction module for:
generating and displaying correction prompt information, wherein the correction prompt information is used for prompting a user to adjust the terminal equipment and the audio equipment to a preset angle;
and in response to receiving a confirmation instruction, carrying out coordinate system unification on the first motion parameter and the second motion parameter, wherein the confirmation instruction is used for representing that the terminal equipment and the audio equipment are adjusted to a preset angle.
In one embodiment, the motion module is further configured to:
and updating the second angle in real time by using an inertial navigation algorithm.
In one embodiment, the audio to be played includes navigation audio or multimedia audio currently played by the terminal device.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic device comprising a memory for storing computer instructions executable on a processor for implementing the navigation method of the first aspect when the computer instructions are executed.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of the first aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
according to the navigation method provided by the embodiment of the disclosure, the first distance and the first angle of the target position relative to the current position of the terminal device are determined in the navigation process, so that the audio to be played can be rendered according to the first distance and the first angle to obtain the target audio, and finally the audio device can be controlled to play the target audio. Because the virtual sound source direction of the target audio is the direction of the target position, the target audio can enable a user to directly and accurately obtain the direction of the target position, so that the complexity and the recognition difficulty of the navigation interaction mode are reduced, the use difficulty of the user for navigation is reduced, the use experience of the user is improved, and the accuracy of navigation is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow chart of a navigation method shown in an exemplary embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating a relationship between a target location and a current location in accordance with an exemplary embodiment of the present disclosure;
fig. 3 is a schematic diagram of preset positions of a terminal device and an audio device according to an exemplary embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a navigation device according to an exemplary embodiment of the present disclosure;
fig. 5 is a block diagram of an electronic device shown in an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in this disclosure to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
In recent years, technology of terminal devices such as smartphones has been advanced, and functions of the terminal devices have been developed greatly, and the terminal devices are becoming an indispensable part of life. Taking the navigation function of the terminal equipment as an example, the positioning precision and the navigation mode of the terminal equipment are obviously improved, and the terminal equipment becomes an indispensable function in the traveling of people. However, in the related art, the navigation interaction mode of the terminal device is complex, the recognition difficulty is high, and the user can easily recognize errors, so that the navigation errors are caused.
Based on this, in a first aspect, at least one embodiment of the present disclosure provides a navigation method that may be applied to a terminal device that previously established a connection with an audio device. Referring to fig. 1, a flow of the method is shown, which includes steps S101 to S103.
The terminal equipment can be a smart phone, a tablet personal computer, intelligent wearable equipment and the like; the audio device may be a binaural audio device, such as headphones, an earphone, or the like. The audio device may establish a connection with the terminal device in a wired manner or in a wireless manner, for example, in a bluetooth manner, etc. After the terminal device establishes connection with the audio device, the terminal device can control the audio device to play audio, for example, the smart phone controls the earphone for establishing connection to play audio such as music.
The method can be applied to an outdoor navigation scene or an indoor navigation scene, and is preferably applied to the indoor navigation scene. Since the indoor navigation technology in the related art needs to be completed by means of the support of a positioning base station or the like, only a specific indoor scene can provide navigation. When no equipment such as a base station is positioned indoors, a user can only look for a target position by looking at a map (a map in a map program or a real map in a room or the like) and a sign or the like, which is very inconvenient and inaccurate. Therefore, the method can solve the trouble caused by the incapability of navigation or the complicated navigation interaction mode in the indoor navigation scene.
In step S101, a first distance and a first angle of a target position relative to a current position of the terminal device are determined during navigation.
Illustratively, during indoor navigation, a first distance and a first angle of a target location relative to a current location of the terminal device are determined.
The navigation process may be initiated by a user, for example, the user enters a target location through a map application and initiates navigation, and the map application may generate navigation information based on the target location and the current location to plan a route.
The target location is a destination for the user to navigate, e.g., the user wants to navigate to a train station, which is the target location. The current location of the terminal device is used to characterize the current location of the user, since the user is the same location as the terminal device when navigating using the terminal device.
Referring to fig. 2, a relationship among a current position a, a target position O, a first distance D, and a first angle α is shown. The first distance and the first angle of the target position relative to the current position of the terminal device can be determined in real time, and the step S102 and the step S103 are executed after the first distance and the first angle are determined each time, so that real-time navigation is performed on the user.
In one possible embodiment, the target position and the current position may be acquired in real time on a navigation map, and the first distance and the first angle may be determined according to the target position and the current position. The target position may be input by the user in the application program of the navigation map, and the current position may be input by the user in the application program of the navigation map, or the navigation map is automatically located. The method can determine the first distance and the first angle (and realize the direction prompt to the user in the subsequent steps) when the equipment such as the base station and the like is not positioned in the indoor scene, thereby avoiding the user from automatically searching the target position according to the map and improving the user experience.
In another possible embodiment, the target position and the current position may be acquired on a navigation map in response to a navigation start instruction, and the first distance and the first angle may be determined according to the target position and the current position; and updating the first distance and the first angle in real time later by using an inertial navigation algorithm. The method can determine the first distance and the first angle (and realize the direction prompt to the user in the subsequent step) when the equipment such as the base station and the like is not positioned in the indoor scene, thereby avoiding the user from automatically searching the target position according to the map and improving the user experience; and moreover, the frequent acquisition of the target position and the current position from the navigation map can be avoided, the complexity is reduced, and the navigation efficiency is improved.
In yet another possible embodiment, the first distance and the first angle of the target location with respect to the current location of the terminal device may be determined in real time using an indoor navigation technique. The indoor navigation technology can be realized based on a positioning base station and by adopting methods such as proximity detection, triangular positioning, multilateral positioning, fingerprint positioning, combined positioning and the like. The indoor navigation technology can more accurately determine the first distance and the first angle in an indoor scene, so that the navigation accuracy and the scene pertinence are improved.
In step S102, rendering the audio to be played according to the first distance and the first angle to obtain a target audio, where the virtual sound source direction of the target audio is the direction of the target position.
The audio to be played can be navigation audio or multimedia audio currently played by the terminal device.
The method can adopt a spatial audio technology to render audio to be played, the spatial audio technology can render audio with spatial sound effect, and the spatial sound effect can simulate the direction of a sound source, so that a user can consider the audio sent from the direction of the simulated sound source after hearing the audio, and a very visual and accurate sense of direction is obtained.
In addition, since the first distance is updated in real time, the volume of the target audio can be adjusted according to the change of the first distance, for example, the volume of the target audio is increased when the first distance is reduced, and the volume of the target audio is decreased when the first distance is increased. Therefore, the change of the distance between the user and the target position can be intuitively prompted.
In step S103, the audio device is controlled to play the target audio.
The audio to be played comprises navigation audio or multimedia audio currently played by the terminal equipment, and the corresponding target audio comprises navigation audio or multimedia audio currently played by the terminal equipment. The navigation audio is audio for playing navigation information, such as audio for indicating a user to walk straight, turn, etc., and the multimedia audio may be music, sound in video, etc. That is, the direction of the target position can be directly and accurately prompted by utilizing the rendering of the space audio technology when the navigation audio is played, the direction of the target position can be prompted in the process of playing music or radio frequency, the navigation audio is very convenient, the navigation prompt can be synchronously completed in the process of listening to music or watching video by a user, and the interaction experience of the user in the navigation process is improved.
When the audio device is a binaural audio device, the direction of the user's target position may be prompted using binaural effects.
According to the navigation method provided by the embodiment of the disclosure, the first distance and the first angle of the target position relative to the current position of the terminal device are determined in the navigation process, so that the audio to be played can be rendered according to the first distance and the first angle to obtain the target audio, and finally the audio device can be controlled to play the target audio. Because the virtual sound source direction of the target audio is the direction of the target position, the target audio can enable a user to directly and accurately obtain the direction of the target position, so that the complexity and the recognition difficulty of the navigation interaction mode are reduced, the use difficulty of the user for navigation is reduced, the use experience of the user is improved, and the accuracy of navigation is improved.
Under indoor scene, the information of more than one dimension is provided for indoor pedestrian navigation through the spatial audio technology, so that the spatial sound effect of playing audio can be tracked when a user wears the binaural audio device, the phase position relationship between the user and the target position is tracked, the user perceives the azimuth and the distance of the target position, guidance is provided for pedestrian navigation, and navigation experience is improved.
It will be appreciated that the location of the terminal device is the same as the location of the audio device, as both are the same as the location of the user; however, the directions of the terminal device and the audio device are not necessarily the same, and the direction error between the terminal device and the audio device may affect the direction of the analog audio source of the target audio, thereby reducing the direction accuracy.
Thus, in some embodiments of the present disclosure, audio to be played may be rendered in the following manner, so as to correct the effect caused by the angle error described above: firstly, determining a third angle of the target position relative to the audio device according to the first angle and a second angle between the terminal device and the audio device; and then, rendering the audio to be played according to the first distance and the third angle to obtain target audio.
Optionally, the terminal device has at least one motion sensor, and the audio device has at least one motion sensor. For example, the motion sensor of the terminal device may be an angular velocity sensor and/or a gyro sensor, and the audio device may be an angular velocity sensor and/or a gyro sensor.
Based on this, the second angle may be predetermined as follows: firstly, acquiring a first motion parameter acquired by at least one motion sensor of the terminal equipment and a second motion parameter acquired by at least one motion sensor of the audio equipment; next, a second angle between the terminal device and the audio device is determined based on the first motion parameter and the second motion parameter.
By way of example, the second angle may be determined as follows:
wherein Δα (T) is an angle difference between the user's head and the mobile device at time T, ω1 (T) is a rotational angular velocity of the terminal device around a standard direction (for example, a vertical direction) at time T, which is acquired by a gyro sensor of the terminal device, and ω2 (T) is a rotational angular velocity of the audio device around the standard direction (for example, a vertical direction) at time T, which is acquired by a gyro sensor of the audio device.
For example, after determining the second angle according to the first motion parameter and the second motion parameter for the first time, the second angle may be updated in real time by using an inertial navigation algorithm. Therefore, the instantaneity of the second angle is improved, the requirement of instantaneity of navigation is met, and the calculation load for determining the second angle can be reduced.
It will be appreciated that prior to determining the second angle, the first and second motion parameters may be co-ordinate in the following manner: firstly, generating and displaying correction prompt information, wherein the correction prompt information is used for prompting a user to adjust the terminal equipment and the audio equipment to a preset angle, for example, displaying prompt information in a display screen of the terminal equipment, prompting the user to keep the terminal equipment horizontal and point the top end forward (namely, the gesture of the terminal equipment shown in fig. 3), and enabling the earphone user to wear the audio equipment to face forward (namely, the gesture of the user shown in fig. 3); and then, in response to receiving a confirmation instruction, carrying out coordinate system unification on the first motion parameter and the second motion parameter, wherein the confirmation instruction is used for representing that the terminal equipment and the audio equipment are adjusted to a preset angle. In the embodiment, the coordinate systems of the first motion parameter and the second motion parameter are unified, so that the first motion parameter and the second motion parameter adopt the same world coordinate system, the calculation of the second angle is convenient, and the calculation accuracy of the second angle can be improved.
According to a second aspect of the embodiments of the present disclosure, there is provided a navigation device applied to a terminal device that establishes a connection with an audio device in advance, referring to fig. 4, the device includes:
a determining module 401, configured to determine, during navigation, a first distance and a first angle of a target position relative to a current position of the terminal device;
the rendering module 402 is configured to render audio to be played according to the first distance and the first angle to obtain target audio, where a virtual sound source direction of the target audio is a direction of the target position;
and the playing module 403 is configured to control the audio device to play the target audio.
In some embodiments of the disclosure, the determining module is specifically configured to:
and acquiring the target position and the current position on a navigation map in real time, and determining the first distance and the first angle according to the target position and the current position.
In some embodiments of the disclosure, the determining module is specifically configured to:
responding to a navigation starting instruction, acquiring the target position and the current position on a navigation map, and determining the first distance and the first angle according to the target position and the current position;
and updating the first distance and the first angle in real time by using an inertial navigation algorithm.
In some embodiments of the disclosure, the determining module is specifically configured to:
and determining a first distance and a first angle of the target position relative to the current position of the terminal equipment in real time by utilizing an indoor navigation technology.
In some embodiments of the disclosure, the module is specifically configured to:
in the indoor navigation process, a first distance and a first angle of a target position relative to the current position of the terminal device are determined.
In some embodiments of the disclosure, the rendering module is specifically configured to:
determining a third angle of the target position relative to the audio device according to the first angle and a second angle between the terminal device and the audio device;
and rendering the audio to be played according to the first distance and the third angle to obtain target audio.
In some embodiments of the present disclosure, the terminal device has at least one motion sensor, and the audio device has at least one motion sensor; the apparatus further comprises a motion module for:
acquiring a first motion parameter acquired by at least one motion sensor of the terminal equipment and a second motion parameter acquired by at least one motion sensor of the audio equipment;
and determining a second angle between the terminal equipment and the audio equipment according to the first motion parameter and the second motion parameter.
In some embodiments of the present disclosure, a correction module is further included for:
generating and displaying correction prompt information, wherein the correction prompt information is used for prompting a user to adjust the terminal equipment and the audio equipment to a preset angle;
and in response to receiving a confirmation instruction, carrying out coordinate system unification on the first motion parameter and the second motion parameter, wherein the confirmation instruction is used for representing that the terminal equipment and the audio equipment are adjusted to a preset angle.
In some embodiments of the present disclosure, the motion module is further configured to:
and updating the second angle in real time by using an inertial navigation algorithm.
In one embodiment, the audio to be played includes navigation audio or multimedia audio currently played by the terminal device.
The specific manner in which the various modules perform the operations in relation to the apparatus of the above embodiments has been described in detail in relation to the embodiments of the method of the first aspect and will not be described in detail here.
In accordance with a third aspect of embodiments of the present disclosure, reference is made to fig. 5, which schematically illustrates a block diagram of an electronic device. For example, the apparatus 500 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, or the like.
Referring to fig. 5, an apparatus 500 may include one or more of the following components: a processing component 502, a memory 504, a power supply component 506, a multimedia component 508, an audio component 510, an input/output (I/O) interface 512, a sensor component 514, and a communication component 516.
The processing component 502 generally controls overall operation of the apparatus 500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing element 502 may include one or more processors 520 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 502 can include one or more modules that facilitate interactions between the processing component 502 and other components. For example, the processing component 502 may include a multimedia module to facilitate interaction between the multimedia component 508 and the processing component 502.
Memory 504 is configured to store various types of data to support operations at device 500. Examples of such data include instructions for any application or method operating on the apparatus 500, contact data, phonebook data, messages, pictures, videos, and the like. The memory 504 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 506 provides power to the various components of the device 500. The power components 506 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 500.
The multimedia component 508 includes a screen between the device 500 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation. In some embodiments, the multimedia component 508 includes a front-facing camera and/or a rear-facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 500 is in an operational mode, such as a navigation mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 510 is configured to output and/or input audio signals. For example, the audio component 510 includes a Microphone (MIC) configured to receive external audio signals when the device 500 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 504 or transmitted via the communication component 516. In some embodiments, the audio component 510 further comprises a speaker for outputting audio signals.
The I/O interface 512 provides an interface between the processing component 502 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 514 includes one or more sensors for providing status assessment of various aspects of the apparatus 500. For example, the sensor assembly 514 may detect the on/off state of the device 500, the relative positioning of the components, such as the display and keypad of the device 500, the sensor assembly 514 may also detect a change in position of the device 500 or a component of the device 500, the presence or absence of user contact with the device 500, the orientation or acceleration/deceleration of the device 500, and a change in temperature of the device 500. The sensor assembly 514 may also include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 514 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 514 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 516 is configured to facilitate communication between the apparatus 500 and other devices in a wired or wireless manner. The apparatus 500 may access a wireless network based on a communication standard, such as WiFi,2G or 3G,4G or 5G, or a combination thereof. In one exemplary embodiment, the communication part 516 receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 516 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 500 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the power supply methods of electronic devices described above.
In a fourth aspect, the present disclosure also provides, in an exemplary embodiment, a non-transitory computer-readable storage medium, such as memory 504, comprising instructions executable by processor 520 of apparatus 500 to perform the method of powering an electronic device described above. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (22)

1. A navigation method applied to a terminal device that establishes a connection with an audio device in advance, the method comprising:
in the navigation process, determining a first distance and a first angle of a target position relative to the current position of the terminal equipment;
rendering the audio to be played according to the first distance and the first angle to obtain target audio, wherein the virtual sound source direction of the target audio is the direction of the target position;
and controlling the audio equipment to play the target audio.
2. The navigation method according to claim 1, wherein the determining a first distance and a first angle of a target location relative to a current location of the terminal device comprises:
and acquiring the target position and the current position on a navigation map in real time, and determining the first distance and the first angle according to the target position and the current position.
3. The navigation method according to claim 1, wherein the determining a first distance and a first angle of a target location relative to a current location of the terminal device comprises:
responding to a navigation starting instruction, acquiring the target position and the current position on a navigation map, and determining the first distance and the first angle according to the target position and the current position;
and updating the first distance and the first angle in real time by using an inertial navigation algorithm.
4. The navigation method according to claim 1, wherein the determining a first distance and a first angle of a target location relative to a current location of the terminal device comprises:
and determining a first distance and a first angle of the target position relative to the current position of the terminal equipment in real time by utilizing an indoor navigation technology.
5. The navigation method according to any one of claims 1 to 4, characterized in that the determining a first distance and a first angle of a target position relative to a current position of the terminal device during navigation comprises:
in the indoor navigation process, a first distance and a first angle of a target position relative to the current position of the terminal device are determined.
6. The navigation method according to claim 1, wherein the rendering the audio to be played according to the first distance and the first angle to obtain the target audio includes:
determining a third angle of the target position relative to the audio device according to the first angle and a second angle between the terminal device and the audio device;
and rendering the audio to be played according to the first distance and the third angle to obtain target audio.
7. The navigation method of claim 6, wherein the terminal device has at least one motion sensor and the audio device has at least one motion sensor; the method further comprises the steps of:
acquiring a first motion parameter acquired by at least one motion sensor of the terminal equipment and a second motion parameter acquired by at least one motion sensor of the audio equipment;
and determining a second angle between the terminal equipment and the audio equipment according to the first motion parameter and the second motion parameter.
8. The navigation method of claim 7, further comprising:
generating and displaying correction prompt information, wherein the correction prompt information is used for prompting a user to adjust the terminal equipment and the audio equipment to a preset angle;
and in response to receiving a confirmation instruction, carrying out coordinate system unification on the first motion parameter and the second motion parameter, wherein the confirmation instruction is used for representing that the terminal equipment and the audio equipment are adjusted to a preset angle.
9. The navigation method of claim 7, further comprising:
and updating the second angle in real time by using an inertial navigation algorithm.
10. The navigation method according to claim 1, wherein the audio to be played comprises navigation audio or multimedia audio currently played by the terminal device.
11. A navigation apparatus applied to a terminal device that establishes a connection with an audio device in advance, the apparatus comprising:
the determining module is used for determining a first distance and a first angle of the target position relative to the current position of the terminal equipment in the navigation process;
the rendering module is used for rendering the audio to be played according to the first distance and the first angle to obtain target audio, wherein the virtual sound source direction of the target audio is the direction of the target position;
and the playing module is used for controlling the audio equipment to play the target audio.
12. The navigation device of claim 11, wherein the determining module is specifically configured to:
and acquiring the target position and the current position on a navigation map in real time, and determining the first distance and the first angle according to the target position and the current position.
13. The navigation device of claim 11, wherein the determining module is specifically configured to:
responding to a navigation starting instruction, acquiring the target position and the current position on a navigation map, and determining the first distance and the first angle according to the target position and the current position;
and updating the first distance and the first angle in real time by using an inertial navigation algorithm.
14. The navigation device of claim 11, wherein the determining module is specifically configured to:
and determining a first distance and a first angle of the target position relative to the current position of the terminal equipment in real time by utilizing an indoor navigation technology.
15. Navigation device according to any one of claims 11 to 14, characterized in that the module is specifically adapted to:
in the indoor navigation process, a first distance and a first angle of a target position relative to the current position of the terminal device are determined.
16. The navigation device of claim 11, wherein the rendering module is specifically configured to:
determining a third angle of the target position relative to the audio device according to the first angle and a second angle between the terminal device and the audio device;
and rendering the audio to be played according to the first distance and the third angle to obtain target audio.
17. The navigation device of claim 16, wherein the terminal equipment has at least one motion sensor and the audio equipment has at least one motion sensor; the apparatus further comprises a motion module for:
acquiring a first motion parameter acquired by at least one motion sensor of the terminal equipment and a second motion parameter acquired by at least one motion sensor of the audio equipment;
and determining a second angle between the terminal equipment and the audio equipment according to the first motion parameter and the second motion parameter.
18. The navigation device of claim 17, further comprising a correction module to:
generating and displaying correction prompt information, wherein the correction prompt information is used for prompting a user to adjust the terminal equipment and the audio equipment to a preset angle;
and in response to receiving a confirmation instruction, carrying out coordinate system unification on the first motion parameter and the second motion parameter, wherein the confirmation instruction is used for representing that the terminal equipment and the audio equipment are adjusted to a preset angle.
19. The navigation device of claim 17, further comprising a motion module further configured to:
and updating the second angle in real time by using an inertial navigation algorithm.
20. The navigation device of claim 11, wherein the audio to be played comprises navigation audio or multimedia audio currently played by the terminal equipment.
21. An electronic device comprising a memory, a processor for storing computer instructions executable on the processor for implementing the navigation method of any one of claims 1 to 10 when the computer instructions are executed.
22. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method of any one of claims 1 to 10.
CN202210865737.4A 2022-07-21 2022-07-21 Navigation method, navigation device, electronic equipment and storage medium Pending CN117470225A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210865737.4A CN117470225A (en) 2022-07-21 2022-07-21 Navigation method, navigation device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210865737.4A CN117470225A (en) 2022-07-21 2022-07-21 Navigation method, navigation device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117470225A true CN117470225A (en) 2024-01-30

Family

ID=89633537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210865737.4A Pending CN117470225A (en) 2022-07-21 2022-07-21 Navigation method, navigation device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117470225A (en)

Similar Documents

Publication Publication Date Title
EP3540571B1 (en) Method and device for editing virtual scene, and non-transitory computer-readable storage medium
EP3306432B1 (en) Flight control method for an aircraft, control device and computer program to implement the flight control method
EP3156767B1 (en) Method and device for navigating and method and device for generating a navigation video
CN109618212B (en) Information display method, device, terminal and storage medium
EP2927787B1 (en) Method and device for displaying picture
CN110022363B (en) Method, device and equipment for correcting motion state of virtual object and storage medium
US10110830B2 (en) Multiple streaming camera navigation interface system
CN103968846B (en) Positioning and navigation method and device
EP3848773A1 (en) Smart globe and control method therefor
JP2017505466A (en) Automatic focusing method, apparatus, program and recording medium
CN104613959A (en) Navigation method and device for wearable device and electronic equipment
CN111246095A (en) Method, device and equipment for controlling lens movement and storage medium
CN113989469A (en) AR (augmented reality) scenery spot display method and device, electronic equipment and storage medium
WO2022179080A1 (en) Positioning method and apparatus, electronic device, storage medium, program and product
CN113160031B (en) Image processing method, device, electronic equipment and storage medium
CN112019895B (en) Function operation control method, function operation control device, and storage medium
CN115950415A (en) Method and device for determining navigation direction and storage medium
CN117470225A (en) Navigation method, navigation device, electronic equipment and storage medium
KR20160019305A (en) Mobile terminal and method for controlling the same
CN114608591A (en) Vehicle positioning method and device, storage medium, electronic equipment, vehicle and chip
EP3208469A1 (en) Method and device for adjusting operating data
KR20170023648A (en) Mobile terminal and method of controlling the same
EP3401724A1 (en) Method, device, computer program and storage medium for controlling glasses
CN105446573B (en) Method and device for displaying direction
CN112860827B (en) Inter-device interaction control method, inter-device interaction control device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination