WO2017049796A1 - 导航、导航视频生成方法及装置 - Google Patents
导航、导航视频生成方法及装置 Download PDFInfo
- Publication number
- WO2017049796A1 WO2017049796A1 PCT/CN2015/099304 CN2015099304W WO2017049796A1 WO 2017049796 A1 WO2017049796 A1 WO 2017049796A1 CN 2015099304 W CN2015099304 W CN 2015099304W WO 2017049796 A1 WO2017049796 A1 WO 2017049796A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- navigation
- video
- request information
- end point
- module
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 91
- 238000004364 calculation method Methods 0.000 claims description 12
- 230000009471 action Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 22
- 238000004891 communication Methods 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 238000007726 management method Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3647—Guidance involving output of stored or live camera images or video streams
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3453—Special cost functions, i.e. other than distance or default speed limit of road segments
- G01C21/3492—Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3605—Destination input or retrieval
- G01C21/3623—Destination input or retrieval using a camera or code reader, e.g. for optical or magnetic codes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/48—Matching video sequences
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
Definitions
- the present disclosure relates to the field of wireless communication technologies, and in particular, to a navigation and navigation video generation method and apparatus.
- the navigation system is based on maps.
- users need to identify abstract maps and simplified logos. Because some users are naturally slow to perceive the map, in complex architectural environments or multi-junction environments. It is easy to go wrong.
- Embodiments of the present disclosure provide navigation, navigation video generation.
- the technical solution is as follows:
- a navigation method comprising:
- the navigation video being a video obtained by taking a live view of the road;
- the navigation request information includes a navigation parameter: a navigation start point and a navigation end point
- the determining a navigation video that matches the navigation request information includes:
- the navigation starting point of the navigation video is the same as the navigation starting point of the navigation request information, and
- the navigation video of the navigation video is the same as the navigation end of the navigation request information as a navigation video that matches the navigation start point and the navigation end point.
- the navigation request information includes a navigation parameter: a navigation start point and a navigation end point
- the determining a navigation video that matches the navigation request information includes:
- the navigation video corresponding to the navigation route is intercepted from the navigation video that includes the navigation route, and the navigation video corresponding to the navigation route is used as a navigation video that matches the navigation start point and the navigation end point.
- the navigation request information includes a navigation parameter: a navigation start point and a navigation end point
- the determining a navigation video that matches the navigation request information includes:
- the navigation video corresponding to the navigation sub-route is spliced to obtain a navigation video that matches the navigation start point and the navigation end point.
- the navigation request information further includes at least one of the following navigation parameters: an area name, a link name, a season, a weather, an average travel speed, and a travel distance;
- the determining the navigation video that matches the navigation request information further includes:
- the navigation video obtained by the query is determined to be a navigation video that matches the navigation request information.
- performing navigation according to the navigation video including:
- the navigation video is sent to a terminal, and the navigation video is played by the terminal.
- performing navigation according to the navigation video including:
- the navigating according to the navigation video includes:
- the navigating according to the navigation video includes:
- the navigation video is played at the playback speed.
- the method when the method is applied to a terminal, the method includes:
- Determining a navigation video that matches the navigation request information including:
- a navigation video matching the navigation request information is queried from a local navigation database.
- a navigation video generation method including:
- the method further includes:
- Associating the captured video with the navigation parameters includes:
- the average travel speed is associated with the captured video as a navigation parameter.
- a navigation apparatus including:
- An obtaining module configured to obtain navigation request information
- a determining module configured to determine a navigation video that matches the navigation request information, where the navigation video is a video obtained by taking a live view of the road;
- a navigation module is configured to navigate according to the navigation video.
- the navigation request information includes navigation parameters: a navigation start point and a navigation end point
- the determining module includes:
- a first obtaining submodule configured to acquire a navigation starting point and a navigation end point of the navigation video
- a determining submodule configured to use the navigation start point of the navigation video as the navigation start point of the navigation request information, and the navigation end point of the navigation video is the same as the navigation end point of the navigation request information as Navigation video that matches the starting point and the navigation end point.
- the navigation request information includes navigation parameters: a navigation start point and a navigation end point
- the determining module includes:
- a first calculation submodule configured to calculate a navigation route according to the navigation starting point and the navigation end point
- a query submodule configured to query a navigation video including the navigation route
- the intercepting sub-module is configured to intercept the navigation video corresponding to the navigation route from the navigation video that includes the navigation route, and use the navigation video corresponding to the navigation route as the navigation video that matches the navigation start point and the navigation end point.
- the navigation request information includes navigation parameters: a navigation start point and a navigation end point
- the determining module includes:
- a first calculation submodule configured to calculate a navigation route according to the navigation starting point and the navigation end point
- Querying a sub-module configured to separately query a navigation video corresponding to the navigation sub-route
- a splicing sub-module configured to splicing the navigation video corresponding to the navigation sub-route to obtain a navigation video that matches the navigation start point and the navigation end point.
- the navigation request information further includes at least one of the following navigation parameters: an area name, a link name, a season, a weather, an average travel speed, and a travel distance; and the determining module includes:
- a second obtaining submodule configured to: when the query obtains at least two navigation videos that match the navigation start point and the navigation end point,
- a second calculation sub-module configured to calculate a matching degree between the navigation parameter of the navigation request information and the navigation parameter of the navigation video obtained by the query;
- the determining submodule is configured to determine the navigation video obtained by the query with the highest matching degree as a navigation video that matches the navigation request information, or obtain the query that matches the matching threshold by a preset threshold.
- the navigation video is determined as a navigation video that matches the navigation request information, or the navigation video obtained by the predetermined number of the matching with a high degree of matching is determined as a navigation video that matches the navigation request information.
- the navigation module is configured to send the navigation video to the terminal when the device is applied to the network side, and play the navigation video by the terminal.
- the navigation module is configured to play the navigation video when the device is applied to the terminal.
- the navigation module includes:
- a display submodule configured to: when the determined navigation videos matching the navigation request information are at least two, display the navigation videos that match the navigation request information;
- a receiving submodule configured to receive a user selected operation of one of the navigation videos
- a play sub-module for playing the selected navigation video is
- the navigation module includes:
- Determining a submodule configured to determine a playback speed of the navigation video according to the current traveling speed
- a playing submodule for playing the navigation video according to the playing speed.
- the device further includes:
- a synchronization module configured to synchronize navigation data from a network side when the device is applied to a terminal, where the navigation data includes the navigation video
- a storage module configured to store the navigation data in a local navigation database
- the determining module is configured to query, from the local navigation database, a navigation video that matches the navigation request information.
- a navigation video generating apparatus includes:
- An acquiring module configured to acquire a navigation parameter input by a user, where the navigation parameter includes a navigation start point and a navigation end point;
- a shooting module for taking a live view of the road, and when reaching the navigation end point, stopping shooting, and obtaining a shooting video
- An association module configured to associate the captured video with the navigation parameter to obtain a navigation video
- the uploading module is configured to upload the navigation video to the network side.
- the method further includes:
- a recording module for recording the traveling speed
- a calculation module configured to calculate an average traveling speed according to the traveling speed
- the association module is configured to associate the average traveling speed as a navigation parameter with the captured video.
- a navigation apparatus including:
- a memory for storing processor executable instructions
- processor is configured to:
- the navigation video being a video obtained by taking a live view of the road;
- a navigation video generating apparatus includes:
- a memory for storing processor executable instructions
- processor is configured to:
- the technical solution of the above navigation is navigated through the real video, and the user can reduce the reflection time of the map and various simplified logos during the driving process, and is more sensitive to the road condition and direction, and can more intuitively navigate the user to avoid the user being complicated.
- the architectural environment or the multi-road environment is the wrong way, which improves the accuracy of navigation and improves the user experience.
- the corresponding navigation video is queried according to the navigation start point and the navigation end point in the navigation request information, and can be directly compared with the navigation start point and the navigation end point of the navigation video, so that the navigation video to be queried by the user can be accurately obtained.
- the distance between the start and end points that the user wants to query is very close, there may be no navigation videos that directly match the user's needs, but there may be navigation videos including the route the user wants to query, from which they can be navigated.
- the video corresponding to the route that the user wants to query is intercepted in the video. In this way, the user can accurately match the required navigation video and improve the navigation accuracy.
- the navigation video obtained by the query is filtered by the navigation parameter to provide the user with a more accurate navigation video with higher matching with the user requirement, thereby improving navigation accuracy and user experience.
- the server needs to push the matched navigation video to the requesting terminal. In this way, it is not necessary to occupy a large amount of storage space of the terminal to store navigation video data, but it requires a certain amount of data traffic of the terminal.
- the terminal pre-stores a large amount of navigation video data, occupies the storage space of the terminal, but saves data traffic of the terminal, and is completed locally due to the query. ,faster.
- the navigation video obtained by the query can be displayed to the user, and the user selects which video to use for navigation, so that the navigation is more intelligent and the user experience is better.
- the playing speed of the navigation video is adjusted according to the current driving speed, so that the video playing speed matches the current traveling speed, and the screen seen by the user from the navigation video is the current actual road image.
- the navigation effect is more accurate and the user experience is better.
- the navigation data can be synchronized from the network side in advance so that no more data traffic is consumed when navigating in real time. Synchronizing navigation data from the network side can be periodic, such as once a week.
- the terminal is configured to capture a video on the real road, associate the navigation parameters, generate a navigation video, and send the navigation video to the network side server.
- the navigation video can be used for navigation, which can reduce the time for the user to reflect the map and various simplified logos during driving, and is more sensitive to road conditions and directions, thereby improving navigation accuracy. Degree also improves user experience.
- the average traveling speed can also be increased in the navigation parameters, so that the matching accuracy is higher when the navigation video is subsequently matched, and the navigation accuracy is improved.
- FIG. 1 is a flow chart showing a navigation method according to an exemplary embodiment.
- FIG. 2 is a flow chart showing a navigation method according to another exemplary embodiment.
- FIG. 3 is a flowchart of a navigation method according to another exemplary embodiment.
- FIG. 4 is a flowchart of a navigation method according to another exemplary embodiment.
- FIG. 5 is a flowchart of a navigation method according to another exemplary embodiment.
- FIG. 6 is a flowchart of a navigation method according to another exemplary embodiment.
- FIG. 7 is a flowchart of a navigation method according to another exemplary embodiment.
- FIG. 8 is a flowchart of a navigation method according to another exemplary embodiment.
- FIG. 9 is a flowchart of a navigation method according to another exemplary embodiment.
- FIG. 10 is a flowchart of a method for generating a navigation video according to an exemplary embodiment.
- FIG. 11 is a block diagram of a navigation device, according to an exemplary embodiment.
- FIG. 12 is a block diagram of a determination module, according to an exemplary embodiment.
- FIG. 13 is a block diagram of a determination module, according to another exemplary embodiment.
- FIG. 14 is a block diagram of a determination module, according to another exemplary embodiment.
- FIG. 15 is a block diagram of a determination module, according to another exemplary embodiment.
- FIG. 16 is a block diagram of a navigation module, according to an exemplary embodiment.
- FIG. 17 is a block diagram of a navigation module, according to another exemplary embodiment.
- FIG. 18 is a block diagram of a navigation device, according to another exemplary embodiment.
- FIG. 19 is a block diagram of a navigation video generating apparatus, according to an exemplary embodiment.
- FIG. 20 is a block diagram of an apparatus for navigating or generating a navigation video, according to an exemplary embodiment.
- FIG. 21 is a block diagram of an apparatus for navigation, according to an exemplary embodiment.
- the navigation video is a video of a live view of the road.
- Each navigation video has its corresponding navigation parameters, including at least: a navigation start point and a navigation end point, and may further include: a region name, a link name, a season, a weather, an average travel speed, a travel distance, and the like.
- the user queries the corresponding navigation video by inputting the navigation parameters in the navigation request, and plays the video at the terminal.
- Navigation through real video can reduce the time for the user to reflect the map and various simplified logos during driving, and is more sensitive to road conditions and directions, thus improving the accuracy of navigation and improving user experience.
- the navigation start point and the navigation end point may be input by the user, or the GPS may automatically locate the current position as a navigation start point.
- the following navigation parameters may be obtained by the user inputting the following navigation parameters, or by GPS positioning, automatic detection, etc.
- the navigation parameters may include at least one of the following: area name, link name, season, weather, and average travel speed.
- FIG. 1 is a flowchart of a navigation method according to an exemplary embodiment. As shown in FIG. 1 , the navigation method is used on a terminal or a network side, and includes the following steps S11-S13:
- step S11 acquiring navigation request information
- step S12 a navigation video matching the navigation request information is determined, and the navigation video is a video obtained by taking a live view of the road;
- step S13 navigation is performed according to the navigation video.
- the user can reduce the reflection time of the map and various simplified logos during the driving process, and is more sensitive to the road condition and direction, and can more intuitively navigate the user to avoid complicated users.
- the construction environment or the multi-road environment is wrong, which improves The accuracy of navigation also improves the user experience.
- the navigation request information includes navigation parameters: a navigation start point and a navigation end point, and step S12 can be implemented in the following three manners:
- FIG. 2 is a flowchart of a navigation method according to another exemplary embodiment. As shown in FIG. 2, step S12 includes:
- step S21 acquiring a navigation start point and a navigation end point of the navigation video
- step S22 the navigation start point of the navigation video is the same as the navigation start point of the navigation request information, and the navigation video of the navigation end of the navigation video is the same as the navigation end point of the navigation request information as the navigation video matching the navigation start point and the navigation end point.
- the corresponding navigation video is queried according to the navigation start point and the navigation end point in the navigation request information, and can be directly compared with the navigation start point and the navigation end point of the navigation video, so that the navigation video to be queried by the user can be accurately obtained.
- FIG. 3 is a flowchart of a navigation method according to another exemplary embodiment. As shown in FIG. 3, step S12 includes:
- step S31 the navigation route is calculated according to the navigation start point and the navigation end point;
- step S32 querying a navigation video including a navigation route
- step S33 the navigation video corresponding to the navigation route is intercepted from the navigation video including the navigation route, and the navigation video corresponding to the navigation route is used as the navigation video matching the navigation start point and the navigation end point.
- the navigation starting point that the user wants to query is location A
- the navigation terminal is location B
- the corresponding navigation route is AB.
- There is a navigation video VideoCD taken on the navigation route CD (the navigation starting point is the location C and the navigation terminal is the location D)
- the navigation route AB is a segment in the navigation route CD
- the navigation corresponding to the navigation route AB can be intercepted from the navigation video VideoCD.
- Video VideoAB is a navigation video VideoAB.
- the navigation parameter corresponding to the navigation video VideoCD includes the road segment corresponding to the route AB.
- the name, or the mark of the place A and the place B, and the direction of travel of the navigation video VideoCD between the place A and the place B is from the place A to the place B.
- the navigation video VideoAB corresponding to the navigation route AB can be intercepted from the navigation video VideoCD.
- the distance between the start and end points that the user wants to query is very close, there may be no navigation videos that directly match the user's needs, but there may be navigation videos including the route the user wants to query, from which they can be navigated.
- the video corresponding to the route that the user wants to query is intercepted in the video. In this way, the user can accurately match the required navigation video and improve the navigation accuracy.
- FIG. 4 is a flowchart of a navigation method according to another exemplary embodiment. As shown in FIG. 4, step S12 includes:
- step S41 the navigation route is calculated according to the navigation start point and the navigation end point;
- step S42 the navigation route is divided into at least two segments of the navigation sub-route
- step S43 the navigation videos corresponding to the navigation sub-routes are respectively queried
- step S44 the navigation videos corresponding to the navigation sub-routes are spliced to obtain a navigation video that matches the navigation start point and the navigation end point.
- the navigation starting point that the user wants to query is location A
- the navigation terminal is location B
- the corresponding navigation route is AB.
- the navigation route AB is divided into three navigation sub-routes AE, EF, EB.
- the navigation videos VideoAE, Video EF, and Video EB are respectively queried for each navigation sub-route, and the three navigation videos are spliced to obtain the navigation video VideoAB corresponding to the navigation route AB.
- the user can accurately match the required navigation video and improve the navigation accuracy.
- the user may increase the number of input navigation parameter items when inputting the navigation request information, for example, input the area name, the link name, etc., and the terminal may also automatically determine the season and other parameters according to the current date, and The network query obtains current weather parameters, detects the current travel speed of the vehicle, and the like. The more navigation parameters in the navigation request information, the more accurate the navigation video obtained by the last query.
- FIG. 5 is a flowchart of a navigation method according to another exemplary embodiment.
- the navigation request information further includes at least one navigation parameter: a region name, a link name, a season, Weather, average driving speed, driving distance.
- step S12 further includes:
- step S51 acquiring navigation parameters of the navigation video obtained by the query
- step S52 calculating a matching degree between the navigation parameter of the navigation request information and the navigation parameter of the navigation video obtained by the query;
- step S53 the navigation video obtained by the query with the highest matching degree is determined as the navigation video matching the navigation request information, or the navigation video obtained by the query whose matching degree exceeds the preset threshold is determined as the navigation video matching the navigation request information. Or, the navigation video obtained by the preset number of matching with high matching is determined as the navigation video that matches the navigation request information.
- the query obtains three navigation videos that match the navigation start point A and the navigation end point B, respectively, Video1, Video2, and Video3.
- the navigation request information also includes the following navigation parameters:
- the navigation parameters of the navigation videos Video1, Video2, and Video3 are as follows:
- Video1 has the highest matching degree with the navigation parameters of the navigation request information, and Video1 can be used as the navigation video to be queried by the user. It is also possible to pre-set a navigation video with a matching degree of 50% or more or two navigation videos with a high matching degree to the user, and the user selects which navigation video to use.
- the video is also taken at 6:00 pm. If it is shot in the summer, because the day is still bright, the shooting effect is clear, and if it is winter Shooting, because the sky is already dark, only buildings or road signs with lighting can be clearly photographed. Similarly, weather conditions such as rain, snow, and fog are also very different from those of sunny days. In addition, the same starting point and end point, because the middle of the road section may be different, the navigation video is not the same. In order to match the user to a more accurate navigation video, you need to consider as many navigation parameters as possible.
- the navigation video obtained by the query is filtered by the navigation parameter to provide the user with a more accurate navigation video with higher matching with the user requirement, thereby improving navigation accuracy and user experience.
- the navigation method provided by the embodiment of the present disclosure has a server for navigation and a terminal with navigation function.
- the method of the present disclosure can be applied to a server on the network side as well as to a terminal.
- the terminal in this embodiment may be any device having a video playing function, such as a mobile phone, a computer, a digital broadcasting terminal, a messaging device, a game console, a tablet device, a personal digital assistant, and the like.
- Embodiments of the present disclosure arrange two sets of methods for implementing navigation according to different methods of implementing a subject, as follows:
- FIG. 6 is a flowchart of a navigation method according to another exemplary embodiment. As shown in FIG. 6, the method includes the following steps:
- step S61 the server receives the navigation request information sent by the terminal
- step S62 the server determines a navigation video that matches the navigation request information
- step S63 the server sends the navigation video to the terminal, and the terminal plays the navigation video for the user to navigate.
- the server needs to push the matched navigation video to the requesting terminal. In this way, it is not necessary to occupy a large amount of storage space of the terminal to store navigation video data, but it requires a certain amount of data traffic of the terminal.
- FIG. 7 is a flowchart of a navigation method according to another exemplary embodiment. As shown in FIG. 7, the method includes the following steps:
- step S71 the terminal receives navigation request information input by the user
- step S72 the terminal determines a navigation video that matches the navigation request information
- step S73 the terminal plays the navigation video and performs navigation.
- the terminal pre-stores a large amount of navigation video data, occupies the storage space of the terminal, but saves data traffic of the terminal, and is completed locally due to the query. ,faster.
- the server or the terminal performs navigation video matching, if there are multiple matching navigation videos, the user can select when playing the navigation video.
- FIG. 8 is a flowchart of a navigation method according to another exemplary embodiment. As shown in FIG. 8 , optionally, when the determined navigation video is at least two, the following steps are performed:
- step S81 the navigation videos matching the navigation request information are arranged and displayed;
- step S82 the user selects a selected operation of one of the navigation videos
- step S83 the selected navigation video is played.
- the navigation video obtained by the query can be displayed to the user, and the user selects which video to use for navigation, so that the navigation is more intelligent and the user experience is better.
- the playback speed of the navigation video is adjusted according to the driving speed.
- FIG. 9 is a flowchart of a navigation method according to another exemplary embodiment. As shown in FIG. 9 , optionally, the following steps are performed:
- step S91 the current traveling speed is acquired
- step S92 determining a play speed of the navigation video according to the current travel speed
- step S93 the navigation video is played in accordance with the playback speed.
- the playing speed of the navigation video is adjusted according to the current driving speed. For example, when the user decelerates, the playing speed of the navigation video is also slowed down. When the user stops, the navigation video stops playing when the user stops. When accelerating, the playback speed of the navigation video is accelerated accordingly. In this way, the video playing speed is matched with the current traveling speed, and the picture seen by the user from the navigation video is the current actual road picture, the navigation effect is more accurate, and the user experience is better.
- the method further includes: synchronizing navigation data from the network side, the navigation data includes a navigation video; and storing the navigation data in a local navigation database.
- the terminal queries the navigation video that matches the navigation request information from the local navigation database.
- the navigation data can be synchronized from the network side in advance.
- the user downloads the navigation video of the current city or the required area, so that no real data traffic is required for real-time navigation.
- Synchronizing navigation data from the network side can be periodic, such as once a week.
- the embodiment of the present disclosure further provides a navigation video generation method, which is applied to a terminal.
- the terminal can be any mobile recording device, a computer, a digital broadcasting terminal, a messaging device, a game console, a tablet device, a personal digital assistant, a driving recorder, and the like.
- FIG. 10 is a flowchart of a method for generating a navigation video according to an exemplary embodiment. As shown in FIG. 10, the method includes the following steps:
- step S101 a navigation parameter input by a user is acquired, where the navigation parameter includes a navigation start point and a navigation end point;
- step S102 real-time shooting is taken on the road, and when the navigation end point is reached, shooting is stopped, and a shooting video is obtained;
- step S103 the captured video is associated with the navigation parameter to obtain a navigation video
- step S104 the navigation video is uploaded to the network side.
- the terminal is configured to capture a video on the real road, associate the navigation parameters, generate a navigation video, and send the video to the network side server.
- the navigation video can be used for navigation, which can reduce the time for the user to reflect the map and various simplified logos during driving, and is more sensitive to road conditions and directions, thereby improving navigation accuracy. Degree also improves user experience.
- the method further includes:
- Step S103 further includes associating the average traveling speed as a navigation parameter with the captured video.
- the average driving speed can also be increased in the navigation parameters, so that the matching accuracy is higher when the navigation video is subsequently matched, and the navigation accuracy is improved.
- the terminal may also record navigation parameters such as the travel distance corresponding to the navigation video, so as to facilitate subsequent accurate query matching.
- FIG. 11 is a block diagram of a navigation device that can be implemented as part or all of an electronic device by software, hardware, or a combination of both, according to an exemplary embodiment. As shown in FIG. 11, the navigation device includes:
- the obtaining module 111 is configured to obtain navigation request information.
- a determining module 112 configured to determine a navigation video that matches the navigation request information, where the guide The aerial video is a video obtained by taking a live shot of the road;
- the navigation module 113 is configured to perform navigation according to the navigation video.
- FIG. 12 is a block diagram of a determining module according to an exemplary embodiment.
- the navigation request information includes navigation parameters: a navigation starting point and a navigation end point
- the determining module 112 includes:
- a first obtaining sub-module 121 configured to acquire a navigation starting point and a navigation end point of the navigation video
- a determining sub-module 122 configured to use the navigation start point of the navigation video to be the same as the navigation start point of the navigation request information, and the navigation end point of the navigation video is the same as the navigation end point of the navigation request information A navigation video that matches the navigation start point and the navigation end point.
- FIG. 13 is a block diagram of a determination module according to another exemplary embodiment.
- the navigation request information includes navigation parameters: a navigation start point and a navigation end point
- the determining module 112 includes:
- a first calculation sub-module 131 configured to calculate a navigation route according to the navigation starting point and the navigation end point;
- the query sub-module 132 is configured to query a navigation video that includes the navigation route
- the intercepting sub-module 133 is configured to intercept a navigation video corresponding to the navigation route from a navigation video that includes the navigation route, and use a navigation video corresponding to the navigation route as a navigation video that matches the navigation start point and the navigation end point.
- FIG. 14 is a block diagram of a determining module according to another exemplary embodiment.
- the navigation request information includes navigation parameters: a navigation starting point and a navigation end point
- the determining module 112 includes:
- a first calculation sub-module 131 configured to calculate a navigation route according to the navigation starting point and the navigation end point;
- a sub-module 142 configured to divide the navigation route into at least two navigation sub-routes
- the query sub-module 143 is configured to separately query a navigation video corresponding to the navigation sub-route;
- a splicing sub-module 144 configured to splicing navigation videos corresponding to the navigation sub-routes, A navigation video is obtained that matches the navigation start point and the navigation end point.
- FIG. 15 is a block diagram of a determination module according to another exemplary embodiment.
- the navigation request information further includes at least one navigation parameter: an area name, a road segment name. , season, weather, average driving speed, driving distance;
- the determining module further includes:
- a second obtaining submodule 151 configured to: when the query obtains at least two navigation videos that match the navigation start point and the navigation end point,
- a second calculation sub-module 152 configured to calculate a matching degree between the navigation parameter of the navigation request information and the navigation parameter of the navigation video obtained by the query;
- the determining sub-module 122 is configured to determine, by using the navigation video obtained by the query with the highest degree of matching, a navigation video that matches the navigation request information, or the query that matches the matching threshold by a preset threshold.
- the obtained navigation video is determined as a navigation video that matches the navigation request information, or the navigation video obtained by the predetermined number of the matching with a high degree of matching is determined as a navigation video that matches the navigation request information.
- the navigation module is configured to send the navigation video to the terminal when the device is applied to the network side, and play the navigation video by the terminal.
- the navigation module is configured to play the navigation video when the device is applied to the terminal.
- FIG. 16 is a block diagram of a navigation module, according to an exemplary embodiment. As shown in FIG. 16, the navigation module 113 includes:
- the display sub-module 161 is configured to display, when the determined navigation videos matching the navigation request information are at least two, the navigation videos that match the navigation request information are arranged and displayed;
- a receiving submodule 162 configured to receive a user selected operation of one of the navigation videos
- the playing sub-module 163 is configured to play the selected navigation video.
- FIG. 17 is a block diagram of a navigation module, according to another exemplary embodiment. As shown in FIG. 17 , optionally, the navigation module 113 includes:
- a determining submodule 172 configured to determine a playing speed of the navigation video according to the current traveling speed
- the playing submodule 173 is configured to play the navigation video according to the playing speed.
- FIG. 18 is a block diagram of a navigation device, according to another exemplary embodiment. As shown in FIG. 18, optionally, the device further includes:
- the synchronization module 114 is configured to synchronize navigation data from the network side when the device is applied to the terminal, where the navigation data includes the navigation video;
- a storage module 115 configured to store the navigation data in a local navigation database
- the determining module 112 is configured to query, from the local navigation database, a navigation video that matches the navigation request information.
- the embodiment of the present disclosure further provides a navigation video generating apparatus, which is applied to a terminal.
- FIG. 19 is a block diagram of a navigation video generating apparatus, according to an exemplary embodiment. As shown in Figure 19, the device includes:
- An obtaining module 191 configured to acquire a navigation parameter input by a user, where the navigation parameter includes a navigation start point and a navigation end point;
- the photographing module 192 is configured to perform real-time shooting on the road, and when the navigation end point is reached, stop shooting and obtain a shooting video;
- the association module 193 is configured to associate the captured video with the navigation parameter to obtain a navigation video.
- the uploading module 194 is configured to upload the navigation video to the network side.
- the method further includes:
- a recording module 195 configured to record a traveling speed
- a calculation module 196 configured to calculate an average traveling speed according to the traveling speed
- the association module 193 is configured to associate the average traveling speed as a navigation parameter with the captured video.
- the embodiment of the present disclosure further provides a navigation apparatus, including:
- a memory for storing processor executable instructions
- processor is configured to:
- the navigation video being a video obtained by taking a live view of the road;
- the embodiment of the present disclosure further provides a navigation video generating apparatus, including:
- a memory for storing processor executable instructions
- processor is configured to:
- FIG. 20 is a block diagram of an apparatus for navigating or generating a navigation video, the apparatus being applicable to a terminal device, according to an exemplary embodiment.
- device 2000 can be a video camera, a recording device, a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
- Apparatus 2000 can include one or more of the following components: processing component 2002, memory 2004, power component 2006, multimedia component 2008, audio component 2010, input/output (I/O) interface 2012, sensor component 2014, and communication component 2016 .
- Processing component 2002 typically controls the overall operation of device 2000, such as with display, telephone calls, Data communication, camera operations and operations associated with recording operations.
- Processing component 2002 may include one or more processors 2020 to execute instructions to perform all or part of the steps of the above described methods.
- processing component 2002 can include one or more modules to facilitate interaction between component 2002 and other components.
- processing component 2002 can include a multimedia module to facilitate interaction between multimedia component 2008 and processing component 2002.
- the memory 2004 is configured to store various types of data to support operation at the device 2000. Examples of such data include instructions for any application or method operating on device 2000, contact data, phone book data, messages, pictures, videos, and the like.
- the memory 2004 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), Magnetic Memory, Flash Memory, Disk or Optical Disk.
- SRAM static random access memory
- EEPROM electrically erasable programmable read only memory
- EPROM erasable Programmable Read Only Memory
- PROM Programmable Read Only Memory
- ROM Read Only Memory
- Magnetic Memory Flash Memory
- Disk Disk or Optical Disk.
- Power component 2006 provides power to various components of device 2000.
- Power component 2006 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 2000.
- the multimedia component 2008 includes a screen between the device 2000 and the user that provides an output interface.
- the screen can include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
- the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
- the multimedia component 2008 includes a front camera and/or a rear camera. When the device 2000 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
- the audio component 2010 is configured to output and/or input audio signals.
- audio component 2010 A microphone (MIC) is included that is configured to receive an external audio signal when device 2000 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
- the received audio signal may be further stored in memory 2004 or transmitted via communication component 2016.
- the audio component 2010 also includes a speaker for outputting an audio signal.
- the I/O interface 2012 provides an interface between the processing component 2002 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
- the sensor assembly 2014 includes one or more sensors for providing a status assessment of various aspects to the device 2000.
- sensor assembly 2014 can detect an open/closed state of device 2000, a relative positioning of components, such as the display and keypad of device 2000, and sensor component 2014 can also detect a change in position of one component of device 2000 or device 2000. The presence or absence of contact by the user with the device 2000, the orientation or acceleration/deceleration of the device 2000 and the temperature change of the device 2000.
- the sensor assembly 2014 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
- Sensor assembly 2014 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
- the sensor assembly 2014 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
- Communication component 2016 is configured to facilitate wired or wireless communication between device 2000 and other devices.
- the device 2000 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
- the communication component 2016 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
- the communication component 2016 also includes a near field communication (NFC) module to facilitate short range communication.
- NFC near field communication
- the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
- RFID radio frequency identification
- IrDA infrared data association
- UWB ultra-wideband
- Bluetooth Bluetooth
- apparatus 2000 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic A device (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLD programmable logic A device
- FPGA field programmable gate array
- controller microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
- non-transitory computer readable storage medium comprising instructions, such as a memory 2004 comprising instructions executable by processor 2020 of apparatus 2000 to perform the above method.
- the non-transitory computer readable storage medium may be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
- FIG. 21 is a block diagram of an apparatus for navigation, according to an exemplary embodiment.
- device 2100 can be provided as a server.
- Apparatus 2100 includes a processing component 2122 that further includes one or more processors, and memory resources represented by memory 2132 for storing instructions executable by processing component 2122, such as an application.
- the application stored in memory 2132 may include one or more modules each corresponding to a set of instructions.
- processing component 2122 is configured to execute instructions to perform the methods described above.
- the device 2100 can also include a power supply component 2126 configured to perform power management of the device 2100, a wired or wireless network interface 2150 configured to connect the device 2100 to the network, and an input/output (I/O) interface 2158.
- the device 2100 can operate based on an operating system stored in the memory 2132, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM or the like.
- a non-transitory computer readable storage medium when instructions in the storage medium are executed by a processor of apparatus 2000, to enable apparatus 2000 to perform the method of navigation video generation described above, the method comprising:
- the method further includes:
- Associating the captured video with the navigation parameters includes:
- the average travel speed is associated with the captured video as a navigation parameter.
- a non-transitory computer readable storage medium when instructions in the storage medium are executed by a processor of the apparatus 2000 or the apparatus 2100, enabling the apparatus 2000 or the apparatus 2100 to perform the method of navigation described above, the method comprising:
- the navigation video being a video obtained by taking a live view of the road;
- the navigation request information includes a navigation parameter: a navigation start point and a navigation end point
- the determining a navigation video that matches the navigation request information includes:
- Navigating the navigation start point of the navigation video with the navigation start point of the navigation request information, and the navigation end point of the navigation video is the same as the navigation end point of the navigation request information as matching the navigation start point and the navigation end point Navigation video.
- the navigation request information includes a navigation parameter: a navigation start point and a navigation end point
- the determining a navigation video that matches the navigation request information includes:
- the navigation video corresponding to the navigation route is intercepted from the navigation video that includes the navigation route, and the navigation video corresponding to the navigation route is used as a navigation video that matches the navigation start point and the navigation end point.
- the navigation request information includes a navigation parameter: a navigation start point and a navigation end point
- the determining a navigation video that matches the navigation request information includes:
- the navigation video corresponding to the navigation sub-route is spliced to obtain a navigation video that matches the navigation start point and the navigation end point.
- the navigation request information further includes at least one of the following navigation parameters: an area name, a link name, a season, a weather, an average travel speed, and a travel distance;
- the determining the navigation video that matches the navigation request information further includes:
- performing navigation according to the navigation video including:
- the navigation video is sent to a terminal, and the navigation video is played by the terminal.
- performing navigation according to the navigation video including:
- the navigating according to the navigation video includes:
- the navigating according to the navigation video includes:
- the navigation video is played at the playback speed.
- the method when the method is applied to a terminal, the method includes:
- Determining a navigation video that matches the navigation request information including:
- a navigation video matching the navigation request information is queried from a local navigation database.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Navigation (AREA)
- Instructional Devices (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (26)
- 一种导航方法,其特征在于,所述方法包括:获取导航请求信息;确定与所述导航请求信息匹配的导航视频,所述导航视频为对道路进行实景拍摄得到的视频;根据所述导航视频进行导航。
- 根据权利要求1所述的方法,其特征在于,所述导航请求信息包括导航参数:导航起点和导航终点,所述确定与所述导航请求信息匹配的导航视频,包括:获取导航视频的导航起点和导航终点;将所述导航视频的导航起点与所述导航请求信息的导航起点相同,且所述导航视频的导航终点与所述导航请求信息的导航终点相同的导航视频作为与所述导航起点和导航终点匹配的导航视频。
- 根据权利要求1所述的方法,其特征在于,所述导航请求信息包括导航参数:导航起点和导航终点,所述确定与所述导航请求信息匹配的导航视频,包括:根据所述导航起点和所述导航终点计算导航路线;查询包括所述导航路线的导航视频;从包括所述导航路线的导航视频中截取所述导航路线对应的导航视频,将所述导航路线对应的导航视频作为与所述导航起点和导航终点匹配的导航视频。
- 根据权利要求1所述的方法,其特征在于,所述导航请求信息包括导航参数:导航起点和导航终点,所述确定与所述导航请求信息匹配的导航视频,包括:根据所述导航起点和所述导航终点计算导航路线;将所述导航路线划分为至少两段导航子路线;分别查询与所述导航子路线对应的导航视频;将与所述导航子路线对应的导航视频进行拼接,得到与所述导航起点和导航终点匹配的导航视频。
- 根据权利要求2-4中任一项所述的方法,其特征在于,所述导航请求信息还包括以下至少一项导航参数:区域名、路段名、季节、天气、平均行驶速度、行驶距离;当查询得到至少两个与所述导航起点和导航终点匹配的导航视频时,所述确定与所述导航请求信息匹配的导航视频,还包括:获取查询得到的导航视频的导航参数;计算所述导航请求信息的导航参数与所述查询得到的导航视频的导航参数的匹配度;将所述匹配度最高的所述查询得到的导航视频确定为与所述导航请求信息匹配的导航视频,或将所述匹配度超过预设阈值的所述查询得到的导航视频确定为与所述导航请求信息匹配的导航视频,或将所述匹配度高的预设个数的所述查询得到的导航视频确定为与所述导航请求信息匹配的导航视频。
- 根据权利要求1所述的方法,其特征在于,当所述方法应用于网络侧时,根据所述导航视频进行导航,包括:将所述导航视频发送到终端,由所述终端播放所述导航视频。
- 根据权利要求1所述的方法,其特征在于,当所述方法应用于终端时,根据所述导航视频进行导航,包括:播放所述导航视频。
- 根据权利要求7所述的方法,其特征在于,当确定的与所述导航请求信息匹配的导航视频为至少两个时,所述根据所述导航视频进行导航包括:将与所述导航请求信息匹配的导航视频进行排列显示;接收用户对其中一个导航视频的选定操作;播放选定的所述导航视频。
- 根据权利要求7所述的方法,其特征在于,所述根据所述导航视频进行导航包括:获取当前行驶速度;根据所述当前行驶速度确定所述导航视频的播放速度;按照所述播放速度播放所述导航视频。
- 根据权利要求1所述的方法,其特征在于,当所述方法应用于终端时,所述方法包括:从网络侧同步导航数据,所述导航数据包括所述导航视频;将所述导航数据存储在本地导航数据库;确定与所述导航请求信息匹配的导航视频,包括:从本地导航数据库中查询与所述导航请求信息匹配的导航视频。
- 一种导航视频生成方法,其特征在于,包括:获取用户输入的导航参数,所述导航参数包括导航起点和导航终点;对道路进行实景拍摄,当到达所述导航终点时,停止拍摄,得到拍摄视频;将所述拍摄视频与所述导航参数进行关联,得到导航视频;将所述导航视频上传到网络侧。
- 根据权利要求11所述的方法,其特征在于,所述方法还包括:记录行驶速度;根据所述行驶速度计算平均行驶速度;将所述拍摄视频与所述导航参数进行关联,包括:将所述平均行驶速度作为导航参数与所述拍摄视频进行关联。
- 一种导航装置,其特征在于,包括:获取模块,用于获取导航请求信息;确定模块,用于确定与所述导航请求信息匹配的导航视频,所述导航视频为对道路进行实景拍摄得到的视频;导航模块,用于根据所述导航视频进行导航。
- 根据权利要求13所述的装置,其特征在于,所述导航请求信息包括导航参数:导航起点和导航终点,所述确定模块包括:第一获取子模块,用于获取导航视频的导航起点和导航终点;确定子模块,用于将所述导航视频的导航起点与所述导航请求信息的导航起点相同,且所述导航视频的导航终点与所述导航请求信息的导航终点相同的导航视频作为与所述导航起点和导航终点匹配的导航视频。
- 根据权利要求13所述的装置,其特征在于,所述导航请求信息包括导航参数:导航起点和导航终点,所述确定模块包括:第一计算子模块,用于根据所述导航起点和所述导航终点计算导航路线;查询子模块,用于查询包括所述导航路线的导航视频;截取子模块,用于从包括所述导航路线的导航视频中截取所述导航路线对应的导航视频,将所述导航路线对应的导航视频作为与所述导航起点和导航终点匹配的导航视频。
- 根据权利要求13所述的装置,其特征在于,所述导航请求信息包括导航参数:导航起点和导航终点,所述确定模块包括:第一计算子模块,用于根据所述导航起点和所述导航终点计算导航路线;划分子模块,用于将所述导航路线划分为至少两段导航子路线;查询子模块,用于分别查询与所述导航子路线对应的导航视频;拼接子模块,用于将与所述导航子路线对应的导航视频进行拼接,得到与所述导航起点和导航终点匹配的导航视频。
- 根据权利要求14-16中任一项所述的装置,其特征在于,所述导航 请求信息还包括以下至少一项导航参数:区域名、路段名、季节、天气、平均行驶速度、行驶距离;所述确定模块包括:第二获取子模块,用于当查询得到至少两个与所述导航起点和导航终点匹配的导航视频时,第二计算子模块,用于计算所述导航请求信息的导航参数与所述查询得到的导航视频的导航参数的匹配度;所述确定子模块,用于将所述匹配度最高的所述查询得到的导航视频确定为与所述导航请求信息匹配的导航视频,或将所述匹配度超过预设阈值的所述查询得到的导航视频确定为与所述导航请求信息匹配的导航视频,或将所述匹配度高的预设个数的所述查询得到的导航视频确定为与所述导航请求信息匹配的导航视频。
- 根据权利要求13所述的装置,其特征在于,所述导航模块,用于当所述装置应用于网络侧时,将所述导航视频发送到终端,由所述终端播放所述导航视频。
- 根据权利要求13所述的装置,其特征在于,所述导航模块,用于当所述装置应用于终端时,播放所述导航视频。
- 根据权利要求19所述的装置,其特征在于,所述导航模块包括:显示子模块,用于当确定的与所述导航请求信息匹配的导航视频为至少两个时,将与所述导航请求信息匹配的导航视频进行排列显示;接收子模块,用于接收用户对其中一个导航视频的选定操作;播放子模块,用于播放选定的所述导航视频。
- 根据权利要求19所述的装置,其特征在于,所述导航模块包括:获取子模块,用于获取当前行驶速度;确定子模块,用于根据所述当前行驶速度确定所述导航视频的播放速度;播放子模块,用于按照所述播放速度播放所述导航视频。
- 根据权利要求13所述的装置,其特征在于,所述装置还包括:同步模块,用于当所述装置应用于终端时,从网络侧同步导航数据,所述导航数据包括所述导航视频;存储模块,用于将所述导航数据存储在本地导航数据库;所述确定模块,用于从本地导航数据库中查询与所述导航请求信息匹配的导航视频。
- 一种导航视频生成装置,其特征在于,包括:获取模块,用于获取用户输入的导航参数,所述导航参数包括导航起点和导航终点;拍摄模块,用于对道路进行实景拍摄,当到达所述导航终点时,停止拍摄,得到拍摄视频;关联模块,用于将所述拍摄视频与所述导航参数进行关联,得到导航视频;上传模块,用于将所述导航视频上传到网络侧。
- 根据权利要求23所述的装置,其特征在于,所述方法还包括:记录模块,用于记录行驶速度;计算模块,用于根据所述行驶速度计算平均行驶速度;所述关联模块,用于将所述平均行驶速度作为导航参数与所述拍摄视频进行关联。
- 一种导航装置,其特征在于,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为:获取导航请求信息;确定与所述导航请求信息匹配的导航视频,所述导航视频为对道路进行 实景拍摄得到的视频;根据所述导航视频进行导航。
- 一种导航视频生成装置,其特征在于,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为:获取用户输入的导航参数,所述导航参数包括导航起点和导航终点;对道路进行实景拍摄,当到达所述导航终点时,停止拍摄,得到拍摄视频;将所述拍摄视频与所述导航参数进行关联,得到导航视频;将所述导航视频上传到网络侧。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MX2016004211A MX2016004211A (es) | 2015-09-22 | 2015-12-29 | Metodo y dispositivo para navegar y metodo y dispositivo para generar video de navegacion. |
RU2016112333A RU2630709C1 (ru) | 2015-09-22 | 2015-12-29 | Способ и устройство навигации, способ и устройство формирования навигационного видеоизображения |
JP2017541156A JP2017538948A (ja) | 2015-09-22 | 2015-12-29 | ナビゲーション、ナビゲーションビデオ生成方法および装置 |
KR1020167005586A KR20180048221A (ko) | 2015-09-22 | 2015-12-29 | 네비게이션, 네비게이션 비디오 생성 방법 및 장치 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510609516.0A CN105222802A (zh) | 2015-09-22 | 2015-09-22 | 导航、导航视频生成方法及装置 |
CN201510609516.0 | 2015-09-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017049796A1 true WO2017049796A1 (zh) | 2017-03-30 |
Family
ID=54991890
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2015/099304 WO2017049796A1 (zh) | 2015-09-22 | 2015-12-29 | 导航、导航视频生成方法及装置 |
Country Status (8)
Country | Link |
---|---|
US (1) | US20170082451A1 (zh) |
EP (1) | EP3156767B1 (zh) |
JP (1) | JP2017538948A (zh) |
KR (1) | KR20180048221A (zh) |
CN (1) | CN105222802A (zh) |
MX (1) | MX2016004211A (zh) |
RU (1) | RU2630709C1 (zh) |
WO (1) | WO2017049796A1 (zh) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105222773B (zh) * | 2015-09-29 | 2018-09-21 | 小米科技有限责任公司 | 导航方法及装置 |
CN105828288A (zh) * | 2016-03-21 | 2016-08-03 | 乐视网信息技术(北京)股份有限公司 | 一种多媒体控制方法及装置 |
CN107402019A (zh) * | 2016-05-19 | 2017-11-28 | 北京搜狗科技发展有限公司 | 一种视频导航的方法、装置及服务器 |
CN105973227A (zh) * | 2016-06-21 | 2016-09-28 | 上海磐导智能科技有限公司 | 可视化实景导航方法 |
CN107576332B (zh) * | 2016-07-04 | 2020-08-04 | 百度在线网络技术(北京)有限公司 | 一种换乘导航的方法和装置 |
CN107621265A (zh) * | 2016-07-14 | 2018-01-23 | 百度在线网络技术(北京)有限公司 | 一种用于进行室内导航的方法和装置 |
CN108020231A (zh) * | 2016-10-28 | 2018-05-11 | 大辅科技(北京)有限公司 | 一种基于视频的地图***及导航方法 |
EP3545672A4 (en) * | 2016-11-22 | 2020-10-28 | Volkswagen Aktiengesellschaft | METHOD AND DEVICE FOR VIDEO PROCESSING |
CN107588782A (zh) * | 2017-08-25 | 2018-01-16 | 上海与德科技有限公司 | 一种虚拟导航的驾驶方法和*** |
DE102018003249A1 (de) | 2018-04-20 | 2018-09-27 | Daimler Ag | Verfahren zur integrierten Videobildanzeige auf einem Anzeigegerät eines Navigationssystems |
CN110646002B (zh) * | 2018-06-27 | 2021-07-06 | 百度在线网络技术(北京)有限公司 | 用于处理信息的方法和装置 |
DE102018008560A1 (de) | 2018-10-30 | 2019-03-28 | Daimler Ag | Navigationssystem |
JP7275556B2 (ja) * | 2018-12-14 | 2023-05-18 | トヨタ自動車株式会社 | 情報処理システム、プログラム、及び情報処理方法 |
DE102019206250A1 (de) * | 2019-05-01 | 2020-11-05 | Siemens Schweiz Ag | Regelung und Steuerung der Ablaufgeschwindigkeit eines Videos |
CN111735473B (zh) * | 2020-07-06 | 2022-04-19 | 无锡广盈集团有限公司 | 一种能上传导航信息的北斗导航*** |
CN113899359B (zh) * | 2021-09-30 | 2023-02-17 | 北京百度网讯科技有限公司 | 导航方法、装置、设备以及存储介质 |
CN114370884A (zh) * | 2021-12-16 | 2022-04-19 | 北京三快在线科技有限公司 | 导航方法及装置、电子设备及可读存储介质 |
CN114844987A (zh) * | 2022-04-02 | 2022-08-02 | 咪咕音乐有限公司 | 全景导航方法、装置、设备及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102679989A (zh) * | 2012-05-23 | 2012-09-19 | 李杰波 | 一种用于物流配送导航方法及其应用的物流配送车 |
CN104180814A (zh) * | 2013-05-22 | 2014-12-03 | 北京百度网讯科技有限公司 | 移动终端上实景功能中的导航方法和电子地图客户端 |
CN104266654A (zh) * | 2014-09-26 | 2015-01-07 | 广东好帮手电子科技股份有限公司 | 一种车载实景导航***及方法 |
CN104897164A (zh) * | 2014-03-06 | 2015-09-09 | 宇龙计算机通信科技(深圳)有限公司 | 视频地图分享方法、装置及*** |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5982298A (en) * | 1996-11-14 | 1999-11-09 | Microsoft Corporation | Interactive traffic display and trip planner |
US6133853A (en) * | 1998-07-30 | 2000-10-17 | American Calcar, Inc. | Personal communication and positioning system |
AUPP152098A0 (en) * | 1998-01-28 | 1998-02-19 | Joyce Russ Advertising Pty Ltd | Navigational apparatus using visual description of the route |
JP2003046969A (ja) * | 2001-07-30 | 2003-02-14 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
DE10151354A1 (de) * | 2001-10-22 | 2003-05-08 | Andreas Berger | Informations- und Navigationssystem |
JP2004005493A (ja) * | 2002-04-24 | 2004-01-08 | Vehicle Information & Communication System Center | 運転者支援情報送信装置及び運転者支援情報受信装置ならびに運転者支援情報提供システム |
US20030210806A1 (en) * | 2002-05-07 | 2003-11-13 | Hitachi, Ltd. | Navigational information service with image capturing and sharing |
GB0215217D0 (en) * | 2002-06-29 | 2002-08-14 | Spenwill Ltd | Position referenced multimedia authoring and playback |
JP2005140638A (ja) * | 2003-11-06 | 2005-06-02 | Mitsubishi Electric Corp | ナビゲーション装置、道路映像情報作成装置およびこれらを用いた道路映像情報利用システム並びに記録媒体 |
JP4503393B2 (ja) * | 2004-08-10 | 2010-07-14 | 省吾 吉村 | 目的地案内装置、プログラム及びその記録媒体 |
US7177761B2 (en) * | 2004-10-27 | 2007-02-13 | Navteq North America, Llc | Map display for a navigation system |
US20070150188A1 (en) * | 2005-05-27 | 2007-06-28 | Outland Research, Llc | First-person video-based travel planning system |
DE102006056874B4 (de) * | 2006-12-01 | 2015-02-12 | Siemens Aktiengesellschaft | Navigationsgerät |
CN101459808A (zh) * | 2007-12-10 | 2009-06-17 | 英业达股份有限公司 | 结合定位***的影像记录方法 |
US20090254265A1 (en) * | 2008-04-08 | 2009-10-08 | Thimmannagari Chandra Reddy | Video map technology for navigation |
US20110102637A1 (en) * | 2009-11-03 | 2011-05-05 | Sony Ericsson Mobile Communications Ab | Travel videos |
CN101719130A (zh) * | 2009-11-25 | 2010-06-02 | 中兴通讯股份有限公司 | 街景地图的实现方法和实现*** |
JP2013134225A (ja) * | 2011-12-27 | 2013-07-08 | Nomura Research Institute Ltd | ナビゲーション装置、システム、方法及びコンピュータプログラム |
US8666655B2 (en) * | 2012-07-30 | 2014-03-04 | Aleksandr Shtukater | Systems and methods for navigation |
JP5958228B2 (ja) * | 2012-09-21 | 2016-07-27 | 株式会社Jvcケンウッド | 映像情報提供装置及び方法 |
CN104729520B (zh) * | 2015-03-18 | 2018-08-07 | 广东好帮手电子科技股份有限公司 | 一种基于轨迹记录和时间搜索的导航记录仪及方法 |
-
2015
- 2015-09-22 CN CN201510609516.0A patent/CN105222802A/zh active Pending
- 2015-12-29 RU RU2016112333A patent/RU2630709C1/ru active
- 2015-12-29 WO PCT/CN2015/099304 patent/WO2017049796A1/zh active Application Filing
- 2015-12-29 MX MX2016004211A patent/MX2016004211A/es unknown
- 2015-12-29 JP JP2017541156A patent/JP2017538948A/ja active Pending
- 2015-12-29 KR KR1020167005586A patent/KR20180048221A/ko not_active Application Discontinuation
-
2016
- 2016-09-14 US US15/265,621 patent/US20170082451A1/en not_active Abandoned
- 2016-09-15 EP EP16188955.5A patent/EP3156767B1/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102679989A (zh) * | 2012-05-23 | 2012-09-19 | 李杰波 | 一种用于物流配送导航方法及其应用的物流配送车 |
CN104180814A (zh) * | 2013-05-22 | 2014-12-03 | 北京百度网讯科技有限公司 | 移动终端上实景功能中的导航方法和电子地图客户端 |
CN104897164A (zh) * | 2014-03-06 | 2015-09-09 | 宇龙计算机通信科技(深圳)有限公司 | 视频地图分享方法、装置及*** |
CN104266654A (zh) * | 2014-09-26 | 2015-01-07 | 广东好帮手电子科技股份有限公司 | 一种车载实景导航***及方法 |
Also Published As
Publication number | Publication date |
---|---|
KR20180048221A (ko) | 2018-05-10 |
US20170082451A1 (en) | 2017-03-23 |
EP3156767B1 (en) | 2020-11-04 |
EP3156767A2 (en) | 2017-04-19 |
EP3156767A3 (en) | 2017-07-05 |
MX2016004211A (es) | 2017-11-15 |
RU2630709C1 (ru) | 2017-09-12 |
JP2017538948A (ja) | 2017-12-28 |
CN105222802A (zh) | 2016-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017049796A1 (zh) | 导航、导航视频生成方法及装置 | |
CN107957266B (zh) | 定位方法、装置及存储介质 | |
KR101870052B1 (ko) | 네비게이션 방법, 장치, 프로그램 및 기록 매체 | |
KR101668352B1 (ko) | 촬영 제어방법, 장치, 단말기, 프로그램 및 기록매체 | |
US9477687B2 (en) | Mobile terminal and metadata setting method thereof | |
RU2648625C2 (ru) | Способ и устройство для определения пространственного параметра на основе изображения, а также оконечное устройство | |
CN108259991B (zh) | 视频处理方法及装置 | |
CN106250430B (zh) | 智能设备列表的排序方法及装置 | |
KR20170072942A (ko) | 오디오 커버 디스플레이 방법 및 장치 | |
CN104123339A (zh) | 图像管理方法及装置 | |
CN105956091B (zh) | 扩展信息获取方法及装置 | |
WO2017096973A1 (zh) | 设备显示方法及装置 | |
CN105509735B (zh) | 信息提示方法、装置及终端 | |
EP3352453B1 (en) | Photographing method for intelligent flight device and intelligent flight device | |
CN105516592A (zh) | 拍摄方法及装置 | |
US20170090684A1 (en) | Method and apparatus for processing information | |
CN105959587A (zh) | 快门速度获取方法和装置 | |
WO2019006767A1 (zh) | 一种无人机的景点导航方法及装置 | |
CN105488074B (zh) | 照片聚类的方法及装置 | |
WO2021237592A1 (zh) | 锚点信息处理方法、装置、设备及存储介质 | |
CN112146676B (zh) | 信息导航方法、装置、设备及存储介质 | |
WO2015180383A1 (zh) | 一种确定位置的方法及装置 | |
CN109961646B (zh) | 一种路况信息的纠错方法及装置 | |
JP2018503988A (ja) | ユーザ情報プッシュ方法及び装置 | |
WO2022110801A1 (zh) | 数据处理方法及装置、电子设备和存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2017541156 Country of ref document: JP Kind code of ref document: A Ref document number: 20167005586 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2016/004211 Country of ref document: MX |
|
ENP | Entry into the national phase |
Ref document number: 2016112333 Country of ref document: RU Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15904670 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15904670 Country of ref document: EP Kind code of ref document: A1 |