CN103913174A - Navigation information generation method and system, mobile client and server - Google Patents

Navigation information generation method and system, mobile client and server Download PDF

Info

Publication number
CN103913174A
CN103913174A CN201210592118.9A CN201210592118A CN103913174A CN 103913174 A CN103913174 A CN 103913174A CN 201210592118 A CN201210592118 A CN 201210592118A CN 103913174 A CN103913174 A CN 103913174A
Authority
CN
China
Prior art keywords
information
user
mobile client
image
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201210592118.9A
Other languages
Chinese (zh)
Other versions
CN103913174B (en
Inventor
乔宇
邹静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201210592118.9A priority Critical patent/CN103913174B/en
Publication of CN103913174A publication Critical patent/CN103913174A/en
Application granted granted Critical
Publication of CN103913174B publication Critical patent/CN103913174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Abstract

Being applicable to the field of information technologies, the invention provides a navigation information generation method and system. The method includes: letting a mobile client receive or shoot a surrounding environment digital image of a user's current location, after the image is detected qualified, extracting image features according to the image type and key area divided by the system automatically or calibrated by the user, sending the image features and geographical locating information to a server together so as to retrieve the environmental information of the image's key area part, at the same time, calculating the shooting position and the user's current visual angle by the server, then sending the environmental information and the visual angle information back to the mobile client, and fusing the information with the digital image on the mobile client by an augmented reality means to form a user's visual angle based environmental information fused and reality augmented navigation chart. The method provided by the invention can improve the accuracy of the navigation system, provide the user's visual angle based environmental information fused and reality augmented navigation chart, and strengthen the user experience.

Description

A kind of generation method and system of navigation information and mobile client and server end
Technical field
The invention belongs to areas of information technology, relate in particular to a kind of generation method and system and mobile client and server end of navigation information.
Background technology
Existing augmented reality onboard navigation system is to extract various road conditions features according to the image of image collecting device Real-time Collection, judge road surface situation (comprising traffic lights feature, track feature, pedestrian's feature, front motor vehicle feature, bicycle feature etc.), in conjunction with predefined navigation way, generate navigation hint information (comprising turning of motor vehicle information, motor vehicle doubling information, danger warning information, destination range information, road name information, status information of equipment), this navigation hint information is added on described current road conditions image.It mainly provides road conditions identification and navigation feature, but and the buildings of failing on recognition image, the environmental positioning information providing is single, and only can the working direction that obvious displacement judges that user is current occur by user.
The navigational system of traditional facing mobile apparatus mostly via satellite GPS or wireless Wi-Fi location obtain current location information, and then transfer the electronic chart of current location, generate two dimension or three-dimensional navigation map.But the locating information that this type systematic provides is also comparatively single, the Two-dimensional electron map of looking down angle is only provided mostly, also fail to provide the ambient condition information (as periphery landmark title etc.) of the current present position of user, and its static situation in most unrenewable family current direction of user that judges.But for most hand-held mobile terminal user, it is except needs geo-localisation information, more need to assist the current present position of own accurate understanding oneself by grasping around significant environmental information (as landmark building etc.), but conventional navigation systems only provides the current residing street locations of user or latitude and longitude information, and it cannot allow user get a real idea of own present position.For instance, navigational system only can be pointed out active user in Mou Tiao street or certain crossing under normal circumstances, but user conventionally under foreign environment only whereby information be cannot be really clear and definite own residing position, if but add and provide the current environment building information of living in of user, user just can understand rapidly own current position.
As from the foregoing, in the face of the day by day complicated and increasingly three-dimensional urban transportation layout of level, traditional navigational system relies on merely GPS consumer positioning present position on Two-dimensional electron map, very easily, for user provides wrong or fuzzy locating information and navigation Service, affect greatly the accuracy of navigational system.
On the other hand, existing navigational system cannot provide the service that judges the current direction of user mostly in the situation that user is static, conventionally it is to move certain distance by user to judge current its working direction, this in actual use, give user particularly hand-held mobile terminal user bring great inconvenience.
Simultaneously, the navigation picture that existing navigator provides is mostly aerial view, do not meet the actual visual angle of user, and in built-up city, the all directions direction of angle that what a lot of users cannot provide according to navigation picture look down is guided clear and definite own correct working direction, and this just needs navigational system to provide a navigation directions based on himself visual angle for user.
Summary of the invention
The object of the present invention is to provide a kind of generation method and system and mobile client and server end of navigation information, be intended to solve the only voucher one geo-location means consumer positioning present position on two dimensional surface of navigational system existing in prior art, very easily, for user provides wrong or fuzzy locating information and navigation Service, affect greatly the problem of the accuracy of navigational system; In addition, existing navigational system mostly only can move certain distance by user and judge current its working direction, is necessary to develop a kind of method in the situation that user is static, the current direction of user being judged; Moreover existing navigator only provides the navigation picture of looking down angle mostly, with user particularly the actual visual angle of hand-held mobile terminal user be not inconsistent, cannot meet the problem of its clear and definite own correct working direction demand.In addition, existing navigator provides two dimensional surface or virtual map mostly, lack explanation or explanation to the current environment of living in of user, and the sense of reality is lower, makes user compared with indigestibility and receives navigation information.
The present invention is achieved in that a kind of generation method of navigation information, said method comprising the steps of:
Mobile client receives the surrounding environment digital picture of the current present position of user's shooting;
The demarcation information that the type information that mobile client receiving system is divided the surrounding environment digital picture of taking automatically or reception user demarcate it;
Mobile client is according to building feature and/or character features in the surrounding environment digital picture of type information or demarcation information extraction shooting;
Mobile client is obtained the current geographical location information of user;
Mobile client is sent to the building feature of extraction and/or character features or word recognition result and the current geographical location information of user in server end.
Mobile client reception server end feedack, by itself and Digital Image Fusion, forms the navigation picture of the integrated environment information of the augmented reality based on user's visual angle by augmented reality means.
Another object of the present invention is to provide a kind of generation system of navigation information, described system comprises:
Digital picture reception/taking module, for receiving or take the surrounding environment digital picture of the current present position of user;
Type is divided receiver module, the demarcation information that the type information of automatically the surrounding environment digital picture of taking being divided for receiving system or reception user demarcate it;
Buildings/character features extraction module, for according to type information or demarcate building feature and/or the character features of surrounding environment digital picture that information extraction is taken;
Geographical location information acquisition module, for obtaining the current geographical location information of mobile phone users;
Communication module, for being sent to server end by the building feature of extraction and/or character features and the current geographical location information of mobile phone users.And for environment identifying information, geography information and user's visual angle etc. of reception server end feedback.
Synthesis module, for by the environment identifying information receiving, geography information and user's visual angle by augmented reality means by itself and Digital Image Fusion, form the navigation picture of the integrated environment information of the augmented reality based on user's visual angle.
Another object of the present invention is to provide a kind of mobile client of the generation system that comprises navigation information recited above.
A kind of generation method that another object of the present invention is to provide navigation information, said method comprising the steps of:
Server end receives building feature and/or word and the current geographical location information of mobile phone users that mobile client sends;
Server end extracts the buildings image relevant to this geographic position in buildings image data base according to described geographical location information;
Server end extracts the building feature separately of described relevant buildings image;
Server end contrasts the building feature in the image of the building feature separately extracting and reception, and then buildings is identified;
Server end extracts the relevant environment information of the buildings that similarity is the highest;
The feature of the Architectural drawing that the Architectural drawing feature that server end basis receives and similarity are the highest is compared, and according to the geographic position of buildings reality, calculates the current visual angle of user.
Word after the correction that server end sends mobile client is retrieved in geographic information database in conjunction with the geographical location information receiving, and draws the relevant environment information of the current geography information of living in of mobile phone users;
Server end extracts the geographic pattern corresponding with this geographic position in geographic pattern database according to described geographical location information;
Server end is sent to above-mentioned relevant information in mobile client.
Above-mentioned geographic pattern and/or above-mentioned relevant environment information are sent to mobile client by server end.
Another object of the present invention is to provide a kind of generation system of navigation information, described system comprises:
Receiver module, the building feature and/or word and the current geographical location information of mobile phone users that send for receiving mobile client;
Buildings image extraction module, for extracting the buildings image relevant to this geographic position according to described geographical location information in buildings image data base;
Building feature extraction module, for extracting the building feature separately of described relevant buildings image;
Comparing module, for the building feature of the image of the building feature separately extracting and reception is compared, and then identifies buildings;
Environmental information extraction module, for extracting the relevant environment information of the buildings that similarity is the highest;
Visual angle computing module, for the feature of the characteristics of image receiving and the highest image of similarity is compared, thereby calculates user's visual angle.
Retrieval module, retrieves at geographic information database in conjunction with the geographical location information receiving for the word after the correction that mobile client is sent, and draws the relevant environment information of the current geography information of living in of mobile phone users;
Geographic pattern extraction module, for extracting the geographic pattern corresponding with this geographic position according to described geographical location information at geographic pattern database;
Sending module, for being sent to mobile client by above-mentioned relevant environment information.
Another object of the present invention is to provide a kind of server end of the generation system that comprises navigation information recited above.
In the present invention, the embodiment of the present invention proposes the identification of the environment based on user perspective and analysis of image content and airmanship scheme, this scheme according to user by mobile device take pictures gained digital image analysis and identify its building information, and in conjunction with GPS location and electronic map information, by augmented reality, above information and digital picture are merged mutually, giving user more in the environmental information of horn of plenty, can three-dimensionally position from multidimensional angle the location of the image assisted user accurate understanding oneself by augmented reality to user.And when user is when around without distinguishing mark building information, can be by taking the image of the identification Word messages such as guideboard, public transport stop board, carry out character analysis and identification by character analysis processing, and in conjunction with GPS locating information, can accurately locate active user present position, effectively solve the problem that existing navigational system only relies on geo-location means to locate mostly on two dimensional surface electronic chart.On the other hand, also avoid the location mistake and the navigation information that cause not in time because of urban transportation construction or navigation map information updating to lose efficacy, improved the accuracy of navigation.
On the other hand, the present invention can just can judge user's actual direction and identify out on image in the situation that user any displacement does not occur,, having solved existing most navigator needs user to move one section of problem that obviously could judge the actual direction of oneself after distance.
On the other hand, according to the current visual angle of image calculation user, and merge mutually by augmented reality and the digital picture that user takes, for user provides the navigation picture based on its actual visual angle.
On the other hand, the strong and single defect of information representation form of the navigator user interactivity that the present invention fully takes into account existing augmented reality, by rich interactive interface and the crucial identified region function of increase delineation, strengthen the accuracy of system locating information and environment identifying information, and user self photographic images is provided and draws a circle to approve the two-dimentional geographic pattern of crucial identified region sectional drawing with user, offer user from user's self visual angle and two kinds of different visual angles of two-dimensional map image angle of looking down panorama more comprehensive, directly perceived and real visual experience, strengthen the ability to express of system locating information and environment identifying information, make user more acceptant and understand self residing environment.
In addition, the present invention also has personalized interactive interface, not only there is stronger interactivity compared with conventional navigation systems, and can also provide personalized service, user can operate interface icon according to self actual demand, increased ease for use and the friendly of system, the user who has promoted system experiences, and gives user more personalized service.
Accompanying drawing explanation
Fig. 1 is the realization flow schematic diagram of the generation method of the navigation information that provides of first embodiment of the invention.
Fig. 2 is the structural representation of the generation system of the navigation information that provides of first embodiment of the invention.
Fig. 3 is the realization flow schematic diagram of the generation method of the navigation information that provides of second embodiment of the invention.
Fig. 4 is the structural representation of the generation system of the navigation information that provides of second embodiment of the invention.
Fig. 5 is the mobile client that provides of the embodiment of the present invention and the mutual Organization Chart of server end.
Embodiment
In order to make object of the present invention, technical scheme and beneficial effect clearer, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.
In embodiments of the present invention, the embodiment of the present invention has designed environment recognition technology scheme and an airmanship scheme based on user perspective and analysis of image content.The image that this technical scheme can be taken by mobile terminal according to user, on server end, analyze and recognition image in the Word message of landmark and guideboard, and then the geographic position of computed image and take towards etc. information, determine user perspective direction, corresponding building information on image is identified out, for user provides abundant location and environmental information in mobile client.Simultaneously by the user perspective direction signs that calculate on image, make user without mobile just can obtain current towards.
Refer to Fig. 1, the realization flow of the generation method of the navigation information providing for first embodiment of the invention, it mainly comprises the following steps:
In step S101, mobile client receives the surrounding environment digital picture of the current present position of user's shooting;
In step S102, the demarcation information that the type information that mobile client receiving system is divided the surrounding environment digital picture of taking automatically or reception user demarcate it;
In embodiments of the present invention, provide a human-computer interaction interface, user can carry out type division to captured image by this interactive interface, to determine that image carries out buildings identification or word identification.
In step S103, mobile client is according to building feature and/or character features in the surrounding environment digital picture of type information or demarcation information extraction shooting;
In embodiments of the present invention, extract the building feature in the surrounding environment digital picture of taking, be specially:
By extract the features such as color, texture, shape from the surrounding environment digital picture of taking, to obtain the feature representation of image.In addition, utilize SIFT feature point detection operator, from image, detect SIFT unique point, and calculate the SIFT local feature of these points.Characteristics of image will be uploaded onto the server end with for further analysis and recognition image content.The features such as color, texture, shape will be used for scene to classify, and SIFT feature will be mainly used in buildings identification.
The step of extracting the character features in the surrounding environment digital picture of taking, is specially:
The Word message such as guideboard, bus station's station board in main recognition image.Utilize the distribution character of color and word that the character area in image is detected and cut apart.And then, utilize OCR technology to identify the word in image.Final recognition result will identify out in image, and the end of uploading onto the server.
In step S104, mobile client is obtained the current geographical location information of user;
In embodiments of the present invention, utilize GPS or WIFI module to obtain the current geographical location information of mobile phone users.
In step S105, word and the current geographical location information of user that mobile client completes the building feature of extraction and/or check and correction are sent in server end.
In embodiments of the present invention, the generation method of described navigation information is further comprising the steps of:
Relevant environment information and visual angle that mobile client reception server end sends;
Mobile client is fused to described relevant environment identifying information, geography information and visual angle etc. in the digital picture of user's shooting by augmented reality means.
As one embodiment of the present invention, for explicit recognition object, improve recognition efficiency, user can be by drawing a circle to approve target identified region on touch-screen.Be implemented as follows:
The region of the needs identification that mobile client reception user draws a circle to approve in the surrounding environment digital picture of taking;
Mobile client is extracted corresponding building feature and/or the character features in region of the needs identification of delineation.
In embodiments of the present invention, when in the surrounding environment digital picture that the region of needs identification is being taken when too small cannot delineation, user can amplify the surrounding environment digital picture of described shooting, needs so that user draw a circle to approve the region identified.
As another preferred embodiment of the present invention, need the region of identification for convenience of quick delineation, according to the different characteristics of buildings identification and word identification, different separately quick delineation templates is provided respectively, user takes after target identified region being aimed in quick delineation region in the time taking, system will be analyzed identification to image automatically, operates in addition without user.Be implemented as follows:
Mobile client provides a quick delineation template;
Mobile client needs the region of identification while dropping on quick delineation template in when receiving, execution shooting operation;
Mobile client is extracted corresponding buildings and/or the character features of the image in delineation template.
As one embodiment of the present invention, the generation method of described navigation information is further comprising the steps of:
Mobile client is extracted the corresponding character features in region of the needs identification of delineation.
Mobile client is identified word according to the character features extracting;
Mobile client shows the word identifying for user and proofreads;
The word after proofreading and correct is sent to server end by mobile client.
As one embodiment of the present invention, one prompt facility is provided, this function can be pointed out in user's behavior to user in the time that interactive interface operates, and as the undesirable prompting of the image-region sharpness user of delineation, thereby carrys out standard user's operation behavior.
As one embodiment of the present invention, user can be by interactive interface be switched between the digital picture after augmented reality and geographic pattern, meets user's different demands.
As one embodiment of the present invention, the generation method of described navigation information is further comprising the steps of:
Whether mobile client detected image quality meets processing requirements, when detecting while meeting the requirements, and execution step S102; When detecting when undesirable, prompting user takes again.
Concrete examination criteria comprises image size, readability etc.First digital picture is done to whole detection, when meeting the identified region of after base conditioning requires, user being drawn a circle to approve, image does detailed detection, if delineation area image meets next step analyzing and testing requirement, carry out the building feature and/or the character features that extract in the surrounding environment digital picture of taking; Otherwise return to interactive interface, prompting user takes again.
Refer to Fig. 2, the structure of the generation system of the navigation information providing for first embodiment of the invention.For convenience of explanation, only show the part relevant to the embodiment of the present invention.The generation system of described navigation information comprises: digital picture reception/taking module 101, type are divided receiver module 102, buildings/character features extraction module 103, geographical location information acquisition module 104 and communication module 105.The generation system of described navigation information can be the unit that is built in software unit, hardware cell or software and hardware combining in mobile client.
Digital picture reception/taking module 101, for receiving or take the surrounding environment digital picture of the current present position of user;
Type is divided receiver module 102, the demarcation information that the type information of automatically the surrounding environment digital picture of taking being divided for receiving system or reception user demarcate it;
Buildings/character features extraction module 103, for according to type information or demarcate building feature and/or the character features of surrounding environment digital picture that information extraction is taken;
Geographical location information acquisition module 104, for obtaining the current geographical location information of mobile phone users;
Communication module 105, for being sent to server end by the building feature of extraction and/or word and the current geographical location information of mobile phone users.
In embodiments of the present invention, the generation system of described navigation information also comprises: synthesis module.
Communication module, the relevant environment information and the visual angle that also send for reception server end;
Synthesis module, for being fused to by augmented reality means the digital picture that user takes by described relevant environment identifying information, geography information and visual angle.
As one embodiment of the present invention, for explicit recognition object, improve recognition efficiency, user can be by drawing a circle to approve target identified region on touch-screen.The generation system of described navigation information also comprises: region receiver module.
Region receiver module, for receiving the region of the needs identification that user draws a circle to approve in the surrounding environment digital picture of taking;
Buildings/character features extraction module 103, corresponding building feature and/or the character features in region of also identifying for extracting the needs of delineation.
In embodiments of the present invention, the generation system of described navigation information also comprises: module is provided.
Masterplate module, for providing a quick delineation template;
Digital image capture module, also, for needing the region of identification while dropping in quick delineation template when receiving, carries out shooting operation;
Buildings/character features extraction module 103, also for extracting corresponding buildings and/or the character features of the image in delineation template.
As one embodiment of the present invention, the generation system of described navigation information also comprises: image detection module and reminding module.
Whether image detection module, meet processing requirements for detection of image;
Type is divided receiver module 102, also for when detecting while meeting the requirements, and the demarcation information that the type information that receiving system is divided the surrounding environment digital picture of taking automatically or reception user demarcate it;
Reminding module, for when detecting when undesirable, prompting user takes again.
Refer to Fig. 3, the realization flow of the generation method of the navigation information providing for second embodiment of the invention, it mainly comprises the following steps:
In step S201, server end receives building feature and/or word and the current geographical location information of mobile phone users that mobile client sends;
In step S202, server end extracts the buildings image relevant to this geographic position in buildings image data base according to described geographical location information;
In step S203, server end extracts the building feature separately of described relevant buildings image;
In step S204, server end contrasts the building feature in the image of the building feature separately extracting and reception, and then buildings is identified;
In step S205, server end extracts the relevant environment information of the buildings that similarity is the highest;
In embodiments of the present invention, relevant environment information spinner will comprise building name etc.
In step S206, the word after the correction that server end sends mobile client is retrieved in geographic information database in conjunction with the geographical location information receiving, and draws the relevant environment information of the current geography information of living in of mobile phone users;
In embodiments of the present invention, this relevant environment information spinner will comprise: guideboard and public transport stop board information etc.
In step S207, server end extracts the geographic pattern corresponding with this geographic position in geographic pattern database according to described geographical location information;
In step S208, server end is sent to above-mentioned relevant environment information in mobile client.
As one embodiment of the present invention, the generation method of described navigation information also comprises:
Building information in server end comparison photographic images in building feature and Geographic Information System calculates visual angle and the position of user's photographic images.
Refer to Fig. 4, the structure of the generation system of the navigation information providing for second embodiment of the invention.For convenience of explanation, only show the part relevant to the embodiment of the present invention.The generation system of described navigation information comprises: receiver module 201, buildings image extraction module 202, building feature extraction module 203, comparing module 204, environmental information extraction module 205, retrieval module 206, geographic pattern extraction module 207, sending module 208.The generation system of described navigation information can be the unit that is built in software unit, hardware cell or software and hardware combining in mobile client.
Receiver module 201, the building feature and/or word and the current geographical location information of mobile phone users that send for receiving mobile client;
Buildings image extraction module 202, for extracting the buildings image relevant to this geographic position according to described geographical location information in buildings image data base;
Building feature extraction module 203, for extracting the building feature separately of described relevant buildings image;
Comparing module 204, for the building feature of the image of the building feature separately extracting and reception is contrasted, and then identifies buildings;
Environmental information extraction module 205, for extracting the relevant environment information of the buildings that similarity is the highest;
Retrieval module 206, retrieves at geographic information database in conjunction with the geographical location information receiving for the word after the correction that mobile client is sent, and draws the relevant environment information of the current geography information of living in of mobile phone users;
In embodiments of the present invention, this relevant environment information spinner will comprise: guideboard and public transport stop board information etc.
Geographic pattern extraction module 207, for extracting the geographic pattern corresponding with this geographic position according to described geographical location information at geographic pattern database;
Sending module 208, for being sent to mobile client by above-mentioned relevant environment information.
As one embodiment of the present invention, the generation system of described navigation information also comprises: visual angle computing module.
Visual angle computing module, calculates visual angle and the position of user's photographic images for the building information in relatively photographic images building feature and Geographic Information System.
Refer to Fig. 5, describe the reciprocal process of mobile client and server end below in detail:
The surrounding environment digital picture that mobile phone users uses mobile terminal to take its current present position, first image detection module carries out image size to this digital picture, the detections such as readability, when meeting the identified region of after base conditioning requires, user being drawn a circle to approve, image does detailed detection, if delineation area image meets next step analyzing and testing requirement, obtain user, by interactive interface, captured image is carried out to type division, to determine that image carries out buildings identification or word identification (if the quick delineation template that employing system provides, user takes after target identified region being aimed in quick delineation region in the time taking, system will be analyzed identification to image automatically, by interactive interface, captured image is carried out to type division again without user, to determine that image carries out buildings identification or word identification.); Otherwise return to interactive interface, prompting user takes again.Then, buildings/character features extraction module carries out buildings or character features extraction to digital picture; Mobile client is identified word; Mobile client is shown the word identifying for user and is proofreaded modification by interactive interface; Mobile client is sent to word and the current geographical location information of user after the building feature of extraction and/or check and correction in server end by communication module.
Server end is according to the GPS(or the WIFI that receive from client (mobile client)) locating information, the approximate range of Primary Location image taking then extracts the characteristic of building in this regional extent from database.Then, the point of interest feature of obtaining in image is mated with the unique point of standard building, locate and identify the key building in image.Utilize these position of key building in image, the visual angle information while identifying image taking simultaneously.Finally the key building and the orientation information that identify are sent to synthesis module.If image recognition failure, returns to client by failure information.
In addition, GPS(or WIFI that server end sends from client) locating information and word, guideboard in itself and geographic information database and public transport stop board information are compared, draw the specific geographic environmental location information of the current position of mobile phone users.And the information in this information and buildings image recognition is sent to synthesis module in the lump.
The function of synthesis module is to utilize the GPS(or the WIFI that receive) locating information transfers corresponding map image from map data base, and the key building information receiving, user's vision orientation information, word identifying information, geographical environment positional information etc. are sent to mobile client, then mobile client these information exchanges cross augmented reality merge be shown on the photo of shooting.
The information such as the buildings identifying information of mobile client reception server end feedback, word identifying information, geographical environment information, user's visual angle, camera site, be synthesized to by augmented reality in the digital picture of user's shooting, then show the augmented reality image that user can be browsed to comprise enriched environment information by interactive interface.
In addition, identifying information and the geography information such as the buildings word of mobile client reception server end feedback, be synthesized in digital picture by augmented reality, strengthens the environmental information ability to express of map image.
In sum, the embodiment of the present invention proposes the identification of the environment based on user perspective and analysis of image content and airmanship scheme, this scheme according to user by mobile device take pictures gained digital image analysis and identify its building information, in conjunction with GPS location and electronic map information, analyze and draw the actual visual angle of user and current direction simultaneously, by augmented reality, above-mentioned information is being merged mutually with digital picture, give user more when the environmental information of horn of plenty, can three-dimensionally position user from multidimensional angle, the location of the image assisted user accurate understanding oneself by augmented reality.And when user is when around without distinguishing mark building information, can be by taking the image of the identification Word messages such as guideboard, public transport stop board, carry out character analysis and identification by character analysis processing, and in conjunction with GPS locating information, can accurately locate active user present position, effectively solve the problem that existing navigational system only relies on geo-location means to locate mostly on two dimensional surface electronic chart.On the other hand, also avoid the location mistake and the navigation information that cause not in time because of urban transportation construction or navigation map information updating to lose efficacy, improved the accuracy of navigation.
On the other hand, can calculate user current visual angle at server end according to characteristics of image, and identify out on image by augmented reality, though make user can also be clearly in actionless situation oneself current direction.
On the other hand, the strong and single defect of information representation form of the navigator user interactivity that the present invention fully takes into account existing augmented reality, by rich interactive interface and the crucial identified region function of increase delineation, strengthen the accuracy of system locating information and environment identifying information, and user self photographic images is provided and draws a circle to approve the two-dimentional geographic pattern of crucial identified region sectional drawing with user, offer user from user's self visual angle and two kinds of different visual angles of two-dimensional map image angle of looking down panorama more comprehensive, directly perceived and real visual experience, strengthen the ability to express of system locating information and environment identifying information, make user more acceptant and understand self residing environment.
In addition, the present invention also has personalized interactive interface, not only there is stronger interactivity compared with conventional navigation systems, and can also provide personalized service, user can operate interface icon according to self actual demand, increased ease for use and the friendly of system, the user who has promoted system experiences, and gives user more personalized service.
One of ordinary skill in the art will appreciate that all or part of step realizing in above-described embodiment method is can carry out the hardware that instruction is relevant by program to complete, described program can be stored in a computer read/write memory medium, described storage medium, as ROM/RAM, disk, CD etc.
The foregoing is only preferred embodiment of the present invention, not in order to limit the present invention, all any modifications of doing within the spirit and principles in the present invention, be equal to and replace and improvement etc., within all should being included in protection scope of the present invention.

Claims (17)

1. a generation method for navigation information, is characterized in that, said method comprising the steps of:
Mobile client receives the surrounding environment digital picture of the current present position of user's shooting;
The demarcation information that the type information that mobile client receiving system is divided the surrounding environment digital picture of taking automatically or reception user demarcate it;
Mobile client is according to building feature and/or character features in the surrounding environment digital picture of type information or demarcation information extraction shooting;
Mobile client is obtained the current geographical location information of user;
Mobile client is sent to the building feature of extraction and/or character features or word recognition result and the current geographical location information of user in server end.
2. the method for claim 1, is characterized in that, the generation method of described navigation information is further comprising the steps of:
Relevant environment information and visual angle that mobile client reception server end sends;
Mobile client is synthesized to described related building word environment identifying information, geography information and visual angle in the digital picture of user's shooting by augmented reality means.
3. the method for claim 1, is characterized in that, the generation method of described navigation information is further comprising the steps of:
The region of the needs identification that mobile client reception user draws a circle to approve in the surrounding environment digital picture of taking;
Mobile client is extracted corresponding building feature and/or the character features in region of the needs identification of delineation.
4. the method for claim 1, is characterized in that, the generation method of described navigation information is further comprising the steps of:
Mobile client provides a quick delineation template;
Mobile client needs the region of identification while dropping on quick delineation template in when receiving, execution shooting operation;
Mobile client is extracted corresponding buildings and/or the character features of the image in delineation template.
5. the method for claim 1, is characterized in that, the generation method of described navigation information is further comprising the steps of:
Whether mobile client detected image quality meets processing requirements;
When detecting while meeting the requirements, the type information that execution step receiving system is divided the surrounding environment digital picture of taking automatically or receive the demarcation information that user demarcates it;
When detecting when undesirable, prompting user takes again.
6. the method for claim 1, is characterized in that, the generation method of described navigation information is further comprising the steps of:
Mobile client is extracted the corresponding character features in region of the needs identification of delineation.
Mobile client or server end are identified word according to the character features extracting;
Mobile client shows the word identifying for user and proofreads;
Character features or word recognition result are sent to server end by mobile client.
7. a generation system for navigation information, is characterized in that, described system comprises:
Digital picture reception/taking module, for receiving or take the surrounding environment digital picture of the current present position of user;
Type is divided receiver module, the demarcation information that the type information of automatically the surrounding environment digital picture of taking being divided for receiving system or reception user demarcate it;
Buildings/character features extraction module, for according to type information or demarcate building feature and/or the character features of surrounding environment digital picture that information extraction is taken;
Geographical location information acquisition module, for obtaining the current geographical location information of mobile phone users;
Communication module, for being sent to server end by the building feature of extraction and/or character features or word recognition result and the current geographical location information of mobile phone users.
8. system as claimed in claim 7, is characterized in that, the generation system of described navigation information also comprises:
Communication module, the relevant environment information and the visual angle that also send for reception server end;
Synthesis module, for being synthesized to by augmented reality means the digital picture that user takes by described buildings word recognition result, relevant environment information and shooting angle location information.
9. system as claimed in claim 7, is characterized in that, the generation system of described navigation information also comprises:
Region receiver module, for receiving the region of the needs identification that user draws a circle to approve in the surrounding environment digital picture of taking;
Buildings/character features extraction module, corresponding building feature and/or the character features in region of also identifying for extracting the needs of delineation.
10. system as claimed in claim 7, is characterized in that, the generation system of described navigation information also comprises:
Masterplate module, for providing a quick delineation template;
Digital image capture module, also, for needing the region of identification while dropping in quick delineation template when receiving, carries out shooting operation;
Buildings/character features extraction module, also for extracting corresponding buildings and/or the character features of the image in delineation template.
11. systems as claimed in claim 7, is characterized in that, the generation system of described navigation information also comprises:
Whether image detection module, meet processing requirements for detection of picture quality;
Type is divided receiver module, also for when detecting while meeting the requirements, and the demarcation information that the type information that receiving system is divided the surrounding environment digital picture of taking automatically or reception user demarcate it;
Reminding module, for when detecting when undesirable, prompting user takes again.
12. 1 kinds comprise the mobile client of the generation system of the navigation information described in claim 7 to 11 any one.
The generation method of 13. 1 kinds of navigation informations, is characterized in that, said method comprising the steps of:
Server end receives building feature and/or word and the current geographical location information of mobile phone users that mobile client sends;
Server end extracts the buildings image relevant to this geographic position in buildings image data base according to received geographical location information;
Server end extracts the building feature separately of described relevant buildings image;
Server end contrasts the building feature in the image of the building feature separately extracting and reception, and then buildings is identified;
Server end extracts the relevant environment information of the buildings that similarity is the highest;
Word after the correction that server end sends mobile client is retrieved in geographic information database in conjunction with the geographical location information receiving, and draws the relevant environment information of the current geography information of living in of mobile phone users;
Server end extracts the geographic pattern corresponding with this geographic position in geographic pattern database according to described geographical location information;
Server end is sent to above-mentioned relevant environment information in mobile client.
14. methods as claimed in claim 13, is characterized in that, the generation method of described navigation information is further comprising the steps of:
Building information in server end comparison photographic images in building feature and Geographic Information System calculates visual angle and the position of user's photographic images.
The generation system of 15. 1 kinds of navigation informations, is characterized in that, described system comprises:
Receiver module, the building feature and/or word and the current geographical location information of mobile phone users that send for receiving mobile client;
Buildings image extraction module, for extracting the buildings image relevant to this geographic position according to described geographical location information in buildings image data base;
Building feature extraction module, for extracting the building feature separately of described relevant buildings image;
Comparing module, for the building feature of the image of the building feature separately extracting and reception is compared, and then identifies buildings;
Environmental information extraction module, for extracting the relevant environment information of the buildings that similarity is the highest;
Retrieval module, retrieves at geographic information database in conjunction with the geographical location information receiving for the word after the correction that mobile client is sent, and draws the relevant environment information of the current geography information of living in of mobile phone users;
Geographic pattern extraction module, for extracting the geographic pattern corresponding with this geographic position according to described geographical location information at geographic pattern database;
Sending module, for being sent to mobile client by above-mentioned relevant information.
16. systems as claimed in claim 15, is characterized in that, the generation system of described navigation information also comprises:
Visual angle computing module, thereby for the buildings characteristics of image in photographic images building feature and Geographic Information System is compared and calculated visual angle and the position of user's photographic images.
17. 1 kinds comprise the server end of the generation system of the navigation information described in claim 15 or 16 any one.
CN201210592118.9A 2012-12-31 2012-12-31 The generation method and system of a kind of navigation information and mobile client and server end Active CN103913174B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210592118.9A CN103913174B (en) 2012-12-31 2012-12-31 The generation method and system of a kind of navigation information and mobile client and server end

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210592118.9A CN103913174B (en) 2012-12-31 2012-12-31 The generation method and system of a kind of navigation information and mobile client and server end

Publications (2)

Publication Number Publication Date
CN103913174A true CN103913174A (en) 2014-07-09
CN103913174B CN103913174B (en) 2016-10-19

Family

ID=51039038

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210592118.9A Active CN103913174B (en) 2012-12-31 2012-12-31 The generation method and system of a kind of navigation information and mobile client and server end

Country Status (1)

Country Link
CN (1) CN103913174B (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104266654A (en) * 2014-09-26 2015-01-07 广东好帮手电子科技股份有限公司 Vehicle real scene navigation system and method
CN104819723A (en) * 2015-04-29 2015-08-05 京东方科技集团股份有限公司 Positioning method and positioning server
CN105320725A (en) * 2015-05-29 2016-02-10 杨振贤 Method and apparatus for acquiring geographic object in collection point image
CN105373610A (en) * 2015-11-17 2016-03-02 广东欧珀移动通信有限公司 Indoor map updating method and server
CN106652556A (en) * 2015-10-28 2017-05-10 ***通信集团公司 Human-vehicle anti-collision method and apparatus
CN106683473A (en) * 2017-03-30 2017-05-17 深圳市科漫达智能管理科技有限公司 Reverse-direction vehicle-finding navigation method and mobile terminal
CN107220726A (en) * 2017-04-26 2017-09-29 消检通(深圳)科技有限公司 Fire-fighting equipment localization method, mobile terminal and system based on augmented reality
CN107481342A (en) * 2016-06-07 2017-12-15 腾讯科技(深圳)有限公司 Attendance checking system, method, server and terminal
WO2018032180A1 (en) * 2016-08-14 2018-02-22 阮元 Method and system for stopping visual information push according to market feedback
CN107764273A (en) * 2017-10-16 2018-03-06 北京耘华科技有限公司 A kind of automobile navigation localization method and system
CN107948956A (en) * 2017-11-07 2018-04-20 北京小米移动软件有限公司 Localization method and device
CN108427947A (en) * 2018-03-16 2018-08-21 联想(北京)有限公司 A kind of image-recognizing method and electronic equipment
CN108507541A (en) * 2018-03-01 2018-09-07 广东欧珀移动通信有限公司 Building recognition method and system and mobile terminal
CN108827325A (en) * 2017-05-04 2018-11-16 大众汽车有限公司 Method, equipment and the computer-readable storage medium that data are positioned
CN108833834A (en) * 2018-06-21 2018-11-16 苏州博学智能科技有限公司 A kind of finding system of children loss prevention
CN108917744A (en) * 2018-05-11 2018-11-30 中国地质大学(武汉) A kind of accurate 3-D positioning method based on GPS and street view database
CN109635145A (en) * 2018-11-23 2019-04-16 积成电子股份有限公司 Power equipment inspection information identifying method based on Multidimensional Comprehensive information
WO2019100699A1 (en) * 2017-11-24 2019-05-31 Guangdong Kang Yun Technologies Limited Method and system to configure scanning bot
CN109857945A (en) * 2019-01-02 2019-06-07 珠海格力电器股份有限公司 A kind of information display method, device, storage medium and terminal
CN109992089A (en) * 2017-12-29 2019-07-09 索尼公司 Electronic equipment, wireless communication system, method and computer readable storage medium
CN110019608A (en) * 2017-11-16 2019-07-16 腾讯科技(深圳)有限公司 A kind of information collecting method, device and system and storage equipment
CN110470315A (en) * 2019-06-27 2019-11-19 安徽四创电子股份有限公司 A kind of sight spot tourist air navigation aid
CN112789480A (en) * 2019-05-24 2021-05-11 谷歌有限责任公司 Method and apparatus for navigating two or more users to meeting location
CN113091763A (en) * 2021-03-30 2021-07-09 泰瑞数创科技(北京)有限公司 Navigation method based on live-action three-dimensional map
CN113484889A (en) * 2021-07-07 2021-10-08 中国人民解放军国防科技大学 Immersive navigation system based on augmented reality and satellite positioning of mobile terminal
CN113792726A (en) * 2021-11-16 2021-12-14 北京长隆讯飞科技有限公司 Method and system for rapidly generating POI (Point of interest) based on visual image
US11645368B2 (en) 2016-12-30 2023-05-09 Google Llc Hash-based dynamic restriction of content on information resources

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1612707A2 (en) * 2004-06-30 2006-01-04 Navteq North America, LLC Method of collecting information for a geographic database for use with a navigation system
CN1854757A (en) * 2005-04-28 2006-11-01 中国科学院遥感应用研究所 Remote-sensing imaging set interpretation system method
US20070233380A1 (en) * 2006-03-29 2007-10-04 Denso Corporation Navigation device and method of navigating vehicle
CN101340661A (en) * 2008-08-14 2009-01-07 北京中星微电子有限公司 Guide control implementing mobile apparatus and server, guide control method
CN101945327A (en) * 2010-09-02 2011-01-12 郑茂 Wireless positioning method and system based on digital image identification and retrieve
CN102243078A (en) * 2010-05-13 2011-11-16 上海宾华信息科技有限公司 Release-upon-vision navigation device and realization method thereof
CN102456132A (en) * 2010-10-22 2012-05-16 英业达股份有限公司 Location method and electronic device applying same

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1612707A2 (en) * 2004-06-30 2006-01-04 Navteq North America, LLC Method of collecting information for a geographic database for use with a navigation system
CN1854757A (en) * 2005-04-28 2006-11-01 中国科学院遥感应用研究所 Remote-sensing imaging set interpretation system method
US20070233380A1 (en) * 2006-03-29 2007-10-04 Denso Corporation Navigation device and method of navigating vehicle
CN101340661A (en) * 2008-08-14 2009-01-07 北京中星微电子有限公司 Guide control implementing mobile apparatus and server, guide control method
CN102243078A (en) * 2010-05-13 2011-11-16 上海宾华信息科技有限公司 Release-upon-vision navigation device and realization method thereof
CN101945327A (en) * 2010-09-02 2011-01-12 郑茂 Wireless positioning method and system based on digital image identification and retrieve
CN102456132A (en) * 2010-10-22 2012-05-16 英业达股份有限公司 Location method and electronic device applying same

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104266654A (en) * 2014-09-26 2015-01-07 广东好帮手电子科技股份有限公司 Vehicle real scene navigation system and method
CN104819723A (en) * 2015-04-29 2015-08-05 京东方科技集团股份有限公司 Positioning method and positioning server
CN104819723B (en) * 2015-04-29 2017-10-13 京东方科技集团股份有限公司 A kind of localization method and location-server
CN105320725A (en) * 2015-05-29 2016-02-10 杨振贤 Method and apparatus for acquiring geographic object in collection point image
CN105320725B (en) * 2015-05-29 2019-02-22 杨振贤 Obtain the method and device of the geographic object in acquisition point image
CN106652556A (en) * 2015-10-28 2017-05-10 ***通信集团公司 Human-vehicle anti-collision method and apparatus
CN105373610A (en) * 2015-11-17 2016-03-02 广东欧珀移动通信有限公司 Indoor map updating method and server
CN107481342A (en) * 2016-06-07 2017-12-15 腾讯科技(深圳)有限公司 Attendance checking system, method, server and terminal
WO2018032180A1 (en) * 2016-08-14 2018-02-22 阮元 Method and system for stopping visual information push according to market feedback
US11645368B2 (en) 2016-12-30 2023-05-09 Google Llc Hash-based dynamic restriction of content on information resources
CN106683473A (en) * 2017-03-30 2017-05-17 深圳市科漫达智能管理科技有限公司 Reverse-direction vehicle-finding navigation method and mobile terminal
CN107220726A (en) * 2017-04-26 2017-09-29 消检通(深圳)科技有限公司 Fire-fighting equipment localization method, mobile terminal and system based on augmented reality
CN108827325A (en) * 2017-05-04 2018-11-16 大众汽车有限公司 Method, equipment and the computer-readable storage medium that data are positioned
CN107764273B (en) * 2017-10-16 2020-01-21 北京耘华科技有限公司 Vehicle navigation positioning method and system
CN107764273A (en) * 2017-10-16 2018-03-06 北京耘华科技有限公司 A kind of automobile navigation localization method and system
US10582338B2 (en) 2017-11-07 2020-03-03 Beijing Xiaomi Mobile Software Co., Ltd. Positioning method and device
CN107948956A (en) * 2017-11-07 2018-04-20 北京小米移动软件有限公司 Localization method and device
CN110019608A (en) * 2017-11-16 2019-07-16 腾讯科技(深圳)有限公司 A kind of information collecting method, device and system and storage equipment
CN110019608B (en) * 2017-11-16 2022-08-05 腾讯科技(深圳)有限公司 Information acquisition method, device and system and storage equipment
WO2019100699A1 (en) * 2017-11-24 2019-05-31 Guangdong Kang Yun Technologies Limited Method and system to configure scanning bot
CN109992089A (en) * 2017-12-29 2019-07-09 索尼公司 Electronic equipment, wireless communication system, method and computer readable storage medium
CN108507541A (en) * 2018-03-01 2018-09-07 广东欧珀移动通信有限公司 Building recognition method and system and mobile terminal
CN108427947A (en) * 2018-03-16 2018-08-21 联想(北京)有限公司 A kind of image-recognizing method and electronic equipment
CN108917744A (en) * 2018-05-11 2018-11-30 中国地质大学(武汉) A kind of accurate 3-D positioning method based on GPS and street view database
CN108833834A (en) * 2018-06-21 2018-11-16 苏州博学智能科技有限公司 A kind of finding system of children loss prevention
CN109635145A (en) * 2018-11-23 2019-04-16 积成电子股份有限公司 Power equipment inspection information identifying method based on Multidimensional Comprehensive information
CN109857945A (en) * 2019-01-02 2019-06-07 珠海格力电器股份有限公司 A kind of information display method, device, storage medium and terminal
CN112789480A (en) * 2019-05-24 2021-05-11 谷歌有限责任公司 Method and apparatus for navigating two or more users to meeting location
CN112789480B (en) * 2019-05-24 2023-01-03 谷歌有限责任公司 Method and apparatus for navigating two or more users to meeting location
CN110470315A (en) * 2019-06-27 2019-11-19 安徽四创电子股份有限公司 A kind of sight spot tourist air navigation aid
CN113091763B (en) * 2021-03-30 2022-05-03 泰瑞数创科技(北京)有限公司 Navigation method based on live-action three-dimensional map
CN113091763A (en) * 2021-03-30 2021-07-09 泰瑞数创科技(北京)有限公司 Navigation method based on live-action three-dimensional map
CN113484889A (en) * 2021-07-07 2021-10-08 中国人民解放军国防科技大学 Immersive navigation system based on augmented reality and satellite positioning of mobile terminal
CN113792726A (en) * 2021-11-16 2021-12-14 北京长隆讯飞科技有限公司 Method and system for rapidly generating POI (Point of interest) based on visual image

Also Published As

Publication number Publication date
CN103913174B (en) 2016-10-19

Similar Documents

Publication Publication Date Title
CN103913174A (en) Navigation information generation method and system, mobile client and server
US10664708B2 (en) Image location through large object detection
US20230400317A1 (en) Methods and Systems for Generating Route Data
CN107133325B (en) Internet photo geographic space positioning method based on street view map
US8315456B2 (en) Methods and apparatus for auditing signage
US9324003B2 (en) Location of image capture device and object features in a captured image
US9497581B2 (en) Incident reporting
US9874454B2 (en) Community-based data for mapping systems
CN103632626A (en) Intelligent tour guide realizing method and intelligent tour guide device based on mobile network and mobile client
CN112101339B (en) Map interest point information acquisition method and device, electronic equipment and storage medium
US20200167603A1 (en) Method, apparatus, and system for providing image labeling for cross view alignment
KR100533033B1 (en) Position tracing system and method using digital video process technic
KR20140038355A (en) Computerized method and device for annotating at least one feature of an image of a view
JP2023516502A (en) Systems and methods for image-based location determination and parking monitoring
US20150154607A1 (en) Systems and methods of correlating business information to determine spam, closed businesses, and ranking signals
US20130135446A1 (en) Street view creating system and method thereof
US20210350189A1 (en) Bayesian Methodology for Geospatial Object/Characteristic Detection
JP2007122247A (en) Automatic landmark information production method and system
CN104748739A (en) Intelligent machine augmented reality implementation method
WO2023160722A1 (en) Interactive target object searching method and system and storage medium
CN105845020B (en) A kind of live-action map production method and device
JP4685286B2 (en) Information update processing device
KR20030084855A (en) Method for implementing Video GIS system of car navigation system having GPS receiver
CN111275823B (en) Target associated data display method, device and system
Wang et al. A markerless augmented reality mobile navigation system with multiple targets display function

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant