WO2020006685A1 - Procédé d'établissement de carte, terminal et support de stockage lisible par ordinateur - Google Patents

Procédé d'établissement de carte, terminal et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2020006685A1
WO2020006685A1 PCT/CN2018/094329 CN2018094329W WO2020006685A1 WO 2020006685 A1 WO2020006685 A1 WO 2020006685A1 CN 2018094329 W CN2018094329 W CN 2018094329W WO 2020006685 A1 WO2020006685 A1 WO 2020006685A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
description information
information
maps
points
Prior art date
Application number
PCT/CN2018/094329
Other languages
English (en)
Chinese (zh)
Inventor
韩立明
林义闽
廉士国
Original Assignee
深圳前海达闼云端智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳前海达闼云端智能科技有限公司 filed Critical 深圳前海达闼云端智能科技有限公司
Priority to PCT/CN2018/094329 priority Critical patent/WO2020006685A1/fr
Priority to CN201880001181.6A priority patent/CN109074757B/zh
Publication of WO2020006685A1 publication Critical patent/WO2020006685A1/fr

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/006Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
    • G09B29/007Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes using computer methods

Definitions

  • the present application relates to the field of detection, and in particular, to a method, a terminal, and a computer-readable storage medium for establishing a map.
  • VSLAM Current Visual Simultaneous localization and mapping
  • the matching of the feature points in the image captured by the terminal and the feature points in the map is the core issue of the VSLAM technology.
  • the feature points extracted through the image have a certain invariance in a short time and a small scene, so that the current VSLAM technical solution can be applied in the above situation.
  • the existing solution is to establish multiple maps for the same space for terminal positioning. That is, in a short period of time, multiple sets of images are collected at a fixed angle, and VSLAM technology is used to establish a feature point map for each group of images, and then multiple maps are added to the labels and managed separately.
  • mapping scheme has at least the following disadvantages: due to the existence of multiple maps, the terminal needs to perform operations such as expanding, updating, and deleting each map separately, and the management efficiency is low. It can be seen that how to improve the management efficiency of maps is a problem that needs to be solved.
  • a technical problem to be solved in some embodiments of the present application is how to improve map management efficiency.
  • An embodiment of the present application provides a method for building a map, including: obtaining N maps describing the same space, where N is an integer greater than 1; extracting spatially invariant features of each of the N maps; and according to the N maps The respective space-invariant features are combined with the information of the map points in the N maps to obtain a merged map.
  • An embodiment of the present application further provides a terminal including at least one processor; and a memory communicatively connected to the at least one processor; wherein the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor.
  • the processor executes to enable at least one processor to execute the method for establishing a map mentioned in the foregoing embodiment.
  • An embodiment of the present application further provides a computer-readable storage medium storing a computer program.
  • the computer program is executed by a processor, the method for establishing a map mentioned in the foregoing embodiment is implemented.
  • the embodiment of the present application only needs to perform operations such as expanding, deleting, and updating map point information.
  • the operation of the map improves the management efficiency of the map.
  • the terminal uses the merged map for positioning, it is not necessary to switch the map for matching, which improves the positioning efficiency.
  • FIG. 1 is a flowchart of a method for establishing a map according to a first embodiment of the present application
  • FIG. 2 is a flowchart of a method of combining information of map points of N-1 maps into a reference map according to a second embodiment of the present application;
  • FIG. 3 is a schematic diagram showing a relationship between a map establishing method and a positioning method according to a second embodiment of the present application.
  • FIG. 4 is a flowchart of a method for positioning using a merged map according to a second embodiment of the present application
  • FIG. 5 is a schematic structural diagram of a terminal according to a third embodiment of the present application.
  • the first embodiment of the present application relates to a method for establishing a map, which is applied to a terminal.
  • the method for establishing a map includes the following steps:
  • Step 101 Obtain N maps describing the same space.
  • N is an integer greater than 1.
  • the terminal collects multiple image sequences in the same space through a visual sensor, and extracts information and spatially invariant features of the map points extracted from each group of image sequences, and establishes based on the information and spatially invariant features of the extracted map points. Maps for each image sequence.
  • the map may be a map established by the terminal according to the image sequence collected by the visual sensor, or a map transmitted to the terminal by the cloud or other terminals, and the source of the map is not limited in this embodiment.
  • Step 102 Extract the spatially invariant features of each of the N maps.
  • the spatially invariant feature may be any one or any combination of line features, semantic features, and tag information.
  • the line feature refers to the characteristics of the line segments in the map, including the length, angle, and intersection of the line segments.
  • Semantic features refer to features assigned to the same map points in different maps.
  • the terminal recognizes the same map points in different maps by using an image recognition method, and assigns the same features to the same map points.
  • Marker information refers to the information on the map of the location markers in space.
  • the positioning mark is a mark arranged at a plurality of fixed positions in the space, and the mark may be a quick response (Quick Response (QR) code or data matrix (data matrix (DM) code.
  • QR Quick Response
  • DM data matrix
  • Step 103 Combine the information of the map points in the N maps according to the space invariant features of the N maps to obtain a merged map.
  • the method of merging the information of the map points in the N maps to obtain the merged maps according to the spatially invariant features of the N maps is exemplified below.
  • Method 1 The terminal creates a new map as a reference map, and determines the relative pose relationship between the reference map and map A in the N maps.
  • map A is any one of the N maps.
  • the terminal adds the information of the spatially invariant features and map points in the map A to the reference map according to the relative pose relationship between the reference map and the map A.
  • the terminal matches the spatially invariant features in the N-1 maps other than map A with the spatially invariant features in the reference map, and determines the corresponding N-1 maps with The relative pose relationship of the base map.
  • the terminal combines the information of the map points of the N-1 maps into the reference map according to the relative pose relationship between the N-1 maps and the reference map, to obtain a merged map.
  • Method 2 The terminal selects one map from the N maps as the reference map, and matches the spatially invariant features in the N-1 maps other than the reference map in the N maps with the spatially invariant features in the reference map, respectively. , According to the respective matching results of the N-1 maps, determine the relative pose relationship between the N-1 maps and the reference map, respectively. The terminal combines the information of the map points of the N-1 maps into the reference map according to the relative pose relationship between the N-1 maps and the reference map, to obtain a merged map.
  • a method for determining the relative pose relationship between map B and map C is taken as an example, and in the methods 1 and 2, the terminal determines the relative pose relationship between N-1 maps and the reference map, respectively.
  • the terminal matches the spatially invariant features in map B with the spatially invariant features in map C. Based on the matching results, point perspective is used.
  • n Points (PnP) pose measurement algorithm to solve the relative pose relationship between map B and map C.
  • the relative pose relationship is optimized by a beam adjustment (BA) algorithm.
  • BA beam adjustment
  • the terminal may merge map points with a distance less than a preset value in the merged map.
  • the position information of the merged map point is determined according to the position information of the map point before the merge. The value can be determined according to actual needs.
  • the spatial coordinates of map point a in the merged map are (xa, ya, za), and the spatial coordinates of map point b are (xb, yb, zb), where xa represents the abscissa of map point a, and ya represents the map.
  • the vertical coordinate of point a, za indicates the vertical coordinate of map point a
  • xb indicates the horizontal coordinate of map point b
  • yb indicates the vertical coordinate of map point b
  • zb indicates the vertical coordinate of map point b.
  • the terminal calculates the distance between map point a and map point b, and judges whether the distance between map point a and map point b is less than a preset value.
  • map point c If it is determined to be smaller, merge map point a and map point b to obtain map point c.
  • the spatial coordinates of map point c are ((xa + xb) / 2, (ya + yb) / 2, (za + zb) / 2).
  • the description information set of map point c includes the description information set of map point a and Set of description information of map point b.
  • determining the location information of the merged map points based on the location information of the map points before the merger is equivalent to collecting the location information of the same map point multiple times, which improves the accuracy of the location information of the map points .
  • a merged map is established, which improves the management efficiency of the map.
  • the merged map is established by the method for establishing a map mentioned in this embodiment, when the terminal needs to expand the map, determine the position of the information that needs to be expanded in the merged map, and add the information that needs to be expanded at that position.
  • Each map of the map is expanded with map information.
  • the terminal needs to delete or update the map point information, it only needs to delete or update the map point information in the merged map, and does not need to delete or update the map point information for each of the N maps.
  • the second embodiment of the present application relates to a method for establishing a map.
  • This embodiment is a further refinement of the first embodiment, and specifically illustrates that according to the relative pose relationship between N-1 maps and the reference map, N The process of merging the information of the map points of the -1 map into the reference map.
  • FIG. 2 a flowchart of a method for combining information of map points of N-1 maps into a reference map is shown in FIG. 2 and includes the following steps:
  • Step 201 Determine the corresponding relationship between the map points in the N-1 map and the map points in the reference map according to the relative pose relationship between the N-1 maps and the reference map.
  • Step 202 Determine the description information set of the map points in the merged map according to the corresponding relationship between the map points in the N-1 maps and the map points in the reference map, and the description information of the map points in the N-1 map.
  • the terminal may cluster the description information in the description information set of the map points by using a clustering algorithm to obtain the clustered description information set.
  • the clustered description information set determines the merged map.
  • the terminal clusters the description information in the description information set of the map points, classifies the description information of the similar map points into one category, and records the center point of the category as the clustered description information. For example, after determining that there is description information of unclassified map points, the terminal randomly selects description information of one map point as the center point from description information of unclassified map points.
  • the terminal For this center point, the terminal performs the following operations: finds all description information that is within a first preset value from the center point and records it as set M; determines the vector from the center point to each element in set M, and compares all vectors Add to get the offset vector; control the center point to move in the direction of the offset vector, the distance of travel is half the modulus of the offset vector; determine whether the modulus of the offset vector is less than the second preset value, and if it is less, record the center point , Otherwise, determine the vector from the current center point to each element in the set M, add all the vectors to get the offset vector, control the center point to move in the direction of the offset vector, and move the distance by half the modulus of the offset vector ... ... until the magnitude of the offset vector is less than the second preset value. After the terminal determines that all description information of the map points is classified, the clustered description information is determined according to the recorded center point.
  • the map point information further includes shooting information corresponding to the description information of the map point, where the shooting information includes shooting brightness and shooting angle.
  • the terminal clusters the description information in the description information set of the map points through a clustering algorithm to obtain the clustered description information set. If the clustering algorithm aggregates the L description information into one category, according to the shooting brightness corresponding to the L description information, the shooting brightness corresponding to the description information clustered by the L description information, and the shooting corresponding to the L description information.
  • the terminal calculates the average value of the shooting brightness corresponding to the L description information as the first average value, and uses the first average value as the shooting brightness corresponding to the description information clustered by the L description information.
  • the terminal calculates an average value of the shooting angles corresponding to the L description information as the second average value, and uses the second average value as the shooting angle corresponding to the description information clustered by the L description information.
  • the following describes the process of determining the shooting information corresponding to the clustered description information in combination with the actual scene.
  • the description information corresponding to the map point P includes first description information (ka), second description information (kb), third description information (kc), fourth description information (kd), fifth description information (ke) ... ....
  • the shooting brightness of ka is the first shooting brightness (hka)
  • the shooting brightness of kb is the second shooting brightness (hkb)
  • the shooting brightness of kc is the third shooting brightness (hkc)
  • the shooting angle of ka is the first The shooting angle (tka)
  • the shooting angle of kb is the second shooting angle (tkb)
  • the shooting angle of kc is the third shooting angle (tkc) ...
  • ka, kb, and kc are divided into one group,
  • the shooting brightness of the description information after the class (hka + hkb + hkc) / 3
  • the shooting angle of the description information after the cluster (tka + tkb + tkc) / 3.
  • merging N maps reduces the storage space of the maps.
  • the storage format of the description information of the map points is: (shooting information 1, map point id, location information, map point description information 1); (shooting information 2, map point id, location information, Map point description information 2); (shooting information 3, map point id, location information, map point description information 3) ... (shooting information p, map point id, location information, map point description information p)
  • p represents the number of description information of map points
  • map point id represents the number of map points by the terminal in order to determine the same map point in different maps.
  • the description information of the map points in the merged map is stored in the form of a description information set, the storage format is: ⁇ map point id, location information, (shooting information 1, Description information of map points 1), (shooting information 2, description information of map points 2), (shooting information 3, description information of map points 3) ... (shooting information q, description information of map points q) ⁇ .
  • the merged map has q less than or equal to p after clustering the description information in the description information set of the merged map.
  • the data type of the shooting information is unsigned character data (unsigned char)
  • the data size is 1 byte
  • the data type of the map point id is unsigned int data (unsigned int)
  • the data size is 4 bytes
  • the position information is the spatial coordinates of the map point.
  • the data type is floating point data (float)
  • the data size is 4 bytes
  • the data type of the map point description information is unsigned char
  • the data size is 8 bytes.
  • the terminal may perform positioning based on the merged map.
  • the relationship between the method of establishing a map and the positioning method is shown in FIG. 3.
  • map merging refers to the merging of maps corresponding to different shooting information.
  • Single state positioning refers to positioning according to the shooting information when the terminal initiates a positioning request.
  • the method for positioning a terminal using a merged map is shown in FIG. 4 and includes the following steps:
  • Step 301 Acquire an image for positioning.
  • Step 302 Extract description information of the map points in the image.
  • Step 303 Determine description information in the merged map that matches description information of map points in the image.
  • the terminal matches the description information of the map points in the image with all the description information in the merged map, and determines the matching description information.
  • the terminal when the terminal extracts the description information of the map in the image, the terminal determines the shooting information of the image by analyzing the image or by using a sensor (such as a light sensor) on the terminal.
  • the terminal determines the shooting information corresponding to the description information of the map point in the image according to the shooting information of the image.
  • the terminal filters and obtains from the description information set of the map points in the merged map according to the shooting information corresponding to the description information of the map points in the image and each description information in the description information set of the map points in the merged map. Descriptive information for matching.
  • the terminal determines a temporary map according to the description information used for matching, and performs positioning according to the temporary map.
  • the terminal determines the description information that matches the description information of the map point in the image according to the description information for matching in the temporary map.
  • the specific process for the terminal to determine the description information for matching is as follows: the terminal calculates a first difference between the shooting brightness of the image and the shooting brightness corresponding to each description information, and the shooting angle of the image A second difference value of the shooting angle corresponding to each description information; according to the first difference value and the second difference value, selecting M description information from the description information set of the map point as the description information for matching; wherein, M is a positive integer.
  • the following describes the process by which the terminal obtains the descriptive information for matching from the feature information set of the map points in the map in combination with the actual scenario.
  • the terminal acquires the shooting brightness of the map points in the second image as H, the shooting angle is L, and the feature information set of the map points in the map is ⁇ (description information a1, shooting brightness h1, shooting angle l1), (description information a2 , Shooting brightness h2, shooting angle l2), (description information a3, shooting brightness h3, shooting angle l3) ... ⁇ , that is, the shooting brightness corresponding to the description information a1 is h1, the corresponding shooting angle is l1, and the description information a2 corresponds to The shooting brightness is h2, and the corresponding shooting angle is l2 ...
  • the terminal calculates the difference between h1 and H and the difference between l1 and L, and sets different weights for the two differences according to actual needs, so as to determine the second image.
  • the terminal calculates the distance between the shooting information of the map points of the second image and the shooting information corresponding to each description information in the feature information set of the map points in the map.
  • the terminal sorts each description information in the feature information set of the map points in the map in the order of the distance from small to large, and selects the first M description information ranked as the description information for matching.
  • a method of setting a distance preset value can also be used to filter the feature information set, that is, the distance of the shooting information corresponding to the description information of the map point in the second image.
  • the description information smaller than the distance preset value is used as the description information for matching.
  • Step 304 Obtain position information of the map points corresponding to the matching description information from the merged map, and determine a positioning result according to the obtained position information.
  • the terminal determines a positioning result by using a pose estimation algorithm, for example, a PnP pose measurement algorithm, according to a matching result between a map point in the map and a map point in the second image.
  • the positioning result includes the pose information of the terminal.
  • the terminal may also first determine that at least T map points in the image match the map successfully .
  • T is a positive integer, for example, T is equal to 10.
  • the pose information of the terminal is calculated to avoid the insufficient number of successful matches and the inability to locate the resource waste caused by the positioning result.
  • the terminal since information of map points in multiple maps is stored in a merged map, when the terminal performs operations such as expansion, deletion, and update of map point information, , Only need to operate the merged map, which improves the management efficiency of the map.
  • the terminal uses the merged map for positioning, it is not necessary to switch the map for matching, which improves the positioning efficiency.
  • the terminal merges the description information of similar map points through a clustering algorithm, reducing the amount of map data.
  • a third embodiment of the present application relates to a terminal.
  • the terminal includes at least one processor 401; and a memory 402 that is communicatively connected to the at least one processor 401.
  • the memory 402 stores instructions that can be executed by the at least one processor 401, and the instructions are executed by the at least one processor 401, so that the at least one processor 401 can execute the method for establishing a map described above.
  • the processor 401 is a central processing unit (Central Processing Unit (CPU) as an example
  • the memory 402 is a readable and writable memory (Random Access Memory, RAM) as an example.
  • the processor 401 and the memory 402 may be connected through a bus or other methods. In FIG. 5, the connection through the bus is taken as an example.
  • the memory 402 is a non-volatile computer-readable storage medium, and can be used to store non-volatile software programs, non-volatile computer executable programs, and modules. For example, description information of map points in the embodiments of the present application is stored in In memory 402.
  • the processor 401 executes various functional applications and data processing of the device by running the non-volatile software programs, instructions, and modules stored in the memory 402, that is, the above method for establishing a map is implemented.
  • the memory 402 may include a storage program area and a storage data area, where the storage program area may store an operating system and an application program required for at least one function; the storage data area may store a list of options and the like.
  • the memory 402 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage device.
  • the memory 402 may optionally include a memory remotely set relative to the processor, and these remote memories may be connected to an external device through a network. Examples of the above network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • One or more modules are stored in the memory, and when executed by one or more processors, execute the method for establishing a map in any of the foregoing method embodiments.
  • the above product can execute the method provided in the embodiment of the present application, and has the corresponding functional modules and beneficial effects of the execution method.
  • the above product can execute the method provided in the embodiment of the present application, and has the corresponding functional modules and beneficial effects of the execution method.
  • a fourth embodiment of the present application relates to a computer-readable storage medium storing a computer program.
  • the computer program is executed by the processor, the method for building a map described in any of the above method embodiments is implemented.
  • the program is stored in a storage medium and includes several instructions to make a device ( It may be a single-chip microcomputer, a chip, or the like) or a processor that executes all or part of the steps of the method described in each embodiment of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disks or optical disks and other media that can store program code .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Ecology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Navigation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Selon certains modes de réalisation, l'invention concerne un procédé permettant d'établir une carte, ainsi qu'un terminal et un support de stockage lisible par ordinateur. Le procédé permettant d'établir la carte consiste à : acquérir N cartes décrivant un même espace, N étant un nombre entier supérieur à un ; extraire les caractéristiques invariantes spatiales respectives des N cartes ; et selon les caractéristiques invariantes spatiales des N cartes, fusionner les informations des points de carte dans les N cartes afin d'obtenir une carte fusionnée.
PCT/CN2018/094329 2018-07-03 2018-07-03 Procédé d'établissement de carte, terminal et support de stockage lisible par ordinateur WO2020006685A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/094329 WO2020006685A1 (fr) 2018-07-03 2018-07-03 Procédé d'établissement de carte, terminal et support de stockage lisible par ordinateur
CN201880001181.6A CN109074757B (zh) 2018-07-03 2018-07-03 一种建立地图的方法、终端和计算机可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/094329 WO2020006685A1 (fr) 2018-07-03 2018-07-03 Procédé d'établissement de carte, terminal et support de stockage lisible par ordinateur

Publications (1)

Publication Number Publication Date
WO2020006685A1 true WO2020006685A1 (fr) 2020-01-09

Family

ID=64789292

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/094329 WO2020006685A1 (fr) 2018-07-03 2018-07-03 Procédé d'établissement de carte, terminal et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN109074757B (fr)
WO (1) WO2020006685A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111652934A (zh) * 2020-05-12 2020-09-11 Oppo广东移动通信有限公司 定位方法及地图构建方法、装置、设备、存储介质
CN115001835A (zh) * 2022-06-15 2022-09-02 覃惠玲 一种基于物联网终端的数据加密***
CN115705670A (zh) * 2021-08-06 2023-02-17 北京小米移动软件有限公司 地图管理方法及其装置

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020223974A1 (fr) * 2019-05-09 2020-11-12 珊口(深圳)智能科技有限公司 Procédé de mise à jour de carte et robot mobile
CN112148742A (zh) * 2019-06-28 2020-12-29 Oppo广东移动通信有限公司 地图更新方法及装置、终端、存储介质
CN118095809A (zh) * 2024-04-28 2024-05-28 炬星科技(深圳)有限公司 机器人多任务处理方法及装置、机器人

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106503011A (zh) * 2015-09-07 2017-03-15 高德软件有限公司 一种地图数据处理方法及装置
JP2018050149A (ja) * 2016-09-21 2018-03-29 キヤノン株式会社 画像処理装置
CN107885867A (zh) * 2017-11-22 2018-04-06 苏州联讯图创软件有限责任公司 地图切片数据的合成方法和合成***

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL2008690C2 (en) * 2011-04-25 2014-07-15 Google Inc Dynamic highlighting of geographic entities on electronic maps.
US10136054B2 (en) * 2012-10-24 2018-11-20 Morpho, Inc. Image processing device for compositing panoramic images, image processing program and recording medium
US9177404B2 (en) * 2012-10-31 2015-11-03 Qualcomm Incorporated Systems and methods of merging multiple maps for computer vision based tracking
CN103136782B (zh) * 2013-02-22 2016-05-18 广东威创视讯科技股份有限公司 一种三维模型地图动态渲染方法及装置
US9201019B2 (en) * 2013-05-30 2015-12-01 Seagate Technology Llc Article edge inspection
US9723109B2 (en) * 2014-05-28 2017-08-01 Alexander Hertel Platform for constructing and consuming realm and object feature clouds
CN106156245B (zh) * 2015-04-28 2020-02-14 高德软件有限公司 一种电子地图中的线要素合并方法及装置
CN105096386B (zh) * 2015-07-21 2017-11-17 中国民航大学 大范围复杂城市环境几何地图自动生成方法
CN105674993A (zh) * 2016-01-15 2016-06-15 武汉光庭科技有限公司 基于双目相机的高精度视觉定位地图生成***及方法
US20190333239A1 (en) * 2016-12-02 2019-10-31 Cloudminds (Shenzhen) Robotics Systems Co., Ltd. Positioning method and device
CN107223269B (zh) * 2016-12-29 2021-09-28 达闼机器人有限公司 三维场景定位方法和装置
CN106802954B (zh) * 2017-01-18 2021-03-26 中国科学院合肥物质科学研究院 无人车语义地图模型构建方法及其在无人车上的应用方法
CN107680135B (zh) * 2017-11-16 2019-07-23 珊口(上海)智能科技有限公司 定位方法、***及所适用的机器人
CN107885871A (zh) * 2017-11-24 2018-04-06 南京华捷艾米软件科技有限公司 基于云计算的同步定位与地图构建方法、***、交互***
CN108108748A (zh) * 2017-12-08 2018-06-01 联想(北京)有限公司 一种信息处理方法及电子设备

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106503011A (zh) * 2015-09-07 2017-03-15 高德软件有限公司 一种地图数据处理方法及装置
JP2018050149A (ja) * 2016-09-21 2018-03-29 キヤノン株式会社 画像処理装置
CN107885867A (zh) * 2017-11-22 2018-04-06 苏州联讯图创软件有限责任公司 地图切片数据的合成方法和合成***

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111652934A (zh) * 2020-05-12 2020-09-11 Oppo广东移动通信有限公司 定位方法及地图构建方法、装置、设备、存储介质
CN115705670A (zh) * 2021-08-06 2023-02-17 北京小米移动软件有限公司 地图管理方法及其装置
CN115705670B (zh) * 2021-08-06 2024-06-04 北京小米移动软件有限公司 地图管理方法及其装置
CN115001835A (zh) * 2022-06-15 2022-09-02 覃惠玲 一种基于物联网终端的数据加密***

Also Published As

Publication number Publication date
CN109074757A (zh) 2018-12-21
CN109074757B (zh) 2021-11-09

Similar Documents

Publication Publication Date Title
WO2020006685A1 (fr) Procédé d'établissement de carte, terminal et support de stockage lisible par ordinateur
WO2020259481A1 (fr) Procédé et appareil de positionnement, dispositif électronique et support de stockage lisible
WO2021223367A1 (fr) Procédé et appareil de suivi en ligne de multiples piétons sur la base d'une lentille unique, dispositif et support d'enregistrement
WO2021057744A1 (fr) Procédé et appareil de positionnement, et dispositif et support d'informations
CN109740004B (zh) 一种归档方法及装置
Sheng et al. Unsupervised collaborative learning of keyframe detection and visual odometry towards monocular deep slam
CN110399789B (zh) 行人重识别方法、模型构建方法、装置、设备和存储介质
CN111652934A (zh) 定位方法及地图构建方法、装置、设备、存储介质
EP4174716A1 (fr) Procédé et dispositif de suivi de piéton, et support d'enregistrement lisible par ordinateur
JP2007259415A (ja) 画像処理装置及び画像処理方法、サーバ及びその制御方法、プログラム並びに記憶媒体
CN111105459B (zh) 描述子地图生成方法、定位方法、装置、设备和存储介质
KR20160109761A (ko) 건설현장 맞춤형 이미지 분석기술을 활용한 중장비/근로자 인식 및 추적 방법 및 시스템
WO2024077935A1 (fr) Procédé et appareil de positionnement de véhicule à base de slam visuel
CN111079535B (zh) 一种人体骨架动作识别方法、装置及终端
CN113989744A (zh) 一种基于超大尺寸高分辨图像的行人目标检测方法及***
CN113298871B (zh) 地图生成方法、定位方法及其***、计算机可读存储介质
Wilson et al. Visual and object geo-localization: A comprehensive survey
CN111767839B (zh) 一种车辆行驶轨迹确定方法、装置、设备及介质
JP6976731B2 (ja) 情報処理装置、情報処理方法、及びプログラム
CN111401482B (zh) 特征点匹配方法及装置、设备、存储介质
CN115049731B (zh) 一种基于双目摄像头的视觉建图和定位方法
CN109074676B (zh) 建立地图的方法、定位方法、终端及计算机可读存储介质
CN113361392B (zh) 无监督的基于相机和无线定位的多模态行人重识别方法
CN110827340A (zh) 地图的更新方法、装置及存储介质
CN115615436A (zh) 一种多机重定位的无人机定位方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18925276

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14/05/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18925276

Country of ref document: EP

Kind code of ref document: A1