CN118149838A - Instant positioning and map construction method, device, terminal and storage medium - Google Patents

Instant positioning and map construction method, device, terminal and storage medium Download PDF

Info

Publication number
CN118149838A
CN118149838A CN202211558817.1A CN202211558817A CN118149838A CN 118149838 A CN118149838 A CN 118149838A CN 202211558817 A CN202211558817 A CN 202211558817A CN 118149838 A CN118149838 A CN 118149838A
Authority
CN
China
Prior art keywords
map
image
mapping
current
instant
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211558817.1A
Other languages
Chinese (zh)
Inventor
程林
方迟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202211558817.1A priority Critical patent/CN118149838A/en
Publication of CN118149838A publication Critical patent/CN118149838A/en
Pending legal-status Critical Current

Links

Landscapes

  • Navigation (AREA)

Abstract

The disclosure provides an instant positioning and map construction method and device, a terminal and a storage medium. The instant positioning and map construction method comprises the following steps: acquiring a first image through current equipment; performing instant positioning and map construction based on the first image to obtain a first map; receiving a second image from the other device; determining relative location information between the current device and other devices; and carrying out fusion map construction on the basis of the first map based on the second image and the relative position information to obtain a second map. By adopting multiple devices for collaborative map construction, the integrity and accuracy of the constructed map can be increased.

Description

Instant positioning and map construction method, device, terminal and storage medium
Technical Field
The disclosure relates to the field of information technology, and in particular, to an instant positioning and map construction method and device, a terminal and a storage medium.
Background
The visual instant positioning and mapping (SLAM, simultaneous Localization AND MAPPING) is a process of sensing and mapping the environment with a camera. In the current stage of augmented reality (XR) device environment, the construction of a virtual reality or mixed reality environment is mainly performed by mapping through single device vision SLAM, and the map is often not complete and accurate enough due to limitations of device performance, user position and the like. Thus, further improvements in this respect are expected.
Disclosure of Invention
In order to solve the existing problems, the present disclosure provides an instant positioning and map construction method and device, a terminal and a storage medium.
The present disclosure adopts the following technical solutions.
The embodiment of the disclosure provides an instant positioning and map construction method, which comprises the following steps: acquiring a first image through current equipment; performing instant positioning and map construction based on the first image to obtain a first map; receiving a second image from the other device; determining relative location information between the current device and the other device; and carrying out fusion map construction on the basis of the first map based on the second image and the relative position information to obtain a second map.
Another embodiment of the present disclosure provides an instant positioning and mapping apparatus including: an image acquisition module located in a current device and configured to acquire a first image through the current device, wherein the image acquisition module is further configured to receive a second image from another device; the map construction module is positioned in the current equipment and is configured to perform instant positioning and map construction based on the first image to obtain a first map; a location determination module located in the current device and configured to determine relative location information between the current device and the other device; the map construction module is further configured to perform fusion map construction on the basis of the first map based on the second image and the relative position information to obtain a second map.
In some embodiments, the present disclosure provides a terminal comprising: at least one memory and at least one processor; the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the instant positioning and map construction method.
In some embodiments, the present disclosure provides a storage medium for storing program code for performing the above-described instant localization and mapping method.
In some embodiments, the present disclosure provides a computer program product comprising instructions that, when executed by a computer device, cause the computer device to perform the above-described instant localization and mapping method.
The embodiment of the disclosure can increase the integrity and accuracy of the constructed map by adopting multiple devices for collaborative map construction.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
Fig. 1 shows a flow chart of an instant localization and mapping method of an embodiment of the present disclosure.
Fig. 2 illustrates a partial module for an instant positioning and mapping device in accordance with an embodiment of the present disclosure.
Fig. 3 shows a schematic structural diagram of an electronic device of an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in and/or in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "a" and "an" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be construed as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
Visual SLAM can be achieved by the following steps: extracting characteristic points from two frames of images obtained in sequence in time and matching the characteristic points to obtain a group of matched characteristic points; based on the matched feature points, the conversion relation between the two images can be calculated; based on the conversion relation, the environment map in the first coordinate system (for example, the robot coordinate system) obtained recently can be converted and added to the environment map in the second coordinate system (for example, the global coordinate system) before, so that incremental environment map construction is realized. In the art, there are various algorithms for implementing the visual SLAM, and for simplicity, the various existing visual SLAM algorithms will not be described in detail in this specification.
FIG. 1 provides a flow chart of an instant localization and mapping method of an embodiment of the present disclosure. In some embodiments, the instant localization and mapping method of the present disclosure may include step S101, acquiring a first image through a current device. In some embodiments, the current device is, for example, a first device and includes a camera or other photosensitive element, an image of the surrounding environment may be acquired in real time. In some embodiments, the positioning information of the current device may be obtained from the obtained first image through a SLAM algorithm. In some embodiments, the current device may also perform positioning of the device through GPS or beidou navigation, so as to obtain positioning information of the device. In some embodiments, the first image may be an RGB image or a depth image.
In some embodiments, the method of the present disclosure may further include step S102, performing instant positioning and mapping based on the first image, to obtain a first map. In some embodiments, as described above, by the SLAM algorithm, the spatial position coordinates may be constructed based on the first image, and the feature points identified from the first image may be spatially filled based on the immediately located position information, so as to construct the first map.
In some embodiments, the method of the present disclosure may further include step S103 of receiving a second image from the other device. The map built by the current device alone is often not complete and accurate enough, subject to device performance, user location, etc. Here, the collaborative construction of the map is performed by other devices. In some embodiments, the other devices may include one or more devices, and may include cameras or other photosensitive elements, so that images of the surrounding environment may be acquired in real time. In some embodiments, positioning information of other devices may be obtained from the obtained second image through a SLAM algorithm. In some embodiments, other devices may also perform positioning of the device through GPS or beidou navigation, so as to obtain positioning information of the other devices. In some embodiments, the second image may be an RGB image or a depth image.
In some embodiments, the method of the present disclosure may further include step S104 of determining relative location information between the current device and other devices. In some embodiments, after the positioning information of the current device is acquired through the first image, the positioning information of the other device is acquired through the second image, and then the relative position information between the current device and the other device, for example, the position information of the other device relative to the current device, may be known.
In some embodiments, the method of the present disclosure may further include step S105 of performing fusion map construction based on the second image and the relative position information on the basis of the first map, resulting in a second map. In some embodiments, using the second image and the determined relative position information obtained from the other device, the spatial position information in the second image may be converted into spatial position information in the constructed first map, thereby supplementing the construction of the first map. Because other devices and the current device can be in different positions, states and the like at the same time, the constructed second map can be more complete and accurate relative to the first map.
In some embodiments, the construction process of the first map and the second map may be performed in the current device or other devices or any devices performing map collaborative construction, or the data obtained by these devices may be transmitted to other computing devices, so as to save the computing power required by the current device or other devices. In the present disclosure, as an example, the construction process of the first map and the second map is performed in the current device.
Thus, by adopting multiple devices to perform collaborative map construction, the integrity and accuracy of the constructed map can be increased.
In some embodiments, the methods of the present disclosure may further comprise: based on the first image, first point cloud data is acquired. Point cloud technology refers to a technology for representing the distribution of objects in space by building a 3D space model from a collection of a large number of points. In some embodiments, the first point cloud data may be obtained by calculating and determining a three-dimensional position of each point in the first image based on the acquired first image. In some embodiments, performing the instant localization and mapping based on the first image includes: and performing instant positioning and map construction based on the first point cloud data. In some embodiments, a large number of points in the point cloud data may be fitted to a plane, and matching conversion and fusion of points or planes between image frames may be performed, so as to construct a current first map.
In some embodiments, the methods of the present disclosure may further comprise: and acquiring second point cloud data based on the second image. In some embodiments, the second point cloud data may be obtained by calculating and determining a three-dimensional position of each point in the second image based on the received second image. In some embodiments, fusing the map construction based on the second image and the relative position information on the basis of the first map comprises: and carrying out fusion map construction on the basis of the first map based on the second point cloud data and the relative position information.
In some embodiments, at least one of the current device and the other device is a head mounted device. In some embodiments, the head mounted device may include a VR head mounted display. In some embodiments, the current device and/or other devices may also be other devices with image acquisition capabilities, such as VR glasses.
In some embodiments, after obtaining the first map, the method further comprises: acquiring the pose and displacement of the current equipment; and carrying out real-time construction and supplementation on the first map based on the pose and displacement of the current equipment. In some embodiments, the 6DoF pose and displacement of the present device may be obtained through a camera or an Inertial Measurement Unit (IMU) of the present device, and the first map may be further constructed and supplemented in real time by using the pose and displacement. In some embodiments, by utilizing the pose and displacement to construct and supplement the first map in real time, the map can be constructed under different positions and angles, so that the constructed map is more complete and accurate.
In some embodiments, after obtaining the second map, the method further comprises: acquiring the pose and displacement of other equipment; and carrying out real-time construction and supplementation on the second map based on the pose and displacement of other equipment. In some embodiments, the 6DoF pose and displacement of the device may be obtained through a camera or an Inertial Measurement Unit (IMU) of other devices, and the second map may be further constructed and supplemented in real time by using the pose and displacement. In some embodiments, by utilizing the pose and displacement to construct and supplement the second map in real time, the map can be constructed under different positions and angles, so that the constructed map is more complete and accurate.
In some embodiments, absolute location information for the current device and other devices may be obtained using network or GPS technology, among other techniques. In some embodiments, when the current device and other devices are close in distance, communication can be performed through Ultra Wideband (UWB), laser communication, and the like, so as to obtain relative position information between the other devices and the current device. In some embodiments, in the case that the position of the current device is known, the absolute position information of the other devices or the relative position information between the other devices and the current device are obtained, and map fusion construction can be performed based on the second images obtained by the other devices. Thus, the map collaborative construction among multiple devices is facilitated.
In some embodiments, the relative position information between the other device and the current device includes a relative distance between the other device and the current device and a relative pose between the other device and the current device. In some embodiments, because the current device and the other devices may have changes of pose and displacement, the relative position information between the other devices and the current device also includes relative distance and relative pose, and further, the map is accurately fused and constructed according to the relative distance and the relative pose, so that the constructed map is more accurate.
In some embodiments, camera relative distances and poses of multiple devices (e.g., a current device and other devices) can be correlated, and loop detection and map fusion stitching can be performed on SLAMs acquired by the multiple devices. The main purpose of the loop detection algorithm is to know whether the device has returned to a previously visited location, which helps to reduce the pose error of the device and reduce the visual information redundancy of the three-dimensional map.
In some embodiments, the methods of the present disclosure further comprise: and after the second map is obtained, adjusting and supplementing the second map based on the image information and the positioning information newly acquired by the current equipment and/or other equipment. In some embodiments, for example, the current device and/or other devices may move with the user, and the current device and/or other devices may continually generate new images and positioning information, which in turn may supplement the second map, resulting in a more complete and accurate map.
In some embodiments, the second map may be saved to a local or cloud map. In some embodiments, the second map is subjected to absolute position positioning such as GPS (Global positioning System), splicing and de-duplication by the user permission to form a big data cloud map. By adopting the method disclosed by the invention, the integrity and the accuracy of constructing the SLAM map by the XR equipment can be improved.
In some embodiments, the local cloud map and the server cloud map are perfected by recording a more complete map, so that a user can perform more continuous customized interaction experience, and map path assistance can be performed on an automatic intelligent product later. In some embodiments, for example, when an earthquake or fire occurs indoors, the XR device can be used for supplementing an existing constructed map in real time, and can identify building channels which are blocked by collapsed objects and the like, so that escape of users is facilitated.
In some embodiments, the user-licensed, collaborative map built of users and user groups is used to interactively adapt to larger map environments. For example, the current device may arrive for the first time at a map that was built via other devices, may agree to obtain access to other device security areas, virtual item ornaments, customized interactions, etc. via the user. In some embodiments, when the current device reaches a second map co-constructed via the other device, a request for use of the secure zone and/or virtual article of merchandise by the other device is issued; after receiving the use license, the secure area and/or virtual article of the other device is used. In this way, data interaction between shared users of the XR device is facilitated and enriched.
In some embodiments, a large multiplayer map game is aided by collaborative mapping to allow users to share XR virtual, mixed reality environments and locations in real time within a closely identifiable environment. For example, when a multiplayer virtual game is played, the surrounding environment and positioning can be shared among teammates in the same group, so that the richness and the interestingness of the multiplayer team game are improved, and further the game experience is improved.
Embodiments of the present disclosure also provide an instant positioning and mapping apparatus 400. The instant positioning and mapping apparatus 400 includes an image acquisition module 401, a position determination module 402, and a map construction 403. In some embodiments, the image acquisition module 401 is located in the current device and is configured to acquire the first image by the current device. In some embodiments, the map construction module 403 is located in the current device and is configured to perform the instant localization and mapping based on the first image resulting in the first map. In some embodiments, the image acquisition module is further configured to receive the second image from the other device. In some embodiments, the location determination module 402 is located in the current device and is configured to determine relative location information between the current device and other devices. In some embodiments, the map construction module 403 is further configured to perform fusion map construction based on the second image and the relative position information on the basis of the first map, resulting in a second map.
It should be appreciated that the descriptions regarding the instant localization and mapping method also apply to the instant localization and mapping apparatus 400 herein, and for the sake of simplicity, will not be described in detail herein.
In some embodiments, the instant positioning and mapping apparatus further comprises: the point cloud acquisition module is configured to acquire first point cloud data based on the first image; and performing the instant localization and mapping based on the first image includes: and performing instant positioning and map construction based on the first point cloud data. In some embodiments, the point cloud acquisition module is further configured to acquire second point cloud data based on the second image; and performing fusion map construction based on the second image and the relative position information on the basis of the first map includes: and carrying out fusion map construction on the basis of the first map based on the second point cloud data and the relative position information. In some embodiments, at least one of the current device and the other device is a head mounted device. In some embodiments, the instant positioning and map construction apparatus further includes a pose and displacement acquisition module configured to acquire a pose and displacement of the current device after the first map is obtained; the map construction module is further configured to construct a supplement to the first map in real-time based on the pose and displacement of the current device. In some embodiments, the pose and displacement acquisition module is configured to acquire pose and displacement of the other device after obtaining the second map; the map construction module is further configured to construct and supplement the second map in real time based on the pose and displacement of the other device. In some embodiments, the relative position information between the other device and the current device includes a relative distance between the other device and the current device and a relative pose between the other device and the current device. In some embodiments, the map construction module is further configured to adjust and supplement the second map based on the image information and positioning information newly acquired by the current device and/or other devices after the second map is obtained. In some embodiments, the instant positioning and mapping apparatus further comprises: the request module is positioned in the current equipment and sends out using requests of safety areas and/or virtual article ornaments of other equipment when the current equipment reaches the second map cooperatively constructed by the other equipment; and after receiving the use license, use the secure area and/or virtual article of merchandise of the other device.
In addition, the present disclosure also provides a terminal, including: at least one memory and at least one processor; the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the instant positioning and map construction method.
In addition, the present disclosure also provides a computer storage medium storing program codes for executing the above-mentioned instant localization and mapping method.
The present disclosure further provides a computer program product comprising instructions that, when executed by a computer device, cause the computer device to perform the above-described instant localization and mapping method.
The instant positioning and map construction method and device of the present disclosure are described above based on the embodiments and the application. In addition, the present disclosure also provides a terminal and a storage medium, which are described below.
Referring now to fig. 3, a schematic diagram of an electronic device (e.g., a terminal device or server) 500 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 3 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 3, the electronic device 500 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 501, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data required for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 508 including, for example, magnetic tape, hard disk, etc.; and communication means 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 3 shows an electronic device 500 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or from the storage means 508, or from the ROM 502. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods of the present disclosure described above.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided an instant localization and mapping method including: acquiring a first image through current equipment; performing instant positioning and map construction based on the first image to obtain a first map; receiving a second image from the other device; determining relative location information between the current device and the other device; and carrying out fusion map construction on the basis of the first map based on the second image and the relative position information to obtain a second map.
In accordance with one or more embodiments of the present disclosure, the instant localization and mapping method further includes: acquiring first point cloud data based on the first image; and performing instant localization and mapping based on the first image includes: and performing instant positioning and map construction based on the first point cloud data.
In accordance with one or more embodiments of the present disclosure, the instant localization and mapping method further includes: acquiring second point cloud data based on the second image; and performing fusion map construction based on the second image and the relative position information on the basis of the first map includes: and carrying out fusion map construction on the basis of the first map based on the second point cloud data and the relative position information.
According to one or more embodiments of the present disclosure, at least one of the current device and the other device is a head-mounted device.
In accordance with one or more embodiments of the present disclosure, after obtaining the first map, the method further comprises: acquiring the pose and displacement of the current equipment; and carrying out real-time construction and supplementation on the first map based on the pose and displacement of the current equipment.
In accordance with one or more embodiments of the present disclosure, after obtaining the second map, the method further comprises: acquiring the pose and displacement of the other equipment; and building and supplementing the second map in real time based on the pose and displacement of the other equipment.
In accordance with one or more embodiments of the present disclosure, the relative position information between the other device and the current device includes a relative distance between the other device and the current device, and a relative pose between the other device and the current device.
According to one or more embodiments of the present disclosure, the method further comprises: and after the second map is obtained, adjusting and supplementing the second map based on the image information and the positioning information newly acquired by the current equipment and/or the other equipment.
According to one or more embodiments of the present disclosure, the method further comprises: when the current equipment reaches the second map cooperatively constructed by the other equipment, sending a request for using the security area and/or virtual article ornament of the other equipment; after receiving the use license, the secure area and/or virtual article of the other device is used.
There is also provided, in accordance with one or more embodiments of the present disclosure, an instant localization and mapping apparatus including: the graph construction device comprises: an image acquisition module located in the current device and configured to acquire a first image through the current device, wherein the image acquisition module is further configured to receive a second image from the other device; the map construction module is positioned in the current equipment and is configured to perform instant positioning and map construction based on the first image to obtain a first map; a location determination module located in a current device and configured to determine relative location information between the current device and the other device; the map construction module is further configured to perform fusion map construction on the basis of the first map based on the second image and the relative position information to obtain a second map.
According to one or more embodiments of the present disclosure, there is provided a terminal including: at least one memory and at least one processor; wherein the at least one memory is configured to store program code, and the at least one processor is configured to invoke the program code stored by the at least one memory to perform any of the methods described above.
According to one or more embodiments of the present disclosure, there is provided a storage medium for storing program code for performing the above-described method.
According to one or more embodiments of the present disclosure, a computer program product is provided, comprising instructions that, when executed by a computer device, cause the computer device to perform the above-described instant localization and mapping method.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (12)

1. The instant positioning and map construction method is characterized by comprising the following steps of:
Acquiring a first image through current equipment;
performing instant positioning and map construction based on the first image to obtain a first map;
Receiving a second image from the other device;
Determining relative location information between the current device and the other device;
And carrying out fusion map construction on the basis of the first map based on the second image and the relative position information to obtain a second map.
2. The method of on-the-fly localization and mapping of claim 1, further comprising: acquiring first point cloud data based on the first image;
And performing instant localization and mapping based on the first image includes: and performing instant positioning and map construction based on the first point cloud data.
3. The method of on-the-fly localization and mapping of claim 1, further comprising: acquiring second point cloud data based on the second image;
and performing fusion map construction based on the second image and the relative position information on the basis of the first map includes: and carrying out fusion map construction on the basis of the first map based on the second point cloud data and the relative position information.
4. The method of instant localization and mapping of claim 1, wherein at least one of the current device and the other device is a head mounted device.
5. The method of instant localization and mapping of claim 1, further comprising, after obtaining the first map:
acquiring the pose and displacement of the current equipment;
and carrying out real-time construction and supplementation on the first map based on the pose and displacement of the current equipment.
6. The method of instant localization and mapping of claim 1, further comprising, after obtaining the second map:
acquiring the pose and displacement of the other equipment;
And building and supplementing the second map in real time based on the pose and displacement of the other equipment.
7. The method of claim 1, wherein the relative position information between the current device and the other device includes a relative distance between the other device and the current device and a relative pose between the other device and the current device.
8. The method of on-the-fly localization and mapping of claim 1, further comprising:
when the current equipment reaches the second map cooperatively constructed by the other equipment, sending a request for using the security area and/or virtual article ornament of the other equipment;
After receiving the use license, the secure area and/or virtual article of the other device is used.
9. An instant positioning and mapping device, comprising:
An image acquisition module located in a current device and configured to acquire a first image through the current device, wherein the image acquisition module is further configured to receive a second image from another device;
The map construction module is positioned in the current equipment and is configured to perform instant positioning and map construction based on the first image to obtain a first map;
A location determination module located in the current device and configured to determine relative location information between the current device and the other device;
The map construction module is further configured to perform fusion map construction on the basis of the first map based on the second image and the relative position information to obtain a second map.
10. A terminal, comprising:
at least one memory and at least one processor;
Wherein the at least one memory is configured to store program code, and the at least one processor is configured to invoke the program code stored in the at least one memory to perform the on-the-fly positioning and mapping method of any of claims 1-8.
11. A storage medium for storing program code for performing the on-the-fly positioning and mapping method of any one of claims 1 to 8.
12. A computer program product, characterized in that it comprises instructions that, when executed by a computer device, cause the computer device to perform the instant localization and mapping method according to any one of claims 1 to 8.
CN202211558817.1A 2022-12-06 2022-12-06 Instant positioning and map construction method, device, terminal and storage medium Pending CN118149838A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211558817.1A CN118149838A (en) 2022-12-06 2022-12-06 Instant positioning and map construction method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211558817.1A CN118149838A (en) 2022-12-06 2022-12-06 Instant positioning and map construction method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN118149838A true CN118149838A (en) 2024-06-07

Family

ID=91290812

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211558817.1A Pending CN118149838A (en) 2022-12-06 2022-12-06 Instant positioning and map construction method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN118149838A (en)

Similar Documents

Publication Publication Date Title
CN109949422B (en) Data processing method and equipment for virtual scene
CN106846497B (en) Method and device for presenting three-dimensional map applied to terminal
US11557083B2 (en) Photography-based 3D modeling system and method, and automatic 3D modeling apparatus and method
KR20220009393A (en) Image-based localization
CN112288853B (en) Three-dimensional reconstruction method, three-dimensional reconstruction device, and storage medium
EP3872764A1 (en) Method and apparatus for constructing map
US11776151B2 (en) Method for displaying virtual object and electronic device
US20190340317A1 (en) Computer vision through simulated hardware optimization
CN109754464B (en) Method and apparatus for generating information
CN112907652B (en) Camera pose acquisition method, video processing method, display device, and storage medium
CN111260774A (en) Method and device for generating 3D joint point regression model
CN111597466A (en) Display method and device and electronic equipment
CN114363161B (en) Abnormal equipment positioning method, device, equipment and medium
CN112270242B (en) Track display method and device, readable medium and electronic equipment
CN111833459B (en) Image processing method and device, electronic equipment and storage medium
WO2021088497A1 (en) Virtual object display method, global map update method, and device
CN109816791B (en) Method and apparatus for generating information
CN114187509B (en) Object positioning method and device, electronic equipment and storage medium
CN118149838A (en) Instant positioning and map construction method, device, terminal and storage medium
CN115578432A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112887793B (en) Video processing method, display device, and storage medium
CN114529452A (en) Method and device for displaying image and electronic equipment
CN112991542B (en) House three-dimensional reconstruction method and device and electronic equipment
CN117635697A (en) Pose determination method, pose determination device, pose determination equipment, storage medium and program product
CN117906634A (en) Equipment detection method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination