CN110941265A - Map entry method and device, computer equipment and storage medium - Google Patents

Map entry method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN110941265A
CN110941265A CN201911071479.7A CN201911071479A CN110941265A CN 110941265 A CN110941265 A CN 110941265A CN 201911071479 A CN201911071479 A CN 201911071479A CN 110941265 A CN110941265 A CN 110941265A
Authority
CN
China
Prior art keywords
moving
reference object
map information
mobile robot
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911071479.7A
Other languages
Chinese (zh)
Inventor
刘欣欣
宋兆辉
王海峰
顾达玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meng Guang Information Technology Co Ltd
Original Assignee
Meng Guang Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meng Guang Information Technology Co Ltd filed Critical Meng Guang Information Technology Co Ltd
Priority to CN201911071479.7A priority Critical patent/CN110941265A/en
Publication of CN110941265A publication Critical patent/CN110941265A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/36Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a map recording method, a map recording device, computer equipment and a storage medium. The method comprises the steps that in response to a map entry starting instruction, the mobile robot moves along with the mobile reference object to keep a preset distance with the mobile reference object; the method facilitates installation procedures, reduces installation requirements, is simple in installation mode, and can realize map scanning and inputting in the walking process as long as a map inputting starting command is sent, and the robot can follow behind a robot moving path guide person.

Description

Map entry method and device, computer equipment and storage medium
Technical Field
The invention relates to the technical field of automation, in particular to a map entry method, a map entry device, computer equipment and a storage medium.
Background
A robot is a machine device that automatically performs work. The intelligent robot can accept human command, run pre-completed program and perform actions according to the principle set by artificial intelligence technology. The task being to assist or replace human work, e.g. production, construction, or dangerous work
In the prior art, a robot may scan and enter maps of various scenes, for example, a computer-controlled robot or a human-propelled robot may be used to scan the maps.
However, in the use scene of the robot scanning and inputting the map in the prior art, a complicated pre-configuration process is required, time is long, and rapid installation cannot be well implemented, which leads to an increase in operation and maintenance cost.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a map entry method, apparatus, computer device, and storage medium.
A map entry method, characterized in that the method is performed by a mobile robot, the method comprising: responding to a map entry starting instruction, and enabling the mobile robot to move along with the mobile reference object so as to keep a preset distance with the mobile reference object; when the mobile robot starts moving, map information is entered.
In one embodiment, the method further comprises: acquiring an initial position of a moving reference object; and moving to a target position which is away from the initial position by a preset distance.
In one embodiment, the mobile robot is configured with an image recognition component and a map information entry component; when the mobile robot starts moving, the step of starting to enter map information includes: under the condition that the mobile robot starts to move, the map information is input by using the map information input component; wherein the map information entry component comprises a lidar.
In one embodiment, the method further comprises: identifying the moving reference object based on an image recognition component; wherein the image recognition component comprises a 3D camera; the moving reference object comprises a robot moving path guide person.
A map entry apparatus, characterized in that the apparatus is executed by a mobile robot, the apparatus comprising: the first moving module is used for responding to a map entry starting instruction, and the mobile robot moves along with the mobile reference object so as to keep a preset distance with the mobile reference object; and the recording module is used for recording map information under the condition that the mobile robot starts to move.
In one embodiment, the apparatus further comprises: the acquisition module is used for acquiring the initial position of the mobile reference object; and the second moving module is used for moving to a target position which is away from the initial position by a preset distance.
In one embodiment, the mobile robot is configured with an image recognition component and a map information entry component; the logging module comprises: an entry unit that enters map information using the map information entry means when the mobile robot starts moving; wherein the map information entry component comprises a lidar.
In one embodiment, the apparatus further comprises: an identification module for identifying the moving reference object based on an image identification component; wherein the image recognition component comprises a 3D camera; the moving reference object comprises a robot moving path guide person.
An electronic device, comprising: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: the method described in the above embodiment is performed.
A computer-readable storage medium having computer program instructions stored thereon, wherein the computer program instructions, when executed by a processor, implement the method described in the above embodiments.
According to the map entry method, the mobile robot moves along with the mobile reference object by responding to the map entry starting instruction so as to keep a preset distance with the mobile reference object; under the condition that the mobile robot starts to move, the map information is input, the installation program is convenient, the installation requirement is reduced, and the robot can follow behind a robot moving path guide person to scan and input the map in the walking process in a foolproof installation mode as long as a map input starting command is sent.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a flow diagram illustrating a map entry method in accordance with one embodiment.
Fig. 2 is a block diagram showing a structure of a map input device according to an embodiment.
Fig. 3 shows a block diagram of an electronic device in accordance with an embodiment of the disclosure.
Fig. 4 shows a block diagram of an electronic device in accordance with an embodiment of the disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, a map entry method is provided, which specifically includes the following steps:
and S110, responding to a map entry starting instruction, and enabling the mobile robot to move along with the mobile reference object so as to keep a preset distance with the mobile reference object.
The map entry method in the implementation mode can be applied to entry of various types of use scenes, such as hotels, restaurants, amusement parks and the like.
Wherein the map entry start instruction may be issued by a specific worker. As an example, the map entry start instruction may be triggered by a start button provided to the mobile robot, or may be triggered by a mobile terminal of a worker. The movable reference object can be any person or object capable of moving, such as a worker.
As an example, the map entry is in a usage scenario for a hotel, and the mobile reference may be a hotel attendant.
In this implementation manner, the distance between the mobile reference object and the mobile reference object can be measured in real time through the distance measuring sensor, and the preset distance is maintained in real time, so that the effect of moving along with the mobile reference object is achieved, and as an example, the preset distance may be 1-2.5 mm.
And S120, recording map information when the mobile robot starts to move.
In this implementation, the map information may be depth information for expressing a three-dimensional environment of the current environment, and the three-dimensional environment of the current environment is established based on the depth information.
In the embodiment, the mobile robot moves along with the mobile reference object by responding to the map entry starting instruction so as to keep a preset distance with the mobile reference object; under the condition that the mobile robot starts to move, the map information is input, the installation program is convenient, the installation requirement is reduced, and the robot can follow behind a robot moving path guide person to scan and input the map in the walking process in a foolproof installation mode as long as a map input starting command is sent.
In one embodiment, the method further comprises: acquiring an initial position of a moving reference object; and moving to a target position which is away from the initial position by a preset distance.
The present implementation may be used to set the position for the mobile reference object and the mobile robot before the method starts to be performed. The initial position is the position of the mobile reference object, and the target position is the position away from the mobile robot by a preset distance.
In another embodiment, the mobile reference object may transmit its own position information (e.g., coordinates) to the mobile robot, and the mobile robot may determine the target position according to the position information and a preset distance and move to the target position.
In a further embodiment, the mobile robot is provided with drive means (e.g. powered wheels) by which it can be moved to a target position at a preset distance from the initial position.
Optionally, the mobile robot may be further configured with an image recognition component and a map information entry component; when the mobile robot starts moving, the step of starting to enter map information includes: under the condition that the mobile robot starts to move, the map information is input by using the map information input component; wherein the map information entry component comprises a lidar.
In this implementation manner, the map information entry component may be any sensor for acquiring scene depth information, for example, a passive distance measurement sensing component, or active distance measurement sensing such as a TOF camera, a motion sensing device Kinect sensor, and a laser radar. Correspondingly, different sensors may use different map information entry methods.
As an example, the principle of TOF camera acquisition of depth images is by transmitting successive near-infrared pulses to a target scene and then receiving with a sensor the light pulses reflected back by the object. By comparing the phase difference between the emitted light pulse and the light pulse reflected by the object, the transmission delay between the light pulses can be calculated, the distance between the object and the emitter can be further obtained, and finally a depth image can be obtained.
For another example, the laser radar ranging technique obtains three-dimensional information of a scene by means of laser scanning. The basic principle is that laser is emitted to the space according to a certain time interval, signals of all scanning points are recorded, the signals reach an object in a detected scene from a laser radar, and then the signals are reflected back to the laser radar through the object for a time interval, so that the distance between the surface of the object and the laser radar is calculated, and depth data are obtained.
In one embodiment, the method further comprises: identifying the moving reference object based on an image recognition component; wherein the image recognition component comprises a 3D camera; the moving reference object comprises a robot moving path guide person.
In the implementation mode, the mobile reference object can be determined as a robot mobile path guide person (worker), before the map entry, the image for identifying the mobile path guide person can be obtained through the 3D camera, the mobile path guide person is identified based on the face and/or human body identification technology, and if the identification is passed, the map entry operation is started.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 2, there is provided a map input apparatus including: a first movement module 201 and an entry module 202.
The first moving module 201 is configured to, in response to a map entry starting instruction, move the mobile robot along with the mobile reference object so as to keep a preset distance from the mobile reference object;
and the recording module 202 is used for recording map information under the condition that the mobile robot starts to move. .
In one embodiment, the apparatus further comprises: the acquisition module is used for acquiring the initial position of the mobile reference object; and the second moving module is used for moving to a target position which is away from the initial position by a preset distance.
In one embodiment, the mobile robot is configured with an image recognition component and a map information entry component; the logging module comprises: an entry unit that enters map information using the map information entry means when the mobile robot starts moving; wherein the map information entry component comprises a lidar.
In one embodiment, the apparatus further comprises: an identification module for identifying the moving reference object based on an image identification component; wherein the image recognition component comprises a 3D camera; the moving reference object comprises a robot moving path guide person.
Embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the above-mentioned method. The computer readable storage medium may be a non-volatile computer readable storage medium.
An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured as the above method.
The electronic device may be provided as a terminal, server, or other form of device.
Fig. 3 is a block diagram illustrating an electronic device 800 according to an example embodiment. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like terminal.
Referring to fig. 3, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the electronic device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 804, is also provided that includes computer program instructions executable by the processor 820 of the electronic device 800 to perform the above-described methods.
Fig. 4 is a block diagram illustrating an electronic device 1900 according to an example embodiment. For example, the electronic device 1900 may be provided as a server. Referring to fig. 4, electronic device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1932, is also provided that includes computer program instructions executable by the processing component 1922 of the electronic device 1900 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (10)

1. A map entry method, characterized in that the method is performed by a mobile robot, the method comprising:
responding to a map entry starting instruction, and enabling the mobile robot to move along with the mobile reference object so as to keep a preset distance with the mobile reference object;
when the mobile robot starts moving, map information is entered.
2. The method of claim 1, further comprising:
acquiring an initial position of a moving reference object;
and moving to a target position which is away from the initial position by a preset distance.
3. The method according to claim 2, characterized in that the mobile robot is provided with an image recognition component and a map information entry component;
wherein, under the condition that the mobile robot starts moving, starting to input map information comprises the following steps:
under the condition that the mobile robot starts to move, the map information is input by using the map information input component; wherein the map information entry component comprises a lidar.
4. The method of claim 3, further comprising:
identifying the moving reference object based on an image recognition component;
wherein the image recognition component comprises a 3D camera; the moving reference object comprises a robot moving path guide person.
5. A map entry apparatus, characterized in that the apparatus is executed by a mobile robot, the apparatus comprising:
the first moving module is used for responding to a map entry starting instruction, and the mobile robot moves along with the mobile reference object so as to keep a preset distance with the mobile reference object;
and the recording module is used for recording map information under the condition that the mobile robot starts to move.
6. The apparatus of claim 5, further comprising:
the acquisition module is used for acquiring the initial position of the mobile reference object;
and the second moving module is used for moving to a target position which is away from the initial position by a preset distance.
7. The apparatus of claim 5, wherein the mobile robot is configured with an image recognition component and a map information entry component; the logging module comprises:
an entry unit that enters map information using the map information entry means when the mobile robot starts moving; wherein the map information entry component comprises a lidar.
8. The apparatus of claim 7, further comprising:
an identification module for identifying the moving reference object based on an image identification component; wherein the image recognition component comprises a 3D camera; the moving reference object comprises a robot moving path guide person.
9. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: performing the method of any one of claims 1 to 4.
10. A computer readable storage medium having computer program instructions stored thereon, which when executed by a processor implement the method of any one of claims 1 to 4.
CN201911071479.7A 2019-11-05 2019-11-05 Map entry method and device, computer equipment and storage medium Pending CN110941265A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911071479.7A CN110941265A (en) 2019-11-05 2019-11-05 Map entry method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911071479.7A CN110941265A (en) 2019-11-05 2019-11-05 Map entry method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110941265A true CN110941265A (en) 2020-03-31

Family

ID=69906299

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911071479.7A Pending CN110941265A (en) 2019-11-05 2019-11-05 Map entry method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110941265A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113778086A (en) * 2021-09-03 2021-12-10 上海擎朗智能科技有限公司 Map construction and use method, robot and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106292657A (en) * 2016-07-22 2017-01-04 北京地平线机器人技术研发有限公司 Mobile robot and patrol path setting method thereof
CN107607117A (en) * 2017-08-09 2018-01-19 华南理工大学 A kind of robot based on laser radar builds figure navigation system and method
CN107807652A (en) * 2017-12-08 2018-03-16 灵动科技(北京)有限公司 Merchandising machine people, the method for it and controller and computer-readable medium
CN108614563A (en) * 2018-06-12 2018-10-02 北京康力优蓝机器人科技有限公司 A method of realizing that mobile robot target follows by location navigation
WO2019061964A1 (en) * 2017-09-27 2019-04-04 广东宝乐机器人股份有限公司 Map creation method of mobile robot and mobile robot
CN109725580A (en) * 2019-01-17 2019-05-07 深圳市锐曼智能装备有限公司 The long-range control method of robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106292657A (en) * 2016-07-22 2017-01-04 北京地平线机器人技术研发有限公司 Mobile robot and patrol path setting method thereof
CN107607117A (en) * 2017-08-09 2018-01-19 华南理工大学 A kind of robot based on laser radar builds figure navigation system and method
WO2019061964A1 (en) * 2017-09-27 2019-04-04 广东宝乐机器人股份有限公司 Map creation method of mobile robot and mobile robot
CN107807652A (en) * 2017-12-08 2018-03-16 灵动科技(北京)有限公司 Merchandising machine people, the method for it and controller and computer-readable medium
CN108614563A (en) * 2018-06-12 2018-10-02 北京康力优蓝机器人科技有限公司 A method of realizing that mobile robot target follows by location navigation
CN109725580A (en) * 2019-01-17 2019-05-07 深圳市锐曼智能装备有限公司 The long-range control method of robot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113778086A (en) * 2021-09-03 2021-12-10 上海擎朗智能科技有限公司 Map construction and use method, robot and storage medium

Similar Documents

Publication Publication Date Title
EP3316074B1 (en) Screen control method, apparatus, and non-transitory tangible computer readable storage medium
US20180169934A1 (en) 3d printing data generation method and device
EP3099063A1 (en) Video communication method and apparatus
CN113205549B (en) Depth estimation method and device, electronic equipment and storage medium
CN107820131B (en) Comment information sharing method and device
WO2019006769A1 (en) Following-photographing method and device for unmanned aerial vehicle
EP3147802B1 (en) Method and apparatus for processing information
CN111666917A (en) Attitude detection and video processing method and device, electronic equipment and storage medium
CN106774849B (en) Virtual reality equipment control method and device
CN110989901B (en) Interactive display method and device for image positioning, electronic equipment and storage medium
CN112146645B (en) Method and device for aligning coordinate system, electronic equipment and storage medium
CN110928627A (en) Interface display method and device, electronic equipment and storage medium
CN104065883B (en) Image pickup method and device
WO2022151686A1 (en) Scene image display method and apparatus, device, storage medium, program and product
CN111563138A (en) Positioning method and device, electronic equipment and storage medium
CN110750226A (en) Central control equipment management method and device, computer equipment and storage medium
EP3758343A1 (en) Method and device for controlling image acquisition component and storage medium
CN111860373A (en) Target detection method and device, electronic equipment and storage medium
CN106533907B (en) Information sending method and device
CN112950712B (en) Positioning method and device, electronic equipment and storage medium
EP3225510A1 (en) Methods and devices for controlling self-balanced vehicle to park
CN110941265A (en) Map entry method and device, computer equipment and storage medium
CN112767541A (en) Three-dimensional reconstruction method and device, electronic equipment and storage medium
CN107948876B (en) Method, device and medium for controlling sound box equipment
CN111832338A (en) Object detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200331

RJ01 Rejection of invention patent application after publication