CN111324129A - Navigation method and device based on face recognition - Google Patents

Navigation method and device based on face recognition Download PDF

Info

Publication number
CN111324129A
CN111324129A CN202010196330.8A CN202010196330A CN111324129A CN 111324129 A CN111324129 A CN 111324129A CN 202010196330 A CN202010196330 A CN 202010196330A CN 111324129 A CN111324129 A CN 111324129A
Authority
CN
China
Prior art keywords
navigation
client
face recognition
distance
spacing distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010196330.8A
Other languages
Chinese (zh)
Other versions
CN111324129B (en
Inventor
翁伟东
郭敏鸿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CCB Finetech Co Ltd
Original Assignee
China Construction Bank Corp
CCB Finetech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction Bank Corp, CCB Finetech Co Ltd filed Critical China Construction Bank Corp
Priority to CN202010196330.8A priority Critical patent/CN111324129B/en
Publication of CN111324129A publication Critical patent/CN111324129A/en
Application granted granted Critical
Publication of CN111324129B publication Critical patent/CN111324129B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)

Abstract

The invention provides a navigation method and a device based on face recognition, wherein the method comprises the following steps: acquiring a face photo and destination information of a client; generating a first navigation walking route according to the destination information; when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client; and adjusting the navigation walking speed of the user based on the spacing distance. The invention can be used for navigating guests, can deal with complex environments, and improves the navigation efficiency and reliability, thereby improving the moving efficiency and safety of navigation.

Description

Navigation method and device based on face recognition
Technical Field
The invention relates to the technical field of navigation robots, in particular to a navigation method and a navigation device based on face recognition.
Background
In large indoor public places such as airports, exhibition halls, office halls and the like, the expected destination is difficult to find accurately and quickly, which is a problem frequently encountered by the public, and even if the public has foreground or waiter guidance, the public often walks by mistake or goes around.
With the development of intelligent technology, the intelligent robots are used to perform the road-guiding navigation task, and some robots can also perform autonomous navigation to bring customers to the designated destination. After receiving destination information of a client, the robot calculates a walking route from a current position to the destination according to an indoor positioning navigation technology (generally, map information is arranged in the robot, the walking route between two points of a map can be calculated, a sensor signal is received, a practical walking route and speed are determined, walking and obstacle avoidance can be controlled), and then a navigation walking function is started to bring the client to the destination.
The existing robot navigation function solves the problem that the robot moves to a destination, can objectively provide a leading and guiding service for a client, but cannot detect whether the client timely catches up and whether the client is influenced by other things or obstacles in the midway, so that the client does not catch up, and finally, the robot reaches the destination according to a set route, but the client stays in place or gets lost in the half way.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a navigation method and a navigation device based on face recognition, which can respond to a complex environment and improve the navigation efficiency and reliability.
In order to solve the technical problems, the invention provides the following technical scheme:
in a first aspect, the present invention provides a navigation method based on face recognition, including:
acquiring a face photo and destination information of a client;
generating a first navigation walking route according to the destination information;
when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client;
and adjusting the navigation walking speed of the user based on the spacing distance.
Wherein, the real-time calculation of the spacing distance between the client and the client according to the face photo of the client comprises the following steps:
carrying out face recognition according to the face photo of the client to determine the position of the client;
and measuring the distance between the client and the client based on the position of the client and a radar ranging mode.
Wherein, the adjusting of the self navigation walking speed based on the separation distance comprises:
when the spacing distance is greater than or equal to the preset distance, the navigation walking speed of the user is reduced and the navigation is continued;
and when the spacing distance is greater than the preset distance and the duration time that the spacing distance is greater than the preset time, generating a second navigation walking route according to the position of the client and navigating based on the navigation walking route until the spacing distance is less than the preset distance.
Further, after the generating the first navigation traveling route according to the destination information, the method further includes:
and detecting the clients following the client by an infrared detection mode.
Further, the method also comprises the following steps:
and stopping navigation according to a stop instruction input by a client.
In a second aspect, the present invention provides a navigation device based on face recognition, including:
the acquisition unit is used for acquiring a face photo and destination information of a client;
the route unit is used for generating a first navigation walking route according to the destination information;
the searching unit is used for calculating the spacing distance between the client and the searching unit in real time according to the face photo of the client when the navigation is carried out on the basis of the first navigation walking route;
and the adjusting unit is used for adjusting the navigation walking speed of the user on the basis of the spacing distance.
Wherein the search unit comprises:
the face recognition subunit is used for carrying out face recognition according to the face picture of the client and determining the position of the client;
and the distance measuring subunit is used for measuring the distance between the client and the client based on the position of the client and a radar ranging mode.
Wherein the adjusting unit includes:
the first adjusting subunit is used for reducing the navigation walking speed of the first adjusting subunit and continuing navigation when the spacing distance is greater than or equal to a preset distance;
and the second adjusting subunit is used for generating a second navigation walking route according to the position of the client and navigating based on the navigation walking route when the spacing distance is greater than the preset distance and the duration time that the spacing distance is greater than the preset time until the spacing distance is less than the preset distance.
Further, the method also comprises the following steps:
and the infrared unit is used for detecting the clients following the infrared unit in an infrared detection mode.
Further, the method also comprises the following steps:
and the termination unit is used for stopping navigation according to a stop instruction input by the client.
In a third aspect, the present invention provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the navigation method based on face recognition when executing the program.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method for navigating based on face recognition.
According to the technical scheme, the invention provides a navigation method and a navigation device based on face recognition, wherein the method comprises the steps of obtaining a face picture and destination information of a client; generating a first navigation walking route according to the destination information; when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client; the navigation walking speed of the client is adjusted based on the interval distance, so that the client can be navigated, the navigation walking efficiency and reliability can be improved in response to a complex environment, and the navigation moving efficiency and safety are further improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a first flowchart of a navigation method based on face recognition in an embodiment of the present invention.
Fig. 2 is a second flowchart of a navigation method based on face recognition in the embodiment of the present invention.
Fig. 3 is a third flowchart of a navigation method based on face recognition in the embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a navigation device based on face recognition in an embodiment of the present invention.
Fig. 5 is a schematic structural diagram of an electronic device in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides an embodiment of a navigation method based on face recognition, which specifically comprises the following contents in reference to fig. 1:
s101: acquiring a face photo and destination information of a client;
in this step, the purpose of obtaining the destination information of the client is to generate a navigation walking route, and the client is guided to walk based on the navigation walking route, so that the guided navigation is realized. And acquiring a face photo of the client so as to perform face recognition, and further determining whether the led client follows the walking.
In implementation, the destination information of the customer is obtained through at least one of voice input, text input and text recognition. And acquiring a face photo of the client through the camera.
S102: generating a first navigation walking route according to the destination information;
in this step, the location where the client is expected to arrive can be obtained according to the destination information, and the walking route between two points of the map can be calculated by the built-in map information according to the location where the client is expected to arrive and the location where the client is located. The route is the first navigation walking route.
S103: when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client;
in the step, when the navigation is carried out according to the first navigation walking route, surrounding videos or images are collected according to the camera, a specific target is identified from the videos or images, and the distance between the specific target and the specific target is calculated.
Note that the specific target is the client corresponding to the face photograph in step S101.
During implementation, face recognition is carried out according to the face photos of the customers, specifically, face recognition is carried out on collected surrounding videos or images, the face photos of the customers in the surrounding videos or images are recognized, and the positions of the customers can be determined by taking the customers as centers; the distance between the client and the client is measured based on the position of the client and a radar ranging mode, specifically, the distance between the client and the client can be obtained by performing radar ranging on the position of the client.
S104: and adjusting the navigation walking speed of the user based on the spacing distance.
In the step, the spacing distance between the client and the client is determined, whether the client can lose the tracking is judged based on the spacing distance, therefore, different preset distances are set according to the use requirements, and whether the client loses the tracking is judged according to the spacing distance between the client and the preset distance. If the tracking loss is possible, the self advancing speed is adjusted, so that the client can find the tracking loss in time conveniently, and the quasi-group navigation is realized.
In specific implementation, if the spacing distance is greater than or equal to the preset distance, it is determined that the heel drop may occur.
When the spacing distance is greater than or equal to the preset distance, the navigation walking speed of the user is reduced, and navigation is continued; and when the spacing distance is greater than the preset distance and the duration time that the spacing distance is greater than the preset time, generating a second navigation walking route according to the position of the client and navigating based on the navigation walking route until the spacing distance is less than the preset distance.
As can be seen from the above description, the navigation method and apparatus based on face recognition provided by the embodiment of the present invention obtains the face picture and the destination information of the client; generating a first navigation walking route according to the destination information; when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client; the navigation walking speed of the client is adjusted based on the interval distance, so that the client can be navigated, the navigation walking efficiency and reliability can be improved in response to a complex environment, and the navigation moving efficiency and safety are further improved.
In an embodiment of the present invention, referring to fig. 2, after step S102 in the embodiment of the navigation method based on face recognition, step S105 is further included, which specifically includes the following contents:
s105: and detecting the clients following the client by an infrared detection mode.
The human body position is searched in the navigation and route guiding process in an infrared detection mode, the human face is obtained by the camera for comparison, the client is ensured to follow all the time, the accuracy of determining the position of the client is further improved, and the navigation and route guiding efficiency is further improved.
In an embodiment of the present invention, referring to fig. 3, the embodiment of the navigation method based on face recognition further includes step S106, which specifically includes the following contents:
and stopping navigation according to a stop instruction input by a client.
During specific implementation, a customer can arrive at a destination by himself, and can input a stop instruction to stop navigation without continuing navigation and guidance.
It should be noted that the client can determine whether to continue navigation according to the own requirement, and if the client does not need to continue navigation, the client can input a stop instruction to stop navigation.
To further explain the scheme, the present invention provides a full-flow embodiment of a navigation method based on face recognition, and the navigation method based on face recognition specifically includes the following contents:
1. acquiring the navigation and guidance requirements of a client, and calculating a navigation walking route according to target information and self-positioning information;
2. and calling a front camera, snapshotting a face photo of the client, storing the face photo into temporary navigation task information, and starting navigation and route guidance.
3. Calling infrared detection to detect whether a human body walks along with the infrared detection;
4. calling a radar to measure distance, detecting whether the distance of the human body is within a set threshold value, if the distance exceeds the standard, decelerating or stopping, and if the distance is specifically returned to within the threshold value, accelerating again;
5. calling a rear camera, snapshotting the face of the following client, and sending the face to a face recognition module;
6. comparing the received picture with the face stored when the navigation task is started, and confirming that the nearest follower is the previous client;
7. if the photos are consistent, walking as usual, and if the photos are inconsistent, walking is suspended;
8. and the camera carries out omnidirectional search in situ and omni-directionally rotates, and a client is searched according to the face.
9. And after finding the client, walking the client, and returning to the threshold value to continue navigating.
10. And ending the navigation guide task after the destination is reached or an instruction of canceling the navigation guide by the client is received.
An embodiment of the present invention provides a specific implementation manner of a navigation device based on face recognition, which is capable of implementing all contents in the navigation method based on face recognition, and referring to fig. 4, the navigation device based on face recognition specifically includes the following contents:
the acquisition unit 10 is used for acquiring a face photo and destination information of a client;
a route unit 20 for generating a first navigation travel route according to the destination information;
the searching unit 30 is used for calculating the spacing distance between the client and the client in real time according to the face photo of the client when the navigation is carried out based on the first navigation walking route;
and the adjusting unit 40 is used for adjusting the navigation walking speed of the user based on the spacing distance.
Wherein, the search unit 30 includes:
the face recognition subunit is used for carrying out face recognition according to the face picture of the client and determining the position of the client;
and the distance measuring subunit is used for measuring the distance between the client and the client based on the position of the client and a radar ranging mode.
Wherein the adjusting unit 40 includes:
the first adjusting subunit is used for reducing the navigation walking speed of the first adjusting subunit and continuing navigation when the spacing distance is greater than or equal to a preset distance;
and the second adjusting subunit is used for generating a second navigation walking route according to the position of the client and navigating based on the navigation walking route when the spacing distance is greater than the preset distance and the duration time that the spacing distance is greater than the preset time until the spacing distance is less than the preset distance.
Further, the method also comprises the following steps:
and an infrared unit 50 for detecting a customer following itself by means of infrared detection.
Further, the method also comprises the following steps:
and a termination unit 60 for stopping the navigation according to a stop instruction input by the client.
The embodiment of the navigation device based on face recognition provided by the present invention can be specifically used for executing the processing flow of the embodiment of the navigation method based on face recognition in the above embodiment, and the functions thereof are not described herein again, and reference can be made to the detailed description of the embodiment of the method.
As can be seen from the above description, the navigation device based on face recognition according to the embodiment of the present invention obtains a face picture and destination information of a client; generating a first navigation walking route according to the destination information; when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client; the navigation walking speed of the client is adjusted based on the interval distance, so that the client can be navigated, the navigation walking efficiency and reliability can be improved in response to a complex environment, and the navigation moving efficiency and safety are further improved.
The application provides an embodiment of an electronic device for implementing all or part of contents in the navigation method based on face recognition, and the electronic device specifically includes the following contents:
a processor (processor), a memory (memory), a communication Interface (Communications Interface), and a bus; the processor, the memory and the communication interface complete mutual communication through the bus; the communication interface is used for realizing information transmission between related devices; the electronic device may be a desktop computer, a tablet computer, a mobile terminal, and the like, but the embodiment is not limited thereto. In this embodiment, the electronic device may be implemented with reference to the embodiment of the navigation method based on face recognition and the embodiment of the navigation device based on face recognition, which are incorporated herein, and repeated details are not repeated herein.
Fig. 5 is a schematic block diagram of a system configuration of an electronic device 9600 according to an embodiment of the present application. As shown in fig. 5, the electronic device 9600 can include a central processor 9100 and a memory 9140; the memory 9140 is coupled to the central processor 9100. Notably, this FIG. 5 is exemplary; other types of structures may also be used in addition to or in place of the structure to implement telecommunications or other functions.
In one embodiment, the navigation function based on face recognition may be integrated into the central processor 9100. The central processor 9100 may be configured to control as follows:
acquiring a face photo and destination information of a client;
generating a first navigation walking route according to the destination information;
when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client;
and adjusting the navigation walking speed of the user based on the spacing distance.
As can be seen from the above description, the electronic device provided in the embodiments of the present application obtains a face picture of a customer and destination information; generating a first navigation walking route according to the destination information; when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client; the navigation walking speed of the client is adjusted based on the interval distance, so that the client can be navigated, the navigation walking efficiency and reliability can be improved in response to a complex environment, and the navigation moving efficiency and safety are further improved.
In another embodiment, the navigation device based on face recognition may be configured separately from the central processor 9100, for example, the navigation device based on face recognition may be configured as a chip connected to the central processor 9100, and the navigation function based on face recognition may be realized by the control of the central processor.
As shown in fig. 5, the electronic device 9600 may further include: a communication module 9110, an input unit 9120, an audio processor 9130, a display 9160, and a power supply 9170. It is noted that the electronic device 9600 also does not necessarily include all of the components shown in fig. 5; further, the electronic device 9600 may further include components not shown in fig. 5, which may be referred to in the art.
As shown in fig. 5, a central processor 9100, sometimes referred to as a controller or operational control, can include a microprocessor or other processor device and/or logic device, which central processor 9100 receives input and controls the operation of the various components of the electronic device 9600.
The memory 9140 can be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. The information relating to the failure may be stored, and a program for executing the information may be stored. And the central processing unit 9100 can execute the program stored in the memory 9140 to realize information storage or processing, or the like.
The input unit 9120 provides input to the central processor 9100. The input unit 9120 is, for example, a key or a touch input device. Power supply 9170 is used to provide power to electronic device 9600. The display 9160 is used for displaying display objects such as images and characters. The display may be, for example, an LCD display, but is not limited thereto.
The memory 9140 can be a solid state memory, e.g., Read Only Memory (ROM), Random Access Memory (RAM), a SIM card, or the like. There may also be a memory that holds information even when power is off, can be selectively erased, and is provided with more data, an example of which is sometimes called an EPROM or the like. The memory 9140 could also be some other type of device. Memory 9140 includes a buffer memory 9141 (sometimes referred to as a buffer). The memory 9140 may include an application/function storage portion 9142, the application/function storage portion 9142 being used for storing application programs and function programs or for executing a flow of operations of the electronic device 9600 by the central processor 9100.
The memory 9140 can also include a data store 9143, the data store 9143 being used to store data, such as contacts, digital data, pictures, sounds, and/or any other data used by an electronic device. The driver storage portion 9144 of the memory 9140 may include various drivers for the electronic device for communication functions and/or for performing other functions of the electronic device (e.g., messaging applications, contact book applications, etc.).
The communication module 9110 is a transmitter/receiver 9110 that transmits and receives signals via an antenna 9111. The communication module (transmitter/receiver) 9110 is coupled to the central processor 9100 to provide input signals and receive output signals, which may be the same as in the case of a conventional mobile communication terminal.
Based on different communication technologies, a plurality of communication modules 9110, such as a cellular network module, a bluetooth module, and/or a wireless local area network module, may be provided in the same electronic device. The communication module (transmitter/receiver) 9110 is also coupled to a speaker 9131 and a microphone 9132 via an audio processor 9130 to provide audio output via the speaker 9131 and receive audio input from the microphone 9132, thereby implementing ordinary telecommunications functions. The audio processor 9130 may include any suitable buffers, decoders, amplifiers and so forth. In addition, the audio processor 9130 is also coupled to the central processor 9100, thereby enabling recording locally through the microphone 9132 and enabling locally stored sounds to be played through the speaker 9131.
An embodiment of the present invention further provides a computer-readable storage medium capable of implementing all the steps in the navigation method based on face recognition in the foregoing embodiment, where the computer-readable storage medium stores a computer program, and the computer program, when executed by a processor, implements all the steps in the navigation method based on face recognition in the foregoing embodiment, for example, when the processor executes the computer program, the processor implements the following steps:
acquiring a face photo and destination information of a client;
generating a first navigation walking route according to the destination information;
when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client;
and adjusting the navigation walking speed of the user based on the spacing distance.
As can be seen from the above description, the computer-readable storage medium provided in the embodiment of the present invention obtains a face picture and destination information of a client; generating a first navigation walking route according to the destination information; when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client; the navigation walking speed of the client is adjusted based on the interval distance, so that the client can be navigated, the navigation walking efficiency and reliability can be improved in response to a complex environment, and the navigation moving efficiency and safety are further improved.
Although the present invention provides method steps as described in the examples or flowcharts, more or fewer steps may be included based on routine or non-inventive labor. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. When an actual apparatus or client product executes, it may execute sequentially or in parallel (e.g., in the context of parallel processors or multi-threaded processing) according to the embodiments or methods shown in the figures.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, apparatus (system) or computer program product. Accordingly, embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment. In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The terms "upper", "lower", and the like, indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience in describing the present invention and simplifying the description, but do not indicate or imply that the referred devices or elements must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. Unless expressly stated or limited otherwise, the terms "mounted," "connected," and "connected" are intended to be inclusive and mean, for example, that they may be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations. It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict. The present invention is not limited to any single aspect, nor is it limited to any single embodiment, nor is it limited to any combination and/or permutation of these aspects and/or embodiments. Moreover, each aspect and/or embodiment of the present invention may be utilized alone or in combination with one or more other aspects and/or embodiments thereof.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.

Claims (12)

1. A navigation method based on face recognition is characterized by comprising the following steps:
acquiring a face photo and destination information of a client;
generating a first navigation walking route according to the destination information;
when the navigation is carried out based on the first navigation walking route, calculating the spacing distance between the client and the client in real time according to the face photo of the client;
and adjusting the navigation walking speed of the user based on the spacing distance.
2. The navigation method based on face recognition of claim 1, wherein the step of calculating the separation distance between the client and the navigation device in real time according to the face picture of the client comprises the following steps:
carrying out face recognition according to the face photo of the client to determine the position of the client;
and measuring the distance between the client and the client based on the position of the client and a radar ranging mode.
3. The navigation method based on face recognition according to claim 1, wherein the adjusting of the navigation walking speed based on the separation distance comprises:
when the spacing distance is greater than or equal to the preset distance, the navigation walking speed of the user is reduced and the navigation is continued;
and when the spacing distance is greater than the preset distance and the duration time that the spacing distance is greater than the preset time, generating a second navigation walking route according to the position of the client and navigating based on the navigation walking route until the spacing distance is less than the preset distance.
4. The method for navigating based on human face recognition according to claim 1, further comprising, after the generating a first navigation walking route according to the destination information:
and detecting the clients following the client by an infrared detection mode.
5. The navigation method based on face recognition according to claim 1, further comprising:
and stopping navigation according to a stop instruction input by a client.
6. A navigation device based on face recognition, comprising:
the acquisition unit is used for acquiring a face photo and destination information of a client;
the route unit is used for generating a first navigation walking route according to the destination information;
the searching unit is used for calculating the spacing distance between the client and the searching unit in real time according to the face photo of the client when the navigation is carried out on the basis of the first navigation walking route;
and the adjusting unit is used for adjusting the navigation walking speed of the user on the basis of the spacing distance.
7. The navigation device based on face recognition according to claim 6, wherein the search unit comprises:
the face recognition subunit is used for carrying out face recognition according to the face picture of the client and determining the position of the client;
and the distance measuring subunit is used for measuring the distance between the client and the client based on the position of the client and a radar ranging mode.
8. The face recognition-based navigation device of claim 6, wherein the adjustment unit comprises:
the first adjusting subunit is used for reducing the navigation walking speed of the first adjusting subunit and continuing navigation when the spacing distance is greater than or equal to a preset distance;
and the second adjusting subunit is used for generating a second navigation walking route according to the position of the client and navigating based on the navigation walking route when the spacing distance is greater than the preset distance and the duration time that the spacing distance is greater than the preset time until the spacing distance is less than the preset distance.
9. The face recognition-based navigation device of claim 6, further comprising:
and the infrared unit is used for detecting the clients following the infrared unit in an infrared detection mode.
10. The face recognition-based navigation device of claim 6, further comprising:
and the termination unit is used for stopping navigation according to a stop instruction input by the client.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method for navigating based on face recognition according to any one of claims 1 to 5 are implemented when the processor executes the program.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the face recognition based navigation method of any one of claims 1 to 5.
CN202010196330.8A 2020-03-19 2020-03-19 Navigation method and device based on face recognition Active CN111324129B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010196330.8A CN111324129B (en) 2020-03-19 2020-03-19 Navigation method and device based on face recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010196330.8A CN111324129B (en) 2020-03-19 2020-03-19 Navigation method and device based on face recognition

Publications (2)

Publication Number Publication Date
CN111324129A true CN111324129A (en) 2020-06-23
CN111324129B CN111324129B (en) 2023-07-18

Family

ID=71171682

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010196330.8A Active CN111324129B (en) 2020-03-19 2020-03-19 Navigation method and device based on face recognition

Country Status (1)

Country Link
CN (1) CN111324129B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040111273A1 (en) * 2002-09-24 2004-06-10 Yoshiaki Sakagami Receptionist robot system
US20060178817A1 (en) * 2003-10-23 2006-08-10 Navitime Japan Co., Ltd Navigation apparatus, server apparatus, navigation method, and navigation program
CN106426213A (en) * 2016-11-23 2017-02-22 深圳哈乐派科技有限公司 Accompanying and nursing robot
CN107390721A (en) * 2017-07-26 2017-11-24 歌尔科技有限公司 Robot retinue control method, device and robot
CN108734083A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Control method, device, equipment and the storage medium of smart machine
CN109333535A (en) * 2018-10-25 2019-02-15 同济大学 A kind of guidance method of autonomous mobile robot
CN109571499A (en) * 2018-12-25 2019-04-05 广州天高软件科技有限公司 A kind of intelligent navigation leads robot and its implementation
CN109935310A (en) * 2019-03-08 2019-06-25 弗徕威智能机器人科技(上海)有限公司 A kind of robot guidance system and method applied to medical institutions
CN110405767A (en) * 2019-08-01 2019-11-05 深圳前海微众银行股份有限公司 Intelligent exhibition room leads method, apparatus, equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040111273A1 (en) * 2002-09-24 2004-06-10 Yoshiaki Sakagami Receptionist robot system
US20060178817A1 (en) * 2003-10-23 2006-08-10 Navitime Japan Co., Ltd Navigation apparatus, server apparatus, navigation method, and navigation program
CN106426213A (en) * 2016-11-23 2017-02-22 深圳哈乐派科技有限公司 Accompanying and nursing robot
CN107390721A (en) * 2017-07-26 2017-11-24 歌尔科技有限公司 Robot retinue control method, device and robot
CN108734083A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Control method, device, equipment and the storage medium of smart machine
WO2019179468A1 (en) * 2018-03-21 2019-09-26 北京猎户星空科技有限公司 Control method for smart device, apparatus, device, and storage medium
CN109333535A (en) * 2018-10-25 2019-02-15 同济大学 A kind of guidance method of autonomous mobile robot
CN109571499A (en) * 2018-12-25 2019-04-05 广州天高软件科技有限公司 A kind of intelligent navigation leads robot and its implementation
CN109935310A (en) * 2019-03-08 2019-06-25 弗徕威智能机器人科技(上海)有限公司 A kind of robot guidance system and method applied to medical institutions
CN110405767A (en) * 2019-08-01 2019-11-05 深圳前海微众银行股份有限公司 Intelligent exhibition room leads method, apparatus, equipment and storage medium

Also Published As

Publication number Publication date
CN111324129B (en) 2023-07-18

Similar Documents

Publication Publication Date Title
US10641613B1 (en) Navigation using sensor fusion
RU2642150C2 (en) Method and device for programmable control of user's motion path to elevator/escalator
KR101948728B1 (en) Method and system for collecting data
CN105318881B (en) Map navigation method, device and system
US8526677B1 (en) Stereoscopic camera with haptic feedback for object and location detection
US10663302B1 (en) Augmented reality navigation
US10989559B2 (en) Methods, systems, and devices for displaying maps
EP2769333A1 (en) Image and video based pedestrian traffic estimation
CN105222774A (en) A kind of indoor orientation method and user terminal
US20220230350A1 (en) Position recognition method and system based on visual information processing
CN113910224B (en) Robot following method and device and electronic equipment
CN112085445A (en) Robot destination arrival determining method and device, electronic equipment and storage medium
CN113063421A (en) Navigation method and related device, mobile terminal and computer readable storage medium
CN115420275A (en) Loop path prediction method and device, nonvolatile storage medium and processor
CN114012740B (en) Target place leading method and device based on robot and robot
US9594148B2 (en) Estimation device and estimation method using sound image localization processing
US11181381B2 (en) Portable pedestrian navigation system
CN105528385B (en) Information acquisition method, information acquisition system, and information acquisition program
JP6746735B2 (en) Information presenting method, information presenting system, and information presenting program
KR20190068006A (en) Method for providing route through marker recognition and server using the same
CN111324129B (en) Navigation method and device based on face recognition
CN116533987A (en) Parking path determination method, device, equipment and automatic driving vehicle
CN115675528A (en) Automatic driving method and vehicle based on similar scene mining
KR101954800B1 (en) Positioninng service system, method and providing service apparatus for location information, mobile in the system thereof
US10735902B1 (en) Method and computer program for taking action based on determined movement path of mobile devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220930

Address after: 12 / F, 15 / F, 99 Yincheng Road, Pudong New Area pilot Free Trade Zone, Shanghai, 200120

Applicant after: Jianxin Financial Science and Technology Co.,Ltd.

Address before: 25 Financial Street, Xicheng District, Beijing 100033

Applicant before: CHINA CONSTRUCTION BANK Corp.

Applicant before: Jianxin Financial Science and Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant