CN108445881A - People's method and robot are looked for by a kind of robot - Google Patents

People's method and robot are looked for by a kind of robot Download PDF

Info

Publication number
CN108445881A
CN108445881A CN201810219259.3A CN201810219259A CN108445881A CN 108445881 A CN108445881 A CN 108445881A CN 201810219259 A CN201810219259 A CN 201810219259A CN 108445881 A CN108445881 A CN 108445881A
Authority
CN
China
Prior art keywords
people
looked
robot
image information
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810219259.3A
Other languages
Chinese (zh)
Inventor
洪帆
周能文
刘雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Loy Intelligent Technology Co Ltd
Original Assignee
Shanghai Loy Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Loy Intelligent Technology Co Ltd filed Critical Shanghai Loy Intelligent Technology Co Ltd
Priority to CN201810219259.3A priority Critical patent/CN108445881A/en
Publication of CN108445881A publication Critical patent/CN108445881A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0285Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Manipulator (AREA)

Abstract

The present embodiments relate to smart machine field, disclose a kind of robot looks for people's method and robot.In the present invention, people's method is looked for by robot, is applied to a kind of movable machine people, including:Moving area is determined according to predeterminated position;It is moved in moving area, collects image information;The collected image information of analysis;If being found from collected image information and the matched image information of the identification feature of people to be looked for, direction of travel and movement are determined according to the matched image information of institute, the position where reaching the people to be looked for;Position to where people to be looked for described in the user terminal transmission bound with the robot so that user is quickly found out people to be looked under the premise of only knowing approximate location.

Description

People's method and robot are looked for by a kind of robot
Technical field
The present embodiments relate to smart machine field, more particularly to smart machine looks for people's technology.
Background technology
Robot (Robot) is the automatic installations for executing work.It can not only receive mankind commander, but also can run The program of advance layout, can also be according to principle program action formulated with artificial intelligence technology, and its task is to assist or take For the work of human work, such as production industry, construction industry, or dangerous work.Social now, robot enters family, is It is trend of the times that household, which carries out service, and robot can provide various types of services in the family, such as vehicle caring service, clothes of running errands It is engaged in, the service of cooking etc., and due to the rapid development of artificial intelligence, robot is able to carry out more and more complex works.
In modern city, the social scope of people is often concentrated in a few a regions, when being met with people, it is often found that, Although meeting place, possible position can not be refined instruction, may pedestrian it is excessive, may site environment it is complicated, It causes meeting place even if simultaneously, it is also possible to it is difficult to encounter.The especially places such as station, market, these are particularly problematic, Even if at the scene by telephonic communication, it also is difficult to describe sometimes clear.
Invention content
Embodiment of the present invention be designed to provide a kind of robot look for people's method and robot so that user is only Under the premise of knowing approximate location, it is quickly found out people to be looked for.
In order to solve the above technical problems, embodiments of the present invention provide a kind of robot look for people's method, be applied to A kind of movable machine people, including:Moving area is determined according to predeterminated position;It is moved in moving area, collects image letter Breath;The collected image information of analysis;If being found from collected image information and the identification feature of people to be looked for The image information matched then determines direction of travel and movement according to the matched image information of institute, where reaching the people to be looked for Position;Position to where people to be looked for described in the user terminal transmission bound with the robot.
Embodiments of the present invention additionally provide a kind of robot, including:At least one processor;And with it is described extremely The memory of few processor communication connection;Wherein, the memory, which is stored with, to be executed by least one processor Instruction, described instruction is executed by least one processor, so that at least one processor is able to carry out as above-mentioned Robot look for people's method.
Embodiments of the present invention additionally provide a kind of computer readable storage medium, are stored with computer program, described That above-mentioned robot is realized when computer program is executed by processor looks for people's method.
In terms of existing technologies, the main distinction and its effect are embodiment of the present invention:Movable machine people It is moved in moving area, and collects surrounding image information in the process of moving, then the information being collected into is analyzed Processing, therefrom matches the information for meeting people's identification feature to be looked for, once it is matched to the information met, you can know people's to be looked for Orientation, as the moving direction of robot, until reaching the position where people to be looked for.The position of people to be looked for can be sent later To user, people to be looked for is found as early as possible thereby using family.As it can be seen that embodiment of the present invention makes user only know approximate location Under the premise of, it is quickly found out people to be looked for.
As a further improvement, it is described to the user terminal bound with robot send described in position where people to be looked for it Afterwards, further include:Position where people to be looked for described in planning and the route between the user terminal position.Further limit machine Device people is obtaining behind the position for looking for people, can obtain the route between user, is sought convenient for people to be looked for or the mutual of user It looks for.
As a further improvement, the position planned where people to be looked for and the route between the user terminal position Later, further include:Voice guidance message is sent according to the route.Also transmittable voice guidance message is limited, then people to be looked for It can be according to voice guide so that user further easily finds people to be looked for.
As a further improvement, determining moving area according to predeterminated position, specially:Receive look for people to instruct when, with Position is starting point, and the region for being determined for compliance with preset condition is the moving area.Further limit moving area really Determine method, meet actual demand, method of determination is flexible.
As a further improvement, the preset condition is to be less than or equal to preset value at a distance from the starting point.With Small range searching is carried out around the position of family, since the nigh probability of people to be looked for is higher, so small range searching is more easy to It is quickly obtained the image information of people to be looked for.
As a further improvement, further including:If continue preset duration do not found from collected image information with The matched image information of identification feature of people to be looked for, then be moved to predeterminated position.Further limiting can not look for after the duration To when looking for people, being returned to preset position.
As a further improvement, the predeterminated position is the position where the user terminal bound with the robot It sets.The position for further limiting robot return is the position where user.
As a further improvement, the identification feature is:Facial image.It can be fast using facial image as identification feature Speed is efficiently matched to people to be looked for according to the image information being collected into.
Description of the drawings
One or more embodiments are illustrated by the picture in corresponding attached drawing, these exemplary theorys The bright restriction not constituted to embodiment, the element with same reference numbers label is expressed as similar element in attached drawing, removes Non- to have special statement, composition does not limit the figure in attached drawing.
Fig. 1 is to look for people's method flow diagram according to the robot in first embodiment of the invention;
Fig. 2 is the position for looking for robot and people to be looked in people's method according to the robot in first embodiment of the invention Relational graph;
Fig. 3 is to look for people's method flow diagram according to the robot in third embodiment of the invention;
Fig. 4 is the structural schematic diagram according to the smart machine in four embodiment of the invention.
Specific implementation mode
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with attached drawing to the present invention Each embodiment be explained in detail.However, it will be understood by those skilled in the art that in each embodiment party of the present invention In formula, in order to make the reader understand this application better, many technical details are proposed.But even if without these technical details And various changes and modifications based on the following respective embodiments, it can also realize the application technical solution claimed.
People's method is looked for the first embodiment of the present invention is related to a kind of robot.
Robot in present embodiment looks for people's method to be applied to a kind of movable machine people, robot and a user Terminal is bound, which can be the user terminal of affiliated robot owner, and user terminal can be mobile phone, remote controler Deng.Robot in present embodiment looks for the flow of people's method as shown in Figure 1, specific as follows:
Step 101, moving area is determined according to predeterminated position.
Specifically, predeterminated position can be a specific position, the position where such as starting point or user terminal. More specifically, during looking for people, user can also move, then according to the position where user terminal, robot can To understand the accurate location where user.
In present embodiment can receive look for people to instruct when, using position as starting point, be determined for compliance with default item The region of part is moving area.Wherein, the region for meeting preset condition can be to be less than or equal to preset at a distance from starting point The region of value.
In practical application, user can find with robot in the same area, accelerate hunting speed, acceptable and robot It is found in different zones, expands and find region, hunting speed can also be accelerated.
Step 102, it is moved in moving area, collects image information.
Specifically, robot can utilize various traveling forms, such as straight line travelling, spiral in moving process Travelling can be arranged according to user's actual need.Furtherly, image information is collected while mobile by robot, can be with Image information is collected using the camera module of built in machine people, when moving, robot can also rotate simultaneously, in the same of rotation When collect image information, to expand search area as far as possible.
Step 103, collected image information is analyzed.
Specifically, mobile phone to image information can be single frames still image, can also be video image, it is specific right Image carries out analyzing processing, and details are not described herein.
Step 104, judge from collected image information, if find and the identification feature of people to be looked for is matched Image information;If so, thening follow the steps 105;If it is not, then returning to step 104.
Specifically, it is searched in the image information being collected into, it is determined whether exist and matched with the identification feature of people to be looked for Image information, the identification feature in present embodiment can be facial image, due to looking in the environment of people, it is understood that there may be more Passerby can quickly identify people to be looked for, quick and precisely using facial image from crowd at this time.
Step 105, direction of travel and movement are determined according to the matched image information of institute, the position where reaching people to be looked for It sets.
Specifically, when the camera module of robot shoots front, once in the image 21 taken People to be looked for is identified, as long as then position according to people to be looked in the picture, you can determine that the orientation of people to be looked for and robot is closed System, the orientation that the left margin and right margin of image correspond to robot respectively can determine according to the shooting angle of camera module, such as 30 degree etc., so the different direction that different location on the image can correspond to robot occurs in people to be looked for, which is machine Direction of travel after device people.As shown in Fig. 2, the image of people such as to be looked for is located at 230 position, then people to be looked for is opposite in machine The front of people, so direction of travel is the straight line where 220, the image of people such as to be looked for is located at 231 position, then waiting looking for People is opposite in the left front of robot, so direction of travel is the straight line where 221, if the image of people to be looked for is positioned at 232 Position, then people to be looked for is opposite in the right front of robot, so direction of travel is the straight line where 222, other are not another One enumerates.
More specifically, once identifying people to be looked for, the moving direction of robot is just moved according to determining direction of travel, It is equivalent to the pattern for entering proximal to people to be looked for.
It should also be noted that, after determining direction of travel, still carries out image information and be collected and analyze, determine row Into direction, the position where reaching people to be looked for.When robot is constantly close to when looking for people, the facial image of people to be looked for is being received The accounting in image information collected can be increasing, can be with according to the movement speed of robot and the variation of the accounting of facial image The distance of robot and people to be looked for is determined, so that it is determined that whether robot has come to the position where people to be looked for, specific algorithm It repeats no more.
Step 106, to the position sent with the user terminal that robot is bound where people to be looked for.
Specifically, the position where robot reaches people to be looked for, so that it may using by the position where its own as waiting for The position where people is looked for, and the position is sent to user, the position of people to be looked for is confirmed for user.In practical application, in user Understand after the accurate location for looking for people, can actively go to the position of people to be looked for, it is convenient and efficient.
In terms of existing technologies, the main distinction and its effect are present embodiment:Movable machine people is moving It is moved in dynamic region, and collects surrounding image information in the process of moving, analyzing processing then is carried out to the information being collected into, The information for meeting people's identification feature to be looked for therefrom is matched, once it is matched to the information met, you can know the orientation of people to be looked for, As the moving direction of robot, until reaching the position where people to be looked for.The position of people to be looked for can be sent to use later People to be looked for is found at family thereby using family as early as possible.As it can be seen that present embodiment makes user under the premise of only knowing approximate location, It is quickly found out people to be looked for.In addition, using position when people being looked for instruct is being received as starting point, turnover zone delimited out Domain is found by robot emphasis, convenient for faster finding people to be looked for.
What second embodiment of the present invention was related to a kind of robot looks for people's method.Second embodiment is implemented first It is further improved on the basis of mode, mainly thes improvement is that:In second embodiment of the invention, for user and wait for The step of looking for people's programme path, convenient for the mutual searching of people to be looked for or user.
People's method flow diagram is looked for as shown in figure 3, specific as follows by robot in present embodiment:
Step 301 is similar to step 106 with step 101 in first embodiment to step 306 in present embodiment, This is repeated no more.
Step 307, the position where people to be looked for and the route between user terminal position are planned.
Specifically, the map programme path that prestores can be utilized, in practical application, the route cooked up can be by aobvious Display screen is shown, can also be sent to default terminal of setting, etc..In present embodiment, voice can also be sent according to route Guidance information helps people to be looked for advance to the position where user.Since during looking for people, user and people to be looked for may Shift position, it is possible to pass through the updating location information route obtained in real time so that user more quickly finds people to be looked for.
Obtained behind the position for looking for people as it can be seen that present embodiment further limits robot, can obtain with user it Between route, convenient for the mutual searching of people to be looked for or user.Meanwhile also transmittable voice guidance message is limited, then waiting looking for People can be according to voice guide so that user further easily finds people to be looked for.
What third embodiment of the present invention was related to a kind of robot looks for people's method.Third embodiment is implemented first It is further improved on the basis of mode, mainly thes improvement is that:In third embodiment of the invention, robot is increased newly Return mechanisms, more meet the actual demand of user.
Specifically, during judging from collected image information, if do not found, continuation can be returned It searches, in present embodiment, during returning to lasting judgement, duration can be further judged, if persistently looked into The overlong time that can not find, it is determined that can not find, return to predeterminated position, no longer need to look for.If that is, continuing preset duration It does not find from collected image information and the matched image information of the identification feature of people to be looked for, is then moved to default position It sets.In practical application, the judgment basis for being determined as can not find can also be used as by the number of return, no longer arranged one by one herein It lifts.
Furtherly, the predeterminated position in present embodiment is the position where the user terminal bound with robot.It is real In the application of border, predeterminated position can also be a scheduled specific position, such as starting point.
As it can be seen that present embodiment can not be found after being limited to the duration when looking for people, it is returned to preset position.
The step of various methods divide above, be intended merely to describe it is clear, when realization can be merged into a step or Certain steps are split, multiple steps are decomposed into, as long as including identical logical relation, all in the protection domain of this patent It is interior;To either adding inessential modification in algorithm in flow or introducing inessential design, but its algorithm is not changed Core design with flow is all in the protection domain of the patent.
Four embodiment of the invention is related to a kind of robot, as shown in figure 4, including:
At least one processor;And the memory being connect at least one processor communication;Wherein, memory stores There is the instruction that can be executed by least one processor, instruction is executed by least one processor, so that at least one processor energy People's method is looked for by any one robot mentioned in enough execution such as first embodiment to third embodiment.
Wherein, memory is connected with processor using bus mode, and bus may include the bus of any number of interconnection And one or more processors and the various of memory are electrically connected to together by bridge, bus.Bus can also will be such as peripheral The various other of equipment, voltage-stablizer and management circuit or the like are electrically connected to together, these are all well known in the art , therefore, it will not be further described herein.Bus interface provides interface between bus and transceiver.Transceiver Can be an element, can also be multiple element, such as multiple receivers and transmitter, provide for over a transmission medium with The unit of various other device communications.The data handled through processor are transmitted on the radio medium by antenna, further, Antenna also receives data and transfers data to processor.
Processor is responsible for bus and common processing, can also provide various functions, including periodically, peripheral interface, Voltage adjusting, power management and other control functions.And memory can be used to store processor and execute operation when institute The data used.
Fifth embodiment of the invention is related to a kind of computer readable storage medium, is stored with computer program.Computer Above method embodiment is realized when program is executed by processor.
That is, it will be understood by those skilled in the art that implement the method for the above embodiments be can be with Relevant hardware is instructed to complete by program, which is stored in a storage medium, including some instructions are making It obtains an equipment (can be microcontroller, chip etc.) or processor (processor) executes each embodiment method of the application All or part of step.And storage medium above-mentioned includes:USB flash disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. are various can store journey The medium of sequence code.
It will be understood by those skilled in the art that the respective embodiments described above are to realize specific embodiments of the present invention, And in practical applications, can to it, various changes can be made in the form and details, without departing from the spirit and scope of the present invention.

Claims (10)

1. people's method is looked for by a kind of robot, which is characterized in that it is applied to a kind of movable machine people, including:
Moving area is determined according to predeterminated position;
It is moved in the moving area, collects image information;
The collected image information of analysis;
If being found from collected image information and the matched image information of the identification feature of people to be looked for, according to institute The image information matched determines direction of travel and movement, the position where people to be looked for described in arrival;
Position to where people to be looked for described in the user terminal transmission bound with the robot.
2. people's method is looked for by robot according to claim 1, which is characterized in that described to the user bound with robot After position where people to be looked for described in terminal transmission, further include:
Position where people to be looked for described in planning and the route between the user terminal position.
3. people's method is looked for by robot according to claim 2, which is characterized in that the position planned where people to be looked for After route between the user terminal position, further include:
Voice guidance message is sent according to the route.
4. people's method is looked for by robot according to claim 1, which is characterized in that described determined according to predeterminated position is moved Region, specially:
Receive look for people to instruct when, using position as starting point, be determined for compliance with preset condition region be the turnover zone Domain.
5. people's method is looked for by robot according to claim 4, which is characterized in that the preset condition be and the starting The distance of point is less than or equal to preset value.
6. people's method is looked for by robot according to claim 1, which is characterized in that further include:If continuing preset duration not It is found from collected image information and the matched image information of the identification feature of people to be looked for, is then moved to default position It sets.
7. people's method is looked for by robot according to claim 6, which is characterized in that the predeterminated position be and the machine Position where the user terminal of people's binding.
8. people's method is looked for by robot according to claim 1, which is characterized in that the identification feature is:Facial image.
9. a kind of robot, which is characterized in that including:
At least one processor;And
The memory being connect at least one processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one processor, and described instruction is by least one place Device is managed to execute so that at least one processor be able to carry out the robot as described in any in claim 1 to 8 look for people Method.
10. a kind of computer readable storage medium, is stored with computer program, which is characterized in that the computer program is located That robot described in any item of the claim 1 to 8 is realized when reason device execution looks for people's method.
CN201810219259.3A 2018-03-16 2018-03-16 People's method and robot are looked for by a kind of robot Pending CN108445881A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810219259.3A CN108445881A (en) 2018-03-16 2018-03-16 People's method and robot are looked for by a kind of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810219259.3A CN108445881A (en) 2018-03-16 2018-03-16 People's method and robot are looked for by a kind of robot

Publications (1)

Publication Number Publication Date
CN108445881A true CN108445881A (en) 2018-08-24

Family

ID=63195653

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810219259.3A Pending CN108445881A (en) 2018-03-16 2018-03-16 People's method and robot are looked for by a kind of robot

Country Status (1)

Country Link
CN (1) CN108445881A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163448A (en) * 2018-12-28 2019-08-23 山东浪潮商用***有限公司 A kind of tax intelligent robot based on indoor positioning leads tax method
CN112008735A (en) * 2020-08-24 2020-12-01 北京云迹科技有限公司 Tour robot-based rescue method, device and system
CN115709468A (en) * 2022-11-16 2023-02-24 京东方科技集团股份有限公司 Guide control method and device, electronic equipment and readable storage medium
CN115963825A (en) * 2022-12-23 2023-04-14 美的集团(上海)有限公司 Intelligent device, control method and device thereof, and computer program product

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090147994A1 (en) * 2007-10-10 2009-06-11 Rakesh Gupta Toro: tracking and observing robot
CN103049734A (en) * 2011-10-12 2013-04-17 杜惠红 Method and system for finding person in public place
CN105654512A (en) * 2015-12-29 2016-06-08 深圳羚羊微服机器人科技有限公司 Target tracking method and device
CN106774326A (en) * 2016-12-23 2017-05-31 湖南晖龙股份有限公司 A kind of shopping guide robot and its shopping guide method
CN107172198A (en) * 2017-06-27 2017-09-15 联想(北京)有限公司 A kind of information processing method, apparatus and system
CN107278369A (en) * 2016-12-26 2017-10-20 深圳前海达闼云端智能科技有限公司 Method, device and the communication system of people finder
CN107292240A (en) * 2017-05-24 2017-10-24 深圳市深网视界科技有限公司 It is a kind of that people's method and system are looked for based on face and human bioequivalence
CN107316028A (en) * 2017-06-30 2017-11-03 广东工业大学 The quick location tracking method of Given Face and device in crowd based on mobile terminal
CN107330369A (en) * 2017-05-27 2017-11-07 芜湖星途机器人科技有限公司 Human bioequivalence robot
CN107770366A (en) * 2017-08-31 2018-03-06 珠海格力电器股份有限公司 Method and device for searching equipment, storage medium and equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090147994A1 (en) * 2007-10-10 2009-06-11 Rakesh Gupta Toro: tracking and observing robot
CN103049734A (en) * 2011-10-12 2013-04-17 杜惠红 Method and system for finding person in public place
CN105654512A (en) * 2015-12-29 2016-06-08 深圳羚羊微服机器人科技有限公司 Target tracking method and device
CN106774326A (en) * 2016-12-23 2017-05-31 湖南晖龙股份有限公司 A kind of shopping guide robot and its shopping guide method
CN107278369A (en) * 2016-12-26 2017-10-20 深圳前海达闼云端智能科技有限公司 Method, device and the communication system of people finder
CN107292240A (en) * 2017-05-24 2017-10-24 深圳市深网视界科技有限公司 It is a kind of that people's method and system are looked for based on face and human bioequivalence
CN107330369A (en) * 2017-05-27 2017-11-07 芜湖星途机器人科技有限公司 Human bioequivalence robot
CN107172198A (en) * 2017-06-27 2017-09-15 联想(北京)有限公司 A kind of information processing method, apparatus and system
CN107316028A (en) * 2017-06-30 2017-11-03 广东工业大学 The quick location tracking method of Given Face and device in crowd based on mobile terminal
CN107770366A (en) * 2017-08-31 2018-03-06 珠海格力电器股份有限公司 Method and device for searching equipment, storage medium and equipment

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163448A (en) * 2018-12-28 2019-08-23 山东浪潮商用***有限公司 A kind of tax intelligent robot based on indoor positioning leads tax method
CN112008735A (en) * 2020-08-24 2020-12-01 北京云迹科技有限公司 Tour robot-based rescue method, device and system
CN115709468A (en) * 2022-11-16 2023-02-24 京东方科技集团股份有限公司 Guide control method and device, electronic equipment and readable storage medium
CN115963825A (en) * 2022-12-23 2023-04-14 美的集团(上海)有限公司 Intelligent device, control method and device thereof, and computer program product
CN115963825B (en) * 2022-12-23 2024-03-26 美的集团(上海)有限公司 Intelligent device, control method and device thereof and computer program product

Similar Documents

Publication Publication Date Title
CN108445881A (en) People's method and robot are looked for by a kind of robot
CN110146100B (en) Trajectory prediction method, apparatus and storage medium
WO2023065395A1 (en) Work vehicle detection and tracking method and system
CN105513403A (en) Method and system for finding vehicle in parking lot based on image recognition
CN111784729B (en) Object tracking method and device, electronic equipment and storage medium
KR20210006511A (en) Lane determination method, device and storage medium
CN110497901A (en) A kind of parking position automatic search method and system based on robot VSLAM technology
WO2019148491A1 (en) Human-computer interaction method and device, robot, and computer readable storage medium
CN109540155A (en) A kind of path planning and navigation method, computer installation and the computer readable storage medium of sweeping robot
CN106205161A (en) traffic information transmission method and device
CN110021033A (en) A kind of method for tracking target based on the twin network of pyramid
CN104766490A (en) Intelligent vehicle finding system
CN107370989A (en) Target seeking method and server
CN110533685A (en) Method for tracing object and device, storage medium and electronic device
CN110533700A (en) Method for tracing object and device, storage medium and electronic device
CN112596024B (en) Motion identification method based on environment background wireless radio frequency signal
CN109141453A (en) A kind of route guiding method and system
CN109615904A (en) Parking management method, device, computer equipment and storage medium
WO2023197232A1 (en) Target tracking method and apparatus, electronic device, and computer readable medium
CN113705417B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN115049954B (en) Target identification method, device, electronic equipment and medium
CN110335495A (en) Vehicle position information processing method, device, computer equipment and storage medium
CN106935059A (en) One kind positioning looks for car system, positioning to look for car method and location positioning method
WO2023236514A1 (en) Cross-camera multi-object tracking method and apparatus, device, and medium
CN109147379A (en) Garage parking intelligently guiding terminal and its control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180824