CN114428501B - Old man home service robot and control method thereof - Google Patents

Old man home service robot and control method thereof Download PDF

Info

Publication number
CN114428501B
CN114428501B CN202111554673.8A CN202111554673A CN114428501B CN 114428501 B CN114428501 B CN 114428501B CN 202111554673 A CN202111554673 A CN 202111554673A CN 114428501 B CN114428501 B CN 114428501B
Authority
CN
China
Prior art keywords
robot
acquired
control
indoor map
control instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111554673.8A
Other languages
Chinese (zh)
Other versions
CN114428501A (en
Inventor
孙贇
姚郁巍
苏瑞
衡进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Terminus Technology Co Ltd
Original Assignee
Chongqing Terminus Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Terminus Technology Co Ltd filed Critical Chongqing Terminus Technology Co Ltd
Priority to CN202111554673.8A priority Critical patent/CN114428501B/en
Publication of CN114428501A publication Critical patent/CN114428501A/en
Application granted granted Critical
Publication of CN114428501B publication Critical patent/CN114428501B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Manipulator (AREA)

Abstract

The application provides a home service robot for old people and a control method thereof. The method comprises the following steps: acquiring an indoor map, wherein the indoor map marks the position of a bed, a medicine storage place, a takeout and collection place, a cup placement place and a sanitary article storage place; receiving a control instruction which is input by analysis voice, remote control or manual operation, and converting the control instruction into text information; extracting keywords in the text information, and searching marking points corresponding to the objects to be acquired in the indoor map; controlling the robot to move to the vicinity of the marking point position of the object to be acquired according to the position of the robot and the marking point position of the object to be acquired, and controlling the mechanical arm to clamp the object to be acquired through laser ranging; and controlling the robot to convey the object to be acquired to the conveying destination point according to the position of the robot and the position of the marking point of the conveying destination point. The intelligent home care device realizes intelligent home care for the disabled bedridden old, and can remotely control the robot in various modes such as voice or mobile phone to complete basic care services such as taking medicine, delivering water, delivering food and the like.

Description

Old man home service robot and control method thereof
Technical Field
The application relates to the technical field of robots, in particular to a home service robot for old people and a control method thereof.
Background
With the continuous development of the robot field, household robot products are promoted towards families at lower selling prices. The existing household robot has the functions of autonomous walking and collision prevention, interaction of voice and touch screen and the like, and integrates a safety monitoring type sensing device, and can provide the functions of early education, discipline education, customized information inquiry and pushing, safety monitoring, psychological counseling, auxiliary medical diagnosis and the like according to the user demands of different age levels. Meanwhile, along with the continuous reinforcement of the aging trend of China, the population of the aged is increased sharply, and the aged in the empty nest is increased continuously, so that how to ensure the healthy life of the aged becomes a great social problem to be solved urgently.
The existing household service robot for the old can only complete simple tasks such as video monitoring, smoke alarm, heart rate alarm, accompanying chat and the like. On the other hand, for the elderly with disabled bed, food, water, medicine and other materials are often needed, but these emergency materials are not necessarily placed near the bedside of the elderly, so that the elderly cannot obtain the emergency materials.
Disclosure of Invention
In view of the above, the application aims to provide a control method of an old people home service robot, which can pointedly solve the existing old people bedridden home service problem.
Based on the above object, the application provides a control method of an old people home service robot, comprising the following steps:
Acquiring an indoor map, wherein the indoor map marks the position of a bed, a medicine storage place, a takeout and collection place, a cup placement place and a sanitary article storage place;
Receiving a control instruction input by voice, remote control or manual operation, analyzing the control instruction input by voice, remote control or manual operation, and converting the control instruction into text information;
Extracting keywords in the text information, and matching in a preset control intent library to obtain a control intent, wherein the control intent comprises an object name to be acquired and a delivery destination point, and searching a mark point corresponding to the object to be acquired in the indoor map according to the object name to be acquired;
According to the position of the robot and the position of the marking point of the object to be acquired, carrying out first path planning on the indoor map, controlling the robot to move to the position near the position of the marking point of the object to be acquired, and clamping the object to be acquired through a machine vision control mechanical arm;
And carrying out path planning for the second time on the indoor map according to the position of the robot and the position of the marking point of the conveying destination point, and controlling the robot to convey the object to be acquired to the conveying destination point.
Further, the acquiring an indoor map, the indoor map marking a bed position, a medicine storage location, a take-out access location, a cup placement location, and a sanitary article storage location, includes:
The robot traverses each indoor room, avoids obstacles through a laser radar, and scans all indoor spaces through 360 degrees of cameras of the robot to obtain an indoor space scanning image;
according to a positioning module and a running track in the robot, combining the laser radar to measure the distance of a wall or a door, and establishing an indoor map;
according to an image recognition algorithm, capturing an indoor article image in an indoor space scanning image and inputting the indoor article image into a neural network classification module to obtain images of beds, medicines, gates, water cups and sanitary articles;
The positions of the beds, medicines, gates, cups, and sanitary articles are located using laser radar ranging and marked on the indoor map.
Further, the acquiring an indoor map, the indoor map marking a bed position, a medicine storage location, a take-out access location, a cup placement location, and a sanitary article storage location, includes:
The robot receives an indoor map sent by an external server or a terminal, and the indoor map is provided with a bed position, a medicine storage place, a takeout and collection place, a cup placement place and a sanitary article storage place which are marked manually through software or APP.
Further, the receiving the control instruction of voice, remote control or manual input, analyzing the control instruction of voice, remote control or manual input, converting the control instruction into text information, including:
Receiving a control instruction of voice input, analyzing the control instruction of the voice input by utilizing a voice recognition algorithm, and converting the control instruction into text information; or alternatively
Receiving a control instruction input by a remote control, analyzing the control instruction input by the remote control according to a preset remote control command code lookup table, and converting the control instruction into text information; or alternatively
And receiving a control instruction manually input by a user through software or APP and sent to the robot through a computer or a mobile phone, analyzing the manually input control instruction, and converting the control instruction into text information.
Further, extracting the keywords in the text information, and matching in a preset control intent library to obtain a control intent, wherein the control intent comprises a name of an article to be obtained and a delivery destination point, and searching a mark point corresponding to the article to be obtained in the indoor map according to the name of the article to be obtained, including:
Extracting action, time, article name and place information in the text information as keywords;
searching the keywords in the corresponding relation between preset keywords and equipment control commands, and acquiring the equipment control commands according to the matched corresponding relation information to serve as control intentions, wherein the control intentions comprise names of articles to be acquired and delivery destination points;
and matching the object name to be acquired with the marking information of all marking points in the indoor map, and finding out the marking points corresponding to the matched marking information.
Further, according to the position of the robot and the position of the marking point of the object to be acquired, performing a first path planning on the indoor map, controlling the robot to move to the vicinity of the position of the marking point of the object to be acquired, and clamping the object to be acquired through a machine vision control mechanical arm, including:
Marking the minimum number of turning points between the position of the robot and the position of the marking point of the object to be acquired according to the position of the robot and the position of the marking point of the object to be acquired, and finishing the first path planning by connecting every two turning points through a straight line;
Controlling the robot to move to the vicinity of the marking point position of the object to be acquired according to the route planned by the first path;
the calculation of the distance between the mechanical arm and the object to be acquired is completed through laser radar ranging, the mechanical arm is controlled to move the calculated distance, the mechanical gripper is unfolded to clamp the object to be acquired, and after clamping is fixed, a voice prompt or a light prompt for completing clamping is sent out.
Further, the second path planning is performed on the indoor map according to the position of the robot and the position of the marking point of the delivery destination, and the controlling the robot to deliver the object to be acquired to the delivery destination includes:
Marking the minimum number of turning points between the position of the robot and the position of the marking point of the conveying destination point according to the position of the robot and the position of the marking point of the conveying destination point, and finishing the second path planning by connecting every two turning points through a straight line;
and controlling the robot to move to the vicinity of the delivery destination point according to the route planned by the second path, and sending out a voice prompt or a light prompt for completing the task.
Based on the above object, the present application also provides an old man home service robot, comprising:
the map acquisition module is used for acquiring an indoor map, and the indoor map marks the position of a bed, the medicine storage place, the takeout and collection place, the cup placement place and the sanitary article storage place;
The instruction receiving module is used for receiving a control instruction input by voice, remote control or manual operation, analyzing the control instruction input by voice, remote control or manual operation and converting the control instruction into text information;
the intention acquisition module is used for extracting keywords in the text information, matching the keywords in a preset control intention library to obtain a control intention, wherein the control intention comprises an object name to be acquired and a transport destination point, and searching a mark point corresponding to the object to be acquired in the indoor map according to the object name to be acquired;
The grabbing target module is used for planning a first path on the indoor map according to the position of the robot and the position of the marking point of the object to be acquired, controlling the robot to move to the position near the position of the marking point of the object to be acquired, and controlling the mechanical arm to clamp the object to be acquired through machine vision;
and the conveying execution module is used for carrying out path planning for the second time on the indoor map according to the position of the robot and the position of the marking point of the conveying destination point, and controlling the robot to convey the object to be acquired to the conveying destination point.
Overall, the advantages of the application and the experience brought to the user are:
The intelligent home care device realizes intelligent home care for the disabled bedridden old, and can remotely control the robot in various modes such as voice or mobile phone to complete basic care services such as taking medicine, delivering water, delivering food and the like.
Drawings
In the drawings, the same reference numerals refer to the same or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily drawn to scale. It is appreciated that these drawings depict only some embodiments according to the disclosure and are not therefore to be considered limiting of its scope.
Fig. 1 shows a schematic diagram of the system architecture of the present application.
Fig. 2 shows a flowchart of a control method of an old man home service robot according to an embodiment of the present application.
Fig. 3 shows a constitution diagram of an old man home service robot according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 5 shows a schematic diagram of a storage medium according to an embodiment of the present application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
Fig. 1 shows a schematic diagram of the system architecture of the present application. In an embodiment of the application, an indoor map is obtained, wherein the indoor map marks a bed position, a medicine storage place, a takeout access place, a cup placement place and a sanitary article storage place; receiving a control instruction which is input by analysis voice, remote control or manual operation, and converting the control instruction into text information; extracting keywords in the text information, and searching marking points corresponding to the objects to be acquired in the indoor map; controlling the robot to move to the vicinity of the marking point position of the object to be acquired according to the position of the robot and the marking point position of the object to be acquired, and controlling the mechanical arm to clamp the object to be acquired through laser ranging; and controlling the robot to convey the object to be acquired to the conveying destination point according to the position of the robot and the position of the marking point of the conveying destination point.
Fig. 2 shows a flowchart of a control method of an old man home service robot according to an embodiment of the present application. As shown in fig. 2, the control method of the old man home service robot includes:
step 101: an indoor map is acquired that marks bed locations, drug storage locations, take-out access locations, cup placement locations, and sanitary storage locations.
The application has the following two ways for acquiring the indoor map, one is active and the other is passive.
Firstly, actively traversing a robot in each indoor room, avoiding obstacles through a laser radar, and scanning all indoor spaces through 360 degrees of a camera of the robot to obtain an indoor space scanning image; according to a positioning module and a running track in the robot, combining the laser radar to measure the distance of a wall or a door, and establishing an indoor map; according to an image recognition algorithm, capturing an indoor article image in an indoor space scanning image and inputting the indoor article image into a neural network classification module to obtain images of beds, medicines, gates, water cups and sanitary articles; the positions of the beds, medicines, gates, cups, and sanitary articles are located using laser radar ranging and marked on the indoor map.
In this embodiment, in the process of capturing the object image, a method of performing object detection by OpenCV may be used. For example, the algorithm with the high-flexibility dynamic data structure in OpenCV is used for extracting the target feature points and describing the feature points of the acquired image, then the feature points are matched with the data in the database, so that the target identification is completed, and the identification result is displayed on the LCD. The OpenCV comprises hundreds of C function application programming interfaces, is independent of an external library, can independently operate, and has high performance, and an algorithm is written in C or C++ language. In the embodiment, application development is performed on the embedded ARM platform by using OpenCV, cross compiling and transplanting are required for the OpenCV, a compiled dynamic link library is downloaded to an embedded Linux file system, and related functions in the OpenCV can be used by configuring OpenCV environment variables.
Specifically, the embodiment uses feature2d module related feature point detection sub-and feature point descriptor in OpenCV and matching frame to complete image target recognition, the characteristic descriptor information database of the related object should be established in advance. Since the embedded Qt cannot support the gtk+ window, the gtk+ based image display function cvShowImage in OpenCV is masked during cross-compilation, so that the data in the relevant image format cannot be displayed by using the function in the embedded Qt. The method comprises the steps of converting image data in an OpenCV custom IplImage structural body into 32-bit true color four-channel image data, and then displaying the image data by utilizing related classes in Qt.
The neural network classification module in this embodiment uses a convolutional neural network to perform fine classification on objects, and includes the steps of training an original input image by using a convolutional depth network, detecting objects by using a feature map of a hidden layer, simultaneously cutting image blocks of the compact surrounding objects based on the detection result, and finally constructing a convolutional network to initialize an original training network, and merging the two to input the two into a synthetic convolutional neural network for classification detection. Therefore, the application can solve the object detection problem under various background states, provides a matching framework of the convolutional neural network, and improves the object detection precision.
Second, the robot receives an indoor map transmitted by an external server or terminal, and the indoor map is provided with a bed position, a medicine storage place, a take-out and take-up place, a cup placement place and a sanitary article storage place which are marked manually through software or APP.
Step 102: receiving a control instruction input by voice, remote control or manual operation, analyzing the control instruction input by voice, remote control or manual operation, and converting the control instruction into text information, wherein the method comprises the following steps:
The first voice control mode is to receive a control instruction of voice input, analyze the control instruction of the voice input by utilizing a voice recognition algorithm and convert the control instruction into text information; the research of the voice recognition algorithm in different fields is very common, and the present voice recognition algorithm capable of intelligently recognizing the voice control instruction can be used in the present application, which is not described in detail herein.
The second remote control mode is similar to infrared remote control equipment such as a television, an air conditioner and the like, receives a control instruction input by remote control, analyzes the control instruction input by remote control according to a preset remote control command code lookup table, and converts the control instruction into text information; the preset remote control command code lookup table is obtained by downloading from an external server or a mobile terminal such as a mobile phone in advance.
And in a third manual control mode, receiving a control instruction manually input by a user through software or APP and sent to the robot through a computer or a mobile phone, analyzing the manually input control instruction, and converting the control instruction into text information.
Step 103: extracting keywords in the text information, and matching in a preset control intent library to obtain a control intent, wherein the control intent comprises a name of an article to be acquired and a delivery destination point, and searching a mark point corresponding to the article to be acquired in the indoor map according to the name of the article to be acquired, and the method comprises the following steps:
Extracting action, time, article name and place information in the text information as keywords;
searching the keywords in the corresponding relation between preset keywords and equipment control commands, and acquiring the equipment control commands according to the matched corresponding relation information to serve as control intentions, wherein the control intentions comprise names of articles to be acquired and delivery destination points;
and matching the object name to be acquired with the marking information of all marking points in the indoor map, and finding out the marking points corresponding to the matched marking information.
In the present application, since the functional goal of the robot is to provide home services for the elderly, since the main keywords such as water, medicines, sanitary products are all of the name part of speech, and the actions are of the animal part of speech. For example, in the command "please take the water in living room for me", the keywords are living room, water. Thus, the keywords extracted by the present application may be specifically actions, times, names of items, places, and so on. The extraction of the keywords may be performed using a predetermined extraction strategy, for example, "yes", "no", "you, me, he" and other mood aid words and pronouns may be ignored in the process of extracting the keywords. The method can adopt the existing extraction strategy to extract the keywords, and the method does not limit the specific implementation process of extracting the keywords.
Step 104: according to the position of the robot and the position of the marking point of the object to be acquired, carrying out a first path planning on the indoor map, controlling the robot to move to the vicinity of the position of the marking point of the object to be acquired, and clamping the object to be acquired through a machine vision control mechanical arm, wherein the method comprises the following steps:
Marking the minimum number of turning points between the position of the robot and the position of the marking point of the object to be acquired according to the position of the robot and the position of the marking point of the object to be acquired, and finishing the first path planning by connecting every two turning points through a straight line; as shown in fig. 1, a shortest path around the obstacle can be planned between the robot and the medicine to save time for the robot to reach the target object.
Controlling the robot to move to the vicinity of the marking point position of the object to be acquired according to the route planned by the first path;
the calculation of the distance between the mechanical arm and the object to be acquired is completed through laser radar ranging, the mechanical arm is controlled to move the calculated distance, the mechanical gripper is unfolded to clamp the object to be acquired, and after clamping is fixed, a voice prompt or a light prompt for completing clamping is sent out.
Step 105: according to the position of the robot and the position of the marking point of the conveying destination point, carrying out a second path planning on the indoor map, controlling the robot to convey the object to be acquired to the conveying destination point, and comprising the following steps:
Marking the minimum number of turning points between the position of the robot and the position of the marking point of the conveying destination point according to the position of the robot and the position of the marking point of the conveying destination point, and finishing the second path planning by connecting every two turning points through a straight line; as shown in fig. 1, a shortest path around the obstacle between the drug and the bed can be planned to save time for the robot to transport the drug to the vicinity of the bed.
And controlling the robot to move to the vicinity of the delivery destination point according to the route planned by the second path, and sending out a voice prompt or a light prompt for completing the task.
The intelligent home care device realizes intelligent home care for the disabled bedridden old, and can remotely control the robot in various modes such as voice or mobile phone to complete basic care services such as taking medicine, delivering water, delivering food and the like.
An embodiment of the application provides an old man home service robot, where the system is configured to execute the control method of the old man home service robot described in the foregoing embodiment, as shown in fig. 3, and the system includes:
a map acquisition module 501 for acquiring an indoor map that marks a bed location, a medicine storage location, a take-out access location, a cup placement location, and a sanitary storage location;
The instruction receiving module 502 is configured to receive a control instruction input by voice, remote control or manual operation, parse the control instruction input by voice, remote control or manual operation, and convert the control instruction into text information;
The intention obtaining module 503 is configured to extract a keyword in the text information, and match the keyword in a preset control intention library to obtain a control intention, where the control intention includes a name of an article to be obtained and a destination point of transportation, and find a mark point corresponding to the article to be obtained in the indoor map according to the name of the article to be obtained;
the grabbing target module 504 is configured to perform a first path planning on the indoor map according to a position of the robot and a position of a mark point of the object to be acquired, control the robot to move to a position near the position of the mark point of the object to be acquired, and control the mechanical arm to clamp the object to be acquired through machine vision;
And the delivery execution module 505 is configured to perform a second path planning on the indoor map according to the position of the robot and the position of the marking point of the delivery destination, and control the robot to deliver the object to be acquired to the delivery destination.
The control method of the elderly people home service robot provided by the embodiment of the application has the same beneficial effects as the method adopted, operated or realized by the stored application program because of the same inventive concept.
The embodiment of the application also provides electronic equipment corresponding to the control method of the elderly people home service robot provided by the embodiment, so as to execute the control method of the elderly people home service robot. The embodiment of the application is not limited.
Referring to fig. 4, a schematic diagram of an electronic device according to some embodiments of the present application is shown. As shown in fig. 4, the electronic device 2 includes: a processor 200, a memory 201, a bus 202 and a communication interface 203, the processor 200, the communication interface 203 and the memory 201 being connected by the bus 202; the memory 201 stores a computer program that can be executed by the processor 200, and the processor 200 executes the control method of the elderly home service robot according to any one of the embodiments of the present application when executing the computer program.
The memory 201 may include a high-speed random access memory (RAM: random Access Memory), and may further include a non-volatile memory (non-volatile memory), such as at least one disk memory. The communication connection between the system network element and at least one other network element is implemented via at least one communication interface 203 (which may be wired or wireless), the internet, a wide area network, a local network, a metropolitan area network, etc. may be used.
Bus 202 may be an ISA bus, a PCI bus, an EISA bus, or the like. The buses may be classified as address buses, data buses, control buses, etc. The memory 201 is configured to store a program, and the processor 200 executes the program after receiving an execution instruction, and the control method of the senior citizen home service robot disclosed in any of the foregoing embodiments of the present application may be applied to the processor 200 or implemented by the processor 200.
The processor 200 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 200 or by instructions in the form of software. The processor 200 may be a general-purpose processor, including a central processing unit (Central Processing Unit, abbreviated as CPU), a network processor (Network Processor, abbreviated as NP), etc.; but may also be a Digital Signal Processor (DSP), application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 201, and the processor 200 reads the information in the memory 201, and in combination with its hardware, performs the steps of the above method.
The electronic equipment provided by the embodiment of the application and the control method of the elderly home service robot provided by the embodiment of the application have the same beneficial effects as the method adopted, operated or realized by the same conception.
The embodiment of the present application further provides a computer readable storage medium corresponding to the method for controlling the elderly people home service robot provided in the foregoing embodiment, referring to fig. 5, the computer readable storage medium is shown as an optical disc 30, and a computer program (i.e. a program product) is stored thereon, where the computer program, when executed by a processor, performs the method for controlling the elderly people home service robot provided in any of the foregoing embodiments.
It should be noted that examples of the computer readable storage medium may also include, but are not limited to, a phase change memory (PRAM), a Static Random Access Memory (SRAM), a Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a flash memory, or other optical or magnetic storage medium, which will not be described in detail herein.
The computer readable storage medium provided by the above embodiment of the present application has the same beneficial effects as the method adopted, operated or implemented by the application program stored in the control method of the elderly people home service robot provided by the embodiment of the present application, because of the same inventive concept.
It should be noted that:
the algorithms and displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with the teachings herein. The required structure for a construction of such a system is apparent from the description above. In addition, the present application is not directed to any particular programming language. It will be appreciated that the teachings of the present application described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present application.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the application, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in a virtual machine creation system according to embodiments of the application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus or system program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that various changes and substitutions are possible within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. The control method of the home service robot for the aged is characterized by comprising the following steps of:
Acquiring an indoor map, wherein the indoor map marks the position of a bed, a medicine storage place, a takeout and collection place, a cup placement place and a sanitary article storage place;
Receiving a control instruction input by voice, remote control or manual operation, analyzing the control instruction input by voice, remote control or manual operation, and converting the control instruction into text information;
Extracting keywords in the text information, and matching in a preset control intent library to obtain a control intent, wherein the control intent comprises an object name to be acquired and a delivery destination point, and searching a mark point corresponding to the object to be acquired in the indoor map according to the object name to be acquired;
According to the position of the robot and the position of the marking point of the object to be acquired, carrying out a first path planning on the indoor map, controlling the robot to move to the vicinity of the marking point of the object to be acquired, controlling the mechanical arm to clamp the object to be acquired through laser ranging, and comprising the following steps: marking the minimum number of turning points between the position of the robot and the position of the marking point of the object to be acquired according to the position of the robot and the position of the marking point of the object to be acquired, and finishing the first path planning by connecting every two turning points through a straight line; controlling the robot to move to the vicinity of the marking point position of the object to be acquired according to the route planned by the first path; the method comprises the steps that the distance between a mechanical arm and an object to be acquired is calculated through laser radar ranging, the mechanical arm is controlled to move the calculated distance, a mechanical gripper is unfolded to clamp the object to be acquired, and after clamping is fixed, a voice prompt or a light prompt for finishing clamping is sent out;
According to the position of the robot and the position of the marking point of the conveying destination point, carrying out a second path planning on the indoor map, controlling the robot to convey the object to be acquired to the conveying destination point, and comprising the following steps: marking the minimum number of turning points between the position of the robot and the position of the marking point of the conveying destination point according to the position of the robot and the position of the marking point of the conveying destination point, and finishing the second path planning by connecting every two turning points through a straight line; and controlling the robot to move to the vicinity of the delivery destination point according to the route planned by the second path, and sending out a voice prompt or a light prompt for completing the task.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The indoor map is obtained, the indoor map marks bed position, medicine storage place, takeout access place, drinking cup place and sanitary article storage place, includes:
The robot traverses each indoor room, avoids obstacles through a laser radar, and scans all indoor spaces through 360 degrees of cameras of the robot to obtain an indoor space scanning image;
according to a positioning module and a running track in the robot, combining the laser radar to measure the distance of a wall or a door, and establishing an indoor map;
according to an image recognition algorithm, capturing an indoor article image in an indoor space scanning image and inputting the indoor article image into a neural network classification module to obtain images of beds, medicines, gates, water cups and sanitary articles;
The positions of the beds, medicines, gates, cups, and sanitary articles are located using laser radar ranging and marked on the indoor map.
3. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The indoor map is obtained, the indoor map marks bed position, medicine storage place, takeout access place, drinking cup place and sanitary article storage place, includes:
The robot receives an indoor map sent by an external server or a terminal, and the indoor map is provided with a bed position, a medicine storage place, a takeout and collection place, a cup placement place and a sanitary article storage place which are marked manually through software or APP.
4. A method according to claim 2 or 3, characterized in that,
The receiving the control instruction input by voice, remote control or manual operation, analyzing the control instruction input by voice, remote control or manual operation, and converting the control instruction into text information, comprising:
Receiving a control instruction of voice input, analyzing the control instruction of the voice input by utilizing a voice recognition algorithm, and converting the control instruction into text information; or alternatively
Receiving a control instruction input by a remote control, analyzing the control instruction input by the remote control according to a preset remote control command code lookup table, and converting the control instruction into text information; or alternatively
And receiving a control instruction manually input by a user through software or APP and sent to the robot through a computer or a mobile phone, analyzing the manually input control instruction, and converting the control instruction into text information.
5. The method of claim 4, wherein the step of determining the position of the first electrode is performed,
Extracting keywords in the text information, and matching in a preset control intent library to obtain a control intent, wherein the control intent comprises an object name to be acquired and a delivery destination point, and searching a mark point corresponding to the object to be acquired in the indoor map according to the object name to be acquired, wherein the method comprises the following steps:
Extracting action, time, article name and place information in the text information as keywords;
searching the keywords in the corresponding relation between preset keywords and equipment control commands, and acquiring the equipment control commands according to the matched corresponding relation information to serve as control intentions, wherein the control intentions comprise names of articles to be acquired and delivery destination points;
and matching the object name to be acquired with the marking information of all marking points in the indoor map, and finding out the marking points corresponding to the matched marking information.
6. A senior citizen service robot employing the method of any of claims 1-5, comprising:
the map acquisition module is used for acquiring an indoor map, and the indoor map marks the position of a bed, the medicine storage place, the takeout and collection place, the cup placement place and the sanitary article storage place;
The instruction receiving module is used for receiving a control instruction input by voice, remote control or manual operation, analyzing the control instruction input by voice, remote control or manual operation and converting the control instruction into text information;
the intention acquisition module is used for extracting keywords in the text information, matching the keywords in a preset control intention library to obtain a control intention, wherein the control intention comprises an object name to be acquired and a transport destination point, and searching a mark point corresponding to the object to be acquired in the indoor map according to the object name to be acquired;
the grabbing target module is used for planning a first path on the indoor map according to the position of the robot and the position of the marking point of the object to be acquired, controlling the robot to move to the vicinity of the marking point of the object to be acquired, and controlling the mechanical arm to clamp the object to be acquired through laser ranging;
and the conveying execution module is used for carrying out path planning for the second time on the indoor map according to the position of the robot and the position of the marking point of the conveying destination point, and controlling the robot to convey the object to be acquired to the conveying destination point.
7. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor runs the computer program to implement the method of any one of claims 1-5.
8. A computer readable storage medium having stored thereon a computer program, wherein the program is executed by a processor to implement the method of any of claims 1-5.
CN202111554673.8A 2021-12-17 2021-12-17 Old man home service robot and control method thereof Active CN114428501B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111554673.8A CN114428501B (en) 2021-12-17 2021-12-17 Old man home service robot and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111554673.8A CN114428501B (en) 2021-12-17 2021-12-17 Old man home service robot and control method thereof

Publications (2)

Publication Number Publication Date
CN114428501A CN114428501A (en) 2022-05-03
CN114428501B true CN114428501B (en) 2024-06-21

Family

ID=81311734

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111554673.8A Active CN114428501B (en) 2021-12-17 2021-12-17 Old man home service robot and control method thereof

Country Status (1)

Country Link
CN (1) CN114428501B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN207467472U (en) * 2017-10-30 2018-06-08 东软集团股份有限公司 Intelligent service system
CN108885459A (en) * 2018-06-08 2018-11-23 珊口(深圳)智能科技有限公司 Air navigation aid, navigation system, mobile control system and mobile robot
CN109976325A (en) * 2017-12-27 2019-07-05 深圳市优必选科技有限公司 Method, device and equipment for managing articles by robot

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108320558B (en) * 2017-01-16 2020-07-21 重庆特斯联智慧科技股份有限公司 Intelligent navigation garage-based reservation service system and control method thereof
US11378956B2 (en) * 2018-04-03 2022-07-05 Baidu Usa Llc Perception and planning collaboration framework for autonomous driving
CN109116841B (en) * 2018-07-23 2021-09-14 昆明理工大学 Path planning smooth optimization method based on ant colony algorithm
CN110333714B (en) * 2019-04-09 2022-06-10 武汉理工大学 Unmanned vehicle path planning method and device
CN113238549A (en) * 2021-03-31 2021-08-10 珠海市一微半导体有限公司 Path planning method and chip for robot based on direct nodes and robot
CN113111144A (en) * 2021-04-14 2021-07-13 北京云迹科技有限公司 Room marking method and device and robot movement method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN207467472U (en) * 2017-10-30 2018-06-08 东软集团股份有限公司 Intelligent service system
CN109976325A (en) * 2017-12-27 2019-07-05 深圳市优必选科技有限公司 Method, device and equipment for managing articles by robot
CN108885459A (en) * 2018-06-08 2018-11-23 珊口(深圳)智能科技有限公司 Air navigation aid, navigation system, mobile control system and mobile robot

Also Published As

Publication number Publication date
CN114428501A (en) 2022-05-03

Similar Documents

Publication Publication Date Title
Coşar et al. ENRICHME: Perception and Interaction of an Assistive Robot for the Elderly at Home
US10650117B2 (en) Methods and systems for audio call detection
CN114516049B (en) Agent robot control system, agent robot control method, and storage medium
US9355368B2 (en) Computer-based method and system for providing active and automatic personal assistance using a robotic device/platform
US9223837B2 (en) Computer-based method and system for providing active and automatic personal assistance using an automobile or a portable electronic device
Flynn Combining sonar and infrared sensors for mobile robot navigation
Lee et al. An intelligent emergency response system: preliminary development and testing of automated fall detection
US20180043542A1 (en) Customer service robot and related systems and methods
Di Paola et al. An autonomous mobile robotic system for surveillance of indoor environments
US11350809B2 (en) Cleaning robot and method of performing task thereof
WO2019205760A1 (en) Intelligent device, inventory taking method, device and apparatus
CN114428502B (en) Logistics robot based on networking with household appliances and control method thereof
US20170132561A1 (en) Product delivery unloading assistance systems and methods
US20200105403A1 (en) Hospital support apparatus and operation method and operation program of hospital support apparatus
JP2005056213A (en) System, server and method for providing information
CN114428501B (en) Old man home service robot and control method thereof
CN114510022B (en) Medical transport robot and control method thereof
CN111462884A (en) Intelligent nursing management system based on 5G
Loghmani et al. Towards socially assistive robots for elderly: An end-to-end object search framework
US20200175804A1 (en) Availability of medicinal products
US12050464B2 (en) Robot paired with user to perform specific task
US20210072750A1 (en) Robot
JP6944173B1 (en) Information processing system, information processing method and computer program
JP7104755B2 (en) Information processing systems, programs, and information processing methods
LeMasurier Design of a Smartphone Application to Provide Shopping Assistance to People Who Are Blind or Have Low Vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant