CN111571611A - Facial operation robot track planning method based on facial and skin features - Google Patents

Facial operation robot track planning method based on facial and skin features Download PDF

Info

Publication number
CN111571611A
CN111571611A CN202010457760.0A CN202010457760A CN111571611A CN 111571611 A CN111571611 A CN 111571611A CN 202010457760 A CN202010457760 A CN 202010457760A CN 111571611 A CN111571611 A CN 111571611A
Authority
CN
China
Prior art keywords
facial
robot
track
skin
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010457760.0A
Other languages
Chinese (zh)
Other versions
CN111571611B (en
Inventor
陈彦彪
翟敬梅
胡燕
唐骢
陈家骊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Nali Biotechnology Co ltd
Original Assignee
Guangzhou Nali Biotechnology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Nali Biotechnology Co ltd filed Critical Guangzhou Nali Biotechnology Co ltd
Priority to CN202010457760.0A priority Critical patent/CN111571611B/en
Publication of CN111571611A publication Critical patent/CN111571611A/en
Application granted granted Critical
Publication of CN111571611B publication Critical patent/CN111571611B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1072Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring distances on the body, e.g. measuring length, height or thickness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1075Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H7/00Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for
    • A61H7/002Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for by rubbing or brushing
    • A61H7/004Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for by rubbing or brushing power-driven, e.g. electrical
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H7/00Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for
    • A61H7/007Kneading
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Abstract

The invention discloses a facial operation robot based on facial and skin characteristics and a track planning method thereof, wherein the method comprises the following steps of: (1) dividing the face operation and the non-operation area based on the size of the head and face part of the adult; (2) comprehensively analyzing two skin tension lines of a Langerhans line and a wrinkle line of the facial skin, and designing operation tracks of the robot aiming at facial operation areas at different positions; (3) acquiring a track curve of the facial operation robot by combining an isoplanar method and a dichotomy method; (4) optimizing the operation track of the robot based on the curvature of the facial skin; (5) and selecting an optimal track point sequence of the facial operation area based on a nearest neighbor algorithm, realizing robot track planning, and obtaining an operation track of the facial operation robot. The invention designs and plans the track according to the characteristics of human body face operation, such as face viscoelasticity, skin anisotropy, human body sensitivity and the like, improves the stability of robot operation, reduces impact and realizes efficient and coherent operation of a face area.

Description

Facial operation robot track planning method based on facial and skin features
Technical Field
The invention relates to the technical field of robots, in particular to a facial operation robot based on facial and skin characteristics and a track planning method thereof.
Background
As the problems of large demands of practitioners, high requirements on skills, long culture period, high labor cost and the like exist in the fields of medical rehabilitation, massage health care and beauty care, the demands of robots which are in direct contact with the face of a human body for operation are gradually increased. In the current research of skin operation robot trajectory planning at home and abroad, the motion trajectory of a rehabilitation robot applied to limb rehabilitation training and treatment is mainly a set fixed trajectory; the robot applied to body massage is mainly a preset point-to-point motion track and a linear motion track obtained by characteristic point fitting; the robot applied to facial and oral rehabilitation can obtain an initial track by marking characteristic points on a human face CT image and fitting, and corrects the initial track by combining skin elasticity compensation quantity to obtain a simple and local oral massage operation track. The robot operation track planning methods have the advantages that the number of the track points of the robot operation track obtained by the track planning methods is small, the applicability is poor, the operation track of the local face area is obtained, and the robot operation track planning method facing the whole face area is not searched.
Different from an industrial robot or a mobile robot, the robot which operates on the surface of the human skin is operated by the human skin, has viscoelasticity, anisotropy and the like, has a complex anatomical structure and extremely complex mechanical properties; the skin at different facial positions has different mechanical property expressions due to the difference of internal tissue distribution; the richer nervous tissue in facial skin can perceive external stimulus, has different responses to external stimulus, directly influences the human receptivity in the facial operation of robot. In order to reduce the friction force of the robot during facial operation and improve the human perception, the robot track needs to be designed, optimized and planned in consideration of the viscoelasticity, anisotropy, large curvature change and the like of the facial skin.
In the prior art, key parts such as eyes, nose and mouth cannot be avoided in the face operation process of the robot, the operation stability is improved, the impact is reduced, and the face region operation is efficiently and continuously completed, so that a novel face skin operation robot based on the face skin characteristics and a track planning method thereof need to be designed.
Disclosure of Invention
The invention aims to provide a facial operation robot based on facial and skin characteristics and a track planning method thereof, aiming at a human facial operation robot, the human sensibility is improved while the safety and the effectiveness of the robot facial operation are ensured.
The invention solves the technical problem and adopts the following technical scheme:
a facial operation robot track planning method based on face and skin features is characterized in that the track planning of a facial operation robot is carried out based on skin features such as a facial safe operation area and skin anisotropy, so that key parts such as eyes, a nose and a mouth of the robot can be avoided in the operation process, the operation stability is improved, the impact is reduced, and the operation of the facial area is efficiently and continuously completed, and specifically comprises the following steps:
(1) dividing the face operation and the non-operation area based on the size of the head and face part of the adult;
(2) comprehensively analyzing two skin tension lines of a Langerhans line and a wrinkle line of the facial skin, and designing operation tracks of the robot aiming at facial operation areas at different positions;
(3) acquiring a track curve of the facial operation robot by combining an isoplanar method and a dichotomy method;
(4) optimizing the operation track of the robot based on the curvature of the facial skin;
(5) and selecting an optimal track point sequence of the facial operation area based on a nearest neighbor algorithm, realizing robot track planning, and obtaining an operation track of the facial operation robot.
In the step (1), the size data of the part of the female head and face item in the national standard 'adult head and face size' (GB/T2428.98) is selectedPerforming analysis to set reference size data of the face region division as a face region division basis to DG-NAnd DM1-M2Extracting facial key part feature points, taking an eyebrow central point (EB) and a nose tip point (S) as reference points, and passing through a transverse width dimension DM1-M2And a longitudinal height dimension DG-NRestricting and dividing a face operation area;
furthermore, two skin tension lines, namely a facial wrinkle line and a Langerhans line, which have large influence on skin extension and tension are analyzed in the step (2), and different operation tracks are designed according to skin characteristics of different facial areas of the skin so as to achieve a better facial skin operation effect;
furthermore, the bending change of the track curve obtained based on the isoplanar method in the step (3) is obvious, and the bending condition of the track curve is improved by processing the data of the intersection points by adopting a successive dichotomy;
further, in the step (4), the normal curvature of the face track point is calculated by using an approximate solution method, a threshold value is set, and when the normal curvature deviation value between adjacent track points in the track line is greater than the threshold value, a plurality of points are interpolated between the two track points;
further, setting an initial position point of the robot in the step (5), processing the face operation track points based on a nearest neighbor algorithm, and selecting an optimal face operation area track point sequence to obtain the operation track of the face operation robot.
The invention has the advantages that:
compared with the existing skin operation robot and the track planning method thereof, the method comprehensively considers the facial skin characteristics, the human body sensitivity, the robot operation safety and the like to plan the track of the facial operation robot. Dividing a face operation area to ensure that key parts such as eyes, a nose, a mouth and the like can be avoided in the face operation process of the robot; the method mainly considers the anisotropy of facial skin, combines two skin tension lines of a facial wrinkle line and a Langerhans line, and designs the operation tracks of the skin in different facial areas so as to realize better facial operation effect; the operation track of the robot is optimized based on the curvature of the face, the operation stability is improved, and the impact is reduced; and planning the track of the facial operation robot based on a nearest neighbor algorithm to realize the coherent operation of the facial area of the robot.
Drawings
Fig. 1 is a flow chart of a facial operation robot trajectory planning method based on facial and skin features according to the invention.
FIG. 2a is a schematic diagram of the measurement items of the head and face of a woman in the division of the face work area according to the present invention.
FIG. 2b is a schematic diagram of the division of the facial working area according to the present invention.
FIG. 3a is a schematic view of a facial wrinkle line according to the present invention.
FIG. 3b is a schematic view of Langer's line on the face according to the present invention.
FIG. 4a is a schematic diagram of obtaining a face operation track point by an iso-planar method according to the present invention.
FIG. 4b is a schematic diagram of a trajectory generation method based on successive dichotomy according to the present invention.
FIG. 5a is a schematic diagram of the approximate solution of the normal curvature of the face according to the present invention.
Fig. 5b is a simplified robot face operation trace diagram according to the present invention.
Fig. 6 is a schematic diagram of a simplified robot face operation track obtained based on a nearest neighbor algorithm.
Detailed Description
The purpose of the present invention is described in further detail below by using specific examples, which cannot be described in detail herein, but the embodiments of the present invention are not limited to the following examples.
Referring to fig. 1 to 6, the facial operation robot based on facial and skin features and the trajectory planning method thereof provided by the embodiment of the present invention can be used in the field of robot trajectory planning, and perform facial operation area division based on the size of the head and face part of an adult; considering skin anisotropy and facial wrinkles, comprehensively analyzing two skin tension lines, and designing robot operation tracks of facial operation areas at different positions; combining an isoplanar method and a dichotomy method to obtain a facial robot operation track curve; optimizing the robot operation track based on the curvature of the face; the method comprises the following steps of selecting an optimal face operation area track point sequence based on a nearest neighbor algorithm, realizing robot track planning, and finally obtaining the operation track of a face operation robot, wherein the method specifically comprises the following steps:
s1, selecting partial size data of the female head and face project in the national standard 'adult head and face size' (GB/T2428.98) for analysis, and taking the partial size data as a basis for dividing the face operation region, wherein a schematic diagram of the female head and face measurement project is shown in figure 2 a. In the figure, point V is the vertex of the head; point G is the glabellar point; point N is the pre-auricular point; points M1 and M2 are a left corner point and a right corner point respectively; dG-NIs the distance between the glabellar point and the anterior auricular point, DM1-M2The transverse distance between the left and right corner points; 1. 2, 3 are female head and face measurement items.
Setting reference size data of face region division to DG-NAnd DM1-M2,DG-NThe value can be determined from the distance from the vertex to the eyebrow and the height of the head and ears, DM1-M2The value is the mouth width dimension. As shown in fig. 2b, the feature points of key parts of the face are extracted, and the central point (EB) of the eyebrow and the nose tip point (S) are taken as reference points and pass through the transverse width dimension DM1-M2And a longitudinal height dimension DG-NThe areas of the three key parts of the eyes, the nose and the mouth are defined as non-operation areas, and the rest areas of the face are defined as operation areas including the forehead area and the left and right cheek areas.
S2, facial wrinkle lines and Langer' S lines are two lines of skin tension that have a greater effect on skin extension and tension. The facial wrinkle lines are lines of a plurality of bulges and depressions naturally formed on the surface of the skin, so that the skin can be stretched, and the skin has elasticity like a pleated skirt. In the daily facial care operation, in order to resist the trouble caused by facial wrinkles, the facial lifting operation and massage are usually performed perpendicular to the facial wrinkle lines, so that the facial blood circulation is promoted, the wrinkles are reduced, and the skin aging is delayed. Langer lines indicate the preferential direction of skin extensibility and are indicative of skin anisotropy. Elastin and collagen fiber inside the skin along the Langer line direction are easier to extend, the face care operation is carried out to conform to the skin extensibility direction, the friction obstruction in the operation can be reduced, and the human body feeling is improved.
Fig. 3a and 3b show the distribution of facial wrinkle lines and Langer lines on the face, respectively, where H is the forehead region and C is the cheek region of the face. In the H area, the trend of the wrinkle line is basically consistent with that of the Langer line, the skin elasticity is obviously shown compared with the extensibility in the H area due to the fact that the skin viscoelasticity is small, the wrinkle removing effect is considered more, and the operation track is perpendicular to the wrinkle line of the face. In the C area, the wrinkle line is not completely vertical to the Langer line, certain intersection angles exist at different positions in different degrees, the viscoelasticity of the skin in the C area is considered to be obvious, the extensibility of the skin is obviously shown, in order to conform to the extensibility direction of the skin, the friction force is reduced, the human body sensitivity is improved, a certain wrinkle removing effect is realized, and the operation track is along the Langer line on the face.
And S3, respectively cutting the forehead, the left cheek area and the right cheek area by adopting an isoplanar method to obtain cut lines. As shown in fig. 4a, a set of section planes S ═ S is defined along the Y direction and parallel to the X-Z plane1,S2,…,Si,…,SmAnd m is the total number of the cross-sectional planes, and the offset between the cross-sectional planes is the line spacing L. Suppose a sectional plane SiIntersecting with the curved surface of the facial skin to obtain the number of intersection points on an intersection line as n, traversing all triangular patches on the facial model intersected with the intersection plane S, judging the position relationship between the triangular patches and the intersection plane S, and analyzing different intersection conditions to obtain the intersection point P of the operation areaC
Figure BDA0002509914050000079
The obtained intersection point PCConnecting lines according to bubbling sequencing to obtain an operation area intercept line C conforming to the operation direction:
C={C1,C2,…,Ci,…,Cm}(i={1,2,…,m})
in order to improve the bending condition of the operation track, the successive dichotomy is adopted to process the intersection data, and an intersection line C is setiAll the cross-over points on the upper part form a closed interval
Figure BDA00025099140500000710
Bi1Has a midpoint of
Figure BDA0002509914050000071
Will be provided with
Figure BDA00025099140500000711
As a new interval Bi2Boundary value of (i.e.
Figure BDA0002509914050000072
Bi2Has a midpoint of
Figure BDA0002509914050000073
Circulating the midpoint value in the calculation interval until
Figure BDA0002509914050000074
(j is the total number of times of the loop calculation), and different intercept point intervals B are obtainedijCorresponding midpoint value
Figure BDA0002509914050000075
Are connected in sequence with BijThe intersection points of the corresponding positions of the intersection lines of every middle line are connected if no intersection point exists in the corresponding interval
Figure BDA0002509914050000076
FIG. 4B is a schematic diagram of a trajectory generation method based on successive dichotomy, in which purple points are corresponding intercept point intervals Bi1,Bi2,Bi3(i=1,2,3,4,5) midpoint
Figure BDA0002509914050000077
Connection B13,B23,B33,B43,B53The first intersection point in the cross section obtains a red trace line TC1(ii) a Connect the second intersection point, due to B13,B43,B53Without a second intersection point, then connect
Figure BDA0002509914050000078
Instead of the point of intersection, T is obtainedC2Connecting B according to the method described abovei1,Bi2,Bi3And the bending condition of the obtained face operation track is improved by other track lines.
S4, optimizing the massage tracks of different face areas based on the curvature of the face, and interpolating track points at the position with larger curvature change to increase the number of the track points. The three-dimensional reconstruction of the facial skin curved surface is a grid model, the normal curvature of the curve at the grid vertex cannot be directly calculated, and the curve can only be obtained by an approximate solving method.
As shown in FIG. 5a, a vertex v on the face mesh model is taken, and a set of n triangular patches related to the vertex v is Tv
Figure BDA0002509914050000081
In the formula
Figure BDA0002509914050000082
vij,vi(j+1)Are the two vertices of the ith relevant triangular patch of vertex vth.
Each triangular patch
Figure BDA0002509914050000087
Has a unit normal vector of
Figure BDA0002509914050000083
The following equation is obtained:
Figure BDA0002509914050000084
estimating a normal vector N at the vertex v from the unit normal vectors of each triangular patchv
Figure BDA0002509914050000085
Calculating the normal curvature k at the vertex v by the curvature formula of any point on the triangular meshvComprises the following steps:
Figure BDA0002509914050000086
the curvature of each track point of the massage track is obtained by the method, and a threshold value T is setHWhen the normal curvature deviation value between adjacent track points in the track line is larger than the threshold value THAnd interpolating and supplementing a plurality of points between the two track points. And at the position with large curvature change, the number of track points is increased, and the change of the normal vector between the adjacent track points is reduced, so that the change of the terminal pose of the robot is smoother.
Analyzing the simplified robot face operation track schematic diagram shown in fig. 5b in combination with the step S4, wherein black arrows in the diagram indicate trends of the operation tracks; the yellow points are track points, and the number of the track points at the position with larger curvature is larger; the forehead operation track is thThe three parts of the left cheek are respectively tlf1,tlf2,tlf3The operation tracks of the three parts of the right cheek are respectively trf1,trf2,trf3
S5, planning the robot face operation track based on the nearest neighbor algorithm, optimally selecting the optimal face operation starting point and track operation sequence, obtaining the shortest robot face massage path, and obtaining the simplified robot face operation track shown in figure 6.
A facial operation robot based on facial and skin features for implementing the trajectory planning method as shown in FIG. 6, the robot uses the forehead region trajectory points H1The forehead area is lifted and massaged for the starting point of the robot face massage and moves to the track point H2Finishing the 1 st section of face massage operation; then move to the locus point CL11According to the trajectory tlf1Move to the track point CL of the track12Finishing the 2 nd section of face massage operation; by parity of reasoning, the track t is completed in sequencelf2, tlf3,trf1,trf2,trf3And (3) performing facial massage operation in sections 3-7 of the area.
The above examples of the present invention are merely examples for clearly illustrating the present invention and are not intended to limit the embodiments of the present invention. It will be apparent to those skilled in the art that other variations and modifications may be made in the foregoing description, and it is not necessary or necessary to exhaustively enumerate all embodiments herein. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (7)

1. A facial operation robot track planning method based on face and skin features is characterized in that the track planning of a facial operation robot is carried out based on skin features such as a facial safe operation area and skin anisotropy, so that key parts such as eyes, a nose and a mouth of the robot can be avoided in the operation process, the operation stability is improved, the impact is reduced, and the operation of the facial area is efficiently and continuously completed, and specifically comprises the following steps:
(1) dividing the face operation and the non-operation area based on the size of the head and face part of the adult;
(2) comprehensively analyzing two skin tension lines of a Langerhans line and a wrinkle line of the facial skin, and designing operation tracks of the robot aiming at facial operation areas at different positions;
(3) acquiring a track curve of the facial operation robot by combining an isoplanar method and a dichotomy method;
(4) optimizing the operation track of the robot based on the curvature of the facial skin;
(5) and selecting an optimal track point sequence of the facial operation area based on a nearest neighbor algorithm, realizing robot track planning, and obtaining an operation track of the facial operation robot.
2. The facial and skin feature based facial work robot trajectory planning method of claim 1, characterized by: and (1) extracting specific facial position feature points by referring to partial data in national standards of human head and face sizes, and dividing facial operation and non-operation areas by combining the reference data.
3. The facial and skin feature based facial work robot trajectory planning method of claim 1, characterized by: and (2) comprehensively analyzing two skin tension lines of a Langerhans line and a wrinkle line of the face skin, and designing the face operation track of the robot aiming at different face position operation areas.
4. The facial and skin feature based facial work robot trajectory planning method of claim 1, characterized by: and (3) the distance between the intersection points obtained based on the isoplanar method is not uniform, if the intersection points corresponding to the intersection lines of each row are directly and sequentially connected, the track bending change is obvious, and the successive bisection method is adopted to process the intersection point data to improve the obtained operation track bending condition.
5. The facial and skin feature based facial work robot trajectory planning method of claim 1, characterized by: in the step (4), for cheekbones and cheek regions, the skin normal curvature changes greatly, and the terminal normal vector changes greatly during robot operation, so that operation tracks of different face regions are optimized based on the face curvature, track point interpolation is performed at the position with the large curvature change, and the number of track points is increased.
6. The facial and skin feature based facial work robot trajectory planning method of claim 1, characterized by: and (5) the massage operation priority sequences of different facial areas in the step (5) directly influence the total length of the facial massage path of the robot, and the optimal track point sequence of the facial operation area is selected based on the nearest neighbor algorithm, so that the trajectory planning of the robot is realized, the idle stroke of the robot is reduced, and the operation efficiency is improved.
7. A facial operation robot based on facial and skin features implementing the trajectory planning method according to one of claims 1 to 6.
CN202010457760.0A 2020-05-26 2020-05-26 Facial operation robot track planning method based on facial and skin features Active CN111571611B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010457760.0A CN111571611B (en) 2020-05-26 2020-05-26 Facial operation robot track planning method based on facial and skin features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010457760.0A CN111571611B (en) 2020-05-26 2020-05-26 Facial operation robot track planning method based on facial and skin features

Publications (2)

Publication Number Publication Date
CN111571611A true CN111571611A (en) 2020-08-25
CN111571611B CN111571611B (en) 2021-09-21

Family

ID=72117852

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010457760.0A Active CN111571611B (en) 2020-05-26 2020-05-26 Facial operation robot track planning method based on facial and skin features

Country Status (1)

Country Link
CN (1) CN111571611B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115741732A (en) * 2022-11-15 2023-03-07 福州大学 Interactive path planning and motion control method of massage robot
CN115847449A (en) * 2023-02-22 2023-03-28 深圳市德壹医疗科技有限公司 Intelligent massage method, device and equipment based on path planning and storage medium

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008041457A1 (en) * 2006-09-29 2008-04-10 Waseda University Massage robot, control program therefor, and robot for specifying portion of human body
US20100016658A1 (en) * 2007-04-03 2010-01-21 Hui Zou Anatomical visualization and measurement system
CN204700888U (en) * 2015-06-12 2015-10-14 唐志伟 A kind of novel feeding robot
JP2015186568A (en) * 2014-03-13 2015-10-29 パナソニックIpマネジメント株式会社 massage device and massage method
CN105574484A (en) * 2014-11-04 2016-05-11 三星电子株式会社 Electronic device, and method for analyzing face information in electronic device
CN105740781A (en) * 2016-01-25 2016-07-06 北京天诚盛业科技有限公司 Three-dimensional human face in-vivo detection method and device
CN105913416A (en) * 2016-04-06 2016-08-31 中南大学 Method for automatically segmenting three-dimensional human face model area
US20170069052A1 (en) * 2015-09-04 2017-03-09 Qiang Li Systems and Methods of 3D Scanning and Robotic Application of Cosmetics to Human
FR3067957A1 (en) * 2017-06-26 2018-12-28 Capsix ROBOT DISPLACEMENT MANAGEMENT DEVICE AND ASSOCIATED CARE ROBOT
KR101950148B1 (en) * 2018-09-13 2019-02-19 주식회사 바디프랜드 Method and apparatus for providing massage for stimulating physeal plate for promoting growth
US20190160684A1 (en) * 2017-11-29 2019-05-30 Midea Group Co., Ltd Massage Robot Using Machine Vision
CN109938842A (en) * 2019-04-18 2019-06-28 王小丽 Facial surgical placement air navigation aid and device
CN109940626A (en) * 2019-01-23 2019-06-28 浙江大学城市学院 A kind of thrush robot system and its control method based on robot vision
CN209221348U (en) * 2018-06-28 2019-08-09 诺思科技有限公司 Artificial intelligence robot for skin treating
CN110472605A (en) * 2019-08-21 2019-11-19 广州纳丽生物科技有限公司 A kind of skin problem diagnostic method based on deep learning face subregion
CN110831537A (en) * 2017-06-23 2020-02-21 奥瑞斯健康公司 Robotic system for determining a pose of a medical device in a lumen network
CN110900597A (en) * 2018-09-14 2020-03-24 上海沃迪智能装备股份有限公司 Jumping motion track planning method with settable vertical height and corner height

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008041457A1 (en) * 2006-09-29 2008-04-10 Waseda University Massage robot, control program therefor, and robot for specifying portion of human body
US20100016658A1 (en) * 2007-04-03 2010-01-21 Hui Zou Anatomical visualization and measurement system
JP2015186568A (en) * 2014-03-13 2015-10-29 パナソニックIpマネジメント株式会社 massage device and massage method
CN105574484A (en) * 2014-11-04 2016-05-11 三星电子株式会社 Electronic device, and method for analyzing face information in electronic device
CN204700888U (en) * 2015-06-12 2015-10-14 唐志伟 A kind of novel feeding robot
US20170069052A1 (en) * 2015-09-04 2017-03-09 Qiang Li Systems and Methods of 3D Scanning and Robotic Application of Cosmetics to Human
CN105740781A (en) * 2016-01-25 2016-07-06 北京天诚盛业科技有限公司 Three-dimensional human face in-vivo detection method and device
CN105913416A (en) * 2016-04-06 2016-08-31 中南大学 Method for automatically segmenting three-dimensional human face model area
CN110831537A (en) * 2017-06-23 2020-02-21 奥瑞斯健康公司 Robotic system for determining a pose of a medical device in a lumen network
FR3067957A1 (en) * 2017-06-26 2018-12-28 Capsix ROBOT DISPLACEMENT MANAGEMENT DEVICE AND ASSOCIATED CARE ROBOT
US20190160684A1 (en) * 2017-11-29 2019-05-30 Midea Group Co., Ltd Massage Robot Using Machine Vision
CN209221348U (en) * 2018-06-28 2019-08-09 诺思科技有限公司 Artificial intelligence robot for skin treating
KR101950148B1 (en) * 2018-09-13 2019-02-19 주식회사 바디프랜드 Method and apparatus for providing massage for stimulating physeal plate for promoting growth
CN110900597A (en) * 2018-09-14 2020-03-24 上海沃迪智能装备股份有限公司 Jumping motion track planning method with settable vertical height and corner height
CN109940626A (en) * 2019-01-23 2019-06-28 浙江大学城市学院 A kind of thrush robot system and its control method based on robot vision
CN109938842A (en) * 2019-04-18 2019-06-28 王小丽 Facial surgical placement air navigation aid and device
CN110472605A (en) * 2019-08-21 2019-11-19 广州纳丽生物科技有限公司 A kind of skin problem diagnostic method based on deep learning face subregion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
翟敬梅: "机器人虚拟仿真及远程控制***的研究与实现", 《计算机工程与应用》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115741732A (en) * 2022-11-15 2023-03-07 福州大学 Interactive path planning and motion control method of massage robot
CN115847449A (en) * 2023-02-22 2023-03-28 深圳市德壹医疗科技有限公司 Intelligent massage method, device and equipment based on path planning and storage medium

Also Published As

Publication number Publication date
CN111571611B (en) 2021-09-21

Similar Documents

Publication Publication Date Title
CN111571611B (en) Facial operation robot track planning method based on facial and skin features
He et al. A wireless BCI and BMI system for wearable robots
Safavynia et al. Task-level feedback can explain temporal recruitment of spatially fixed muscle synergies throughout postural perturbations
KR102006019B1 (en) Method, system and non-transitory computer-readable recording medium for providing result information about a procedure
Perl et al. Nonequilibrium brain dynamics as a signature of consciousness
Muret et al. Beyond body maps: Information content of specific body parts is distributed across the somatosensory homunculus
Figueiredo et al. Individual profiles of spatio-temporal coordination in high intensity swimming
US11123140B1 (en) Computing platform for improved aesthetic outcomes and patient safety in medical and surgical cosmetic procedures
Memar et al. Objective assessment of human workload in physical human-robot cooperation using brain monitoring
CN110782528A (en) Free deformation human face shaping simulation method, system and storage medium
JP6993291B2 (en) Computer and emotion estimation method
CN114842522A (en) Artificial intelligence auxiliary evaluation method applied to beauty treatment
CN204377989U (en) A kind of perpendicular oval mouth mask
Lanitis Age estimation based on head movements: A feasibility study
CN112329640A (en) Facial nerve palsy disease rehabilitation detection system based on eye muscle movement analysis
Karg et al. Human movement analysis: Extension of the f-statistic to time series using hmm
CN113221958A (en) Method, device and system for matching massage track with massage area and storage medium
US20230200907A1 (en) Computing platform for improved aesthetic outcomes and patient safety in medical and surgical cosmetic procedures
US11497418B2 (en) System and method for neuroactivity detection in infants
Wendel et al. Measuring tissue thicknesses of the human head using centralized and normalized trajectories
Manaka et al. A parsimonious laboratory system for the evaluation of rat reaching task: recovery from the massive destruction of motor areas
WO2022173056A1 (en) Skin state inference method, device, program, system, trained model generation method, and trained model
PARDO RAMOS Characterization of bilateral muscular synergies during gait in Down syndrome and control group
Huang et al. Non-rigid tracking of musk shrews in video for detection of emetic episodes
JP2009297209A (en) Massage method for lifting up face

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant