CN116747431A - Method and device for detecting action position of beauty instrument and energy output and beauty instrument - Google Patents

Method and device for detecting action position of beauty instrument and energy output and beauty instrument Download PDF

Info

Publication number
CN116747431A
CN116747431A CN202310568109.4A CN202310568109A CN116747431A CN 116747431 A CN116747431 A CN 116747431A CN 202310568109 A CN202310568109 A CN 202310568109A CN 116747431 A CN116747431 A CN 116747431A
Authority
CN
China
Prior art keywords
beauty instrument
energy output
instrument
target image
beauty
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310568109.4A
Other languages
Chinese (zh)
Inventor
郦轲
王念欧
万进
董明志
安云霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Accompany Technology Co Ltd
Original Assignee
Shenzhen Accompany Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Accompany Technology Co Ltd filed Critical Shenzhen Accompany Technology Co Ltd
Priority to CN202310568109.4A priority Critical patent/CN116747431A/en
Publication of CN116747431A publication Critical patent/CN116747431A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N1/00Electrotherapy; Circuits therefor
    • A61N1/18Applying electric currents by contact electrodes
    • A61N1/32Applying electric currents by contact electrodes alternating or intermittent currents
    • A61N1/36Applying electric currents by contact electrodes alternating or intermittent currents for stimulation
    • A61N1/3605Implantable neurostimulators for stimulating central or peripheral nerve system
    • A61N1/36128Control systems
    • A61N1/36135Control systems using physiological parameters
    • A61N1/36139Control systems using physiological parameters with automatic adjustment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N1/00Electrotherapy; Circuits therefor
    • A61N1/18Applying electric currents by contact electrodes
    • A61N1/32Applying electric currents by contact electrodes alternating or intermittent currents
    • A61N1/328Applying electric currents by contact electrodes alternating or intermittent currents for improving the appearance of the skin, e.g. facial toning or wrinkle treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N7/00Ultrasound therapy

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physiology (AREA)
  • Neurosurgery (AREA)
  • Biophysics (AREA)
  • Neurology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Plastic & Reconstructive Surgery (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application provides a method for detecting the action position of a beauty instrument and outputting energy, a detection device and the beauty instrument, wherein the method for detecting the action position of the beauty instrument comprises the following steps: acquiring a target image; acquiring scene parameters in the target image based on the target image; the scene parameters in the target image are matched to a database to determine the location of the cosmetic instrument's action. The energy output method of the beauty instrument comprises the following steps: receiving the action position of the beauty instrument under the current use scene; determining the working state and/or energy output gear of the beauty instrument based on the action position of the beauty instrument under the current use scene; and when the working state of the beauty instrument is an energy output state, performing energy output according to the determined energy output gear. Therefore, the beauty instrument can correspondingly adjust the working state and/or the energy output gear according to the current action position of the beauty instrument, and further, the use experience of a user can be improved.

Description

Method and device for detecting action position of beauty instrument and energy output and beauty instrument
Technical Field
The application relates to the technical field of detection, in particular to a method for detecting the action position of a beauty instrument and outputting energy, a detection device and the beauty instrument.
Background
Because the fat content, the muscle content, the nerve distribution and the like of each region of human skin are different, the acceptable current stimulation intensity, the sound wave intensity and the like are also different, but in the prior art, a user usually adjusts the energy output gear of the beauty instrument according to the used skin region, so that the user needs to know the energy output gear used by each region of the skin, the use difficulty of the user is greatly increased, and therefore, a method for intelligently adjusting the energy output gear of the beauty instrument according to the used skin region is provided, so that the use experience of the user is improved, and further research is needed.
Disclosure of Invention
Therefore, the application provides a method for detecting the action position of a beauty instrument and outputting energy, a detection device and the beauty instrument, so as to solve the technical problems.
The first aspect of the application provides a method for detecting the action position of a beauty instrument, which comprises the following steps:
acquiring a target image;
acquiring scene parameters in the target image based on the target image; wherein the scene parameters comprise at least one of the existence of a beauty instrument, an environmental parameter, the degree to which the beauty instrument is shielded, the inclination angle of the beauty instrument, the gesture of a user holding the beauty instrument, and the distance between the beauty instrument and a beauty area of the user;
matching scene parameters in the target image with a database to determine the position of action of the beauty instrument; wherein, the database stores the corresponding relation between the scene parameter and the action position of the beauty instrument.
A second aspect of the present application provides a method of energy output of a cosmetic device, the method comprising:
receiving the action position of the beauty instrument under the current use scene;
determining the working state and/or energy output gear of the beauty instrument based on the acting position of the beauty instrument under the current use scene, wherein the working state of the beauty instrument comprises a standby state and an energy output state;
and when the working state of the beauty instrument is an energy output state, performing energy output according to the determined energy output gear.
A third aspect of the present application provides a detection apparatus comprising:
the first acquisition module is used for acquiring a target image;
the second acquisition module acquires scene parameters in the target image based on the target image; wherein the scene parameters comprise at least one of the existence of a beauty instrument, an environmental parameter, the degree to which the beauty instrument is shielded, the inclination angle of the beauty instrument, the gesture of a user holding the beauty instrument, and the distance between the beauty instrument and a beauty area of the user;
the determining module is used for matching scene parameters in the target image with a database so as to determine the acting position of the beauty instrument; wherein, the database stores the corresponding relation between the scene parameter and the action position of the beauty instrument.
A fourth aspect of the present application provides a cosmetic device comprising:
the receiving module is used for receiving the target image acquired by the detection device and determining the action position of the beauty instrument under the current use scene according to the target image or receiving the action position of the beauty instrument under the current use scene sent by the detection device;
the determining module is used for determining the working state and/or the energy output gear of the beauty instrument based on the position of the beauty instrument in the current use scene, wherein the working state of the beauty instrument comprises a standby state and an energy output state;
and the energy output module is used for outputting energy according to the determined energy output gear when the working state of the beauty instrument is the energy output state.
The fifth aspect of the present application provides a cosmetic detection system, comprising a detection device and a cosmetic instrument, wherein the detection device is the detection device, and the cosmetic instrument is the cosmetic instrument.
A sixth aspect of the present application provides a computer readable storage medium, in which a computer program is stored, where the computer program is executed after being called by a processor, to implement the aforementioned method for detecting a functional position of a cosmetic instrument, and/or the aforementioned method for outputting energy of a cosmetic instrument.
According to the application, a database storing the corresponding relation between scene parameters and the action position of the beauty instrument is established, a target image is acquired through the detection device, the scene parameters in the target image are acquired by the detection device or the beauty instrument based on the target image, and the action position of the beauty instrument is determined based on the scene parameters in the target image and the database, so that the beauty instrument can correspondingly adjust the working state and/or the energy output gear according to the current action position of the beauty instrument, and the user does not need to grasp the knowledge of the optimal energy output gear corresponding to each skin area, and the use experience of the user can be improved.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a method for detecting a position of action of a cosmetic instrument according to some embodiments of the present application;
fig. 2 is a flow chart of a method for outputting energy of a cosmetic instrument according to some embodiments of the present application;
FIG. 3 is a schematic block diagram of a detection device according to some embodiments of the present application;
fig. 4 is a schematic block diagram of a cosmetic apparatus according to some embodiments of the present application;
fig. 5 is a schematic block diagram of a beauty treatment detection system according to some embodiments of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without any inventive effort, are intended to be within the scope of the application.
Referring to fig. 1, fig. 1 is a flow chart of a method for detecting a position of an action of a cosmetic instrument according to some embodiments of the application.
As shown in fig. 1, in some embodiments, the method for detecting the action position of the beauty instrument includes:
s101: a target image is acquired.
S102: scene parameters in the target image are acquired based on the target image.
Wherein the scene parameters include at least one of the presence of a beauty instrument, an environmental parameter, a degree to which the beauty instrument is shielded, an inclination angle of the beauty instrument, a posture of a user holding the beauty instrument, and a distance between the beauty instrument and a beauty area of the user.
S103: the scene parameters in the target image are matched to a database to determine the location of the cosmetic instrument's action.
Wherein, the database stores the corresponding relation between the scene parameter and the action position of the beauty instrument.
According to the application, the action position of the beauty instrument is determined by extracting the scene parameters in the target image and matching the scene parameters in the target image with a database, so that the intelligent method for determining the action position of the beauty instrument is provided, and compared with the method for determining the action position of the beauty instrument by naked eyes of a user, the method for determining the action position of the beauty instrument is more scientific and objective.
Wherein the ambient parameters include, but are not limited to, ambient light brightness.
In some embodiments, the acquiring the target image in step S101 includes: the target image is acquired based on a video data stream.
In some other embodiments, the acquiring the target image in step S101 includes: periodically, an image is taken.
In some embodiments, the acquiring the target image based on the video data stream comprises:
and judging whether the currently acquired one frame of image meets the requirement or not based on the video data stream.
And if the requirements are met, determining the target image.
If the requirements are not met, another frame of image is acquired again and whether the requirements are met is judged.
Wherein the another frame of image re-acquired is the previous frame of image or the next frame of image of the previous acquired image.
The requirements include a definition requirement, specifically, whether a currently acquired frame of image meets the requirement can be judged through the resolution of the image, whether a blurred object exists, whether a noise exists or not, and the like.
Compared with the method of acquiring the target image by taking pictures, the method of acquiring the target image by using the video data stream can select the image of the adjacent frame when one frame of image is not satisfied with the requirement, and the interval time between the two frames of adjacent images in the video data stream is often smaller than the interval time of taking two pictures, so that the method of acquiring the target image by using the video data stream is easier to be close to the preset time point at the preset time point.
In some embodiments, the acquiring the target image in step S101 includes: the target image is acquired periodically.
In other embodiments, the acquiring the target image in step S101 includes: and acquiring the target image according to a preset time interval.
In some embodiments, before the capturing the target image, the method for detecting the action position of the beauty instrument further includes:
and obtaining calibration images of the beauty instrument under various use scenes.
And acquiring scene parameters in the calibration image based on the calibration image.
Setting the action position of the beauty instrument under the current scene parameters so as to establish a database of the corresponding relation between the scene parameters and the action position of the beauty instrument.
Referring to fig. 2, fig. 2 is a flow chart of a method for outputting energy of a cosmetic instrument according to some embodiments of the application.
As shown in fig. 2, in some embodiments, the energy output method of the beauty instrument includes:
s201: the position of the cosmetic instrument action in the current use scenario is received.
S202: and determining the working state and/or the energy output gear of the beauty instrument based on the action position of the beauty instrument under the current use scene.
The working states of the beauty instrument comprise a standby state and an energy output state.
S203: and when the working state of the beauty instrument is an energy output state, performing energy output according to the determined energy output gear.
According to the application, the working state and/or the energy output gear of the beauty instrument can be directly adjusted according to the received position of the action of the beauty instrument under the current use scene, so that the function of intelligently adjusting the working state and/or the energy output gear of the beauty instrument according to the position of the action of the beauty instrument under the current use scene is realized, and further, the user is not required to memorize the optimal energy output gear corresponding to each skin area for manual adjustment, and the use experience of the user can be improved.
Such as: when the action position of the beauty instrument under the current use scene is unknown (for example, the beauty instrument does not exist in the current scene), the working state of the beauty instrument can be determined to be a standby state; when the acting position of the beauty instrument under the current use scene is the cheek, the working state of the beauty instrument can be determined to be an energy output state, and the energy output gear is a first gear; when the acting position of the beauty instrument under the current use scene is the forehead, the working state of the beauty instrument can be determined to be an energy output state, and the energy output gear is a second gear; when the position of beauty instrument effect under the current use scene is the mouth week, can confirm that beauty instrument's operating condition is the energy output state, and the energy output gear is the third gear, wherein, the energy output intensity that the energy output gear corresponds is: energy output intensity of the first gear > energy output intensity of the second gear > energy output intensity of the third gear, such as: the energy output intensity of the first gear is 18.5-25.2J (joule), the energy output intensity of the second gear is 9.2-18.4J, and the energy output intensity of the third gear is 0.1-9.1J.
In some embodiments, each time the position of the action of the beauty instrument in the current use scenario is received, the working state and/or the energy output gear of the beauty instrument are determined again based on the position of the action of the beauty instrument in the current use scenario, for example, each 0.2s the position of the action of the beauty instrument in the current use scenario is received, and each 0.2s the working state and/or the energy output gear of the beauty instrument are determined again.
In some embodiments, the determining the working state and/or the energy output gear of the cosmetic instrument based on the position of the cosmetic instrument in the current usage scenario includes:
and determining the working state and/or the energy output gear of the beauty instrument based on the relation between the working state and/or the energy output gear of the beauty instrument and the working position of the beauty instrument under the current use scene.
Such as: the relation between the position of the action of the beauty instrument and the working state and/or the energy output gear of the beauty instrument comprises: when the action position of the cosmetic instrument is cheek, the working state of the cosmetic instrument is an energy output state, and the energy output gear is a first gear. If the active position of the beauty instrument in the current use scene is the cheek, the working state of the beauty instrument can be determined to be an energy output state, and the energy output gear is the first gear.
Because the skin positions are different, the best energy stimulus born by the skin is different, the corresponding energy output gear is set for different skin positions, and the use effect of the beauty instrument can be improved.
In other embodiments, the determining the working state and/or the energy output gear of the cosmetic instrument based on the position of the cosmetic instrument in the current use scenario includes:
determining the current skin level according to the position acted by the beauty instrument under the current use scene and the corresponding relation between the position acted by the beauty instrument and the skin level; wherein the skin level is divided based on any one or more of skin resistance information, fat content, and the number of nerve tissues.
And determining the working state and/or the energy output gear of the beauty instrument based on the relation between the skin level and the working state and/or the energy output gear of the beauty instrument and the current skin level.
Such as: when the fat content is divided, the fat content corresponding to the middle lower part of the cheek is thick, the fat content corresponding to the forehead, the temple and the upper part of the cheek is medium, and the fat content corresponding to the perioral and periocular is thin; when the fat content is thick, the corresponding energy output gear is the first gear; when the fat content is medium, the gear of the corresponding energy output is the second gear; when the fat content is thin, the gear of the corresponding energy output is third gear. When the position of the action of the beauty instrument under the current use scene is the eye periphery, the corresponding relation between the position of the action of the beauty instrument and the skin level determines that the current skin level is thin, the working state of the beauty instrument can be determined to be the energy output state based on the relation between the skin level and the working state and/or the energy output gear of the beauty instrument, and the energy output gear is the third gear.
Since the skin resistance information, fat content and nerve tissue number are different at each position of the skin, the setting of the energy output gear can be reduced by dividing the skin area according to any one or more combinations.
Referring to fig. 3, fig. 3 is a schematic block diagram of a detection device according to some embodiments of the application.
As shown in fig. 3, the detection apparatus 1 includes: the first acquisition module 10, the second acquisition module 11 and the first determination module 12. The first acquisition module 10 is configured to acquire a target image. The second acquisition module 11 acquires scene parameters in the target image based on the target image; wherein the scene parameters include at least one of the presence of a beauty instrument, an environmental parameter, a degree to which the beauty instrument is shielded, an inclination angle of the beauty instrument, a posture of a user holding the beauty instrument, and a distance between the beauty instrument and a beauty area of the user. The first determining module 12 matches scene parameters in the target image with a database to determine the location of the cosmetic instrument action; wherein, the database stores the corresponding relation between the scene parameter and the action position of the beauty instrument.
In the application, the detection device 1 can acquire a target image, extract scene parameters in the target image, and match the scene parameters in the target image with a database so as to determine the action position of the beauty instrument, so that the action position of the beauty instrument can be intelligently determined, and the action position of the beauty instrument is determined more scientifically and objectively compared with the determination of naked eyes of users.
The detection device 1 may be, but is not limited to, a cosmetic mirror, a smart phone, an iPad, a computer, a camera, etc.
Further, the detection device 1 further has a communication function, and the obtained target image of the fox searching and/or the determined action position of the beauty instrument can be sent to the beauty instrument, wherein the detection device 1 can communicate through any one of a radio frequency antenna, wifi and bluetooth.
Referring to fig. 4, fig. 4 is a schematic block diagram of a cosmetic apparatus according to some embodiments of the present application.
As shown in fig. 4, in some embodiments, the cosmetic device 2 includes: a receiving module 20, a second determining module 21 and an energy output module 22. The receiving module 20 is configured to receive the target image acquired by the detecting device and determine, according to the target image, a position where the beauty instrument 2 acts in the current use scenario, or receive the position where the beauty instrument 2 acts in the current use scenario sent by the detecting device. The second determining module 21 determines an operating state and/or an energy output gear of the cosmetic instrument 2 based on the position of the cosmetic instrument in the current use scenario, wherein the operating state of the cosmetic instrument 2 includes a standby state and an energy output state. The energy output module 22 is configured to output energy according to the determined energy output gear when the working state of the cosmetic apparatus 2 is an energy output state.
According to the application, the beauty instrument 2 can directly adjust the working state and/or the energy output gear according to the received position acted by the beauty instrument 2 under the current use scene, so that the function of intelligently adjusting the working state and/or the energy output gear according to the position acted by the beauty instrument 2 under the current use scene is realized, and further, the user does not need to memorize the optimal energy output gear corresponding to each skin area for manual adjustment, and the use experience of the user can be improved.
The receiving module 20 may receive the target image or the position acted by the beauty instrument sent by the detecting device through any one of radio frequency antenna, wifi and bluetooth.
In other embodiments, the position of the current use scenario where the cosmetic instrument 2 acts may be further processed according to the received target image, and then the working state and/or the energy output gear of the position may be adjusted, that is, the detection device 1 only needs to acquire the target image and send the target image to the cosmetic instrument 2, and the cosmetic instrument 2 further processes the received target image to obtain the position of the current use scenario where the cosmetic instrument 2 acts.
Referring to fig. 5, fig. 5 is a schematic block diagram of a beauty treatment detection system according to some embodiments of the application.
As shown in fig. 5, in some embodiments, the beauty treatment detection system 3 includes the beauty treatment instrument 2 and the detection device 1 provided in the foregoing embodiments.
Some embodiments of the present application further provide a computer readable storage medium, where a computer program is stored in the computer readable storage medium, where the computer program is executed after being called by a processor, so as to implement the method for detecting the action position of the beauty treatment instrument provided by the foregoing embodiment and/or the method for outputting energy of the beauty treatment instrument provided by the foregoing embodiment.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be implemented by a program that instructs associated hardware, and the program may be stored in a computer readable memory, which may include: flash disk, read-only memory, random access memory, magnetic or optical disk, etc.
The foregoing is a description of embodiments of the present application, and it should be noted that, for those skilled in the art, modifications and variations can be made without departing from the principles of the embodiments of the present application, and such modifications and variations are also considered to be within the scope of the present application.

Claims (12)

1. A method for detecting the position of action of a cosmetic instrument, the method comprising:
acquiring a target image;
acquiring scene parameters in the target image based on the target image; wherein the scene parameters comprise at least one of the existence of a beauty instrument, an environmental parameter, the degree to which the beauty instrument is shielded, the inclination angle of the beauty instrument, the gesture of a user holding the beauty instrument, and the distance between the beauty instrument and a beauty area of the user;
matching scene parameters in the target image with a database to determine the position of action of the beauty instrument; wherein, the database stores the corresponding relation between the scene parameter and the action position of the beauty instrument.
2. The method for detecting a position of action of a cosmetic instrument according to claim 1, wherein the acquiring the target image includes:
the target image is acquired based on a video data stream.
3. The method of claim 2, wherein the capturing the target image based on the video data stream comprises:
acquiring a frame of image based on the video data stream and judging whether the current image meets the requirement;
if the requirements are met, determining the target image;
if the requirements are not met, another frame of image is acquired again and whether the requirements are met is judged.
4. The method for detecting a position of action of a cosmetic instrument according to claim 1, wherein the acquiring the target image includes:
periodically acquiring the target image; or alternatively, the first and second heat exchangers may be,
and acquiring the target image according to a preset time interval.
5. The method for detecting a position of action of a cosmetic instrument according to claim 1, wherein before the acquisition of the target image, the method further comprises:
obtaining calibration images of the beauty instrument under various use scenes;
acquiring scene parameters in the calibration image based on the calibration image;
setting the action position of the beauty instrument under the current scene parameters so as to establish a database of the corresponding relation between the scene parameters and the action position of the beauty instrument.
6. A cosmetic apparatus energy output method, the method comprising:
receiving the action position of the beauty instrument under the current use scene;
determining the working state and/or energy output gear of the beauty instrument based on the acting position of the beauty instrument under the current use scene, wherein the working state of the beauty instrument comprises a standby state and an energy output state;
and when the working state of the beauty instrument is an energy output state, performing energy output according to the determined energy output gear.
7. The energy output method of a cosmetic instrument according to claim 6, wherein the determining the working state and/or the energy output gear of the cosmetic instrument based on the position of the cosmetic instrument in the current use scenario comprises:
and determining the working state and/or the energy output gear of the beauty instrument based on the relation between the working state and/or the energy output gear of the beauty instrument and the working position of the beauty instrument under the current use scene.
8. The energy output method of a cosmetic instrument according to claim 6, wherein the determining the working state and/or the energy output gear of the cosmetic instrument based on the position of the cosmetic instrument in the current use scenario comprises:
determining the current skin level according to the position acted by the beauty instrument under the current use scene and the corresponding relation between the position acted by the beauty instrument and the skin level; wherein the skin level is divided based on any one or more of skin resistance information, fat content, and nerve tissue number;
and determining the working state and/or the energy output gear of the beauty instrument based on the relation between the skin level and the working state and/or the energy output gear of the beauty instrument and the current skin level.
9. A detection device, characterized in that the detection device comprises:
the first acquisition module is used for acquiring a target image;
the second acquisition module acquires scene parameters in the target image based on the target image; wherein the scene parameters comprise at least one of the existence of a beauty instrument, an environmental parameter, the degree to which the beauty instrument is shielded, the inclination angle of the beauty instrument, the gesture of a user holding the beauty instrument, and the distance between the beauty instrument and a beauty area of the user;
the first determining module is used for matching scene parameters in the target image with a database so as to determine the acting position of the beauty instrument; wherein, the database stores the corresponding relation between the scene parameter and the action position of the beauty instrument.
10. A cosmetic device, comprising:
the receiving module is used for receiving the target image acquired by the detection device and determining the action position of the beauty instrument under the current use scene according to the target image or receiving the action position of the beauty instrument under the current use scene sent by the detection device;
the second determining module is used for determining the working state and/or the energy output gear of the beauty instrument based on the position of the beauty instrument in the current use scene, wherein the working state of the beauty instrument comprises a standby state and an energy output state;
and the energy output module is used for outputting energy according to the determined energy output gear when the working state of the beauty instrument is the energy output state.
11. A cosmetic detection system, comprising a detection device and a cosmetic instrument, wherein the detection device is the detection device according to claim 9, and the cosmetic instrument is the cosmetic instrument according to claim 10.
12. A computer readable storage medium, wherein a computer program is stored in the computer readable storage medium, and the computer program is executed after being called by a processor, so as to implement the method for detecting the action position of a cosmetic instrument according to any one of claims 1 to 5 and/or the method for outputting energy of a cosmetic instrument according to any one of claims 6 to 8.
CN202310568109.4A 2023-05-18 2023-05-18 Method and device for detecting action position of beauty instrument and energy output and beauty instrument Pending CN116747431A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310568109.4A CN116747431A (en) 2023-05-18 2023-05-18 Method and device for detecting action position of beauty instrument and energy output and beauty instrument

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310568109.4A CN116747431A (en) 2023-05-18 2023-05-18 Method and device for detecting action position of beauty instrument and energy output and beauty instrument

Publications (1)

Publication Number Publication Date
CN116747431A true CN116747431A (en) 2023-09-15

Family

ID=87957982

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310568109.4A Pending CN116747431A (en) 2023-05-18 2023-05-18 Method and device for detecting action position of beauty instrument and energy output and beauty instrument

Country Status (1)

Country Link
CN (1) CN116747431A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117138228A (en) * 2023-11-01 2023-12-01 深圳市宗匠科技有限公司 Control method of beauty instrument, beauty instrument and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112546438A (en) * 2020-11-10 2021-03-26 添可智能科技有限公司 Beauty instrument control and working method and equipment
CN113397480A (en) * 2021-05-10 2021-09-17 深圳数联天下智能科技有限公司 Control method, device and equipment of beauty instrument and storage medium
CN218220818U (en) * 2022-07-22 2023-01-06 彦霖实验室(深圳)有限公司 Beauty instrument intelligent positioning system and auxiliary positioning device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112546438A (en) * 2020-11-10 2021-03-26 添可智能科技有限公司 Beauty instrument control and working method and equipment
CN113397480A (en) * 2021-05-10 2021-09-17 深圳数联天下智能科技有限公司 Control method, device and equipment of beauty instrument and storage medium
CN218220818U (en) * 2022-07-22 2023-01-06 彦霖实验室(深圳)有限公司 Beauty instrument intelligent positioning system and auxiliary positioning device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117138228A (en) * 2023-11-01 2023-12-01 深圳市宗匠科技有限公司 Control method of beauty instrument, beauty instrument and computer readable storage medium
CN117138228B (en) * 2023-11-01 2024-02-13 深圳市宗匠科技有限公司 Control method of beauty instrument, beauty instrument and computer readable storage medium

Similar Documents

Publication Publication Date Title
RU2762142C1 (en) Method and apparatus for determining the key point of the face, computer apparatus, and data storage
CN107886484B (en) Beautifying method, beautifying device, computer-readable storage medium and electronic equipment
CN107833197B (en) Image processing method and device, computer readable storage medium and electronic equipment
US8957943B2 (en) Gaze direction adjustment for video calls and meetings
WO2019120029A1 (en) Intelligent screen brightness adjustment method and apparatus, and storage medium and mobile terminal
EP3664016B1 (en) Image detection method and apparatus, and terminal
US20160352996A1 (en) Terminal, image processing method, and image acquisition method
KR102548317B1 (en) Dye detection method and electronic device
CN107820017B (en) Image shooting method and device, computer readable storage medium and electronic equipment
CN111510630A (en) Image processing method, device and storage medium
CN111445413B (en) Image processing method, device, electronic equipment and storage medium
CN116747431A (en) Method and device for detecting action position of beauty instrument and energy output and beauty instrument
CN111880640B (en) Screen control method and device, electronic equipment and storage medium
CN110677592B (en) Subject focusing method and device, computer equipment and storage medium
CN107424117B (en) Image beautifying method and device, computer readable storage medium and computer equipment
CN111353368A (en) Pan-tilt camera, face feature processing method and device and electronic equipment
CN113850726A (en) Image transformation method and device
CN113572956A (en) Focusing method and related equipment
CN105279498B (en) A kind of eyeball recognition methods, device and terminal
WO2022184084A1 (en) Skin test method and electronic device
CN115830668A (en) User authentication method and device based on facial recognition, computing equipment and medium
US10009545B2 (en) Image processing apparatus and method of operating the same
CN110519526B (en) Exposure time control method and device, storage medium and electronic equipment
KR101507410B1 (en) Live make-up photograpy method and apparatus of mobile terminal
US20170163852A1 (en) Method and electronic device for dynamically adjusting gamma parameter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination