CN117574502A - Method, device, equipment and medium for determining building elements based on emotion data - Google Patents

Method, device, equipment and medium for determining building elements based on emotion data Download PDF

Info

Publication number
CN117574502A
CN117574502A CN202311550935.2A CN202311550935A CN117574502A CN 117574502 A CN117574502 A CN 117574502A CN 202311550935 A CN202311550935 A CN 202311550935A CN 117574502 A CN117574502 A CN 117574502A
Authority
CN
China
Prior art keywords
emotion
building
image information
initial
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311550935.2A
Other languages
Chinese (zh)
Inventor
张静
任洪国
刘玉晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei University of Engineering
Original Assignee
Hebei University of Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei University of Engineering filed Critical Hebei University of Engineering
Priority to CN202311550935.2A priority Critical patent/CN117574502A/en
Publication of CN117574502A publication Critical patent/CN117574502A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Mathematics (AREA)
  • Structural Engineering (AREA)
  • Pure & Applied Mathematics (AREA)
  • Civil Engineering (AREA)
  • Architecture (AREA)
  • Development Economics (AREA)
  • Mathematical Optimization (AREA)
  • Educational Administration (AREA)
  • Mathematical Analysis (AREA)
  • Game Theory and Decision Science (AREA)
  • Evolutionary Computation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The disclosure provides a building element determining method, device, equipment and medium based on emotion data, comprising the following steps: acquiring building element image information and expression image information corresponding to the building element image information; inputting the building element image information and the expression image information into a building emotion assessment model after training, and outputting emotion data corresponding to the building element image information and the expression image information through the building emotion assessment model; storing the emotion data and the building element image information in a database correspondingly; and acquiring target emotion data, searching target building elements corresponding to the target emotion from the database according to the target emotion data, and outputting the target building elements. According to the method and the device, when building design is carried out, the target building elements corresponding to the emotion data in the database are searched according to the preference of the user, so that the designed building meets the requirements of the user more, and the subsequent modification during construction is avoided.

Description

Method, device, equipment and medium for determining building elements based on emotion data
Technical Field
The disclosure relates to the field of architectural design, and in particular relates to a method, a device, equipment and a medium for determining architectural elements based on emotion data.
Background
Traditional architectural designs are mainly focused on functions and use purposes of the space, and the design is mainly dependent on experience and knowledge of an architect, so that users can only provide the architect with imagination on evaluation of a certain design. For different design patterns, the user cannot directly feel and make corresponding evaluation, so that when the architect designs, the preference of the user cannot be accurately known.
Meanwhile, a great deal of time and funds are consumed in building design and construction, and the construction is difficult to modify in practice, and various building elements are determined in building design, so that the modification after construction is avoided.
Disclosure of Invention
In view of the above, an object of the present disclosure is to provide a method, a device, an apparatus and a medium for determining building elements based on emotion data, which are used for solving or partially solving the above problems.
With the above object in view, a first aspect of the present disclosure provides a construction element determining method based on emotion data, the method including:
Acquiring building element image information and expression image information corresponding to the building element image information, wherein the building element image information comprises building elements which are factors comprising building features;
inputting the building element image information and the expression image information into a trained building emotion assessment model, and outputting emotion data corresponding to the building element image information and the expression image information through the building emotion assessment model, wherein the building emotion assessment model is a model which is obtained by training an initial assessment model and outputs emotion data corresponding to different building elements;
storing the emotion data and the building element image information in a database correspondingly;
and acquiring target emotion data, searching a target building element corresponding to the target emotion data from the database according to the target emotion data, and outputting the target building element.
Based on the same inventive concept, a second aspect of the present disclosure proposes a construction element determining device based on emotion data, including:
an information acquisition module configured to acquire building element image information and expression image information corresponding to the building element image information, wherein the building element image information contains building elements which are factors containing building features;
The emotion data determining module is configured to input the building element image information and the expression image information into a trained building emotion assessment model, and output emotion data corresponding to the building element image information and the expression image information through the building emotion assessment model, wherein the building emotion assessment model is a model which is obtained by training an initial assessment model and outputs emotion data corresponding to different building elements;
the data storage module is configured to store the emotion data and the building element image information into a database correspondingly;
the target building element determining module is configured to acquire target emotion data, search target building elements corresponding to the target emotion data from the database according to the target emotion data, and output the target building elements.
Based on the same inventive concept, a third aspect of the present disclosure proposes an electronic device comprising a memory, a processor and a computer program stored on the memory and executable by the processor, the processor implementing the method for determining architectural elements based on emotion data as described above when executing the computer program.
Based on the same inventive concept, a fourth aspect of the present disclosure proposes a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the emotion data-based building element determination method as described above.
As can be seen from the foregoing, the present disclosure provides a method, apparatus, device and medium for determining building elements based on emotion data, where the building elements are elements including building features, obtain building element image information and expression image information of a user corresponding to the building element image information, and input the building element image information and the expression image information together into a trained building emotion assessment model to obtain corresponding emotion data. And determining emotion data of the building element users through the building emotion assessment model, wherein the emotion data can represent the preference degree of the users for the building elements. And correspondingly storing the building element image information and the emotion data thereof into a database for selecting corresponding building elements according to the emotion data. And acquiring target emotion data, searching a target building element corresponding to the target emotion from the database according to the target emotion data, and outputting the target building element. By storing the emotion data of the user for each building element, when building design is carried out, the target building element corresponding to the emotion data in the database is searched according to the preference of the user, so that the designed building meets the requirements of the user more, and the subsequent modification during construction is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the present disclosure or related art, the drawings required for the embodiments or related art description will be briefly described below, and it is apparent that the drawings in the following description are only embodiments of the present disclosure, and other drawings may be obtained according to these drawings without inventive effort to those of ordinary skill in the art.
FIG. 1 is a flow chart of a method of determining architectural elements based on affective data in accordance with an embodiment of the present disclosure;
FIG. 2 is a block diagram of a construction element determination device based on emotion data according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
For the purposes of promoting an understanding of the principles and advantages of the disclosure, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same.
It should be noted that unless otherwise defined, technical or scientific terms used in the embodiments of the present disclosure should be given the ordinary meaning as understood by one of ordinary skill in the art to which the present disclosure pertains. The terms "first," "second," and the like, as used in embodiments of the present disclosure, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof, but does not exclude other elements or items. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", etc. are used merely to indicate relative positional relationships, which may also be changed when the absolute position of the object to be described is changed.
The terms referred to in this disclosure are explained as follows:
MobileNet model: is a lightweight model proposed by *** in 2017 for cell phones or embedded devices.
Based on the above description, the present embodiment proposes a building element determining method based on emotion data, as shown in fig. 1, where the method includes:
step 101, acquiring building element image information and expression image information corresponding to the building element image information, wherein the building element image information comprises building elements which are factors comprising building features.
In a specific implementation, the building element image information includes a building element, where the building element is a factor including a building feature, and in this embodiment, the building element includes at least one of the following: materials, colors, dimensions, light, designs, lines of motion, and internal structures. The expression image information corresponding to the building element image information represents the image information of the expression displayed by the user when the user views the building element image information. The expression image information acquisition mode comprises at least one of the following steps: cameras, image capturing devices, video capturing devices, etc.
For example, a user may experience building elements of different parameters through real space or virtual reality space, taking as an example the colors of the building elements, and the user may experience the same building when painting yellow or blue, respectively, by viewing virtual reality content or viewing real video. The face image of the user is recognized through a fixed camera which is located on the opposite face of the user's face in real space or virtual reality space, and is set facing the face of the user for capturing the front face of the user.
Step 102, inputting the building element image information and the expression image information into a trained building emotion estimation model, and outputting emotion data corresponding to the building element image information and the expression image information through the building emotion estimation model, wherein the building emotion estimation model is a model which is obtained by training an initial estimation model and outputs emotion data corresponding to different building elements.
In specific implementation, training is carried out on the initial evaluation model in advance to obtain a building emotion evaluation model for outputting emotion data corresponding to different building elements. In this embodiment, the initial evaluation model adopts a MobileNet model, and the MobileNet model has smaller weight and is easier to use.
Building element image information and expression image information are input into a pre-trained building emotion assessment model, the building emotion assessment model is utilized for processing, emotion data corresponding to the building element image information and the expression image information are output, and the emotion data are used for representing the preference degree of a user for building elements in the building element image information. The form of the emotion data comprises at least one of the following: basic emotion data or two-dimensional emotion values, said basic emotion data comprising at least one of the following: data corresponding to happiness, data corresponding to surprise, data corresponding to fear, data corresponding to aversion, data corresponding to neutrality, data corresponding to sadness, or data corresponding to anger, the two-dimensional emotion value including at least one of: the two-dimensional emotion value is in the form of a percentile figure.
Illustratively, when the two-dimensional emotion value is positive-negative, the positive corresponds to a value of a%, the negative corresponds to b%, a e [0,100], b e [0,100], a+b=100.
For another example, when the two-dimensional emotion value is wake-relax, the wake corresponds to a value of c, the relax corresponds to a value of d, c e 0,100, d e 0,100, c+d=100.
Illustratively, the emotion data is in the form of basic emotion data,
the building elements in the obtained building element image information are in colors, wherein the colors are yellow, namely, building images with yellow colors and expression images of users for the building images with yellow colors are obtained, the building element image information and the expression image information are input into a building emotion assessment model after training, and basic emotion data are obtained through processing of the building emotion assessment model to be happy.
As another example, the emotion data is in the form of two-dimensional emotion values, which are positive-negative,
the building elements in the obtained building element image information are in colors, wherein the colors are blue, namely, a building image with the colors being blue and an expression image of a user for the building image with the colors being blue are obtained, the building element image information and the expression image information are input into a building emotion assessment model after training, and a two-dimensional emotion value is obtained through processing of the building emotion assessment model and is 30% -70%.
And step 103, storing the emotion data and the building element image information in a database correspondingly.
In specific implementation, emotion data and building element image information are correspondingly stored in a database, each building element is stored in the database, and the emotion data is a label for calling the corresponding building element according to the preference of a user.
Illustratively, emotion data takes the form of basic emotion data, building elements are colors, light rays and materials, the colors specifically comprise yellow, blue and green, the light rays specifically comprise bright light rays and dim light rays, the materials are steel, wood and aluminum alloy materials, the corresponding emotion data are yellow and happy, blue and sad, green and neutral, the light rays are bright and happy, the dim light rays are fear, the steel is anger, the wood is surprise and the aluminum alloy materials are neutral. When in storage, three kinds of building elements with colors are stored together, namely yellow is happy, blue is sad, and green is neutral. The two building elements are stored together, namely, the light is bright and happy, and the light is dim and frightened. Three of the building elements are stored together as materials, namely, steel is anger, wood is surprise and aluminum alloy material is neutral.
Step 104, obtaining target emotion data, searching a target building element corresponding to the target emotion data from the database according to the target emotion data, and outputting the target building element.
In specific implementation, target emotion data are obtained, target building elements corresponding to the target emotion are searched from the database according to the target emotion data, and the target building elements are output.
Illustratively, when a user wants a pleasant style in building design, the user searches a database for emotion data corresponding to each building element as a happy parameter, and uses the emotion data as a target building element. Based on the above example, looking up the database to obtain emotion data with yellow color and bright light corresponding to happy emotion data, and taking the emotion data with yellow color and bright light as the target building element.
Through the scheme, the building element image information and the expression image information of the user corresponding to the building element image information are acquired, and the building element image information and the expression image information are input into the trained building emotion assessment model together, so that corresponding emotion data are obtained. And determining emotion data of the building element users through the building emotion assessment model, wherein the emotion data can represent the preference degree of the users for the building elements. And correspondingly storing the building element image information and the emotion data thereof into a database for selecting corresponding building elements according to the emotion data. And acquiring target emotion data, searching a target building element corresponding to the target emotion from the database according to the target emotion data, and outputting the target building element. By storing the emotion data of the user for each building element, when building design is carried out, the target building element corresponding to the emotion data in the database is searched according to the preference of the user, so that the designed building meets the requirements of the user more, and the subsequent modification during construction is avoided.
In some embodiments, step 102 specifically includes:
and 1021, identifying the expression image information to obtain facial feature point information in the expression image information.
Step 1022 of inputting the building element image information and the facial feature point information into a trained building emotion estimation model, and outputting emotion data corresponding to the building element image information and the facial feature point information via the building emotion estimation model.
In specific implementation, facial feature point detection is carried out on the expression image information, and facial feature point information of the expression image information is obtained through recognition. Facial feature point detection refers to locating key points of a face of a human face, including points of eyebrows, eyes, nose, mouth and facial contour areas, given the face image.
And inputting the building element image information and the facial feature point information into a trained building emotion estimation model, and outputting emotion data corresponding to the building element image information and the facial feature point information through the building emotion estimation model.
In some embodiments, the training process of the building emotion estimation model specifically includes:
Step 10A, obtaining initial building element image information and initial emotion information corresponding to the initial building element image information.
And step 10B, building emotion data sets are built according to the initial building element image information and the initial emotion information.
And step 10C, dividing the building emotion data set into a training data set and a test data set.
And 10D, training the initial emotion estimation model by using the data in the training data set to obtain a building emotion estimation model.
In specific implementation, initial building element image information and initial emotion data corresponding to the initial building element image information are acquired, and a building emotion data set is established based on the acquired initial building element image information and the initial emotion information. Dividing the building emotion data set to obtain a training data set and a test data set, wherein the training data set is used for training an initial evaluation model, and the test data set is used for testing the model obtained by training to determine whether training is completed or not. And training the initial emotion estimation model by utilizing the data in the training data set to obtain a building emotion estimation model.
Illustratively, data in the architectural emotion data set is randomly shuttled, 80% is selected as the training data set, and 20% is selected as the test data set.
As another example, data in the architectural emotion data set is shuttled randomly, with 70% selected as the training data set and 30% selected as the test data set.
In some embodiments, step 10A specifically includes:
step 10A1, initial building element image information is acquired.
Step 10A2, displaying the initial building element image information, obtaining initial expression image information and generating emotion prompt information, wherein the emotion prompt information is used for prompting to input emotion data corresponding to the initial building element image information.
In the specific implementation, the initial building element image information is acquired. Displaying the initial building element image information, acquiring initial expression image information of a user on the initial building element image information, and generating emotion prompt information, wherein the emotion prompt information is used for prompting the user to input emotion data corresponding to the initial building element image information. The prompting mode of the emotion prompting information comprises at least one of the following steps: voice prompts, virtual reality prompts, text prompts, picture prompts, video prompts, or rich text prompts.
The method for prompting the emotion prompting information includes the steps that an emotion prompting mode is voice prompting, building elements contained in initial building element image information are yellow, a building picture with the yellow color is displayed to a user, emotion prompting information is generated, and meanwhile expression image information when the user sees the building picture is obtained. And sending the emotion prompt information to a user, such as 'please input emotion data of you for the picture'.
Step 10A3, receiving feedback information corresponding to the emotion prompt information input by a user, and obtaining initial emotion information corresponding to the feedback information.
When the method is implemented, feedback information corresponding to the emotion prompt information is input when a user faces the initial building element image information, and initial emotion information corresponding to the feedback information is obtained.
For example, after the emotion prompt information is displayed to the user, the user may also be displayed with an option of basic emotion data or an option of two-dimensional emotion data, so that the user may select emotion data corresponding to the initial building element image information.
For another example, for two-dimensional emotion data, after emotion prompt information is displayed to a user, two-dimensional emotion sliding bars can be displayed to the user, for example, two-dimensional emotion is positive-negative, and two sliding bars are respectively displayed to the user, namely, one sliding bar is positive emotion data, one sliding bar is negative emotion data, so that the user can select, for example, when the user faces blue, the positive emotion data dragging sliding bar is 3, the negative emotion data dragging sliding bar is 7, the emotion data with blue is positive 3, and the negative 7, the user is blue to emotion element, and the corresponding initial emotion information is negative. For example, when the two-dimensional emotion is wake-up and release, two sliding bars for respectively displaying wake-up emotion data and release emotion data to the user, namely, two sliding bars in total, one sliding bar for wake-up emotion data and one sliding bar for release emotion data are selected by the user, if the color of the user faces blue, the wake-up emotion data dragging sliding bar is selected to be 2, the release emotion data dragging sliding bar is selected to be 8, and the emotion data with blue color is wake-up 2 and release 8.
For two-dimensional emotion data, after emotion prompt information is displayed to a user, a sliding bar of two-dimensional emotion can be displayed to the user, wherein two ends of the sliding bar respectively represent two emotions corresponding to the two-dimensional emotion, such as positive-negative and awakening-relaxing, for example, for a first sliding bar, the leftmost end of the sliding bar represents positive, the rightmost end of the sliding bar represents negative, and the midmost end of the sliding bar represents positive and negative accounting for 50% of each. For the second slider, the leftmost end of the slider represents wake-up, the rightmost end of the slider represents relax, and the midmost end of the slider represents wake-up and relax each account for 50%. The user can determine the positive emotion data and the negative emotion data through sliding, and wake up the corresponding emotion data and relax the corresponding emotion data. If the user slides the two sliding bars, it is determined that the positive emotion data corresponding to the blue color of the building element is 30%, the negative emotion data corresponding to the blue color is 70%, the wake emotion data is 20%, and the relaxation emotion data corresponding to the blue color is 80%, so that the emotion data corresponding to the blue color is negative and relaxed.
In some embodiments, step 10B specifically includes:
Step 10B1, identifying the initial expression image information, and obtaining initial facial feature point information in the initial expression image information.
And step 10B2, building emotion data sets are built according to the initial facial feature point information, the initial building element image information and the initial emotion information.
In specific implementation, facial feature point detection is carried out on the initial expression image information, and facial feature point information of the initial expression image information is obtained through recognition. Building emotion data sets are established based on the identified initial facial feature point information, the initial building element image information and the initial emotion information input by the user.
In some embodiments, step 10D specifically includes:
step 10D1, setting a corresponding correct tag for the data in each training dataset.
And step 10D2, inputting the data into an initial emotion estimation model for training, and obtaining a training result.
And step 10D3, constructing a loss function based on the training result and the correct label.
And 10D4, iterating the initial emotion estimation model by using the loss function until the loss function converges to a preset convergence threshold value, and determining to complete iteration to obtain a building emotion estimation model.
In specific implementation, the prediction bias is described by using a loss function, in this embodiment, the cross entropy loss function is used as the loss function, and the loss function is constructed by using the training result and the target label. And when the loss function converges to a preset convergence threshold, determining that the initial emotion estimation model is iterated to obtain a building emotion estimation model. By constructing the loss function, the loss condition of the initial emotion estimation model in the iterative process can be clearly reflected while deriving the calculation gradient.
The model adopts an Adam optimization algorithm as a model optimizer, and the Adam optimization algorithm maintains independent self-adaptive learning rate for each network weight, so that the model has a good convergence effect.
Illustratively, the loss function in this embodiment is formulated as:
wherein CEE is a loss function, t i To be the correct label, y i Is a training result.
In some embodiments, the method specifically further comprises:
step 10a, acquiring data in a test data set, wherein each data in the test data set comprises initial facial feature point information, initial building element image information and initial emotion information.
And 10b, inputting the initial facial feature point information and the initial building element image information into a building emotion estimation model for each data in the test data set, and processing the building emotion estimation model to obtain first emotion information.
In specific implementation, data in a test data set in the building emotion data set is obtained, wherein each data in the test data set comprises initial facial feature point information, initial building element image information and initial emotion information. And inputting the initial facial feature point information and the initial building element image information in the data in the test data set into the building emotion estimation model to obtain corresponding first emotion information.
And step 10c, calculating the evaluation accuracy of the building emotion evaluation model based on the initial emotion information and the first emotion information.
And 10d, determining that the evaluation accuracy is higher than a preset accuracy threshold, and finishing the training of the building emotion evaluation model.
In specific implementation, first emotion information corresponding to each data in the test data set is compared with initial emotion information, and whether the first emotion information is identical with the initial emotion information or not is judged. And calculating the evaluation accuracy of the building emotion evaluation model. And when the evaluation accuracy is higher than a preset accuracy threshold, training the building emotion evaluation model.
For example, the test data set includes 10 pieces of data, first emotion information corresponding to each piece of data in the test data set is compared with initial emotion information, the number of the first emotion information which is the same as that of the initial emotion information in eight pieces of data is determined to be 7, the number of the first emotion information which is different from that of the initial emotion information is 3, and the evaluation accuracy of the building emotion evaluation model is 70%.
The preset accuracy threshold is 60%, the evaluation accuracy is larger than the preset accuracy threshold, and the building emotion evaluation model training is determined to be completed.
It should be noted that the method of the embodiments of the present disclosure may be performed by a single device, such as a computer or a server. The method of the embodiment can also be applied to a distributed scene, and is completed by mutually matching a plurality of devices. In the case of such a distributed scenario, one of the devices may perform only one or more steps of the methods of embodiments of the present disclosure, the devices interacting with each other to accomplish the methods.
It should be noted that the foregoing describes some embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
Based on the same inventive concept, the present disclosure also provides a building element determining device based on emotion data, corresponding to the method of any embodiment.
Referring to fig. 2, fig. 2 is a construction element determining apparatus based on emotion data according to an embodiment, including:
an information acquisition module 201 configured to acquire building element image information and expression image information corresponding to the building element image information, wherein the building element image information includes a building element, and the building element is a factor including a building feature;
the emotion data determining module 202 is configured to input the building element image information and the expression image information into a trained building emotion estimation model, and output emotion data corresponding to the building element image information and the expression image information through the building emotion estimation model, wherein the building emotion estimation model is a model which is obtained by training an initial estimation model and outputs emotion data corresponding to different building elements;
a data storage module 203 configured to store the emotion data and the building element image information in a database in correspondence;
the target building element determining module 204 is configured to obtain target emotion data, search a target building element corresponding to the target emotion data from the database according to the target emotion data, and output the target building element.
In some embodiments, the emotion data determination module 202 specifically includes:
an image recognition unit configured to recognize the expression image information, and obtain facial feature point information in the expression image information;
and a emotion data determination unit configured to input the building element image information and the facial feature point information to a trained building emotion estimation model, and to output emotion data corresponding to the building element image information and the facial feature point information via the building emotion estimation model.
In some embodiments, the apparatus further comprises a model training module configured to:
an initial information acquisition unit configured to acquire initial building element image information and initial emotion information corresponding to the initial building element image information;
a data set establishing unit configured to establish a building emotion data set according to the initial building element image information and the initial emotion information;
a data set dividing unit configured to divide the architectural emotion data set into a training data set and a test data set;
and the model training unit is configured to train the initial emotion estimation model by utilizing the data in the training data set to obtain a building emotion estimation model.
In some embodiments, the initial information acquisition unit specifically includes:
an image information acquisition subunit configured to acquire initial building element image information;
the information display subunit is configured to display the initial building element image information, acquire initial expression image information and generate emotion prompt information, wherein the emotion prompt information is used for prompting the input of emotion data corresponding to the initial building element image information;
and the feedback information receiving subunit is configured to receive feedback information corresponding to the emotion prompt information and input by a user and acquire initial emotion information corresponding to the feedback information.
In some embodiments, the data set creating unit specifically includes:
an information recognition subunit configured to recognize the initial expression image information and obtain initial facial feature point information in the initial expression image information;
and the data set establishing subunit is configured to establish a building emotion data set according to the initial facial feature point information, the initial building element image information and the initial emotion information.
In some embodiments, the model training unit specifically includes:
A tag setting subunit configured to set a corresponding correct tag for the data in each training dataset;
the model training subunit is configured to input the data into an initial emotion estimation model for training to obtain a training result;
a loss function construction subunit configured to construct a loss function based on the training result and the correct label;
and the iteration subunit is configured to iterate the initial emotion estimation model by using the loss function until the loss function converges to a preset convergence threshold value, and determine that the iteration is completed to obtain a building emotion estimation model.
In some embodiments, the apparatus specifically further comprises a model test module specifically configured to:
acquiring data in a test data set, wherein each data in the test data set comprises initial facial feature point information, initial building element image information and initial emotion information;
inputting the initial facial feature point information and the initial building element image information into a building emotion assessment model aiming at each data in the test data set, and processing the building emotion assessment model to obtain first emotion information;
Calculating the evaluation accuracy of the building emotion evaluation model based on the initial emotion information and the first emotion information;
and determining that the evaluation accuracy is higher than a preset accuracy threshold, and finishing the training of the building emotion evaluation model.
For convenience of description, the above devices are described as being functionally divided into various modules, respectively. Of course, the functions of the various modules may be implemented in the same one or more pieces of software and/or hardware when implementing the present disclosure.
The device of the foregoing embodiment is configured to implement the corresponding method for determining building elements based on emotion data in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which is not described herein.
Based on the same inventive concept, the present disclosure also provides an electronic device corresponding to the method of any embodiment, including a memory, a processor, and a computer program stored on the memory and capable of running on the processor, where the processor implements the method for determining building elements based on emotion data according to any embodiment when executing the program.
Fig. 3 shows a more specific hardware architecture of an electronic device according to this embodiment, where the device may include: a processor 1010, a memory 1020, an input/output interface 1030, a communication interface 1040, and a bus 1050. Wherein processor 1010, memory 1020, input/output interface 1030, and communication interface 1040 implement communication connections therebetween within the device via a bus 1050.
The processor 1010 may be implemented by a general-purpose CPU (Central Processing Unit ), microprocessor, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or one or more integrated circuits, etc. for executing relevant programs to implement the technical solutions provided in the embodiments of the present disclosure.
The Memory 1020 may be implemented in the form of ROM (Read Only Memory), RAM (Random Access Memory ), static storage device, dynamic storage device, or the like. Memory 1020 may store an operating system and other application programs, and when the embodiments of the present specification are implemented in software or firmware, the associated program code is stored in memory 1020 and executed by processor 1010.
The input/output interface 1030 is used to connect with an input/output module for inputting and outputting information. The input/output module may be configured as a component in a device (not shown) or may be external to the device to provide corresponding functionality. Wherein the input devices may include a keyboard, mouse, touch screen, microphone, various types of sensors, etc., and the output devices may include a display, speaker, vibrator, indicator lights, etc.
Communication interface 1040 is used to connect communication modules (not shown) to enable communication interactions of the present device with other devices. The communication module may implement communication through a wired manner (such as USB, network cable, etc.), or may implement communication through a wireless manner (such as mobile network, WIFI, bluetooth, etc.).
Bus 1050 includes a path for transferring information between components of the device (e.g., processor 1010, memory 1020, input/output interface 1030, and communication interface 1040).
It should be noted that although the above-described device only shows processor 1010, memory 1020, input/output interface 1030, communication interface 1040, and bus 1050, in an implementation, the device may include other components necessary to achieve proper operation. Furthermore, it will be understood by those skilled in the art that the above-described apparatus may include only the components necessary to implement the embodiments of the present description, and not all the components shown in the drawings.
The electronic device of the foregoing embodiment is configured to implement the corresponding method for determining building elements based on emotion data in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which is not described herein.
Based on the same inventive concept, corresponding to any of the above-described embodiment methods, the present disclosure further provides a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the emotion data-based building element determination method described in any of the above-described embodiments.
The computer readable media of the present embodiments, including both permanent and non-permanent, removable and non-removable media, may be used to implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device.
The storage medium of the above embodiment stores computer instructions for causing the computer to perform the method for determining a building element based on emotion data according to any one of the above embodiments, and has the advantages of the corresponding method embodiments, which are not described herein.
It will be appreciated that before using the technical solutions of the various embodiments in the disclosure, the user may be informed of the type of personal information involved, the range of use, the use scenario, etc. in an appropriate manner, and obtain the authorization of the user.
For example, in response to receiving an active request from a user, a prompt is sent to the user to explicitly prompt the user that the operation it is requesting to perform will require personal information to be obtained and used with the user. Therefore, the user can select whether to provide personal information to the software or hardware such as the electronic equipment, the application program, the server or the storage medium for executing the operation of the technical scheme according to the prompt information.
As an alternative but non-limiting implementation, in response to receiving an active request from a user, the manner in which the prompt information is sent to the user may be, for example, a popup, in which the prompt information may be presented in a text manner. In addition, a selection control for the user to select to provide personal information to the electronic device in a 'consent' or 'disagreement' manner can be carried in the popup window.
It will be appreciated that the above-described notification and user authorization process is merely illustrative, and not limiting of the implementations of the present disclosure, and that other ways of satisfying relevant legal regulations may be applied to the implementations of the present disclosure.
Those of ordinary skill in the art will appreciate that: the discussion of any of the embodiments above is merely exemplary and is not intended to suggest that the scope of the disclosure, including the claims, is limited to these examples; the technical features of the above embodiments or in the different embodiments may also be combined under the idea of the present disclosure, the steps may be implemented in any order, and there are many other variations of the different aspects of the embodiments of the present disclosure as described above, which are not provided in details for the sake of brevity.
Additionally, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown within the provided figures, in order to simplify the illustration and discussion, and so as not to obscure the embodiments of the present disclosure. Furthermore, the devices may be shown in block diagram form in order to avoid obscuring the embodiments of the present disclosure, and this also accounts for the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform on which the embodiments of the present disclosure are to be implemented (i.e., such specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the disclosure, it should be apparent to one skilled in the art that embodiments of the disclosure can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative in nature and not as restrictive.
While the present disclosure has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of those embodiments will be apparent to those skilled in the art in light of the foregoing description. For example, other memory architectures (e.g., dynamic RAM (DRAM)) may use the embodiments discussed.
The disclosed embodiments are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Accordingly, any omissions, modifications, equivalents, improvements, and the like, which are within the spirit and principles of the embodiments of the disclosure, are intended to be included within the scope of the disclosure.

Claims (10)

1. A method for determining construction elements based on emotion data, comprising:
acquiring building element image information and expression image information corresponding to the building element image information, wherein the building element image information comprises building elements which are factors comprising building features;
inputting the building element image information and the expression image information into a trained building emotion assessment model, and outputting emotion data corresponding to the building element image information and the expression image information through the building emotion assessment model, wherein the building emotion assessment model is a model which is obtained by training an initial assessment model and outputs emotion data corresponding to different building elements;
Storing the emotion data and the building element image information in a database correspondingly;
and acquiring target emotion data, searching a target building element corresponding to the target emotion data from the database according to the target emotion data, and outputting the target building element.
2. The method according to claim 1, wherein the inputting the building element image information and the expression image information into the trained building emotion estimation model and outputting emotion data corresponding to the building element image information and the expression image information via the building emotion estimation model includes:
identifying the expression image information to obtain facial feature point information in the expression image information;
and inputting the building element image information and the facial feature point information into a trained building emotion estimation model, and outputting emotion data corresponding to the building element image information and the facial feature point information through the building emotion estimation model.
3. The method of claim 1, wherein the training process of the architectural emotion assessment model comprises:
Acquiring initial building element image information and initial emotion information corresponding to the initial building element image information;
building a building emotion data set according to the initial building element image information and the initial emotion information;
dividing the building emotion data set into a training data set and a test data set;
and training the initial emotion estimation model by utilizing the data in the training data set to obtain a building emotion estimation model.
4. The method of claim 3, wherein the acquiring initial building element image information and initial emotion information corresponding to the initial building element image information comprises:
acquiring initial building element image information;
displaying the initial building element image information, acquiring initial expression image information and generating emotion prompt information, wherein the emotion prompt information is used for prompting the input of emotion data corresponding to the initial building element image information;
and receiving feedback information corresponding to the emotion prompt information, which is input by a user, and acquiring initial emotion information corresponding to the feedback information.
5. The method of claim 4, wherein the creating a building emotion data set from the initial building element image information and the initial emotion information comprises:
Identifying the initial expression image information to obtain initial facial feature point information in the initial expression image information;
and building a building emotion data set according to the initial facial feature point information, the initial building element image information and the initial emotion information.
6. The method of claim 3, wherein training the initial emotion assessment model using the data in the training dataset to obtain a building emotion assessment model comprises:
setting a corresponding correct label for the data in each training data set;
inputting the data into an initial emotion estimation model for training to obtain a training result;
constructing a loss function based on the training result and the correct label;
and iterating the initial emotion estimation model by using the loss function until the loss function converges to a preset convergence threshold value, and determining to complete iteration to obtain the building emotion estimation model.
7. A method according to claim 3, further comprising:
acquiring data in a test data set, wherein each data in the test data set comprises initial facial feature point information, initial building element image information and initial emotion information;
Inputting the initial facial feature point information and the initial building element image information into a building emotion assessment model aiming at each data in the test data set, and processing the building emotion assessment model to obtain first emotion information;
calculating the evaluation accuracy of the building emotion evaluation model based on the initial emotion information and the first emotion information;
and determining that the evaluation accuracy is higher than a preset accuracy threshold, and finishing the training of the building emotion evaluation model.
8. A construction element determining apparatus based on emotion data, comprising:
an information acquisition module configured to acquire building element image information and expression image information corresponding to the building element image information, wherein the building element image information contains building elements which are factors containing building features;
the emotion data determining module is configured to input the building element image information and the expression image information into a trained building emotion assessment model, and output emotion data corresponding to the building element image information and the expression image information through the building emotion assessment model, wherein the building emotion assessment model is a model which is obtained by training an initial assessment model and outputs emotion data corresponding to different building elements;
The data storage module is configured to store the emotion data and the building element image information into a database correspondingly;
the target building element determining module is configured to acquire target emotion data, search target building elements corresponding to the target emotion data from the database according to the target emotion data, and output the target building elements.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of any one of claims 1 to 7 when the program is executed.
10. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 7.
CN202311550935.2A 2023-11-20 2023-11-20 Method, device, equipment and medium for determining building elements based on emotion data Pending CN117574502A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311550935.2A CN117574502A (en) 2023-11-20 2023-11-20 Method, device, equipment and medium for determining building elements based on emotion data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311550935.2A CN117574502A (en) 2023-11-20 2023-11-20 Method, device, equipment and medium for determining building elements based on emotion data

Publications (1)

Publication Number Publication Date
CN117574502A true CN117574502A (en) 2024-02-20

Family

ID=89891297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311550935.2A Pending CN117574502A (en) 2023-11-20 2023-11-20 Method, device, equipment and medium for determining building elements based on emotion data

Country Status (1)

Country Link
CN (1) CN117574502A (en)

Similar Documents

Publication Publication Date Title
CN110337016B (en) Short video personalized recommendation method and system based on multimodal graph convolution network, readable storage medium and computer equipment
CN106293074B (en) Emotion recognition method and mobile terminal
CN111966800B (en) Emotion dialogue generation method and device and emotion dialogue model training method and device
CN110688874B (en) Facial expression recognition method and device, readable storage medium and electronic equipment
US10719695B2 (en) Method for pushing picture, mobile terminal, and storage medium
US12033078B2 (en) Sequence-of-sequences model for 3D object recognition
CN113362263B (en) Method, apparatus, medium and program product for transforming an image of a virtual idol
CN111260754A (en) Face image editing method and device and storage medium
CN110728319B (en) Image generation method and device and computer storage medium
EP4222712A1 (en) Music reactive animation of human characters
CN112116589B (en) Method, device, equipment and computer readable storage medium for evaluating virtual image
US20180182105A1 (en) Method and system for sharing-oriented personalized route planning via a customizable multimedia approach
CN113377914A (en) Recommended text generation method and device, electronic equipment and computer readable medium
CN117078790B (en) Image generation method, device, computer equipment and storage medium
EP4396669A1 (en) Conversation guided augmented reality experience
CN109961152B (en) Personalized interaction method and system of virtual idol, terminal equipment and storage medium
CN113705792A (en) Personalized recommendation method, device, equipment and medium based on deep learning model
US9607573B2 (en) Avatar motion modification
CN117574502A (en) Method, device, equipment and medium for determining building elements based on emotion data
CN110110126A (en) Inquire the method, apparatus and server of the face-image of personage
CN117011449A (en) Reconstruction method and device of three-dimensional face model, storage medium and electronic equipment
CN110766502B (en) Commodity evaluation method and system
CN112287159A (en) Retrieval method, electronic device and computer readable medium
Castillo-Arredondo et al. PhotoHandler: Manipulation of Portrait Images with StyleGANs Using Text.
CN117523064A (en) Cartoon image generation method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination