CN111523169A - Decoration scheme generation method and device, electronic equipment and storage medium - Google Patents

Decoration scheme generation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111523169A
CN111523169A CN202010331237.3A CN202010331237A CN111523169A CN 111523169 A CN111523169 A CN 111523169A CN 202010331237 A CN202010331237 A CN 202010331237A CN 111523169 A CN111523169 A CN 111523169A
Authority
CN
China
Prior art keywords
sequence data
data
decorated
room
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010331237.3A
Other languages
Chinese (zh)
Other versions
CN111523169B (en
Inventor
苏旭
袁道鸣
周琳琳
麦广柱
吴翔南
刘聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Bozhilin Robot Co Ltd
Original Assignee
Guangdong Bozhilin Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Bozhilin Robot Co Ltd filed Critical Guangdong Bozhilin Robot Co Ltd
Priority to CN202010331237.3A priority Critical patent/CN111523169B/en
Publication of CN111523169A publication Critical patent/CN111523169A/en
Application granted granted Critical
Publication of CN111523169B publication Critical patent/CN111523169B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/27Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Architecture (AREA)
  • Civil Engineering (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Structural Engineering (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a decoration scheme generation method and device, electronic equipment and a storage medium. The method includes the steps that acquired to-be-decorated drawing paper is input into a pre-trained feature extraction model, feature data of each room of the to-be-decorated drawing paper are obtained, the feature data can be rapidly determined only by inputting the to-be-decorated drawing paper into the feature extraction model, the feature data can be obtained without depending on a display platform, accuracy, efficiency and flexibility of feature data determination are improved, sequence data distances between the intermediate sequence data and at least one pre-stored sample sequence data are respectively calculated by converting the feature data into intermediate sequence data of each room of the to-be-decorated drawing paper, target sequence data are determined from the sample sequence data according to the sequence data distances, layout information corresponding to the target sequence data is determined as a decoration scheme of the to-be-decorated drawing paper, and the decoration scheme of the to-be-decorated drawing paper can be accurately and rapidly determined.

Description

Decoration scheme generation method and device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to computer technology, in particular to a decoration scheme generation method and device, electronic equipment and a storage medium.
Background
Along with the improvement of the living standard of people, more and more people buy the commodity house, and the decoration market demand of the commodity house is higher and higher. And market customers have higher requirements on the decoration design of rooms, and the customers hope that designers can design according to own expectations on one hand, and also expect that design effect drawings can be quickly seen on the other hand, and meanwhile, decoration cost can be saved. These increased demands have created significant challenges for designers of traditional finish design companies.
Currently, some decoration design companies perform functional partitioning on the acquired blank house type drawing using a depth countermeasure network, and design a decoration scheme according to each functional partition. However, the mode has poor function effect on the whole blank room, and the determined decoration scheme is relatively rigid.
Therefore, the decoration scheme generated by the prior art has poor effect and needs to be improved.
Disclosure of Invention
The embodiment of the invention provides a decoration scheme generation method and device, electronic equipment and a storage medium, and aims to achieve the effect of improving the design effect and flexibility of a decoration scheme.
In a first aspect, an embodiment of the present invention provides a decoration scheme generation method, where the method includes:
inputting the obtained drawing paper to be decorated into a pre-trained feature extraction model to obtain feature data of each room of the drawing paper to be decorated;
converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated, wherein the intermediate sequence data comprises tag word vectors and spatial characteristic information of each room;
respectively calculating sequence data distances between the intermediate sequence data and at least one piece of pre-stored sample sequence data;
and determining target sequence data from the sample sequence data according to the sequence data distance, and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated.
In a second aspect, an embodiment of the present invention further provides a decoration scheme generating apparatus, where the apparatus includes:
the characteristic data determining module is used for inputting the obtained drawing paper to be decorated into a pre-trained characteristic extraction model to obtain characteristic data of each room of the drawing paper to be decorated;
the intermediate sequence determining module is used for converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated, wherein the intermediate sequence data comprises tag word vectors and spatial characteristic information of each room;
a sequence data distance calculation module for calculating a sequence data distance between the intermediate sequence data and at least one pre-stored sample sequence data respectively;
and the decoration scheme determining module is used for determining target sequence data from the sample sequence data according to the sequence data distance and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated.
In a third aspect, an embodiment of the present invention further provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor executes the computer program to implement the decoration scheme generation method according to any one of the first aspect.
In a fourth aspect, embodiments of the present invention further provide a storage medium containing computer-executable instructions, where the computer-executable instructions, when executed by a computer processor, implement the finishing scheme generation method according to any one of the first aspect.
The technical scheme provided by the embodiment of the invention obtains the characteristic data of each room of the drawing to be decorated by inputting the obtained drawing paper to be decorated into the pre-trained characteristic extraction model, realizes that the characteristic data can be quickly determined only by inputting the drawing paper to be decorated into the characteristic extraction model, does not need to rely on a display platform to obtain the characteristic data, improves the accuracy, the efficiency and the flexibility of the characteristic data determination, respectively calculating sequence data distances between the intermediate sequence data and at least one pre-stored sample sequence data by converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated, and determining target sequence data from the sample sequence data according to the sequence data distance, and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated, so that the decoration scheme of the drawing to be decorated can be accurately and quickly determined. The problem of relatively poor decoration scheme effect that generates among the prior art is solved, the effect of accuracy and the efficiency of the decoration scheme of confirming the drawing of waiting to decorate is realized improving.
Drawings
FIG. 1 is a schematic flow chart illustrating a decoration scheme generating method according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart illustrating a decoration scheme generating method according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of the logic of training and application of a feature extraction model according to a second embodiment of the present invention;
FIG. 4 is a schematic structural diagram of a decoration scheme generation apparatus according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic flow chart of a decoration scheme generation according to an embodiment of the present invention, which is applicable to a case where intermediate sequence data is determined by a feature extraction model and a decoration scheme is determined according to the intermediate sequence data and the sample sequence data, and the method may be performed by a decoration scheme generation apparatus, where the apparatus may be implemented by software and/or hardware and is generally integrated in a terminal or an electronic device. Referring specifically to fig. 1, the method may include the steps of:
and S110, inputting the obtained drawing paper to be decorated to a pre-trained feature extraction model to obtain feature data of each room of the drawing paper to be decorated.
The graph paper to be decorated can be a blank house type graph, and the characteristic extraction model is used for extracting characteristic data of the blank house type graph. Optionally, the feature data may include label information, position information, room wall information, door and window information, and the like of the to-be-decorated rough house corresponding to the rough house type diagram. The label information may be guest restaurants, host rooms, parent rooms, boy rooms, girl rooms, worker rooms, multifunctional rooms, kitchens, toilets, living balconies, landscape balconies, water supply balconies, and the like, the position information may be coordinate information of each room, the room wall information may include information such as width information and height of a wall, and the door and window information may include information such as width information and height of doors and windows.
Optionally, the feature extraction model may be obtained by training the initial network according to the sample decoration drawing and the sample feature data. In this embodiment, the feature extraction model may be an SSD network (Single Shot multi-object detector), and feature extraction is performed on the to-be-decorated drawing through a plurality of convolution layers by using VGG-16(Visual Geometry Group-16) as a basic network model, so as to obtain feature data of each room. The advantage of determining the feature data through the feature extraction model is that the feature data can be rapidly determined only by inputting the drawing paper to be decorated into the feature extraction model, the feature data does not need to be acquired by a display platform, and accuracy, efficiency and flexibility of feature data determination are improved.
And S120, converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated.
The intermediate sequence data comprises tag word vectors and spatial feature information of all rooms. As described in the foregoing steps, the feature data includes tag information, location information, room wall information, door and window information, and the like, the tag word vector in this step is vector data corresponding to the tag information, and the spatial feature information includes location information, room wall information, door and window information, and the like.
Optionally, the intermediate sequence data is determined in a manner of: inputting the label information in the characteristic data into a pre-trained word vector model to obtain label word vectors of each room of the drawing to be decorated; obtaining initial sequence data of the wall in each room of the drawing to be decorated based on the label word vector and second coordinate information and dimension information in the feature data; and splicing the initial sequence data according to a specific splicing sequence to obtain intermediate sequence data of each room.
Optionally, the word vector model may be obtained through training of sample label information and sample label word vectors, the second coordinate information may be coordinate information of a wall of each room or coordinate information of a door or window, the size information may be width and height of the wall of each room or width and height of the door or window of each room, and the specific splicing sequence may be a clockwise sequence. In this embodiment, the word vector model may select to obtain the tagged word vector of each room by using a 50-dimensional word vector, sequentially concatenate the tagged word vector, the second coordinate information, and the size information of each room to obtain initial sequence data, and then start each room with a door, concatenate the initial sequence data of each room again in a clockwise direction to obtain intermediate sequence data of each room.
And S130, respectively calculating sequence data distances between the intermediate sequence data and at least one pre-stored sample sequence data.
In this embodiment, sample feature data of each room of a plurality of sample blank house-type diagrams may be stored in advance, and the sample feature data may be input to the word vector model trained in advance to obtain the sample sequence data. Alternatively, the sample feature data may include sample label information, sample coordinate information, sample size information, and the like of each room, and the sample sequence data includes label word vectors of each room of the sample rough floor plan.
Alternatively, the sequence data distance may be determined by: determining the inter-sequence transfer amount of the intermediate sequence data and the sample sequence data and the label vector distance of the two sequences based on the spatial feature information of any sequence in the intermediate sequence data and the spatial feature information of any sequence in the sample sequence data; and determining a sequence data distance between the intermediate sequence data and the sample sequence data based on the inter-sequence transfer amount and the tag vector distance in the two sequences, wherein the sequence data distance is the minimum of the product sum of the inter-sequence transfer amount and the tag vector distance in the two sequences.
Specifically, the sequence data Distance may be calculated according to a WMD (Earth Mover's Distance) algorithm. The objective function of the WMD algorithm is:
Figure BDA0002465022020000071
the constraint conditions are as follows:
Figure BDA0002465022020000072
wherein, TijC (i, j) represents the Euclidean distance between the ith value in the intermediate sequence data and the jth value in the sample sequence data, i.e. the distance between the tag vectors in the two sequences of the intermediate sequence data and the sample sequence data, diAnd d'iThe transfer amount of the tag vector and the transfer amount of the tag vector are respectively represented, and the probability of the ith tag vector appearing in the intermediate sequence data is respectively represented.
And S140, determining target sequence data from the sample sequence data according to the sequence data distance, and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated.
In this embodiment, the sample characteristic data and the sample layout information may be stored in advance in correspondence with the label information of each room of the house-type diagram of the sample blank. It is understood that the sequence data distance can be used to determine the similarity between the intermediate sequence data and at least one sample sequence data, and that a smaller sequence data distance indicates that the intermediate sequence data is closer to the sample sequence data, thereby facilitating the determination of the target sequence data from the intermediate sequence data according to the sequence data distance. As described in the previous step, the sample sequence data includes the tagged word vectors of the rooms of the sample rough floor plan, and thus the target sequence data also includes the tagged word vectors of the rooms of the sample rough floor plan. Optionally, the tag word vectors in the target sequence data may be acquired, the tag word vectors of the target sequence data are analyzed, the arrangement information corresponding to the target sequence data is determined according to the analysis result, and the arrangement information is determined as the decoration scheme of the drawing to be decorated. Wherein the arrangement information is the sample layout information.
Alternatively, the sample sequence data corresponding to a number data distance smaller than a preset threshold may be determined as the target sequence data. The preset threshold may be a minimum value, and is used for screening target sequence data of the intermediate sequence data. In this embodiment, the sample sequence data may be sorted based on the calculated sequence data distance, and the sample sequence data corresponding to the sequence data distance at the preset sorting position may be determined as the target sequence data. For example, the sample sequence data is arranged in descending order, and the sample sequence data corresponding to the distance of 1-3 sequence data with the smallest numerical value is used as the target sequence data. The advantage of doing so is, can confirm the fitment scheme of the fitment drawing fast and accurately.
The technical scheme provided by the embodiment of the invention obtains the characteristic data of each room of the drawing to be decorated by inputting the obtained drawing paper to be decorated into the pre-trained characteristic extraction model, realizes that the characteristic data can be quickly determined only by inputting the drawing paper to be decorated into the characteristic extraction model, does not need to rely on a display platform to obtain the characteristic data, improves the accuracy, the efficiency and the flexibility of the characteristic data determination, respectively calculating sequence data distances between the intermediate sequence data and at least one pre-stored sample sequence data by converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated, and determining target sequence data from the sample sequence data according to the sequence data distance, and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated, so that the decoration scheme of the drawing to be decorated can be accurately and quickly determined. The problem of relatively poor decoration scheme effect that generates among the prior art is solved, the effect of accuracy and the efficiency of the decoration scheme of confirming the drawing of waiting to decorate is realized improving.
Example two
Fig. 2 is a schematic flow chart of a decoration scheme generation method according to a second embodiment of the present invention. The technical scheme of the embodiment is refined on the basis of the embodiment. Optionally, the feature extraction model comprises a first feature extraction model and at least one second feature extraction model; correspondingly, inputting the obtained drawing paper to be decorated into a pre-trained feature extraction model to obtain feature data of each room of the drawing paper to be decorated, and the feature data comprises the following steps: inputting the drawing paper to be decorated to the first feature extraction model to obtain room type label information and first coordinate information of each room of the drawing paper to be decorated; inputting the room type label information, the first coordinate information and the to-be-decorated drawing paper corresponding to the room type label information into a second feature extraction model corresponding to the room type label information to obtain feature data of each room of the to-be-decorated drawing paper, wherein the feature data comprises the label information, the second coordinate information and the size information of components in the room. The benefit of refinement is to facilitate understanding of the specific manner of determination of the feature data. In the method, reference is made to the above-described embodiments for those parts which are not described in detail. Referring specifically to fig. 2, the method may include the steps of:
s210, inputting the drawing paper to be decorated into the first feature extraction model to obtain the room type label information and the first coordinate information of each room of the drawing paper to be decorated.
In this embodiment, the feature extraction model may include a first feature extraction model and a second feature extraction model. Optionally, the first feature extraction model may obtain first predicted feature information by inputting the sample decoration drawing into the initial model, calculate a first loss function of the first predicted feature information and the first sample feature information, and obtain the first feature extraction model by reversely adjusting a network parameter of the initial model through the first loss function. Optionally, the first prediction characteristic information includes first prediction room type tag information and first prediction coordinate information, the first sample characteristic information includes first sample room type tag information and first sample coordinate information, and the first sample coordinate information may be location information of each room of the sample decoration drawing. Further, after the first feature extraction model is obtained, the drawing paper to be decorated can be input into the first feature extraction model, and the room type label information and the first coordinate information of each room of the drawing paper to be decorated are obtained.
Optionally, the expression of the first loss function is:
Figure BDA0002465022020000091
wherein L isconf(x,c)As class confidence error, Lloc(x,l,g)Is the position error, N is the number of positive samples of the prior frame,
Figure BDA0002465022020000092
is an indication parameter when
Figure BDA0002465022020000093
The time indicates that the ith prior frame is matched with the jth real frame, the correctly labeled category (group route) is p, c is a category confidence degree predicted value, l is a position predicted value of a boundary frame corresponding to the prior frame, and g is a position parameter of the group route.
As shown in fig. 3, which is a logic diagram of the feature extraction model training and application, it can be seen from fig. 3 that, before the first feature extraction model training, the sample finishing drawing may be subjected to the flooding filling process. The method comprises the steps of carrying out region segmentation on a sample decoration drawing of the feature extraction model according to spatial feature information of each room of the sample decoration drawing to obtain at least one segmentation region, traversing pixel values corresponding to pixel points of each segmentation region by starting with set seed points of each segmentation region, carrying out color filling on the segmentation regions according to a traversal result, and training an initial model according to the color-filled sample decoration drawing to obtain the feature extraction model.
Specifically, the sample decoration drawing may be segmented into at least one segmentation area according to spatial characteristic information such as position information and label information of each room of the sample decoration drawing, for example, into areas such as a main room, a parent room, a guest restaurant, a toilet, a study room, and the like, and then any point is selected as a seed point in each segmentation area, and the seed point is used to start to traverse color values of pixel points of each segmentation area, so as to fill each segmentation area with different colors. The advantage of doing so lies in, can improve the training degree of accuracy and the detection precision of first characteristic model to, be favorable to filtering information such as the characters of sample fitment drawing, mark line, with the noise influence of reduction.
Optionally, before the first feature extraction model is trained, normalization processing and data enhancement processing may be performed on the sample decoration drawing. Specifically, size transformation processing can be performed on the sample decoration drawing, the picture data can be processed into 512 x 512-sized pictures in a unified mode, then horizontal turning and random sampling are performed on the pictures to complete data enhancement processing, so that the generalization capability of the model is increased, then normalization processing is performed on the data, the convergence rate of the model can be improved better, and the training time of the first feature model is shortened.
And S220, inputting the room type label information, the first coordinate information and the to-be-decorated drawing corresponding to the room type label information into a second feature extraction model corresponding to the room type label information to obtain feature data of each room of the to-be-decorated drawing.
Wherein the feature data comprises tag information, second coordinate information and dimension information of the component in the room. For example, the feature data includes label information, coordinate information, and size information of a wall of each room, and may further include label information, coordinate information, and size information of a door or window of each room.
In this embodiment, the second feature extraction model may be obtained by training the initial network according to the sample room type tag information, the sample coordinate information, the sample decoration drawing, and the feature data of each room of the sample decoration drawing, so as to obtain the second feature extraction model corresponding to each sample room type tag information. The expression of the loss function corresponding to the second feature extraction model is the same as that of the loss function of the first feature extraction model, and the step is not specifically explained.
It can be understood that, since the size information and the category information of the components in each room of each sample decoration drawing are different, in order to perform targeted training on the second feature extraction model, the preselected frame proportion and the category of the second feature extraction model can be adjusted according to the size information and the category information of the components in each room of the sample decoration drawing, so that the efficiency and the accuracy of feature data extraction of the second feature extraction model can be improved.
According to the embodiment, the drawing to be decorated is firstly input into the first feature extraction model to obtain the room type label information and the first coordinate information of each room of the drawing to be decorated, and then the feature data of each room of the drawing to be decorated is determined through the second feature extraction model corresponding to the room type label information, so that the accuracy of determining the feature data can be improved, and the accuracy of determining the decoration scheme is further improved.
And S230, converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated.
And S240, respectively calculating sequence data distances between the intermediate sequence data and at least one pre-stored sample sequence data.
And S250, determining target sequence data from the sample sequence data according to the sequence data distance, and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated.
According to the technical scheme provided by the embodiment of the invention, the first characteristic extraction model and the at least one second characteristic extraction model are respectively trained in advance, and the characteristic information of the drawing to be decorated is extracted through the trained first characteristic extraction model and the second characteristic extraction model corresponding to the label information of each room type, so that the aim of improving the accuracy of the decoration scheme determination can be achieved. And before the first feature extraction model is trained, the sample decoration drawing is used for performing flooding filling processing, normalization processing and data enhancement processing, so that the training accuracy and the detection precision of the first feature model can be improved, the convergence rate of the model can be improved, and the training time of the first feature model can be shortened.
EXAMPLE III
Fig. 4 is a schematic structural diagram of a decoration scheme generating apparatus according to a third embodiment of the present invention. As shown in fig. 4, the apparatus includes: a characteristic data determination module 31, an intermediate sequence determination module 32, a sequence data distance calculation module 33, and a finishing scheme determination module 34.
The characteristic data determining module 31 is configured to input the obtained drawing paper to be decorated to a pre-trained characteristic extraction model to obtain characteristic data of each room of the drawing paper to be decorated;
the intermediate sequence determining module 32 is configured to convert the feature data into intermediate sequence data of each room of the drawing to be finished, where the intermediate sequence data includes a tag word vector and spatial feature information of each room;
a sequence data distance calculation module 33 for calculating a sequence data distance between the intermediate sequence data and at least one sample sequence data stored in advance, respectively;
and the decoration scheme determining module 34 is configured to determine target sequence data from the sample sequence data according to the sequence data distance, and determine layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated.
On the basis of the above embodiments, the feature extraction model includes a first feature extraction model and at least one second feature extraction model;
on the basis of the foregoing embodiments, the feature data determining module 31 is further configured to input the drawing paper to be decorated to the first feature extraction model, so as to obtain room type label information and first coordinate information of each room of the drawing paper to be decorated;
inputting the room type label information, the first coordinate information and the to-be-decorated drawing paper corresponding to the room type label information into a second feature extraction model corresponding to the room type label information to obtain feature data of each room of the to-be-decorated drawing paper, wherein the feature data comprises the label information, the second coordinate information and the size information of components in the room.
The intermediate sequence determining module 32 is further configured to input the label information in the feature data to a pre-trained word vector model to obtain label word vectors of each room of the drawing to be decorated;
obtaining initial sequence data of each room of the drawing to be decorated based on the label word vector and second coordinate information and size information in the feature data;
and splicing the initial sequence data according to a specific splicing sequence to obtain intermediate sequence data of each room.
On the basis of the foregoing embodiments, the sequence data distance calculating module 33 is further configured to determine, based on spatial feature information of any sequence in the intermediate sequence data and spatial feature information of any sequence in the sample sequence data, an inter-sequence transfer amount between the intermediate sequence data and the sample sequence data and a tag vector distance between the two sequences;
determining a sequence data distance between the intermediate sequence data and the sample sequence data based on the two inter-sequence transfer amount and a tag vector distance in the two sequences, wherein the sequence data distance is a minimum of a sum of products of the two inter-sequence transfer amount and the tag vector distance in the two sequences.
On the basis of the foregoing embodiments, the decoration scheme determining module 34 is further configured to determine, as the target sequence data, sample sequence data corresponding to a sequence data distance smaller than a preset threshold;
alternatively, the first and second electrodes may be,
ranking the at least one sample sequence data based on the calculated sequence data distance;
and determining the sample sequence data corresponding to the sequence data distance of the preset sequencing position as the target sequence data.
On the basis of the above embodiments, the apparatus further includes: the tag word vector acquisition module and the analysis module; the tag word vector acquisition module is used for acquiring tag word vectors in the target sequence data;
and the analysis module is used for analyzing the label word vector of the target sequence data and determining the arrangement information corresponding to the target sequence data according to an analysis result.
On the basis of the above embodiments, the apparatus further includes: a preprocessing module; the device comprises a segmentation module and a filling module; the system comprises a characteristic extraction module, a segmentation module and a display module, wherein the segmentation module is used for carrying out region segmentation on a sample decoration drawing according to spatial characteristic information of each room of the sample decoration drawing of the characteristic extraction model to obtain at least one segmentation region;
a filling module, configured to traverse pixel values corresponding to pixel points of the respective segmented regions starting with the set seed points of the respective segmented regions, perform color filling on the segmented regions according to a traversal result, train an initial model according to a color-filled sample decoration drawing, and obtain the feature extraction model
The technical scheme provided by the embodiment of the invention obtains the characteristic data of each room of the drawing to be decorated by inputting the obtained drawing paper to be decorated into the pre-trained characteristic extraction model, realizes that the characteristic data can be quickly determined only by inputting the drawing paper to be decorated into the characteristic extraction model, does not need to rely on a display platform to obtain the characteristic data, improves the accuracy, the efficiency and the flexibility of the characteristic data determination, respectively calculating sequence data distances between the intermediate sequence data and at least one pre-stored sample sequence data by converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated, and determining target sequence data from the sample sequence data according to the sequence data distance, and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated, so that the decoration scheme of the drawing to be decorated can be accurately and quickly determined. The problem of relatively poor decoration scheme effect that generates among the prior art is solved, the effect of accuracy and the efficiency of the decoration scheme of confirming the drawing of waiting to decorate is realized improving.
Example four
Fig. 5 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention. FIG. 5 illustrates a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 5 is only an example and should not bring any limitation to the function and the scope of use of the embodiment of the present invention.
As shown in FIG. 5, electronic device 12 is embodied in the form of a general purpose computing device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, and commonly referred to as a "hard drive"). Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. The memory 28 may include at least one program product having a set of program modules (e.g., a finishing plan generation device feature data determination module 31, a middle sequence determination module 32, a sequence data distance 33 calculation module, and a finishing plan determination module 34) configured to perform the functions of embodiments of the present invention.
A program/utility 44 having a set of program modules 46 (e.g., a decoration scheme generation apparatus feature data determination module 31, a subsequence determination module 32, a sequence data distance 33 calculation module, and a decoration scheme determination module 34) may be stored, for example, in memory 28, such program modules 46 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may include implementation of a network environment. Program modules 46 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with electronic device 12, and/or with any devices (e.g., network card, modem, etc.) that enable electronic device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with other modules of the electronic device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, to implement a finishing plan generating method provided by an embodiment of the present invention, the method including:
inputting the obtained drawing paper to be decorated into a pre-trained feature extraction model to obtain feature data of each room of the drawing paper to be decorated;
converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated, wherein the intermediate sequence data comprises tag word vectors and spatial characteristic information of each room;
respectively calculating sequence data distances between the intermediate sequence data and at least one piece of pre-stored sample sequence data;
and determining target sequence data from the sample sequence data according to the sequence data distance, and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, to implement a finishing plan generating method provided by an embodiment of the present invention.
Of course, those skilled in the art will appreciate that the processor may also implement the technical solution of the decoration scheme generation method provided in any embodiment of the present invention.
EXAMPLE five
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a decoration scheme generation method provided in an embodiment of the present invention, where the method includes:
inputting the obtained drawing paper to be decorated into a pre-trained feature extraction model to obtain feature data of each room of the drawing paper to be decorated;
converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated, wherein the intermediate sequence data comprises tag word vectors and spatial characteristic information of each room;
respectively calculating sequence data distances between the intermediate sequence data and at least one piece of pre-stored sample sequence data;
and determining target sequence data from the sample sequence data according to the sequence data distance, and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated.
Of course, the computer program stored on the computer-readable storage medium provided by the embodiment of the present invention is not limited to the above method operations, and may also perform related operations in a decoration scheme generation method provided by any embodiment of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, or device.
The computer readable signal medium may include, among other things, feature data, intermediate sequence data, sequence data distances, and object sequence data, and may carry computer readable program code embodied therein. Such propagated feature data, intermediate sequence data, sequence data distance, and target sequence data. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It should be noted that, in the embodiment of the subsidy computing device, the modules included in the embodiment are only divided according to the functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A finishing plan generating method, comprising:
inputting the obtained drawing paper to be decorated into a pre-trained feature extraction model to obtain feature data of each room of the drawing paper to be decorated;
converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated, wherein the intermediate sequence data comprises tag word vectors and spatial characteristic information of each room;
respectively calculating sequence data distances between the intermediate sequence data and at least one piece of pre-stored sample sequence data;
and determining target sequence data from the sample sequence data according to the sequence data distance, and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated.
2. The method of claim 1, wherein the feature extraction models comprise a first feature extraction model and at least one second feature extraction model;
correspondingly, inputting the obtained drawing paper to be decorated into a pre-trained feature extraction model to obtain feature data of each room of the drawing paper to be decorated, and the feature data comprises the following steps:
inputting the drawing paper to be decorated to the first feature extraction model to obtain room type label information and first coordinate information of each room of the drawing paper to be decorated;
inputting the room type label information, the first coordinate information and the to-be-decorated drawing paper corresponding to the room type label information into a second feature extraction model corresponding to the room type label information to obtain feature data of each room of the to-be-decorated drawing paper, wherein the feature data comprises the label information, the second coordinate information and the size information of components in the room.
3. The method of claim 2, wherein the converting the feature data into the intermediate sequence data of the rooms of the drawing to be decorated comprises:
inputting the label information in the characteristic data into a pre-trained word vector model to obtain label word vectors of all rooms of the drawing to be decorated;
obtaining initial sequence data of each room of the drawing to be decorated based on the label word vector and second coordinate information and size information in the feature data;
and splicing the initial sequence data according to a specific splicing sequence to obtain intermediate sequence data of each room.
4. The method of claim 1, wherein the separately calculating the sequence data distance between the intermediate sequence data and at least one pre-stored sample sequence data comprises:
determining the inter-sequence transfer amount of the intermediate sequence data and the sample sequence data and the label vector distance in the two sequences based on the spatial feature information of any one of the intermediate sequence data and the spatial feature information of any one of the sample sequence data;
determining a sequence data distance between the intermediate sequence data and the sample sequence data based on the two inter-sequence transfer amount and a tag vector distance in the two sequences, wherein the sequence data distance is a minimum of a sum of products of the two inter-sequence transfer amount and the tag vector distance in the two sequences.
5. The method of claim 1, wherein determining target sequence data from the sample sequence data based on the sequence data distance comprises:
determining sample sequence data corresponding to the sequence data distance smaller than a preset threshold value as the target sequence data;
alternatively, the first and second electrodes may be,
ranking the sample sequence data based on the calculated sequence data distance;
and determining the sample sequence data corresponding to the sequence data distance of the preset sequencing position as the target sequence data.
6. The method of claim 1, wherein before the determining the layout information corresponding to the target sequence data as the finishing scheme of the drawing to be finished, the method further comprises:
acquiring a tag word vector of the target sequence data;
and analyzing the label word vector of the target sequence data, and determining the arrangement information corresponding to the target sequence data according to the analysis result.
7. The method of claim 1, further comprising:
according to the spatial feature information of each room of the sample decoration drawing of the feature extraction model, carrying out region segmentation on the sample decoration drawing to obtain at least one segmentation region;
starting with the set seed points of each segmentation region, traversing the pixel values corresponding to the pixel points of each segmentation region, and filling the color of the segmentation region according to the traversal result;
and training the initial model according to the color-filled sample decoration drawing to obtain the feature extraction model.
8. A finishing plan generating apparatus, comprising:
the characteristic data determining module is used for inputting the obtained drawing paper to be decorated into a pre-trained characteristic extraction model to obtain characteristic data of each room of the drawing paper to be decorated;
the intermediate sequence determining module is used for converting the characteristic data into intermediate sequence data of each room of the drawing to be decorated, wherein the intermediate sequence data comprises tag word vectors and spatial characteristic information of each room;
a sequence data distance calculation module for calculating a sequence data distance between the intermediate sequence data and at least one pre-stored sample sequence data respectively;
and the decoration scheme determining module is used for determining target sequence data from the sample sequence data according to the sequence data distance and determining layout information corresponding to the target sequence data as a decoration scheme of the drawing to be decorated.
9. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the finishing solution generating method according to any one of claims 1 to 7 when executing the computer program.
10. A storage medium containing computer-executable instructions, which when executed by a computer processor implement the finishing solution generation method of any one of claims 1-7.
CN202010331237.3A 2020-04-24 2020-04-24 Decoration scheme generation method and device, electronic equipment and storage medium Active CN111523169B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010331237.3A CN111523169B (en) 2020-04-24 2020-04-24 Decoration scheme generation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010331237.3A CN111523169B (en) 2020-04-24 2020-04-24 Decoration scheme generation method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111523169A true CN111523169A (en) 2020-08-11
CN111523169B CN111523169B (en) 2023-06-13

Family

ID=71911046

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010331237.3A Active CN111523169B (en) 2020-04-24 2020-04-24 Decoration scheme generation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111523169B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651801A (en) * 2020-12-23 2021-04-13 北京城市网邻信息技术有限公司 Method and device for displaying house source information
CN113486432A (en) * 2021-07-15 2021-10-08 苏州苏明装饰股份有限公司 Assembly decoration module combination method and system based on standardization
CN115796556A (en) * 2023-02-01 2023-03-14 北京有竹居网络技术有限公司 Decoration scheme determination method and device, electronic equipment and readable storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104778756A (en) * 2015-04-10 2015-07-15 北京明兰网络科技有限公司 Intelligent home decoration design system
CN107832476A (en) * 2017-12-01 2018-03-23 北京百度网讯科技有限公司 A kind of understanding method of search sequence, device, equipment and storage medium
US20180268220A1 (en) * 2017-03-17 2018-09-20 Magic Leap, Inc. Room layout estimation methods and techniques
CN108984904A (en) * 2018-07-17 2018-12-11 北京理工大学 A kind of Home Fashion & Design Shanghai method based on deep neural network
CN109933662A (en) * 2019-02-15 2019-06-25 北京奇艺世纪科技有限公司 Model training method, information generating method, device, electronic equipment and computer-readable medium
KR20190106867A (en) * 2019-08-27 2019-09-18 엘지전자 주식회사 An artificial intelligence apparatus for guiding arrangement location of furniture and operating method thereof
CN110363853A (en) * 2019-07-15 2019-10-22 贝壳技术有限公司 Furniture puts scheme generation method, device and equipment, storage medium
US20190354850A1 (en) * 2018-05-17 2019-11-21 International Business Machines Corporation Identifying transfer models for machine learning tasks
CN110826328A (en) * 2019-11-06 2020-02-21 腾讯科技(深圳)有限公司 Keyword extraction method and device, storage medium and computer equipment
WO2020051776A1 (en) * 2018-09-11 2020-03-19 Intel Corporation Method and system of deep supervision object detection for reducing resource usage
CN110929319A (en) * 2019-11-01 2020-03-27 深圳市彬讯科技有限公司 Decoration design case information processing method and system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104778756A (en) * 2015-04-10 2015-07-15 北京明兰网络科技有限公司 Intelligent home decoration design system
US20180268220A1 (en) * 2017-03-17 2018-09-20 Magic Leap, Inc. Room layout estimation methods and techniques
CN107832476A (en) * 2017-12-01 2018-03-23 北京百度网讯科技有限公司 A kind of understanding method of search sequence, device, equipment and storage medium
US20190354850A1 (en) * 2018-05-17 2019-11-21 International Business Machines Corporation Identifying transfer models for machine learning tasks
CN108984904A (en) * 2018-07-17 2018-12-11 北京理工大学 A kind of Home Fashion & Design Shanghai method based on deep neural network
WO2020051776A1 (en) * 2018-09-11 2020-03-19 Intel Corporation Method and system of deep supervision object detection for reducing resource usage
CN109933662A (en) * 2019-02-15 2019-06-25 北京奇艺世纪科技有限公司 Model training method, information generating method, device, electronic equipment and computer-readable medium
CN110363853A (en) * 2019-07-15 2019-10-22 贝壳技术有限公司 Furniture puts scheme generation method, device and equipment, storage medium
KR20190106867A (en) * 2019-08-27 2019-09-18 엘지전자 주식회사 An artificial intelligence apparatus for guiding arrangement location of furniture and operating method thereof
CN110929319A (en) * 2019-11-01 2020-03-27 深圳市彬讯科技有限公司 Decoration design case information processing method and system
CN110826328A (en) * 2019-11-06 2020-02-21 腾讯科技(深圳)有限公司 Keyword extraction method and device, storage medium and computer equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘米兰等: "基于案例库的室内区域自动布局方法", 《计算机辅助设计与图形学学报》 *
王廷魁等: "基于BIM的全装修房个性化设计研究", 《工程设计学报》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651801A (en) * 2020-12-23 2021-04-13 北京城市网邻信息技术有限公司 Method and device for displaying house source information
CN113486432A (en) * 2021-07-15 2021-10-08 苏州苏明装饰股份有限公司 Assembly decoration module combination method and system based on standardization
CN113486432B (en) * 2021-07-15 2022-02-18 苏州苏明装饰股份有限公司 Assembly decoration module combination method and system based on standardization
CN115796556A (en) * 2023-02-01 2023-03-14 北京有竹居网络技术有限公司 Decoration scheme determination method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN111523169B (en) 2023-06-13

Similar Documents

Publication Publication Date Title
US11586785B2 (en) Information processing apparatus, information processing method, and program
JP6584478B2 (en) Method and apparatus for improved segmentation and recognition of images
CN111523169B (en) Decoration scheme generation method and device, electronic equipment and storage medium
CN112016638B (en) Method, device and equipment for identifying steel bar cluster and storage medium
CN110442856B (en) Address information standardization method and device, computer equipment and storage medium
EP3996055A1 (en) Machine learning techniques for extracting floorplan elements from architectural drawings
CN110929802A (en) Information entropy-based subdivision identification model training and image identification method and device
CN110263218B (en) Video description text generation method, device, equipment and medium
CN109544516B (en) Image detection method and device
CN113780098A (en) Character recognition method, character recognition device, electronic equipment and storage medium
CN113378712A (en) Training method of object detection model, image detection method and device thereof
CN111563429A (en) Drawing verification method and device, electronic equipment and storage medium
CN114581732A (en) Image processing and model training method, device, equipment and storage medium
CN114581794A (en) Geographic digital twin information acquisition method and device, electronic equipment and storage medium
CN108460335B (en) Video fine-granularity identification method and device, computer equipment and storage medium
CN114387642A (en) Image segmentation method, device, equipment and storage medium
CN113537026A (en) Primitive detection method, device, equipment and medium in building plan
CN110929499B (en) Text similarity obtaining method, device, medium and electronic equipment
CN112347776B (en) Medical data processing method and device, storage medium and electronic equipment
CN112857746A (en) Tracking method and device of lamplight detector, electronic equipment and storage medium
CN113963211B (en) Unsupervised domain adaptation training method and system for gesture recognition
CN114973333A (en) Human interaction detection method, human interaction detection device, human interaction detection equipment and storage medium
CN111612890B (en) Method and device for automatically generating three-dimensional model by two-dimensional house type graph and electronic equipment
CN109815307B (en) Position determination method, apparatus, device, and medium
CN114220163A (en) Human body posture estimation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant