CN111476875B - Smart building Internet of things object simulation method and building cloud server - Google Patents

Smart building Internet of things object simulation method and building cloud server Download PDF

Info

Publication number
CN111476875B
CN111476875B CN202010262005.7A CN202010262005A CN111476875B CN 111476875 B CN111476875 B CN 111476875B CN 202010262005 A CN202010262005 A CN 202010262005A CN 111476875 B CN111476875 B CN 111476875B
Authority
CN
China
Prior art keywords
rendering
animation
internet
building
synchronous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010262005.7A
Other languages
Chinese (zh)
Other versions
CN111476875A (en
Inventor
张志云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ANHUI ANTAI TECHNOLOGY Co.,Ltd.
Original Assignee
ANHUI ANTAI TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ANHUI ANTAI TECHNOLOGY CO LTD filed Critical ANHUI ANTAI TECHNOLOGY CO LTD
Priority to CN202011150566.4A priority Critical patent/CN112907738A/en
Priority to CN202010262005.7A priority patent/CN111476875B/en
Priority to CN202011150309.0A priority patent/CN112288868A/en
Publication of CN111476875A publication Critical patent/CN111476875A/en
Application granted granted Critical
Publication of CN111476875B publication Critical patent/CN111476875B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/04Architectural design, interior design

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Remote Sensing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention provides a smart building Internet of things object simulation method and a building cloud server, building object entities under various smart building simulation spaces are classified based on preset building functions, so that the condition of rendering conflict in the rendering process is improved by considering the difference of different building functions of a smart building system, and in addition, the subsequent service updating can be performed by taking the related Internet of things object entities as independent observation objects in the actual observation process through performing separation simulation on operation results generated after performing synchronous rendering on each independent related Internet of things object entity.

Description

Smart building Internet of things object simulation method and building cloud server
Technical Field
The invention relates to the technical field of intelligent buildings, in particular to an object simulation method for the Internet of things of an intelligent building and a building cloud server.
Background
With the rapid development of the technology of the internet of things and the 5G technology, the internet of things plays an increasingly important role, and through the intelligent building object simulation system of the internet of things, which is constructed by adopting the technology of the internet of things, more humanized and intelligent terminal solution services can be provided while the intelligent building is realized. At present, when a smart building is planned, a three-dimensional model rendering is usually performed on a smart building internet of things object simulation system in advance, for example, the operation conditions of each internet of things object entity (such as a human-computer interaction terminal, a security terminal and a mobile application terminal) in the smart building internet of things object simulation system are rendered in advance, so that subsequent service updating is facilitated.
In a traditional scheme, differences of different building functions of an intelligent building internet of things object simulation system are not considered generally, so that rendering conflicts are easily caused in a simulation rendering process, and in the rendering process, linkage operation effects may exist among different internet of things object entities and can help a user to know the general view of the current terminal solution service. However, currently, a separate simulation scheme for an operation result of each associated internet of things object entity is absent, so that subsequent service updating cannot be performed by using the associated internet of things object entity as an independent observation object in a targeted manner in an actual observation process.
Disclosure of Invention
In order to overcome at least the above-mentioned deficiencies in the prior art, the present invention aims to provide a smart building internet of things object simulation method and a building cloud server, which classify building object entities in each smart building simulation space based on predetermined building functions, thereby taking into account differences in different building functions of a smart building system, improving the situation of rendering conflicts in the rendering process, and performing separate simulation on operation results generated after synchronous rendering for each individual associated internet of things object entity, so as to perform subsequent service update with the associated internet of things object entity as an independent observation object in the actual observation process.
In a first aspect, the invention provides a method for simulating an object of the internet of things of a smart building, which is applied to a building cloud server, wherein the building cloud server is in communication connection with a plurality of building service terminals, and the method comprises the following steps:
building object entities of the target building three-dimensional model in the intelligent building simulation space of each intelligent building object are obtained from each building service terminal, the building object entities in the intelligent building simulation space of each intelligent building object are classified according to the preset building functions, and a building object entity set of each building function is generated respectively;
for each building function, acquiring model rendering data corresponding to each Internet of things object entity in a building object entity set of the building function, and performing operation simulation on the model rendering data corresponding to each Internet of things object entity;
monitoring whether rendering linkage information used for representing rendering linkage of the Internet of things object entities exists or not in the operation simulation process, and extracting first model rendering data of a first Internet of things object entity corresponding to the rendering linkage information of the operation simulation and second model rendering data of at least one second Internet of things object entity having rendering synchronous relation with the first Internet of things object entity when the rendering linkage information is detected;
associating the first model rendering data and the second model rendering data to a preset synchronous rendering queue, and establishing a plurality of first synchronous rendering parameters of the first model rendering data and a plurality of second synchronous rendering parameters of the second model rendering data based on the synchronous rendering queue;
determining a first bone reconstruction parameter of the first internet of things object entity according to each first synchronous rendering parameter, and determining second skeleton reconstruction parameters of the second networked object entity according to each second synchronous rendering parameter, then mapping the first skeleton reconstruction parameter and the second skeleton reconstruction parameter to a preset simulation space to obtain a first simulated reconstruction three-dimensional animation corresponding to the first skeleton reconstruction parameter and a second simulated reconstruction three-dimensional animation corresponding to the second skeleton reconstruction parameter, and determining a plurality of linkage animation frames in the preset simulation space, summarizing the plurality of linked animation frames to obtain at least a plurality of summarized animation sequences of different classes, for each summarized animation sequence, running a first simulated reconstruction three-dimensional animation and a second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence in a preset simulated running process;
and determining complete linkage record information between the first Internet of things object entity and the at least one second Internet of things object entity according to the running results of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence.
In a possible implementation manner of the first aspect, the step of obtaining model rendering data corresponding to each internet of things object entity in a building object entity set of the building function includes:
judging whether a rendering business relation is associated with each object entity of the Internet of things; the rendering service relationship is used for setting a rendering service of model rendering data corresponding to the object entities of the Internet of things, each object entity of the Internet of things corresponds to one rendering service relationship, and the service functions of different rendering service relationships are different;
if the rendering business relation corresponding to each Internet of things object entity is not associated, acquiring database information of each Internet of things object entity; the database information comprises an internet of things service type corresponding to the internet of things object entity, wherein the internet of things service type is the internet of things service type corresponding to the model rendering data generated by the internet of things object entity;
analyzing and identifying each database information according to the function distinguishing character corresponding to each database information to obtain at least a plurality of function distinguishing fields corresponding to each database information, and determining a target function distinguishing field with service header information from the function distinguishing field corresponding to each database information; the service header information is an identifier representing a function distinguishing field to a function distinguishing field corresponding to the service type of the Internet of things;
associating a corresponding rendering service relationship with each Internet of things object entity according to a service resource identification segment in a target function distinguishing field corresponding to each Internet of things object, wherein the rendering service relationship is determined according to the rendering service relationship corresponding to each service resource in the service resource identification segment in the target function distinguishing field;
and obtaining model rendering data corresponding to each Internet of things object entity from a pre-configured model rendering database according to the rendering business relationship corresponding to each Internet of things object entity association, wherein the model rendering database comprises model rendering data of each Internet of things object entity under different rendering business relationships.
In a possible implementation manner of the first aspect, the step of associating the first model rendering data and the second model rendering data to a preset synchronous rendering queue includes:
determining synchronous rendering configuration information of the synchronous rendering queue; the synchronous rendering configuration information is used for representing a synchronous rendering unit which is distributed when the synchronous rendering queue processes the successively associated model rendering data, and the synchronous rendering unit is used for representing rendering vector information when the synchronous rendering queue renders the associated model rendering data;
determining, based on the synchronized rendering configuration information, first rendering vector information corresponding to the synchronized rendering queue to which the first model rendering data is associated and second rendering vector information corresponding to the synchronized rendering queue to which the second model rendering data is associated;
determining, from the first rendering vector information and the second rendering vector information, whether there is rendering synchronization when associating the first model rendering data and the second model rendering data to the synchronized rendering queue; wherein the rendering synchronization is used for representing that the rendering of the synchronous rendering queue has synchronization behavior;
if not, adjusting the second rendering vector information to obtain third rendering vector information, and associating the first model rendering data and the second model rendering data to the synchronous rendering queue based on the first rendering vector information and the third rendering vector information, wherein the vector difference between the third rendering vector information and the second rendering vector information is matched with the vector difference between the first rendering vector information and the second rendering vector information;
and if so, continuously adopting the first rendering vector information and the second rendering vector information to associate the first model rendering data and the second model rendering data to the synchronous rendering queue.
In a possible implementation manner of the first aspect, the step of establishing a plurality of first synchronous rendering parameters of the first model rendering data and a plurality of second synchronous rendering parameters of the second model rendering data based on the synchronous rendering queue includes:
determining, based on the synchronized rendering queue, a first sequence of rendering nodes for the first model rendering data and a second sequence of rendering nodes for the second model rendering data; the rendering node sequence is used for representing the animation coordination relation of model rendering data under different rendering nodes;
establishing a plurality of first synchronous rendering parameters of the first model rendering data and a plurality of second synchronous rendering parameters of the second model rendering data in the synchronous rendering queue according to the first rendering node sequence and the second rendering node sequence respectively.
In a possible implementation manner of the first aspect, the step of determining a first bone reconstruction parameter of the first internet of things object entity according to each first synchronous rendering parameter, and determining a second bone reconstruction parameter of the second internet of things object entity according to each second synchronous rendering parameter includes:
determining a rendering node time sequence axis corresponding to each first synchronous rendering parameter according to the rendering nodes in each first synchronous rendering parameter and rendering animation consecutive parameters between every two adjacent rendering nodes;
determining a first skeletal reconstruction parameter of the first internet of things object entity based on the rendering node timing axis; each rendering node in the first synchronous rendering parameters is correspondingly provided with rendering animation consecutive input and output parameters, matching parameters between the rendering animation consecutive input and output parameters and rendering animation consecutive input and output parameters of any one rendering node serve as corresponding rendering animation consecutive parameters, and the rendering animation consecutive input and output parameters are determined according to rendering tracks of the rendering nodes in the first synchronous rendering parameters;
listing rendering nodes of each second synchronous rendering parameter and rendering animation consecutive input and output parameters corresponding to the rendering nodes to obtain a first rendering script and a second rendering script corresponding to each second synchronous rendering parameter; the first rendering script is a rendering script corresponding to a rendering node of a second synchronous rendering parameter, and the second rendering script is a rendering script corresponding to a rendering animation consecutive input and output parameter of the second synchronous rendering parameter;
determining a first association relationship of the first rendering script relative to the second rendering script and a second association relationship of the second rendering script relative to the second rendering script;
acquiring at least three target associated nodes with the same node continuity in the first associated relationship and the second associated relationship, and determining a second skeleton reconstruction parameter of the second synchronous rendering parameter according to the target associated nodes; wherein the node continuity is used to characterize rendering animation consecutive input-output relationships between each two associated nodes.
In a possible implementation manner of the first aspect, the step of summarizing the plurality of linked animation frames to obtain at least a plurality of summarized animation sequences of different categories includes:
determining the number of simulated and reconstructed three-dimensional animations corresponding to each linkage animation frame in the preset simulation space;
determining the category spread range of the simulated reconstruction three-dimensional animation corresponding to each linkage animation frame; the category spread range is the superposition proportion of a first simulated reconstruction three-dimensional animation and a second simulated reconstruction three-dimensional animation in the simulated reconstruction three-dimensional animation corresponding to each linkage animation frame;
determining the geometric primitive information of a first simulated reconstruction three-dimensional animation and a second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame; the geometric primitive information is obtained by calculating image characteristic values of animation areas corresponding to a set number of first simulation reconstruction three-dimensional animations and second simulation reconstruction three-dimensional animations;
determining a frame characteristic sequence of each linkage animation frame according to the number, the category spread range and the geometric primitive information of the simulated reconstruction three-dimensional animation corresponding to each linkage animation frame;
and summarizing each linkage animation frame based on the frame feature sequence of each linkage animation frame to obtain the summarized animation sequences of at least a plurality of different categories.
In a possible implementation manner of the first aspect, the step of running the first simulatively-reconstructed three-dimensional animation and the second simulatively-reconstructed three-dimensional animation corresponding to each linked animation frame in the summarized animation sequence in a preset simulation running process includes:
determining synchronous rendering configuration information of a frame feature sequence corresponding to each linkage animation frame in each summary animation sequence;
determining synchronous rendering errors of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in each summary according to the synchronous rendering configuration information; the synchronous rendering error is used for representing the rendering error conditions of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame;
judging whether the difference value of each synchronous rendering error and the reference rendering error corresponding to the simulation running process is within a preset difference value interval or not; the preset difference interval is used for representing an interval where each synchronous rendering error is located when the simulation running process is in normal running;
when the difference value between each synchronous rendering error and the reference synchronous coefficient corresponding to the simulation running process falls into the preset difference value interval, running a first simulation reconstruction three-dimensional animation and a second simulation reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence based on the simulation running process;
and otherwise, correcting synchronous rendering configuration information corresponding to synchronous rendering errors corresponding to the difference values which do not fall into the preset difference value interval according to the thread script of the simulation running process, and returning to the step of determining the synchronous rendering errors of the first simulation reconstruction three-dimensional animation and the second simulation reconstruction three-dimensional animation corresponding to each linkage animation frame in each summary according to the synchronous rendering configuration information.
In a possible implementation manner of the first aspect, the step of determining complete linkage record information between the first internet-of-things object entity and the at least one second internet-of-things object entity according to the running results of the first simulated reconstructed three-dimensional animation and the second simulated reconstructed three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence includes:
and determining complete linkage record information between the first Internet of things object entity and the at least one second Internet of things object entity according to a simulated rendering stream generated by splicing the running results of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence according to a time sequence.
In a second aspect, an embodiment of the present invention further provides an intelligent building internet of things object simulation apparatus, which is applied to a building cloud server, where the building cloud server is in communication connection with a plurality of building service terminals, and the apparatus includes:
the classification module is used for acquiring building object entities of the target building three-dimensional model in the intelligent building simulation space of each intelligent building object from each building service terminal, classifying the building object entities in the intelligent building simulation space according to the preset building functions, and respectively generating a building object entity set of each building function;
the operation simulation module is used for acquiring model rendering data corresponding to each Internet of things object entity in a building object entity set of each building function according to each building function, and performing operation simulation on the model rendering data corresponding to each Internet of things object entity;
the system comprises an extraction module, a simulation module and a control module, wherein the extraction module is used for monitoring whether rendering linkage information for representing the rendering linkage of the Internet of things object entities exists in the operation simulation process, and extracting first model rendering data of a first Internet of things object entity corresponding to the rendering linkage information of the operation simulation and second model rendering data of at least one second Internet of things object entity having rendering synchronous relation with the first Internet of things object entity when the rendering linkage information is detected;
the association establishing module is used for associating the first model rendering data and the second model rendering data to a preset synchronous rendering queue, and establishing a plurality of first synchronous rendering parameters of the first model rendering data and a plurality of second synchronous rendering parameters of the second model rendering data based on the synchronous rendering queue;
an operation module for determining a first bone reconstruction parameter of the first internet of things object entity according to each first synchronous rendering parameter, and determining second skeleton reconstruction parameters of the second networked object entity according to each second synchronous rendering parameter, then mapping the first skeleton reconstruction parameter and the second skeleton reconstruction parameter to a preset simulation space to obtain a first simulated reconstruction three-dimensional animation corresponding to the first skeleton reconstruction parameter and a second simulated reconstruction three-dimensional animation corresponding to the second skeleton reconstruction parameter, and determining a plurality of linkage animation frames in the preset simulation space, summarizing the plurality of linked animation frames to obtain at least a plurality of summarized animation sequences of different classes, for each summarized animation sequence, running a first simulated reconstruction three-dimensional animation and a second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence in a preset simulated running process;
and the determining module is used for determining complete linkage record information between the first internet of things object entity and the at least one second internet of things object entity according to the running results of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence.
In a third aspect, the embodiment of the invention also provides a smart building internet of things object simulation system, which comprises a building cloud server and a plurality of building service terminals in communication connection with the building cloud server;
the building service terminal is used for sending building object entities of the target building three-dimensional model in the intelligent building simulation space of each intelligent building object to the building cloud server
The building cloud server is used for acquiring building object entities of the target building three-dimensional model in the intelligent building simulation space of each intelligent building object from each building service terminal, classifying the building object entities in the intelligent building simulation space according to the preset building functions, and respectively generating a building object entity set of each building function;
the building cloud server is used for acquiring model rendering data corresponding to each Internet of things object entity in a building object entity set of each building function according to each building function, and performing operation simulation on the model rendering data corresponding to each Internet of things object entity;
the building cloud server is used for monitoring whether rendering linkage information used for representing rendering linkage of the Internet of things object entities exists or not in the operation simulation process, and extracting first model rendering data of a first Internet of things object entity corresponding to the rendering linkage information in operation simulation and second model rendering data of at least one second Internet of things object entity having rendering synchronous relation with the first Internet of things object entity when the rendering linkage information is detected;
the building cloud server is used for associating the first model rendering data and the second model rendering data to a preset synchronous rendering queue, and establishing a plurality of first synchronous rendering parameters of the first model rendering data and a plurality of second synchronous rendering parameters of the second model rendering data based on the synchronous rendering queue;
the building cloud server is used for determining a first bone reconstruction parameter of the first Internet of things object entity according to each first synchronous rendering parameter, and determining second skeleton reconstruction parameters of the second networked object entity according to each second synchronous rendering parameter, then mapping the first skeleton reconstruction parameter and the second skeleton reconstruction parameter to a preset simulation space to obtain a first simulated reconstruction three-dimensional animation corresponding to the first skeleton reconstruction parameter and a second simulated reconstruction three-dimensional animation corresponding to the second skeleton reconstruction parameter, and determining a plurality of linkage animation frames in the preset simulation space, summarizing the plurality of linked animation frames to obtain at least a plurality of summarized animation sequences of different classes, for each summarized animation sequence, running a first simulated reconstruction three-dimensional animation and a second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence in a preset simulated running process;
and the building cloud server is used for determining complete linkage record information between the first Internet of things object entity and the at least one second Internet of things object entity according to the running results of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence.
In a fourth aspect, an embodiment of the present invention further provides a building cloud server, where the building cloud server includes a processor, a machine-readable storage medium, and a network interface, where the machine-readable storage medium, the network interface, and the processor are connected through a bus system, the network interface is configured to be communicatively connected to at least one building service terminal, the machine-readable storage medium is configured to store a program, an instruction, or a code, and the processor is configured to execute the program, the instruction, or the code in the machine-readable storage medium to perform the method for simulating the smart building internet of things object in any one of the possible designs in the first aspect or the first aspect.
In a fifth aspect, an embodiment of the present invention provides a computer-readable storage medium, where instructions are stored, and when executed, cause a computer to perform the method for simulating an internet of things object of a smart building in the first aspect or any one of the possible designs of the first aspect.
Based on any one of the aspects, the building object entities under the building simulation spaces of the intelligent buildings are classified based on the preset building functions, so that the difference of different building functions of the intelligent building system is considered, the situation of rendering conflict in the rendering process is improved, in addition, the operation results generated after synchronous rendering is carried out on each independent related Internet of things object entity are separately simulated, and the subsequent service updating can be carried out by pertinently taking the related Internet of things object entities as independent observation objects in the actual observation process.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic view of an application scenario of an intelligent building internet of things object simulation system according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a method for simulating an internet of things object of an intelligent building according to an embodiment of the present invention;
fig. 3 is a schematic view of a functional module of an intelligent building internet of things object simulation device according to an embodiment of the present invention;
fig. 4 is a block diagram illustrating a structure of a building cloud server for implementing the method for simulating the internet of things object of the smart building according to the embodiment of the present invention.
Detailed Description
The present invention is described in detail below with reference to the drawings, and the specific operation methods in the method embodiments can also be applied to the apparatus embodiments or the system embodiments.
Fig. 1 is an interactive schematic diagram of an intelligent building internet of things object simulation system 10 according to an embodiment of the present invention. The intelligent building internet of things object simulation system 10 can comprise a building cloud server 100 and a building service terminal 200 which is in communication connection with the internet of things cloud building cloud server 100. The intelligent building internet of things object simulation system 10 shown in fig. 1 is only one possible example, and in other possible embodiments, the intelligent building internet of things object simulation system 10 may include only one of the components shown in fig. 1 or may also include other components.
In this embodiment, the building service terminal 200 may include a mobile device, a tablet computer, a laptop computer, or the like, or any combination thereof. In some embodiments, the mobile device may include an internet of things object entity device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof. In some embodiments, the internet of things object entity device may include a control device of an intelligent electrical device, an intelligent monitoring device, an intelligent television, an intelligent camera, and the like, or any combination thereof. In some embodiments, the wearable device may include a smart bracelet, a smart lace, smart glass, a smart helmet, a smart watch, a smart garment, a smart backpack, a smart accessory, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smartphone, a personal digital assistant, a gaming device, and the like, or any combination thereof. In some embodiments, the virtual reality device and the augmented reality device may include a virtual reality helmet, virtual reality glass, a virtual reality patch, an augmented reality helmet, augmented reality glass, an augmented reality patch, or the like, or any combination thereof. For example, virtual reality devices and augmented reality devices may include various virtual reality products and the like.
In this embodiment, the internet of things cloud building cloud server 100 and the building service terminal 200 in the smart building internet of things object simulation system 10 may execute the network security protection method of the internet of things mobile base station described in the following method embodiment in a matching manner, and the detailed description of the method embodiment below may be referred to in the execution steps of the building cloud server 100 and the building service terminal 200.
In this embodiment, in the present embodiment, the intelligent building internet of things object simulation system 10 may be implemented in various application scenarios, for example, a blockchain application scenario, an intelligent home application scenario, an intelligent control application scenario, and the like.
To solve the technical problem in the background art, fig. 2 is a schematic flow chart of a smart building internet of things object simulation method according to an embodiment of the present invention, which can be executed by the building cloud server 100 shown in fig. 1, and the following describes the smart building internet of things object simulation method in detail.
Step S110, building object entities of the target building three-dimensional model in the intelligent building simulation space of each intelligent building object are obtained from each building service terminal, the building object entities in the intelligent building simulation space of each intelligent building object are classified according to the preset building functions, and building object entity sets of each building function are respectively generated.
The building object entity can be used for representing an entity rendering model of a target building three-dimensional model specifically displayed under the intelligent building simulation space of each intelligent building object.
Step S120, aiming at each building function, obtaining model rendering data corresponding to each Internet of things object entity in a building object entity set of the building function, and performing operation simulation on the model rendering data corresponding to each Internet of things object entity.
Step S130, whether rendering linkage information used for representing the rendering linkage of the Internet of things object entities exists is monitored in the operation simulation process, and when the rendering linkage information is detected, first model rendering data of a first Internet of things object entity corresponding to the rendering linkage information of the operation simulation and second model rendering data of at least one second Internet of things object entity having rendering synchronization relation with the first Internet of things object entity are extracted.
Step S140, associating the first model rendering data and the second model rendering data to a preset synchronous rendering queue, and establishing a plurality of first synchronous rendering parameters of the first model rendering data and a plurality of second synchronous rendering parameters of the second model rendering data based on the synchronous rendering queue.
Step S150, determining a first skeleton reconstruction parameter of the first Internet of things object entity according to each first synchronous rendering parameter, and determining second skeleton reconstruction parameters of the second networked object entity according to each second synchronous rendering parameter, then mapping the first skeleton reconstruction parameter and the second skeleton reconstruction parameter to a preset simulation space to obtain a first simulated reconstruction three-dimensional animation corresponding to the first skeleton reconstruction parameter and a second simulated reconstruction three-dimensional animation corresponding to the second skeleton reconstruction parameter, and determining a plurality of linkage animation frames in the preset simulation space, aggregating the plurality of linked animation frames to obtain at least a plurality of aggregated animation sequences of different categories, for each aggregated animation sequence, and running the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence in a preset simulated running process.
Step S160, according to the running results of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence, complete linkage record information between the first Internet of things object entity and at least one second Internet of things object entity is determined.
Based on the above steps, the building object entities under each smart building simulation space are classified based on the predetermined building function, so that differences of different building functions of the smart building system are considered, and the situation of rendering conflicts in the rendering process is improved.
In one possible implementation manner, for step S110, in order to improve the accuracy of the division and reduce redundant information to improve the classification accuracy, the present embodiment may acquire a building object corresponding to each predetermined building function, form a building object sequence of each predetermined building function, and acquire building object information associated with each target building object of each smart building simulation space and the building object of the building object sequence.
On the basis, the density of the key building objects of each target building function can be calculated according to the target building objects and the building object information related to the building objects of the building object sequence, and the building objects are selected from the building object sequence according to the density of the key building objects of each target building function, so that the initial building object distribution space is obtained.
In one possible example, if the total building object distribution density of the initial building object distribution space is greater than the maximum total building object distribution density required by the total building object distribution density, then a first key building object in the initial building object distribution space is dispersed to a first distribution density and a second key building object in the initial building object distribution space is aggregated to the first distribution density.
It should be noted that the second key building object may refer to a key building object whose unit intensity of the building unit where the key building object is located is less than a set intensity, the first key building object may refer to a key building object whose unit intensity of the building unit where the key building object is located is not less than a set intensity, the first distribution density may be set according to actual requirements, but the first partition density should not be too different from the maximum total building object distribution density required by the total building object distribution density.
Then, the total building object distribution density of the initial building object distribution space after the updating is carried out, and if the total building object distribution density of the initial building object distribution space after the updating is larger than the maximum total building object distribution density, the processing is carried out on the initial building object distribution space after the updating again.
For another example, if the total building object distribution density of the initial building object distribution space after the current update is less than or equal to the maximum total building object distribution density, the initial building object distribution space before the current update may be used as the first update distribution space, and the target building functions may be sorted according to the building functions from low priority to high priority to obtain the target building function sequence.
On the basis, building object entities under the building simulation space of each intelligent building can be classified according to the building function sequence of the target building, and a building object entity set of each building function is generated respectively.
For example, the target building functions may be grouped according to the target building function sequence, each group including a first building function and a second building function related to the function hierarchy of the target building function sequence and corresponding to a difference in the hierarchy of the function hierarchy, the first building function having a lower priority than the second building function.
Then, each packet is sequentially taken as a target packet in the order from low priority to high priority in the hierarchy difference from the function hierarchy, and the target packet is subjected to the following second update processing: the critical building objects for the first building function of the target grouping in the first update distribution space are increased by a set number and the critical building objects for the second building function of the target grouping in the first update distribution space are decreased by the set number.
On the basis, whether the total building object distribution density of the updated first update distribution space is greater than the total building object distribution density requirement or not can be judged, and if the total building object distribution density of the updated first update distribution space is greater than the total building object distribution density requirement, the updated first update distribution space is used as the final building object distribution space. And if the total building object distribution density of the updated first updating distribution space is not greater than the total building object distribution density requirement, taking the next group as a new target group, and performing second updating processing on the new target group.
For another example, if the total building object distribution density of the initial building object distribution space is less than the minimum total building object distribution density required to be greater than the total building object distribution density, then the following third update process is performed on the initial building object distribution space: the first key building object in the initial building object distribution space is increased by a first distribution density and the second key building object in the initial building object distribution space is decreased by the first distribution density.
On the basis, calculating the total building object distribution density of the initial building object distribution space after the current update, and if the total building object distribution density of the initial building object distribution space after the current update is smaller than the minimum total building object distribution density, executing third update processing on the initial building object distribution space after the current update again. Or if the total building object distribution density of the initial building object distribution space after the update is greater than or equal to the minimum total building object distribution density, taking the initial building object distribution space before the update as a second update distribution space, and sequencing the target building functions according to the sequence from low priority to high priority of the building functions to obtain a target building function sequence.
Thus, the target building functions can be grouped according to the target building function sequence, each group comprises a first building function and a second building function which are related to the function level of the target building function sequence and are consistent with the level difference of the function level, and the priority of the first building function is lower than that of the second building function.
Then, each packet is sequentially taken as a target packet in the order from low priority to high priority in the hierarchy difference from the function hierarchy, and the following fourth update processing is performed on the target packet: the critical building objects for the first building function of the target grouping in the second update distribution space are decreased by a set number and the critical building objects for the second building function of the target grouping in the second update distribution space are increased by the set number.
Further, this embodiment may determine whether the total building object distribution density of the second updated distribution space updated this time is greater than the total building object distribution density requirement, if the total building object distribution density of the second updated distribution space updated this time is greater than the total building object distribution density requirement, the second updated distribution space updated this time is used as the final building object distribution space, and if the total building object distribution density of the second updated distribution space updated this time is not greater than the total building object distribution density requirement, the next group is used as the new target group, and the fourth update processing is performed on the new target group.
In this way, the building object entities of each building object in the final building object distribution space of each target building function can be classified as a building object entity set of the building function.
In a possible implementation manner, regarding step S120, it is considered that a part of the internet of things object entities may be added after updating, and therefore, it is further required to determine whether a rendering service relationship is associated with each internet of things object entity.
The rendering service relationship can be used for setting a rendering service of the model rendering data corresponding to the object entities of the internet of things, each object entity of the internet of things corresponds to one rendering service relationship, and the service functions of different rendering service relationships are different.
For example, if the rendering service relationship corresponding to each internet of things object entity is not associated, the database information of each internet of things object entity is obtained. The database information comprises an internet of things service type corresponding to the internet of things object entity, and the internet of things service type is the internet of things service type corresponding to the model rendering data generated by the internet of things object entity.
Then, each database information can be analyzed and identified according to the function distinguishing character corresponding to each database information to obtain at least a plurality of function distinguishing fields corresponding to each database information, and the target function distinguishing field with the service header information is determined from the function distinguishing field corresponding to each database information. The service header information is an identifier representing a function distinguishing field to a function distinguishing field corresponding to the service type of the internet of things.
On this basis, a corresponding rendering service relationship can be associated with each internet of things object entity according to the service resource identification section in the target function distinguishing field corresponding to each internet of things object, wherein the rendering service relationship is determined according to the rendering service relationship corresponding to each service resource in the service resource identification section in the target function distinguishing field.
Therefore, the model rendering data corresponding to each Internet of things object entity can be obtained from the pre-configured model rendering database according to the rendering business relation corresponding to each Internet of things object entity association, wherein the model rendering database comprises the model rendering data of each Internet of things object entity under different rendering business relations.
In a possible implementation manner, for step S130, in the process of extracting the first model rendering data of the first internet of things object entity corresponding to the rendering linkage information of the operation simulation and the second model rendering data of the at least one second internet of things object entity having the rendering synchronization relationship with the first internet of things object entity, the first model rendering data of the first internet of things object entity corresponding to the rendering linkage information of the operation simulation and the second model rendering data of the at least one second internet of things object entity having the rendering synchronization relationship with the first internet of things object entity may be extracted from the operation simulation record information generated in the operation simulation process. Wherein, the at least one second internet-of-things object entity having rendering synchronization relationship with the first internet-of-things object entity may refer to a second internet-of-things object entity having linkage effect associated with the presence of the first internet-of-things object entity. For example, if a certain internet of things object entity needs to be synchronously operated in the process of rendering eagle by the first internet of things object entity, the internet of things object entity can be understood as a second internet of things object entity which has rendering synchronous relation with the first internet of things object entity.
In one possible implementation manner, for step S140, the present embodiment may first determine synchronous rendering configuration information of the synchronous rendering queue.
It should be noted that the synchronous rendering configuration information is used to represent a synchronous rendering unit allocated by the synchronous rendering queue when the synchronous rendering queue processes the successively associated model rendering data, and the synchronous rendering unit may be used to represent rendering vector information when the synchronous rendering queue renders the associated model rendering data.
On this basis, first rendering vector information corresponding to the association of the first model rendering data to the synchronous rendering queue and second rendering vector information corresponding to the association of the second model rendering data to the synchronous rendering queue may be determined based on the synchronous rendering configuration information, and then whether rendering synchronization exists when the first model rendering data and the second model rendering data are associated to the synchronous rendering queue may be determined according to the first rendering vector information and the second rendering vector information. Wherein, the rendering synchronization is used for representing that the rendering of the synchronous rendering queue has synchronous behavior.
For example, if the first rendering vector information and the second rendering vector information match, then it is determined that there is rendering synchronization when associating the first model rendering data and the second model rendering data to the synchronized rendering queue, otherwise it is determined that there is no rendering synchronization when associating the first model rendering data and the second model rendering data to the synchronized rendering queue.
If it is determined that rendering synchronization does not exist when the first model rendering data and the second model rendering data are associated to the synchronous rendering queue, the second rendering vector information may be adjusted to obtain third rendering vector information, and the first model rendering data and the second model rendering data are associated to the synchronous rendering queue based on the first rendering vector information and the third rendering vector information, wherein a vector difference between the third rendering vector information and the second rendering vector information is matched with a vector difference between the first rendering vector information and the second rendering vector information.
If it is determined that rendering synchronization exists when the first model rendering data and the second model rendering data are associated to the synchronized rendering queue, the first model rendering data and the second model rendering data may be associated to the synchronized rendering queue continuously using the first rendering vector information and the second rendering vector information.
In a possible implementation manner, still referring to step S140, the present embodiment may specifically determine, based on the synchronous rendering queue, a first rendering node sequence of the first model rendering data and a second rendering node sequence of the second model rendering data.
It should be noted that the rendering node sequence may be used to represent an animation coordination relationship of the model rendering data under different rendering nodes, for example, a transition animation coordination relationship, an overlay animation coordination relationship, an add animation coordination relationship, and the like may be represented, and this is not specifically contemplated herein.
Then, a plurality of first synchronous rendering parameters of the first model rendering data and a plurality of second synchronous rendering parameters of the second model rendering data may be established in the synchronous rendering queue according to the first rendering node sequence and the second rendering node sequence, respectively.
In a possible implementation manner, for step S150, in order to ensure synchronicity and coherence and facilitate subsequent observation, the embodiment may determine a rendering node timing axis corresponding to each first synchronous rendering parameter according to a plurality of rendering nodes in each first synchronous rendering parameter and a rendering animation coherence parameter between every two adjacent rendering nodes, and then determine a first skeleton reconstruction parameter of the first internet-of-things object entity based on the rendering node timing axis.
Each rendering node in the first synchronous rendering parameters is correspondingly provided with rendering animation consecutive input and output parameters, matching parameters between the rendering animation consecutive input and output parameters and the rendering animation consecutive input and output parameters of any one rendering node serve as corresponding rendering animation consecutive parameters, and the rendering animation consecutive input and output parameters are determined according to rendering tracks of the rendering nodes in the first synchronous rendering parameters.
On the basis, the rendering nodes of each second synchronous rendering parameter and the rendering animation consecutive input and output parameters corresponding to the rendering nodes can be listed to obtain a first rendering script and a second rendering script corresponding to each second synchronous rendering parameter.
The first rendering script may be a rendering script corresponding to a rendering node of the second synchronous rendering parameter, and the second rendering script may be a rendering script corresponding to a rendering animation consecutive input/output parameter of the second synchronous rendering parameter.
On the basis, a first incidence relation of the first rendering script relative to the second rendering script and a second incidence relation of the second rendering script relative to the second rendering script can be determined, then at least three target incidence nodes with the same node continuity in the first incidence relation and the second incidence relation are obtained, and a second skeleton reconstruction parameter of a second synchronous rendering parameter is determined according to the target incidence nodes. The node continuity is used for representing rendering animation consecutive input and output relations between every two associated nodes.
In a possible implementation manner, still referring to step S150, in the process of summarizing the plurality of linked animation frames to obtain at least a plurality of summarized animation sequences of different categories, the present embodiment may determine the number of the simulated three-dimensional animations corresponding to each linked animation frame in the preset simulation space, and simultaneously determine the category coverage range of the simulated three-dimensional animations corresponding to each linked animation frame.
The category spread range can be the coincidence proportion of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation in the simulated reconstruction three-dimensional animation corresponding to each linkage animation frame.
On the basis, the geometric primitive information of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame can be determined.
The geometric primitive information may be obtained by calculating image feature values (e.g., grayscale feature values, mean feature values of RGB color values, etc.) of a set number of animation regions corresponding to the first simulated three-dimensional animation and the second simulated three-dimensional animation.
Then, the frame feature sequence of each linkage animation frame (i.e. the sequence formed by the number of the simulation reconstruction three-dimensional animations, the category extending range and the geometric primitive information in sequence) can be determined according to the number of the simulation reconstruction three-dimensional animations, the category extending range and the geometric primitive information corresponding to each linkage animation frame. And then, summarizing each linkage animation frame based on the frame feature sequence of each linkage animation frame to obtain at least a plurality of summarized animation sequences of different categories. For example, linked animation frames having at least one characteristic parameter in the same characteristic parameter in each of the frame characteristic sequences may be aggregated into an aggregated animation sequence of a category corresponding to the same characteristic parameter, thereby aggregating animation sequences of at least a plurality of different categories.
In a possible implementation manner, still referring to step S150, in the process of running the first simulated reconstructed three-dimensional animation and the second simulated reconstructed three-dimensional animation corresponding to each linked animation frame in the summarized animation sequence in the preset simulation running process, this embodiment may specifically determine synchronous rendering configuration information of the frame feature sequence corresponding to each linked animation frame in each summarized animation sequence, and then determine a synchronous rendering error of the first simulated reconstructed three-dimensional animation and the second simulated reconstructed three-dimensional animation corresponding to each linked animation frame in each summary according to the synchronous rendering configuration information.
And the synchronous rendering error can be used for representing the rendering error conditions of the first simulated reconstructed three-dimensional animation and the second simulated reconstructed three-dimensional animation corresponding to each linkage animation frame.
Then, whether the difference value between each synchronous rendering error and the reference rendering error corresponding to the simulation running process is within a preset difference value interval can be judged.
The preset difference interval can be used for representing an interval where each synchronous rendering error is located when the simulation running process is in normal running.
Therefore, when the difference value between each synchronous rendering error and the reference synchronous coefficient corresponding to the simulated running process falls into the preset difference value interval, the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summarized animation sequence can be run based on the simulated running process.
Otherwise, when the difference value between each synchronous rendering error and the reference synchronous coefficient corresponding to the simulated running process does not fall into the preset difference value interval, the synchronous rendering configuration information corresponding to the synchronous rendering error corresponding to the difference value which does not fall into the preset difference value interval can be corrected according to the thread script of the simulated running process, and the step of determining the synchronous rendering error of the first simulated reconstructed three-dimensional animation and the second simulated reconstructed three-dimensional animation corresponding to each linkage animation frame in each summary according to the synchronous rendering configuration information is returned.
In a possible implementation manner, for step S160, the complete linkage record information between the first internet of things object entity and the at least one second internet of things object entity may be determined according to a simulation rendering stream generated by splicing running results of the first simulation reconstruction three-dimensional animation and the second simulation reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence according to a time sequence. Therefore, subsequent service updating can be performed by taking the associated Internet of things object entity as an independent observation object in a targeted manner in the actual observation process.
Fig. 3 is a schematic diagram of functional modules of a smart building internet of things object simulation apparatus 300 according to an embodiment of the present invention, in this embodiment, functional modules of the smart building internet of things object simulation apparatus 300 may be classified according to a method embodiment executed by the building cloud server 100, that is, the following functional modules corresponding to the smart building internet of things object simulation apparatus 300 may be used to execute various method embodiments executed by the building cloud server 100. The intelligent building internet of things object simulation device 300 may include a classification module 310, an operation simulation module 320, an extraction module 330, an association establishment module 340, an operation module 350, and a determination module 360, and the functions of the functional modules of the intelligent building internet of things object simulation device 300 are described in detail below.
The classification module 310 is configured to obtain, from each building service terminal, building object entities of the target building three-dimensional model in the building simulation space of each smart building object, classify the building object entities in the building simulation space of each smart building according to a predetermined building function, and generate a building object entity set of each building function. The classifying module 310 may be configured to perform the step S110, and the detailed implementation of the classifying module 310 may refer to the detailed description of the step S110.
The operation simulation module 320 is configured to, for each building function, obtain model rendering data corresponding to each internet of things object entity in a building object entity set of the building function, and perform operation simulation on the model rendering data corresponding to each internet of things object entity. The operation simulation module 320 may be configured to perform the step S120, and for a detailed implementation of the operation simulation module 320, reference may be made to the detailed description of the step S120.
The extracting module 330 is configured to monitor whether rendering linkage information for representing that an internet of things object entity has rendering linkage exists in an operation simulation process, and extract first model rendering data of a first internet of things object entity corresponding to the rendering linkage information of the operation simulation and second model rendering data of at least one second internet of things object entity having rendering synchronization relationship with the first internet of things object entity when the rendering linkage information is detected. The extracting module 330 may be configured to perform the step S130, and the detailed implementation of the extracting module 330 may refer to the detailed description of the step S130.
The association establishing module 340 is configured to associate the first model rendering data and the second model rendering data to a preset synchronous rendering queue, and establish a plurality of first synchronous rendering parameters of the first model rendering data and a plurality of second synchronous rendering parameters of the second model rendering data based on the synchronous rendering queue. The association establishing module 340 may be configured to perform the step S140, and the detailed implementation manner of the association establishing module 340 may refer to the detailed description of the step S140.
A running module 350, configured to determine a first bone reconstruction parameter of the first internet of things object entity according to each first synchronous rendering parameter, and determining second skeleton reconstruction parameters of the second networked object entity according to each second synchronous rendering parameter, then mapping the first skeleton reconstruction parameter and the second skeleton reconstruction parameter to a preset simulation space to obtain a first simulated reconstruction three-dimensional animation corresponding to the first skeleton reconstruction parameter and a second simulated reconstruction three-dimensional animation corresponding to the second skeleton reconstruction parameter, and determining a plurality of linkage animation frames in the preset simulation space, aggregating the plurality of linked animation frames to obtain at least a plurality of aggregated animation sequences of different categories, for each aggregated animation sequence, and running the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence in a preset simulated running process. The operation module 350 may be configured to perform the step S150, and the detailed implementation of the operation module 350 may refer to the detailed description of the step S150.
The determining module 360 is configured to determine complete linkage record information between the first internet-of-things object entity and the at least one second internet-of-things object entity according to the operation results of the first simulated reconstructed three-dimensional animation and the second simulated reconstructed three-dimensional animation corresponding to each linkage animation frame in the summarized animation sequence. The determining module 360 may be configured to perform the step S160, and for a detailed implementation of the determining module 360, reference may be made to the detailed description of the step S160.
Further, fig. 4 is a schematic structural diagram of a building cloud server 100 for executing the method for simulating an internet of things object of a smart building according to an embodiment of the present invention. As shown in fig. 4, the building cloud server 100 may include a network interface 110, a machine-readable storage medium 120, a processor 130, and a bus 140. The processor 130 may be one or more, and one processor 130 is illustrated in fig. 4 as an example. The network interface 110, the machine-readable storage medium 120, and the processor 130 may be connected by a bus 140 or otherwise, as exemplified by the connection by the bus 140 in fig. 4.
The machine-readable storage medium 120 is a computer-readable storage medium, and can be used for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the method for simulating the smart building internet of things object in the embodiment of the present invention (for example, the classification module 310, the operation simulation module 320, the extraction module 330, the association establishing module 340, the operation module 350, and the determination module 360 of the smart building internet of things object simulation apparatus 300 shown in fig. 3). The processor 130 detects the software program, the instructions and the modules stored in the machine-readable storage medium 120, so as to execute various functional applications and data processing of the terminal device, that is, the above-mentioned method for simulating the internet of things object of the intelligent building is implemented, and details are not repeated herein.
The machine-readable storage medium 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the machine-readable storage medium 120 may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DR RAM). It should be noted that the memories of the systems and methods described herein are intended to comprise, without being limited to, these and any other suitable memory of a publishing node. In some examples, the machine-readable storage medium 120 may further include memory located remotely from the processor 130, which may be connected to the building cloud server 100 over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The processor 130 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 130. The processor 130 may be a general-purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
The building cloud server 100 may interact with other devices (e.g., the building service terminal 200) through the network interface 110. Network interface 110 may be a circuit, bus, transceiver, or any other device that may be used to exchange information. Processor 130 may send and receive information using network interface 110.
Finally, it should be noted that: as will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
For the above-mentioned apparatus embodiments, since they basically correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, wherein the modules described as separate parts may or may not be physically separate, and the parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims. It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. The method for simulating the object of the Internet of things of the smart building is applied to a building cloud server, the building cloud server is in communication connection with a plurality of building service terminals, and the method comprises the following steps:
building object entities of the target building three-dimensional model in the intelligent building simulation space of each intelligent building object are obtained from each building service terminal, the building object entities in the intelligent building simulation space of each intelligent building object are classified according to the preset building functions, and a building object entity set of each building function is generated respectively;
for each building function, acquiring model rendering data corresponding to each Internet of things object entity in a building object entity set of the building function, and performing operation simulation on the model rendering data corresponding to each Internet of things object entity;
monitoring whether rendering linkage information used for representing rendering linkage of the Internet of things object entities exists or not in the operation simulation process, and extracting first model rendering data of a first Internet of things object entity corresponding to the rendering linkage information of the operation simulation and second model rendering data of at least one second Internet of things object entity having rendering synchronous relation with the first Internet of things object entity when the rendering linkage information is detected;
associating the first model rendering data and the second model rendering data to a preset synchronous rendering queue, and establishing a plurality of first synchronous rendering parameters of the first model rendering data and a plurality of second synchronous rendering parameters of the second model rendering data based on the synchronous rendering queue;
determining a first bone reconstruction parameter of the first internet of things object entity according to each first synchronous rendering parameter, and determining second skeleton reconstruction parameters of the second networked object entity according to each second synchronous rendering parameter, then mapping the first skeleton reconstruction parameter and the second skeleton reconstruction parameter to a preset simulation space to obtain a first simulated reconstruction three-dimensional animation corresponding to the first skeleton reconstruction parameter and a second simulated reconstruction three-dimensional animation corresponding to the second skeleton reconstruction parameter, and determining a plurality of linkage animation frames in the preset simulation space, summarizing the plurality of linked animation frames to obtain at least a plurality of summarized animation sequences of different classes, for each summarized animation sequence, running a first simulated reconstruction three-dimensional animation and a second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence in a preset simulated running process;
and determining complete linkage record information between the first Internet of things object entity and the at least one second Internet of things object entity according to the running results of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence.
2. The intelligent building internet of things object simulation method according to claim 1, wherein the step of obtaining model rendering data corresponding to each internet of things object entity in the building object entity set of the building function comprises:
judging whether a rendering business relation is associated with each object entity of the Internet of things; the rendering service relationship is used for setting a rendering service of model rendering data corresponding to the object entities of the Internet of things, each object entity of the Internet of things corresponds to one rendering service relationship, and the service functions of different rendering service relationships are different;
if the rendering business relation corresponding to each Internet of things object entity is not associated, acquiring database information of each Internet of things object entity; the database information comprises an internet of things service type corresponding to the internet of things object entity, wherein the internet of things service type is the internet of things service type corresponding to the model rendering data generated by the internet of things object entity;
analyzing and identifying each database information according to the function distinguishing character corresponding to each database information to obtain at least a plurality of function distinguishing fields corresponding to each database information, and determining a target function distinguishing field with service header information from the function distinguishing field corresponding to each database information; the service header information is an identifier representing a function distinguishing field to a function distinguishing field corresponding to the service type of the Internet of things;
associating a corresponding rendering service relationship with each Internet of things object entity according to a service resource identification segment in a target function distinguishing field corresponding to each Internet of things object, wherein the rendering service relationship is determined according to the rendering service relationship corresponding to each service resource in the service resource identification segment in the target function distinguishing field;
and obtaining model rendering data corresponding to each Internet of things object entity from a pre-configured model rendering database according to the rendering business relationship corresponding to each Internet of things object entity association, wherein the model rendering database comprises model rendering data of each Internet of things object entity under different rendering business relationships.
3. The method for modeling intelligent building internet of things (IOT) object as claimed in claim 1, wherein the step of associating the first model rendering data and the second model rendering data to a preset synchronized rendering queue comprises:
determining synchronous rendering configuration information of the synchronous rendering queue; the synchronous rendering configuration information is used for representing a synchronous rendering unit which is distributed when the synchronous rendering queue processes the successively associated model rendering data, and the synchronous rendering unit is used for representing rendering vector information when the synchronous rendering queue renders the associated model rendering data;
determining, based on the synchronized rendering configuration information, first rendering vector information corresponding to the synchronized rendering queue to which the first model rendering data is associated and second rendering vector information corresponding to the synchronized rendering queue to which the second model rendering data is associated;
determining, from the first rendering vector information and the second rendering vector information, whether there is rendering synchronization when associating the first model rendering data and the second model rendering data to the synchronized rendering queue; wherein the rendering synchronization is used for representing that the rendering of the synchronous rendering queue has synchronization behavior;
if not, adjusting the second rendering vector information to obtain third rendering vector information, and associating the first model rendering data and the second model rendering data to the synchronous rendering queue based on the first rendering vector information and the third rendering vector information, wherein the vector difference between the third rendering vector information and the second rendering vector information is matched with the vector difference between the first rendering vector information and the second rendering vector information;
and if so, continuously adopting the first rendering vector information and the second rendering vector information to associate the first model rendering data and the second model rendering data to the synchronous rendering queue.
4. The method for modeling intelligent building internet of things (IOT) object as claimed in any one of claims 1-3, wherein the step of establishing a plurality of first simultaneous rendering parameters of the first model rendering data and a plurality of second simultaneous rendering parameters of the second model rendering data based on the simultaneous rendering queue comprises:
determining, based on the synchronized rendering queue, a first sequence of rendering nodes for the first model rendering data and a second sequence of rendering nodes for the second model rendering data; the rendering node sequence is used for representing the animation coordination relation of model rendering data under different rendering nodes;
establishing a plurality of first synchronous rendering parameters of the first model rendering data and a plurality of second synchronous rendering parameters of the second model rendering data in the synchronous rendering queue according to the first rendering node sequence and the second rendering node sequence respectively.
5. The intelligent building internet of things object simulation method according to any one of claims 1 to 3, wherein the step of determining a first bone reconstruction parameter of the first internet of things object entity according to each first synchronous rendering parameter and a second bone reconstruction parameter of the second internet of things object entity according to each second synchronous rendering parameter comprises:
determining a rendering node time sequence axis corresponding to each first synchronous rendering parameter according to the rendering nodes in each first synchronous rendering parameter and rendering animation consecutive parameters between every two adjacent rendering nodes;
determining a first skeletal reconstruction parameter of the first internet of things object entity based on the rendering node timing axis; each rendering node in the first synchronous rendering parameters is correspondingly provided with rendering animation consecutive input and output parameters, matching parameters between the rendering animation consecutive input and output parameters and rendering animation consecutive input and output parameters of any one rendering node serve as corresponding rendering animation consecutive parameters, and the rendering animation consecutive input and output parameters are determined according to rendering tracks of the rendering nodes in the first synchronous rendering parameters;
listing rendering nodes of each second synchronous rendering parameter and rendering animation consecutive input and output parameters corresponding to the rendering nodes to obtain a first rendering script and a second rendering script corresponding to each second synchronous rendering parameter; the first rendering script is a rendering script corresponding to a rendering node of a second synchronous rendering parameter, and the second rendering script is a rendering script corresponding to a rendering animation consecutive input and output parameter of the second synchronous rendering parameter;
determining a first association relationship of the first rendering script relative to the second rendering script and a second association relationship of the second rendering script relative to the second rendering script;
acquiring at least three target associated nodes with the same node continuity in the first associated relationship and the second associated relationship, and determining a second skeleton reconstruction parameter of the second synchronous rendering parameter according to the target associated nodes; wherein the node continuity is used to characterize rendering animation consecutive input-output relationships between each two associated nodes.
6. The intelligent building internet of things object simulation method according to claim 5, wherein the step of summarizing the plurality of linked animation frames to obtain at least a plurality of summarized animation sequences of different categories comprises:
determining the number of simulated and reconstructed three-dimensional animations corresponding to each linkage animation frame in the preset simulation space;
determining the category spread range of the simulated reconstruction three-dimensional animation corresponding to each linkage animation frame; the category spread range is the superposition proportion of a first simulated reconstruction three-dimensional animation and a second simulated reconstruction three-dimensional animation in the simulated reconstruction three-dimensional animation corresponding to each linkage animation frame;
determining the geometric primitive information of a first simulated reconstruction three-dimensional animation and a second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame; the geometric primitive information is obtained by calculating image characteristic values of animation areas corresponding to a set number of first simulation reconstruction three-dimensional animations and second simulation reconstruction three-dimensional animations;
determining a frame characteristic sequence of each linkage animation frame according to the number, the category spread range and the geometric primitive information of the simulated reconstruction three-dimensional animation corresponding to each linkage animation frame;
and summarizing each linkage animation frame based on the frame feature sequence of each linkage animation frame to obtain the summarized animation sequences of at least a plurality of different categories.
7. The method for simulating the Internet of things object for the intelligent building according to claim 6, wherein the step of operating the first simulatively reconstructed three-dimensional animation and the second simulatively reconstructed three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence in a preset simulation operation process comprises the following steps:
determining synchronous rendering configuration information of a frame feature sequence corresponding to each linkage animation frame in each summary animation sequence;
determining synchronous rendering errors of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in each summary according to the synchronous rendering configuration information; the synchronous rendering error is used for representing the rendering error conditions of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame;
judging whether the difference value of each synchronous rendering error and the reference rendering error corresponding to the simulation running process is within a preset difference value interval or not; the preset difference interval is used for representing an interval where each synchronous rendering error is located when the simulation running process is in normal running;
when the difference value between each synchronous rendering error and the reference synchronous coefficient corresponding to the simulation running process falls into the preset difference value interval, running a first simulation reconstruction three-dimensional animation and a second simulation reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence based on the simulation running process;
and otherwise, correcting synchronous rendering configuration information corresponding to synchronous rendering errors corresponding to the difference values which do not fall into the preset difference value interval according to the thread script of the simulation running process, and returning to the step of determining the synchronous rendering errors of the first simulation reconstruction three-dimensional animation and the second simulation reconstruction three-dimensional animation corresponding to each linkage animation frame in each summary according to the synchronous rendering configuration information.
8. The intelligent building internet of things object simulation method according to claim 1, wherein the step of determining complete linkage record information between the first internet of things object entity and the at least one second internet of things object entity according to the running results of the first simulation reconstruction three-dimensional animation and the second simulation reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence comprises the following steps:
and determining complete linkage record information between the first Internet of things object entity and the at least one second Internet of things object entity according to a simulated rendering stream generated by splicing the running results of the first simulated reconstruction three-dimensional animation and the second simulated reconstruction three-dimensional animation corresponding to each linkage animation frame in the summary animation sequence according to a time sequence.
9. A building cloud server, characterized in that the building cloud server comprises a processor, a machine-readable storage medium, and a network interface, the machine-readable storage medium, the network interface, and the processor are connected by a bus system, the network interface is configured to be communicatively connected to at least one building service terminal, the machine-readable storage medium is configured to store a program, an instruction, or a code, and the processor is configured to execute the program, the instruction, or the code in the machine-readable storage medium to execute the smart building internet of things object simulation method according to any one of claims 1 to 8.
10. A computer-readable storage medium, wherein the computer-readable storage medium is configured with a program, instructions or code, which when executed, implements the method for simulating an internet of things object for a smart building according to any one of claims 1 to 8.
CN202010262005.7A 2020-04-06 2020-04-06 Smart building Internet of things object simulation method and building cloud server Active CN111476875B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202011150566.4A CN112907738A (en) 2020-04-06 2020-04-06 Smart building Internet of things object simulation method and building cloud server
CN202010262005.7A CN111476875B (en) 2020-04-06 2020-04-06 Smart building Internet of things object simulation method and building cloud server
CN202011150309.0A CN112288868A (en) 2020-04-06 2020-04-06 Intelligent building Internet of things object simulation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010262005.7A CN111476875B (en) 2020-04-06 2020-04-06 Smart building Internet of things object simulation method and building cloud server

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202011150309.0A Division CN112288868A (en) 2020-04-06 2020-04-06 Intelligent building Internet of things object simulation method and system
CN202011150566.4A Division CN112907738A (en) 2020-04-06 2020-04-06 Smart building Internet of things object simulation method and building cloud server

Publications (2)

Publication Number Publication Date
CN111476875A CN111476875A (en) 2020-07-31
CN111476875B true CN111476875B (en) 2021-04-30

Family

ID=71749731

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202011150309.0A Withdrawn CN112288868A (en) 2020-04-06 2020-04-06 Intelligent building Internet of things object simulation method and system
CN202011150566.4A Withdrawn CN112907738A (en) 2020-04-06 2020-04-06 Smart building Internet of things object simulation method and building cloud server
CN202010262005.7A Active CN111476875B (en) 2020-04-06 2020-04-06 Smart building Internet of things object simulation method and building cloud server

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202011150309.0A Withdrawn CN112288868A (en) 2020-04-06 2020-04-06 Intelligent building Internet of things object simulation method and system
CN202011150566.4A Withdrawn CN112907738A (en) 2020-04-06 2020-04-06 Smart building Internet of things object simulation method and building cloud server

Country Status (1)

Country Link
CN (3) CN112288868A (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112364421B (en) * 2020-11-12 2023-10-27 天河超级计算淮海分中心 Rendering method and device of building information model, computer equipment and storage medium
CN112529742B (en) * 2020-12-23 2021-08-10 红石阳光(北京)科技股份有限公司 Building comprehensive management method and system based on intelligent brain
CN116582571B (en) * 2023-07-14 2023-09-12 绿城科技产业服务集团有限公司 Remote terminal equipment interaction method and device based on building management

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106249607A (en) * 2016-07-28 2016-12-21 桂林电子科技大学 Virtual Intelligent household analogue system and method
CN109978727A (en) * 2019-03-28 2019-07-05 上海荷福人工智能科技(集团)有限公司 A kind of AI intelligent building operation platform

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2475599Y (en) * 2001-04-21 2002-02-06 张志云 Tail gas pressure reducing noise silencer
CN103970920A (en) * 2013-02-05 2014-08-06 周柏贾 Earthquake emergency exercise virtual simulation system
US10068385B2 (en) * 2015-12-15 2018-09-04 Intel Corporation Generation of synthetic 3-dimensional object images for recognition systems
CN105931288A (en) * 2016-04-12 2016-09-07 广州凡拓数字创意科技股份有限公司 Construction method and system of digital exhibition hall
CN109583108B (en) * 2018-12-06 2023-04-28 中国电力工程顾问集团西南电力设计院有限公司 Extra-high voltage transmission line scene construction method based on GIS
CN209590841U (en) * 2019-05-06 2019-11-05 福建中科智与科技有限公司 A kind of system integrated based on three-dimensional visualization technique combination internet of things data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106249607A (en) * 2016-07-28 2016-12-21 桂林电子科技大学 Virtual Intelligent household analogue system and method
CN109978727A (en) * 2019-03-28 2019-07-05 上海荷福人工智能科技(集团)有限公司 A kind of AI intelligent building operation platform

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于Django的智慧园区平台***设计与实现;牛宁;《中国优秀硕士学位论文全文数据库 信息科技辑》;20190315;I138-503 *

Also Published As

Publication number Publication date
CN111476875A (en) 2020-07-31
CN112907738A (en) 2021-06-04
CN112288868A (en) 2021-01-29

Similar Documents

Publication Publication Date Title
CN111476875B (en) Smart building Internet of things object simulation method and building cloud server
CN111490990B (en) Network security analysis method based on big data platform, big data platform server and computer readable storage medium
CN111352670B (en) Virtual reality scene loading method and device, virtual reality system and equipment
CN111310057B (en) Online learning mining method and device, online learning system and server
CN112184872A (en) Game rendering optimization method based on big data and cloud computing center
CN110826799B (en) Service prediction method, device, server and readable storage medium
CN112291227A (en) Attack behavior mining method and system based on image big data and big data platform
CN111414540B (en) Online learning recommendation method and device, online learning system and server
CN110930254A (en) Data processing method, device, terminal and medium based on block chain
CN112069075A (en) Fashion testing method and device for game role and game client
CN116796552B (en) Simulation thinking asynchronous cooperative processing method and device
CN111476886B (en) Smart building three-dimensional model rendering method and building cloud server
CN112800241A (en) Big data processing method and big data processing system based on block chain offline payment
CN112269976A (en) Artificial intelligence face verification method and system of Internet of things
CN111787081B (en) Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform
CN111797170B (en) Medical data information coordination processing method, device and system
CN112565450A (en) Data processing method, system and platform based on artificial intelligence and Internet of things interaction
CN111026371A (en) Game development method and device, electronic equipment and storage medium
CN106547696A (en) A kind of method for generating test case and device of Workflow-oriented system
CN113971181A (en) Data synchronization method and device based on remote multi-active system
CN117808816A (en) Image anomaly detection method and device and electronic equipment
CN114332501A (en) Repeated video checking method and device based on frame characteristics
CN110278120A (en) A kind of test environment with suitable method and proxy server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: School of information engineering and automation, Kunming University of technology, 253 Xuefu Road, Wuhua District, Kunming City, Yunnan Province

Applicant after: Zhang Zhiyun

Address before: School of information engineering, Huaqiao University, 269 Chenghua North Road, Fengze District, Quanzhou City, Fujian Province

Applicant before: Zhang Zhiyun

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210412

Address after: 8 / F, building A1, Huayi Science Park, 71 Tianda Road, high tech Zone, Hefei City, Anhui Province

Applicant after: ANHUI ANTAI TECHNOLOGY Co.,Ltd.

Address before: School of information engineering and automation, Kunming University of technology, 253 Xuefu Road, Wuhua District, Kunming City, Yunnan Province, 650093

Applicant before: Zhang Zhiyun

GR01 Patent grant
GR01 Patent grant