CN116777184A - Automatic monitoring method for construction progress of assembled building - Google Patents

Automatic monitoring method for construction progress of assembled building Download PDF

Info

Publication number
CN116777184A
CN116777184A CN202311043244.3A CN202311043244A CN116777184A CN 116777184 A CN116777184 A CN 116777184A CN 202311043244 A CN202311043244 A CN 202311043244A CN 116777184 A CN116777184 A CN 116777184A
Authority
CN
China
Prior art keywords
marking
image
hoisting
component
frames
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311043244.3A
Other languages
Chinese (zh)
Other versions
CN116777184B (en
Inventor
卢延锋
梁冰
李东
魏媛
韩贵金
李伟
杨震卿
宋萍萍
周文武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sada Intelligent Technology Co ltd
BEIJING LIUJIAN CONSTRUCTION GROUP CO LTD
Beijing Construction Engineering Group Co Ltd
Original Assignee
Beijing Sada Intelligent Technology Co ltd
BEIJING LIUJIAN CONSTRUCTION GROUP CO LTD
Beijing Construction Engineering Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sada Intelligent Technology Co ltd, BEIJING LIUJIAN CONSTRUCTION GROUP CO LTD, Beijing Construction Engineering Group Co Ltd filed Critical Beijing Sada Intelligent Technology Co ltd
Priority to CN202311043244.3A priority Critical patent/CN116777184B/en
Publication of CN116777184A publication Critical patent/CN116777184A/en
Application granted granted Critical
Publication of CN116777184B publication Critical patent/CN116777184B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • G06Q10/063114Status monitoring or status determination for a person or group
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/16Applications of indicating, registering, or weighing devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Computing Systems (AREA)
  • General Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Tourism & Hospitality (AREA)
  • Databases & Information Systems (AREA)
  • Marketing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Game Theory and Decision Science (AREA)
  • Computational Linguistics (AREA)
  • Primary Health Care (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Mechanical Engineering (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of resource, workflow, personnel or project management, and discloses an automatic monitoring method for the progress of assembly type building construction. And (3) carrying out machine identification on the marking component which can represent the construction progress in the hoisting process by generating sample data and training a target detection model. The lifting hook which is not easy to be mistakenly identified is utilized to assist the image identification during the identification, so that the problem that an irrelevant object is mistakenly identified as a marking member during the image identification is overcome; the interference of the hoisting angle on the recognition result in the hoisting process is overcome by shooting different images of the component multiframe in the hoisting at one time and respectively recognizing; the type of the component is further judged by combining the hanging weights of the components in hanging, so that the components with identical shapes but different types are prevented from being mistakenly identified as marked components. The three points ensure accurate results, and then the number of the marked components is counted to judge the construction progress, so that the construction progress can be automatically and accurately monitored, and project management personnel can efficiently and accurately organize the construction.

Description

Automatic monitoring method for construction progress of assembled building
Technical Field
The invention relates to the technical field of resource, workflow, personnel or project management, in particular to an automatic monitoring method for the progress of assembly type building construction.
Background
The fabricated building is similar to a building assembled by prefabricated members at a construction site like building blocks, construction is fast, construction period is extremely short, requirements on project management and construction organization are much higher than those of a common building, and therefore construction project management personnel of the fabricated building have to accurately control construction progress of the site in real time, and coordinate construction components and manpower distribution according to the construction progress.
The traditional construction progress monitoring method is to visually inspect by means of a human eye/monitoring camera or to listen to reports of on-site constructors to each working section and summarize the reports to judge. However, if the visual inspection is performed, the auxiliary construction equipment including the scaffold is shielded and disturbed, and the assembled building often needs to be started at the same time at a plurality of construction positions, so that the summary judgment of the plurality of construction positions is required if the visual inspection is performed. Summarizing this process is somewhat subjective, resulting in a bias in the outcome of the determination. And after listening to the report and summarizing the judgment, firstly, the subjective judgment of the site constructor and the subjective judgment of the project manager are introduced at the same time, and the result deviation is larger.
The method for assisting in monitoring the construction progress by counting the amount of the used components is also a method for assisting in monitoring the construction progress, but when the method is used for manually counting, firstly, people cannot concentrate on the components in the use process in all weather, and some used components can not be counted. Secondly, false identification may occur during the statistical process, such as identifying the plasterboard as a superimposed sheet or vice versa (both look similar from the bottom).
If the usage of the components is counted by machine recognition, there are two additional problems in addition to misidentifying similar articles:
problem 1: the types and the number of the components on the construction site are various, that is, the number of the components which can enter the image of the machine identification is extremely huge, the false identification is serious, and the machine identification cannot distinguish what the components in the installation are stacked components at all because only the components in the installation are counted.
Problem 2: the position and posture of the component in the installation can change greatly, the appearance can be that is to say the appearance can occur, but the camera is fixed, the appearance of the component which it beats down can change greatly, thereby influence the stable uniformity of recognition result, lead to missing recognition.
Disclosure of Invention
The invention provides an automatic monitoring method for the construction progress of an assembled building.
The technical problems to be solved are as follows: the construction process of the fabricated building needs to monitor the construction progress accurately in time, but the existing construction progress monitoring method is high in subjectivity (inaccurate), time-consuming and labor-consuming (untimely).
In order to solve the technical problems, the invention adopts the following technical scheme: the automatic monitoring method of the construction progress of the fabricated building, the prefabricated member which can represent the construction progress in the construction process is marked as a marking member, the construction progress is monitored by counting the number of the lifting marking members, and the method comprises the following steps:
step one: selecting a marking member, wherein the marking member is required to meet the following conditions:
condition 1: only one piece is hoisted every time, and the tower crane is adopted for hoisting;
condition 2: the number of the use and the installation position are determined before site construction;
condition 3: the specification is one or more of fixed specifications;
weighing and obtaining the quality of the marking member of each specification after the marking member is selected;
step two: aiming at the marking component, a sample data set is established and a target detection model is trained;
step three: acquiring an image of a component in the hoisting process and performing image recognition by using the same view angle as the sample data set;
step four: after image recognition, comparing the hoisting weight of the member in hoisting with the quality of the marked member recorded in the step one, and if the image recognition result and the hoisting weight are identical with those of the marked member with certain specification, adding one to the hoisting number of the marked member;
step five: and judging the construction progress according to the number of the lifting marks of the marking members.
In the second step, a camera positioned on the tower crane trolley is adopted to acquire an image of a sample, and the camera is vertically and downwards aligned with a lifting hook, so that the lifting hook is positioned above a member in hoisting in the image of the sample;
and thirdly, if a plurality of predicted frames appear in the member in hoisting in one frame of image recognition, reserving the member with the largest intersection with the target frame of the lifting hook as an image recognition result.
In the third step, multiple frames of images of the member in hoisting are acquired each time to respectively perform image recognition, and recognition results of the images of all frames are comprehensively considered to judge what kind of marked member the member in hoisting is in fit with.
In the third step, the hoisting start and the hoisting end are judged through hoisting, the video of the whole process from the hoisting start to the hoisting end is shot, frames are taken at intervals of video streams, and multiple frames of different images for image recognition are obtained.
Further, the marking member is a prefabricated laminated slab, a prefabricated staircase, and a prefabricated outer wall.
Further, when the fabricated building to be constructed is a multi-story building, the marking member should satisfy the following conditions:
condition 4: at least one piece of floor is needed;
if the marking members are only one type, counting the number and specification of the marking members adopted by each layer, and when the number of the lifting marking members accords with the design amount of the marking members of all floors below a certain floor, indicating that the construction progress reaches the floor;
if the marking members are multiple, counting the number and specification of all kinds of marking members adopted by each layer, and when the hoisting number of the various marking members is consistent with the design amount of the same kind of marking members of all floors below a certain floor, indicating that the construction progress reaches the floor;
when the number of the lifting mark members is counted, if one mark member has a plurality of specifications, the mark members with various specifications are counted together, and the specifications are not distinguished.
Further, the second step specifically comprises the following sub-steps:
step 2.1: hoisting a marking component, shooting a video through a vertically downward camera on a tower crane trolley, taking frames from a video stream at intervals, obtaining an original image of the marking component, marking data of the marking component in the image, cutting and storing the marking component according to marking information, and transforming the component image into a positive rectangular component image through an image perspective transformation algorithm;
step 2.2: performing image rotation and image overturning operation on the right rectangular component image;
step 2.3: shooting a lifting hook lifting video through a vertically downward camera on a tower crane trolley, taking frames from video streams at intervals, acquiring an original image of the lifting hook, marking the lifting hook with data, and superposing and fusing the component images obtained in the step 2.1 and the step 2.2 by using marked lifting hook information;
step 2.4, preprocessing the image before training by using an image fogging algorithm and an image blurring algorithm, so as to improve the robustness of the model;
the image fog adding algorithm is used for synthesizing fog by using a central point, wherein the central point is used for synthesizing and diffusing the fog through one central point of the image, and the effect of fog synthesis is weaker as the distance from the central point of the fog is longer; the image blurring algorithm is motion blurring, convolution calculation is carried out by using a blurring convolution kernel, and a motion blurring effect is achieved;
step 2.5: and training a target detection model.
Further, in the third step, when a plurality of predicted frames appear in the image recognition of the member in hoisting, the following method is adopted to reserve the image recognition result as the maximum intersection with the target frame of the lifting hook:
firstly judging hooks, if a plurality of hooks are identified, taking the hook targets with the highest confidence coefficient to leave, and inhibiting other hook targets; if the lifting hook is not arranged, a component target with the maximum confidence is left; if only one lifting hook exists or only one lifting hook exists after the suppression, judging whether a component prediction frame exists or not, and if no component prediction frame exists, reasoning a next frame of image; if the component prediction frames exist, judging the intersection sizes of the existing component target frames and the lifting hook target frames, if no intersection exists, directly inhibiting, if the intersection exists, leaving the component target frame with the largest intersection, and if a plurality of component target frames and the lifting hook target frames are equal in intersection size, leaving one component target frame with the largest confidence.
In the fourth step, the data processing center reads the recognition result of each frame image and calculates the recognition frame number of each type of mark member of various specificationsn i Total frame number of imagenThen sequentially calculating the proportion of the number of identification frames of the marking members of various specifications to the total number of frames of the imagea i If there is a marking member of one specificationa i If the image recognition result exceeds the set threshold value, the image recognition result is judged to be identical to the marking member of the specification.
In the third step, at least 100 frames of different images of the member in hoisting are acquired each time in the hoisting process, and image recognition is performed respectively; in the fourth step, ifa i If the number is greater than 0.7, the image recognition result is considered to be identical to the marking member of the specification.
Compared with the prior art, the automatic monitoring method for the construction progress of the fabricated building has the following beneficial effects:
according to the invention, the construction progress can be judged by generating the sample data and training the target detection model to perform machine identification on the components capable of representing the construction progress in the hoisting process and counting the number of the components, so that the construction progress can be automatically and accurately monitored, and project management personnel can efficiently and accurately organize the construction;
according to the invention, the angle of the image shot when the component in hoisting is identified is adjusted, so that the lifting hook on the image is positioned right above the component in hoisting, and only the overlapping result with the lifting hook is reserved when the image is identified, thereby overcoming the problem that an irrelevant object is mistakenly identified as a marked component when the image is identified, and ensuring that the identified component is necessarily the component in hoisting;
according to the invention, the interference of the hoisting angle on the identification result in the hoisting process is overcome by shooting the multi-frame different images of the component in the hoisting at one time and identifying the images respectively, so that the uniform and stable result is ensured;
in the invention, the type of the component is further judged by combining the hanging weights of the components in hanging, so that the components with identical shapes but different types are prevented from being mistakenly identified as marked components.
Drawings
FIG. 1 is a flow chart of an automatic monitoring method for the progress of fabricated building construction according to the present invention;
FIG. 2 is a flow chart of the step two of creating a sample dataset;
FIG. 3 is a flow chart of filtering the result when a plurality of predicted frames appear in one frame of image recognition by the component in hoisting in the third step;
FIG. 4 is a schematic view of an image acquisition mode of a component in hoisting;
FIG. 5 is a schematic diagram of the hardware used in the method for automatically monitoring the progress of the fabricated building construction of the present invention;
in the figure, a 1-camera, a 2-lifting hook and a 3-lifting member are arranged.
Detailed Description
As shown in fig. 1, an automatic monitoring method for the construction progress of an assembled building, wherein prefabricated members capable of representing the construction progress in the construction process are marked as marking members, the construction progress is monitored by counting the number of the lifting of the marking members, and the method comprises the following steps:
step one: selecting a marking member, wherein the marking member is required to meet the following conditions:
condition 1: only one piece is hoisted every time, and the tower crane is adopted for hoisting;
the condition that a plurality of components are hoisted at one time during construction exists, and the condition can cause interference to image recognition, so that the condition needs to be avoided. These components need to be hoisted by the tower crane entirely, and if there are components to be hoisted by other hoisting equipment, statistics will be missed, and a system error is caused, so that it is required to ensure that the marked components are hoisted by the tower crane.
Condition 2: the number of the use and the installation position are determined before site construction; thereby ensuring that the construction progress can be characterized by the amount of these components;
condition 3: the specification is one or more of fixed, thereby ensuring the stable appearance and quality. The specification includes dimensions, properties, materials, and qualities.
Weighing and obtaining the quality of the marking member of each specification after the marking member is selected; preferably, a plurality of marking elements of the same specification are weighed and averaged to avoid that tolerances of a certain element affect the result.
Step two: aiming at the marking component, a sample data set is established and a target detection model is trained;
in the prior art, no sample data set exists, so that the sample data set needs to be built by itself, but as long as the building is completed on one construction project, the later construction project can be directly used. The invention has established sample data set, other construction projects can be directly borrowed if the invention needs to be applied.
Step three: acquiring an image of a component in the hoisting process and performing image recognition by using the same view angle as the sample data set;
the hoisting mode of the components in the hoisting process is required to be the same as that of the sample data and the mark components in the building process, and the same angle is adopted for shooting.
Step four: after image recognition, comparing the hoisting weight of the member 3 in hoisting with the quality of the marked member recorded in the step one, and if the image recognition result and the hoisting weight are identical with those of the marked member with certain specification, adding one to the hoisting number of the marked member;
the hoisting is used for verification to avoid that some components similar to the marking components in appearance are mistakenly identified as the marking components. Note that the sling is used for verification only, and cannot be used as a criterion alone, and if the sling is used as a criterion alone, various broken materials (such as cement, grouting materials and the like) in the construction process are identified as marking members. Verification by crane weight is performed on the premise that the shape is judged to be similar to the marking member, and the scattered materials are removed, so that the marking member can be used.
Step five: and judging the construction progress according to the number of the lifting marks of the marking members.
The hardware and connection manner required for implementing the above steps in this embodiment can be referred to fig. 5.
In the second step, as shown in fig. 4, a camera 1 positioned on the tower crane trolley is adopted to acquire an image of a sample, and the camera 1 is vertically and downwardly aligned with a lifting hook 2, so that in the image of the sample, the lifting hook 2 is positioned above a member 3 in hoisting; in the third step, if a plurality of prediction frames appear in one frame of image recognition, the member 3 in hoisting is reserved as an image recognition result with the largest intersection with the target frame of the lifting hook 2.
In this way, the lifting hook 2 is located at the center of the image and is unique in number and located right above the hoisting member 3, and since the lifting hook 2 is located at the center of the image, is stable and unique in shape and is unique in number (these are all constraint conditions), the lifting hook 2 cannot be identified incorrectly, and based on the lifting hook, the member overlapped with the frame can be ensured to be the hoisting member 3, so that the unhindered member cannot be identified, and an irrelevant member cannot be identified. For example, if the prefabricated laminated slab is a marking member, prefabricated laminated slabs waiting for lifting or stacking can be eliminated.
And thirdly, respectively carrying out image recognition on multiple frames of images of the member 3 in hoisting in each hoisting process, and comprehensively considering the recognition results of the images of all frames to judge the specific coincidence of the member 3 in hoisting with the marked member.
Therefore, the position and the gesture change in the hoisting process can be ensured, and the image recognition cannot be influenced. After all, the components meeting the conditions can be ensured to be correctly identified in most positions and postures, and partial positions and postures can not be correctly identified, but the final judgment result is not affected. Meanwhile, the components which do not meet the conditions may be mistakenly identified in part and position postures, but the final judging result is not affected.
And thirdly, judging the hoisting start and the hoisting end by hoisting, shooting the video of the whole process from the hoisting start to the hoisting end, and taking frames from the video stream at intervals to obtain multiple frames of different images for image recognition.
The step-type increase of the hoisting weight when hoisting is started and the step-type decrease of the hoisting weight when hoisting is finished are obvious judging bases, and the start and the end of hoisting can be judged according to the step-type increase of the hoisting weight, so that the obtained video is ensured to be the video in the hoisting process.
If a multi-story building is being constructed, the marking means preferably fulfil the following conditions:
condition 4: at least one piece of floor is needed; thereby facilitating project management personnel to organize construction, as the layer where construction is performed now can be clearly known, visual impression is provided, and communication with people is facilitated.
In this embodiment, the marking members are prefabricated laminated slabs, prefabricated stairways, and prefabricated exterior walls, which satisfy all of the above conditions.
If the marking members are only one type, counting the number and specification of the marking members adopted by each layer, and when the number of the lifting marking members accords with the design amount of the marking members of all floors below a certain floor, indicating that the construction progress reaches the floor;
if the marking members are multiple, counting the number and specification of all kinds of marking members adopted by each layer, and when the hoisting number of the various marking members is consistent with the design amount of the same kind of marking members of all floors below a certain floor, indicating that the construction progress reaches the floor;
when the number of the hoisting marked components is counted, if one marked component has multiple specifications, the marked components with various specifications are counted together without distinguishing the specifications, so that the counted workload is saved on the premise of not influencing the accuracy of the result.
As shown in fig. 2, the second step specifically includes the following sub-steps:
step 2.1: hoisting a marking component, shooting a video through a camera 1 vertically downwards on a tower crane trolley, taking frames from a video stream at intervals, obtaining an original image of the marking component, marking data of the marking component in the image, cutting and storing the marking component according to marking information, and transforming the component image into a positive rectangular component image through an image perspective transformation algorithm;
step 2.2: performing image rotation and image overturning operation on the right rectangular component image;
step 2.3: shooting a video of lifting of the lifting hook 2 through a vertically downward camera 1 on the tower crane trolley, taking frames from the video stream at intervals, acquiring an original image of the lifting hook 2, marking the data of the lifting hook 2, and superposing and fusing the component images obtained in the step 2.1 and the step 2.2 by using the marked lifting hook 2 information;
the steps 2.1-2.3 are to adapt the sample data set to the method of the invention for assisting image recognition by using the hook 2. At the same time, the sample establishment process is also a part of the construction process, and the equipment for establishing the sample (as shown in fig. 5) is also needed later, so that no extra cost is generated.
Step 2.4, preprocessing the image before training by using an image fogging algorithm and an image blurring algorithm, so as to improve the robustness of the model;
the image fog adding algorithm is used for synthesizing fog by using a central point, wherein the central point is used for synthesizing and diffusing the fog through one central point of the image, and the effect of fog synthesis is weaker as the distance from the central point of the fog is longer; the image blurring algorithm is motion blurring, convolution calculation is carried out by using a blurring convolution kernel, and a motion blurring effect is achieved;
step 2.5: and training a target detection model.
As shown in fig. 3, in step three, when a plurality of predicted frames appear in the image recognition of the member 3 under hoisting, the following method is adopted to reserve as an image recognition result that the intersection with the target frame of the hook 2 is the largest:
firstly judging hooks 2, if a plurality of hooks 2 are identified, taking the hooks 2 with the highest confidence to leave, and inhibiting other hooks 2; if the lifting hook 2 is not arranged, a component target with the highest confidence is left; if only one lifting hook 2 exists or only one lifting hook 2 exists after the inhibition, starting to judge whether a component prediction frame exists, and if no component prediction frame exists, reasoning a next frame of image; if the component prediction frames exist, judging the intersection sizes of the existing component target frames and the lifting hook 2 target frames, if no intersection exists, directly inhibiting, if the intersection exists, leaving the component target frame with the largest intersection, and if a plurality of component target frames and the lifting hook 2 target frames are equal in intersection size, leaving one component target frame with the largest confidence.
The calculation formula for judging the intersection size of the existing component target frame and the hook 2 target frame is as follows:
in the method, in the process of the invention,wfor the width of the component target frame overlapped with the lifting hook 2 target frame, the calculation formula is as follows:
hfor the height of the component target frame overlapped with the lifting hook 2 target frame, the calculation formula is as follows:
(x 1 ,y 1 )、(x 2 ,y 2 ) The coordinates of the left upper corner and the right lower corner of the target frame of the lifting hook 2 are respectively calculatedx 1 ' ,y 1 ' )、(x 2 ' ,y 2 ' ) The coordinates of the upper left corner and the lower right corner of the component target frame are respectively,Zfor calculating the proportion value of the intersection to the target frame of the lifting hook 2, judging the intersection state of the member target frame and the target frame of the lifting hook 2 by using the proportion value;
n number of component prediction frames and lifting hook 2 prediction framesZAfter the value is obtained, each of the following is determinedZThe magnitude of the value, ifZ0, filtering the component target frame, ifZGreater than 0, will be soughtZThe component prediction frame with the largest value is reserved, the rest is filtered, if a plurality of components appearZThe values are equal in size, and the component target frame with the highest confidence is reserved.
In the fourth step, the data processing center reads the identification result of each frame image and calculates the identification frame number of each type of marking member with various specificationsn i Total frame number of imagenThen sequentially calculating the proportion of the number of identification frames of the marking members of various specifications to the total number of frames of the imagea i If there is a marking member of one specificationa i If the image recognition result exceeds the set threshold value, the image recognition result is judged to be identical to the marking member of the specification. In the embodiment, in the third step, at least 100 frames of different images of the member 3 in hoisting in the hoisting process are obtained each time to respectively perform image recognition; in the fourth step, ifa i If the number is greater than 0.7, the image recognition result is considered to be identical to the marking member of the specification.
The threshold value 0.7 is required to be adjusted on site according to actual conditions, and in the embodiment, 0.7 can ensure that the image recognition result is accurate, and if the invention is applied to other construction projects, the adjustment can be performed on the basis of the threshold value 0.7 in the embodiment until the recognition accuracy meets the project requirements.
The fourth step is as follows:
the data processing center reads the hanging weight information of the hanging times and sequentially calculates the hanging weightmAnd the mass of various marking members recorded in the step onem i Absolute value of difference of (2)Calculating the maximum value of the mass of each marking memberm max And minimum value ofm min Difference of->
The data processing center calculates the judgment value of the hanging time final identification component as a certain component according to the weight formula,P i the component category with the largest value is the final recognition result, and the weight formula is as follows:
in the method, in the process of the invention,w 1w 2 is a weight seriesThe number of the product is the number,a i the proportion of the number of identification frames for each type of component to the total number of frames of the image,b i the relation between the hanging weight information and the actual component mass information is reflected, and the calculation formula is as follows:
determination using a maximum estimation methodw 1 Andw 2
w 1 andw 2 using a maximum estimation method, using known sample result information to extrapolate back the most probable occurrence of the correct sample resultw 1 Andw 2 the method comprises the steps of carrying out a first treatment on the surface of the Order theIn order for the model parameters to be solved,S j the maximum estimation formula is as follows:
P j for the component class to which the greatest confidence corresponds,R j as true category, ifP j And (3) withR j In accordance with the method, the device and the system,S j 1, otherwiseS j Is 0.
The above examples are only illustrative of the preferred embodiments of the present invention and are not intended to limit the scope of the present invention, and various modifications and improvements made by those skilled in the art to the technical solution of the present invention should fall within the scope of protection defined by the claims of the present invention without departing from the spirit of the present invention.

Claims (10)

1. An automatic monitoring method for the construction progress of an assembled building is characterized by comprising the following steps: the prefab that can characterize the construction progress in the work progress marks the component as, monitors the construction progress through counting the hoist and mount number of marking the component to include following steps:
step one: selecting a marking member, wherein the marking member is required to meet the following conditions:
condition 1: only one piece is hoisted every time, and the tower crane is adopted for hoisting;
condition 2: the number of the use and the installation position are determined before site construction;
condition 3: the specification is one or more of fixed specifications;
weighing and obtaining the quality of the marking member of each specification after the marking member is selected;
step two: aiming at the marking component, a sample data set is established and a target detection model is trained;
step three: acquiring an image of a component in the hoisting process and performing image recognition by using the same view angle as the sample data set;
step four: after image recognition, comparing the hoisting weight of the member (3) in hoisting with the quality of the marked member recorded in the step one, and if the image recognition result and the hoisting weight are identical with those of the marked member with certain specification, adding one to the hoisting number of the marked member;
step five: and judging the construction progress according to the number of the lifting marks of the marking members.
2. The automatic monitoring method for the progress of fabricated building construction according to claim 1, wherein the method comprises the following steps: in the second step, a camera (1) positioned on the tower crane trolley is adopted to acquire an image of a sample, and the camera (1) is vertically and downwards aligned with a lifting hook (2), so that the lifting hook (2) is positioned above a member (3) in hoisting in the image of the sample;
and thirdly, if a plurality of prediction frames appear in the member (3) in hoisting in one frame of image recognition, reserving the member (3) with the largest intersection with the target frame of the lifting hook (2) as an image recognition result.
3. The automatic monitoring method for the progress of fabricated building construction according to claim 1, wherein the method comprises the following steps: and thirdly, respectively carrying out image recognition on multiple frames of images of the member (3) in hoisting in each hoisting process, and comprehensively considering the recognition results of the images of all frames to judge the specific coincidence of the member (3) in hoisting with the marked member.
4. A method for automatically monitoring the progress of fabricated building construction according to claim 3, wherein: and thirdly, judging the hoisting start and the hoisting end by hoisting, shooting the video of the whole process from the hoisting start to the hoisting end, and taking frames from the video stream at intervals to obtain multiple frames of different images for image recognition.
5. The automatic monitoring method for the progress of fabricated building construction according to claim 1, wherein the method comprises the following steps: the marking component is a prefabricated laminated slab, a prefabricated stair and a prefabricated outer wall.
6. The automatic monitoring method for the progress of fabricated building construction according to claim 1, wherein the method comprises the following steps: when the fabricated building is a multi-storey building, the marking members should satisfy the following conditions:
condition 4: at least one piece of floor is needed;
if the marking members are only one type, counting the number and specification of the marking members adopted by each layer, and when the number of the lifting marking members accords with the design amount of the marking members of all floors below a certain floor, indicating that the construction progress reaches the floor;
if the marking members are multiple, counting the number and specification of all kinds of marking members adopted by each layer, and when the hoisting number of the various marking members is consistent with the design amount of the same kind of marking members of all floors below a certain floor, indicating that the construction progress reaches the floor;
when the number of the lifting mark members is counted, if one mark member has a plurality of specifications, the mark members with various specifications are counted together, and the specifications are not distinguished.
7. The automatic monitoring method for the progress of fabricated building construction according to claim 2, wherein the method comprises the following steps: the second step specifically comprises the following sub-steps:
step 2.1: hoisting a marking component, shooting a video through a camera (1) vertically downwards on a tower crane trolley, taking frames for video stream at intervals, obtaining an original image of the marking component, marking data of the marking component in the image, cutting and storing the marking component according to marking information, and transforming the component image into a right rectangular component image through an image perspective transformation algorithm;
step 2.2: performing image rotation and image overturning operation on the right rectangular component image;
step 2.3: shooting a video of lifting of a lifting hook (2) through a vertically downward camera (1) on a tower crane trolley, taking frames from video streams at intervals, acquiring an original image of the lifting hook (2), marking the lifting hook (2) with data, and superposing and fusing component images obtained in the step 2.1 and the step 2.2 by using marked lifting hook (2) information;
step 2.4, preprocessing the image before training by using an image fogging algorithm and an image blurring algorithm, so as to improve the robustness of the model;
the image fog adding algorithm is used for synthesizing fog by using a central point, wherein the central point is used for synthesizing and diffusing the fog through one central point of the image, and the effect of fog synthesis is weaker as the distance from the central point of the fog is longer; the image blurring algorithm is motion blurring, convolution calculation is carried out by using a blurring convolution kernel, and a motion blurring effect is achieved;
step 2.5: and training a target detection model.
8. The automatic monitoring method for the progress of fabricated building construction according to claim 2, wherein the method comprises the following steps: in the third step, when a plurality of predicted frames appear in the image recognition of the member (3) in hoisting, the following method is adopted to reserve the image recognition result with the largest intersection with the target frame of the lifting hook (2):
firstly judging hooks (2), if a plurality of hooks (2) are identified, taking the hooks (2) with the highest confidence to leave, and inhibiting other hooks (2) targets; if the lifting hook (2) is not arranged, a component target with the highest confidence is left; if only one lifting hook (2) exists or only one lifting hook (2) exists after the inhibition, judging whether a component prediction frame exists or not, and if no component prediction frame exists, reasoning a next frame of image; if a component prediction frame exists, judging the intersection size of the existing component target frames and the hook (2) target frames, if no intersection exists, directly inhibiting, if the intersection exists, leaving the component target frame with the largest intersection, and if a plurality of component target frames and the hook (2) target frames exist, the intersection size is equal, and leaving one component target frame with the largest confidence.
9. A method for automatically monitoring the progress of fabricated building construction according to claim 3, wherein: in the fourth step, the data processing center reads the identification result of each frame image and calculates the identification frame number of each type of marking member with various specificationsn i Total frame number of imagenThen sequentially calculating the proportion of the number of identification frames of the marking members of various specifications to the total number of frames of the imagea i If there is a marking member of one specificationa i If the image recognition result exceeds the set threshold value, the image recognition result is judged to be identical to the marking member of the specification.
10. The automatic monitoring method for the construction progress of the fabricated building according to claim 9, wherein the method comprises the following steps: step three, respectively acquiring at least 100 frames of different images of the member (3) in hoisting in the hoisting process for image recognition; in the fourth step, ifa i If the number is greater than 0.7, the image recognition result is considered to be identical to the marking member of the specification.
CN202311043244.3A 2023-08-18 2023-08-18 Automatic monitoring method for construction progress of assembled building Active CN116777184B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311043244.3A CN116777184B (en) 2023-08-18 2023-08-18 Automatic monitoring method for construction progress of assembled building

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311043244.3A CN116777184B (en) 2023-08-18 2023-08-18 Automatic monitoring method for construction progress of assembled building

Publications (2)

Publication Number Publication Date
CN116777184A true CN116777184A (en) 2023-09-19
CN116777184B CN116777184B (en) 2023-12-12

Family

ID=87991579

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311043244.3A Active CN116777184B (en) 2023-08-18 2023-08-18 Automatic monitoring method for construction progress of assembled building

Country Status (1)

Country Link
CN (1) CN116777184B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100020861A (en) * 2008-08-13 2010-02-23 건국대학교 산학협력단 Apparatus and method for managing progress of steel structure construction
CN110287519A (en) * 2019-05-14 2019-09-27 深圳大学 A kind of the building engineering construction progress monitoring method and system of integrated BIM
CN112561487A (en) * 2020-12-21 2021-03-26 广联达科技股份有限公司 Method, device and equipment for calculating construction progress and readable storage medium
CN113971781A (en) * 2021-12-03 2022-01-25 上海建工四建集团有限公司 Building structure construction progress identification method and device, client and storage medium
CN115564381A (en) * 2022-10-11 2023-01-03 上海东方投资监理有限公司 Information control method for construction progress of fabricated building
CN116363586A (en) * 2023-03-24 2023-06-30 中铁七局集团第三工程有限公司 Intelligent bridge construction progress identification method based on improved YOLOV5S

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100020861A (en) * 2008-08-13 2010-02-23 건국대학교 산학협력단 Apparatus and method for managing progress of steel structure construction
CN110287519A (en) * 2019-05-14 2019-09-27 深圳大学 A kind of the building engineering construction progress monitoring method and system of integrated BIM
CN112561487A (en) * 2020-12-21 2021-03-26 广联达科技股份有限公司 Method, device and equipment for calculating construction progress and readable storage medium
CN113971781A (en) * 2021-12-03 2022-01-25 上海建工四建集团有限公司 Building structure construction progress identification method and device, client and storage medium
CN115564381A (en) * 2022-10-11 2023-01-03 上海东方投资监理有限公司 Information control method for construction progress of fabricated building
CN116363586A (en) * 2023-03-24 2023-06-30 中铁七局集团第三工程有限公司 Intelligent bridge construction progress identification method based on improved YOLOV5S

Also Published As

Publication number Publication date
CN116777184B (en) 2023-12-12

Similar Documents

Publication Publication Date Title
CN112561487B (en) Method, device and equipment for calculating construction progress and readable storage medium
CN108960668A (en) The early warning and reminding method and device of construction quality monitoring, calculate equipment at storage medium
CN111898990A (en) Building construction progress management method
CN109557935A (en) A kind of safety monitoring during construction method and system based on unmanned plane
CN110146035B (en) Embedded part detection method, device, equipment and system for component production line
CN112827772A (en) Workpiece identification automatic spraying control method and system
CN115329446B (en) Digital twinning modeling method for intelligent hoisting process of prefabricated parts of fabricated building
CN110097087A (en) A kind of automatic binding reinforcing bars location recognition method
CN114511199A (en) Engineering project delivery acceptance management method, equipment and computer storage medium
CN115797850B (en) Oil field production safety early warning analysis system based on video stream
CN112859681B (en) Intelligent monitoring method for safety and stability of building steel structure based on big data analysis and cloud monitoring platform
CN112037327A (en) Method, device, equipment and storage medium for monitoring construction progress of building at arrangement point
CN116777184B (en) Automatic monitoring method for construction progress of assembled building
CN112819306A (en) Method, system, device and medium for evaluating work efficiency based on computer vision
CN109815568A (en) A method of protective device is added based on the model automatic identification hole BIM
CN113256269A (en) Engineering management system based on BIM, cloud computing and big data technology
CN115690693A (en) Intelligent monitoring system and monitoring method for construction hanging basket
CN117474321A (en) BIM model-based construction site risk intelligent identification method and system
CN113720283A (en) Building construction height identification method and device, electronic equipment and system
CN113971781A (en) Building structure construction progress identification method and device, client and storage medium
CN117236892A (en) Digital twin-based bridge steel web water transportation hoisting system and control method
CN108975116B (en) Elevator floor height measuring method, device, equipment and storage medium
CN116993110B (en) Component type identification method in hoisting process based on vision and hoisting weight
CN116228159A (en) Construction progress monitoring method, monitoring device, equipment and medium
CN113658239A (en) Building construction progress identification method and device, electronic equipment and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant