Abstract

Civil engineering structures are generally large in volume and scale, especially the most commonly used reinforced concrete structures. There are many defects such as micro-cracks and air bubbles inside, and the concrete is easy to crack under tension. The short-gauge length (point) sensor will be affected by local damage and cannot accurately reflect the overall working state of the structure. The long-gauge strain sensing technology can reflect the average strain of the area to be measured when there are local cracks in the structure to be measured and can also be fully deployed on the structure to form a distributed sensor network to comprehensively monitor the entire structure. Therefore, long-gauge length sensing is more suitable for civil engineering structure monitoring. This study mainly focuses on fuzzy systems and introduces a generalized probability decision process model to describe the behavior of such fuzzy systems. The maximum-likelihood scheduling and the minimum-likelihood scheduling are discussed. The corresponding model checking methods for probabilistic linear time properties are final reachability, always reachability, persistent reachability, and repeated reachability, and the advantage of this method is that the verification process of their model checking is transformed into a fuzzy matrix. The model checking problems of generalized likelihood-regular security properties and generalized likelihood-correcting-regular properties are studied, respectively, and their model checking problems are transformed into generalized likelihood linear time properties that are always possible. The failure state of the beam and the displacement of the beam bottom are obtained through the data of each unit sensor. The experimental results show that the sensor can effectively capture the cracks. The measured data of the sensor are relatively accurate. The dynamic test performance of the sensor is studied by the damage monitoring test of the vertical cantilever flat beam. After spectrum analysis, the sensor can accurately obtain the participation coefficient of each mode shape, the recognition result is close to that of the FBG sensor, the frequency recognition error does not exceed 0.2%, and the long-gauge length strain mode of the structure can also be accurately recognized. At the same time, the long-gauge strain mode obtained by the sensor monitoring can accurately locate the damage and can quantitatively identify the damage more accurately.

1. Introduction

Engineering quality is the guarantee for the survival and development of civil engineering enterprises. The extensive quality management model of civil engineering makes it extremely difficult for civil engineering enterprises to turn to quality and benefit-oriented intensive development. The application of information technology in the civil engineering industry has become an important way to transform the development mode and improve the quality of civil engineering [1]. Logical equivalent technology is the key technology of civil engineering informatization, creating conditions for “data-based” quality management and big data analysis. However, the research and application of logical equivalent and big data technology integration in quality management have just started, and the research on application methods is still in a “blank.” Logic synthesis is a complicated process, and reasonable constraints need to be imposed on the design to produce good synthesis results. After the synthesis is completed, it is necessary to check whether the timing and various constraints are satisfied [25]. This study takes civil engineering quality management as the research object, innovatively applies logical equivalent and big data technology to civil engineering quality management, and systematically studies the realization path and application method of logical equivalence and big data in quality management. We provide reference for civil engineering quality management based on logical equivalence and big data.

However, current researches on civil engineering networks based on complex networks are mostly limited to the discussion of morphological complexity and structural common features and lack quantitative indicators to describe the diversity, heterogeneity, and completeness of civil engineering cross-connections from a structural perspective. Entropy theory and fractal theory are representative methods of complexity research [68]. Scholars have proposed topological information entropy and several structural fractal dimension measurement methods. However, the current topological information entropy has the disadvantages of inconsistent methods and models and can only describe certain aspects of the network. Research on the fractal characteristics of civil engineering network structures is mostly limited to calculations. The box dimension, volume, and degree volume dimension of civil engineering network are rarely studied, and the explanation of its geological mechanism needs to be in-depth. Existing map topology information measurement indicators pay more attention to the diversity of map symbol connectivity methods or the closeness of connectivity and lack methods to quantitatively describe the spatial pattern and structural characteristics of map symbols, especially the compactness and compactness of map symbol topology. The paper defines the compactness of map symbol topology and heterogeneous information entropy, proposes its calculation method, and applies them to part of the civil engineering network. The results show that the method can describe the topology of the civil engineering network more accurately [911].

This article first takes the status quo of quality management as the breakthrough point, and through the analysis of total quality management theory, logical equivalent technology and big data theory expound the feasibility and necessity of applying logical equivalent and big data technology in civil engineering quality management. We put forward the realization path of logical equivalence and big data in quality management, which provides ideas for civil engineering quality management based on logical equivalence and big data. Through systematic analysis of the whole cycle of logically equivalent quality management data, a framework for the big data analysis process of logically equivalent quality management was designed. Under the guidance of this framework, the structure of logically equivalent quality management data, the extraction and storage methods of logically equivalent quality management data, and the quality text big data analysis method based on text mining are proposed, thereby realizing logically equivalent research on quality management big data analysis methods and providing data support for total quality management based on logical equivalence. Integrating logical equivalent technology and big data analysis methods into traditional quality management workflow and information flow, a comprehensive quality management process based on logical equivalence and big data has been constructed. Guided by this process, the system analyzes the integration, storage, sharing, and application methods of logically equivalent quality information, as well as the logically equivalent application points and implementation steps in total quality management, and proposes a logically equivalent-based approach in a big data environment.

There have been a large number of researches on the characteristics of civil engineering network structures based on complex network theory. Qing et al. [12] used duality to characterize the civil engineering network of the regional city, revealing its small-world characteristics, selecting 6 urban civil engineering networks with different geometric forms and historical origins, and analyzing their complex network structure characteristics. Clark and Watson [13] analyzed the betweenness centrality of six domestic civil engineering networks as examples and found that the betweenness centrality values are distributed in a power law, and the betweenness centrality values can effectively reflect the civil engineering grade. Qi et al. [14] found that the homology and heterogeneity of the civil engineering network were tested, and it was found that homogeneity and heterogeneity coexist in the network. These empirical studies reveal the potential structural characteristics of the civil engineering network, but most studies are limited to the description of certain aspects of the civil engineering network and lack a systematic description of the structural complexity of the civil engineering network. In the research of complex networks, the network structure entropy is used to describe the disorder and heterogeneity of the network, and it makes up for the shortcomings of the power law coefficients that are not easy to fit accurately and have good versatility.

Map information entropy is an extension of entropy theory in the field of cartography. It mainly describes the amount of information contained in map symbols from the perspectives of geometry, topology, and topics to evaluate its complexity. Map topology information entropy can effectively describe the complexity of the adjacency relationship of map symbols, and it has great potential in the measurement of the complexity of civil engineering network structure. It has been initially used in civil engineering selection and mapping synthesis [1517]. However, the existing measurement models of map topology information entropy are not uniform, and the definitions are different, and the description of the complexity of the civil engineering network structure is still not accurate enough. The first issue that should be considered in the synthesis of reversible logic is the cascade of reversible logic gates, that is, the order in which those reversible logic gates can be used to realize the given reversible logic function; secondly, it is hoped that the realization cost of the reversible network is as small as possible to realize the reversible logic. Each technology of the network requires a reasonable price; the input bit and output bit must be equal in the process of reversible logic design, so in many cases useless information bits will be added, which is also an important factor to be considered in reversible logic synthesis. Yan et al. [18] used a variety of logical equivalent software to deepen the design of the prestressed steel structure and realized construction simulation and dynamic monitoring. The Xsteel software was used in the deepening design of the project and the construction process, which demonstrated that the logical equivalent technology is in the large span. Special-shaped steel structure design and construction have advantages in improving efficiency and sustainable development. Matheran et al. [19] carried out the Sports Center Cycling Hall Grid Project, and based on ProSteel software and using VB language for secondary development, a system for the rapid genration of spatial structure models was established.

Subsequently, scholars continue to propose new structural fractal dimension measurement methods. So far, small-world characteristics, scale-freeness, and structural fractals have been summarized as the three major characteristics of complex networks. Scholars tested the fractal characteristics and self-similarity of the civil engineering network structure of the top 50 cities in the United States using the box covering method and found that the civil engineering network also has fractal characteristics in its structure. However, compared with the research on the small world and scale-free characteristics of civil engineering network, the exploration of the fractal characteristics of civil engineering network structure is only in its infancy. This is mainly because structural fractals are not as intuitive and easy to understand as small world and scale-free. It is more abstract, and it is still difficult to explain its mechanism. Although studies have shown that the emergence and emergence of structural fractals are largely due to the mutual exclusion of hub nodes [2023], this explanation is still partial to theory, and it is urgent to describe the fractal characteristics of complex network structures in reality. The researchers applied logical equivalent technology to prefabricated houses, used Tekla to achieve deepening design, collision detection, etc., and proposed the establishment of a prefabricated apartment library and a prefabricated component product library to standardize prefabricated civil structures. They conducted research on the storage, protection, and arrangement procedures of scaffolding engineering and construction site components of prefabricated civil construction. They believe that the application of logical equivalent technology is of great help in solving the current problems of prefabricated civil engineering. In the research, they also used the unique advantages of logically equivalent technologies such as collaborative design, 4D construction simulation, and collision detection [2426].

3. Logical Equivalent Model Space Architecture

3.1. Logical Level Distribution

The core of logical equivalent technology is information, which uses software as a carrier to realize model construction, model application, and information management. The ultimate goal is to realize civil engineering informatization. According to the connotation of logical equivalence, a complete information model should be able to integrate engineering data and business data at different stages of the entire life cycle, dynamically realize the creation, management, and sharing of information, and always maintain the consistency and completeness of information.

Logical equivalent technology changes the way of information sharing and improves management efficiency with its own characteristics as follows. In different stages of the life cycle of civil engineering, the information remains consistent. The same information only needs to be created once, and it can automatically evolve on its basis. The model objects are modified and expanded in different stages without recreation, which reduces the errors of inconsistent information.

Agglomeration coefficient and characteristic path length are the two main indicators to determine whether a network has small-world characteristics. If the network has both a larger agglomeration coefficient and a smaller characteristic path length, then the network is a small-world network. Degree distribution, characteristic path length, and clustering coefficient are the three most basic concepts in the statistical characteristics of complex network structures. Hierarchy is a common law and phenomenon in the objective world, and it is one of the basic organizational forms widely existing in various complex systems in nature and human society.

The hierarchy in the civil engineering network is a hierarchical structure artificially organized or automatically emerging according to the function or importance of the civil engineering. Streets at the same level have similar properties and relatively close connections, while streets at different levels are progressively progressive in terms of features and functions. Hierarchical structure is not possessed by random network and regular network. It profoundly affects the distribution of traffic flow of the network and causes the difference in network complexity.

The nodes are connected to each other to form a network. Their connection relationship directly affects the complexity of the network structure. The complexity of the connection relationship can be reflected by the diversity and difference in the connection methods. In the network, the connection relationship of nodes largely determines the network structure and the properties emerging from the structure. In a regular network, the connection relationship of nodes is very single, and by reconnecting edges with a certain probability on the basis of the regular network, the connection relationship of nodes has changed from single to rich, and the network structure has also changed from simple rules to complex and diverse.

3.2. Equivalent Data Indicators

Civil engineering quality management is a coordinated activity of command and control organization in terms of quality, to economically and efficiently construct civil engineering of qualified quality that meets design requirements and standards and user needs. Quality management is to implement and realize all activities of all quality management functions by determining and establishing quality policies, quality objectives, and responsibilities and in the quality management system through quality planning, quality control, quality assurance, and quality improvement.

The compactness of the topology and the heterogeneity of the topology comprehensively consider the pattern and structural characteristics of the nodes in the network. By comparing the importance of each node in the network (through the degree value) and the difference between a node and its neighbors in terms of adjacency, the spatial adjacency information of nodes at the global and local levels is revealed. After extracting the feature description parameters of the topological structure in Table 1, the improved information entropy calculation model is used to define the information entropy of the entire network.

From the design flow point of view, starting from the RTL code and the initial UPF, RTL describes the logic function of the design, and UPF describes the power consumption intent of the design. RTL and UPF exist in different files, so that it is easy to adjust RTL and UPF. Read in the RTL code and the initial UPF, and comprehensively complete the output of the door-level netlist and the UPF’ corresponding to the integrated door-level netlist. The newly generated UPF’ contains the information of the original UPF and also explains the power connections of the special units inserted during synthesis (such as isolation units and level conversion units).

Based on logically equivalent civil engineering design, the objects facing are not unrelated points, lines, and surfaces, but civil components that contain rich attribute information and have bidirectional relationships, such as columns, beams, walls, slabs, and doors. Each component object inherits the attributes of its own class, expressed and calculated by parameters and parameter values. Component entity objects include geometric attributes such as length, width, and height that describe their own characteristics; physical structure attributes such as material; basic data such as functional attributes; and extended attributes such as technical parameters, cost data, schedule data, construction information, and maintenance information. In the entire life cycle of a civil engineering project, it is only necessary to continuously update and improve the parameter information of the building information model.

3.3. Model Weight Iteration

Each link of the PDCA cycle in civil engineering quality management, the setting of quality control points, the quality management of inspection batches, subitem projects, unit projects, individual projects, and the completion and acceptance of the entire project all require accurate and reliable information and data for support. In the existing quality management, most of the information is archived on paper, or stored separately in the enterprise management information system of each participant, and the integrated sharing of information cannot be realized. There “information gaps” and “information islands” make it difficult to communicate between the participants and reduce the predictability and controllability of the quality of civil engineering by the participants.

Static timing analysis tools (such as PrimeTtme) and power analysis tools (such as PrimeTime PX) can read gate-level netlists and UPFs after synthesis or physical implementation to perform timing analysis and power analysis for low-power designs. PrimeTtme uses the information contained in UF to construct a virtual power supply network model and back-marks the voltage to the power supply pins of each gate-level netlist instance. Through corresponding constraints during synthesis, the synchronization circuit in Figure 1 meets the requirements of setup time and hold time.

Logical equivalent technology provides a platform for information integration with a powerful back-end storage system, including data layer, model layer, and information application layer. The process of constructing logically equivalent models defines basic data such as geometric properties, physical structure properties, and functional properties of components to form a 3D model. With the progress of the project and the in-depth application of the 3D model, the extended information in the model is continuously enriched and improved. The information in the design phase, construction preparation phase, construction phase, completion phase, and operation and maintenance phase is continuously integrated on the basis of the 3D model to ensure the continuity and consistency of the information in each phase and finally form the information of the project product and business process.

3.4. Spatial Factor Recursion

In the current quality management process, the formal communication between the parties is usually carried out in the form of emails, contact sheets, meetings, etc. Although it can solve communication problems, it is often manifested as a “point-to-point” communication mode, which is prone to untimely communication. The phenomenon of information asymmetry and information islands of all parties seriously affects the efficiency of quality management. There is a lack of a unified “point-to-face” information platform between all parties to improve work efficiency.

The R node can be used as both a premise and a conclusion, so we know that the model structure of the generative rule is an irregular directed graph structure. The specific irregularity is manifested as follows: in the topological graph of production rules, not only the nodes are related, but also there are certain connections within the nodes. The relationship is actually the rule. The relationship between conditions and entities is that they belong to the same rule and have the same parent node. and the internal connection relationship of the node represents the relationship between the premise and the premise in a rule. There are many differences between these two relationships. Therefore, the production rule is an irregular data structure.

Because the quality planning and assurance work in Table 2 are not sufficient, the quality of quality control entities is prone to uneven quality. Information in the entire quality management process cannot be effectively integrated, making quality improvement work without a data basis, forming a vicious circle. The quality management based on logical equivalent technology realizes the collection and storage of information through its front-end collection equipment and back-end database, creating conditions for the formation of logically equivalent big data. Logical equivalent big data provide a reference for quality planning of similar projects and can simulate the planning scheme to ensure that the planning scheme is economically feasible; through the information collection equipment, the process can be tracked in real time and the project dynamics can be controlled in real time.

With the help of any of the 16 main roads in Figure 2, the horizontal roads can easily reach each other. In the same way, vertical road markings can easily reach each other by means of horizontal road markings. Although the grid-like civil engineering network is not as dense as the sample civil engineering network, with the help of two circular civil engineering projects, any two roads can reach each other through a small number of transfers. The lowest compactness is the sample civil engineering network of Gavle City, which is on the road network delineated on the outer ring road.

The proposed topological information entropy calculation method only measures the uniformity of the road demarcation value distribution in each civil engineering network from the perspective of probability and does not consider the degree value, that is, the compactness of the topology and the difference in the degree value of the connected roads. Therefore, the complexity of the topology of the sample road network is not well described.

The security rule system is an expandable database, so inputting security rules in a uniform format is the basis for the establishment of the database. When applying safety rules to natural language specifications that have been processed through knowledge engineering, first the accident subject of the safety accident is determined, and safety rules are applied to conduct rule inspections within the scope of the identified accident subject. Therefore, the first node K11 is the subject of the accident. By searching the specification results of the first node K11, safety information based on attributes, parameters, safety measures, etc., is searched and the Ck conclusion node at the end of the search data is extracted, and processing measures and other information in the clauses can be used for safety inspection.

4. Construction of a Civil Engineering Complexity Structure Measurement Model Based on Logical Equivalence

4.1. Civil Engineering Quality Management

The quality of civil engineering products refers to the use value of the product, that is, the natural attributes of the product that can meet the needs of the country and the people, such as applicability, safety (reliability), durability, aesthetics, economy, and environmental coordination. The operation quality of the quality management system can be reflected by its operational effectiveness. The effectiveness of the quality management system operation can be summarized as comprehensive implementation, behavior in place, timely management, moderate control, effective identification, and advancing with the times. It is the enterprise’s purpose to achieve project quality. The efficiency and level of management, organization, and technical work are up to standard.

The volume structure fractal dimension value of all twelve sample civil engineering network data is less than the degree volume structure fractal dimension value. This is because, in the dual graph of civil engineering network based on road planning, the heterogeneity of the network nodes is very strong, and the degree value mostly obeys the power law distribution. In addition, civil engineering networks have “hubs” serving local structures at different scales, so that there is a certain degree of heterogeneity between the global and local node degree values. Therefore, as the measurement radius increases, the speed at which the average number of nodes that can be reached by any node increases will be lower than the speed at which the node degree value that can be reached in Table 3 increases.

By analyzing and comparing the above knowledge representation methods, we can divide them into two categories. The first type is knowledge modeling and expression methods based on artificial intelligence (AI) technology. This type of method focuses more on knowledge reasoning, which is based on production rules. Knowledge representation, Bayesian network knowledge representation, logic-based knowledge representation, and neural network knowledge representation are more representative knowledge representation methods. The second category is a knowledge modeling method based on semantic Web technology.

Among them, knowledge representation based on semantic Web and ontology based on framework and object-oriented knowledge representation methods are more typical methods. Among the above two types of knowledge representation methods, the knowledge representation method based on AI technology is more inclined to the logical relationship between knowledge and structure, and the knowledge represented can be derived through logical reasoning. The description of knowledge is convenient for knowledge classification and later query and retrieval.

The reference design and implementation design shown in Figure 3 are two designs that use formality to check whether they are equivalent. The reference design is used as a standard for comparison. It should be a functionally correct design, and a form of design at a certain stage before the design is realized. The implementation design is the changed design, and it is the design that needs to be verified whether it is equivalent to the previous design. For example, the general design after synthesis is the realization design, and the reference design is the RTL design.

4.2. Complexity Quantification Post-Processing

The complexity quantification framework provides the possibility for the rapid development of IFC-based applications to support the complex business processes of the civil engineering industry. During the development process, software developers only need to add the binary files of IFC Java Toolbox to the path of the software project to realize the integration of Toolbox and the software and then achieve full access to the logical equivalent model based on IFC. If you bind Toolbox in the initial stage of logical equivalent model creation, you can also read, write, modify, and create a Java-based IFC model. We use IFC Java Toolbox to achieve complete access to the IFC-based logical equivalent model, as well as to read, write, modify, and create IFC files, mainly through the following modules of IFC Java Toolbox.

When the pedestrian updates the position, he will select the target grid point at the next moment according to the probability in the preference matrix. If the target grid is empty, and no other pedestrians move to this grid, the move is executed. If the target grid point is occupied, the pedestrian stands still. If multiple pedestrians choose the same target grid point, there will be a position conflict, you need to calculate the probability of moving to the target grid point, and the pedestrian with a higher relative probability in Table 4 occupies the target grid point.

In order for formality to perform complete verification, all comparison points should be verifiable. There must be a one-to-one correspondence between the reference design and the implementation design. However, in some cases, it is not necessary to achieve complete comparison when verifying consistency. For example, the implementation design has additional output ports, or the implementation logic or reference logic has additional nodes. The initial comparison point match will be matched according to the design object name, if the design object name is different. The phase of a node may be inverted, and this change will be recorded in the automatic setting file. When the automatic setting file is read in formality, the tool can recognize the phase reversal during the matching and verification of the comparison points. The SVF file is helpful for successful verification, but it can also be verified when there is no SVF file.

To better simulate the aggregation phenomenon, the “background field” can be divided into two types: static field and dynamic field. Among them, the static field S is only related to the pedestrian’s movement environment and has nothing to do with the evolution time of the system and the pedestrian’s movement state. The size of the distance between each cell and the exit is used to indicate the size of the static field S value, the larger S is, the closer the cell is to the exit. We expand on the Burstedde cellular automata model, adopt the calculation method of “static field” in model, introduce local density, consider the dynamic changes in pedestrian local density in surrounding areas, and establish a kind of consideration localization.

Numerical simulation studies the evacuation process in single and multiple exits, as well as the dynamic behavior of high-density crowds in turning and circular passages. The results show that the new model can well reproduce the macroscopic phenomena of group movement, arching, following, and stop-and-go during pedestrian movement in different scenes.

4.3. Structural Level Measurement

Traditional data mining is aimed at structured data, that is, a predefined data model, which is described based on the IFC standard, can be stored in a relational database through exchange or analysis, and can be logically expressed in a two-dimensional table structure. Models are classified as structured data. However, data are not all structured. With the development of network technology, organizations are full of unstructured data such as text documents, Web pages, and emails. According to statistics, 80% of an organization’s information is stored in the form of text, which is the most common form of information.

Compared with structured data, unstructured data are data that are inconvenient to use the two-dimensional table structure of the database for logical expression. Generally, it is text data, pictures, audio, video, etc., described by unstructured natural language. The content cannot be directly parsed by a computer, and it will be difficult to store and retrieve it in the database. Regarding the storage of unstructured data in the database, the general approach is to create a data table that contains three fields: number, content keyword, and content summary. The content can be referenced through the number field, and its retrieval can be achieved through content keywords. Semi-structured data are between structured data and unstructured data. The structure and content of the data are integrated, and there is no obvious distinction.

The object-oriented knowledge representation method mainly expresses knowledge through objects, classes, inheritance, encapsulation, interfaces, messages, and methods. This method believes that the most basic condition of knowledge is the object, and the object can decompose any form of knowledge. Through data abstraction and information shielding, the method in Figure 4 can not only deduct knowledge from general to special but also generalize knowledge from special to general.

The signal is divided into many small time intervals, and each time interval is analyzed by a Fourier transform to determine the frequencies that occur within the corresponding time interval. Equivalent verification requires related preparation steps and the support of library units, such as library files for standard cell libraries, level conversion units, and isolation units. When running FormMido, you need to read in all the relevant logic library files. The library files should contain the definitions of the corresponding logic functions, power pins, and power shutdown functions. Before reading the reference design and implementing the design, you should read the SVF file generated by Compilef. The SVF file contains information such as design optimization and rename during synthesis, which can improve the verification performance and make the verification successful.

5. Application and Analysis of Civil Engineering Complexity Structure Measurement Model Based on Logical Equivalence

5.1. Logical Equivalent Data Preprocessing

The experiment divides the constructed logical equivalent rule knowledge base into eight “tables” to store data, which are the main table, the secondary main table, the direct parameter table, the indirect parameter table, the basis specification table, the accident type table, the handling measure table, and the risk level table has a certain logical relationship between the table and the database. When we extract the main and sub-subjects in the database, their corresponding direct and indirect parameters are also extracted at the same time and are calculated by comparing the parameters. We can judge whether the main body (or sub-subject) is a source of danger. Once it is proved to be a source of danger, the follow-up norms, accident types, and handling measures will be output in this logical order.

As the scope of the sample data shrinks, the scale of the civil engineering network also becomes smaller. Generally, it is easier to find high-level transportation hubs or core nodes in large scale or large-scale civil engineering networks, and some sub-hubs or centers are also locally arranged. As part of the large-scale civil engineering network data, the core node level in the medium-scale civil engineering network is slightly lower, the number of secondary central nodes is limited, and the connectivity value will not be particularly high.

In other words, the heterogeneity of nodes in the large-scale civil engineering network is higher (i.e., the difference in the connectivity value of the road in the civil engineering network is greater), the middle-scale is second, and the small-scale road network has the lowest degree of heterogeneity. From the basic principle and implementation process of the box covering method, it can be seen that the higher the degree of heterogeneity of the network, the faster the network convergence may be. In other words, as the box size increases, for networks with high heterogeneity, the minimum number of boxes required by the overlay network of Figure 5 decreases faster, and thus, the fractal dimension of the box overlay structure becomes higher.

The first is the acquisition of target knowledge. For the knowledge that this research needs to acquire, that is, the current construction safety regulations, the norms are explicit knowledge expressed through text, images, and symbols, so the method of acquisition is relatively simple. The second step is knowledge expression, that is, to transform the acquired knowledge and store the knowledge in the established knowledge base using computer language. The third step is the application of knowledge; that is, knowledge is extracted from the established knowledge base, and the problems encountered are solved through the extracted knowledge. The totality of these spectra represents how the spectrum changes over time.

In fact, in this process, the computer plays the role of an expert to solve practical problems. Using the knowledge engineering method, we translate the current construction safety specifications extracted above into language and translate them into a knowledge representation form that can be read by a computer to build a knowledge base of current construction safety specifications for subsequent rule inspections.

Each component has its own unique identification ID, namely Element ID. Element ID exists as a condition item in the “.csv” file when operating Revit in Dynamo. Geometric dimensions, elevation parameters, and other information, among which information such as component names, geometric dimensions, elevation parameters, etc., is used for rule inspection in the safety rule knowledge base, and Element ID also outputs as the condition value regardless of “.csv.” In the file or in the logical equivalent model, it is always with the component information.

Therefore, Element ID can be used as a bridge between the security rule check result and the visual rule check platform. The selected rule check platform needs to meet the requirements of being able to link to an external database. We import the original logical equivalent model into the visual rule check platform and at the same time link the platform to the security rule knowledge base that has completed the rule check, so that the knowledge base with Element ID information and specific rule check results can be matched to the import through Element ID matching on the concrete logically equivalent model components in the platform.

5.2. Simulation of Structural Measurement of Civil Engineering Complexity

In the simulation process, compared with the frame structure system, the shear wall structure system is a medium rigid structure, and the lateral displacement of the house under the action of horizontal load is significantly reduced; compared with the shear wall structure system, the frame is very small. The plan layout of the shear wall structure system is free and flexible. When adding a load, you can use the 3D analysis view or the analysis view of each layer. In the analysis view, you can easily select the components to be loaded. Before structural analysis, load information can be counted in the schedule. If you need to modify the load value at this time, you only need to modify it in the schedule, and the system will automatically reflect this modification to the model instead of in the model. This makes the entire design process simple and efficient, while saving engineers more time and focusing on design.

The internal space can be divided into aisle area and seating area, but different layouts have different numbers of horizontal and vertical aisles. In this study, the movie hall is divided into two-dimensional grids of W × L, and the exit widths of the movie hall are all equal, where the shape represents the number of grids in the width direction, and L represents the number of grids in the length direction. Each grid takes 0.4 m × 0.4 m to represent the area occupied by a single pedestrian, and each grid is either occupied by pedestrians or empty. Among the twelve sample data, the box-covered structure fractal dimension value of nine sample data is close to the degree volume structure fractal dimension value, while the volume structure fractal dimension value is significantly smaller than 2. Suppose x is an M-dimensional parameter vector, and its density p(x) is the prior probability density.

This is determined by the definition of structural fractal dimension. As mentioned above, the fractal dimension of volume structure does not consider the difference in nodes, and the fractal dimension of volume structure is equivalent to giving each node a weight, which is the degree value of the node. Since the civil engineering network is a scale-free network, this weight may be very large, as high as hundreds or even thousands, which directly affects the calculation results of the fractal dimension of the volumetric structure in Figure 6.

The civil engineering project is a multidiscipline complex with wide professional coverage and many subjects of work. Therefore, simply relying on the main body table sometimes cannot correspond to the specific information and location, such as the hole in the main body, the form that exists in the civil engineering can be divided into vertical openings and non-vertical openings; the scaffolding in the main body can be divided into portal scaffolding, bowl-buckle steel pipe scaffolding, bamboo scaffolding, etc. We regard this different existence form of the same main body as the main body. The sub-subject is the specific manifestation of the subject, so the sub-subject table is designed in the knowledge base to reflect the specific existence form of the main body.

In the box covering method, although the heterogeneity of nodes is considered in the box covering, that is, the difference in node degree value, and the difference only affects the number of nodes contained in the box and does not affect the minimum number of boxes required by the overlay network. In other words, the difference in the box itself is not directly mapped to the calculation process of the box covering method. The degree of connectivity diversity of the civil engineering network is different, and the degree of influence of the node heterogeneity on the fractal dimension of the corresponding structure is also different.

The results of the fractal dimension measurement of the three structures in different regions are very similar. Among them, the value ranges of the box covering structure fractal dimension, the area structure fractal dimension, and the volume structure fractal dimension of the measured twelve data are as follows: [3.0987, 4.7685], [3.1460, 4.3539], and [3.9573, 5.2727]. The mean values are 3.9464, 3.7410, and 4.5958, respectively. These results show that the topological fractal dimension of civil engineering network is greater than its geometric fractal dimension. As we all know, the geometric fractal dimension of civil engineering network is between 1 and 2, and the topological fractal dimension is all above 3. This is consistent with the definition of fractal dimension. The geometric fractal dimension describes the two-dimensional space filling capacity of geographic objects, while the topological structure can exceed the limit of the geometric space and fill the high-dimensional space arbitrarily.

5.3. Case Application and Analysis

After the calculation of the civil engineering model is completed, the calculation results can be analyzed to adjust the cross section or structural layout of the unreasonable components. After multiple calculations, the modified structural model data can be fed back to the Revit model to obtain a reasonable and effective model. After entering the load transfer interface, we can check the basic structure analysis model settings such as the geometry, constraints, load conditions and combinations, and loads of the structure. You can choose to display the load settings under different load conditions layer by layer or as a whole to avoid serious errors when imported into the structural analysis software.

After the rule is executed, the result of the rule check is output. According to the check result, it can be judged whether the logically equivalent component meets the safety standard. If the logically equivalent component meets the safety standard, the above rule matching and rule execution will be performed on the next component of the “.csv” file. Through a sensor, an IV dimensional observation vector Z is generated. The task of parameter estimation is to use z to restore the original parameter vector x.

If the result of the rule inspection judges that the logically equivalent component does not meet the safety standard, the safety inspection report will be output in the safety rule knowledge base, and the report content will include “accident type,” “safety specification,” “processing measures,” and “risk level” items, which are embedded in the content of the condition result table of the safety rule knowledge base.

The rule check for direct parameters is relatively simple. It is judged whether the logically equivalent components meet the safety standards by comparing the explicit parameter information such as the geometric size and elevation of the accident subject to the “direct parameter table” in the safety rule knowledge base; for the rules for indirect parameters check, we first extract the calculation condition values of indirect parameters from the “.csv” file, such as the length and fineness of the template support column, and calculate whether the slenderness ratio of the support column is safe through the calculation formula embedded in the “indirect parameter table” rules.

We import the “.csv” file into the structured security rule knowledge base in Figure 7. The first step is to match the rules of the knowledge base. The matching condition is the “component name” in the “.csv” file, and the matching scope is the security rule knowledge base. For example, if the component name is “dongkou,” it matches the subject whose subjectID is “4” in the main body table of the knowledge base; the component name is “wood scaffolding,” which matches the sub-subject “Wood” whose subjectID is “10” in the sub-subject table of the knowledge base. The “scaffold” topic with the topic ID of “7” in the topic table is associated to complete the identification of the accident topic. The second stage of rule matching is to obtain the attribute parameter information of the “.csv” file.

This can be done by estimating the function. In addition, the conditional probability density p(zI, x) gives the relationship between the parameter vector and the observation vector. At this stage, the security rule knowledge base will identify the parameter information as a “direct parameter” or “indirect parameter condition value” according to the parameter structure, that is, extract and identify the subject of the accident after the parameter information and classify it into parameters for parameter verification.

6. Conclusion

In this study, the civil engineering structure model is sent to different structural analysis software packages, and the model complexity of different analysis software is compared and analyzed. In particular, the use of analysis software, which is closely integrated with the Revit structural core modeling software, is detailed. At the same time, the application of logical equivalent technology assists and optimizes quality management work, and it promotes the informatization and standardization of quality management and improves quality management efficiency. By analyzing the structural fractal characteristics of different forms, different scales, and multiscale civil engineering networks, and calculating the fractal dimensions of the box covering structure, the volume, and the volume structure fractal dimensions, the comparative analysis of the fractal dimension values obtained show that the civil engineering network is typical structural fractal body. The fractal dimension of the box covering structure decreases as the scale decreases, and the volume fractal dimension is closely related to the compactness and heterogeneity of the network. The paper further confirms that morphology affects structure. The shape of the civil engineering network affects the integrity, heterogeneity, accessibility, and compactness of the civil engineering network structure. The three fractal dimensions of the structure jointly describe this influence from different angles. Finally, the application of big data analysis to realize the value mining of logically equivalent quality management big data will help solidify the quality management experience and training of civil engineering and improve the quality.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This work was supported by Shaanxi Dijian Real Estate Development Group Co., Ltd.