Abstract

According to the indicators, the evaluation index system of convergent media communication efficiency is constructed, and its corresponding hierarchical structure model is constructed. According to the nine-level scale in the analytic hierarchy process, the indicators are scored by designing an expert questionnaire with weights of relevant indicators, and a pairwise comparison judgment matrix is constructed from the questionnaire data obtained through the expert consultation questionnaire. This paper uses the YAAHP analytic hierarchy process software to calculate the matrix results and test the final results. If the obtained results are within the consistency test range, it will pass; otherwise it will not pass. The fine-tuning matrix is processed, and the weight vector is calculated to obtain the final indicator weight result. After studying a class of using matrix factorization to enhance the independence of random matrix column vectors, a Gram-Schmidt orthogonalization of random matrices is proposed to minimize the intrinsic correlation of the matrix. After analyzing the reconstruction characteristics of the orthogonal matching pursuit algorithm, a new signal reconstruction algorithm is given by combining the new optimization method with the reconstruction algorithm. Referring to the relevant theories of physics and molecular dynamics, this paper draws on the relevant description of the information fusion theory, introduces the basic concepts of “energy” and “collision,” and constructs the energy description model before data object fusion and the collision model when data objects are fused. By combining the practice of the radio mobile client log access and record data resources that have completed situation assessment and state transition, it effectively handles many low-efficiency regional data object processing problems generated in the process of data object fusion and deepens data object fusion.

1. Introduction

In recent years, with the development of the Internet and the popularization of smartphones, emerging media have developed rapidly, traditional media have been severely impacted, and county-level media have faced enormous pressure to survive [1]. Issues such as the loss of audiences and the sharp drop in advertising revenue are constantly reminding county-level media that if they want to seek development, they must find new paths. With the development of information technology and the expansion of the scale of netizens, the era where everyone has a microphone has fully opened [2, 3]. Netizens can publish their own news anytime and anywhere through mobile phones, such as uploading live videos of breaking news. For example, rumors spread faster than before, and the scope of spread is wider than before. The absolute advantage of traditional media in occupying the mainstream public opinion position is no longer guaranteed. Only by promoting media integration can we grasp the dominance of public opinion.

In the process of promoting the integration of traditional media and emerging media, we must pay attention to the cultivation of Internet thinking and be aware of the respective advantages of traditional media and new media, so as to learn from each other and complement each other’s advantages. At the same time, we must also pay attention to the power of technology, make full use of advanced technology, and assist high-quality communication content to be better presented and more deeply rooted in the hearts of the people. We must respect the law of news dissemination and the law of media development. On the basis of respecting the law, we should promote the in-depth integration of traditional media and new media in various aspects such as content and channel management and create a new batch of new mainstream media and new media groups.

It is precisely because the construction of the Convergence Media Center has just started, so most of the current research on the Convergence Media Center is at the theoretical level, such as analyzing the relevant policies proposed by the Party Central Committee for the construction of the Convergence Media Center or the current construction of the Convergence Media Center [46]. There is a lack of investigation and discussion on the actual construction of the Converged Media Center. Secondly, most of the current research articles on the Convergence Media Center are written by front-line media people. Although they are concerned about various practical problems in the current construction of the Convergence Media Center, they lack theoretical support, and the analysis and summary are not deep enough [7].

According to the financial media communication index system, the Delphi expert consultation questionnaire method is used, and the index is scored through the expert survey of the industry and academia. The hierarchical structure model of the indicators at all levels of the evaluation system is established through the AHP method, the judgment matrix for pairwise comparison is constructed, and the consistency test of each matrix level is checked. The relative weight of each evaluation index is determined through group decision-making, and the final specific score of the convergent media communication efficiency index system is obtained through calculation. The column vector of the random matrix must satisfy a certain linear independence, in order to ensure that two different K-term sparse signals will not be transformed into the same sampling set in the sampling of the signal, so as to ensure that enough information for signal reconstruction can be obtained. This paper introduces two methods to reduce the linear correlation of random matrix column vectors, both of which indirectly reduce the correlation of matrix column vectors by increasing the minimum singular value of the matrix. The performance evaluation of data object fusion based on energy collision converts the user’s subject needs into a certain amount of applied energy through certain rules and acts on the data object or data object fusion, so as to realize the optimization of the data object fusion body. When the resulting data object fusion or the connection between the data object fusions is not enough to meet the user’s theme needs, it will continue to start local or global “collision adjustment” to perform local data objects in the data object fusion. Some form of collision makes the information that was originally concealed come to the fore.

Relevant scholars put forward the theory of optimizing random matrix to achieve the purpose of improving signal reconstruction by optimizing the performance of random matrix, but the performance improvement of their proposed optimization method is not very obvious [8]. Related scholars improve the performance of random matrix by reducing the low-power average correlation, constructing a deterministic random matrix that satisfies the RIP criterion, and reducing the maximum column correlation by threshold iteration [9]. The improvement of the performance of the random matrix greatly improves the restoration rate of the signal, which provides a good theoretical basis for the application of compressed sensing in practice.

In the research on the influence of media forms, the West has experienced the development process from media hard determinism to media soft determinism [1012]. McLuhan believes that the medium is the message, which is the motive force that dominates the world, and the medium itself is the truly meaningful message; that is, human beings can only engage in communication and other social activities that are compatible with it after they have a certain medium. The theory of media bias in Empire and Communication: media can be divided into media time bias and space bias. The former can be preserved for a long time; the latter can overcome the obstacles of space [13]. Although Innes and McLuhan have different theoretical perspectives, they both fall into the misunderstanding of “exaggerating the decisive role of media forms” and ignore the influence of people and other factors. Since then, scholars headed by Paul Levenson and Joe Yashoe Mello have gradually realized that the role of media is not a panacea, and it is not a decisive factor in the development of everything. Motivation also played an active role in the communication process [14].

Relevant scholars proposed that the proposal of the Media Center will give the county-level media that have been on the fringe of the industry for a long time to finally have the opportunity to enter the focus area of policy attention and obtain development opportunities supported by policy [15]. They believe that, compared with foreign countries, domestic traditional media lacks the dependence relationship that can rely on the community and target users to form a strong stickiness, and the proposal of the construction of the converged media center and the arrival of the Internet era can effectively make up for this.

Relevant scholars pointed out that, due to cultural differences, media environment differences, and differences in thinking around the world, different scholars at home and abroad have different expressions for the definition of the concept of “media fusion” [16]. Since the definition of media fusion is inconclusive, they put forward their own views. They believe that the development of digital and network technology has boosted the development of media convergence, resulting in blurred boundaries between different media and mutual integration [17].

Relevant scholars put forward that the reasons for the emergence of media integration are the realistic needs under the changing background of the times, the competition between media and the pressure of politics, and the transformation and renewal of technological society [18]. Based on this, they believe that it is precisely because the media industry continues to innovate the original technology and market that it accelerates the renewal of the media industry system, thereby realizing market innovation.

Relevant scholars pointed out that the current rapid development of emerging media has made subversive changes to the ecological environment of the media, and traditional media must achieve transformation and development [19]. They proposed that, in order to improve the communication effect of TV media, first of all, it is necessary to make changes in the way of linking audiences, so that users who are like atoms can become users who can link to the media; secondly, traditional media such as TV must integrate different platform resources, expand different communication channels, and use Internet thinking and cross-border thinking to achieve product form innovation; at the same time, traditional media such as TV must also integrate external capital, pay attention to the guiding role of the market, and build a media integration ecosphere.

Relevant scholars have pointed out that there is a misunderstanding when discussing the topic of media integration [20]. They believe that media integration is not shallow digitalization, not letting traditional media transfer their own content to the Internet, but to form a digital mindset, explore the impact of digital technology development on the entire news industry, and seize the time to adjust and transform.

Relevant scholars believe that, in the context of the era of media integration and development, the development of media formats has put forward higher requirements for journalists and also raised many news challenges to their professional quality [21]. They believe that journalists should continuously improve their ability to adapt to new forms of communication and improve their professional quality. Today, journalists are no longer professionals who are only responsible for collecting and writing traditional news products. In the context of convergent media, journalists are required to become omnimedia talents with multiple media editing capabilities [22].

3. Methods

3.1. Construction of the Theoretical Framework of the Communication Effectiveness of the Convergent Media

The effect at the cognitive level means that the information changes the knowledge level of the audience and the original cognition of the audience; the effect at the psychological and attitude level means that it affects the thought fluctuation of the audience; the effect at the behavior level means the effect on cognition and attitude.

The information dissemination efficiency of converged media should also be comprehensively evaluated based on the cognition, attitude, and behavior of the dissemination effect.

According to the communication characteristics of convergent media, this paper starts with the information dissemination process of government-converged media-audience, combined with the theoretical model of political communication, and incorporates the influencing factors of five social functions of convergent media into three levels of cognition, attitude, and behavior. As the main body of communication, it can stand in the perspective of the social function of the media, through the formulation and implementation of various effective measures, so that the information can be spread and run in a timely and effective manner; whether its service supply and quality can meet the audience’s value expectations and needs, it is conducive to the construction of user loyalty and the transition of psychological identity, whether it can promote benign political communication between the government and the public and then undertake the responsibility of combining public opinion guidance and serving the audience. In view of the above research, this paper starts from the three levels of cognition, attitude, and behavior, combining the five functions of media social function of environmental monitoring, cultural inheritance, information dissemination, coordination relationship, and social service, and relies on the political communication model to construct the communication efficiency of the media, as shown in Figure 1.

Taking convergent media communication effectiveness as the target layer and relying on the convergent media communication effectiveness model, the first-level indicators are divided into five index dimensions, including social communication, political communication, social service, environmental monitoring, and cultural value. The secondary indicators are subdivided through the research and summary of relevant literature and indicator systems, combined with the characteristics of the convergent media and fully considered from the perspective of the influencing factors of convergent media dissemination. The secondary indicators are readability, originality, depth, timeliness, quantity, government response, media interaction, user feedback, information notification, platform provision, economic environment, people’s livelihood environment, core value guidance, traditional value guidance, diversified value guidance, such as the inheritance of science and education culture and regional culture, which are shown in Table 1.

3.2. Method of Determining the Indicator Weight

AHP, also known as analytic hierarchy process, is the main research method used in this study. In this paper, the AHP analysis method is used, and according to the media communication efficiency model obtained in the previous article, the specific index composition is initially determined through literature review and expert consultation, and then the evaluation index system is constructed, and the final index is obtained through the YAAHP analysis hierarchy software group decision-making.

In the evaluation of convergent media communication efficiency in this paper, first of all, the AHP analytic hierarchy process is adopted to establish a specific structural model of the convergent media communication efficiency evaluation system. Secondly, using the 1–9 scale method proposed by the Delphi method, the scale table is shown in Table 1 in the appendix, to construct the judgment matrix of each layer structure, and design the questionnaire related to the assignment of index weights. Thirdly, the Delphi method is used to distribute the relevant questionnaires of the indicator weights to the scholars for scoring. Currently, the Delphi method is used to collect opinions, and 15–20 experts are usually invited. The number of experts can also be adjusted according to the different situations of specific problems in actual operation. Therefore, according to the research needs, 20 experts in related fields are mainly invited in this paper, including the staff of financial media, scholars, and other media workers to distribute the questionnaires to ensure the scientific and reasonableness of the questionnaires. Finally, the analytic hierarchy process software YAAHP is used to calculate and sort out the sample data and obtain the specific weight value of each index in the evaluation of media communication efficiency.

3.3. Random Matrix Optimization Design

The matrix elements of a completely random matrix all independently obey a certain random distribution and are not related to most sparse signals, and the accuracy of the reconstructed signal is high. But it is this complete randomness that makes hardware implementation difficult, the storage capacity of the matrix is large, and the complexity of the actual operation is relatively high.

A Bernoulli random matrix constructs a matrix Φ of size MN, whose elements are independently distributed according to Bernoulli:

The Gaussian random matrix is constructed by constructing a matrix Φ of size MN, all elements in Φ are independently Gaussian distribution, its mean is 0, and its variance is M, namely,Φ satisfies the RIP property with great probability:c is a small constant, and K is the signal sparsity.

Inspired by representative random matrices, people have been looking for a better encoding method to optimize the elements of random matrices. Since then, random matrix designs based on chaotic sequences and random matrix designs based on m-sequences have appeared. These methods inherit some advantages of structured matrix design and have good performance in related application fields. In addition, the optimization study of existing random matrices opens up a new way to improve the performance of random matrices.

Singular Value Decomposition (SVD) is widely used in principal component analysis, image compression, least squares solution, and generalized matrix inversion in practical applications. Based on the research on the singular value distribution after singular value decomposition, a method of singular value decomposition for random matrix is proposed to improve the minimum singular value of the matrix and improve the reconstruction accuracy of the matrix.

Set A ⟶ C; then there are m-order unitary matrix and n-order unitary matrix such that

The singular value of a matrix is closely related to the norm of the matrix. The 2-norm of a matrix is its largest singular value, and the F-norm of a matrix is the sum of squares of all singular values of the matrix.

The number of nonzero singular values in the singular values of the matrix is equal to the rank of the matrix. The larger the singular value, the greater the amount of information contained. For example, in image compression, after singular value decomposition, the largest part of the singular value is selected and the smaller singular value is discarded, and the information of the image will not be lost a lot.

For rotation invariance, if P is a unitary matrix, then PA and A have the same singular values. The random matrix optimization method based on singular value decomposition is to generate a new random matrix by averaging the obtained singular values after singular value decomposition:

With SVD decomposition of the generated random matrix Φ of size M  N, we get

The minimum singular value of the new matrix Φ is increased, the linear independence between the matrix column vectors is enhanced, and the signal reconstruction quality of the new matrix is also improved.

3.4. Random Matrix Optimization Based on Gram-Schmidt Orthogonalization

Random matrix optimization based on SVD decomposition: they all optimize the matrix by increasing the minimum singular value of the matrix after studying the relationship between the linear independence of the matrix column vector and the minimum singular value of the matrix.

In this section, by analyzing the characteristics of the OMP reconstruction algorithm, it is proposed to apply Gram-Schmidt orthogonalization to random matrices (which can be regarded as composed of redundant base atoms) to maximize the linear independence between matrix column vectors. The processed matrix has excellent performance when choosing the OMP algorithm for reconstruction.

Gram-Schmidt orthogonalization (also called Schmidt orthogonalization) is a classical basis orthogonalization algorithm, which converts a set of linearly independent vectors into a set of unit orthogonal vectors in Euclidean space, in typical method. Schmidt orthogonalization plays an important role in advanced algebra, and the use of orthogonal normalized basis can improve the stability of numerical calculation process. Classical Schmitt orthogonalization deals with a set of linearly independent vectors, while the random matrix Φ to be processed in this paper contains N column vectors, but only M of them are linearly independent.

For an M  N-dimensional random matrix Φ, it contains N column vectors, but only M of them are linearly independent, and the remaining N-M column vectors can be linearly represented by these M column vectors; that is to say, it contains a lot of redundant information. If we try to remove the redundant information as much as possible, the most concise form of the signal can be obtained when the signal is represented, and the incoherence between the column vectors of the random matrix Φ will be the largest. The previous analysis is considered from the perspective of signal representation and measurement, but sampling based on compressed sensing theory is an undersampling system, and the redundant information contained in these column vectors can be reasonably used during signal reconstruction. The recovered signal enhances the robustness of signal reconstruction. For how to reasonably retain the redundant information in the column vector, there is no criterion for random matrix optimization; it is difficult to grasp this scale; and if the redundant information of the column vector is removed to the maximum, it may lead to the failure of some reconstruction algorithms.

In the orthogonal matching pursuit OMP algorithm, the column vector of the random matrix column vector that is most similar to the current signal margin is searched for each time. If the random matrix Φ is processed by Gram-Schmidt orthogonalization, then the similarity with each signal residual will be the largest, but in fact, most of the information after linear measurement is mainly concentrated in the low frequency band of the signal (the first M columns of the measurement result); at the same time, the OMP algorithm is in order to improve the matching pursuit algorithm, which converges slowly. The disadvantage is that the orthogonalization of the selected column vector is added after each iteration, which also coincides with the idea of Gram-Schmidt orthogonalization. Therefore, when selecting the OMP algorithm as the reconstruction algorithm, the Gram-Schmidt orthogonalization is performed on the random matrix column vector to obtain the best measurement and better reconstruction quality.

For βi, the following method is used to obtain

After the column normalization of βi, a new random matrix is formed. The linear independence between the column vectors of the new matrix is enhanced, and the reconstruction performance of the new matrix is excellent when the OMP algorithm is selected for signal reconstruction.

3.5. Efficiency Evaluation Process after Data Object Fusion

Through the preliminary results obtained by data object fusion, the approximate distribution of data objects contained in each data object fusion in the result is clarified and the “core” of each data object fusion is determined, i.e., the data object that can represent the corresponding data object fusion or the model of the data object.

The performance evaluation process of data object fusion based on energy collision proposed in this paper is a method of reducing the dimension of high-dimensional data space to low-dimensional space by a specific method, using ordinary data object fusion method, and introducing energy and collision mechanism. Therefore, no matter what method of data object fusion is adopted, it will not have a great impact on the subsequent stages. It is only necessary to make a detailed description of each data object fusion obtained to determine the “core” of the data object fusion.

Receive and understand the user theme, refine and refine the user theme requirements, convert it into a specific energy value according to certain rules, and apply specific energy to the data object fusion body data object or data object fusion body core according to different needs. According to the different energy values applied by the user, different forms of collisions are made between data objects and data objects, between data objects and data object fusion cores, or between data object fusions and data object fusions to generate new fusion results.

In the process of applying energy to the data objects in the fusion body, the user must simultaneously evaluate the performance of the isolated point data objects generated by the preliminary fusion. The isolated point data object is also fused into a specific data object fusion body by obtaining the specific energy applied by the user or a data object in the data object fusion body becomes an isolated data object by obtaining the specific energy applied by the user.

The performance evaluation process of data object fusion based on energy collision introduces the user’s theme needs into the performance evaluation process, so that the performance evaluation process can consciously exclude those data objects that are not required by the user’s theme needs and can dynamically change according to the user’s theme needs.

Optimizing the original data object fusion results in different forms can not only make the boundary between the data object fusion body and the data object fusion body blurred or clear as required, but also make some data object fusion bodies spontaneously merge. Splitting or similar merging achieves the purpose of increasing or reducing the number of data object fusions and enables purposeful reactions between different data object fusions to generate new data object fusions.

The performance evaluation process of data object fusion based on energy collision drives the energy of the user’s theme requirements for the data space that already has the preliminary data object fusion shape and promotes the continuous “collision” processing of the data objects. In the collision process, the collision form changes with the change of the user’s theme requirements and thus describes the relationship between the data object fusion body and the data object fusion body. At the same time, the collision process also considers the processing of a single data object. According to the energy drive of the user’s theme requirements, the data object in the data object fusion body can be transformed into an isolated data object, and the isolated data object can also be added to a specific data object fusion in the body. Figure 2 shows a schematic diagram of the performance evaluation process of data object fusion based on energy collision.

4. Results and Analysis

4.1. Collision of Outlier Data Objects

Due to the obvious difference with the behavior or model of data objects in other data object fusions, they are often regarded as noise or abnormal data objects and discarded.

However, this arbitrary discarding, ignoring the information contained in outlier data objects that meet the user’s thematic needs (in some special applications, rare events may be more meaningful than normal occurrences), is not conducive to comprehensive performance evaluation of data objects.

For the performance evaluation of outliers, the basic solution principle is to revolve around the dynamic changes of user needs, and the state changes of outliers and data objects in the fusion of data objects will also change accordingly. For a data object that is not an isolated point, it may become an isolated point due to the function of the user demand topic that is separated from the basic state data object fusion body energy surface of a data object fusion body; similarly, an isolated point may also be due to the function of the user demand topic.

During the collision process, the preliminary result is that a certain data object in a certain data object fusion body has energy reaching a certain limit value, so that the data object is separated from the base state data object fusion body physical energy of the data object fusion body. The energy of the data object cannot enter the ground state data object fusion body energy surface of other data object fusion bodies. This kind of collision from nonisolated data objects to isolated data objects is called dissociative collision.

With the refinement and refinement of the subject of user needs, when an isolated data object is given a certain amount of energy, if the data object can enter the base state data object fusion body energy surface of some data object fusion bodies, the data object will be become a nonisolated data object; then, by comparing the correlation energy between the data object and the data object fusions of the ground state data object fusions that meet the requirements, then determine which data object fusion to enter (usually the data object enters the data object fusion with the largest correlation energy with a certain data object fusion). In Figure 3, the data object is an isolated point, and after a sticky collision, the data object enters one of the data object fusions, as shown in Figure 4.

During the collision process, the preliminary result is that a certain data object in a certain data object fusion body has energy reaching a certain limit value, so that the data object is separated from the base state data object fusion body physical energy of the data object fusion body.

The energy of the data object can enter the base state data object fusion body energy surface of another data object fusion body; this is a kind of transformation from a data object in a data object fusion body to a data object in another data object fusion body. The collision is called a jump collision.

In the jump collision, under the influence of the user’s demand theme, a data object in the fusion body is applied with negative energy to reduce its relative energy to a certain threshold; the data object is separated from the original data object fusion body.

However, by virtue of its own energy, it can also enter the ground state data object fusion body energy surface of other data object fusion bodies (usually the data object enters the data object fusion body with the largest correlation energy with a certain data object fusion body) and realizes the jump collision process.

4.2. Basic Analysis of Preliminary Fusion

For the principal component extraction based on the distributed principal component transformation method DPCT based on the Lambda architecture, and the preliminary fusion results of the data objects obtained by the basic density-based algorithm, for the convenience of observation, a perspective view of the composition of the first and second principal components is given, as shown in Figure 5.

The feature of Fusion 1 is that the data objects in it all appear in the first month of the initial stage of the entire streaming media system. Since the publicity has not been widely spread, users are gradually accumulating, and the overall listening volume is not high. Fusion 2 is mainly composed of the days from Monday to Friday with low listening rate and some Saturdays and Sundays with relatively high listening rate. The data objects in Fusion 3 are basically the dates of Saturday and Sunday. The reason for the analysis is that the programs on the weekends are all recorded and broadcast, and the interactivity and audibility are weak; and there is no commuting link on the weekends. The listening rate of the segment is missing. Fusion 4 is mainly composed of days from Monday to Friday with a higher listening rate. From the listening rate curve, there is a peak from Monday to Friday and a trough on the weekend. The data objects in Fusion 5 are Monday-Thursday, July-August 2020. Since the value of the third principal component is different from the data objects in Fusion 4, it becomes a data object fusion, which is related to the canvassing and selection of the annual golden songs of the year in the morning column at the end of 2020.

In view of the performance evaluation of data object fusions and the situation of too many outlier data objects, it is necessary to appropriately adopt the data object fusion deletion strategy to merge similar data object fusions and outlier data objects in order to reduce data object fusion. It can reduce the number of data objects, expand the scale of data object fusions, reduce the number of isolated data objects, and highlight the connection between similar or related data object fusions and their high-performance areas.

The main feature of outlier data objects is that they are mainly composed of dates with particularly high listening rates, such as July 1st and January 1st every year, etc., the retrospective play, and sweepstakes of the 80 golden songs of the year selected by the listeners. As a result, some data object compositions cannot be included in a specific data object fusion due to the density-based method itself. In order to make the “forgotten and isolated” data objects in the preliminary fusion to be submitted to users for evaluation and decision-making as members of the corresponding data object fusion, “kernel”-based model construction and “energy”-based energy description are introduced.

For data object fusion in vivo data objects and outlier data objects, specific energy is given to generate corresponding collisions around the special needs of users, so as to effectively reduce the number of data object fusions and outlier data objects and reasonably strengthen the interior of data object fusions.

4.3. Analysis of Evaluation Results

Figure 6 shows the optimized effect diagram of the performance evaluation after the collision after energy analysis (based on the perspective view of the first principal component and the second principal component).

During the performance evaluation process, the critical data objects that did not enter the corresponding data object fusion in the preliminary fusion results are optimized into the data object fusion, and the scale of the data objects included before optimization (as shown in Figure 7) is optimized. After that, it has been greatly improved, and its corresponding data object size (as shown in Figure 8) has obviously changed in an orderly manner.

Referring to the actual optimization situation, after the data object fusion performance evaluation process based on energy collision and the outlier data object fusion optimization process, the matching degree of the corresponding optimization results reaches more than 90%, which is obviously better than that before optimization. This fully verifies the practicability and operability of the related methods proposed in this chapter for improving the effect of data object fusion by means of performance evaluation.

5. Conclusion

Through the matrix construction of the data obtained from the expert questionnaire, the obtained single-level results are tested, and the weight vector is calculated. Finally, the value of each vector is calculated by means of data aggregation, which is the evaluation index system of the convergence media communication efficiency. The weights of each indicator are assigned. This paper studies a class of optimization methods to improve the performance of random matrix reconstruction by enhancing the linear independence of random matrix column vectors and proposes to minimize the linear correlation between matrix column vectors by Gram-Schmidt orthogonalization of random matrices sex. This paper studies reducing the cross-correlation of the matrix by optimizing the cross-correlation coefficient of the perception matrix and then optimizes the random matrix with better performance when the sparse transformation matrix is given. The specific collision process of the data object fusion is explained by calculating the changes of the parameter values of the data object fusion. In the optimization process, it emphasizes the role of users and points out that all effective collisions are driven by the theme needs of users. Therefore, in its collision process, there are two user-based links, “user-driven” and “result evaluation.” Through the research and practice of this paper, from the perspective of information fusion, it is possible to explore the logic, paths, modes, and strategies related to the integration of media resources accumulated in the broadcasting field, the integration of traditional media and new media, and to deepen the traditional broadcasting field itself. It provides ideas for the development of new media integration, with strong pertinence, practical significance, and application value.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

This paper was supported by 2022 Major Achievements in Discipline Research Funding Project of College of Chinese & ASEAN Arts of Chengdu University.