Survey of probabilistic graphical models

Survey of probabilistic graphical models Li Hongmei,Hao Wenning,Gan Wenyan, Chen Gang Institute of Command Information System

PLA Univ. of Sci. & Tech.

Nanjing, China

e-mail: 139********@https://www.360docs.net/doc/934895274.html,

Abstract—Probabilistic graphical model (PGM) is a generic model that represents the probability-based relationships among random variables by a graph, and is

a general method for know ledge representation and inference involving uncertainty. In recent years, PGM provides an important means for solving the uncertainty

of intelligent information field, and becomes research focus in the fields of machine learning and artificial intelligence etc. In the paper, PGM and its three types of basic models are review ed, including the learning and inference theory, research status, application and promotion.

Keywords—probabilisticgraphical model; Bayesian network; Markov network; factor graph; learning and inference

I.I NTR ODUC TION

Probabilistic graphical model is a kind of model that represents the probability-based relationships among random variables, the concept of which was first proposed by J. Whittaker in 1990 from the point of view of statistic [1]. PGM provides an effective tool to solve the uncertain and complex system problems, becoming a hotspot research in the fields of artificial intelligence, machine learning, and data mining. PGM

is the combination of graph theory and probability theory. Graph theory provides a unified and intuitive modeling framework for expressing multivariate dependencies and causal relationships, and a natural structure for the design of model algorithm. Probability theory has a strict theoretical foundation, which gives PGM the advantage that joint probability distribution is compact and concise and depends on the independent variable relationship, simplifying knowledge acquisition and domain modeling, reducing the complexity of model computation.

PGM provides a modeling framework for processing the uncertainties, which makes the probability-based relationships able to be clearly expressed and calculated. Currently, most probability statistical models can be seen as special cases of PGM.

It was in 1988 that J.Pearl [2] gave a detailed description of two important GMPs, directed graph (Bayesian networks) and undirected graph (Markov networks),and pointed out their development in uncertainty information processing. Along with the development and application of PGM theory, a new PGM, factor graph, was proposed in 2001, which provides the uniform representing method or different PGMs, and shows great vitality in data mining,artificial intelligence, machine learning, pattern recognition, and other important areas.

In this paper, three types of PGMs are reviewed, they are Bayesian network, Markov network, and factor graph, which are widely researched and applied, including the theory, development and application. Section 2 presents an overview of PGM in its concept, theoretical development, and application. Section 3 gives an introduction of Bayesian network, including its learning and reference algorithms, model extension and application. In section 4,we give an overview of Markov network, including its theoretical development, model extension and application status. In section 5, we introduce a new PGM, factor graph, and summary the theory of modeling and inference and present situation of application. At last, we give the conclusions.

II.P ROBABILIS TIC GRAPHICAL MODEL Probabilistic graphical model is a graphical construction based on Bayesian rules, and it is able to graphically express the probability-based relationships among sets of random variables, representing the expression and inference on uncertain information and knowledge. Before PGM appears, probability-based Bayesian inference is lack of simple and formal methods of expressing and processing the conditional independence, and it depends on the acquisition and processing of large probabilistic data, slowing the progress of uncertain information processing based on probability in the field of artificial intelligence. It was until the emergence of PGM and that became the most widely-used graphical model in the field of computer and artificial intelligence.

PGM can be seen as a probability distribution family defined in a graph, which was used to represent and operate joint probability distribution. In the view of graph theory, PGM provides a tool to express the conditional dependency and independency relationships among variables and establish the joint probability distribution model based on the multivariate relationships in combination with actual complex conditions of fields, and the means of uncertainty transmission. From the perspective of probability theory, GMP can be used for probability-based uncertainty inference by calculating the marginal probability and other conditional probability among variables.

PGM theory provides an unified and flexible framework of models and methods for solving the actual complex problems, and lots of models widely used for statistical decision and uncertainty reference

2013 10th Web Information System and Application Conference

can be represented by PGM, such as hidden Markov model, dynamic Bayesian network, evolutionary tree, Kalman filter, Markov random field, conditional random field and etc, of which the most representative PGMs are Bayesian network, Markov network, and Factor graph. In order to adapt to special application requirements indifferent areas, usually, these basic models are extended, shaped and combined into a variety of promotional models, and have successfully used in various fields, promising good prospect.

III.B AYESIAN NETWORK

A.Bayesian Network

The Bayesian network (BN), also known as belief network, is a directed acyclic graph (DA G), which is used to express probabilistic causal dependencies among variables, and the concept was first proposed by Pearl in 1986[3]. Then, BN becomes research focus in uncertainty knowledge representation and inference, and is regarded as standard cognitive model in uncertainty reasoning based on probability [4].

BN is an extensive network model of Markov chain,

it consists of two components, the network structure by a DA G and conditional probability table of nodes in the network. BN provides an efficient tool to represent the causal-effect relationships based on probability, it has clarity semantic and powerful inference ability, and through the learning and inference of BN, lots of uncertain phenomena can be recognized, reasoned, classified, predicted and etc.In the real applications, BN is usually used to solve the asymmetric relationships and dependencies for its direction, for example, the temporal relationship in voice analysi s and the causal relationship in faulty diagnosis etc. After nearly 30 years of development, the theory of BN has been extended and applied, realizing the combination with the ideas of artificial intelligence, expert system, decision theory and etc. Foreign researchers, such as R.Neapoliton [5], F.V.J ensen [6], E.Castillo [7], R.G.Cowell [8], F.V Jensen[9] et al. have given detailed description of BN. Domestic scholars also have deep research into BN, and successfully apply them into various domains, including statistical decision, medical diagnose, and expert system.

B.Learning and Inference

Learning is an important ability of BN, and it is particularly important to learn PGM from data without knowledge base [10].According to the integrity of data, the learning of BN can be grouped into two categories: learning from co mplete datasets and learning from incomplete datasets.

With the complete datasets, we usually choose two classical methods to learn parameters of BN: maximum likelihood estimation and Bayesian parameter estimation. Compared with structure learning, parameter learning is simple and it has contemporarily mature theory. While in the actual conditions, more structure learning methods are required to solve the complex problems. Most of the algorithms can be grouped into two categories: the method based on a scoring metric and a search algorithm [11] and the method of conditional independence test based on constraint [12]. Commonly, the two basic methods are combined into new hybrid algorithm, such as max-min hill-climbing algorithm [13].Besides, paper [14] and [15]separately introduced ant colony optimization and particle swarm algorithm into the hybrid algorithm proposed above,and showed high convergence rate and good learning quality, solving the problem of tradition- nal hybrid algorithm that easily falls into local optimum, not global optimum.

In view of the incomplete dataset, new parameter learning algorithms appeared, such as EM algorithm [16]and its improved algorithms [17], Gibbs sampling algorithm, bound and collapse algorithm, the natural gradient learning algorithm [18]and etc. For structure learning from incomplete data, the main algorithms include structure expectation maximization algorithm and score-and-search algorithm. To overcome the limitation of easily falling into local optimization, niche swarm particle algorithm [19] and mutual informat ion [20] into learning BN structure, which reduces searching space and enhances learning speed in a way. With the expanding scope of research,many methods with good learning ability and adaption are proposed, such as the method that learns from real conditions [21] and that learns from new data step by step [22].

The major function of BN is uncertainty inference, the methods are divided into two types: exact inference and non-exact inference. Classical exact inference concludes the Polytree algorithm running in a single unicom network proposed by Pearl [23], the junction tree algorithm proposed by Lauritze. Later, Cooper proved it a NP-hard problem to inference from an unconstrained BN [24].Hence, approximate inference methods appeared. The most widely-used method is random sampling instead of using conditional independency, which is also called Monte Carlo (MC), simple and common-used. In view of the lack of error boundary of MC, a search-based method, bounded conditional algorithm was proposed by Cooper [25] to improve the performance. Besides, Poole, Murph, Jorda, Tarjan et al. and the domestic scholars, such as Li H.T [26] and Gao B [27] have studied the approximate inference, and the algorithms receive continuous optimization. Although the approximate inference was proved to be a NP-hard problem later, it performs better than the exact inference in the actual conditions for its good compromise of time and precision.

C.Application and Extension

The learning and reasoning methods of BNs have been widely used in a wide variety of domains, including artificial intelligence, pattern recognition,

machine learning, data mining. For instance, in the field of artificial intelligence, BNs is mainly used to s olve the problems of knowledge representation and intelligent inference, reflecting the complexity of objective things in the real world and uncertainty reasoning. In the field of pattern recognition, BNs are used for classifier design etc. BNs have achieved great success in the practical application field.

Currently, in order to adapt to different complex applications, Bayesian network models are deformed and promoted, in view of the acyclic and static limitations, mainly in the iterative and feedback process (such as cyclic Bayesian network [28]), in structured data (such as object-oriented Bayesian network (OOBN) [29] and hierarchical Bayesian network (HBN) [30]) and in sequential system based on time-serial (such as dynamic Bayesian network, DBN) [31] and hidden Markov model (HMM), Hybrid Bayesian Network (HBN) combining discrete and continuous Bayesian network [32]. These extended models provide efficient modeling and inference tool s for resolve actual complex problems. Especially, DBNs and HMMs gain more importance in the current research and application.

DBN is the extension of BN within the field of time, it can infer from complex and uncertain problems with non-linear relationships and random evolution. DBN has been applied to various fields, including voice recognition, economic forecasts, target recognition [33], and regulation of gene sequences [34]. HMM is a special case of DBN, and an extension of the Markov chain, the state of which is not visible at any time, and we have to infer the probability of metastasis by observing the state series.HMM has been successfully applied to the fields of speech recognition [35], machine translation, handwriting recognition, image processing, and stock prediction investment.

IV.M ARKOV NETWORK

A.Overview

Markov network (MN) [36] is a probabilistic graphical model that uses an undirected graph to represent the independency relationships among variables, and it defines the joint probability distribution of random varia bles of Markov’s property, which means that the probability of state transition depends on the adjacent state, and the domain of definition may be continuous or discrete. MN is mainly to analyze the context of non-causality and the spatial relationships of physical phenomena, and has been widely used in pattern recognition, stati stical inference and other fields. As we know, the Markov network differs from the Bayesian network that MN is commonly used to solve symmetric relationship and dependencies and it is capable of representing some dependencies such as cycle dependence that the traditional BN is not, such as cyclic circular dependencies,so the MN has a wider range of representation. Although in recent years the concept of

cyclic BN was proposed, the probabilistic inference of which is complex than the traditional BN.

Markov network is an undirected graph model (UGM),which can be represented by a tuple that carries two elements,G=(V,E), where G represents an undirected graph, V is the set of nodes which denotes the random variables, E is the set of edges which represent the dependent relationships among variables. The joint probability distribution is defined over the set of potential functions, and it is the standardization of the product of all potential functions, the following is the form

P(V=v)=

1

z

c

(v c)

c∈C

Whereψc is a non-negative potential function defined over maximal clique (MN is often divided into collections of maximal clique C).We define V= {V1,V2,…,V n},v={v1,v2,…,v n},and it is the state value of V. z is a standardization factor to ensure

that∑p(V=v i)

v

=1.

Parts of the Bayesian network learning and inference algorithms can also be applied to Markov network. Exact inference of Markov network is also a NP-hard problem, we generally use approximate inference, such as Markov Chain Monte Carlo algorithm [33] and belief propagation algorithm.

B.Promotion and Application

Commonly, MN only consider the discrete probability distribution, and in practical applications, the most widely used models are Markov random field (MRF), Conditional random field (CRF), and Relation Markov network and Markov logic network in the area of stati stical relational learning.

In the field of image process and computer vi sion, MRF is the most widely-used undirected graph. Essentially, MN is seen as the network structure of the MRF, they deal with problems in different ways, they have the similar theory though, that is to study and reveal objects’ neighbor relationships. Besag [37] who developed MRF into Markov joint distribution, and the performance provides MN a Bayesian framework, in which the vision problems can be analyzed and modeled via mathematical methods. In image processing, MRF can be applied into a low level of image processing, such as image restoration and segmentation, image surface reconstruction, image edge detection, image texture analysis.

In the field of natural language processing, the most representative UGM is CRF, which is based on MN. CRF is a probability model which can be used for the annotation and classification of sequence structural data, and it is widely used for Chinese named entity recognition and Chinese word segmentation processing, the study of which has been a zoom.

In the area of statistical relational learning, Markov network is combined with relational schema and logic into Relation Markov network (RMN) and Markov Logic Network (MLN),with more flexible representation and wider range of application than the existing methods.RMN can capture the dependencies among data and increase the accuracy of model classification and prediction by making full use of discriminative learning and collective data analysis, and have gained tremendous applicable value in cluster analysi s, relation prediction, and social network analysi s.MLN is a statistical relation model [38] that introduces the first-order logic into MN, and achieves great success in the fields of information extraction, data mining, target identification.

V.F ACTOR G RAPH

A.Overview

Factor Graph is a kind of bidirectional graph which gives a description of how multivariate global function

is decomposed into the product of some local functions, simplifying a large scale of complex global computations to simple local computations, and its concept and the Sum-Product algorithm were proposed by F.R.Kschischang et al. [39] in 2001. In the statistical inference field, lots of iterative process can be interpreted by factor graph, and with studies finding that factor graph is generic, intuitive, unity, and can simplify the calculations, the representation methods of the majority probabilistic graphical models (such as architecture intelligence, Tanner Figure, BN, MN,and etc.) can be unified by factor graph, and it applies the methods of message iterative processing to various fields, including channel coding, signal processing, video detection,scenario analysis, graph theory planning etc.and becomes the general tool for iterative receiver technology. Compared with MN and BN, factor graph can not only represent the conditional independencies ,but also the independencies the BN and MN cannot, and it has more powerful representation capability, which can represent the factorization forms of probabilistic distribution in a more clear and intuitional way.

Factor graph is a bipartite graph which can decompose a global function into a plurality of local functions, and it contains two types of nodes: variable nodes and factor nodes. Assume the global function is

g(x1,x2,…,x5), and its structure is shown in figure 1.

Figure 1 Factor graph

Then the factor graph above can be decompressed to the product of several local functions, the form is shown as follows

g(x1,x2,…,x n)

=f A(x1)f B(x2)f C(x1,x2,x3)f D(x3,x4)f E(x3,x5)

B.The Sum-Product Algorithm

The essential problem of factor graph is inference, and before it is the modeling. Factor graph modeling consists of two components: the topological structure modeling and the local function modeling corresponding to the factor nodes. In practical applications, we can use the domain knowledge to build the prototype of factor graph, and then use the sample-based learning methods to improve the prototype.

The inference of factor graph can be translated into the computation of marginal functions of global function, and the nature of sum-product algorithm is to compute the marginal function, that is the distributed calculation of each marginal function of global function using iterative message passing mechanism. With regard to cyclic factor graph, we can get the exact inference results by sum-product algorithm; but for acyclic factor graph, we can only obtain the approximate results of marginal function in terms of approximate inference.

Sum-product algorithm is a general algorithm that defined in acyclic factor graph, it consists of a series of "summation" and "product" operations, implementing messaging between variable nodes and local function node, in which the “messaging” refers to the function (for continuous variables ) or the table (for discrete variables) computed in the inference process. The messaging mechanism of factor graph is that, during each iteration, all variable nodes and factor nodes have to send messages to all the adjacent nodes, updating the data and obtaining more exact information

With the development of factor graph in domains, the sum-product algorithm has successfully explained a large number of algorithms, such as generalized belief propagation algorithm [40], iterative turbo decoding algorithm, fast Fourier transform, forward-backward algorithm, becoming a general algorithm for iterative processing.

C.Applied Research

Factor graph i s a PGM emerging in recent years, simple and general. And since it was proposed in channel coding, a group of international and domestic academics have been researching, and applied it into lots of areas, including channel coding, communication, signal processing, etc. Some domestic scholars have also tried to apply it into network faulty link identification,ECG diagnosi s,and achieved success.

In 2001, F.R.Kschischang et al. first proposed the concept of factor graph,and gave unified explanation of the existed coding algorithms. And then based on factor graph, Andrew P.Worthen et al. [41] first designed the unified iterative receivers and implemented channel

estimation, channel equalization and channel decoding in different channel environment. J oseph Boutros et al.[42] unified Turbo multiuser detectors algorithms in CDMA system into the factor graph model. Zhang A.P [43] proposed an iterative multiuser receiver based on factor graph for asynchronous coded CDMA System,efficiently improving the bit error performance. Then Xiang H.G [44] proposed the joint iterative detection and decoding maximum a posteriori probability

algorithm based on factor graph, with regard to the

belief propagation decoding of LDPC codes for MIMO system. In 2007, Wu Wei-ling[45]proposed the cyclic

probability decoding algorithm and the improved algorithm based on factor graph with regard to LDPC

code, effectively enhancing encoding performance. Li Ping et al. proposed GMP technology based the factor

graph for IDMA system, optimizing the equalization performance [46]. In 2009, Professor Hao Y.L [47]

proposed a pseudo-random code iterative acquisition algorithm based on Tanner, and gave sum-product

algorithm flow of Tanner graph, and in 2001, his student, Deng Z.X [48] proposed an improved method of Iterative PN Code Acquisition Algorithms, that is multi-iteration pseudo-code acquisition algorithm, solving the rapid acquisition problem of long

pseudo-random code in spread spectrum communication system; In the same year, Hao Y.L [49]

applied iteration m sequence capture algorithm based on factor graph into Gold codes capture, and reduced

the complexity and capture time without loss of detection probability. In 2002, Wang Z.Y [50]

improved Turbo codes equalization algorithm based on factor graph.Dr. Mao Ling [51] proposed the concept of hierarchical factor graph, and applied it into the research on automatic ECG diagnosis method. Lv X.L et al. [52] applied factor graph and sum-product

algorithm into network faulty link identification, which

showed great performance.

VI.

C ONCLUSIONS

In this paper, we gave a summary of probabilistic

graphical model and its three basic models, including their learn ing and inference theories, research status, application and extension.In view of these models and methods for modeling and inference, we found the relations and distinctions between Bayesian network,Markov network and factor graph.Broadly speaking,the BN is commonly used to model and reason the causal or the sequential relationships by a directed graph, the MN is usually applied to model the non-causal relationship and physical space phenomenon by an undirected graph,while factor graph is used to process the iterative problems by a bidirectional graph,unifying the methods of model representation and procession of the BN and MN. The three models mentioned above and their methods of learning and inference have been applied to various domains,

achieving significant performance and promising bright future. In the meanwhile, there also exists lots of room for improvement. Take BN for instance, more work will been done in DBN and the cyclic BN, in view of its complexity and availability. For factor graph, in my view, we could give more research on the extension on its structure and iteration mechanism to solve the problems which BN and MN cannot solve. R EFERENCES [1]J .Whittaker. Graphical Models in Applied Multivariate

Statistics. Wiley, 1990. [2]J .Pearl. Probabilistic Reasoning in Intelligent Systems:

networks of plausible inference [J]. Morgan Kaufmann. 1988. [3]Pearl J. Fusion propagation and structuring in belief net works.

Artificial Intelligence, 1986,29 (3) :241-288. [4]PEARL J. Bayesian Networks [M]. Handbook of Brain Theory

and Neu ral Networks, MIT Press, 2001.

[5]R.Neapoliton. Probability Reasoning in Expert Systems.John

Wiley &

Sons, 1990.

[6][F.V.Jemen. An introduction to Bayesian Networks. UC L Press,

1996. [7][E. Castillo, J. M. Gufiorez, and A. S. Hadi.Expert systems and probabilistic network models.Springer-Verlag, 1997.

[8]R. G.Cowell, R Dawid, S. Lauritzen, and IL D. Spiegel.systems.

Springer, 1999. [9] F.VJensen. Bayesian Networks and Decision Graphs.Springer,

2001. [10]ZhaoYue. Probabilistic graphical model learning theory and its

application.Tsinghua University Press,2012, 33-56. [11]Suzuki J . A.Construction of Bayesian Networks from

Databases Based on An MDL Principle [J ] //Proc. 9th Conference on Uncertainty in Artificial

Intelligence.1993 :266-273.

[12]Cheng J, Bell D A, Liu W. Leaning Belief Networks from Data: An Information Theory Based Approach [C] //Proc.6th International Conference on Information and Knowledge

Management. 1997:325-331. [13]BBOWN LE, ALIFERIS C F. The Max-Min Hill-Climbing Bayesian Network Structu re Learning Algorithm [M]. Nether lands: Kluwer Academic Publishers, 2005.

[14]

J I J un-zhong, Hu Ren-bing, Zhang Hong-xun et al.A hyb rid algorithm for Bayesian network structure learning[J].Journal of Research and Development, 2009.46(9):1498-1507.

[15]Shen Jia-jie, Lin Feng. Structure learning of Bayesian network using adaptive hybrid Memeticalgorithm[J ]. Systems Engineering and Electronics,2012, 34(6).

[16]Dempster A P, Laird N, Rubin D. Maximum likelihood from incomplete data via the EM alg orithm [J]. Journal of the Royal Statistical Society, 1977,39 (Series B ): l-38.

[17]

Friedman N. The Bayesian structural EM algorithm[C].// Proc.14th C onference on Uncertainty in Artificial Intelligence. 1998:129-138.

[18]Huang You-ping. Research on Bayesian network [D] Institu te of Computing Technology, C hinese Academy of Sciences, 2005.

[19]Wang Shuang-Cheng, Yuan Miao-miaoR esearch on learning Bayesian networks structure with missing data [J ].Journal of Software, 2004, 15 (07) 1042.

[20]

Wang Yue, Tan Shu-qiu, Liu Ya-hui. Bayesian network structurallearning algorithm based on mu tual information[J ]. Compu ter Engineering .2011, 37(7).

[21]Feelders A, vatder Gaag L.Learning Bayesian network

parameters with prior knowledge abou t context-specific qualitative influences [c]//Proc.21st Conference on Uncertainty

in Artificial Intelligence. 2005:193-200.

[22]Zeng Y F, Xiang Y P, Saulius P. Refinement of B ayesian

network structures upon new data [J].International J ou rnal of

Granular C ompu ting, Rough Sets and Intelligent Systems, 2009.

[23]Pearl J. Fusion, propagation, and structuring in belief networks

[J]. Artificial Intelligence. 1986,29 (3) :241-288.

[24]CooperG. The computational complexity of probabilistie

inferenee using Bayesian belief networks [J]. Artificial Intelligence ,1990,42:393-405.

[25]Horvitz E, Suermondt HJ, Cooper G F. Bou nded conditional:

flexible inference for decisions under scarce resources [C] //

Proceedings of the Fi f th C onf erence on Uncertaint y in A rtif

icial I ntel li gence, Windsor, Ontario , 1989: 1,822,193.

[26]Li Hai-tao, Jin Guang, Zhou Jing-lu n, et al. Survey of B ayesian

network Inference alg orithms [J]. Systems Engineering and Electronics, 2008,30 (5) :935-939.

[27]Gao Bing, Hu Guo-ping Research on diagnose of BNS

networks normal approximate reasoning arithmetic [J] Compu ter Knowledge and Technolog y, 2011, 07(25).

[28]Zhou Zhong-bao. Probabilistic safety assessment research

based on Bayesian networks [J].J ournal of Systems Engineering,2006, 21(6).

[29]Koller D, Pfeffer A. 0bject_oriented Bayesian networks [C].

Proc uAl97.san Francisco: Morgan Karman , 1997:302-313. [30]Gyftodimos E, Flach P A. Hierarchical B ayesian networks l A

probabilistic reasoning model for structured domains [C]. Proc

of the ICML-2002 Workshop on Development of Representations. San Francisco: University of New sou th Wales ,2002:23-30.

[31]Kevin Patrick Murphy BAHon.Dynamic Bayesian networks:

representation, inference https://www.360docs.net/doc/934895274.html,puter Science, 2002. [32]Wang Shuang-cheng.Research onlearning the hidden variables

of hybrid Bayesian network [J], Jou rnal of C ompu ters, 2005,

28(9).

[33]Chen Haiyang, Gao Xiaoguang, Zheng J ingsong based on data

repair DBNs of the air target identification method[J] Journal

of System Simulation .2010,22(3).

[34]Wangkai-jun, Zhang J un-ying, Zhao Feng, et al.Geometric-

pattern dynamic Bayesian networks reasoning gene regulatory

networks[J].J ou rnal of XiDian University (Natural Science),

2007,34(6).

[35]Wu un, The beau ty of mathematics [M]. People Post

Press ,2012,6.

[36]Pearl J. Probabilistic Reasoning in Intelligent Systems:

Networks of plausible Inference [M]. San FranciscoI Morgan

Kaufmann, 1988

[37]ulian Besag (1975): Spatial Interaction and the Statistical

Analysis of Lattice Systems. J ournal ofthe R oyal Statistical Society, Series B, 36, PP. 192-236.

[38]Doming os P, Richardson M. Markov logic: A unifying

framework for statistical relational learning. In: Proc.of the ICML-2004 Workshop on Statistical Relational Learning and its C onnections to Other Fields.

[39]F. R. Kschischang, BJ. Frey ^ 1. A. Loeliger. Factor graphs and

the sum-Product algorithm []. IEEE Trans. Inf. Theory, 2001,47 (2) :498-519.

[40]J.S.Yedidia, W.T.Freeman, Y.Weiss. Constructing Free-Energy

approximations and generalized belief propagation alg orithms [J]. IEEE Transactions on Information Theory, 2005,5 1 (7): 2282.23 12. [41]Worthen A. P, StarkW. E. Unified design of iterative receivers

using factor graph [J], IEEE Trams. On Information Theory,

2001,47 (2) :849-853.

[42]Boutros J, Caire Glterative. Multiuser joint decoding: u nified

framework and symptotic analysis [J]. IEEE Trans Inform

Theory, 2002,48 (7): 1 772-1 793

[43]Zhang Ai-ping, Luo Han-wen, Wang Hao-xing.Iterative

Multiuser Receiver Based on Factor Graph for Asynchronous

Coded CDMA Systems[J].Acta Electronica Sinica, 2003,

31(4).

[44]Xue Ying-jian,Xiang Hai-ge. LDPC C oded MIMO System

Scheme[J], Journal of Electronics and Information Technology,

2004, 26(10).

[45]Lin xue-hong, Wu Wei-ling.Improved decoding algorithm of

Low-Density Parity-C heck codes[J], J ou rnal of Circuits and

Systems, 2007,12(3).

[46]Guo Qing-hua, LiPing.LMMSE turbo equalization based on

factor graphs [J]. IEEE J. Sel. Areas Commu n, 2008,26

(2) :311-319.

[47]Deng Zhi-xin, HAO Yan-ling.PN Code Acquisition

Sum-Product Alg orithm and Performance Analysis Based on

Tanner Graph[J] J ournal of Beijing University of Posts and

Telecommu nications.2009, 32(3).

[48]Deng Zhi-xin, Hao Yan-ling, Zu Bing-fa.Improved Method of

Iterative PN C ode Acquisition Algorithms[J].J ournal of

Astronau tics, 2010, 31(1).

[49]Hao Yan-ling, Deng Zhi-xin.A Gold C ode Factor Graph

Iterative Acquisition Method [J]. J ournal ofBeijing University

of Posts and Telecommunications, 2010,33(1).

[50]Wang Zhong-yong, Lu Su, Duan Lin-lin.Modified extrinsic

information feedback mechanism based on FG-turbo

equalization[J]. C ompu ter Engineering and Applications,

2012,48(8).

[51]Mao Ling. ECG au tomatic diagnosis method based on

hierarchical factor graph [D] [dissertation], Beijing: National

Defense Science and Technology University, 2009.

[52]Lv Xiang-ling, Zhang Zhi-yong, Hu Guang-min. Faulty link

identification based on factor graph and sum product

algorithm[

J].

J ournal of C ompu ter Applications, 2012,32(2) :343-346.

相关主题
相关文档
最新文档