## Bayesian Belief Networks

A Bayesian belief network (BBN), which also may be called a Bayesian causal probabilistic network, is a graphical data structure that compactly represents the joint probability distribution of a problem domain by exploiting conditional dependencies. A BBN can capture knowledge of a given problem domain in a natural and efficient way [19]. A BBN builds an "acyclic" graph in which nodes represent feature variables, and connections between nodes represent direct probabilistic influences between the variables. Because of the properties of "acyclic" connection and d-separation defined in the BBN, there is no feedback loop between any nodes, and the lack of connection between two nodes indicates the probabilistic independence of two variables. One node in a BBN represents one feature variable. Each variable must have two or more discrete states. For a discrete variable, its digital or symbolic values can be transferred directly to the states of the node. For a continuous variable, it must be segmented into a discrete variable. Each discrete number corresponds to one state. Then, each state is associated with a prior probability value; for each node, the summation of probability values for total states equals 1. The connections between nodes are represented by conditional probabilities. The number of the conditional probabilities is determined by the structure of the BBN. In a BBN, these prior and conditional probabilities can be either assigned by established statistic data or computed from a set of training data. In general, when the network structure is given in advance and the variables are fully observable in the training examples, learning the prior and conditional probabilities is straightforward [19].

Unlike a three-layer feed-forward ANN that has fixed topology, the structure of a BBN can be flexibly changed for different applications, based on human knowledge. It allows investigators to specify dependence and independence of features in a natural way through the network topology. For example, Fig. 2 demonstrates a BBN used in a computerassisted diagnosis scheme to detect breast cancer [28]. The

Skirt ThictÃjemnc}

Alcohol S Smoking

NipfVe Discharge

Breast Pain

Hormones

Lump(s]

Manopauaa

Drcasl Canoor

Prognancy History

MasE(es)

family Breast Cancar History

Asymmetry

Deals ris

Ar'cfiitoctural DiÃŠloition

FIGURE 2 A BBN used for diagnosis of breast cancer. Adapted from Wang, X.H., Zheng, B., Good, W.F., and Chang, Y.H. (1999). Computer-assisted diagnosis of breast cancer using a data-driven Bayesian belief network. Int. J. Med. Informatics 54, 115-126.

network includes 13 features where 5 are related to patient's clinical history, 4 are obtained from general findings in physical examination, and 4 others are extracted from findings in mammograms. Because of the variation in the topologies for different applications, writing a computer program to build a BBN is much more complex than writing one to build an ANN. Fortunately, several software packages (e.g., Hugin Demo [11] and Microsoft BBN [7]) are available for users to build BBNs for their specific applications.

To build a BBN, first a series of prior and conditional probabilities must be determined. For example, to build a BBN as shown in Fig. 2 to predict the possibility of breast cancer, the conditional probabilities are determined as follows. In this network, the nodes represented by features from five different types of patient clinical history information are located in the same layer. These five nodes independently point to (connect to) the node of breast cancer. Because each of these five nodes Yi, i = 1,..., 5) and node of breast cancer has two states, yes and no, a total of 64 conditional probabilities should be determined as follows:

P1(Cancer = yes\Y1 = yes, Y2 = yes, Y3 = yes, Y4 = yes, Y5 = yes) P2(Cancer = no\ Y1 = yes, Y2 = yes, Y3 = yes, Y4 = yes, Y5 = yes) P3(Cancer = yes\Y1 = yes, Y2 = yes, Y3 = yes, Y4 = yes, Y5 = no)

P64(Cancer = no\Yl = no, Y2 = no, Y3 = no, Y4 = no, Y5 = no).

In these 64 conditional probabilities, 32 of them are independent. Next, the node of breast cancer also connects to eight other nodes that represent the features extracted from patient's physical and mammographic examinations. These eight nodes are considered to be independent in this topology. Six of them have two states and two others have three states [28]. Because all the connections between these nine nodes start from the node of breast cancer, the conditional probabilities should be computed as P(Xi\Cancer = yes), and P(Xi\Cancer = no), i = 1,..., 8. The total conditional probabilities between the node of breast cancer and these eight feature nodes are 36, where 20 are independent. Meanwhile, based on the states of each node in the network, the total number of prior probabilities in these 14 nodes is 30 where 16 of them are independent. Therefore, a complete weight structure in this BBN requires 30 prior probabilities and 100 conditional probabilities. Unlike the ANN, which has a fixed output layer or nodes, each node in the BBN can be assigned as an output (or test) node. In the preceding example, the probability of a patient having breast cancer can be tested when the required diagnostic information (features) of this patient is presented to the network. The probability of each diagnostic symptom for the normal and abnormal cases can also be examined using the same BBN.

## 10 Ways To Fight Off Cancer

Learning About 10 Ways Fight Off Cancer Can Have Amazing Benefits For Your Life The Best Tips On How To Keep This Killer At Bay Discovering that you or a loved one has cancer can be utterly terrifying. All the same, once you comprehend the causes of cancer and learn how to reverse those causes, you or your loved one may have more than a fighting chance of beating out cancer.

## Post a comment