DE424 Practical 3 - Report
Name & Surname: Robert Jaret Thompson Student Number: 24704806 Date: 2 March 2025
Plagiarism Declaration
This submission is entirely my own work, excepting the inclusion of resources explicitly permitted for this assignment, and assistance as noted in the following two items.
- DE424 Practical 3 outline
- Emdw userguide
I have not and will not facilitate plagiarism, such as by distributing any written work or source code created by fellow students. I understand that any code I submit may be inspected for plagiarism detection (either manually or by automated systems).
Signed:
The Hamming (7,4) error correction code
In the Hamming (7,4) code, we have:
- Transmitting input bits: which is the original data bits
- Parity check bits: which is computed from the information bits
- Receiving bits:
Figure 1
3.2 PGM to calculate the check bits and given input bits and
a)
Figure 2: Bayes Network (BN) of relationship between input & check bits
b) From Figures 1 & 2, it is observed that and are bits that are dependent on the input bits.
- The check bit is dependent on the input bits . It gives the conditional distribution .
- The check bit is dependent on the input bits . It gives the conditional distribution .
- The check bit is dependent on the input bits . It gives the conditional distribution .
The input bits are assumed to be independent and this results in the joint distribution of all these components into the following equation:
c)
Similar results for the truth tables of and are obtained.
d) Conversion of a Bayes Network (BN) to a Markov Network (MN) involves identifying the directed edges from the BN:
Then we have to marry the parents (moralisation) by changing the directed edges into undirected edges and then connecting the parents of each node.
- has parents and so we add edges between and .
- has parents and so we add edges between and .
- has parents and so we add edges between and .
Figure 3: Markov Network
In an MN, the joint distribution is represented as a product of factors over maximal cliques:
where and
e)
f)
The marginal distributions and are defined in emdw as follows:
rcptr<Factor> ptrb0 =
uniqptr<DT>(
new DT(
{b0},
{binDom},
defProb,
{
{{0}, 1},
{{1}, 1},
} ));
Similar code is used to declare ptrb1
ptrb2
and ptrb3
.
rcptr<Factor> ptrb4gb0b1b2 =
uniqptr<DT>(
new DT(
{b0,b1,b2,b4},
{binDom,binDom,binDom,binDom},
defProb,
{
{{0,0,0,0}, 1},
{{0,0,0,1}, 0},
{{0,0,1,0}, 0},
{{0,0,1,1}, 1},
{{0,1,0,0}, 0},
{{0,1,0,1}, 1},
{{0,1,1,0}, 1},
{{0,1,1,1}, 0},
{{1,0,0,0}, 0},
{{1,0,0,1}, 1},
{{1,0,1,0}, 1},
{{1,0,1,1}, 0},
{{1,1,0,0}, 1},
{{1,1,0,1}, 0},
{{1,1,1,0}, 0},
{{1,1,1,1}, 1},
} ));
Similar code is used to declare ptrb5gb0b2b3
and ptrb6gb0b1b3
which denotes and .
g) Multiplying the probabilities with absorb to get the joint probability as shown in this code and normalize() so that the probabilities add up to 1.
rcptr<Factor> ptrb0b1b2b3b4b5b6 = ptrb4gb0b1b2->absorb(ptrb5gb0b2b3)->absorb(ptrb6gb0b1b3)->absorb(ptrb0)->absorb(ptrb1))->absorb(ptrb2)->absorb(ptrb3)->normalize();
std::cout << __FILE__ << __LINE__ << ": " << *ptrb0b1b2b3b4b5b6 << std::endl;
Figure 5: Joint Probability Distribution
h) Arbitrary input bits are given and through observeAndReduce will generate values for the check bits given the original four input bits.
std::cout << __FILE__ << __LINE__ << ": " << *ptrb0b1b2b3b4b5b6->observeAndReduce({b0},{1})->observeAndReduce({b1},{0}-> observeAndReduce({b2},{1})->observeAndReduce({b3},{0})->normalize() << std::endl;
3.3 PGM to decode the received bits: binary symmetric channel (BSC) case.
a)
Figure 5: BN with receiving bits included
b) The joint probability distribution for the receiver side builds upon the transmitter-side BN while incorporating the additional noise factors for each received bit.
c) The error probability is the likelihood that differs from the transmitted bit. If , then when the transmitted bit is 0, there is a 90% chance that is also 0, and a 10% chance it gets flipped to 1.
0 | 0 | |
0 | 1 | |
1 | 0 | |
1 | 1 | |
d) |
Figure 6: Mn with receiving bits included
e) New factors of the are created and the error probability is added
rcptr<Factor> ptrr0 =
uniqptr<DT>(
new DT(
{r0,b0},
{binDom,binDom},
defProb,
{
{{0, 0}, 0.9},
{{0, 1}, 0.1},
{{1, 0}, 0.1},
{{1, 1}, 0.9},
} ));
std::cout << __FILE__ << __LINE__ << ": " << *ptrr0 << std::endl;
f)
g)
rcptr<Factor> ptrbgr = ptrb0b1b2b3b4b5b6->absorb(ptrR0)->absorb(ptrR1)->absorb(ptrR2)->absorb(ptrR3)->absorb(ptrR4)->absorb(ptrR5)->absorb(ptrR6)->normalize();
std::cout << __FILE__ << __LINE__ << ": " << *ptrbgr << std::endl;
Figure 8: Normalized joint probability distribution
We are now implementing error correction using probabilistic inference. By multiplying all factors together and normalizing, we compute the posterior joint distribution . The goal is to find the most probable original sequence and confirm that the error correction mechanism successfully works. In this case, the most probable sequence for is with a 72% chance.
Figure 9: Posterior marginal beliefs