Bit error rate uncoded vs Bit error rate digital communication

1k Views Asked by At

graph

Above is the graph showing the BER (bit error rate) at different Eb/No values using BPSK over AWGN channel. The pink curve shows the BER of the uncoded system (without channel encoder and decoder) while the black curve represent the BER of the digital communication with the used of hamming (7,4) code for channel encoding. However, I can't explain why both curves started to intersect and cross over at 6dB.

1

There are 1 best solutions below

0
On

I started writing this in a comment and it started getting long. I figure this is either correct or not. It makes sense to me though so maybe you will have to do more research beyond this.

Note: I am aware BER is normally over seconds but for our purpose we will look at something smaller.

My first assumption though (based on your graph) is that the BER is on the actual data and not the signal. If we have a BER on these 2 different encoding schemes of 1 error every 7 bits we have a BER on the hamming encoded signal of 0 errors every 7 bits compared to 1 in 7.

Initial:

  • Unencoded: 1 error every 7 bits received
  • Hamming(7,4): 0 errors every 4 bits (if corrected)

Now lets increase the noise thereby increasing the error rate of the entire signal.

Highly increased BER:

  • Unencoded: 3.5 errors in 7 bits (50%) (multiple sequences to get an average)
  • Hamming(7,4): 2 errors in 4 bits (50%)

Somewhere during the increase in BER these must crossover as you are seeing on your graph. Beyond the crossover I would expect to see it worse on the Hamming side because of less data per error (lower actual data density). I am sure you could calculate this mathematically... unfortunately, it would take me more time to look into that than I care to though as it just intuitively makes sense to me.