# Why isnt my equation evalutaing correctly?

So I'm making a BCH decoder for a uni project in Python. It takes a 10 digit codeword with or without errors and is supposed to detect and correct any errors it finds. For double error correction there is a mathmatical formula to detect the position and magnitute of these errors:
formula for error pos and mag

``````i = (-Q + ((Q^2-4*P*R)^(1/2))/2*P)<br/>
j = (-Q - ((Q^2-4*P*R)^(1/2))/2*P)

b = (i*s1-s2)/(i-j)<br/>
a = s1-b
``````

I have already correctly calculated QPR and s1,s2.
So, I'm debugging with this example (2 error BCH example). It has the number `8888880747` being transmitted to `8899880747` so there are `2` errors in positions 3 and 4 both by a magnitute of 1. So far my program generates the correct `syndromes(s1 to s4 - 2,7,3,3)` and the correct PQR values `(10,7,10)` but when doing the calculation for i and j I get different values than the example - `10.7` and `10.6` as opposed to `3` and `4`. Here is my code for i and j:

``````#work out error positions i and j
sqrt = ((Q**2 - 4*P*R) % 11)**(1/2)
i = (((-Q+sqrt)/(2*P))%11)
j = (((-Q-sqrt)/(2*P))%11)
``````

Can anyone see what I'm doing wrong? Thanks.

The operations are meant to be carried in `GF(11)`, in other words they are meant to be carried out mod 11. Square roots and division mod 11 are different than they are in the real numbers.