ago
0 like 0 dislike
0 like 0 dislike
Let x^(2)\+bx+c be a quadratic over a field k with char(k)=2 and b≠0. Setting it to 0, we can divide by b^(2) and substitute y=x/b to find y^(2)\+y+c/b^(2)=0.

The exercise actually says we can assume there is some d in k such that d(d+1)=b^(-2)c. After looking this up, it seems to come from the fact that we can apply Hensel's lemma to find d/dy(y\^2+y+c)=1 implies that c=d(d+1) for some d, but I don't understand Hensel's lemma enough to say how or why.

In any case, we now have y^(2)\+y+d^(2)\+d=0. Let f:k→k be a map defined by f(n)=n^(2)\+n. Then f has the roots 0 and 1. In addition, it's additive: f(n+m)=(n+m)^(2)\+n+m=n^(2)\+m^(2)\+n+m=f(n)+f(m). This means that f(y+d)=f(y)+f(d)=0, which implies that y=d or y=d+1.

Thus x^(2)\+bx+c has the roots bd and b(d+1), where d(d+1)=b^(-2)c.

Is my reasoning all right with this solution? Is there a way to express the roots only in terms of b and c?
ago
0 like 0 dislike
0 like 0 dislike
are you missing something? when k=Z/2Z, b=c=1, the quadratic is obviously irreducible and no such d exists. or it is a condition that such d exists to begin with? (then it's trivial to solve the quadratic - since there is unique factorization for polynomials over field)
ago
by

No related questions found

33.4k questions

135k answers

0 comments

33.7k users

OhhAskMe is a math solving hub where high school and university students ask and answer loads of math questions, discuss the latest in math, and share their knowledge. It’s 100% free!