Is it possible for a sum of the sequence of independent random variables to converge in probability, even when their variance doesn't converge?

Suppose that   P(X\_n =0) = 1-2/n,   P(X\_n=n) = 1/n, P(X\_n= -n)=1/n.   Then I believe that the X\_n converge in probability to 0 because for all e between 0 and 1/2,

lim\_n  P( |X\_n|> e) = lim\_n (2/n) = 0,

but

Var(X\_n) = 2/n\*n\^2 = 2\*n.

(Edit: added "for all e between 0 and 1/2" and fixed  |X\_n|> e inequality.)
I believe that the example below shows that it is possible that Var(X\_n)  goes to infinity and the X\_n converge almost surely.

Suppose that Y is uniformly distributed over the interval $0,1$ and assume that

P(X\_n = n) =1/2  and P(X\_n= -n)=1/2   if   Y < 1/n,    and

X\_n = 0 if Y >= 1/n   for all positive integers n.

&#x200B;

For any positive integer N,

P( X\_n = 0  for all n>=N) = P( X\_N=0) =  P(Y>=1/N)  = 1-1/N,

so

P( lim\_n X\_n = 0) >= 1-1/N for all N,

thus the X\_n converge almost surely to 0  and

Var(X\_n) = 1/n\*n\^2= n.

&#x200B;

In the example above, the X\_n are very dependent.   I wonder if the following is true or false.

"If  Var(X\_n) is not bounded and the X\_n are independent, then the X\_n do not converge almost surely."

0 like 0 dislike