Here is a brute-force matrix method to obtain the t-statistics. I had to rearrange the data generated in your example to get the desired vectors and matrices and so that the order of the t-statistics matches that of LinearModelFit:
In[213]:= (* Generate data for LinearModelFit *)
data = Table[{3 + i + RandomReal[{-3, 7}], i + RandomReal[{-2, 5}],
i + RandomReal[{-3, 10}]}, {i, 1, 20000}];
(* Modify structure of that data for direct matrix approach *)
n = Dimensions[data][[1]] (* Number of observations *);
p = Dimensions[data][[2]] (* Number of parameters *);
x = data; (* Construct design matrix *);
x[[All, 3]] = 1;
x = x[[All, {3, 1, 2}]]; (* Rearrange predictor variables to match output of LinearModelFit *)
y = data[[All, 3]]; (* Get dependent variable *)
(* Now time the two methods *)
Timing[{model = LinearModelFit[data, {x1, x2}, {x1, x2}];
model["ParameterTStatistics"]}]
Timing[{xpxInv = Inverse[Transpose[x].x]; (* x'x inverse *)
b = xpxInv.Transpose[x].y ; (* Estimates of parameters *)
mse = Total[(y - x.b)^2]/(n - p); (* Mean square error *)
t = b/(mse Diagonal[xpxInv])^0.5 (* t-statistics *)}]
Out[218]= {0.327602, {{13.9661, 40.6508, 80.0327}}}
Out[219]= {0.031200, {{13.9661, 40.6508, 80.0327}}}
The matrix approach takes about 1/10 of the time as LinearModelFit (if all you want are the t-statistics).
If you are repeating this by removing one variable at a time, then there are things you can do to speed that up (such as calculating x'x just once for all variables and then just removing the desired row and column).
And do you really have thousands of variables or thousands of observations? (Or maybe both?)