I have a set of data points xi and yi with individual errors di and ei, and i make a linear fit through the data of the form y=ax+b. How do the errors di and ei translate to errors in a and b? So far i can only find erros in a and b based on the residuals.