As Henrik shows, Mathematica is performing properly.
However, you really need to consider a better model or additional data. If "prediction" is the most important objective, then using ListInterpolation (in this case where there is so little variability about the line) works great:
f = ListInterpolation[FresnelFit1[[All, 2]], {FresnelFit1[[All, 1]]}];
Show[{ListPlot[FresnelFit1, Frame -> True,
PlotRange -> {Automatic, {0.0071, 0.0077}},
PlotStyle -> {Red, PointSize[0.01]}],
Plot[f[\[Lambda]], {\[Lambda], Min[FresnelFit1[[All, 1]]],
Max[FresnelFit1[[All, 1]]]}, PlotStyle -> Blue]},
ImageSize -> Large]

For noisier data a nonparametric fit (such as a gam - generalized additive model) would be better. However, Mathematica as far as I know does not supply a gam function or other nonparametric fits (Developers: Please!).
If interpretation of the coefficients is a more important objective, then you need a different model as there is a not a good fit and there's no point in interpreting coefficients or providing estimates of coefficients of a poorly fitting model. Something must be missing in the model or in how the data is actually generated.
My experience is in fields were large amounts of variability is the expected norm. If I saw such data in one of those fields, I would wonder if all of the observations are highly serially correlated and in such a case the NonlinearModelFit is inappropriate as it does not allow for serial correlation. Is the data from a single run of some measurement process? If so, I'd say you have a sample size of one and would need replication of the measurement process to fight off the claim (at least potentially) that the model as presented is not a good fit. But here, too, with multiple sets of measurements, NonlinearModelFit does not handle mixed models (i.e., where there is more than one source of variability: between sets of measurements and serial correlation). Adding mixed models to Mathematica's repertoire would also be great.