Dear Vitaliy,
I know that the single call did not work and the multiple one did work at least up to some small-ish number. In other words:
wordfreq = {#, WordFrequencyData[#]} & /@ words[[1 ;; 100]];
works and
wordfreq = Select[Transpose[{words, Normal[WordFrequencyData[words]][[All, 2]]}], NumberQ[#[[2]]] &];
appears to time out because of
WordFrequencyData[words]]
I can cut everything into tiny pieces and do it 100 at a time and it will work ok, but be very slow. If I call it on the entire list of words it always times out. I can do something like
WordFrequencyData[words[[1;;100]]]]
but it kind of defeats the purpose. I am just running some alternative thing, to see whether I can get around the problem. The main idea is to use frequency data from a corpus. I guess I am still making a mistake interpreting the data ( there is one bit I don't understand yet), but this it the outline of the procedure. I use frequency data from the British National Corpus from this website. The data set is this one. It also contains information about the frequency of different grammatical variations of the words. If I import the file
wordfreqsBNC = Import["/Users/thiel/Desktop/1_1_all_fullalpha.txt", "TSV"];
and then clean it up a wee bit:
wordfreqsBNC = {If[#[[2]] != "@", #[[2]], #[[4]]], #[[6]], #[[7]]} & /@wordfreqsBNC;
it looks like this:
wordfreqsBNC[[-27502 ;; -27490]] // TableForm

I believe that the second column is a sort of number of occurrences. If that was right this should do the trick:
wordfreqsBNCinDictionary =
Select[wordfreqsBNC, (DictionaryWordQ[ToString[#[[1]]]] && !
StringContainsQ[ToString[#[[1]]],
CharacterRange["0", "9"]]) &];
allcharacters = {#[[1, 1]], Total[#[[All, 2]]]} & /@
GatherBy[Flatten[
Thread @{ToLowerCase[Characters[#[[1]]]], #[[2]]} & /@
wordfreqsBNCinDictionary[[All, {1, 2}]], 1], First];
standardchars =
Reverse@SortBy[
Select[allcharacters, MemberQ[CharacterRange["a", "z"], #[[1]]] &],
Last]
This gives:

which is obviously very far away from what we expect according to your first post and the analysis of a larger number of texts...
BarChart[standardchars[[All, 2]], BarOrigin -> Bottom,
BaseStyle -> 15, ChartLabels -> standardchars[[All, 1]],
AspectRatio -> 1, PlotTheme -> "Detailed"]

There is a little problem with how I use the data. I think that the list gives one number of occurrences for the "main entry" and them other numbers for the different grammatical forms of the words. I might be doing some double counting here. I'll try to sort this out.
Cheers,
Marco