Thanks for the references.
The afterward to Lakoff and Johnson's book contains a short discussion of a new viewpoint: a Neural Theory of Language. There are some references that are a bit old in such a rapidly evolving field, but I will check them out.
I think that one problem, particularly for people like me doing unfunded explorations, is that the dictionary (word list) in Wolfram Language is too wimpy for anything more involved than "bag of words" computations. I found that the first edition of the OED is out of copyright, and there are PDF scans of the 20 volumes, but extracting the words into something useful is simply too big a task for one person. (The scans are not that sharp, so the error rate would be too hight to accurately automate, in my opinion.) It would be great if Oxford University Press would license the OED to Wolfram, but I doubt that that would happen. It might be possible to cobble something analogous to the usage quotes in the dictionary by using all the out-of-copyright literature that is already in plain text, but I suspect that there would need to be expert curation.
My suspicion is that this will be best another stage in the development of AI. Great things were promised, followed by running into hardware (and possibly conceptual) limitations, which resulted in stasis until the next breakthrough. I am not sure that the current hardware can do a decent job using an 800,000 entry dictionary with the type of added information that the OED has.
The main downside to this is that people may make commercial use of the fruits of the current state of NLP and not realize its (severe) limitations.
Right now, this problem is on the back-burner for me until I can find an economical way to use a GPU to do Neural Nets on macOS.