I received the following information:
"Programming languages will end up being increasingly high-level until they become identical to natural language. You will eventually be able to write all your programs in English, Portuguese, or any other natural language, although you will also be able to mix that with instructions of the kind used in today's programming languages whenever you think that's more efficient or clearer,"
Why will programming languages end up being increasingly high-level until they become identical to natural language?
When will programming languages begin to be increasingly high-level?
Wolfram Language programmers are using human language level programming?
If yes, is Wolfram Language the only human-language-level programming language that exists?
P.S, I suspect most the readers of this post are too young to have experience programming in COBOL. Consider yourselves fortunate. Nevertheless, it was a major programming language for at least 35 years and is still in use.
I remember COBOL, though fortunately, I never had to code in it.
I think that people who make statements like this significantly underestimate the complexity and ambiguity of natural language. There is a tendency to think that all terms are as well-defined as mathematical language.
Certainly, understanding language at the level of James Joyce or most "art" poetry is beyond the reach of current AI, and everyday language is almost as complex in semantics.
I read a book about musical interpretation that stated that the notation conveyed about 35% of the composer's intent, and the performer had to supply the rest by learning about context. Where the author got that number from is unknown, but it seems about right for understanding language as well. Of course, spoken language is more ambiguous and incomplete than written language, which is way more complicated than any coding language.
As an illustration, the dictionary used by WL has about 35,000 words, and is about at the level as one of those cheap paperback dictionaries students used to use. To make any sense of English in its full expression, you really need the equivalent of the Oxford English Dictionary, which has 800,000 words, plus historical context and multiple definitions for many words. As I recall, the definition for "is" runs for several pages.
One of the primary complaints of programmers (not just newbies) is that the program did what they typed, not what they meant. We can tell a person "you know what I mean" and most of the time they do, despite syntax errors and ambiguity, because of a shared cultural history and years of experience. I don't think that a computer will have an equal level of understanding until it can share the same cultural history and experience.
Try using Wolfram|Alpha (a system that takes natural language input only) for something non-trivial and get it to do exactly what you need. I think that experience will answer your question.
It is a constant source of frustration that W|A does not return what I expect in may cases. I have had many discussions about this issue with tech support since the service was implemented.
In addition, The W|A "universe" is a small subset of common discourse. You get far too many "I am not programmed to respond in that area" - type answers to too many queries, even what may be considered to be technical inquiries.
W|A does a fairly good job at providing the type of information that was in the old CRC handbooks or a decent almanac. You need patience to phrase the query in the form that W|A's parsers can handle.
However, this is by no means close to the type of natural language discourse.
I will know that there has been progress when you can ask W|A (probably its descendant) about the meaning of a phrase from Finnegan's Wake, and have it provide a meaningful answer on the fly (that is, not a canned response from a human authority).
For now, I am satisfied that it is a decent interface to the CRC tables and Information Please, and this is a major accomplishment. However, it does not cover the full range of natural language.
Your point that W|A is the 21st century version of the CRC handbooks (or Abramowitz and Stegun) with a natural language interface is an excellent one. One area of advance over such handbooks is that W|A can use the information in its repository to make calculations expressed in natural language terms. But the capability to perform calculations falls far short of the capability to write computer programs.
I think a lot people confuse those two disparate capabilities. i know that the corporate executive I mentioned in my 1st post in this thread suffered from exactly such a confusion, but even allowing him that confusion, he was vastly over-optimistic when he thought something like W|A would emerge by 1985.
I believe the time when one can sit down in front of a screen and build a program with the full collaboration of an artificial intelligence — and that is what it will take to really have natural language programming — is so far off that I will never experience it. Frankly, I'm not even sure that it can be accomplished in at a practical level, i.e., for everyday use rather than as a laboratory curiosity,
[quote="Morton Goldberg, Retired"]
Please provide a reference to the quotation that appears in your question. I would to know the author, where it was published/expressed and its date.
The following link is the reference to the quotation that appears in my question: https://forum.osdev.org/viewtopic.php?p=286902#p286902
[quote="George Woodrow III, Freelance Mathematician"]
I think that people who make statements like this significantly underestimate the complexity and ambiguity of natural language.
@George Woodrow III,
An intelligent compiler will ask for clarification whenever there’s an ambiguity and may suggest improved wordings to resolve the issue. Writing a program will end up being a conversation with an intelligent machine which anyone could handle even if they know nothing about programming - it will be a collaboration with an intelligent system which is in itself an expert programmer. The error messages will be comments and questions just like the ones you’d get if you were co-writing a program with a human programmer. (“When you say “print the result of that part”, do you mean this part [a section of the code is highlighted], and do you want it printed to the screen or the printer?”)
None of that will stop you putting in a line of C or any other programming language if you want to, but most of the work will simply be done in natural language, typically at a much higher level with the compiler working out how to carry out the tasks asked of it. The end user will also become a programmer, telling the machine how (s)he would prefer things to be done, and the machine will comply. That will rarely be done through anything other than natural language.
Where natural language is ambiguous, the machine can simply ask for clarification to make sure it has understood the instruction the right way, and if it hasn’t, it can help the programmer improve the wording of the instruction.
Not sure what your background is (no info in your profile), but here goes.
I'm not sure that what you describe is even possible with our current understanding of epistemology. Stuff like what you describe is shown on Star Trek all the time, but so is warp drive: just because something can be described does not make it possible.
I daresay that what you describe is idealized compared to current interactions with human programmers. I have dealt with these conversations from both sides: as a coder and as a 'customer' or as a designer.
There is also some question about what you mean by "natural Language". As it is used in the CS community, it seems to refer to some type of language that resembles English (typically), but is restricted in scope or domain. For me, the term "natural language" means all human-human interactions, from everyday conversation to legal contracts, to Milton, Shakespeare, Austen and T. S. Eliot. In reality, it should include conversations in French, Greek, Urdu or Sanskrit as well, but while there are a considerable number of people fluent in two or more languages, it does not seem to be the norm in the US or UK.
I stick with my assertion that real language is much more complex than what most computer scientists consider to be "natural language". Those of us who are devoted to the arts and humanities must not allow that doppelgänger to take the place of the medium we love so well, even if we use it imperfectly.
no. I have been using Mathematica for 29+ years, and I can state unequivocally that it is not a human level language.
If you consider a scale where 0 is machine-code and 1 is "star-trek" style programming, I'd guess we are at about a 0.1 level. Compared to assembly code (which I programmed in), Wolfram Language is a big advance, and even compared to c or Swift, it is more expressive, but is is by no means even close to human level language.
Each era has its own hype about "natural language programming", and certainly COBOL was more natural than Assembly or even FORTRAN, and so forth. This is understandable, of course, but it is still hype.
[quote="George Woodrow III"]
There is also some question about what you mean by "natural Language".
@George Woodrow III,
I said natural language meaning human language, for example, English, the language that English-speaking humans use when they are speaking to other English-speaking humans.
I had an interesting discussion at the recent Wolfram conference.
Someone use the example of a Quote that President Obama used on several occasions: "the last, full measure of devotion".
The first question was whether a natural language processing system should be able to tell that this quote is from Abraham Lincoln. Given access to the Gettysburg Address and enough time, it is probably within the realm of possibility for current systems, although it would probably have trouble with quotes that are not exact.
The second question was far more interesting. Should a natural language processing system know that the phrase was talking about death? Even if a native English speaker did not recognize the source of the quote (or that it was a quote in the first place), it is highly probably that the person would be able to infer that the phrase referred to death.
As far as I know, no "natural language programming environment" would have a clue.
As far as I know, no AI system has even a rudimentary grasp of metaphor. Far from a literary tool. metaphor is a key component of everyday speech. There is a vast literature on the role of metaphor in language, which I will not rehearse here.
Certainly, W|A doesn't have a clue about metaphor. Wolfram Language does not, either.
I have started reading a book that may be pertinent to this thread:
Language at the Speed of Sight: How We Read, Why So many Can't, and What Can Be Done About It
Mark Seidenberg, Basic Books, 2017
I have only completed the first couple of chapters, but the discussion is quite relevant to this topic. Later chapters focus on how we learn to read, and that may be relevant for a different reason.
The examples that the author uses makes it clear that it is not just the inability of AI to recognize common metaphors, but the more immediate ambiguities of language that are at issue in "natural" language processing. Things such as words that different meaning depending on which syllables get an accent make AI understanding difficult.
My concern here is that people in the field (as well as people merely wanting to exploit the nascent technology) end up redefining the term to mean something that is currently tractable with current technology, and forgetting that it is a small subset of everyday language.
I believe that Wittgenstein discusses this problem in his Blue and Brown books or Philosophical Investigations, but it's been a while and I cannot recall exactly what he said. It is on my list of things to do to re-read this material.
Anyway, the book I recommend is quite interesting.
I'm unsure if Apple Siri (or MS Cortana) is enabled in mm to do anything beyond allow use as typing or simple commands (ie, if integrate is in it's vocab of having loaded special actions).
All language becomes binary, and conversion time is an issue. Coders don't like being "too distant" from machine level programming because there is the real possibility that algorithms made never finish (big O). Simplicity is good in limited situations - mathematica's power goes far beyond the simple side of things.
unix SH (bash) if i remember was developed by nasa engineers to be a human readable script because previous language used had more than once caused "an issue" due to legibility. mm has better legibility than sh, of course :)
you can say "let's add a bot" that understands i'm doing a specific school book problem. that's allot of bots - and could end up being gigabytes of extra install
you can say "let's add some chemistry formula and computation" or "celestial sciences" and they did: it's kept out of the normal distribution (it pulls the package upon request) because it's just too big. and some of the data owners also don't agree to give users rights to "just download and keep" it all.
a better question is how, with terrabyte drives, we are still in a position of "trying to stay thin" ?
Thanks for the responses!
Good luck to all!