Message Boards Message Boards

31 Replies
48 Total Likes
View groups...
Share this post:

Why are some professors negative on Mathematica?

Posted 10 years ago

In discussions with several university physics professors,,,theoreticians...they profess a strong dislike for Mathematica and caution their students about using it. I can fully understand cautioning a user to make sure they have used the correct syntax or correctly formulated a problem or model, but their caution was stronger than that. They, in effect, prefer to roll their own algorithms. They also claim Mathematica gives bad results, though it was unclear whether the fault lay in the execution or the formulation of a problem.

I am troubled by this attitude, since users in all technical disciplines use Mathematica and rely on it to supply solutions to various designs, models, and analyses, some that are mission critical.

I performed a literature search on evaluation of Mathematica and the most recent published critique and evaluation I found was for Mathematica 5. Other than this, there does not appear to be an undercurrent of suspicion except from these specific profs.

What is going on? I can defend my mathematical models but I cannot defend the outcomes of executions of these models if there is skepticism over the validity of solutions obtained by Mathematica. I can also understand that anyone who has not come up on the learning curve might simply be covering their own inadequacies, hence the attitude of rolling their own. However, everyone should be skeptical of published results from the use of personal algorithms, for which no validation or user community exists.

How can the quality of the results from the use of Mathmatica be supported? Are there published evaluations? What might some organizations such as DARPA do to validate some work for which Mathematica has been a cornerstone of the analyses?

Or am I the only person to have run into this level of skepticism...which is ironic since I am more skeptical of analytical results than most.

POSTED BY: Luther Nayhm
31 Replies

Something that I learned years ago (very likely from Feynman or quoted about him to me by someone else) is that, when looking at a dataset in a presentation and listening to the presenter's description of the meaning of the data, in your mind drop the data points from the graph at its extremes and decide if the conclusions make sense. Why? The data at the extremes are often there because the experiment stopped being trustworthy at roughly near those limits so the strength of those points often would be in question....

POSTED BY: David Reiss

Some comments on "attitude" and "culture". One of the greatest influences on me have been the books on data graphics by Edward Tufte. It is one reason I believe in writing literate notebooks with textual explanations and multiple and carefully designed presentations. Tufte writes: " Those who discover an explanation are often those who construct its representation." Mathematica gives you plenty of tools for constructing representations and exploring their behavior.

And why textual explanation? Because if you can't give a simple clear explanation it should leave you a bit skeptical as to your level of understanding. The same goes for documentation of a routine. If it's difficult to document maybe it should be redesigned or scrapped for some other routine. Textual explanation and documentation are not onerous chores, they are part of the learning or creative process.

I tend to eschew the terms "code" and "programming" in favor of terms such as "writing definitions, axioms, rules and specifications". After all, we're trying to do math and science. WRI can hire the programmers. It's a matter of attitude and culture.

Lots of people in academia write papers and even books as part of the process of better learning the material. Do you think they always knew it all before they started? So even if it's self-study, writing literate notebooks is good practice. And they might very well be good enough to interest other people. The same applies to having students write literate notebooks, even if they are relatively short. When they're finished they actually have something to show off. Science and technology are no good if they can't be communicated.

Also, if students can design new and clear presentations along with explanations, they are not only learning but they are adding value because these do not always exist together now.

And thanks to Luther for starting this discussion.

Posted 10 years ago

This is not intended to criticize Edward Tufte or diminish any of the fine work he has done. If you carefully study each of his books you might notice that there is one thing which has not been included. There is not a hint of uncertainty in any of the graphics that he presents. For one example, I cannot imagine in the fog of war in 1812 that Napoleon's march on Moscow was anywhere near as precise and tidy as the famous graphic leads our minds to believe. Yes, the graphic does round the numbers to thousands, but even that I suspect is a misrepresentation.

I spent time some years ago trying to get vendors to use all the new found compute power to demand from the user and then display the uncertainty associated with graphical presentations. Yes, Mathematica keeps track of the precision associated with some numeric quantities, but Plot does not then display graphics which are just fuzzy enough to force us to realize how much we don't know. I don't think the classic error bars are the answer, but fuzz might be.

Several simulation tool vendors will quietly admit that their calculations are really not as precise as their tool displays. One vendor even suggested that I diddle the coefficients a little each time, do ten runs, print all the results, tape them to a window and try to guess what the uncertainty really is.

I listened to an invited speaker years ago. He presented slide after slide of very impressive graphics showing the behavior of something. He was asked if it would be possible to include an indication of the uncertainty in his slides. He replied no, because if he did then it would be obvious that his results had no predictive power. I suspect even the audience members who already knew this still made incorrect assumptions when looking at the graphics.

Consider how many predictions we see for projects large and small. We almost never see the uncertainties of the inputs or outputs or any indication of the uncertainty associated with the project success or failure. The only use of uncertainty currently is to claim there is none if it my project or it is a stupid idea and we shouldn't do it if it is yours.

It seems that everyone wants to know "the number", nobody wants to hear there is any uncertainty. Everyone wants to see the graph with the spidery little line showing the economy is going smoothly off to infinity. There was a book not too long ago which talked about how everyone wants to be told the number, usually the mean, and how badly we make mistakes when we don't see and plan for the uncertainties. Ah! Found it! The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty by Sam Savage, Jeff Danziger and Harry Markowitz. This should not be license to now have to make up numbers for the uncertainties to go along with the numbers we made up for the predictions.

I considered writing a letter to Mr. Tufte asking him if in his next book he might introduce the world to the knowledge that nothing in the real world is precise, when we don't know the uncertainty that we reach false conclusions and we should all begin demanding credible uncertainty information be included in graphics. But the last word I heard is that he is deep into what will likely be his final book and I doubt that he could incorporate such a dramatic change in presentation now. If Mr. Tufte can't be the one to do this then perhaps someone else can.

POSTED BY: Bill Simpson

If results had error bars, and regression analyses showed the correlation coefficient, a lot of what passes for "knowledge" in Sociology, psychology, and medicine would be seen as being very tenuous at best.

I think that this is a major cultural problem. No one gets through Physics 1 or any decent engineering course without learning that any measurement that lacks an estimate of error is useless. I used to work for a clinical lab, and the entire notion knowing what the error bars for a result were was antithetical to the thinking both of the management of the company and the clients.

As I recall, Tufte does discuss error, and I believe that there is a section where he shows radically different sets of data give identical least squares fits. However, this is not the main thrust of his discussion, as you point out.

This whole topic is overdue for a treatment that can be understood by people who just want to use software, and not have a deep understanding of numerical analysis. I'm pretty sure that Mathematica is a good tool to use for this.

Posted 10 years ago

There was an article published recently, but I can't find it here. Someone in the social sciences at a university was accused of cooking his results. Both sides are now claiming that they are blameless and the other is at fault. Somehow a side effect of this was a substantial effort to carefully replicate a few dozen major long accepted classical results in the field. Approximately one third of the replications either failed to support the hypothesis or had sufficiently serious problems with the experiment that no claim could be made. In every case mentioned the original experimenter is furious that they have been targeted by a witch hunt, their reputation has been permanently damaged, they have no way of defending themselves and nothing can be done to make up for this. You should track down the original source of this before using it, just to make certain that I have not somehow skewed the claims. It might have been published in Science in 2013, but I can't be sure.

If the likes of Tufte's standards could begin to make it the norm that uncertainty be clearly shown with every claim and Mathematica could position itself as one of the leading tools to simply and easily produce graphics for claims then, as mentioned above, perhaps everyone might be in a slightly better position to see at a glance what has sufficiently small uncertainty to likely be accepted and what has so much uncertainty that it can be dismissed.

POSTED BY: Bill Simpson
Posted 10 years ago

The quintessential example that comes to mind for me is the Fleischmann–Pons claim of cold fusion.


Bill, that seems a bit off topic of whether academics are missing out by not making greater use of Mathematica. But maybe not completely. Because of their active and dynamic character, Mathematica provides the opportunity of presenting higher quality results with higher integrity. One can include, in the document, raw data and procedures for analyzing and representing the results. Others can check if they actually work or have entry points for exposing problems. If people want to lie about data I suppose they can but it's not quite as easy to get by with just plain sloppiness.

I once worked as a computer consultant for a group of biochemists. At one point I was asked to look at an experiment one of them was doing to measure the rate of some biochemical reaction. The person was very nice and very meticulous in his biochemistry. But when I looked at the reaction system and what he was measuring I discovered to my horror that it had no relation at all to the reaction rate he was trying to measure. Needless to say, he was not pleased with this. I left the group but later learned from someone who replaced me that they continued the experiment and analyzed it with a statistical program. The result was something like 1.0 +/- 10^6 millimole/minute - and they published it!

Bill, Bravo. Very well articulated. Thanks.

Sadly, careful error propagation, null hypotheses, standard error and sample sizes, p values, etc are disappearing from the canonical topics of a physical science education, and many papers appear in press with no discussion of error and uncertainty.

The typical excuse for omission is "curriculum pressure". I think we are just not teaching effectively.

POSTED BY: W. Craig Carter

Bill, your remarks are well put. I think that Tufte somewhere does have a maxim to Be Honest. And he does give several suggestions in Visual Explanations page 34. This might include multiple presentations based on different measures or viewpoints and textual explanation that might address various concerns. That's why I'm an advocate of literate notebooks that make a case and try to clarify and present a true picture. I would be very skeptical of any analysis summarized in a single presentation without extended discussion. In the final analysis it largely depends on the integrity of the person making or responsible for the analysis.

Mathematica probably has the tools to present uncertainty and risk. You just have to figure out how to put them together. With a dynamic presentation you can provide information in tooltips. You can provide information in numerical side reports, which may vary as you move a locator about the primary data representation. You might be able to toggle a presentation between various assumptions. And of course there are the old fashioned error bars and multiple presentations. Your suggestion of fuzzing the data representation might work very well. To go one step further, in a Mathematica notebook (or application) you can give the reader active tools for performing various types of data reduction and the actual data itself. I'm not knowledgeable about statistical analysis of data; I'm sure you can work out much better methods. The point is: Mathematica does give you better tools to present honest results.

Posted 10 years ago

Dusty trails indeed, and in my case, of the handful of engineering companies I've worked for, if I wasn't the only Mathematica user, there may have been 2 or 3 others that let it collect, you guessed it . . . dust. For some reason Matlab/Simulink dominates the engineering world and Mathematica is more popular in science, education and maybe finance. One area of computation that is bound to continue to explode is medical research. I wonder if Mathematica played a role in cracking the human genome.

Macsyma over a 300 baud modem pre-dates me somewhat but nice job in catching that error. I do remember running test cases with it from my CRC Standard Math Tables handbook and having much difficulty interpreting the results that Macsyma produced on my bulky, heat producing Cathode Ray Tube. Of course Mathematica has FullSimplify and similar commands to push results into a more human readable form.

On the topic of supportive learning materials in the 90's I subscribed to Mathematica in Education and Research put out by Telos/Springer. I found this journal to be highly readable and useful as I don't have advanced degrees. It was full of plots, images and code. Thinking back, I learned quite a bit from that little journal as it was very practical in nature.

Does anyone know if this or similar journal is still in print ? I'm aware of the Mathematica Journal, but I've found that to be too theoretical and advanced for my needs.


I first used Macsyma over a 300 baud modem over the Arpanet from Caltech to MIT back in the mid 70s. And I found a bug. They had the general solution to the cubic entered incorrectly.

POSTED BY: David Reiss

That was possibly a bug, so to speak, in the CRC reference at the time. It did not really account for choices of roots used, at least in the quartic formula. This may also have been an issue with the cubic. This was stuff I had to work through in 1992.

POSTED BY: Daniel Lichtblau
Posted 10 years ago

"All of the manuals on how to use Mathematica have been written by people who know too much. Their work should be edited by someone like me who is trying to learn what the developers already know."

Luther, I find your basic thesis here by and large accurate. Mathematica does have a steep learning curve and to directly use the more advanced features (particularly the various programming paradigms) one has to think like Mathematica, which for me has not come naturally.

Like others here, I started off with FORTRAN punch cards in college and was simply amazed at what could be done with the "simple" Do Loop. Ten or so years later I stumbled upon Macsyma, anybody remember that ? It was a command line math tool that claimed it could do symbolic math, maybe it could but I did not find it useful. Then I noticed that our Technical Librarian had a copy of Mathematica sitting on her shelf, version 2.2. After inquiring about it I learned that one of the Chief Engineers owned it originally but returned it to her because he couldn't figure out how to use it. I convinced my company to purchase the Premier Support package which offers supurb technical support and over the years would amaze him (and myself) of what could be done with this tool. I also read a few Mathematica books which I enjoyed at the time but that didn't boost my usability knowledge as I had hoped. During and prior to these years I became somewhat proficient with FORTRAN, MathCad, Excel, BASIC, Minitab and Matlab. And from my experience only Matlab comes close to the power of Mathematica if various toolboxes were acquired. Out of the box, Mathematica was a clear winner for my technical computing and data processing needs. And when serious statistics were added (ver 8 I think) there was no need for any other tool.

Fast forward to today, I am still a clunky Mathematica user and I still rely on the basic Do Loop from yesteryear even though much more efficient programming techniques have been available in Mathematica for years. But I've done some pretty neat stuff that no one else in the company has been able to do.

I share the learning curve frustration but I will take a hard to use yet capable tool over an easy to use limited tool any day. And for my technical computing needs, Mathematica has never let me down in the sense that if I could conceive it, Mathematica could be coaxed to solve it; that is powerful.

On a practical note, the approach I have found very useful is to create and maintain a private library of "How To Notebooks" that describe a narrow task that you know you will be facing again in the future. The source material for these may come from the hard slugging and self-learning that you have produced, a call to Tech Support , the online Help, books, or the Mathematica StackExchange site. If I look at my "How To" Folder I see file names like

How to align entries nicely using TableForm.nb How to compute running averages.nb How to conditionally extract values from a list.nb How to convert time formatted strings to numerical values.nb How to control minor ticks on a log axis.nb How to create a 3-D surface plot.nb How to extract algebraic solutions produced by the Solve command.nb etc.

I find this is far more useful that trying to rely on the built-in Help which, in general, offers a very simple example or two (material that you already know) and then jumps to examples so complex that [literally] some Tech Support Engineers at Wolfram have had some difficulty with, and these guys are good !

Kudos to Stephen Wolfram and his team for inventing, enhancing and maintaining what I think is the strongest technical computing environment on the planet. The professors mentioned at the beginning of this discussion don't know what they are missing.

For an example, that I have put to use in my own work, Google and download the Sandia report Sample Sizes for Confidence Limits and Reliability, by J. Darby.

Posted 10 years ago

We have walked the same dusty trail. I had forgotten about Macsyma.

I do some of the things you suggest, but in a less ordered fashion. I have several worksheets that I have input various solutions and approaches to problems, and I just keep using the same worksheets. My memory is good enough that I remember that they are in the worksheets and I open one and start scrolling. I call it my cluttered desktop approach.

I second your admiration for Wolfram. And I will look up the reference you suggested. Thanks.

POSTED BY: Luther Nayhm

Hello, Yes, the "symbolic-program-x" I referred to above was Macsyma as well (I thought the general rule here was not to mention other software by name? Or, perhaps that was the mathgroup newsgroup?)

Back towards the original topic. I am thinking about the ways that I use Mathematica. They don't fit easily into a single category. Mathematica is something of a swiss army knife.

Academics have less time on their hands than many people think--most of my faculty colleagues work long (and thankless) hours seven days a week. And, I believe that, in most cases, the time is not spent on research or teaching. So, I was wondering why I am able to open Mathematica at all?

From a research perspective, I use Mathematica:

1) derive stuff

2) simulate stuff

2) keep track of stuff that I've derived

3) visualize things that I've learned.

It serves as workbench and as scratch paper. As much as I agree with David Park's viewpoint about using Mathematica to publish, I can't afford for the research community to catch up or adopt. It's a battle that I can't win and I'm either too afraid or too lazy to fight. However, It is easy and eye-opening to give presentations with Mathematica--hopefully this might help get the ball rolling to archival publication.

From a teaching perspective, I use Mathematica:

1) To show students that they can understand a physical phenomenon much much better by coding it and visualizing it.

2) to reduce the barrier for physical scientists to learning maths that they should know (the documentation and electronic communities are excellent modules for math as well as mathematica).

3) simulating physical systems for pedagogy.

I believe that being able to code is integral to scientific literacy. For physical scientists, symbolic computing languages like Mathematica threads together many educational goals.

From a recreational perspective,

1) It is fun just to play with new ideas by coding them up.

2) I have collaborated on pieces that have been exhibited at MoMA, the Pompidou, the Paris Fashion Show, and other places that we created with Mathematica.

Finally, while we are quoting about death and revolution, there is a nice one by Henry Ford, "“Anyone who stops learning is old, whether at twenty or eighty. Anyone who keeps learning stays young.” Just playing with Mathematica is a good strategy for staying young.

POSTED BY: W. Craig Carter

"Just playing with Mathematica is a good strategy for staying young"

I tried that Craig and it doesn't work! Running very very fast might work in a relative sense.

More accurately, playing with Mathematica is a good strategy for maximizing the use of your time.

I basically use Mathematica for all my work unless otherwise forced asunder. I write all my documents in it and have done since around version 3 (when the typesetting became publication class).

POSTED BY: David Reiss
Posted 10 years ago

Clearly, that is something I have to start doing.

POSTED BY: Luther Nayhm

I think it was either Planck or Boltzmann who said that progress in Physics is made at funerals. You advance a new theory or process and wait for the old physicists to die. Having a Ph.D. in Physics myself, I'm allowed to say this. It's probably true for almost all fields of human endeavor.

POSTED BY: Frank Kampas
Posted 10 years ago

I had heard of it as a Planck quote, but I have also heard that it was a quote of a quote. It is like Newton's standing on the shoulders of giants. I think that is a quote of a quote, too. The more modern version of Planck's quote is about progress or acceptance occurring one funeral at a time. Planck's quote was kinder and about acceptance occurring through the younger generation being more familiar with whatever is new and novel and having grown up with it. I am always reminded of the idea of continental drift and how hard it was for the geologists of not so long ago to get their heads around the idea despite the circumstantial evidence. Alvarez's asteroid extinction hypothesis is still being it should be.

I can understand the push back if it were something truly novel, but in my case it was a redo using more advanced techniques available through Mathematica. The approach was simply a re-reading of something and realizing there was a disconnect between what was said and what was actually done. I am of the opinion that everything should be periodically revisited in light of newer findings, insights, and techniques.

When I started my career, I was performing contract R&D and we were selling applied physics and technology. I remember plowing though the journals to get ideas for applications, since I knew what my customers' problem areas were from a technology perspective. If something was novel, I tested it for applicability to various applications. Got some things so wrong, but got some very right, too. It was a fun time to get into the technology business.

The times were different. We took chances. Even a dumb idea might have a nugget of goodness buried somewhere inside.

POSTED BY: Luther Nayhm

I would like to weigh in here even though I'm not an academic, because it's a topic dear to my heart. I think the most valid complaint on Craig's list is number 2. Here's the test: How much time is spent in doing the math and science and how much time in trying to coerce Mathematica into doing what you want? If it tilts too much to the latter Mathematica will lose.

Nevertheless, Mathematica has tremendous advantages, which many academics and users seem to be unaware of and which are basically out-of-sight. It could and should be the premiere publication medium for technical (and maybe even a lot of non-technical) material. It should blow LaTex and PDF documents out of the water. This is because of the active and dynamic capabilities of Mathematica notebooks (and CDF documents): the ability to transmit usable routines with the documents, and the ability to accumulate capability in a documented form, not to speak of all the capability built into Mathematica itself.

And yet, Mathematica has completely failed at this. Look at How many Mathematica notebooks will you find? I haven't tried to check or search all entries but it looks like the score is something like LaTex/PDF 1,000,000, Mathematica notebooks 0. (Of course, some of the PDF's have graphics or mathematical results copied from Mathematica.) So how many academics even see examples of what can be done?

Academics need a positive reason to switch to Mathematica for routine work. There are plenty of very good reasons. They just don't see them.

On the other hand, the second item on Craig's list is quite important. Mathematica is difficult to use. Wolfram Research has not made sufficient effort in usability. Their approach has been to add more bells and whistles and top-down routines. This only makes usage more difficult. Often you may find that you have to modify a high level routine, generally using options or special constructions, to get what you want - only to find out after several hours that it can't be done. It would be much simpler if, as much as possible, capability was built from the bottom up and made available to the user. It's all right to offer some common high level routines (built from the lower level objects) but the user should always have the option to fall back to the more primitive constructions.

For graphics and dynamic displays, the paradigm should be "writing on a piece of paper". That's what academics do all the time. They kind of understand that. In part, that means separating the calculation from the writing.

Another difficulty is that this active and dynamic medium is quite new and revolutionary. Even if Mathematica were easy to use, it's still not all that clear or simple to know the best usage. It's easy to fill displays with "computer junk". Just taking a 19th or 20th century diagram and making something move is not necessarily very informative. There are many possibilities and matters of taste that need to be creatively explored. This is why WRI has to leave the user free to write displays adapted to special cases at hand. They can't even come close to anticipating what form these might take - or providing high level routines for them.

Finally, I would like to include two display examples, which Mathematica's high level routines are not particularly good at doing. I didn't use standard methods for either of these and yet they should be the kind of thing that would be reasonably easy to do,

The following is adapted from the first table I found in Arfken & Weber, Mathematical Methods for Physics, Fifth Edition, 2001, Academic Press. The original was somewhat confusing. This was composed using a set of routines that allow me to write information and formatting directly on a "piece of paper" without using the rather complicate options in Grid. Would you find it easy to compose this using Grid? Does technical work ever use custom tables?

enter image description here

The second example is from a notebook on the use of the Weierstrass elliptic P function in representing EXACT solutions of orbits in the Schwarzschild geometry. I was introduced to this by Ron Burns. This was done using a DynamicModule. I never use the Manipulate statement. I find it too difficult to con out its behavior and I want to arrange the layout just the way I want. The Weierstrass function is characterized by two parameters, g2 and g3, and the green region shows the domain for bound orbits. The region is a thin wedge where most of the physical cases occur. The g3 Slider is dynamically adapted to this and varies only across the green domain - the variation is in the 5th or 6th place. Even though the Weierstrass function was known well before Einstein and Schwarzschild, not to speak of Misner, Thorn and Wheeler, the high precision required probably mitigated against its use. But you can see that I used quite high precision and Mathematica had no difficulty with it at all. That might be one powerful reason in favor of Mathematica.

enter image description here

Posted 10 years ago

I agree with your assessment, though I would not have thought that I could or should write a document wholly in Mathematica. That is a novel idea that I will try out. I am guilty like everyone else of importing the results from Mathematica into PDF documents. The irony to me is that I write everything as a first draft document, including insertion of equations and rationale for their use via Mathtype within Word, as I develop a thesis and solutions, along with reams of paper scratching and associated Mathematica notebook pages. If I don't write it down, it is lost forever, since my notes are never indexed except by various page number styles.

That said, I use Mathematica at its most rudimentary levels compared to the things you discussed, but I get your drift. All of the manuals on how to use Mathematica have been written by people who know too much. Their work should be edited by someone like me who is trying to learn what the developers already know. In commercial areas, you find that software invariably suffers from this issue: developers know too much and don't realize that they need to make their programs and documentation user friendly...where the user can be totally ignorant of the underlying algorithms and could care less. The irony is that when the users have a problem that in turn is becoming an issue, the developers run in a panic to the human factor and user interface groups to have them fix the garbage the developers have created. Wolfram unwittingly operates with this same paradigm, along with most other software product developers.

In my early years at Miami, I had Prof Arfken as an instructor during the times he was contemplating his book. I am sure we were guinea pigs for some of his efforts. He was a pleasant man and fun to listen to. In a symposium at one time he stated an object lessen I still adhere to. He said he had spent weeks trying to solve some problem with a methodology with which he was unfamiliar. He finally made arrangements to visit with a distant heavy weight in mathematical physics to try and resolve what he was trying to do. In his words he said after a day's travel, he went into the heavy weights office, sat down, presented his problem, to which within 5 minutes the "great man," as Prof Arfken referred to him with a twinkle in his eye, laid out the approach and solution. The lesson: when all else fails, ask, and don't wait forever to do so.

POSTED BY: Luther Nayhm

Luther, why don't you write me at and tell me what your project is.

Posted 10 years ago

With the central claim for Mathematica being that it is the single tool needed for everything from jotting down mathematical notes to doing initial exploratory calculations to creating graphics to implementing entire sets of tools for a problem domain to writing and publishing papers and textbooks, I am surprised that I have not seen more published on how to carry out that last step. From the style of it, I am assuming that Ruskeepa's "Mathematica Navigator" was written almost entirely within Mathematica.

There was a thin little book published perhaps twenty years ago showing some of how, entirely within Mathematica, to create documents for publishing. I held a copy in my hands for a minute, but didn't buy it and have never been able to track down that title again. If you could find that book, even though it is old now, then it might help you get started. Perhaps someone else can recall that title.

On the larger question of reception, culture and attitude is far more powerful and persuasive than most might imagine. If you want to do another literature search, then look at the reception that automated theorem proving or even computer proof checking has received from the vast majority of the mathematics community over the last fifty years. Then you can compare and constrast that with the reception that Mathematica in particular and other computer algebra systems in general have received. Perhaps that would help you better understand what you are seeing.

Some have claimed that ideas which are rejected now will be embraced and adopted when the previous generation dies. But culture and attitude may be more persistent than that. If you are creating something new then choose your culture and attitude carefully because you will pay the price and live with the consequences of that forever. For example, if you happened to be around and watching during 1980s, when the culture and attitude was instilled into Mathematica, you might be better able to notice, and possibly even understand, some of the behavior of people inside and outside of Wolfram Inc. thirty years ago and, in some cases, unchanged today.

Lastly, I try to sense your attitude from what you have written above. You might contemplate that. It has a way of leaking out and there almost always seems to be a price to pay for that. I do realize my some of attitude has leaked out in what I have written here.

POSTED BY: Bill Simpson

Probably not what you had but maybe related:

This is by Paul Wellin and he may have been author or a coauthor of the work you have in mind.

POSTED BY: Daniel Lichtblau
Posted 10 years ago

Good points and I plead guilty to the leakage observation. I was in the "business" at one time and have seen a variety of cultures at work. My knee jerk was a recollection of the times I was burned as a "customer". Within software developers is an undercurrent of certainty that they know best or, as with many technical people, they come across a better way of doing things while working on some they shift gears without letting anyone know. This is the bane of project managers.

I recall meeting a past graduate school colleague at a conference. He had set up a company that was successful and one of its lines of business was developing specialized scientific analysis software for the DoD in the areas of spectroscopy. We chatted and he discovered I was associating with the same types of customers and developers as he was, so he asked me my opinion on what it would take to commercialize his products for a wider audience. One of my own staff had developed a different type of specialized analysis software and we were wrestling with the same question: how to expand the market for that intellectual property. I had no answer for my friend or myself. Some fifteen or so years later, the software my group had developed did show up as a commercial product developed independently elsewhere, and their secret was the user interfaces and easy of use, which our intellectual property never possessed....nor did we have the capacity to address that issue. Plus, the market for the new product had never occurred to us.

Mathematica's documentation is as good as it has to be for Wolfram not to suffer any commercial consequences. Wolfram is blessed by an outstanding product and a user community that is so smart that it overcome the deficiencies in the, to be crass about it, Wolfram sells support services. I think where Wolfram may be somewhat short sighted is that such issues as are being raised in this thread indicates that they could promote a ground swell of broader acceptance and use by "commoditizing" their product. This is just the businessman in me speaking. I would reject the idea of a slimmed down or crippled version just to get a cheaper product out, but too many bells and hidden whistles is pretty frustrating in any product. As an example, I just bought a new phone that uses Android and I tried to set up my voice mail account. I had to go on-line to find someone to tell me where the set up was hidden and it was definitely hidden. The documentation did not even address voice mail but it supplied overkill (from my perspective) in the areas of applications and data.

Mathematica is what it is and from a commercial perspective, Wolfram is doing what it needs to do to stay on top of their game...they have a business strategy that does not depend on better transparent documentation. Mathematica is the best mousetrap around from my perspective. Could it be better? Probably, but define better.

POSTED BY: Luther Nayhm

Dear Luther, I've been puzzling over your observation for nearly 15 years, and I also fail to understand why Mathematica/Wolfram Language gets dismissed by my colleagues so haphazardly. In discussions with my colleagues about this question, I tend to go in listening mode and don't advocate on one side or another. I'll try to summarize what they say below.

Some background:

I'm a professor at MIT in physical sciences and engineering. I've been teaching a course on mathematics and problem solving in materials science for about twelve years, and I use Mathematica extensively in my course. I can't judge the efficacy of my course objectively, but my students generally have very positive evaluations; I've been given MIT's highest institutional teaching awards, as well as the School of Engineering's. I believe that these awards reflect the quality of the Mathematica based course. I give nearly all of my research presentations using Mathematica and these are received well (however, such feedback tends to be biased towards positive)

The summary (I don't agree with many of these, and am still honing my counter arguments)

1) Mathematica is not as fast as X (X may or may not be a compiled language).

2) Mathematica's syntax creates too steep of a learning curve.

3) Our (particular) scientific community uses X; so there are many more routines and working examples in X

4) I am already using X, why should I change?

I believe most of the above are the result of "user inertia".

I don't hear the comment that Mathematica gives "bad results" so much now. I doubt if many faculty roll their own algorithms, but are probably thinking back to when they were postdocs or graduate students--however, there are happy exceptions to this

I believe that the best way forward is to offer students a choice; and hope that they will make an unfettered choice of a their preferred programming language. However, I am not so optimistic: a student who went to graduate school at a university down the street wrote to me saying, "I'm taking this class and they insist on using X and I asked if I could use Mathematica instead, and he said no. We were doing more advanced things in 3.016 ( the sophomore class I teach)."

POSTED BY: W. Craig Carter
Posted 10 years ago

I concur with your assessments and those of Park based on a more general exposure to technologists with particular preferences. I started with a blank slate in terms of exposure and preferences, since I had avoided programming at any level after my early pre-PC experiences with Fortran and Basic and more punch cards than any one graduate student should have to carry. I was never required to do much programming, starting with my first job at Michigan (we used a Wang with punch cards!!!) to Battelle, where I was fortunate to manage a talented group of scientists and technologists who loved programming. Now as I close in on my dotage, I am on my own.

I had to find a muscular program to work out some problems that had never been properly solved because all work on those problems stopped a hundred years ago. A reading of a basic paper showed that the author had not actually done what he said he was going to do and did something else because there were no tools like Mathematica available then.The author did what he could. I reread the paper and identified an asyllogistic conclusion and set out to test the author's methodology. I got results that contradicted the text books. However, I did not know if the fault was mine (mis-interpretation or mis-statement of the real problem) or whether the issue was Mathematica (again, was it my use of mathematica or something within Mathematica), or were the results true? So, after research that led to no resolution on any of the points, I asked.

The results of my consulting with several faculty were disappointing. Other than criticizing Mathematica and my methodology, I simply could not get anyone to actually read the statement of the problem as originally presented nor my interpretation and subsequent modeling using Mathematica. I had expected push back but not total out of hand rejection of my thesis, and the criticism of Mathematica seemed to be a way out for actually having to address the issues and possible consequences that the problem presented. I am, at this point, working down the list of criticism eliminate them. Hard slog.

POSTED BY: Luther Nayhm

"starting with my first job at Michigan (we used a Wang with punch cards!!!)"

Around 1982 I had a job overhauling a Wang Fortran compiler (the project name was, not surprisingly, WANGFOR). I also was requested to augment in terms of functionality, the so-called "Wang enhancement" project (WANGENH). (No, I'm not making this up. Not even the project names. It was a Fortran-66 compiler and they wanted to add some of the Fortran-77 capabilities, if I recall correctly.)

My company and Wang Labs had a falling out several months later and, as best I can tell, mutually fired one another. But the work on that decrepit compiler was, I think, pretty good.

So this makes two of us who worked with Wang's Fortran. Also their assembler code, in my case.

POSTED BY: Daniel Lichtblau

Hello Danny and Luther, That makes three of us who started with punch cards.

A bizarre as it might seem, I think punch cards were a beneficial way to learn to program. I spent so much more time thinking about the program and doing thoughtful debugging because submission of a job was so painful. I confess that I often do debugging now with less reflective thought because I can do dozens of haphazard experiments in minutes--this probably speeds the debugging process as often as it slows it down.

I was sold on symbolic computation (pre-wolfram-language) when I was able to solve classical mechanics homeworks faster and more accurately than my classmates. I remember that just being able to do a taylor expansion around a point and copy down the results and redraw crt-rendered plots into my homework felt like cheating.

I also remember my own "inertia" of switching from "symbolic-program-x" which was free at Berkeley (and had opaque syntax that I had learned moderately well) to Mathematica 1.


POSTED BY: W. Craig Carter
Posted 10 years ago

Ah, memories!

POSTED BY: Luther Nayhm
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
or Discard

Group Abstract Group Abstract