Group Abstract Group Abstract

Message Boards Message Boards

7
|
35K Views
|
31 Replies
|
48 Total Likes
View groups...
Share
Share this post:

Why are some professors negative on Mathematica?

Posted 11 years ago

In discussions with several university physics professors,,,theoreticians...they profess a strong dislike for Mathematica and caution their students about using it. I can fully understand cautioning a user to make sure they have used the correct syntax or correctly formulated a problem or model, but their caution was stronger than that. They, in effect, prefer to roll their own algorithms. They also claim Mathematica gives bad results, though it was unclear whether the fault lay in the execution or the formulation of a problem.

I am troubled by this attitude, since users in all technical disciplines use Mathematica and rely on it to supply solutions to various designs, models, and analyses, some that are mission critical.

I performed a literature search on evaluation of Mathematica and the most recent published critique and evaluation I found was for Mathematica 5. Other than this, there does not appear to be an undercurrent of suspicion except from these specific profs.

What is going on? I can defend my mathematical models but I cannot defend the outcomes of executions of these models if there is skepticism over the validity of solutions obtained by Mathematica. I can also understand that anyone who has not come up on the learning curve might simply be covering their own inadequacies, hence the attitude of rolling their own. However, everyone should be skeptical of published results from the use of personal algorithms, for which no validation or user community exists.

How can the quality of the results from the use of Mathmatica be supported? Are there published evaluations? What might some organizations such as DARPA do to validate some work for which Mathematica has been a cornerstone of the analyses?

Or am I the only person to have run into this level of skepticism...which is ironic since I am more skeptical of analytical results than most.

POSTED BY: Luther Nayhm
31 Replies

Something that I learned years ago (very likely from Feynman or quoted about him to me by someone else) is that, when looking at a dataset in a presentation and listening to the presenter's description of the meaning of the data, in your mind drop the data points from the graph at its extremes and decide if the conclusions make sense. Why? The data at the extremes are often there because the experiment stopped being trustworthy at roughly near those limits so the strength of those points often would be in question....

POSTED BY: David Reiss
Posted 11 years ago

This is not intended to criticize Edward Tufte or diminish any of the fine work he has done. If you carefully study each of his books you might notice that there is one thing which has not been included. There is not a hint of uncertainty in any of the graphics that he presents. For one example, I cannot imagine in the fog of war in 1812 that Napoleon's march on Moscow was anywhere near as precise and tidy as the famous graphic leads our minds to believe. Yes, the graphic does round the numbers to thousands, but even that I suspect is a misrepresentation.

I spent time some years ago trying to get vendors to use all the new found compute power to demand from the user and then display the uncertainty associated with graphical presentations. Yes, Mathematica keeps track of the precision associated with some numeric quantities, but Plot does not then display graphics which are just fuzzy enough to force us to realize how much we don't know. I don't think the classic error bars are the answer, but fuzz might be.

Several simulation tool vendors will quietly admit that their calculations are really not as precise as their tool displays. One vendor even suggested that I diddle the coefficients a little each time, do ten runs, print all the results, tape them to a window and try to guess what the uncertainty really is.

I listened to an invited speaker years ago. He presented slide after slide of very impressive graphics showing the behavior of something. He was asked if it would be possible to include an indication of the uncertainty in his slides. He replied no, because if he did then it would be obvious that his results had no predictive power. I suspect even the audience members who already knew this still made incorrect assumptions when looking at the graphics.

Consider how many predictions we see for projects large and small. We almost never see the uncertainties of the inputs or outputs or any indication of the uncertainty associated with the project success or failure. The only use of uncertainty currently is to claim there is none if it my project or it is a stupid idea and we shouldn't do it if it is yours.

It seems that everyone wants to know "the number", nobody wants to hear there is any uncertainty. Everyone wants to see the graph with the spidery little line showing the economy is going smoothly off to infinity. There was a book not too long ago which talked about how everyone wants to be told the number, usually the mean, and how badly we make mistakes when we don't see and plan for the uncertainties. Ah! Found it! The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty by Sam Savage, Jeff Danziger and Harry Markowitz. This should not be license to now have to make up numbers for the uncertainties to go along with the numbers we made up for the predictions.

I considered writing a letter to Mr. Tufte asking him if in his next book he might introduce the world to the knowledge that nothing in the real world is precise, when we don't know the uncertainty that we reach false conclusions and we should all begin demanding credible uncertainty information be included in graphics. But the last word I heard is that he is deep into what will likely be his final book and I doubt that he could incorporate such a dramatic change in presentation now. If Mr. Tufte can't be the one to do this then perhaps someone else can.

POSTED BY: Bill Simpson
Posted 11 years ago
POSTED BY: Bill Simpson
Posted 11 years ago

The quintessential example that comes to mind for me is the Fleischmann–Pons claim of cold fusion.

http://en.wikipedia.org/wiki/Cold_fusion

POSTED BY: Steve M

Bill, that seems a bit off topic of whether academics are missing out by not making greater use of Mathematica. But maybe not completely. Because of their active and dynamic character, Mathematica provides the opportunity of presenting higher quality results with higher integrity. One can include, in the document, raw data and procedures for analyzing and representing the results. Others can check if they actually work or have entry points for exposing problems. If people want to lie about data I suppose they can but it's not quite as easy to get by with just plain sloppiness.

I once worked as a computer consultant for a group of biochemists. At one point I was asked to look at an experiment one of them was doing to measure the rate of some biochemical reaction. The person was very nice and very meticulous in his biochemistry. But when I looked at the reaction system and what he was measuring I discovered to my horror that it had no relation at all to the reaction rate he was trying to measure. Needless to say, he was not pleased with this. I left the group but later learned from someone who replaced me that they continued the experiment and analyzed it with a statistical program. The result was something like 1.0 +/- 10^6 millimole/minute - and they published it!

Bill, Bravo. Very well articulated. Thanks.

Sadly, careful error propagation, null hypotheses, standard error and sample sizes, p values, etc are disappearing from the canonical topics of a physical science education, and many papers appear in press with no discussion of error and uncertainty.

The typical excuse for omission is "curriculum pressure". I think we are just not teaching effectively.

POSTED BY: W. Craig Carter

Bill, your remarks are well put. I think that Tufte somewhere does have a maxim to Be Honest. And he does give several suggestions in Visual Explanations page 34. This might include multiple presentations based on different measures or viewpoints and textual explanation that might address various concerns. That's why I'm an advocate of literate notebooks that make a case and try to clarify and present a true picture. I would be very skeptical of any analysis summarized in a single presentation without extended discussion. In the final analysis it largely depends on the integrity of the person making or responsible for the analysis.

Mathematica probably has the tools to present uncertainty and risk. You just have to figure out how to put them together. With a dynamic presentation you can provide information in tooltips. You can provide information in numerical side reports, which may vary as you move a locator about the primary data representation. You might be able to toggle a presentation between various assumptions. And of course there are the old fashioned error bars and multiple presentations. Your suggestion of fuzzing the data representation might work very well. To go one step further, in a Mathematica notebook (or application) you can give the reader active tools for performing various types of data reduction and the actual data itself. I'm not knowledgeable about statistical analysis of data; I'm sure you can work out much better methods. The point is: Mathematica does give you better tools to present honest results.

Posted 11 years ago
POSTED BY: Steve M

I first used Macsyma over a 300 baud modem over the Arpanet from Caltech to MIT back in the mid 70s. And I found a bug. They had the general solution to the cubic entered incorrectly.

POSTED BY: David Reiss

That was possibly a bug, so to speak, in the CRC reference at the time. It did not really account for choices of roots used, at least in the quartic formula. This may also have been an issue with the cubic. This was stuff I had to work through in 1992.

POSTED BY: Daniel Lichtblau
Posted 11 years ago

"All of the manuals on how to use Mathematica have been written by people who know too much. Their work should be edited by someone like me who is trying to learn what the developers already know."

Luther, I find your basic thesis here by and large accurate. Mathematica does have a steep learning curve and to directly use the more advanced features (particularly the various programming paradigms) one has to think like Mathematica, which for me has not come naturally.

Like others here, I started off with FORTRAN punch cards in college and was simply amazed at what could be done with the "simple" Do Loop. Ten or so years later I stumbled upon Macsyma, anybody remember that ? It was a command line math tool that claimed it could do symbolic math, maybe it could but I did not find it useful. Then I noticed that our Technical Librarian had a copy of Mathematica sitting on her shelf, version 2.2. After inquiring about it I learned that one of the Chief Engineers owned it originally but returned it to her because he couldn't figure out how to use it. I convinced my company to purchase the Premier Support package which offers supurb technical support and over the years would amaze him (and myself) of what could be done with this tool. I also read a few Mathematica books which I enjoyed at the time but that didn't boost my usability knowledge as I had hoped. During and prior to these years I became somewhat proficient with FORTRAN, MathCad, Excel, BASIC, Minitab and Matlab. And from my experience only Matlab comes close to the power of Mathematica if various toolboxes were acquired. Out of the box, Mathematica was a clear winner for my technical computing and data processing needs. And when serious statistics were added (ver 8 I think) there was no need for any other tool.

Fast forward to today, I am still a clunky Mathematica user and I still rely on the basic Do Loop from yesteryear even though much more efficient programming techniques have been available in Mathematica for years. But I've done some pretty neat stuff that no one else in the company has been able to do.

I share the learning curve frustration but I will take a hard to use yet capable tool over an easy to use limited tool any day. And for my technical computing needs, Mathematica has never let me down in the sense that if I could conceive it, Mathematica could be coaxed to solve it; that is powerful.

On a practical note, the approach I have found very useful is to create and maintain a private library of "How To Notebooks" that describe a narrow task that you know you will be facing again in the future. The source material for these may come from the hard slugging and self-learning that you have produced, a call to Tech Support , the online Help, books, or the Mathematica StackExchange site. If I look at my "How To" Folder I see file names like

How to align entries nicely using TableForm.nb How to compute running averages.nb How to conditionally extract values from a list.nb How to convert time formatted strings to numerical values.nb How to control minor ticks on a log axis.nb How to create a 3-D surface plot.nb How to extract algebraic solutions produced by the Solve command.nb etc.

I find this is far more useful that trying to rely on the built-in Help which, in general, offers a very simple example or two (material that you already know) and then jumps to examples so complex that [literally] some Tech Support Engineers at Wolfram have had some difficulty with, and these guys are good !

Kudos to Stephen Wolfram and his team for inventing, enhancing and maintaining what I think is the strongest technical computing environment on the planet. The professors mentioned at the beginning of this discussion don't know what they are missing.

For an example, that I have put to use in my own work, Google and download the Sandia report Sample Sizes for Confidence Limits and Reliability, by J. Darby.

POSTED BY: Steve M
Posted 11 years ago
POSTED BY: Luther Nayhm

Hello, Yes, the "symbolic-program-x" I referred to above was Macsyma as well (I thought the general rule here was not to mention other software by name? Or, perhaps that was the mathgroup newsgroup?)

Back towards the original topic. I am thinking about the ways that I use Mathematica. They don't fit easily into a single category. Mathematica is something of a swiss army knife.

Academics have less time on their hands than many people think--most of my faculty colleagues work long (and thankless) hours seven days a week. And, I believe that, in most cases, the time is not spent on research or teaching. So, I was wondering why I am able to open Mathematica at all?

From a research perspective, I use Mathematica:

1) derive stuff

2) simulate stuff

2) keep track of stuff that I've derived

3) visualize things that I've learned.

It serves as workbench and as scratch paper. As much as I agree with David Park's viewpoint about using Mathematica to publish, I can't afford for the research community to catch up or adopt. It's a battle that I can't win and I'm either too afraid or too lazy to fight. However, It is easy and eye-opening to give presentations with Mathematica--hopefully this might help get the ball rolling to archival publication.

From a teaching perspective, I use Mathematica:

1) To show students that they can understand a physical phenomenon much much better by coding it and visualizing it.

2) to reduce the barrier for physical scientists to learning maths that they should know (the documentation and electronic communities are excellent modules for math as well as mathematica).

3) simulating physical systems for pedagogy.

I believe that being able to code is integral to scientific literacy. For physical scientists, symbolic computing languages like Mathematica threads together many educational goals.

From a recreational perspective,

1) It is fun just to play with new ideas by coding them up.

2) I have collaborated on pieces that have been exhibited at MoMA, the Pompidou, the Paris Fashion Show, and other places that we created with Mathematica.

Finally, while we are quoting about death and revolution, there is a nice one by Henry Ford, "“Anyone who stops learning is old, whether at twenty or eighty. Anyone who keeps learning stays young.” Just playing with Mathematica is a good strategy for staying young.

POSTED BY: W. Craig Carter

I basically use Mathematica for all my work unless otherwise forced asunder. I write all my documents in it and have done since around version 3 (when the typesetting became publication class).

POSTED BY: David Reiss
Posted 11 years ago

Clearly, that is something I have to start doing.

POSTED BY: Luther Nayhm

I think it was either Planck or Boltzmann who said that progress in Physics is made at funerals. You advance a new theory or process and wait for the old physicists to die. Having a Ph.D. in Physics myself, I'm allowed to say this. It's probably true for almost all fields of human endeavor.

POSTED BY: Frank Kampas
Posted 11 years ago
POSTED BY: Luther Nayhm
Posted 11 years ago

I agree with your assessment, though I would not have thought that I could or should write a document wholly in Mathematica. That is a novel idea that I will try out. I am guilty like everyone else of importing the results from Mathematica into PDF documents. The irony to me is that I write everything as a first draft document, including insertion of equations and rationale for their use via Mathtype within Word, as I develop a thesis and solutions, along with reams of paper scratching and associated Mathematica notebook pages. If I don't write it down, it is lost forever, since my notes are never indexed except by various page number styles.

That said, I use Mathematica at its most rudimentary levels compared to the things you discussed, but I get your drift. All of the manuals on how to use Mathematica have been written by people who know too much. Their work should be edited by someone like me who is trying to learn what the developers already know. In commercial areas, you find that software invariably suffers from this issue: developers know too much and don't realize that they need to make their programs and documentation user friendly...where the user can be totally ignorant of the underlying algorithms and could care less. The irony is that when the users have a problem that in turn is becoming an issue, the developers run in a panic to the human factor and user interface groups to have them fix the garbage the developers have created. Wolfram unwittingly operates with this same paradigm, along with most other software product developers.

In my early years at Miami, I had Prof Arfken as an instructor during the times he was contemplating his book. I am sure we were guinea pigs for some of his efforts. He was a pleasant man and fun to listen to. In a symposium at one time he stated an object lessen I still adhere to. He said he had spent weeks trying to solve some problem with a methodology with which he was unfamiliar. He finally made arrangements to visit with a distant heavy weight in mathematical physics to try and resolve what he was trying to do. In his words he said after a day's travel, he went into the heavy weights office, sat down, presented his problem, to which within 5 minutes the "great man," as Prof Arfken referred to him with a twinkle in his eye, laid out the approach and solution. The lesson: when all else fails, ask, and don't wait forever to do so.

POSTED BY: Luther Nayhm
Posted 11 years ago

With the central claim for Mathematica being that it is the single tool needed for everything from jotting down mathematical notes to doing initial exploratory calculations to creating graphics to implementing entire sets of tools for a problem domain to writing and publishing papers and textbooks, I am surprised that I have not seen more published on how to carry out that last step. From the style of it, I am assuming that Ruskeepa's "Mathematica Navigator" was written almost entirely within Mathematica.

There was a thin little book published perhaps twenty years ago showing some of how, entirely within Mathematica, to create documents for publishing. I held a copy in my hands for a minute, but didn't buy it and have never been able to track down that title again. If you could find that book, even though it is old now, then it might help you get started. Perhaps someone else can recall that title.

On the larger question of reception, culture and attitude is far more powerful and persuasive than most might imagine. If you want to do another literature search, then look at the reception that automated theorem proving or even computer proof checking has received from the vast majority of the mathematics community over the last fifty years. Then you can compare and constrast that with the reception that Mathematica in particular and other computer algebra systems in general have received. Perhaps that would help you better understand what you are seeing.

Some have claimed that ideas which are rejected now will be embraced and adopted when the previous generation dies. But culture and attitude may be more persistent than that. If you are creating something new then choose your culture and attitude carefully because you will pay the price and live with the consequences of that forever. For example, if you happened to be around and watching during 1980s, when the culture and attitude was instilled into Mathematica, you might be better able to notice, and possibly even understand, some of the behavior of people inside and outside of Wolfram Inc. thirty years ago and, in some cases, unchanged today.

Lastly, I try to sense your attitude from what you have written above. You might contemplate that. It has a way of leaking out and there almost always seems to be a price to pay for that. I do realize my some of attitude has leaked out in what I have written here.

POSTED BY: Bill Simpson

Probably not what you had but maybe related:

http://library.wolfram.com/infocenter/Conferences/5782/

This is by Paul Wellin and he may have been author or a coauthor of the work you have in mind.

POSTED BY: Daniel Lichtblau
Posted 11 years ago

Good points and I plead guilty to the leakage observation. I was in the "business" at one time and have seen a variety of cultures at work. My knee jerk was a recollection of the times I was burned as a "customer". Within software developers is an undercurrent of certainty that they know best or, as with many technical people, they come across a better way of doing things while working on some project...so they shift gears without letting anyone know. This is the bane of project managers.

I recall meeting a past graduate school colleague at a conference. He had set up a company that was successful and one of its lines of business was developing specialized scientific analysis software for the DoD in the areas of spectroscopy. We chatted and he discovered I was associating with the same types of customers and developers as he was, so he asked me my opinion on what it would take to commercialize his products for a wider audience. One of my own staff had developed a different type of specialized analysis software and we were wrestling with the same question: how to expand the market for that intellectual property. I had no answer for my friend or myself. Some fifteen or so years later, the software my group had developed did show up as a commercial product developed independently elsewhere, and their secret was the user interfaces and easy of use, which our intellectual property never possessed....nor did we have the capacity to address that issue. Plus, the market for the new product had never occurred to us.

Mathematica's documentation is as good as it has to be for Wolfram not to suffer any commercial consequences. Wolfram is blessed by an outstanding product and a user community that is so smart that it overcome the deficiencies in the documentation...plus, to be crass about it, Wolfram sells support services. I think where Wolfram may be somewhat short sighted is that such issues as are being raised in this thread indicates that they could promote a ground swell of broader acceptance and use by "commoditizing" their product. This is just the businessman in me speaking. I would reject the idea of a slimmed down or crippled version just to get a cheaper product out, but too many bells and hidden whistles is pretty frustrating in any product. As an example, I just bought a new phone that uses Android and I tried to set up my voice mail account. I had to go on-line to find someone to tell me where the set up was hidden and it was definitely hidden. The documentation did not even address voice mail but it supplied overkill (from my perspective) in the areas of applications and data.

Mathematica is what it is and from a commercial perspective, Wolfram is doing what it needs to do to stay on top of their game...they have a business strategy that does not depend on better transparent documentation. Mathematica is the best mousetrap around from my perspective. Could it be better? Probably, but define better.

POSTED BY: Luther Nayhm

Dear Luther, I've been puzzling over your observation for nearly 15 years, and I also fail to understand why Mathematica/Wolfram Language gets dismissed by my colleagues so haphazardly. In discussions with my colleagues about this question, I tend to go in listening mode and don't advocate on one side or another. I'll try to summarize what they say below.

Some background:

I'm a professor at MIT in physical sciences and engineering. I've been teaching a course on mathematics and problem solving in materials science for about twelve years, and I use Mathematica extensively in my course. I can't judge the efficacy of my course objectively, but my students generally have very positive evaluations; I've been given MIT's highest institutional teaching awards, as well as the School of Engineering's. I believe that these awards reflect the quality of the Mathematica based course. I give nearly all of my research presentations using Mathematica and these are received well (however, such feedback tends to be biased towards positive)

The summary (I don't agree with many of these, and am still honing my counter arguments)

1) Mathematica is not as fast as X (X may or may not be a compiled language).

2) Mathematica's syntax creates too steep of a learning curve.

3) Our (particular) scientific community uses X; so there are many more routines and working examples in X

4) I am already using X, why should I change?

I believe most of the above are the result of "user inertia".

I don't hear the comment that Mathematica gives "bad results" so much now. I doubt if many faculty roll their own algorithms, but are probably thinking back to when they were postdocs or graduate students--however, there are happy exceptions to this

I believe that the best way forward is to offer students a choice; and hope that they will make an unfettered choice of a their preferred programming language. However, I am not so optimistic: a student who went to graduate school at a university down the street wrote to me saying, "I'm taking this class and they insist on using X and I asked if I could use Mathematica instead, and he said no. We were doing more advanced things in 3.016 ( the sophomore class I teach)."

POSTED BY: W. Craig Carter
Posted 11 years ago

I concur with your assessments and those of Park based on a more general exposure to technologists with particular preferences. I started with a blank slate in terms of exposure and preferences, since I had avoided programming at any level after my early pre-PC experiences with Fortran and Basic and more punch cards than any one graduate student should have to carry. I was never required to do much programming, starting with my first job at Michigan (we used a Wang with punch cards!!!) to Battelle, where I was fortunate to manage a talented group of scientists and technologists who loved programming. Now as I close in on my dotage, I am on my own.

I had to find a muscular program to work out some problems that had never been properly solved because all work on those problems stopped a hundred years ago. A reading of a basic paper showed that the author had not actually done what he said he was going to do and did something else because there were no tools like Mathematica available then.The author did what he could. I reread the paper and identified an asyllogistic conclusion and set out to test the author's methodology. I got results that contradicted the text books. However, I did not know if the fault was mine (mis-interpretation or mis-statement of the real problem) or whether the issue was Mathematica (again, was it my use of mathematica or something within Mathematica), or were the results true? So, after research that led to no resolution on any of the points, I asked.

The results of my consulting with several faculty were disappointing. Other than criticizing Mathematica and my methodology, I simply could not get anyone to actually read the statement of the problem as originally presented nor my interpretation and subsequent modeling using Mathematica. I had expected push back but not total out of hand rejection of my thesis, and the criticism of Mathematica seemed to be a way out for actually having to address the issues and possible consequences that the problem presented. I am, at this point, working down the list of criticism eliminate them. Hard slog.

POSTED BY: Luther Nayhm

"starting with my first job at Michigan (we used a Wang with punch cards!!!)"

Around 1982 I had a job overhauling a Wang Fortran compiler (the project name was, not surprisingly, WANGFOR). I also was requested to augment in terms of functionality, the so-called "Wang enhancement" project (WANGENH). (No, I'm not making this up. Not even the project names. It was a Fortran-66 compiler and they wanted to add some of the Fortran-77 capabilities, if I recall correctly.)

My company and Wang Labs had a falling out several months later and, as best I can tell, mutually fired one another. But the work on that decrepit compiler was, I think, pretty good.

So this makes two of us who worked with Wang's Fortran. Also their assembler code, in my case.

POSTED BY: Daniel Lichtblau

Hello Danny and Luther, That makes three of us who started with punch cards.

A bizarre as it might seem, I think punch cards were a beneficial way to learn to program. I spent so much more time thinking about the program and doing thoughtful debugging because submission of a job was so painful. I confess that I often do debugging now with less reflective thought because I can do dozens of haphazard experiments in minutes--this probably speeds the debugging process as often as it slows it down.

I was sold on symbolic computation (pre-wolfram-language) when I was able to solve classical mechanics homeworks faster and more accurately than my classmates. I remember that just being able to do a taylor expansion around a point and copy down the results and redraw crt-rendered plots into my homework felt like cheating.

I also remember my own "inertia" of switching from "symbolic-program-x" which was free at Berkeley (and had opaque syntax that I had learned moderately well) to Mathematica 1.

Craig

POSTED BY: W. Craig Carter
Posted 11 years ago

Ah, memories!

POSTED BY: Luther Nayhm
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard