Message Boards Message Boards

1
|
4151 Views
|
5 Replies
|
4 Total Likes
View groups...
Share
Share this post:

HOWTO Use Quantity and UnitConvert etc. without a huge performance penalty?

I have posted this to Wolfram Support, who have kindly replied and are investigating this, but I also feel this warrants some public discussion.


EDIT: 2022-04-28: If you have arrived at this post, which I consider addresses a very important matter for the future of the Quantity system in Mathematica, it will help if you mostly ignore the first posting, and view the other later comments, with improved notebooks. I've left the original posting basically as is, but please note the following:

  • The use of Table in the original example is not relevant, it is not a good way of mimicking a large iteration in the actual application of concern, and everything to do with QuantityArray (while useful to know about) is not relevant. See the Timing examples using For instead.

  • The reason For is used is merely to mimic a large iteration for a real application calculation, and to ensure that the test involves invoking a given Quantity-based function more than once with different arguments each time (as might happened during a root solve).

  • The initial use of a "predefined unit Quantity" as the 2nd argument to UnitConvert was basically a lazy irregular practice, and while it works, it is much slower than using a "UnitString" as second argument. Nevertheless, it's not clear to me why extracting the Unit from a Quantity in that context is so very slow.

  • The main issue is NOT UnitConvert. In the real application, I barely ever even have to use it.

  • The main issue is that performing basic maths with Quantity (amongst Quantity or even just multiplying a Quantity by a Real) seems, as far as I can tell, to be impractically slow. If my tests are correct, it's many hundreds of times slower than doing the same calculations just with Real.


The original post (see instead comments below)

I am using the (otherwise wonderful) Quantity system right across a very comprehensive client project. I've hit performance problems, and using some quick tests I think I have just determined that using UnitConvert - or even just multiplying or dividing by a unit Quantity - increases computation time for simple computations by a factor of about 2000!

If what I am seeing is true, I am going to have to spend days or maybe even weeks retro-fitting my MMA application to not use Quantity (at least in computationally crucial places).

The following Timings are on an iMac 5K 3.5 GHz Quad-Core Intel Core i5.

The functions shown are not very interesting, they serve merely to demonstrate the problem.

First I like to use some predefined units (I actually have a large Package library with a special naming convention for this across my app):

unit$K = Quantity[1, "Kelvins"]

unitC = Quantity[1, "DegreesCelsius"]

The inclusion of UnitConvert massively slows this simple conversion computation down!

test$UnitConvert[n_, tK_] :=  Table[ UnitConvert[(tK + i/n) unit$K, unitC], {i, 1, n}]

test$UnitConvert[10000, 300.]; // Timing

{2.839933, Null}

Even just the inclusion of multiplication by a unit Quantity greatly slows it down:

test$UnitConvertDirect[n_, tK_] := Table[ ((tK + i/n) - 273.15) unitC, {i, 1, n}]

test$UnitConvertDirect[10000, 300.]; // Timing

{0.49317, Null}

If the multiplication by the Celsius unit is removed, it's much faster!

test$ConvertNoQuantity[n_, tK_] := Table[ ((tK + i/n) - 273.15), {i, 1, n}]

test$ConvertNoQuantity[10000, 300.]; // Timing

{0.001202, Null}

Estimate of slow-down factor due to combined UnitConvert and multiplication by a unit Quantity:

2.839933/0.001202

2362.67304493

Yes, that's over 2000 times slower. Ouch. Either I am doing something very wrong, or the otherwise brilliant Quantity system is completely unsuitable for use on any substantial real-world project.

To make matters worse, Compile does not work with Quantity:

cfTemp = Compile[{temp}, temp Quantity[1, "Kelvins"] ]

CompiledFunction[....]

cfTemp[300.]

ERROR: CompiledFunction::cfse: Compiled expression 1K should be a machine-size real number.

WARN: CompiledFunction::cfex: Could not complete external evaluation at instruction 1; proceeding with uncompiled evaluation.

300 K

Compile also does not seem to like Quantity used as an argument to any client function, and does not like Quantity returned from a client function, even if you scale it to unitless on the way out.

Now you could of course get around that by (tediously) using units-aware wrappers that divide Quantity arguments by a unit Quantity before invoking a unitless CompiledFunction and multiply the unitless result by the appropriate unit Quantity on the way out. But because of the huge performance hit described above that is completely untenable (and also error-prone).

Yes, yes, I should have tested this all first before committing to using Quantity across an entire large application. I knew there would be a performance hit, but I did not imagine it would be a factor of thousands. 1000s.

I've been extolling the virtues of the Wolfram Language's Quantity system as widely as possible for some time. It's a bit hard to convince the JavaScript-obsessed commercially oriented youth to embrace the Wolfram Language for the benefits of the Quantity system if it is indeed that slow. I hope I'm wrong. I don't ever enjoy it when JavaScript programmers find even the tiniest excuse to laugh at me.

POSTED BY: Darren Kelly
5 Replies
Posted 3 years ago

Not sure what kind of discussion you want to have, but here are a couple of observations.

test$UnitConvert2[n_, tK_] := 
 Table[UnitConvert[Quantity[(tK + i/n), "Kelvins"], "DegreesCelsius"], {i, 1, n}]

is over twice as fast as your original test$UnitConvert. And I did find it a bit odd that you'd want to multiply by a predefined base Quantity as your method of pulling a value into the Quantity space. I don't know how much of the performance gain is due to simply removing extra arithmetic and how much is due to removing any sort of Quantity-related inefficiencies.

Similarly,

test$UnitConvertDirect2[n_, tK_] := 
 Table[Quantity[((tK + i/n) - 273.15), "DegreesCelsius"], {i, 1, n}]

is about 30 times faster than your original test$UnitConvertDirect.

As for the JavaScript kiddos, I'm not really sure you're making the right point. For an apples-to-apples comparison, we need to compare Mathematica's Quantity system with a similarly-featured quantity library implemented in JavaScript. Any such system is going to incur some performance hit. And, to indulge in my own frustrations for a bit, I doubt that using a JavaScript quantity library is interesting to most JavaScript developers (same goes for any other language in place of JavaScript). Unit conversion errors are rampant, and no one really seems to care. I don't think you can make Mathematica any more enticing by showing off the Quantity system--in fact it's likely to have the opposite effect.

But if your primary concern is improving performance in the application you're building, I don't know of any magic formula for using Quanitity without degrading computational performance other than the kinds of things you already alluded to.

POSTED BY: Eric Rimbey

@Eric Rimbey

Many thx for your reply. I agree that using a Quantity (my predefined "unit Quantity" approach) as the 2nd argument to UnitConvert is clearly irregular and not a good idea, and as you showed it comes with some overhead. I tested also your version of this, where you use an explicit unit String "DegreesCelsius":

test$UnitConvertDirect2[n_, tK_] :=  Table[Quantity[((tK + i/n) - 273.15), "DegreesCelsius"], {i, 1, n}]

I also found it to be about 30 times faster. That's something I can easily address in my application. However it's still massively slower than not using a Quantity at all (Quantity creation and UnitConvert are still dreadfully slow), and it does not address the other main issue.

Here's another example, with no explicit use of UnitConvert (see also attached notebook and screenshots). Perform the classic distance calculation d = vt + (1/2) a t ^2 with and without Quantity arguments and Quantity return. The versions using Quantity are over 500 times slower.

EDIT: 2022-07-28 14:30 AEST I have removed completely a notebook screenshot that was causing confusing. I have replaced the notebook with a new one that only uses For in the timing tests, not Table, because it was causing confusion. (If you wish to help, please do not recommend use of QuantityArray, it is not relevant for the real world application example.) Consider instead a For loop over a complex network of physics functions, each of which returns a Quantity, computes the units internally automatically, and is then consumed in many other units-aware physics functions, which each returns a Quantity, and so on.

POSTED BY: Darren Kelly
Posted 3 years ago

The definitions for distanceNoUnits and distanceWithUnits are missing. Can you please provide them.

POSTED BY: Rohit Namjoshi

@Rohit I've removed the screenshot and updated the Notebook to make it clearer and more consistent. I've removed any use of Table in the timing tests

POSTED BY: Darren Kelly
POSTED BY: Darren Kelly
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract