Message Boards Message Boards

GROUPS:

Get data faster while using PlanetData?

Posted 3 months ago
472 Views
|
3 Replies
|
1 Total Likes
|

A typical situation is a very long time, which is necessary to get the result from PlanetData (but it is similar wit other data). For example - 7 coordinates of the planet Mars gets almost 10 seconds.
Requirement of a larger amount of data produces long minutes of waiting. It is normal behaviour? Have I a possibility to get data faster?
Thanks for help!

In[1]:=  (PlanetData["Mars", 
     Dated["HelioCoordinates", Today + (Tomorrow - Today) #]] & /@ 
   Range[7]) // Timing

Out[1]= {9.84375, {{Quantity[0.757483, "AstronomicalUnit"], 
   Quantity[-1.17841, "AstronomicalUnit"], 
   Quantity[-0.0432667, "AstronomicalUnit"]}, {Quantity[0.769738, 
    "AstronomicalUnit"], Quantity[-1.16958, "AstronomicalUnit"], 
   Quantity[-0.043384, "AstronomicalUnit"]}, {Quantity[0.781909, 
    "AstronomicalUnit"], Quantity[-1.16062, "AstronomicalUnit"], 
   Quantity[-0.0434967, "AstronomicalUnit"]}, {Quantity[0.793996, 
    "AstronomicalUnit"], Quantity[-1.15154, "AstronomicalUnit"], 
   Quantity[-0.0436047, "AstronomicalUnit"]}, {Quantity[0.805997, 
    "AstronomicalUnit"], Quantity[-1.14233, "AstronomicalUnit"], 
   Quantity[-0.043708, "AstronomicalUnit"]}, {Quantity[0.817911, 
    "AstronomicalUnit"], Quantity[-1.133, "AstronomicalUnit"], 
   Quantity[-0.0438065, "AstronomicalUnit"]}, {Quantity[0.829736, 
    "AstronomicalUnit"], Quantity[-1.12355, "AstronomicalUnit"], 
   Quantity[-0.0439003, "AstronomicalUnit"]}}}
3 Replies

With my computer access is faster {4.28125, {{Quantity[0.756971, "AstronomicalUnit"], Quantity[-1.17878, "AstronomicalUnit"], Quantity[-0.0432617, "AstronomicalUnit"]},

Posted 3 months ago

Sometimes it is faster, sometimes slower:

In[5]:= Table[((PlanetData["Mars", 
        Dated["HelioCoordinates", Today + (Tomorrow - Today) #]] & /@ 
      Range[7]) // Timing)[[1]], 20]

Out[5]= {6.42188, 7.875, 8.78125, 8.59375, 6.84375, 6.84375, 6.89063, 6.8125, 7.125, 8.79688, 6.79688, 6.79688, 6.79688, 6.8125, 7.,  7.51563, 7.1875, 6.82813, 7.71875, 6.70313}

Tomas,

I've experienced similar issues with computation times increasing by several orders of magnitude when units are involved. I've queried the issue with Wolfram's technical support and have received a response that included the following comments.

 - "The Quantity objects are designed as a convenience feature rather than a computationally efficient objects."

 - "The best method to handle the efficiency issue is to keep the Quantity objects out of repetitive computations, e.g. Table, loops, or plots, especially if you have a computation of which you know the result unit."

My experience is that, if run time is important, units are best avoided for all but the most trivial of computations.

As to the reason for the huge slow down, I'm not sure. Although I suspect one factor might be that there's such a vast number of units in the rule base (many of which are extremely obscure) that needs to be searched each time a unit appears in an expression. If this is the case then maybe part of the solution could be to ditch all the units that aren't being used in a particular notebook and work only with the required subset. But that, I think, would be something WR would need to address.

In respect to your current problem, you could try applying the function QuantityMagnitude[] to the expressions involved in the computations and evaluate only the de-unitised numeric values (although you may need to use UnitConvert first to express the values in consistent units - unless all variables are expressed in consistent units).

I hope this helps,

Ian

Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract