Message Boards Message Boards

8
|
20189 Views
|
5 Replies
|
12 Total Likes
View groups...
Share
Share this post:

Estimated remaining time: monitoring lengthy computations

Posted 12 years ago
I sometimes run lengthy computations, overnight or even over a few days. We have built in Monitor function to track the progress of a variable. But I often find it useful to estimate how much time is left till a computation is done. It is easy to estimate, but it’d be nice to have it monitored automatically. Like many OS do, for example, for files transfer.

Below I give a prototype of a function that does it. The idea is very simple and assumes that computations are approximately uniform in time. A typical usage case would be some image processing of large directory of images. We compute the rate based on variable (e.g. number of images) progress and time passed. The rate allows estimating remaining time. Here is the legend for its arguments:
  • content – the computation to track
  • ticker – the variable to track
  • min – minimum (starting) value of ticker
  • max – maximum (ending) value of ticker
  • period – periodicity to check the status of computation with

“period” is needed because monitoring in principle slows down the original computation. To reduce the slow-down check for the computation status with longer (than single iteration) time intervals.

 Clear["Global`*"];
 SetAttributes[RemainingTime, HoldAll];
 RemainingTime[content_, ticker_, min_, max_, period_] :=
  Block[{T0 = AbsoluteTime[], toTimE, timE},
   ticker = min;
   Monitor[content,
    Refresh[
     timE = AbsoluteTime[] - T0;
     toTimE = (max - min) timE/(ticker - min + (max - min) 10^-6);
    Panel@Grid[
      {{"current value  ", ProgressIndicator[ticker, {min, max}],
        ticker},
       {"remaining time ",
        ProgressIndicator[toTimE - timE, {0, toTimE}],
        Round[toTimE - timE]}},
      Alignment -> Left],
    UpdateInterval -> period, TrackedSymbols -> {}]
   ]
  ]
Now let see how the function works. For uniform in time computations it works fine:
RemainingTime[Do[Pause[0.01], {i, 1, 500}], i, 1, 500, .1]

For non-uniform in time computations, like finding prime numbers, we can obviously see some trouble: 
RemainingTime[
primes = {};
Do[If[PrimeQ[2^i - 1], AppendTo[primes, i]], {i, 1000, 5000}];
primes
, i, 1000, 5000, .1]

Let me know any suggestions or post any improvements or variations. For instance, how do we account for non-uniform in time computations?
POSTED BY: Vitaliy Kaurov
5 Replies
Posted 12 years ago
Very interesting post Vitaliy. I have given it a little thought during the day, and to tell the truth, I have found it quit a quest to figure out how to account for non-uniformity in computations. Initially I thought of the solution being a GPS-like algorithm. This way, by imposing a "speed limit", one could calculate the required time to do a calculation. Slower calculations would lead to a re-calculations of the time estimate. But who would want to slow down a computation? My conclusion was thus, that "estimate" was the keyword.

Please forgive my novice level of Mathematica programming. Based on your code:
 Clear["Global`*"];
 
 SetAttributes[RemainingNonUniformTime, HoldAll];
 RemainingNonUniformTime[content_, ticker_, min_, max_, period_] :=
  Block[{T0 = AbsoluteTime[], lstTickerTracker, tickerTracker = {}},
   ticker = min;
   Monitor[content, Refresh[
     lstTickerTracker  = AppendTo[tickerTracker, AbsoluteTime[]];
     Panel@
     Grid[{{"Initiated at timestamp:" , 
        ToString[DateString[T0]]}, {"Current progress:  ",
        ProgressIndicator[ticker, {min, max}],
        ticker}, {"Avg. Exp. process termination: ",
        ToString[
         DateString[(T0 + (0.5*
              First@Commonest@Differences@lstTickerTracker/
              10*(max - min)) + (0.5*
              Max@Differences@Take[lstTickerTracker, -10]/
              10*(max - min)))]]}, {"Realtime prediction:  ",
        ToString[
         DateString[(AbsoluteTime[] + (Mean@
               Differences@Take[lstTickerTracker, -10]/
              10)*(max - ticker))]]}, {"Current timestamp: ",
        ToString[DateString[AbsoluteTime[]]]}}, Alignment -> Left],
    UpdateInterval -> period, TrackedSymbols -> {}]]]

There are most likely better ways. But thought i'd contribute to the discussion, despite my poor proficiency :-)

The code delivers two estimates. One averaged, based partially on the most common computation-speed, and the slowest computations-speeds within the last few computations. Second estimate is based on the current computation-speed, applied to the remaining number of calculations to be done.  Of course it does not predict the exact time of the end of the computation, but it gives an idea of how long it will take, supposing one does not pause or stall the computation.
POSTED BY: Simon P.
Great, Simon, very interesting! Thank you for your reply. Are you trying estimate an upper bound for remaining time based on a few recent "tail" iterations? By the way, I think there is a small initialization error for your code - it gives an error when it starts for brief time, then resumes properly. Below I give an image of its nice looking interface.
POSTED BY: Vitaliy Kaurov
Posted 12 years ago
Hello again Vitaliy. Yes, my apologies. There is an error. I attempt to find the few recent iterations, berfore the required quantity actually exists. Hence the error.

Yes. The "Realtime prediction" is based on some of the tail iterations. It might of couse seem that this is not completely valid, and not a true forecasting algorithm. Some preliminary investigation made prior the posting of the code showed that that time it takes for a single computation, in a long row of computations, increases and then flattens out, i.e. the longer a computation takes, the more appropriate it should be to assume the latest iterations mirrors the progression speed of the entire process. I have to admit, I am not entirely sure my logic holds, nonetheless, I gave it a shot ;-)

The Image below shows a longer computation. As you can see, the pace of computations "flattens" out. (Computation progress on the y axis, AbsoluteTime on the x axis).

  

This approach truly minimizes the resources required for forecasting, which instead can be used on the actual computation.

I am not sure an algorithm for predicting progress of a non-uniform progress exists. But I am sure, if we combine the 'Realtime prediction' with some kind of logarithmic FindFit(?), then we can pull it off :-)

Edit: Not to leave errorneous code behind in the community, I took the liberty to correcting the previous excerpt of code.
 SetAttributes[RemainingNonUniformTime, HoldAll];
 RemainingNonUniformTime[content_, ticker_, min_, max_, period_] :=
  Block[{T0 = AbsoluteTime[], lstTickerTracker, tickerTracker = {}},
   ticker = min;
   Monitor[content,
    Refresh[lstTickerTracker =
      AppendTo[tickerTracker, AbsoluteTime[]];
     Panel@
      Grid[{{"Initiated at timestamp:",
        ToString[DateString[T0]]}, {"Current progress:  ",
        ProgressIndicator[ticker, {min, max}],
        ticker}, {"Avg. Exp. process termination: ",
        If[Length[lstTickerTracker] > 9,
         ToString[
          DateString[(T0 + (0.5*
               First@Commonest@Differences@lstTickerTracker/
                10*(max - min)) + (0.5*
               Max@Differences@Take[lstTickerTracker, -10]/
                10*(max - min)))]],
         "Initializing..."]}, {"Realtime prediction:  ",
        If[Length[lstTickerTracker] > 9,
         ToString[
          DateString[(AbsoluteTime[] + (Mean@
                 Differences@Take[lstTickerTracker, -10]/10)*(max -
                ticker))]],
         "Initializing..."]}, {"Current timestamp: ",
        ToString[DateString[AbsoluteTime[]]]}}, Alignment -> Left],
    UpdateInterval -> period, TrackedSymbols -> {}]]]
POSTED BY: Simon P.
You could perhaps do a FindFit on the times each successive result took, and extrapolate using that?
POSTED BY: Patrick Stevens
@PatrickStevens Yes, good idea, we could do that type of forecasting, but that would require accumulation of points and time spending on fitting. This will slow down the monitored computation. So I guess the question is how do we find a proper balance between small enough slow down and sophisticated enough forecasting. I wonder if there are some standard algorithms for this.
POSTED BY: Vitaliy Kaurov
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract