Message Boards Message Boards

1
|
7751 Views
|
4 Replies
|
1 Total Likes
View groups...
Share
Share this post:

Large amount of RAM needed for a Fourier transform

Posted 8 years ago

Hello, I would like a machine with >100 Gb of RAM for a Fourier transform. Is such a thing available on the Wolfram cloud? If not, does anyone know of a slick way to run Mathematica on the Amazon cloud?

Thank you in advance,

John

POSTED BY: John F
4 Replies
Posted 8 years ago

With that amount data it is also very hard to do the summation manually (then you can cut your data into batches of (say) 10^7 datapoints):

enter image description here

but this summation method has some very bad scaling (performance-wise) as you can imagine...

Hmm you might indeed be stuck with finding a big computer (certainly some supercomputers have this kind of memory sizes). or write your own fancy routines like fft but then with your data in pieces (not sure if that is possible).

Sorry for wasting your time :-/

POSTED BY: John F
Posted 8 years ago

Hi SH,

Thanks for your reply. I'm afraid I do require all those data points. I'm looking to compute the occupation density (local time) of a fractional OU sheet, which is a pain because it amounts to a sort of `two scale' system where the fractal structure of the fOU sheet needs to be present at a scale much smaller than the resolution of the local time process. The long-range correlation property of the fBm means I can't really do it in sections.

Your suggestion is much appreciated however :)

John

POSTED BY: John F

With that amount data it is also very hard to do the summation manually (then you can cut your data into batches of (say) 10^7 datapoints):

enter image description here

but this summation method has some very bad scaling (performance-wise) as you can imagine...

Hmm you might indeed be stuck with finding a big computer (certainly some supercomputers have this kind of memory sizes). or write your own fancy routines like fft but then with your data in pieces (not sure if that is possible).

Sorry for wasting your time :-/

POSTED BY: Sander Huisman

Hi John

I don't know the answers to your questions, I'm just wondering, do you really have ~10^11 datapoints, and you want to know the spectrum also with 10^11 datapoints?? Sounds like an absurd high precision to know a spectrum. Can you divide the data in (say) pieces of 10^6 point, calculate the fft for each, and average them? This will give you a spectrum with 10^6 datapoints, which is generally more than enough to get typical frequencies and features out. This is a procedure I've done many times on multi-GB datasets works very well.

--SH

POSTED BY: Sander Huisman
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract