The following question concerns a phase of development where Im working with a new data set to figure out its features...
Concerning higher level Mathematica constructs such as TimeSeries
, EventSeries
, Associations
and DataSets
, my initial tendency is to convert raw imports of my data almost immediately into these higher level constructs. However, during my exploration phase of figuring out features of my data, I quickly realize I am frequently invoking //Normal
so as to invoke Mathematica functions that are not tuned for the higher level constructs. And then that leads me to worry about runtime efficiency (or worse) that I am committing too early to converting my raw data into these higher level constructs.
Lately, Im thinking to myself that I ought to re-think my Mathematica notebook development into a exploration phase were I use only basic Mathematica constructs . And later, when I am confident I understand my data set features, that the second section of Mathematica notebooks is where Im committing to using the higher level constructs knowing I wont need to devolve them with //Normal
.
In summary : If there is indeed negative runtime consequences by invoking //Normal
too frequently, what is a good practice of creating a cookie-crumb of initial data analysis to discover essential features of unknown data sets, followed by a full-fledged jump into the higher level Mathematica constructs as the data processing pipeline takes shape?