When working with "big data" it is important to keep memory requirements under control and oftentimes it is necessary to optimize analyses to reduce memory needs. There is actually two main sources driving Spotfire's memory needs:
The size of the actual data tables
The data canvas offers the possibility to review the memory needs of a given data table (even though this feature is not bug free, I will open a case for that).
So, there is a tool to analyze this source of memory needs.
Visualizations
Depending on the configuration of a visualization they can increase Spotfire's memory needs massively (we've seen up to factor 25!).
There is no tool t analyze this source of memory needs.
So to make analyses "big data ready" it would be good if the data canvas (or any other place) had a tool which showed the memory requirements originating from visualizations.
Hello Mark, have you checked the informaiton in Help->support diagnostics and logging?