We use SQL Server for Spotfire. There is a 2 GB file size limitation on all files on this server. To speed up load times for users, we are trying to open an analysis with numerous large tables with an automation job nightly and then save it as embedded to the library. However, when tables become too large and exceed 2 GB this no longer works. One option is to run automation jobs that "export data table to library" for each individual table to SBDF files, but this also fails when any single table is above 2 GB which happens quickly when you are dealing with millions of records. A solution to this is the "export data to file" which lets you save it to a network location and get around the 2 GB limit. My request is to add an automation option called "Save Analysis to File" as opposed to just "save analysis to library". This would allow me to get around the 2 GB limitation in one step instead of having to do multiple steps. Hope to see this soon!
Hi Patrick,
I'm curious about how it went for you.
As you know, and stated in Spotfire Server 10.10 LTS System Requirements, the problem is that Microsoft SQL Server has a size limit of 2 GB for BLOB storage.
Did you manage to go around the MS SQL Server 2Gb limitation by using another database (e.g. Oracle or PostgreSQL) or AWS S3 for the Spotfire Library ?
Anyhow, we are considering adding "Save Analysis to File" to Automation Jobs.
Thanks.
If there was no limitation then no I would not need this feature, but that limitation currently exists.
Hi Patrick,
Would you still need "Save Analysis to File" task in Automation Jobs if there was no 2 GB size limitation in the Library due MS SQL Server? (i.e. using other database)
Thanks, can you help me understand the benefits of that other than just file size limits? My main concern is analysis speed where importing the data into memory seems to be superior to loading data on demand from a database or cloud. If I move all the data into the cloud to avoid file size issues, wouldn't I still have to wait for the data to load when I open the DXP file? Data on demand does not work for what we are trying to do because it load times become cumbersome.
Hi Patrick to minimize the size of your Spotfire database, you can store your organization's Spotfire library content (analyses and analysis data) in the cloud using Amazon Web Services S3 (AWS), or in a file system elsewhere. For more info on this see the Spotfire Server Installation and Administration Manual.