I've had a number of situations where I needed more complex data extraction jobs than can be handled through Info Link caching or Scheduled updates. Being able to create a DXP that does the necessary data cleansing and mashup across sources is a big plus in Spotfire but that logic becomes locked into that DXP. By using this same DXP as more of an ETL job and staging the data in the library as an .SBDF file allows the complex data prep and transformation logic to easily be reused across multiple DXPs. Each user DXP now points to data stored in the library that was staged via the ETL-like DXP. We could then use Automation Services to automate the refreshing of the data. Having an Export Data Task to Library task that lets you pick from a list of available data tables in the currently opened analysis and being able to specify an extracted file name and folder location is what I'm looking for.
I've built a DXP template that use Iron Python to address some of these needs but having this built into the product is necessary in order to leverage this capability more widely.
Implemented in | 7.6 |
The new Export Data Table to Library in Spotfire 7.6 allows saving SBDFs (Spotfire Binary Data File) in the library. Analyses (DXPs) can then use data from these SBDFs to open the analysis quickly and without hitting the database. See the following community article for more infor https://community.tibco.com/wiki/caching-data-tibco-spotfire-automation-services
Just recently found out that in 7.6, there is now an "Export Data Table to Library" automation services task.
You can see information regarding it here.
https://community.tibco.com/wiki/caching-data-tibco-spotfire-automation-services
Oliver, we would be interested in your solution. Any way you can post it on community.tibco.com and share the link here?
I have created this task and can share my code if necessary. The use case is exactly the same as above.