We have run into an issue which will stop us to develop any valuable datasets in Spotfire.
So the situation is the following:
When I run a query based on a table definition Spotfire kicks out a bigquery#table job which runs fine with no issues
When I develop a custom query Spotfire kicks out a bigquery#job job which has a limited DEFAULT response time of 10 seconds.
After that 10 seconds, the BigQuery API response is: "jobComplete": false BUT the query completes within BigQuery, only the API gives back a “I don’t wait more” message
The same happens with BigQuery views except after the 1st run the query result is cached into a temp table and at the 2nd run we receive response with data from the API even if it takes 30 mins to load.
So when we create a custom query with joins and where clause 99% of the time it will take longer than 10sec to run on the server side and the API will deny the data
Please add a timeout parameter to the connection so this setting can be changed if needed.
Hi Thomas, we are currently using 10.6.1. Yes I think we can try a newer version, I reached out to our repository manager to get 10.10. I'll keep you posted.
Hi Balazs, Thanks for reaching out. This used to be an issue but was fixed in Spotfire 10.9. If you are using a previous version would you be able to test with Spotfire 10.10 or 11.0? Thanks, Thomas