Cannot import large data from Google BigQuery via Cloud Storage

Currently, when you try to import large data from GoogleBigQuery via Google Cloud Storage, it fails to import and you see following error in exploratory.log file.

Error in read_tokens_(data, tokenizer, col_specs, col_names, locale_, : ignoring SIGPIPE signal

This is a known Exploratory Desktop issue and the fix will be included in next patch.
As a workaround, please add some where clause on SQL query to make import size smaller.

For example, if you try to import natality table, which is available under sample dataset of bigquery-public-data, you need to reduce the number to around 1 million rows for now.

https://bigquery.cloud.google.com/table/bigquery-public-data:samples.natality?pli=1