Thanks!
I am trying to export some data from Big Query. This is done by first saving the table and then exporting it to google cloud storage. This used to work just fine but recently apparently some tables have nested schemas so exporting as csv does not work anymore. Exporting as a JSON should work, and the export job claims to succeed, but data is not available on google cloud storage. Anyone experiencing similar issues? Is Google having some problems?
Thanks!
Apparently there is a 5GB limit on the buckets in google cloud which prevented us from exporting our datasets. Pity that the job doesn't just fail with an outofstorage error.
The Cloud Storage URI, which is necessary to inform BigQuery where to export the file to, is a simple format: gs://<bucket-name>/<file-name>
.
If you wish to place the file in a series of directories, simply add those to the URI path: gs://<bucket-name>/<parent-directory>/<child-directory>/<file-name>
.
To export a BigQuery table to a file via the WebUI, the process couldn’t be simpler.
table
you wish to export.Export Table
in the top-right.Export format
and Compression
, if necessary.Google Cloud Storage URI
as necessary to match the bucket
, optional directories, and file-name
you wish to export to.OK
and wait for the job to complete.To export a BigQuery table using the BigQuery API, you’ll need to make a call to the Jobs.insert
method with the appropriate configuration. The basic configuration structure is given below: