Replies: 2 comments 1 reply
-
I am also looking for solution to this problem. I want to import the data into another database, which only accepts CSVs where all strings are quoted. However, the BigQuery API currently has no option for this: https://cloud.google.com/bigquery/docs/reference/rest/v2/Job#JobConfigurationExtract |
Beta Was this translation helpful? Give feedback.
-
You can extend the BQ operator and post-process such file. You could use the csv library in Python stdandard library to read the file and write it with QUOTE_ALL. https://stackoverflow.com/questions/30991735/python-csv-reader-writer-handling-quotes-how-can-i-wrap-row-fields-in-quotes. But you would have to download the file locally and push it to GCS rather than than use "extract". The Extract one has the benefit that it happens inside google cloud and the file does not get downloaded/uploaded by Airflow.. And there I am afraid you are limited to what you can do by the BQ extract service. I think you should raise your question/request to BigQuery team :) |
Beta Was this translation helpful? Give feedback.
-
I'm encountering an issue when using
run_extract
method to export table data into a Cloud storage bucket. I need to export all the fields by surrounding them by double quotes ("
) but I didn't find a way to do this. There's noquote_character
option as inrun_load
method.When I manually put double quotes in data source (
"field value"
) and export it usingairflow.providers.google.cloud.hooks.bigquery.BigQueryHook.run_extract
, it surrounds data by 3 double quotes ("""field value"""
), which is not what I need. Is there any way to do that ?Beta Was this translation helpful? Give feedback.
All reactions