Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unknown type JSON #544

Closed
Kvit opened this issue Oct 28, 2023 · 6 comments · Fixed by #581
Closed

Unknown type JSON #544

Kvit opened this issue Oct 28, 2023 · 6 comments · Fixed by #581
Labels
download ⬇️ feature a feature request or enhancement

Comments

@Kvit
Copy link

Kvit commented Oct 28, 2023

Getting error "Unknown type JSON" from function bq_table_download() while trying to download table that includes field of type JSON .

to reproduce, in Google Cloud SQL Workspace

CREATE TABLE bigquery (
  id INT,
  data JSON
);
INSERT INTO bigquery (id, data)
VALUES (1, '{"name": "John", "age": 30, "city": "New York"}');

bq_table_download()

@Kvit
Copy link
Author

Kvit commented Oct 28, 2023

I can get around the issue by converting JSON to plain string in the new table using the conversion function TO_JSON_STRING(json), then dropping the column, then converting string to JSON again, but it is a lot of steps.

@hadley hadley added feature a feature request or enhancement download ⬇️ labels Nov 2, 2023
@hadley
Copy link
Member

hadley commented Nov 6, 2023

In R, would expect to get a parsed object or just a string containing JSON?

@Kvit
Copy link
Author

Kvit commented Nov 7, 2023

the object would be ideal but the sting will do too, perhaps it can be an option parse_json = c("string", "object") ?

@hadley
Copy link
Member

hadley commented Nov 9, 2023

Could you help me get your SQL to work? This is what I see:

library(bigrquery)
options(gargle_oauth_email = TRUE)

con <- DBI::dbConnect(bq_test_dataset())
#> ℹ The bigrquery package is using a cached token for 'hadley@posit.co'.

DBI::dbExecute(con, "CREATE TABLE bigquery (
  id INT,
  data JSON
);")
#> [1] 0
DBI::dbExecute(con, r"(INSERT INTO bigquery (id, data)
VALUES (1, '{"name": "John", "age": 30, "city": "New York"}');)")
#> Error in `bq_job_wait()` at bigrquery/R/dbi-result.R:21:3:
#> ! Job 'gargle-169921.job_56LyAlmx4LgUkfwQ_97LO2Q2LaAd.US' failed
#> ✖ Value has type STRING which cannot be inserted into column data, which has type JSON at [2:12] [invalidQuery]

Created on 2023-11-09 with reprex v2.0.2.9000

@hadley
Copy link
Member

hadley commented Nov 9, 2023

Don't worry, found an example in the google docs:

library(bigrquery)
options(gargle_oauth_email = TRUE)

ds <- bq_test_dataset()
#> ℹ The bigrquery package is using a cached token for 'hadley@posit.co'.
con <- DBI::dbConnect(ds)

DBI::dbExecute(con, "CREATE TABLE table1(
  id INT64,
  cart JSON
)")
#> [1] 0
DBI::dbExecute(con, r"(
  INSERT INTO table1 VALUES
  (1, JSON '{"name": "Alice", "age": 30}'),
  (2, JSON_ARRAY(10, ['foo', 'bar'], [20, 30])),
  (3, JSON_OBJECT('foo', 10, 'bar', ['a', 'b']));
)")
#> [1] 3
DBI::dbReadTable(con, "table1")
#> Error in eval(expr, envir, enclos): Unknown type JSON

Created on 2023-11-09 with reprex v2.0.2.9000

@hadley
Copy link
Member

hadley commented Nov 9, 2023

Hmmm, I think I'm going to move to a strategy where for unknown field types I just return the string that BigQuery sends me. Then it'll be up to you to parse it.

hadley added a commit that referenced this issue Nov 9, 2023
* Parse dates, datetimes, and times in R with the clock package. Fixes #285. Fixes #430
* Return all other unknown types as strings. Fixes #435. Fixes #544.
@hadley hadley closed this as completed in 1ce92e1 Nov 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
download ⬇️ feature a feature request or enhancement
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants