Skip to content

Commit

Permalink
Refactor old BigQuery samples and add new ones. (#187)
Browse files Browse the repository at this point in the history
  • Loading branch information
jmdobry committed Aug 26, 2016
1 parent 1a4bf80 commit 2c780b4
Show file tree
Hide file tree
Showing 31 changed files with 1,490 additions and 1,054 deletions.
117 changes: 71 additions & 46 deletions bigquery/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,8 @@ analytics data warehouse.
* [Setup](#setup)
* [Samples](#samples)
* [Create A Simple Application With the API](#create-a-simple-application-with-the-api)
* [Calculate size of dataset](#calculate-size-of-dataset)
* [Loading Data with a POST Request](#loading-data-with-a-post-request)
* [Loading Data from Cloud Storage](#loading-data-from-cloud-storage)
* [Datasets](#datasets)
* [Tables](#tables)

## Setup

Expand All @@ -39,46 +38,72 @@ __Run the sample:__
[basics_docs]: https://cloud.google.com/bigquery/create-simple-app-api
[basics_code]: getting_started.js

### Calculate size of dataset

View the [source code][size_code].

__Run the sample:__

Usage: `node dataset_size <projectId> <datasetId>`

Example:

node dataset_size bigquery-public-data hacker_news

[size_code]: dataset_size.js

### Loading Data with a POST Request

View the [documentation][file_docs] or the [source code][file_code].

__Run the sample:__

Usage: `node load_data_from_csv <path-to-file> <dataset-id> <table-name>`

Example:

node load_data_from_csv resources/data.csv my-dataset my-table

[file_docs]: https://cloud.google.com/bigquery/loading-data-post-request
[file_code]: load_data_from_csv.js

### Loading Data from Cloud Storage

View the [documentation][gcs_docs] or the [source code][gcs_code].

__Run the sample:__

Usage: `node load_data_from_gcs <bucket-name> <filename> <dataset-id> <table-name>`

Example:

node load_data_from_gcs my-bucket data.csv my-dataset my-table

[gcs_docs]: https://cloud.google.com/bigquery/docs/loading-data-cloud-storage
[gcs_code]: load_data_from_gcs.js
### Datasets

View the [documentation][datasets_docs] or the [source code][datasets_code].

__Usage:__ `node datasets --help`

```
Commands:
create <name> Create a new dataset.
delete <datasetId> Delete the specified dataset.
list List datasets in the authenticated project.
size <datasetId> Calculate the size of the specified dataset.
Options:
--projectId, -p Optionally specify the project ID to use.
[string]
--help Show help [boolean]
Examples:
node datasets create my_dataset Create a new dataset named "my_dataset".
node datasets delete my_dataset Delete "my_dataset".
node datasets list List datasets.
node datasets list -p bigquery-public-data List datasets in a project other than the
authenticated project.
node datasets size my_dataset Calculate the size of "my_dataset".
node datasets size hacker_news -p Calculate the size of
bigquery-public-data "bigquery-public-data:hacker_news".
For more information, see https://cloud.google.com/bigquery/docs
```

[datasets_docs]: https://cloud.google.com/bigquery/docs
[datasets_code]: datasets.js

### Tables

View the [documentation][tables_docs] or the [source code][tables_code].

__Usage:__ `node tables --help`

```
Commands:
create <dataset> <table> Create a new table in the specified dataset.
list <dataset> List tables in the specified dataset.
delete <dataset> <table> Delete a table in the specified dataset.
import <dataset> <table> <file> Import data from a local file or a Google Cloud Storage
file into BigQuery.
export <dataset> <table> <bucket> <file> Export a table from BigQuery to Google Cloud Storage.
Options:
--help Show help [boolean]
Examples:
node tables create my_dataset my_table Create table "my_table" in "my_dataset".
node tables list my_dataset List tables in "my_dataset".
node tables delete my_dataset my_table Delete "my_table" from "my_dataset".
node tables import my_dataset my_table ./data.csv Import a local file into a table.
node tables import my_dataset my_table data.csv Import a GCS file into a table.
--bucket my-bucket
node tables export my_dataset my_table my-bucket Export my_dataset:my_table to
my-file gcs://my-bucket/my-file as raw CSV
node tables export my_dataset my_table my-bucket Export my_dataset:my_table to
my-file -f JSON --gzip gcs://my-bucket/my-file as gzipped JSON
For more information, see https://cloud.google.com/bigquery/docs
```

[tables_docs]: https://cloud.google.com/bigquery/docs
[tables_code]: tables.js
147 changes: 0 additions & 147 deletions bigquery/dataset_size.js

This file was deleted.

Loading

0 comments on commit 2c780b4

Please sign in to comment.