Skip to content

Commit f51fc74

Browse files
committed
Refactor old BigQuery samples and add new ones.
1 parent 1cb30c5 commit f51fc74

31 files changed

Lines changed: 1490 additions & 1054 deletions

bigquery/README.md

Lines changed: 71 additions & 46 deletions
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,8 @@ analytics data warehouse.
1212
* [Setup](#setup)
1313
* [Samples](#samples)
1414
* [Create A Simple Application With the API](#create-a-simple-application-with-the-api)
15-
* [Calculate size of dataset](#calculate-size-of-dataset)
16-
* [Loading Data with a POST Request](#loading-data-with-a-post-request)
17-
* [Loading Data from Cloud Storage](#loading-data-from-cloud-storage)
15+
* [Datasets](#datasets)
16+
* [Tables](#tables)
1817

1918
## Setup
2019

@@ -39,46 +38,72 @@ __Run the sample:__
3938
[basics_docs]: https://cloud.google.com/bigquery/create-simple-app-api
4039
[basics_code]: getting_started.js
4140

42-
### Calculate size of dataset
43-
44-
View the [source code][size_code].
45-
46-
__Run the sample:__
47-
48-
Usage: `node dataset_size <projectId> <datasetId>`
49-
50-
Example:
51-
52-
node dataset_size bigquery-public-data hacker_news
53-
54-
[size_code]: dataset_size.js
55-
56-
### Loading Data with a POST Request
57-
58-
View the [documentation][file_docs] or the [source code][file_code].
59-
60-
__Run the sample:__
61-
62-
Usage: `node load_data_from_csv <path-to-file> <dataset-id> <table-name>`
63-
64-
Example:
65-
66-
node load_data_from_csv resources/data.csv my-dataset my-table
67-
68-
[file_docs]: https://cloud.google.com/bigquery/loading-data-post-request
69-
[file_code]: load_data_from_csv.js
70-
71-
### Loading Data from Cloud Storage
72-
73-
View the [documentation][gcs_docs] or the [source code][gcs_code].
74-
75-
__Run the sample:__
76-
77-
Usage: `node load_data_from_gcs <bucket-name> <filename> <dataset-id> <table-name>`
78-
79-
Example:
80-
81-
node load_data_from_gcs my-bucket data.csv my-dataset my-table
82-
83-
[gcs_docs]: https://cloud.google.com/bigquery/docs/loading-data-cloud-storage
84-
[gcs_code]: load_data_from_gcs.js
41+
### Datasets
42+
43+
View the [documentation][datasets_docs] or the [source code][datasets_code].
44+
45+
__Usage:__ `node datasets --help`
46+
47+
```
48+
Commands:
49+
create <name> Create a new dataset.
50+
delete <datasetId> Delete the specified dataset.
51+
list List datasets in the authenticated project.
52+
size <datasetId> Calculate the size of the specified dataset.
53+
54+
Options:
55+
--projectId, -p Optionally specify the project ID to use.
56+
[string]
57+
--help Show help [boolean]
58+
59+
Examples:
60+
node datasets create my_dataset Create a new dataset named "my_dataset".
61+
node datasets delete my_dataset Delete "my_dataset".
62+
node datasets list List datasets.
63+
node datasets list -p bigquery-public-data List datasets in a project other than the
64+
authenticated project.
65+
node datasets size my_dataset Calculate the size of "my_dataset".
66+
node datasets size hacker_news -p Calculate the size of
67+
bigquery-public-data "bigquery-public-data:hacker_news".
68+
69+
For more information, see https://cloud.google.com/bigquery/docs
70+
```
71+
72+
[datasets_docs]: https://cloud.google.com/bigquery/docs
73+
[datasets_code]: datasets.js
74+
75+
### Tables
76+
77+
View the [documentation][tables_docs] or the [source code][tables_code].
78+
79+
__Usage:__ `node tables --help`
80+
81+
```
82+
Commands:
83+
create <dataset> <table> Create a new table in the specified dataset.
84+
list <dataset> List tables in the specified dataset.
85+
delete <dataset> <table> Delete a table in the specified dataset.
86+
import <dataset> <table> <file> Import data from a local file or a Google Cloud Storage
87+
file into BigQuery.
88+
export <dataset> <table> <bucket> <file> Export a table from BigQuery to Google Cloud Storage.
89+
90+
Options:
91+
--help Show help [boolean]
92+
93+
Examples:
94+
node tables create my_dataset my_table Create table "my_table" in "my_dataset".
95+
node tables list my_dataset List tables in "my_dataset".
96+
node tables delete my_dataset my_table Delete "my_table" from "my_dataset".
97+
node tables import my_dataset my_table ./data.csv Import a local file into a table.
98+
node tables import my_dataset my_table data.csv Import a GCS file into a table.
99+
--bucket my-bucket
100+
node tables export my_dataset my_table my-bucket Export my_dataset:my_table to
101+
my-file gcs://my-bucket/my-file as raw CSV
102+
node tables export my_dataset my_table my-bucket Export my_dataset:my_table to
103+
my-file -f JSON --gzip gcs://my-bucket/my-file as gzipped JSON
104+
105+
For more information, see https://cloud.google.com/bigquery/docs
106+
```
107+
108+
[tables_docs]: https://cloud.google.com/bigquery/docs
109+
[tables_code]: tables.js

bigquery/dataset_size.js

Lines changed: 0 additions & 147 deletions
This file was deleted.

0 commit comments

Comments
 (0)