Node.js client for Google Cloud BigQuery: A fast, economical and fully-managed enterprise data warehouse for large-scale data analytics.
APACHE-2.0 License
Bot releases are hidden (Show)
Published by lukesneeringer almost 7 years ago
import
and export
methods have been renamed to use the verbs load
and extract
(#39)
start-
methods. This also encompasses the renames in the previous item. (#42)
startQuery
method on the base object:
- BigQuery#startQuery
+ BigQuery#createQueryJob
table
object:
- BigQuery/table#startCopy
+ BigQuery/table#createCopyJob
- BigQuery/table#startCopyFrom
+ BigQuery/table#createCopyFromJob
- BigQuery/table#startExport
+ BigQuery/table#createExtractJob
- BigQuery/table#startImport
+ BigQuery/table#createLoadJob
startQuery
method if they need manual pagination. (#38)Published by stephenplusplus almost 7 years ago
The code has been migrated from GoogleCloudPlatform repository to googleapis/nodejs-bigquery.
$ npm install @google-cloud/[email protected]
useLegacySql
has changed from true
to false
.table.copy(function(err, job, apiResponse) {});
has become
table.copy(function(err, apiResponse) {});
table.copy(function(err, job, apiResponse) {});
has become
table.startCopy(function(err, job, apiResponse) {});
$ npm install @google-cloud/[email protected]
(https://github.com/GoogleCloudPlatform/google-cloud-node/pull/2171)
We've officially dropped support for Node v0.12.x in all of our APIs. It may still work, but it is not officially supported and may stop working at any time.
$ npm install @google-cloud/[email protected]
$ npm install @google-cloud/[email protected]
projectId
. See Authentication for more.$ npm install @google-cloud/[email protected]
Previously, when inserting multiple rows of data into a table, if any of them failed to insert, the failed insertion errors were passed in a separate argument to the callback. We've since introduced a custom error type called PartialFailureError
, which is returned as the first err
argument to your callback.
table.insert(data, function(err, insertErrors, apiResponse) {
if (err) {
// An API error occurred.
}
if (insertErrors.length > 0) {
// Some rows failed to insert, while others may have succeeded.
//
// insertErrors[].row (original row object passed to `insert`)
// insertErrors[].errors[].reason
// insertErrors[].errors[].message
}
});
table.insert(data, function(err, apiResponse) {
if (err) {
// An API error or partial failure occurred.
if (err.name === 'PartialFailureError') {
// Some rows failed to insert, while others may have succeeded.
// err.errors[].row (original row object passed to `insert`)
// err.errors[].errors[].reason
// err.errors[].errors[].message
}
}
});
$ npm install @google-cloud/[email protected]
It's been a long time coming, but we're finally here. We've shipped promise support in the Google Cloud BigQuery module!
Nope, carry on. (But keep reading if you use streams)
Don't pass a callback:
var bigquery = require('@google-cloud/bigquery')();
bigquery.getDatasets()
.then(function(data) {
var datasets = data[0];
})
.catch(function(err) {});
All of the streaming functionality from our methods have been moved to their own method.
var bigquery = require('@google-cloud/bigquery')();
- bigquery.getDatasets()
+ bigquery.getDatasetsStream()
.on('data', function(dataset) {})
- bigquery.getJobs()
+ bigquery.getJobsStream()
.on('data', function(job) {})
- bigquery.query('SELECT ...')
+ bigquery.createQueryStream('SELECT ...')
.on('data', function(row) {})
var dataset = bigquery.dataset('my-dataset-name');
- dataset.getTables()
+ dataset.getTablesStream()
.on('data', function(table) {})
- dataset.query('SELECT ...')
+ dataset.createQueryStream('SELECT ...')
.on('data', function(row) {})
var table = dataset.table('my-table-name');
- table.getRows()
+ table.createReadStream()
.on('data', function(row) {})
- table.query('SELECT ...')
+ table.createQueryStream('SELECT ...')
.on('data', function(row) {})