Cli module that assists in creating and running Postgres Migrations using async/await.
MIT License
This is a cli module that assists in creating and running Postgres Migrations using async/await.
This module will generate template migration files in a directory of your choosing that contain empty function closures for up
and down
migrations. These closure will be given a knex client connection per your provided data base configuration.
Running a migration creates a table on your database's public
schema called node_pg_migrate
that tracks the state of any migrations executed.
You'll notice all migrations generated are async
functions. This does mean at least Node 7 is required. This is because the module will parse and sort migrations into a sequence and run them synchronously. This gives you, the migration author, control over what actually comprises your migration and allows you to capture errors for 'optional' or failure-allowed migrations.
I (@ktstowell) personally dislike the fragmentation of javascript libraries, however I felt compelled to make this based on the following reasons:
async/await
if env==='local'
in my code. That should be the responsibility of whatever is orchestrating the migrations. Now in my npm scripts I have npm run migrate
and npm run migrate:local
so the code itself remains environment agnostic.types
api you can specificy what kinds of migrations you want and it will create sub folders for them.mode:'high-availability'
option. This will inject a target
and source
connection into your migration files.npm install @liveaxle/node-pg-migrate
Add "npgm": "node ./node_modules/@liveaxle/node-pg-migrate"
to the scripts
section of your application's package.json so you don't have to type the whole binary path.
If you see any issues feel free to report them to the issues tab on this repo. If you have an improvement/change you'd like to make please submit a PR.
If approved and the PR is merged to master, we will determine where the change falls under the Semver convention and update the package version and publish accordingly.
.npgmrc
files are supported and can contain the following global options:
ordering
:
sequential
: default option. Will create files numbered as 01.<name>.migration.js
timestamp
: Will create files as <Date.now()>.<name>.migration.js
directory
: folder where your migrations will live. Will be created for you if it doesn't exist.Database connection config can be loaded in one of two ways:
Environment variables:
dotenv
package to load any .env
that exists in process.cwd()
- which is the root of the project you are invoking this module from. We support the following variable names:POSTGRES_HOST=
POSTGRES_PORT=
POSTGRES_DB= | POSTGRES_DATABASE=
POSTGRES_PASSWORD=
POSTGRES_USER=
POSTGRES_SCHEMA=
Alternatively, we accept command line arguments as well that will be used before any environment variables if provided.
You can also just pass these in as --connection
as a postgres connection string.
These arguments can be passed as --<arg name>
to all cli methods.
Name | values | required | default |
---|---|---|---|
user | db user name | yes, here or in .env | null |
host | db host | yes, here or in .env | null |
port | db port | yes, here or in .env | null |
password | db password | yes, here or in .env | null |
schema | db schema | yes, here or in .env | null |
connection | db conn string | this or as above args | null |
npm run npgm create (<name> | --name)
Name | values | required | default |
---|---|---|---|
ordering | sequential,timestamp | no | sequential |
directory | path to your migrations dir | yes, can be in .rc | 'migrations' |
types | schema, data | no | schema |
mode | standard, high-availability | no | 'standard' |
npm run npgm create -- --name <name> --before <othername>
npm run npgm up
Name | values | required | default |
---|---|---|---|
include | any specific migrations to run, if empty will run all. Use one --flag per migration | no | [] |
exclude | any specific migrations to ignore, if empty will run all. Use one --flag per migration | no | [] |
We recommend using an .npgmrc file so your configuration can be consistent across method calls. If you have created your migrations with both schema
and data
types, when you run up
it will execute your migration types in the order they were provided, ie: ["schema", "data"]
or --types=schema --types=data
, will both run schema migrations first.
This is technically because, the migrate create config can support n
number of types should you choose. It's actually not coupled to the words schema
or data
.
npm run npgm down
Name | values | required | default |
---|---|---|---|
directory | path to your migrations dir | yes, can be in .rc | 'migrations' |
include | any specific migrations to run, if empty will run all. Use one --flag per migration | no | [] |
exclude | any specific migrations to ignore, if empty will run all. Use one --flag per migration | no | [] |
up
instructions for multiple types.If you have set multiple types in your configuration, down
migrations will run them in reverse order as well as the migrations.
npm run npgm reset
Will simply drop the migrations table from your database. DOES NOT run down
migrations. This is typicall most useful in a development setting.
In your repository where migrations will be stored:
npm run npgm create foo
This will create:
./migrations/<timestamp>.foo.js
The file will contain:
'use strict';
/***********************************************************************************************************************************************
* NODE DB MIGRATE - FOO
***********************************************************************************************************************************************
* @author File generated by @liveaxle/node-pg-migrate
* @description
*
*/
/**
* [exports description]
* @type {Object}
*/
module.exports = {
up, down
};
/**
* [up description]
* @return {[type]} [description]
*/
async function up(client) {
}
/**
* [down description]
* @return {[type]} [description]
*/
async function down(client) {
}
In each of the above functions you now can write whatever the "foo" migration means to your application. Down migrations should be the opposite, in action and order, from what your up migrations are.
Then, at some later point when you actually need to execute your migrations, you can run npm run npgm up
and it will execute the sql in the up
closure above in sequence.
npgm create foo --mode=high-availability
'use strict';
/***********************************************************************************************************************************************
* NODE DB MIGRATE - FOO
***********************************************************************************************************************************************
* @author File generated by @liveaxle/node-pg-migrate
* @description
*
*/
/**
* [exports description]
* @type {Object}
*/
module.exports = {
up, down
};
/**
* [up description]
* @return {[type]} [description]
*/
async function up(target, source) {
}
/**
* [down description]
* @return {[type]} [description]
*/
async function down(target, source) {
}
The intent of this mode
is to inject your two db clients into to your migration closure to enable you to export data from the target and apply it to the source.
npgm create foo --types=data --types=schema
This will create a migration folder structure like:
/migrations
/data
<ordering>.foo.migration.js
/schema
<ordering>.foo.migration.js