Tutorial for learning OpenAPI spec + Connexion + AioHTTP + Google API Design Guide
MIT License
The purpose of this tutorial is to facilitate learning about the OpenAPI 3 spec + Connexion + AioHTTP stack. And designing APIs according to the Google API Design Guide. Also use of proper linting and Python formatting is encouraged through a pre-commit hook. With some knowledge of how to use a shell (Mac/Linux/Unix - I prefer Bash) + Git + Python 3, you should be able to have this tutorial working for you pretty quickly. The interesting part of this tutorial is that there are Pytests for each step. This also works to demo test driven development. Each part of the tutorial has a different pytest file and a corresponding Makefile entry to make running it simple. Answers for each step in the tutorial are included in different Git branches named for the step.
This tutorial assumes previous knowledge of the following:
These instructions will get you a copy of the project up and running on your local machine.
Install Python 3.7 however you like. If you are using a Mac, Homebrew is an easy way to manage package installs. Pyenv is a great tool if you need to work with multiple versions of Python.
This project requires Docker. Setup by following their instructions.
Once the project is cloned to your computer, dependencies must be installed (preferably in
a virtual environment).
Install the necessary pip packages listed in the
requirements.txt
file.
If you receive an error while install the requirements about not finding the pg_config file you need to install a postgresql client before the psycopg2 install will work. If you are on a Mac with Homebrew, running
brew install postgresql
should fix the issue.
If you can get the AioHTTP server running, then you know your environment is setup correctly for the rest of the tutorials.
Either you must run everything from the root of this project or add the project's directory to your
PYTHONPATH
.
To begin, start up your docker containers via:
make docker
After the PostgreSQL container starts it initializes some data. You can verify your PostgreSQL database is running using the command:
make psql
At the prompt you could then run the select query:
select * from kudos;
And you should receive a single row of data back. Use \q
to exist the psql
command line.
After docker container setup, you can run the app like:
make run
This will run your server in the foreground and monopolize the shell until you
CTRL+C
.
Then you can test the API by using curl
in a new shell:
curl -v "http://127.0.0.1:9000/"
If you go to the URL in your browser, it will redirect you back to the Github project + README.
While the app is running, you can also view the local Swagger UI docs by opening your browser to localhost/ui.
Once you are finished exploring, you can safely CTRL+C
to shut down the web server.
There are pytests which validate that you've correctly completed each step in this tutorial. You can run the tests by executing:
make tests
The first time you run this all the tests will fail. This is expected. You'll be doing the work in this tutorial to get the tests working.
Finally, if you want to contribute code back to the project. Please setup the provided githooks. You can do this with:
make githooks
Each of the tests will require a few steps to get it working. First, you should modify index.yaml
inside the spec
folder. Each test will require adding new yaml under the paths
directive.
In addition, for test #2 you should add a new item under the components:schemas
for the Kudo
object which will be reused in tests #3 & #4.
Each time you modify the spec file, you'll need to run
make spec
to compile the newresolved.yaml
.
To get test 1 passing, you'll need to do a few items:
/
path as an example.
/hello-world
route to the index.yamlx-openapi-router-controller
(optional) and operationId
(required)handlers/root.py
and"Hello"
and it's"World"
. Hint don't forget the Content-Type
header.make run
and hit yourmake test_1
.To get test 2 passing, you'll need to do the following:
/kudos
route to the index.yaml.
The Kudo schema is already defined for you in the
index.yaml
file under schemas. To make the tests pass use this schema.
x-openapi-router-controller
(optional) and operationId
(required)You will run into an issue with Python's default
json.dumps
not being able to handle datetime objects natively. There are several clever workarounds, and it will be up to you to chose and implement one.
db_conn
setup in main.py
. ThisThis database connection uses the aiopg library.
kudo
resource should contain all the attributes listed in the schema
of index.yaml
.make run
and hit yourmake test_2
.To get test 3 passing, you'll need to do the following:
/kudos
route in index.yaml and add a POST method. It should return the Kudox-openapi-router-controller
(optional) and operationId
(required)db_conn
setup in main.py
kudo
resource returned should contain all the attributes listed in the schema
of index.yaml
.make run
and hit yourmake test_3
.To get test 4 passing, you'll need to do the following:
/kudos/{id}
route in index.yaml and add a GET method. It should return the Kudox-openapi-router-controller
(optional) and operationId
(required)To get test 5 passing, you'll need to do the following:
/kudos/{id}
route in index.yaml and add a PUT method. It should take a kudox-openapi-router-controller
(optional) and operationId
(required)Make sure you update your
updated_dt
column with a new timestamp using SQLNOW()
when the object is updated.
To get test 6 passing, you'll need to do the following:
/kudos/{id}
route in index.yaml and add a DELETE method. It should return ax-openapi-router-controller
(optional) and operationId
(required)