rules for this service-style

The services based on this style should use the following rules:

  • adhere to PEP8!
  • use type-annotations inside function- and method-definitions!
  • write well written and descriptive docstrings for your modules, functions, methods and classes!
  • use meanigful variable-names inside your code
  • test all relevant parts of your code using pytest!
  • develop inside conda environments and use cenv to keep your conda-environment up to date
  • use the feature-branch workflow for git as described below
  • write a understandable and well written README.md to allow other developers to build the conda-package and the documentation for the service, run the service and deploy it on another host.
  • keep a changelog for your service and its APIs!

PEP8

To ensure correct usage of PEP8 use a linter like pylint inside your IDE.

For details about PEP8 see https://pep8.org/

type-annotations

Type annotations are a really important part of this service as it is used inside the following packages used in these services:

  • pydantic
  • attrs

pydantic uses the type annotations inside the definition of the Model-classes (schemata).

attrs is much easier to write using the parameter auto_attribs defined in the attrs-class-decorator @attr.s. It is also syntactic sugar to use this style of class-definition, so please keep this style.

Type annotations are also used for creation of the documentation by sphinx. This is done using the sphinx-plugin sphinx_autodoc_typehints.

So please use make use of the type annotations!

Tip

For a short introduction on how to use type-annotations read the following tutorials and documentations:

usage of pydantic-models / schemata

Using schemata for the input- and output-definitions of the APIs allows clients to access the APIs of a service easily and to understand what data is needed and in what form in will be returned. Thats why the schemata-definitions should be placed in a seperate project / conda-package. This allows clients to use the package to access the APIs.

google-documentation-style

The sphinx-tool extracts the docstrings of the service, so please write well documented docstrings. The documentation-style we use is the google-documentation-style.

Example

The following example shows how to document your functions using the google-documentation-style. It also uses the type-annotations. Using the package sphinx-autodoc-typehints allows to extract the type-annotations from the function so they don’t have to be defined twice (in the type-annotations and the docstring).

def example_function(param_1: float, param_2: int) -> float:
    """
    Short description of this function.

    More detailed description of this function.

    Parameters:
        param_1: Description of param_1
        param_2: Description of param_2

    Returns:
        result: Description of result
    """
    result = param_1 * param_2
    return result

Tip

See the following links for details about the google-docstring-style:

pytest

testing defaults

In general you will have to test two scenarios for the function / method / class to check:

  • everything works as expected
  • in specific cases a defined exception is raised

The naming-convention for these test-functions are:

  • test_<name_of_function_to_test>_ok
  • test_<name_of_function_to_test>_fails

In the ok-tests you would use assert most of the time to check if the result is the expected result.

For example:

import pytest

def test_example_ok():
    assert add_function(1, 2) == 3
    assert add_function(3, 7) == 10

In the fail-tests you can use the with pytest.raises(<exception_expected_to_be_raised>) syntax: For example:

import pytest

def test_example_fails():
    with pytest.raises(ZeroDivisionError):
        res = 1 / 0

using parametrized tests

Tests can also be parametrized, which reduces the amount of tests to write a lot.

An example for a parametrized test could be:

import pytest

@pytest.mark.parametrize('number', [1, 2, 3, 4, 5])
@pytest.mark.parametrize('word', ['ab','ee','iv','ou','uu'])
def test_example_ok(number, word):
    pass

This tests each combination of the parametrized variables number and word passing them to the test-function test_example_ok. The tested combinations would be:

  • number = 1 and word = 'ab'
  • number = 2 and word = 'ab'
  • number = 3 and word = 'ab'
  • number = 4 and word = 'ab'
  • number = 5 and word = 'ab'
  • number = 1 and word = 'ee'
  • number = 2 and word = 'ee'
  • number = 3 and word = 'ee'
  • number = 4 and word = 'ee'
  • number = 5 and word = 'ee'

This saves a lot of writing and increases readablility of the tests.

using fixtures

For details about pytest-fixtures see the documentation at https://docs.pytest.org/en/latest/fixture.html.

testing aiohttp-api-endpoints

For details how to use tests for your APIs with mokced_request-objects see https://docs.aiohttp.org/en/v2.2.0/testing.html#faking-request-object.

An example for possible test can be seen in ouroboros.tests.test_service.

developing inside conda environments

The services are developed inside their own conda-environment. To create the conda-environment and to update it use the conda-env-manager cenv.

Install it using:

conda create -n cenv_tool cenv

This creates a new conda-environment names cenv_tool with only cenv as a major-dependency.

You could run cenv now using /shared/conda/envs/cenv_tool/bin/cenv but this is a little bit too much writing, so you should create an alias.

As an example how to create such an alias for bash or zsh you could add the following line to your .bashrc / .zshrc:

alias cenv=/shared/conda/envs/cenv_tool/bin/cenv

Now cenv can be run only using:

cenv

Attention

cenv uses the path /opt/conda as default conda-installation-folder and /shared/conda/envs as default conda-environments-folder. You can overwrite these settings with a .cenv.yml inside your home-folder with the following content:

conda_folder: /opt/conda
env_folder: /shared/conda/envs

There you can define your own conda-installation-path and the conda-environments-folder.

cenv creates and updates the conda-environment (the name and dependencies are defined inside the file conda-build/meta.yaml) for your service.

Attention

Each environment is only created, updated and modified using cenv! This means the commands conda install, conda remove are not used anymore. Changes of the dependencies of the environment are defined inside the meta.yaml and are applied by using cenv.**

This means:

  • new dependency required => add it in meta.yaml and run cenv.
  • dependency not needed anymore => remove it from meta.yaml and run cenv.
  • need of another version of dependency => change the version of dependency in meta.yaml and run cenv.

Example for a valid meta.yaml used by cenv

{% set data = load_setup_py_data() %}

package:
    name: "example_package"
    version: {{ data.get("version") }}

source:
    path: ..

build:
    build: {{ environ.get('GIT_DESCRIBE_NUMBER', 0) }}
    preserve_egg_dir: True
    script: python setup.py install

requirements:
    build:
      - python 3.6.8
      - setuptools
    run:
      - python 3.6.8
      - attrs >=18.2
      - jinja2 >=2.10
      - ruamel.yaml >=0.15.23
      - six >=1.12.0
      - yaml >=0.1.7
      - marshmallow >=3.0.0rc1*

test:
    imports:
        - example_package

extra:
    env_name: example
    dev_requirements:
      - ipython >=7.3.0
      - pylint >=2.2.2

Attention

In the requirements-run-section and extra-dev -section the version of each package has to be defined! Not defining a version will not create or update a conda-environment, because this is not the purpose of the conda-usage. The validity of the meta.yaml is checked in cenv using the marshmallow package.

creating conda-package

You can convert the service to a conda-package using the following command:

conda build . --python=<python_version>

If the service is devloped for python 3.6 use the following:

conda build . --python=3.6

git-workflow

Always develop inside feature-branches not inside the master-branch!

Danger

Never add files directly to the master!

So if we want to implement new features we ensure to be up to date to master by ensure to be in the master-branch and the local master-branch is up to date using:

git checkout master
git pull

Now we create a feature-branch for the new features to develop:

git checkout -b <your_feature_branch_name>

Then we develop, changing and adding files, etc.

We add the changes for a file with:

git add <filename_of_file_with_changed_content>
git commit -m '<short_description_what_changed>' -m '<long_description_what_changed>' -m '- <detail 1>' -m '- <detail 2>' ...

If you are done with your feature-branch you have to add the new content to the master-branch.

First ensure the master-branch to be up to date:

git checkout master
git pull

Then rebase your local feature-branch on the up to date master-branch:

git checkout <your_feature_branch_name>
git rebase master

Merge the changes from the rebased local feature-branch into the local up to date master-branch:

git checkout master
git merge --no-ff <your_feature_branch_name>

Now push the updates of your local master-branch to the remote master-branch:

git push

Tip

Useful tools for git like the git-info-command are inside the git-extras package. How to use and install it: https://github.com/tj/git-extras.

write a changelog