Create a CI for code qualities and testing

When you develop your project it is highly recommended to use pre-commit to check many thinks in your code like: typing, english spelling; tests, …

A good practice is to make those verification in the gitlab when we push and when we merge a branch. To do that we need a CI who executes pre-commit.

Establishment of the CI

To make your CI you need a .gitlab-ci.yml and a runner. To create a runner look the gitlab documentation.

To write your gitlab-ci file you need to follow few steps.

You can fin the documentation of gitlab CI here

1. Parent image

Fisrt of all we execute our CI in a docker composer with a custom image. You can see How to make a docker image for gitlab-ci.

image: "gitlab.obspm.fr:4567/<your_group>/<your_docker_project>:latest"

2. Stages

Afterwards you must define your stages. Here 4 stages:

  • install poetry to install your dependencies manager
  • install dependencies to install all dependencies
  • pre-commit to run all precommit hooks
  • cov_tests to run your coverage
stages:
  - install_poetry
  - install_dependencies
  - pre-commit
  - cov_tests

3. Install poetry

Now you can make your first job: install poetry.

Define name of your job:

install_poetry:

Give the stage to which you job is related:

stage: install_poetry

If your program need postgresql you need to tell to the CI to use the postgres service:

  services:
    - postgres

And setup some environnement variables:

  variables:
    POSTGRES_DB: exoimport
    POSTGRES_USER: toto
    POSTGRES_HOST_AUTH_METHOD: trust

Now you can write your script:

  script:
    - which poetry | grep -o poetry > /dev/null &&  curl -sSL https://install.python-poetry.org | python3 - --uninstall || echo "poetry not installed"
    - curl -sSL https://install.python-poetry.org | python3 -
  1. Check if a version of poetry is installed and uninstall it if that is the case.
  2. Install poetry

The complete job looks like:

install_poetry:
  stage: install_poetry
  services:
    - postgres
  variables:
    POSTGRES_DB: exoimport
    POSTGRES_USER: toto
    POSTGRES_HOST_AUTH_METHOD: trust
  script:
    - echo "Installing poetry"
    - which poetry | grep -o poetry > /dev/null &&  curl -sSL https://install.python-poetry.org | python3 - --uninstall || echo "poetry not installed"
    - curl -sSL https://install.python-poetry.org | python3 -
    - poetry --version
    - echo "Poetry is installed"

4. Install dependencies

Define name of your job:

install_dependencies:

Tell to the CI which job need to make this one:

  needs: [install_poetry]

Give the stage to which you job is related:

stage: install_dependencies

Declare an envirenement variable for poetry link to a bug of poetry:

You can read the issue here.

  variables:
    PYTHON_KEYRING_BACKEND: keyring.backends.null.Keyring

Finally write your script:

  script:
    - poetry install
    - poetry run pre-commit clean
    - poetry run pre-commit install --hook-type pre-merge-commit
    - poetry run pre-commit install --hook-type pre-push
    - poetry run pre-commit install --hook-type post-rewrite
    - poetry run pre-commit install-hooks
    - poetry run pre-commit install
  1. Install dependencies with poetry
  2. Clean pre-commit
  3. Install pre-merge-commit hooks of pre-commit
  4. Install pre-push hooks of pre-commit
  5. Install post-rewrite hooks of pre-commit
  6. Install all other hooks of pre-commit
  7. Finalized the install of all hooks

The complete job looks like:

install_dependencies:
  needs: [install_poetry]
  stage: install_dependencies
  variables:
    PYTHON_KEYRING_BACKEND: keyring.backends.null.Keyring
  script:
    - poetry install
    - poetry run pre-commit clean
    - poetry run pre-commit install --hook-type pre-merge-commit
    - poetry run pre-commit install --hook-type pre-push
    - poetry run pre-commit install --hook-type post-rewrite
    - poetry run pre-commit install-hooks
    - poetry run pre-commit install

5. Run pre-commit

Define name of your job:

pre-commit_hooks_formater_linter:

Tell to the CI which job need to make this one:

  needs: [install_poetry, install_dependencies]

Give the stage to which you job is related:

  stage: pre-commit

Now you can write your scipr:

  script:
    - poetry run pre-commit run no-commit-to-branch --all-files
    - poetry run pre-commit run check-merge-conflict --all-files
    - poetry run pre-commit run debug-statements --all-files
    - poetry run pre-commit run trailing-whitespace --all-files
    - poetry run pre-commit run end-of-file-fixer --all-files
    - poetry run pre-commit run check-yaml --all-files
    - poetry run pre-commit run check-toml --all-files
    - poetry run pre-commit run check-added-large-files --all-files
    - poetry run pre-commit run check-shebang-scripts-are-executable --all-files
    - poetry run pre-commit run isort --all-files
    - poetry run pre-commit run black --all-files
    - poetry run pre-commit run codespell --all-files
    - poetry run pre-commit run mypy --all-files
    - poetry run pre-commit run ruff --all-files
    - poetry run pre-commit run markdownlint-cli2 --all-files
  • Lines 1 to 9: Run your pre-commit hooks
  • Line 10: Run isort
  • Line 11: Run black
  • Line 12 Run codepell
  • Line 13: Run mypy
  • Line 14: Run ruff
  • Line 15: Run markdownlint

The complete job looks like:

pre-commit_hooks_formater_linter:
  needs: [install_poetry, install_dependencies]
  stage: pre-commit
  script:
    - poetry run pre-commit run no-commit-to-branch --all-files
    - poetry run pre-commit run check-merge-conflict --all-files
    - poetry run pre-commit run debug-statements --all-files
    - poetry run pre-commit run trailing-whitespace --all-files
    - poetry run pre-commit run end-of-file-fixer --all-files
    - poetry run pre-commit run check-yaml --all-files
    - poetry run pre-commit run check-toml --all-files
    - poetry run pre-commit run check-added-large-files --all-files
    - poetry run pre-commit run check-shebang-scripts-are-executable --all-files
    - poetry run pre-commit run isort --all-files
    - poetry run pre-commit run black --all-files
    - poetry run pre-commit run codespell --all-files
    - poetry run pre-commit run mypy --all-files
    - poetry run pre-commit run ruff --all-files
    - poetry run pre-commit run markdownlint-cli2 --all-files
Note

If you want you can make more then one job to make this step.

6. Run tests with pre-commit

Define name of your job:

tests_for_pre-commit:

Tell to the CI which job need to make this one:

  needs: [install_poetry, install_dependencies]

Give the stage to which you job is related:

  stage: pre-commit

If your program need postgresql and use environnement variables you need to give to the CI all useful environnement variables for the connection to the database.

And setup some environnement variables:

  variables:
    EXOIMPORT_CI: "true"
    EXOIMPORT_DBUSER: toto
    EXOIMPORT_DBPASSWORD: totopassword
    EXOIMPORT_DBHOST: localhost
    EXOIMPORT_DBPORT: 7000
    EXOIMPORT_DBNAME: exoimport

Now you can write your script:

  script:
    - poetry run pre-commit run doctest_md --all-files
    - poetry run pytest --junitxml=testsunit.xml
  1. Run tests in markdown file
  2. Run pytest with an artefact to get a tests report

To get your tests report you need to generate an artefact:

  artifacts:
    expire_in: never
    when: always
    paths:
      - testsunit.xml
    reports:
      junit: testsunit.xml
  1. Declare the artefact part of the job
  2. Tell when this artefact expire, here: never
  3. Tell when the artefact was generate, here: always
  4. Give the path of the artefact
  5. Give the method to generate the report

The complete job looks like:

tests_for_pre-commit:
  needs: [install_poetry, install_dependencies]
  stage: pre-commit
  variables:
    EXOIMPORT_CI: "true"
    EXOIMPORT_DBUSER: toto
    EXOIMPORT_DBPASSWORD: totopassword
    EXOIMPORT_DBHOST: localhost
    EXOIMPORT_DBPORT: 7000
    EXOIMPORT_DBNAME: exoimport
  script:
    - poetry run pre-commit run doctest_md --all-files
    - poetry run pytest --junitxml=testsunit.xml
  artifacts:
    expire_in: never
    when: always
    paths:
      - testsunit.xml
    reports:
      junit: testsunit.xml

6. Run coverage

Define name of your job:

tests_for_coverage:

Tell to the CI which job need to make this one:

  needs: [install_poetry, install_dependencies]

Give the stage to which you job is related:

  stage: cov_tests

If your program need postgresql and use environnement variables you need to give to the CI all useful environnement variables for the connection to the database.

And setup some environnement variables:

  variables:
    EXOIMPORT_CI: "true"
    EXOIMPORT_DBUSER: toto
    EXOIMPORT_DBPASSWORD: totopassword
    EXOIMPORT_DBHOST: localhost
    EXOIMPORT_DBPORT: 7000
    EXOIMPORT_DBNAME: exoimport

Now you can write your script:

  script:
    - poetry run coverage run -m pytest
    - poetry run coverage report
    - poetry run coverage xml
  1. Run the coverage
  2. Create a report
  3. Tell the extension of the report file

Give the coverage formula:

  coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'

Generate the artefact:

  artifacts:
    expire_in: never
    when: always
    reports:
      coverage_report:
        coverage_format: cobertura
        path: coverage.xml
  1. Declare the artefact part of the job
  2. Tell when this artefact expire, here: never
  3. Tell when the artefact was generate, here: always
  4. Give the coverage format
  5. GIve the path of the coverage file

The complete job looks like:

tests_for_coverage:
  needs: [install_poetry, install_dependencies]
  stage: cov_tests
  variables:
    EXOIMPORT_CI: "true"
    EXOIMPORT_DBUSER: toto
    EXOIMPORT_DBPASSWORD: totopassword
    EXOIMPORT_DBHOST: localhost
    EXOIMPORT_DBPORT: 7000
    EXOIMPORT_DBNAME: exoimport
  script:
    - poetry run coverage run -m pytest
    - poetry run coverage report
    - poetry run coverage xml
  coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'
  artifacts:
    expire_in: never
    when: always
    reports:
      coverage_report:
        coverage_format: cobertura
        path: coverage.xml

Examples of a CI file

At the end you have a gitlab-ci.yml file look like this. In this file we make:

  • install poetry
  • install dependencies for python 3.10
  • lauch our pre-commit hooks in many jobs for python 3.10
  • launch our tests for python 3.10
  • install dependencies for python 3.11
  • run our pre-commit hooks in many jobs for python 3.11
  • run our tests for python 3.11
  • run the coverage

To download this file click here

image: "gitlab.obspm.fr:4567/exoplanet/docker-exo:latest"

stages:
  - install_poetry
  - install_dependencies
  - pre-commit
  - cov_tests

install_poetry:
  stage: install_poetry
  services:
    - postgres
  variables:
    POSTGRES_DB: exoimport
    POSTGRES_USER: toto
    POSTGRES_HOST_AUTH_METHOD: trust
  script:
    - echo "Installing poetry"
    - which poetry | grep -o poetry > /dev/null &&  curl -sSL https://install.python-poetry.org | python3 - --uninstall || echo "poetry not installed"
    - curl -sSL https://install.python-poetry.org | python3 -
    - poetry --version
    - echo "Poetry is installed"

# ---------------
# | PYTHON 3.10 |
# ---------------

install_dependencies_py310:
  needs: [install_poetry]
  variables:
    PYTHON_KEYRING_BACKEND: keyring.backends.null.Keyring
  stage: install_dependencies
  script:
    - poetry env use python3.10
    - echo "Installing dependencies with poetry"
    - poetry install
    - echo "Installing pre-commit"
    - poetry run pre-commit clean
    - poetry run pre-commit install --hook-type pre-merge-commit
    - poetry run pre-commit install --hook-type pre-push
    - poetry run pre-commit install --hook-type post-rewrite
    - poetry run pre-commit install-hooks
    - poetry run pre-commit install
    - echo "End of installations"

py310_pre-commit_hooks:
  stage: pre-commit
  needs: [install_poetry, install_dependencies_py310]
  script:
    - echo "Running pre-commit 'pre-commit hooks'"
    - poetry run pre-commit run no-commit-to-branch --all-files
    - poetry run pre-commit run check-merge-conflict --all-files
    - poetry run pre-commit run debug-statements --all-files
    - poetry run pre-commit run trailing-whitespace --all-files
    - poetry run pre-commit run end-of-file-fixer --all-files
    - poetry run pre-commit run check-yaml --all-files
    - poetry run pre-commit run check-toml --all-files
    - poetry run pre-commit run check-added-large-files --all-files
    - poetry run pre-commit run check-shebang-scripts-are-executable --all-files
    - echo "'pre-commit hooks' run finished"
  dependencies:
    - install_dependencies_py310

py310_formater:
  stage: pre-commit
  needs: [install_poetry, install_dependencies_py310]
  script:
    - echo "Running formater"
    - poetry run pre-commit run isort --all-files
    - poetry run pre-commit run black --all-files
    - poetry run pre-commit run codespell --all-files
    - echo "Formater run finished"
  dependencies:
    - install_dependencies_py310

py310_linters:
  stage: pre-commit
  needs: [install_poetry, install_dependencies_py310]
  script:
    - echo "Running linters"
    - poetry run pre-commit run mypy_exoimport --all-files
    - poetry run pre-commit run mypy_cli_app --all-files
    - poetry run pre-commit run ruff --all-files
    - poetry run pre-commit run markdownlint-cli2 --all-files
    - echo "Linters run finished"
  dependencies:
    - install_dependencies_py310

py310_tests_for_pre-commit:
  stage: pre-commit
  needs: [install_poetry, install_dependencies_py310]
  variables:
    EXOIMPORT_CI: "true"
    EXOIMPORT_DBUSER: toto
    EXOIMPORT_DBPASSWORD: totopassword
    EXOIMPORT_DBHOST: localhost
    EXOIMPORT_DBPORT: 7000
    EXOIMPORT_DBNAME: exoimport
  script:
    - echo "Running tests"
    - poetry run pre-commit run doctest_md --all-files
    - poetry run pytest --junitxml=testsunit_py310.xml
    - echo "Tests run finished"
  dependencies:
    - install_dependencies_py310
  artifacts:
    expire_in: never
    when: always
    paths:
      - testsunit_py310.xml
    reports:
      junit: testsunit_py310.xml

# ---------------
# | PYTHON 3.11 |
# ---------------

install_dependencies_py311:
  needs: [install_poetry]
  variables:
    PYTHON_KEYRING_BACKEND: keyring.backends.null.Keyring
  stage: install_dependencies
  script:
    - poetry env use python3.11
    - echo "Installing dependencies with poetry"
    - poetry install
    - echo "Installing pre-commit"
    - poetry run pre-commit clean
    - poetry run pre-commit install --hook-type pre-merge-commit
    - poetry run pre-commit install --hook-type pre-push
    - poetry run pre-commit install --hook-type post-rewrite
    - poetry run pre-commit install-hooks
    - poetry run pre-commit install
    - echo "End of installations"

py311_pre-commit_hooks:
  stage: pre-commit
  needs: [install_poetry, install_dependencies_py311]
  script:
    - echo "Running pre-commit 'pre-commit hooks'"
    - poetry run pre-commit run no-commit-to-branch --all-files
    - poetry run pre-commit run check-merge-conflict --all-files
    - poetry run pre-commit run debug-statements --all-files
    - poetry run pre-commit run trailing-whitespace --all-files
    - poetry run pre-commit run end-of-file-fixer --all-files
    - poetry run pre-commit run check-yaml --all-files
    - poetry run pre-commit run check-toml --all-files
    - poetry run pre-commit run check-added-large-files --all-files
    - poetry run pre-commit run check-shebang-scripts-are-executable --all-files
    - echo "'pre-commit hooks' run finished"
  dependencies:
    - install_dependencies_py311

py311_formater:
  stage: pre-commit
  needs: [install_poetry, install_dependencies_py311]
  script:
    - echo "Running formater"
    - poetry run pre-commit run isort --all-files
    - poetry run pre-commit run black --all-files
    - poetry run pre-commit run codespell --all-files
    - echo "Formater run finished"
  dependencies:
    - install_dependencies_py311

py311_linters:
  stage: pre-commit
  needs: [install_poetry, install_dependencies_py311]
  script:
    - echo "Running linters"
    - poetry run pre-commit run mypy_exoimport --all-files
    - poetry run pre-commit run mypy_cli_app --all-files
    - poetry run pre-commit run ruff --all-files
    - poetry run pre-commit run markdownlint-cli2 --all-files
    - echo "Linters run finished"
    - echo "Linters run finished"
  dependencies:
    - install_dependencies_py311

py311_tests_for_pre-commit:
  stage: pre-commit
  needs: [install_poetry, install_dependencies_py311]
  variables:
    EXOIMPORT_CI: "true"
    EXOIMPORT_DBUSER: toto
    EXOIMPORT_DBPASSWORD: totopassword
    EXOIMPORT_DBHOST: localhost
    EXOIMPORT_DBPORT: 7000
    EXOIMPORT_DBNAME: exoimport
  script:
    - echo "Running tests"
    - poetry run pre-commit run doctest_md --all-files
    - poetry run pytest --junitxml=testsunit_py311.xml
    - echo "Tests run finished"
  dependencies:
    - install_dependencies_py311
  artifacts:
    expire_in: never
    when: always
    paths:
      - testsunit_py311.xml
    reports:
      junit: testsunit_py311.xml

# ------------
# | COVERAGE |
# ------------

tests_for_coverage:
  stage: cov_tests
  needs: [install_poetry, install_dependencies_py311]
  variables:
    EXOIMPORT_CI: "true"
    EXOIMPORT_DBUSER: toto
    EXOIMPORT_DBPASSWORD: totopassword
    EXOIMPORT_DBHOST: localhost
    EXOIMPORT_DBPORT: 7000
    EXOIMPORT_DBNAME: exoimport
  before_script:
    - echo "Cleaning temporary files and folders"
    - find . \( -name __pycache__ -o -name "*.pyc" \) -delete
  script:
    - poetry run coverage run -m pytest
    - poetry run coverage report
    - poetry run coverage xml
  after_script:
    - echo "Cleaning temporary files and folders"
    - find . \( -name __pycache__ -o -name "*.pyc" \) -delete
  dependencies:
    - install_dependencies_py311
  coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'
  artifacts:
    expire_in: never
    when: always
    reports:
      coverage_report:
        coverage_format: cobertura
        path: coverage.xml