Python and CI/CD adventure

Overview

Talking points

  • Traditionally a Java-based shop
  • The goal is to create a CI/CD Pipeline which supports Python, Docker Images, and the OpenShift target platform. 
  • Sub-goal: Adhere to enterprise security requirements for container and dependency scanning.
  • Consistency and similarity to existing pipelines is desirable
  • Greater Security comes at the cost of developer speed and the “Continuous” part of Continuous Delivery.

Infrastructure

 

Category Tool Purpose
Build Jenkins Build and Deploy scripts for applications
Security VeraCode Security Scanning
Repository Nexus Artifact Repository (wheels and docker images)
Security Nexus IQ Security Scanning of dependencies
Runtime Environment OpenShift Container Orchestration and Runtime Environment

Python

Pytest

PIPs

  • pytest
  • pytest-mock
  • pytest-pythonpath
  • pytest-env

PIPs go in the requirements.txt file

pytest.ini

Explicit markers were defined to allow selective running of different types of tests.

  • unit
  • integration -m unit
# content of pytest.ini
[pytest]
#...
markers = 
  unit: unit tests - explicit rather than implicitly defined
  integration: integration tests reach out to external dependencies
python_paths=src
env=
  LOGGING_LEVEL=DEBUG

Example unit test

@pytest.mark.unit
def test_dynamic_table_name(caplet):
  caplog.set_level(logging.debug)
  logger = logging.getLogger()
  issue_type = 'Story'
  table_name = issue_type_to_table_name_dynamic(issue_type)
  logger.debug(f'{issue_type=}, {table_name=}')
  assert table_name == 'stories'

The CI/CD Pipeline will execute Pytest unit tests with the following command

python -m pytest

Python packaging

Files

  • setup.cfg
  • setup.py

setup.py

Setup.py is required for two reasons. First, it triggers setup tools to use the setup.cfg configuration. Secondly, use of setup.py allows the requirements.txt file contents to used by the packager.

If the team prefers, they do still have the option of 

  1. explicitly maintaining packages in requirements.txt and also setup.cfg (install_requires). This violates DRY.
  2. Specify packages explicitly in setup.cfg and set requirements.txt to “-e”. That would exclude making their own docker build for local runs. This is my preference, even though it is a little wonky.
  3. Specify packages explicitly in setup.cfg and don’t define requirements.txt. That’s fine, but it assumes we are always making a package of the local service before running it. It makes it more difficult just to build / test / run the service locally, without Docker.
from setup tools import setup
import os 

with open('requirements.txt') as f:
  required = f.read().splitlines()
setup(
  installed_requires=required
)

setup.cfg

[metadata]
name = servicename
version = 1.0.0
description = Service Description
# ... more attributes

[options]
package_dir = 
  = src
packages = find:
python_requires = >= 3.8
include_package_data = True

scripts = 
  app.py
# app.py is used as the Standardized Python entry point packages/services built in this pipeline.

[options.packages.find]
where = src

app.py

import bogusco.servicename
if __name__ == '__main__':
  bogusco.packagename.main('/etc/servicename/secrets')

Artifacts

  • Python
    • tar gz
    • wheel
  • Docker
    • Image

Shared Docker file

FROM rhel8/python-39
ENV PIP_CONFIG_FILE=/etc/pip.conf
# COPY certificates here
COPY example.pem /etc/pki/ca-trust/source/anchors/example.crt
# ...
USER root
RUN update-ca-trust extract
ENV REQUESTS_CA_BUNDLE=...

# COPY secrets into image
# COPY pip.conf into image

# Download PIP for our package/service from nexus
# PIP install it

CMD ["python", "/opt/app-root/bin/app.py"]

Leave a Reply

Your email address will not be published. Required fields are marked *