Skip to content

Instantly share code, notes, and snippets.

@chaimt
Created October 24, 2023 07:59
Show Gist options
  • Save chaimt/7d5a32e5bd0964ca99538c4168378fa9 to your computer and use it in GitHub Desktop.
Save chaimt/7d5a32e5bd0964ca99538c4168378fa9 to your computer and use it in GitHub Desktop.

Developer Guidelines

Git repo structure

. ├── .github │   └── workflows │   ├── release.yml │   ├── build.yml │   └── test.yml ├── .gitignore ├── .pre-commit-config.yaml ├── Makefile ├── README.md ├── application │ ├── resources │ └── Dockerfile ├── documentation │ ├── diagrams │ ├── images │ └── index.md ├── dist ├── htmlcov ├── logs ├── tests │ └── resources └── tox.ini

Makefile

A Makefile is a simple and widely used build automation tool in the software development world. It is primarily used to manage the build process of a project, including compiling and linking source code files. Makefiles are commonly associated with C and C++ projects, but they can be used for any project that requires automated building and dependency management.

The common actions that should be in every project are:

  • help (show menu options)
  • clean (remove all environment and build files)
  • setup-env (create environment for running)
  • tests (run tests)
  • open-tests-results (open tests results in browser)
  • validate (run pre-commit validations)
  • build (build project)

An example Makefile is shown below:

SHELL=/bin/bash
export APP_NAME=raas_utilities
export BUILD_DIR=$(shell pwd)
export VIRTUAL_ENV=${BUILD_DIR}/.venv
export ARTIFACTS=${BUILD_DIR}/.platform
export POETRY_DIR=${BUILD_DIR}/${APP_NAME}


# Colors for echos 
ccend = $(shell tput sgr0)
ccbold = $(shell tput bold)
ccgreen = $(shell tput setaf 2)
ccso = $(shell tput smso)

DEFAULT_GOAL: help

.PHONY: help
help:	
	@awk 'BEGIN {FS = ":.*##"; printf "\nUsage:\n  make \033[36m<target>\033[0m\n"} /^[a-zA-Z_0-9-]+:.*?##/ { printf "  \033[36m%-15s\033[0m %s\n", $$1, $$2 } /^##@/ { printf "\n\033[1m%s\033[0m\n", substr($$0, 5) } ' $(MAKEFILE_LIST)
	@echo ""
	@echo "${ccbold}Note: to activate the environment in your local shell type:"
	@echo "   $$ source $(VIRTUAL_ENV)/bin/activate"


##@ Development

clean: ## >> remove all environment and build files
	@echo ""
	@echo "$(ccso)--> Removing virtual environment $(ccso)"
	find ${BUILD_DIR} | grep -E "(/__pycache__$|\.pyc$|\.pyo$|\.pytest_cache$)" | xargs rm -rf
	rm -rf ${BUILD_DIR}/${VIRTUAL_ENV} || true	
	rm -rf ${BUILD_DIR}/target  || true	
	rm -rf ${BUILD_DIR}/.mypy_cache  || true	
	rm -rf ${BUILD_DIR}/.pytest_cache  || true	
	rm -rf ${BUILD_DIR}/.ruff_cache  || true		
	rm -rf ${BUILD_DIR}/htmlcov || true

setup-env: ## >> setup environment for development
	@echo "$(ccgreen)--> Environment setup $(ccgreen)"
	rm -rf $(VIRTUAL_ENV)  || true &&\
	pyenv local 3.11.3 &&\
	python3 -m venv $(VIRTUAL_ENV) &&\
	source $(VIRTUAL_ENV)/bin/activate &&\
	pip install --upgrade pip &&\
	pip install poetry &&\
	pip install tox &&\
	pip install pre-commit &&\
	pre-commit	

.PHONY: tests
tests: ## run tests
	source $(VIRTUAL_ENV)/bin/activate &&\
	pushd ${POETRY_DIR} &&\
	tox &&\
	popd

open-tests-results: ## run tests
	open ${POETRY_DIR}/htmlcov/index.html

validate: ## validate project files
	pre-commit run --all-files 
		
build: ## build project
	poetry build

gitignore

A .gitignore file specifies intentionally untracked files that Git should ignore. Files already tracked by Git are not affected. Each line in a .gitignore file specifies a pattern. When deciding whether to ignore a path, Git normally checks gitignore patterns from multiple sources, with the following order of precedence, from highest to lowest (within one level of precedence, the last matching pattern decides the outcome):

gitignore.io is a web service that generates .gitignore files for you. You can select from a wide variety of operating systems, IDEs, and programming languages. It is a great way to get started with a new project.

pre-commit-config

The pre-commit-config.yaml file is where you configure pre-commit hooks. It is a YAML file that lives in the root of your repository. It specifies a list of hooks that are run against your files. Each hook is configured with a name, an id, and an optional list of additional arguments. The hooks are run in the order they are listed in the file. If any hook returns a non-zero exit code, all remaining hooks are skipped. If you want to run all hooks, add the --all-files flag to the pre-commit run command.

An example pre-commit-config.yaml is shown below:

repos:
-   repo: https://github.com/pre-commit/pre-commit-hooks
    rev: v4.4.0
    hooks:
      - id: end-of-file-fixer # Ensures that a file is either empty, or ends with one newline.
      - id: check-ast # Simply check whether the files parse as valid python
      - id: check-toml # This hook checks toml files for parseable syntax.
      - id: check-yaml # This hook checks yaml files for parseable syntax.
      - id: check-added-large-files # Prevent giant files from being committed (500kB)
      - id: check-merge-conflict # Check for files that contain merge conflict strings.
      - id: detect-private-key # Detects the presence of private keys
      - id: debug-statements # Check for debugger imports and py37+ `breakpoint()` calls in python source.

-   repo: https://github.com/codespell-project/codespell
    rev: 'v2.2.5'
    hooks:
      - id: codespell

Documentation

Documenting a developer application is crucial for its usability, maintainability, and for enabling other developers to understand and work with the application. Here are some key types of documentation you should consider for a developer application:

  1. Getting Started Guide:

    • Introduction: Provide an overview of what the application does.
    • Prerequisites: List the system requirements, dependencies, and any prerequisites.
    • Installation: Explain how to install and set up the application.
    • Quick Start: Walk developers through a simple "Hello World" or basic usage example.
  2. API Documentation (if your application has an API):

    • Endpoint descriptions: Document each API endpoint, including its URL, HTTP method, and purpose.
    • Request parameters: List the parameters that can be sent with each request, their data types, and any validation rules.
    • Response format: Describe the structure of API responses, including status codes and data formats (e.g., JSON).
    • Authentication: Explain how to authenticate with the API, including API keys, tokens, or OAuth flows.
    • Rate limits: If applicable, specify rate limits and any other usage restrictions.
  3. Code Examples:

    • Provide code examples in multiple programming languages to demonstrate how to use the application or API.
    • Examples should cover common use cases and scenarios.
  4. Tutorials and Guides:

    • Create tutorials and guides for common tasks or workflows using your application.
    • Address more complex usage scenarios step by step.
  5. Configuration:

    • Document configuration options and how to modify them, such as configuration files, environment variables, or command-line arguments.
  6. Testing and Debugging:

    • Provide information on how to test and debug the application, including logging and debugging tools.

The documentation should be stored in the documentation folder in the root of the repository. The documentation folder should contain the following subfolders:

├── documentation
│   ├── diagrams
│   ├── images
│   └── index.md

You should consider using a documentation generator such as MkDocs or Sphinx to generate your documentation.

CI Build

Continuous Integration (CI) is a software development practice that involves automatically building, testing, and integrating code changes into a shared repository multiple times a day. The CI process typically includes a "CI build," which is a key part of this practice. Here's what a CI build involves:

  1. Automated Build: When a developer makes changes to the code and pushes those changes to the repository, an automated build process is triggered. This process involves compiling, packaging, or building the application from the source code. The exact steps in the build process depend on the project and its technology stack.

  2. Testing: After the build is completed, automated tests are executed. These tests can include unit tests, integration tests, and other types of testing depending on the project. The purpose is to verify that the code changes haven't introduced any new defects or regressions.

  3. Artifact Creation: If the build and tests are successful, the CI system may create artifacts like deployable packages or binaries. These artifacts are ready for deployment to staging or production environments.

git hub actions

GitHub Actions is a CI/CD platform built into GitHub. It allows you to automate your software development workflows. You can create custom workflows to build, test, package, release, or deploy any code project on GitHub. GitHub Actions is free for public repositories and offers a generous free tier for private repositories.

You should have the following GitHub Actions workflows in your repository:

├── .github
│   └── workflows
│       ├── artifacts.yml
│       ├── build.yml
│       └── test.yml

build.yml

The CI actions should be as close as possible to the developer experience, so that developers can find issues as fast as possible. For example in the build.yml file you should have the following steps:

name: build

on: [push]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v3
    - name: Set up Python 3.11
      uses: actions/setup-python@v4
      with:
        python-version: 3.11

    - name: Validate project
      run: |
        pip install pre-commit
        pre-commit run --all-files

test.yml

on: [push]

env:
  AWS_REGION : 'eu-central-1'

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3

      - uses: actions/setup-python@v4
        with:
          python-version: 3.11

      - name: cache poetry install
        uses: actions/cache@v3
        with:
          path: ~/.local
          key: poetry-1.6.1

      - name: Configure AWS credentials
        uses: aws-actions/configure-aws-credentials@v2
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-session-token: ${{ secrets.AWS_SESSION_TOKEN }}
          aws-region: ${{ env.AWS_REGION }}
  

      - name: Install tox & poetry
        working-directory: raas_utilities
        run: |
          pip install --upgrade pip
          pip install poetry==1.6.1
          pip install tox

      - name: run tests
        working-directory: raas_utilities
        run: tox

artifacts.yml

Once the code has passed build and tests and has been merged to the main branch, the artifacts.yml workflow should be triggered. This workflow should build the docker image, helm chart and push them to the ECR repository.

name: release

on:
  push:
    branches:
      - 'main'

env:
  AWS_REGION : 'eu-central-1'
  GITHUB_TOKEN: ${{secrets.GITHUB_TOKEN}}

jobs:
  tag:
    runs-on: ubuntu-latest
    outputs:
      tag_release: ${{ steps.tag_release.outputs.tag }}
    steps:
      - name: Check out code
        uses: actions/checkout@v3
        with:
          fetch-depth: 0
      - name: tag the git repository
        id: tag_release
        env:
          TAG_COMMENT: ${{ github.sha }}
        run: |  
          git config --global user.email "github@qti.qualcomm.com"
          git config --global user.name "Github Actions"

          LATEST_VERSION=`git tag --sort=taggerdate | grep -E '[0-9]' | tail -1`
          PREFIX_VERSION=`date +"%Y.%m"`
          
          if [[ "$LATEST_VERSION" =~ ^"$PREFIX_VERSION." ]]; then
            TAG_VERSION=$(echo ${LATEST_VERSION} | awk -F. -v OFS=. '{$NF += 1 ; print}')
          else
            TAG_VERSION="${PREFIX_VERSION}.1"
          fi
          echo "tag=$TAG_VERSION" >> $GITHUB_OUTPUT
          git tag -a $TAG_VERSION -m $TAG_COMMENT
          git push origin $TAG_VERSION
  docker:
    needs: [tag]
    runs-on: ubuntu-latest
    steps:
      - name: Check out code
        uses: actions/checkout@v3
        with:
          fetch-depth: 0

      - uses: actions/setup-python@v4
        with:
          python-version: 3.11

      - name: Install poetry
        working-directory: raas_utilities
        run: |
          pip install --upgrade pip
          pip install poetry==1.6.1  

      - name: Configure AWS credentials
        uses: aws-actions/configure-aws-credentials@v2
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ${{ env.AWS_REGION }}

      - name: Login to Amazon ECR
        id: login-ecr
        uses: aws-actions/amazon-ecr-login@v1

      - name: Build, tag, and push docker image to Amazon ECR
        working-directory: raas_utilities
        env:
          REGISTRY: ${{ steps.login-ecr.outputs.registry }}
          REPOSITORY: app-dev-eu-central-1-raas-utilities
          TAG_VERSION: ${{ needs.tag.outputs.tag_release }}
        run: |
          export VERSION_PREFIX="$((`date +"%Y"`)).$(date +"%-m")"            
          poetry version ${VERSION_PREFIX}
          poetry build          
          docker build -t $REGISTRY/$REPOSITORY:$TAG_VERSION -f Dockerfile . --build-arg PROJECT_NAME=raas_utilities --build-arg PROJECT_VERSION=${VERSION_PREFIX}
          docker push $REGISTRY/$REPOSITORY:$TAG_VERSION

      - id: workflow-helm
        name: package, and push workflow helm chart to Amazon ECR
        working-directory: artifacts/helm/argo_workflow
        env:
          REGISTRY: ${{ steps.login-ecr.outputs.registry }}
          CHART: app-dev-eu-central-1-raas-pipeline-workflow
          IMAGE_TAG: ${{needs.tag.outputs.tag_release}}
        run: |
          mv raas_pipeline_workflow/bk_25 app-dev-eu-central-1-raas-pipeline-workflow
          helm package $CHART --version $IMAGE_TAG 
          aws ecr get-login-password | helm registry login --username AWS --password-stdin $REGISTRY
          helm push $CHART-$IMAGE_TAG.tgz oci://$REGISTRY

Development Strategy

Developing features using Pull Requests (PRs) is a common and effective approach for collaborative software development. Here are some guidelines on how to develop features using PRs:

  1. Start with a Clear Goal:
    • Before you begin working on a feature, make sure you have a clear understanding of what the feature is supposed to accomplish. This may involve reviewing user stories, requirements, or project documentation.
  2. Create a Feature Branch:
    • Create a new branch in your version control system (e.g., Git) specifically for the feature you're developing. The branch should have a descriptive name related to the feature.
  3. Use Descriptive Commit Messages:
    • Write meaningful commit messages that describe the changes you made. This helps others understand your changes easily.
  4. Frequent Updates:
    • Regularly update your feature branch with the latest changes from the main development branch (usually master or main). This helps avoid merge conflicts and keeps your feature branch up to date.
  5. Write Tests:
    • Whenever possible, write automated tests for your code changes. This ensures that your code works as expected and helps prevent regressions.
  6. Code Reviews:
    • When you're ready to share your work, create a Pull Request (PR). It's in the PR that your code will be reviewed.
    • Assign reviewers, including at least one team member, to review your code. Reviewers can provide feedback, suggest improvements, and ensure the code aligns with project standards.
    • Be open to feedback and constructive criticism. The goal of the review is to improve the code and the overall quality of the project.
  7. Approval and Merge:
    • Once the PR is approved and passes all tests, you can merge it into the main development branch. Be sure to follow your team's specific merge process.
  8. Documentation:
    • If your feature introduces changes that require updates to documentation, make those updates in a separate PR or commit.
  9. Release Notes: 8 If the feature is significant or has a user-facing impact, consider including information in release notes or documentation for users and stakeholders.

Database Migrations

Database migrations are a technique used in software development to manage changes to a database's schema (structure) and data over time. Migrations allow developers to evolve the database schema in a controlled and repeatable manner, making it easier to keep databases up-to-date with changes in the application. Here are some best practices for database management and migration:

  1. Automated Migration Tools:
    • Use automated database migration tools like Flyway, Liquibase, or your database management system's built-in tools to manage and apply migrations. These tools help ensure consistent, versioned, and repeatable migrations.
  2. Keep Migrations Atomic:
    • Each migration should represent a single, logical change to the database schema. Avoid creating overly complex migrations that combine multiple changes.
  3. Consistent Naming Conventions:
    • Establish consistent naming conventions for database objects (e.g., tables, columns, indexes) to make the schema more readable and maintainable.

Rest API

If your application includes a rest api, you should include swagger in your project. Swagger is an open-source framework that simplifies the design, building, and documentation of RESTful web APIs. It provides tools for developers, product owners, and testers to collaborate on API development. Swagger aims to streamline the API development process, enhance its usability, and improve communication between different teams working on an API project.

Deployment Strategy

Deployment is the process of making software available for use. It involves building, testing, and releasing software changes to production environments.

Artifacts

An artifact is a deployable component of your application. It can be a binary, package, or other type of file. Artifacts are created during the build process and are ready for deployment to production environments.

Each project should supply the following artifacts: * Docker image * Helm chart

Versioning

A versioning strategy, often used in software development, helps manage and communicate changes to your software over time. It provides a structured way to indicate the evolution of your software and ensures that stakeholders can understand which version they are using.

Our strategy (Calendar Versioning): In CalVer, version numbers are based on dates, making it easy to determine when a release occurred. For example, a version might be 2023.10.24, indicating the release date. The format is [YYYY].[MM].[INC]

An example of a versioning script is shown below:

git config --global user.email "github@qti.qualcomm.com"
git config --global user.name "Github Actions"

LATEST_VERSION=`git tag --sort=taggerdate | grep -E '[0-9]' | tail -1`
PREFIX_VERSION=`date +"%Y.%m"`

if [[ "$LATEST_VERSION" =~ ^"$PREFIX_VERSION." ]]; then
    TAG_VERSION=$(echo ${LATEST_VERSION} | awk -F. -v OFS=. '{$NF += 1 ; print}')
else
    TAG_VERSION="${PREFIX_VERSION}.1"
fi
echo "tag=$TAG_VERSION" >> $GITHUB_OUTPUT
git tag -a $TAG_VERSION -m $TAG_COMMENT
git push origin $TAG_VERSION

Dev Isolation Deployment

When we talk about dev isolation, we are referring to the ability to deploy a feature branch to a separate environment (e.g., dev) without affecting other environments (e.g., staging, production). This allows developers to test their changes in a production-like environment without impacting other users.

The guidelines for dev isolation deployment are:

  • Ability to install in private namespace
  • Ability to install in private database schema
  • Ability to install in private storage

For this to work, you need to have all the above as part of your infrastructure. For example, if you are using helm, you should have the following:

  • Helm chart that can be installed in a private namespace
  • Database Schema name should be parameterized according to the namespace
  • Storage name should be parameterized according to the namespace
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment