Building a CI/CD Pipeline with GitHub Actions: A Real-World Guide

· 12 min read · DevOps & Cloud

Step-by-step guide to building a production CI/CD pipeline with GitHub Actions including testing, Docker builds, and automated deployment.

Building a CI/CD Pipeline with GitHub Actions: A Real-World Guide

Your team merges to main, someone manually runs the tests, someone else builds the Docker image, and a third person deploys to the server. Each step takes 15 minutes of human attention. Multiply by 5 deploys per week and you lose an entire work day to deployment ceremony.

CI/CD eliminates this. Push to main, go get coffee, come back to a deployed application.

Pipeline Architecture

A production CI/CD pipeline has four stages:

Push → Lint/Type Check → Test → Build → Deploy

Each stage gates the next. If linting fails, tests never run. If tests fail, the build never starts. Only code that passes every check reaches production.

The Workflow File

name: CI/CD Pipeline

on:
  push:
    branches: [main]
  pull_request:
    branches: [main]

env:
  REGISTRY: ghcr.io
  IMAGE_NAME: ${{ github.repository }}

jobs:
  lint-and-typecheck:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: 20
          cache: npm

      - name: Install dependencies
        run: npm ci

      - name: Lint
        run: npm run lint

      - name: Type check
        run: npx tsc --noEmit

  test:
    needs: lint-and-typecheck
    runs-on: ubuntu-latest
    services:
      postgres:
        image: postgres:16
        env:
          POSTGRES_USER: test
          POSTGRES_PASSWORD: test
          POSTGRES_DB: test_db
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - uses: actions/checkout@v4

      - name: Setup Python
        uses: actions/setup-python@v5
        with:
          python-version: "3.12"
          cache: pip

      - name: Install dependencies
        run: pip install -r requirements.txt -r requirements-test.txt

      - name: Run tests
        env:
          DATABASE_URL: postgresql://test:test@localhost:5432/test_db
        run: pytest --cov=app --cov-report=xml -v

      - name: Upload coverage
        uses: codecov/codecov-action@v4
        with:
          token: ${{ secrets.CODECOV_TOKEN }}

  build-and-push:
    needs: test
    if: github.ref == 'refs/heads/main'
    runs-on: ubuntu-latest
    permissions:
      contents: read
      packages: write

    steps:
      - uses: actions/checkout@v4

      - name: Login to Container Registry
        uses: docker/login-action@v3
        with:
          registry: ${{ env.REGISTRY }}
          username: ${{ github.actor }}
          password: ${{ secrets.GITHUB_TOKEN }}

      - name: Build and push
        uses: docker/build-push-action@v5
        with:
          context: .
          push: true
          tags: |
            ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:latest
            ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ github.sha }}
          cache-from: type=gha
          cache-to: type=gha,mode=max

  deploy:
    needs: build-and-push
    if: github.ref == 'refs/heads/main'
    runs-on: ubuntu-latest
    environment: production

    steps:
      - name: Deploy to production
        uses: appleboy/ssh-action@v1
        with:
          host: ${{ secrets.DEPLOY_HOST }}
          username: ${{ secrets.DEPLOY_USER }}
          key: ${{ secrets.DEPLOY_KEY }}
          script: |
            docker pull ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ github.sha }}
            docker compose up -d --no-deps app

Key Design Decisions

npm ci instead of npm install. ci installs exact versions from package-lock.json and is faster because it skips the dependency resolution step.

Service containers for tests. GitHub Actions can spin up PostgreSQL as a service. No mocking the database — tests run against a real database instance.

github.sha tags. Every image is tagged with the commit hash. If a deploy goes wrong, you can roll back to the exact previous version.

GHA cache for Docker. cache-from: type=gha reuses Docker layer cache across workflow runs. A build that takes 5 minutes on first run takes 45 seconds on subsequent runs if only application code changed.

Branch Protection

Enforce the pipeline with branch protection rules:

  1. Require status checks to pass before merging
  2. Require pull request reviews
  3. Require linear history (no merge commits)
  4. Disable force pushes to main

This ensures every change goes through the full pipeline.

Secrets Management

Never hardcode secrets in workflow files:

# BAD
env:
  DATABASE_URL: postgresql://admin:password@prod-server/db

# GOOD
env:
  DATABASE_URL: ${{ secrets.DATABASE_URL }}

Use GitHub Environments for production secrets. Environments support required reviewers — the deploy job pauses until a team member approves.

Monitoring Deploys

Add a notification step after deployment:

- name: Notify deployment
  if: success()
  run: |
    curl -X POST "${{ secrets.SLACK_WEBHOOK }}" \
      -H "Content-Type: application/json" \
      -d '{"text": "Deployed `${{ github.sha }}` to production by ${{ github.actor }}"}'

Takeaways

A CI/CD pipeline is an investment that pays for itself within a week. The initial setup takes 2–4 hours. After that, every deploy is automatic, consistent, and auditable. The key is to start simple — lint, test, build, deploy — and add complexity only when needed.

See also: Docker Multi-Stage Builds for Python for optimizing the build step.