Deployment Pipelines for Craft CMS: From Git Push to Production
If your deployment process involves SSH-ing into a server and running git pull, we need to talk. It works, sure. But it's error-prone, not repeatable, and it's going to bite you eventually.
I've set up deployment pipelines for dozens of Craft sites, and the pattern I've settled on is straightforward, reliable, and can be adapted to whatever hosting you're using. This post covers the full setup, from what goes in your pipeline to how Project Config fits in to the specific scripts I use.
What a Good Deploy Looks Like
A good deployment pipeline does these things in order, every time, without you touching the server:
- You push code to your main branch (or merge a PR)
- A CI service pulls the code, installs dependencies, and builds front-end assets
- The built artifacts get deployed to the server
- Post-deploy scripts run migrations, apply Project Config, and clear caches
- The site is live with the new changes
Let's build this step by step.
Understanding Project Config
Before we get into the pipeline, you need to understand Project Config because it's central to how Craft handles deployments.
Project Config is a set of YAML files in config/project/ that represent your site's structural settings: sections, fields, entry types, volumes, plugin settings, and more. When you make changes in the control panel on your local environment, these YAML files update. You commit them to Git, and on the server, you run php craft project-config/apply to sync those changes to the database.
The key rule: always make structural changes on your local environment, never on production. If you add a field on production, the YAML files on the server will change, and the next deploy will overwrite those changes with whatever's in Git. This is the most common deployment mistake I see.
Option 1: Laravel Forge Deploy Script
If you're using Laravel Forge (which is what I use for most Craft sites), the deploy script is the simplest approach. Forge pulls from Git and runs your script on every push.
cd /home/forge/your-site.com
# Pull latest code
git pull origin main
# Install PHP dependencies (no dev packages in production)
composer install --no-dev --no-interaction --prefer-dist --optimize-autoloader
# Install Node dependencies and build front-end assets
npm ci
npm run build
# Run Craft migrations and apply Project Config
php craft up
# Clear all caches
php craft clear-caches/all
# Restart PHP-FPM for opcache
sudo -S service php8.2-fpm reload
That's the whole thing. Forge triggers this script whenever you push to the configured branch. A few notes:
composer install --no-devskips development dependencies. You don't need PHPUnit or debug tools on production.npm ciis faster thannpm installfor CI environments because it does a clean install from the lockfile.php craft upis a shortcut that runs pending migrations and applies Project Config changes in one command.- Reloading PHP-FPM clears the opcache so PHP picks up the new files immediately.
Option 2: GitHub Actions + Forge
For a more robust pipeline, run the build step in GitHub Actions and then trigger a Forge deployment. This way the server never needs Node.js installed, and the build doesn't slow down your deploy.
name: Deploy to Production
on:
push:
branches: [main]
jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- name: Install and build front-end
run: |
npm ci
npm run build
- name: Deploy to server
uses: appleboy/ssh-action@v1
with:
host: ${{ secrets.SERVER_HOST }}
username: ${{ secrets.SERVER_USER }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
script: |
cd /home/forge/your-site.com
git pull origin main
composer install --no-dev --no-interaction --prefer-dist --optimize-autoloader
php craft up
php craft clear-caches/all
sudo -S service php8.2-fpm reload
rsync or scp to sync the dist/ folder to the server. Add your build output to .gitignore and let CI be the only thing that generates it.
Option 3: GitHub Actions with rsync
This is the approach I use on projects where I want full control over what gets deployed. CI builds everything, then rsync copies the exact files to the server.
name: Build and Deploy
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up PHP
uses: shivammathur/setup-php@v2
with:
php-version: '8.2'
tools: composer:v2
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- name: Install PHP dependencies
run: composer install --no-dev --no-interaction --prefer-dist --optimize-autoloader
- name: Build front-end
run: |
npm ci
npm run build
- name: Deploy via rsync
uses: burnett01/rsync-deployments@6.0.0
with:
switches: -avzr --delete --exclude='.env' --exclude='storage/' --exclude='web/cpresources/'
path: ./
remote_path: /home/forge/your-site.com/
remote_host: ${{ secrets.SERVER_HOST }}
remote_user: ${{ secrets.SERVER_USER }}
remote_key: ${{ secrets.SSH_PRIVATE_KEY }}
- name: Run post-deploy commands
uses: appleboy/ssh-action@v1
with:
host: ${{ secrets.SERVER_HOST }}
username: ${{ secrets.SERVER_USER }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
script: |
cd /home/forge/your-site.com
php craft up
php craft clear-caches/all
sudo -S service php8.2-fpm reload
The key part is the rsync --exclude flags. You never want to overwrite:
.envbecause it contains production secretsstorage/because it contains runtime data, logs, and compiled templatesweb/cpresources/because Craft regenerates these as needed
The Deploy Script Breakdown
Regardless of which approach you use, the post-deploy commands are the same. Let me explain each one:
# Run all pending migrations (Craft core + plugins)
# AND apply any Project Config changes from YAML files
php craft up
# Clear all caches (template, data, asset transforms)
php craft clear-caches/all
# Optional: Warm the template cache if you're using it heavily
# php craft clear-caches/template-caches
# Reload PHP-FPM to clear opcache
sudo -S service php8.2-fpm reload
The php craft up command is doing the heavy lifting. It checks for pending database migrations (from Craft updates or plugin updates), runs them, and then applies any Project Config changes. If you added a new field locally and committed the YAML, this is where it gets created in the production database.
Database Backups Before Deploy
I always add a database backup step before running migrations. If something goes wrong, you can roll back quickly.
cd /home/forge/your-site.com
# Backup database before making changes
php craft db/backup
# Pull and install
git pull origin main
composer install --no-dev --no-interaction --prefer-dist --optimize-autoloader
npm ci
npm run build
# Apply changes
php craft up
php craft clear-caches/all
sudo -S service php8.2-fpm reload
Craft's db/backup command creates a SQL dump in the storage/backups/ directory. If a migration fails, you can restore from this backup. I also set up automated daily backups separately, but having one right before deploy gives you a clean restore point.
Environment Variables
Your .env file should never be in Git. Each environment (local, staging, production) has its own .env with different values. Here's what a production .env typically looks like for Craft:
# Craft
CRAFT_APP_ID=your-unique-app-id
CRAFT_ENVIRONMENT=production
CRAFT_SECURITY_KEY=your-security-key
CRAFT_DEV_MODE=false
CRAFT_ALLOW_ADMIN_CHANGES=false
CRAFT_DISALLOW_ROBOTS=false
# Database
CRAFT_DB_DRIVER=mysql
CRAFT_DB_SERVER=127.0.0.1
CRAFT_DB_PORT=3306
CRAFT_DB_DATABASE=your_database
CRAFT_DB_USER=your_user
CRAFT_DB_PASSWORD=your_password
# URLs
PRIMARY_SITE_URL=https://your-site.com
# Assets
CRAFT_ASSETS_URL=https://your-site.com/uploads
# Build
VITE_DEV_SERVER=false
The important one is CRAFT_ALLOW_ADMIN_CHANGES=false. This prevents anyone from making structural changes (adding fields, sections, etc.) on production. Changes can only come through Project Config via deployment. This is what keeps your environments in sync.
Staging Environment
For most projects I run three environments: local, staging, and production. Staging is where I test deployments before they hit production. The staging deploy is identical to production except it runs on a separate server with a separate database.
# In GitHub Actions, deploy to different servers based on branch
on:
push:
branches:
- main # deploys to production
- staging # deploys to staging
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
# ... build steps ...
- name: Set deploy target
run: |
if [ "${{ github.ref }}" = "refs/heads/main" ]; then
echo "DEPLOY_HOST=${{ secrets.PROD_HOST }}" >> $GITHUB_ENV
echo "DEPLOY_PATH=/home/forge/your-site.com" >> $GITHUB_ENV
else
echo "DEPLOY_HOST=${{ secrets.STAGING_HOST }}" >> $GITHUB_ENV
echo "DEPLOY_PATH=/home/forge/staging.your-site.com" >> $GITHUB_ENV
fi
Common Deployment Gotchas
Project Config Conflicts
If two developers add different fields at the same time, the Project Config YAML files will conflict on merge, just like any other code conflict. Resolve them the same way you'd resolve any Git conflict: pick the right version, commit, and deploy.
To minimize this, I try to coordinate structural changes. If I'm adding a bunch of fields, I let the team know so nobody else is touching the content model at the same time.
Content Created on Production
Content (entries, assets) lives only in the database and doesn't go through Project Config. If someone creates a blog post on production while you're deploying, that's fine. The deploy only touches code and structure, not content.
But if someone adds a new field on production (and CRAFT_ALLOW_ADMIN_CHANGES isn't set to false), that field will get wiped on the next deploy because it's not in the Git-tracked YAML files. This is why that environment variable matters.
Large Composer Installs
Running composer install on a server can be slow and memory-hungry. If you're hitting memory limits, add this to your deploy script:
COMPOSER_MEMORY_LIMIT=-1 composer install --no-dev --no-interaction --prefer-dist --optimize-autoloader
Or better yet, run Composer in CI and rsync the vendor/ directory to the server.
Failed Migrations
If php craft up fails during deployment, your site might be in a partially migrated state. This is why the database backup step matters. If a migration fails:
- Read the error message carefully
- If it's a Project Config conflict, resolve it in your local environment and redeploy
- If it's a real migration error, restore the database backup and investigate
My Recommended Setup
For most Craft projects, here's what I recommend:
- Hosting: Laravel Forge + DigitalOcean (or any cloud provider Forge supports)
- CI: GitHub Actions for front-end builds
- Deploy trigger: Push to main branch
- Deploy method: Forge deploy script with
git pull+ post-deploy commands - Staging: Separate Forge server, deployed from a staging branch
- Backups: Automated daily backups + pre-deploy backup
- Monitoring: Oh Dear or UptimeRobot for uptime, Craft's built-in system report for health checks
This setup handles sites getting anywhere from a few hundred to tens of thousands of page views per day without breaking a sweat. It's simple enough that any developer can understand and maintain it.
A good deployment pipeline takes about an hour to set up and saves you countless hours of manual, error-prone deployments. Once it's running, deploying a change to production is literally just merging a PR. If you're still deploying by hand, this is one of the highest-leverage improvements you can make to your workflow.
Need help setting up a deployment pipeline for your Craft site? I'd be happy to help.