GitHub Actions Upload
This GitHub Actions workflow is an example of automatically build your dbt project and upload the artifacts to Euno after successful completion.
Prerequisites
GitHub repository with your dbt project
Euno integration key stored as a GitHub secret
Setup
1. Add GitHub Secrets
In your GitHub repository, add the following secrets (Settings → Secrets and variables → Actions):
EUNO_INTEGRATION_KEY
: Your Euno integration keyEUNO_ENDPOINT_URL
: Your Euno endpoint URL (e.g.,https://api.app.euno.ai/accounts/YOUR_ACCOUNT_ID/integrations/YOUR_INTEGRATION_ID/run
)
2. Create Workflow File
Create .github/workflows/dbt-euno-upload.yml
in your repository:
name: dbt Build and Upload to Euno
on:
push:
branches: [ main, master ]
pull_request:
branches: [ main, master ]
workflow_dispatch: # Allow manual trigger
env:
DBT_PROFILES_DIR: /tmp/profiles
DBT_PROJECT_DIR: .
jobs:
dbt-build-and-upload:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.9'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install dbt-core dbt-snowflake # Replace dbt-snowflake with your adapter
- name: Set up dbt profile
run: |
mkdir -p $DBT_PROFILES_DIR
cat << EOF > $DBT_PROFILES_DIR/profiles.yml
# Your dbt profiles configuration here
# This is a template - replace with your actual configuration
your_project_name:
target: prod
outputs:
prod:
type: snowflake # or your warehouse type
account: ${{ secrets.DBT_SNOWFLAKE_ACCOUNT }}
user: ${{ secrets.DBT_SNOWFLAKE_USER }}
password: ${{ secrets.DBT_SNOWFLAKE_PASSWORD }}
role: ${{ secrets.DBT_SNOWFLAKE_ROLE }}
database: ${{ secrets.DBT_SNOWFLAKE_DATABASE }}
warehouse: ${{ secrets.DBT_SNOWFLAKE_WAREHOUSE }}
schema: ${{ secrets.DBT_SNOWFLAKE_SCHEMA }}
threads: 4
keepalives_idle: 240
EOF
- name: Run dbt build
run: |
dbt deps
dbt build
dbt docs generate
- name: Create and upload artifacts zip
run: |
cd target
# Check if required files exist
for file in catalog.json manifest.json run_results.json; do
if [ ! -f "$file" ]; then
echo "Error: Required file $file not found"
exit 1
fi
done
# Create zip file with artifacts
echo "Creating zip file with dbt artifacts..."
zip -r dbt_artifacts.zip catalog.json manifest.json run_results.json
# Add semantic_manifest.json if it exists
if [ -f "semantic_manifest.json" ]; then
zip dbt_artifacts.zip semantic_manifest.json
echo "Added semantic_manifest.json to zip"
fi
# Upload to Euno
echo "Uploading artifacts to Euno..."
response=$(curl -s -w "\n%{http_code}" \
-X POST \
-H "Authorization: Bearer ${{ secrets.EUNO_INTEGRATION_KEY }}" \
-F "files=@dbt_artifacts.zip;type=application/zip" \
"${{ secrets.EUNO_ENDPOINT_URL }}")
# Extract status code and response body
status_code=$(echo "$response" | tail -n1)
response_body=$(echo "$response" | head -n -1)
echo "Response: $response_body"
echo "Status Code: $status_code"
# Check if upload was successful
if [ "$status_code" -eq 200 ]; then
echo "✅ Artifacts uploaded successfully!"
else
echo "❌ Upload failed with status code: $status_code"
exit 1
fi
- name: Upload artifacts on failure
if: failure()
uses: actions/upload-artifact@v4
with:
name: dbt-artifacts
path: target/
retention-days: 3
Configuration Notes
Database Adapter
Update the workflow to install the correct dbt adapter for your data warehouse:
Snowflake:
pip install dbt-snowflake
BigQuery:
pip install dbt-bigquery
Redshift:
pip install dbt-redshift
Databricks:
pip install dbt-databricks
Profiles Configuration
Replace the profiles.yml
template with your actual dbt configuration. You'll need to add the necessary secrets to your GitHub repository for database credentials.
Required GitHub Secrets
For all warehouses:
EUNO_INTEGRATION_KEY
EUNO_ENDPOINT_URL
Workflow Triggers
The workflow runs on:
Push to main/master branch: Automatically uploads production artifacts
Pull requests: Validates dbt build but can be configured to upload test artifacts
Manual trigger: Use the "Actions" tab to run manually
Customization
Conditional upload: Add conditions to only upload on specific branches
Slack/Teams notifications: Add notification steps for success/failure
Artifact retention: Adjust the retention period for failed build artifacts
Parallel jobs: Split into separate build and upload jobs for complex workflows
Last updated