forked from apache/atlas
-
Notifications
You must be signed in to change notification settings - Fork 9
MLH-1378 Atlas Helm Package + Version + Publish - Master #5570
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from 3 commits
Commits
Show all changes
9 commits
Select commit
Hold shift + click to select a range
b5a18a5
initial commit
krishnanunni-atlan c0ece65
sync with atlan preprod #1
krishnanunni-atlan 6d5fd3d
sync with atlan preprod
krishnanunni-atlan 118c701
Merge remote-tracking branch 'origin/master' into MLH-1378-master
krishnanunni-atlan 7d41100
make atlas dependencies oci
krishnanunni-atlan ee8aebc
optimize maven wf
krishnanunni-atlan e7337e6
optimize maven build
krishnanunni-atlan 50e6749
Merge remote-tracking branch 'origin/master' into MLH-1378-master
krishnanunni-atlan 643c4c1
sync with atlan
krishnanunni-atlan File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Some comments aren't visible on the classic Files Changed page.
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -34,400 +34,576 @@ | |
| - mlh-1240-improve-cm-refresh-master | ||
|
|
||
| jobs: | ||
| build: | ||
| helm-lint: | ||
| runs-on: ubuntu-latest | ||
| strategy: | ||
| matrix: | ||
| chart: ['atlas', 'atlas-read'] | ||
|
|
||
| steps: | ||
| - name: Checkout code | ||
| uses: actions/checkout@v3 | ||
|
|
||
| - name: Install Helm | ||
| uses: azure/setup-helm@v3 | ||
| with: | ||
| version: '3.12.0' | ||
|
|
||
| - name: Update helm dependencies | ||
| run: | | ||
| cd helm/${{ matrix.chart }} | ||
| helm dependency update | ||
|
|
||
| echo "Chart dependencies:" | ||
| ls -la charts/ | ||
|
|
||
| - name: Lint helm chart | ||
| run: | | ||
| helm lint helm/${{ matrix.chart }}/ | ||
| echo "✅ ${{ matrix.chart }} chart lint passed!" | ||
|
|
||
| - name: Validate Chart.yaml | ||
| run: | | ||
| # Check for required fields | ||
| if ! grep -q "^version:" helm/${{ matrix.chart }}/Chart.yaml; then | ||
| echo "❌ Error: version field missing in Chart.yaml" | ||
| exit 1 | ||
| fi | ||
| if ! grep -q "^appVersion:" helm/${{ matrix.chart }}/Chart.yaml; then | ||
| echo "❌ Error: appVersion field missing in Chart.yaml" | ||
| exit 1 | ||
| fi | ||
| echo "✅ Chart.yaml validation passed!" | ||
|
|
||
| build: | ||
| needs: helm-lint | ||
| runs-on: ubuntu-latest | ||
|
|
||
| steps: | ||
| - uses: actions/checkout@v3 | ||
|
|
||
| # Set up Docker | ||
| - name: Set up Docker | ||
| uses: docker/setup-buildx-action@v2 | ||
| with: | ||
| driver-opts: image=moby/buildkit:master | ||
| install: true | ||
|
|
||
| - name: Set up JDK 17 | ||
| uses: actions/setup-java@v1 | ||
| with: | ||
| java-version: 17 | ||
|
|
||
| - name: Print JDK version | ||
| run: java -version | ||
|
|
||
| # Verify Docker is available | ||
| - name: Verify Docker | ||
| run: | | ||
| docker --version | ||
| docker info | ||
|
|
||
| - name: Cache Maven packages | ||
| uses: actions/cache@v3 | ||
| with: | ||
| path: ~/.m2 | ||
| key: ${{ runner.os }}-m2-${{ hashFiles('**/build.sh') }} | ||
| restore-keys: ${{ runner.os }}-m2 | ||
|
|
||
| - name: Get branch name | ||
| run: | | ||
| echo "BRANCH_NAME=${GITHUB_REF#refs/heads/}" >> $GITHUB_ENV | ||
| echo BRANCH_NAME=${GITHUB_REF#refs/heads/} | ||
|
|
||
| - name: Create Maven Settings | ||
| uses: s4u/[email protected] | ||
| with: | ||
| servers: | | ||
| [{ | ||
| "id": "github", | ||
| "username": "atlan-ci", | ||
| "password": "${{ secrets.ORG_PAT_GITHUB }}" | ||
| }] | ||
|
|
||
| - name: Build with Maven | ||
| run: | | ||
| echo "build without dashboard" | ||
| chmod +x ./build.sh && ./build.sh | ||
|
|
||
| - name: Check disk space before tests | ||
| id: check_disk | ||
| run: | | ||
| echo "==========================================" | ||
| echo "DISK SPACE CHECK" | ||
| echo "==========================================" | ||
| df -h / | grep -E '^/dev/' || df -h / | tail -1 | ||
| echo "" | ||
|
|
||
| # Get disk usage percentage (remove % sign) | ||
| DISK_USAGE=$(df / | tail -1 | awk '{print $5}' | sed 's/%//') | ||
| echo "Current disk usage: ${DISK_USAGE}%" | ||
| echo "disk_usage=$DISK_USAGE" >> $GITHUB_OUTPUT | ||
|
|
||
| if [ "$DISK_USAGE" -gt 70 ]; then | ||
| echo "⚠️ Disk usage is high (${DISK_USAGE}%), cleanup will run" | ||
| else | ||
| echo "✅ Disk space is adequate (${DISK_USAGE}%), skipping cleanup" | ||
| fi | ||
|
|
||
| - name: Free up disk space for tests | ||
| if: steps.check_disk.outputs.disk_usage > 70 | ||
| run: | | ||
| echo "==========================================" | ||
| echo "CLEANING UP DISK SPACE" | ||
| echo "==========================================" | ||
|
|
||
| # Clean Docker system | ||
| echo "Cleaning Docker system..." | ||
| docker system prune -af --volumes || true | ||
|
|
||
| # Clean apt cache | ||
| echo "Cleaning apt cache..." | ||
| sudo apt-get clean || true | ||
| sudo rm -rf /var/cache/apt/archives/* || true | ||
|
|
||
| # Clean temp files | ||
| echo "Cleaning temp files..." | ||
| sudo rm -rf /tmp/* || true | ||
|
|
||
| # Clean old GitHub Actions logs | ||
| echo "Cleaning GitHub Actions logs..." | ||
| sudo rm -rf /home/runner/work/_temp/_runner_file_commands/* || true | ||
|
|
||
| # Clean hostedtoolcache if needed (keep essentials) | ||
| echo "Cleaning hostedtoolcache (non-essential tools)..." | ||
| sudo rm -rf /opt/hostedtoolcache/CodeQL || true | ||
| sudo rm -rf /opt/hostedtoolcache/go || true | ||
| sudo rm -rf /opt/hostedtoolcache/PyPy || true | ||
| sudo rm -rf /opt/hostedtoolcache/node || true | ||
| sudo rm -rf /opt/hostedtoolcache/Ruby || true | ||
|
|
||
| echo "" | ||
| echo "Disk space after cleanup:" | ||
| df -h / | grep -E '^/dev/' || df -h / | tail -1 | ||
|
|
||
| - name: Verify sufficient disk space | ||
| run: | | ||
| echo "==========================================" | ||
| echo "VERIFYING DISK SPACE" | ||
| echo "==========================================" | ||
|
|
||
| DISK_USAGE=$(df / | tail -1 | awk '{print $5}' | sed 's/%//') | ||
| AVAILABLE_GB=$(df -h / | tail -1 | awk '{print $4}') | ||
|
|
||
| echo "Current disk usage: ${DISK_USAGE}%" | ||
| echo "Available space: ${AVAILABLE_GB}" | ||
|
|
||
| # Fail if disk usage is still above 85% | ||
| if [ "$DISK_USAGE" -gt 85 ]; then | ||
| echo "ERROR: Insufficient disk space (${DISK_USAGE}% used)" | ||
| echo "Tests require at least 15% free space to run reliably" | ||
| echo "Elasticsearch will fail with high disk watermark errors at 90%+" | ||
| exit 1 | ||
| else | ||
| echo "Sufficient disk space available (${DISK_USAGE}% used)" | ||
| fi | ||
|
|
||
| - name: Run Integration Tests | ||
| id: integration_tests | ||
| continue-on-error: true | ||
| env: | ||
| # Configure Testcontainers for GitHub Actions | ||
| TESTCONTAINERS_RYUK_DISABLED: true | ||
| TESTCONTAINERS_CHECKS_DISABLE: true | ||
| DOCKER_HOST: unix:///var/run/docker.sock | ||
| run: | | ||
| echo "Running integration tests..." | ||
| chmod +x ./run-integration-tests.sh && ./run-integration-tests.sh | ||
|
|
||
| - name: Upload container logs as artifact | ||
| if: always() # Upload logs even if tests pass (for debugging) | ||
| uses: actions/upload-artifact@v4 | ||
| with: | ||
| name: container-logs-${{ github.run_id }} | ||
| path: target/test-logs/ | ||
| retention-days: 5 | ||
|
|
||
| - name: Setup tmate session on test failure | ||
| if: steps.integration_tests.outcome == 'failure' | ||
| uses: mxschmitt/action-tmate@v3 | ||
| timeout-minutes: 30 | ||
| with: | ||
| detached: true | ||
| limit-access-to-actor: false | ||
|
|
||
| - name: Fail the workflow if tests failed | ||
| if: steps.integration_tests.outcome == 'failure' | ||
| run: exit 1 | ||
|
|
||
| - name: Clean up after integration tests | ||
| run: | | ||
| echo "==========================================" | ||
| echo "CLEANING UP AFTER INTEGRATION TESTS" | ||
| echo "==========================================" | ||
|
|
||
| # Remove test containers and images | ||
| echo "Removing test containers and images..." | ||
| docker system prune -af --volumes || true | ||
|
|
||
| # Clean Maven artifacts to free up space | ||
| echo "Cleaning Maven artifacts..." | ||
| rm -rf ~/.m2/repository/org/apache/atlas/ || true | ||
|
|
||
| # Clean test artifacts | ||
| echo "Cleaning test artifacts..." | ||
| rm -rf webapp/target/surefire-reports/ || true | ||
| rm -rf test-debug-logs/ || true | ||
|
|
||
| # Clean temp files | ||
| echo "Cleaning temp files..." | ||
| sudo rm -rf /tmp/* || true | ||
|
|
||
| echo "" | ||
| echo "Disk space after cleanup:" | ||
| df -h / | tail -1 | ||
|
|
||
| - name: Get Repository Name | ||
| run: echo "REPOSITORY_NAME=`echo "$GITHUB_REPOSITORY" | awk -F / '{print $2}' | sed -e "s/:refs//"`" >> $GITHUB_ENV | ||
| shell: bash | ||
|
|
||
| - name: Get version tag | ||
| # run: echo "##[set-output name=version;]$(echo `git ls-remote https://${{ secrets.ORG_PAT_GITHUB }}@github.com/atlanhq/${REPOSITORY_NAME}.git ${{ env.BRANCH_NAME }} | awk '{ print $1}' | cut -c1-7`)abcd" | ||
| run: | | ||
| echo "VERSION=$(git ls-remote https://${{ secrets.ORG_PAT_GITHUB }}@github.com/atlanhq/${REPOSITORY_NAME}.git ${{ env.BRANCH_NAME }} | awk '{ print $1}' | cut -c1-7 | head -n 1)abcd" | ||
| echo "VERSION=$(git ls-remote https://${{ secrets.ORG_PAT_GITHUB }}@github.com/atlanhq/${REPOSITORY_NAME}.git ${{ env.BRANCH_NAME }} | awk '{ print $1}' | cut -c1-7 | tr -d '[:space:]')abcd" | ||
| echo "VERSION=$(git ls-remote https://${{ secrets.ORG_PAT_GITHUB }}@github.com/atlanhq/${REPOSITORY_NAME}.git ${{ env.BRANCH_NAME }} | awk '{ print $1}' | cut -c1-7 | tr -d '[:space:]')abcd" >> $GITHUB_ENV | ||
|
|
||
| - name: Get commit ID | ||
| run: echo "COMMIT_ID=$(echo ${GITHUB_SHA} | cut -c1-7)abcd" >> $GITHUB_ENV | ||
|
|
||
| # QEMU is required to build arm from a non-arm build machine | ||
| - name: Set up QEMU | ||
| id: qemu | ||
| uses: docker/setup-qemu-action@v3 | ||
| with: | ||
| image: tonistiigi/binfmt:qemu-v7.0.0-28 | ||
| platforms: arm64 | ||
|
|
||
| - name: Set up Buildx | ||
| id: buildx | ||
| uses: docker/setup-buildx-action@v1 | ||
|
|
||
| - name: Login to GitHub Registry | ||
| uses: docker/login-action@v1 | ||
| with: | ||
| registry: ghcr.io | ||
| username: $GITHUB_ACTOR | ||
| password: ${{ secrets.ORG_PAT_GITHUB }} | ||
|
|
||
| - name: Build and push | ||
| id: docker_build | ||
| uses: docker/build-push-action@v3 | ||
| with: | ||
| platforms: linux/amd64,linux/arm64 | ||
| context: . | ||
| file: ./Dockerfile | ||
| no-cache: true | ||
| sbom: true | ||
| provenance: true | ||
| push: true | ||
| tags: | | ||
| ghcr.io/atlanhq/${{ github.event.repository.name }}-${{ env.BRANCH_NAME }}:latest | ||
| ghcr.io/atlanhq/${{ github.event.repository.name }}-${{ env.BRANCH_NAME }}:${{ env.COMMIT_ID }} | ||
|
|
||
| - name: Check Image Manifest | ||
| run: docker buildx imagetools inspect --raw ghcr.io/atlanhq/${{ github.event.repository.name }}-${{ env.BRANCH_NAME }}:${{ env.COMMIT_ID }} | ||
|
|
||
| - name: Scan Image | ||
| uses: aquasecurity/trivy-action@master | ||
| with: | ||
| image-ref: 'ghcr.io/atlanhq/${{ github.event.repository.name }}-${{ env.BRANCH_NAME }}:${{ env.COMMIT_ID }}' | ||
| vuln-type: 'os,library' | ||
| format: 'sarif' | ||
| output: 'trivy-image-results.sarif' | ||
|
|
||
| - name: Upload Trivy scan results to GitHub Security tab | ||
| uses: github/codeql-action/[email protected] | ||
| with: | ||
| sarif_file: 'trivy-image-results.sarif' | ||
|
|
||
| # Smoke test on vclusters (parallel with single VPN) | ||
|
|
||
| smoke-test: | ||
| name: Multi-Cloud Smoke Test | ||
| needs: build | ||
| runs-on: ubuntu-latest | ||
|
|
||
| env: | ||
| VCLUSTER_AWS_NAME: ${{ vars.VCLUSTER_AWS_NAME }} | ||
| VCLUSTER_AZURE_NAME: ${{ vars.VCLUSTER_AZURE_NAME }} | ||
| VCLUSTER_GCP_NAME: ${{ vars.VCLUSTER_GCP_NAME }} | ||
| VCLUSTER_PROJECT: ${{ vars.VCLUSTER_PROJECT }} | ||
|
|
||
| steps: | ||
| - name: Checkout | ||
| uses: actions/checkout@v3 | ||
|
|
||
| - name: Get branch name | ||
| run: echo "BRANCH_NAME=${GITHUB_REF#refs/heads/}" >> $GITHUB_ENV | ||
|
|
||
| - name: Get commit ID | ||
| run: echo "COMMIT_ID=$(echo ${GITHUB_SHA} | cut -c1-7)abcd" >> $GITHUB_ENV | ||
|
|
||
| - name: Set test image | ||
| run: echo "TEST_IMAGE=ghcr.io/atlanhq/${{ github.event.repository.name }}-${{ env.BRANCH_NAME }}:${{ env.COMMIT_ID }}" >> $GITHUB_ENV | ||
|
|
||
| - name: Install kubectl | ||
| uses: azure/setup-kubectl@v3 | ||
|
|
||
| - name: Install vCluster CLI | ||
| uses: loft-sh/setup-vcluster@main | ||
|
|
||
| - name: Install jq | ||
| run: sudo apt-get install -y jq | ||
|
|
||
| - name: Connect to GlobalProtect VPN | ||
| env: | ||
| VCLUSTER_PLATFORM_URL: ${{ secrets.VCLUSTER_PLATFORM_URL }} | ||
| run: | | ||
| echo "==================================================" | ||
| echo "CONNECTING TO VPN (Shared for all clouds)" | ||
| echo "==================================================" | ||
|
|
||
| # Install OpenConnect | ||
| sudo apt-get update -qq | ||
| sudo apt-get install -y openconnect | ||
|
|
||
| # Connect to VPN (using default DTLS/ESP for AWS compatibility) | ||
| echo "${{ secrets.GLOBALPROTECT_PASSWORD }}" | sudo openconnect \ | ||
| --protocol=gp \ | ||
| --user="${{ secrets.GLOBALPROTECT_USERNAME }}" \ | ||
| --passwd-on-stdin \ | ||
| --background \ | ||
| "${{ vars.GLOBALPROTECT_PORTAL_URL }}" | ||
|
|
||
| # Wait for connection to establish | ||
| echo "Waiting for VPN connection to stabilize..." | ||
| sleep 20 | ||
|
|
||
| # Check if VPN is running | ||
| if ! pgrep -x openconnect > /dev/null; then | ||
| echo "ERROR: OpenConnect exited unexpectedly" | ||
| exit 1 | ||
| fi | ||
| echo "VPN process is running (PID: $(pgrep -x openconnect))" | ||
|
|
||
| # Configure routing for vCluster Platform (172.17.0.0/16) | ||
| VPN_INTERFACE=$(ip addr show | grep -E '^[0-9]+: tun' | head -1 | cut -d: -f2 | tr -d ' ' || echo "tun0") | ||
| echo "Using VPN interface: $VPN_INTERFACE" | ||
|
|
||
| sudo ip route del 172.17.0.0/16 dev docker0 2>/dev/null || true | ||
| sudo ip route add 172.17.0.0/16 dev $VPN_INTERFACE | ||
|
|
||
| # Verify connectivity | ||
| if curl -k -sS $VCLUSTER_PLATFORM_URL -o /dev/null --max-time 30; then | ||
| echo "✓ VPN connected successfully" | ||
| else | ||
| echo "ERROR: VPN connectivity test failed" | ||
| exit 1 | ||
| fi | ||
|
|
||
| - name: Login to vCluster Platform | ||
| env: | ||
| VCLUSTER_PLATFORM_URL: ${{ secrets.VCLUSTER_PLATFORM_URL }} | ||
| VCLUSTER_ACCESS_KEY: ${{ secrets.VCLUSTER_ACCESS_KEY }} | ||
| run: | | ||
| echo "==================================================" | ||
| echo "LOGGING IN TO VCLUSTER PLATFORM (Shared)" | ||
| echo "==================================================" | ||
| vcluster platform login $VCLUSTER_PLATFORM_URL --access-key $VCLUSTER_ACCESS_KEY | ||
| echo "✓ Login successful" | ||
|
|
||
| - name: Connect to all vClusters | ||
| run: | | ||
| echo "==================================================" | ||
| echo "CONNECTING TO ALL VCLUSTERS" | ||
| echo "==================================================" | ||
|
|
||
| # Connect to AWS vCluster | ||
| echo "Connecting to AWS vCluster ($VCLUSTER_AWS_NAME)..." | ||
| KUBECONFIG=kubeconfig-aws.yaml vcluster platform connect vcluster $VCLUSTER_AWS_NAME --project $VCLUSTER_PROJECT | ||
| echo "✓ AWS kubeconfig saved to kubeconfig-aws.yaml" | ||
|
|
||
| # Connect to Azure vCluster | ||
| echo "Connecting to Azure vCluster ($VCLUSTER_AZURE_NAME)..." | ||
| KUBECONFIG=kubeconfig-azure.yaml vcluster platform connect vcluster $VCLUSTER_AZURE_NAME --project $VCLUSTER_PROJECT | ||
| echo "✓ Azure kubeconfig saved to kubeconfig-azure.yaml" | ||
|
|
||
| # Connect to GCP vCluster | ||
| echo "Connecting to GCP vCluster ($VCLUSTER_GCP_NAME)..." | ||
| KUBECONFIG=kubeconfig-gcp.yaml vcluster platform connect vcluster $VCLUSTER_GCP_NAME --project $VCLUSTER_PROJECT | ||
| echo "✓ GCP kubeconfig saved to kubeconfig-gcp.yaml" | ||
|
|
||
| echo "" | ||
| echo "Verifying kubeconfigs..." | ||
| ls -lh kubeconfig-*.yaml | ||
|
|
||
| echo "" | ||
| echo "Testing AWS connection..." | ||
| KUBECONFIG=kubeconfig-aws.yaml kubectl cluster-info | head -1 | ||
|
|
||
| echo "" | ||
| echo "Testing Azure connection..." | ||
| KUBECONFIG=kubeconfig-azure.yaml kubectl cluster-info | head -1 | ||
|
|
||
| echo "" | ||
| echo "Testing GCP connection..." | ||
| KUBECONFIG=kubeconfig-gcp.yaml kubectl cluster-info | head -1 | ||
|
|
||
| echo "" | ||
| echo "✓ All vCluster connections established" | ||
|
|
||
| - name: Run parallel smoke tests | ||
| run: ./scripts/multi-cloud-smoke-test.sh ${{ env.TEST_IMAGE }} | ||
|
|
||
| - name: Upload smoke test logs | ||
| if: always() | ||
| uses: actions/upload-artifact@v4 | ||
| with: | ||
| name: smoke-test-logs-${{ github.run_id }} | ||
| path: smoke-test-logs/ | ||
| path: smoke-test-logs/ | ||
|
|
||
| helm-publish: | ||
| needs: smoke-test # Only publish if smoke tests pass in all clouds | ||
| runs-on: ubuntu-latest | ||
| strategy: | ||
| matrix: | ||
| chart: ['atlas', 'atlas-read'] | ||
|
|
||
| steps: | ||
| - name: Checkout code | ||
| uses: actions/checkout@v3 | ||
|
|
||
| - name: Get branch name | ||
| id: branch | ||
| run: | | ||
| echo "name=${GITHUB_REF#refs/heads/}" >> $GITHUB_OUTPUT | ||
|
|
||
| - name: Get commit ID | ||
| id: commit | ||
| run: | | ||
| echo "id=$(echo ${GITHUB_SHA} | cut -c1-7)abcd" >> $GITHUB_OUTPUT | ||
|
|
||
krishnanunni-atlan marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| - name: Generate chart version | ||
| id: version | ||
| run: | | ||
| # Semantic version: 1.0.0-branch.commitid | ||
| # Replace underscores with hyphens for semver compliance | ||
| BRANCH_NAME_NORMALIZED=$(echo "${{ steps.branch.outputs.name }}" | tr '_' '-') | ||
| CHART_VERSION="1.0.0-${BRANCH_NAME_NORMALIZED}.${{ steps.commit.outputs.id }}" | ||
| echo "chart=${CHART_VERSION}" >> $GITHUB_OUTPUT | ||
| echo "Generated chart version: ${CHART_VERSION}" | ||
|
|
||
| - name: Install Helm | ||
| uses: azure/setup-helm@v3 | ||
| with: | ||
| version: '3.12.0' | ||
|
|
||
| - name: Update Chart.yaml with version | ||
| run: | | ||
| sed -i "s/^version: .*/version: ${{ steps.version.outputs.chart }}/" helm/${{ matrix.chart }}/Chart.yaml | ||
| sed -i "s/^appVersion: .*/appVersion: \"${{ steps.commit.outputs.id }}\"/" helm/${{ matrix.chart }}/Chart.yaml | ||
|
|
||
| echo "Updated ${{ matrix.chart }}/Chart.yaml:" | ||
| cat helm/${{ matrix.chart }}/Chart.yaml | grep -E "^(version|appVersion):" | ||
|
|
||
| - name: Update values.yaml with image tags | ||
| run: | | ||
| # Replace placeholders with actual values | ||
| sed -i "s/ATLAS_LATEST_IMAGE_TAG/${{ steps.commit.outputs.id }}/g" helm/${{ matrix.chart }}/values.yaml | ||
| sed -i "s/ATLAS_BRANCH_NAME/${{ steps.branch.outputs.name }}/g" helm/${{ matrix.chart }}/values.yaml | ||
|
|
||
| echo "Image configuration in ${{ matrix.chart }}/values.yaml:" | ||
| grep -A 3 "image:" helm/${{ matrix.chart }}/values.yaml | head -10 | ||
|
|
||
| - name: Update helm dependencies | ||
| run: | | ||
| cd helm/${{ matrix.chart }} | ||
| helm dependency update | ||
|
|
||
| echo "Chart dependencies:" | ||
| ls -la charts/ | ||
|
|
||
| - name: Package helm chart | ||
| run: | | ||
| mkdir -p helm-packages | ||
| helm package helm/${{ matrix.chart }}/ --destination ./helm-packages/ | ||
|
|
||
| echo "Packaged charts:" | ||
| ls -lh helm-packages/ | ||
|
|
||
| - name: Login to GitHub Container Registry | ||
| uses: docker/login-action@v2 | ||
| with: | ||
| registry: ghcr.io | ||
| username: $GITHUB_ACTOR | ||
| password: ${{ secrets.ORG_PAT_GITHUB }} | ||
|
|
||
| - name: Push chart to GHCR (OCI Registry) | ||
| run: | | ||
| CHART_FILE=$(ls helm-packages/${{ matrix.chart }}-*.tgz) | ||
| echo "Pushing chart: ${CHART_FILE}" | ||
|
|
||
| helm push ${CHART_FILE} oci://ghcr.io/atlanhq/helm-charts | ||
|
|
||
| echo "✅ Chart published successfully!" | ||
| echo "📦 Chart: ${{ matrix.chart }}" | ||
| echo "📌 Version: ${{ steps.version.outputs.chart }}" | ||
| echo "🏷️ Registry: oci://ghcr.io/atlanhq/helm-charts/${{ matrix.chart }}" | ||
|
|
||
| - name: Create GitHub Release | ||
| uses: ncipollo/release-action@v1 | ||
| with: | ||
| tag: helm-${{ matrix.chart }}-v${{ steps.version.outputs.chart }} | ||
| name: "${{ matrix.chart }} Helm Chart v${{ steps.version.outputs.chart }}" | ||
| body: | | ||
| ## 📦 ${{ matrix.chart }} Helm Chart Release | ||
|
|
||
| **Chart**: `${{ matrix.chart }}` | ||
| **Chart Version**: `${{ steps.version.outputs.chart }}` | ||
| **App Version**: `${{ steps.commit.outputs.id }}` | ||
| **Branch**: `${{ steps.branch.outputs.name }}` | ||
|
|
||
| ### 🐳 Docker Image | ||
| ``` | ||
| ghcr.io/atlanhq/atlas-metastore-${{ steps.branch.outputs.name }}:${{ steps.commit.outputs.id }} | ||
| ``` | ||
|
|
||
| ### 📥 Installation | ||
|
|
||
| **Via OCI Registry (Recommended):** | ||
| ```bash | ||
| helm install ${{ matrix.chart }} oci://ghcr.io/atlanhq/helm-charts/${{ matrix.chart }} \ | ||
| --version ${{ steps.version.outputs.chart }} | ||
| ``` | ||
|
|
||
| **Via Downloaded Chart:** | ||
| ```bash | ||
| helm install ${{ matrix.chart }} ./${{ matrix.chart }}-${{ steps.version.outputs.chart }}.tgz | ||
| ``` | ||
| artifacts: "./helm-packages/${{ matrix.chart }}-*.tgz" | ||
| token: ${{ secrets.GITHUB_TOKEN }} | ||
| makeLatest: false | ||
|
|
||
| - name: Chart publish summary | ||
| run: | | ||
| echo "## 🎉 Helm Chart Published Successfully!" >> $GITHUB_STEP_SUMMARY | ||
| echo "" >> $GITHUB_STEP_SUMMARY | ||
| echo "**Chart**: ${{ matrix.chart }}" >> $GITHUB_STEP_SUMMARY | ||
| echo "**Version**: ${{ steps.version.outputs.chart }}" >> $GITHUB_STEP_SUMMARY | ||
| echo "**Registry**: oci://ghcr.io/atlanhq/helm-charts/${{ matrix.chart }}" >> $GITHUB_STEP_SUMMARY | ||
| echo "" >> $GITHUB_STEP_SUMMARY | ||
| echo "### Installation Command" >> $GITHUB_STEP_SUMMARY | ||
| echo '```bash' >> $GITHUB_STEP_SUMMARY | ||
| echo "helm install ${{ matrix.chart }} oci://ghcr.io/atlanhq/helm-charts/${{ matrix.chart }} --version ${{ steps.version.outputs.chart }}" >> $GITHUB_STEP_SUMMARY | ||
| echo '```' >> $GITHUB_STEP_SUMMARY | ||
|
||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,28 @@ | ||
| apiVersion: v2 | ||
| name: atlas-read | ||
| description: Apache Atlas Read Replica for Metadata Management | ||
| type: application | ||
| version: 1.0.0 | ||
| appVersion: "3.0.0" # Will be updated by CI with commit ID | ||
| maintainers: | ||
| - name: Atlan Engineering | ||
| email: [email protected] | ||
| keywords: | ||
| - atlas | ||
| - atlas-read | ||
| - metadata | ||
| - read-replica | ||
| - apache-atlas | ||
| sources: | ||
| - https://github.com/atlanhq/atlas-metastore | ||
| home: https://github.com/atlanhq/atlas-metastore | ||
| dependencies: | ||
| - name: cassandra-online-dc | ||
| repository: file://./charts/cassandra-online-dc | ||
| version: 0.x.x | ||
| - name: elasticsearch-read | ||
| repository: file://./charts/elasticsearch-read | ||
| version: 7.x.x | ||
| - name: elasticsearch-exporter-read | ||
| repository: file://./charts/elasticsearch-exporter-read | ||
| version: 3.3.0 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,2 @@ | ||
| # atlas | ||
| This chart will install the apache atlas which use elasticsearch and cassandra. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,17 @@ | ||
| # Patterns to ignore when building packages. | ||
| # This supports shell glob matching, relative path matching, and | ||
| # negation (prefixed with !). Only one pattern per line. | ||
| .DS_Store | ||
| # Common VCS dirs | ||
| .git/ | ||
| .gitignore | ||
| # Common backup files | ||
| *.swp | ||
| *.bak | ||
| *.tmp | ||
| *~ | ||
| # Various IDEs | ||
| .project | ||
| .idea/ | ||
| *.tmproj | ||
| OWNERS |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,19 @@ | ||
| apiVersion: v2 | ||
| appVersion: 3.11.5 | ||
| description: Apache Cassandra is a free and open-source distributed database management | ||
| system designed to handle large amounts of data across many commodity servers, providing | ||
| high availability with no single point of failure. | ||
| engine: gotpl | ||
| home: http://cassandra.apache.org | ||
| icon: https://upload.wikimedia.org/wikipedia/commons/thumb/5/5e/Cassandra_logo.svg/330px-Cassandra_logo.svg.png | ||
| keywords: | ||
| - cassandra | ||
| - database | ||
| - nosql | ||
| maintainers: | ||
| - email: [email protected] | ||
| name: KongZ | ||
| - email: [email protected] | ||
| name: maorfr | ||
| name: cassandra-online-dc | ||
| version: 0.14.4 |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.