Skip to content

Instantly share code, notes, and snippets.

@cprima
Last active November 2, 2025 11:09
Show Gist options
  • Select an option

  • Save cprima/5e6dbc305e584d35d334b7a3355cc2a4 to your computer and use it in GitHub Desktop.

Select an option

Save cprima/5e6dbc305e584d35d334b7a3355cc2a4 to your computer and use it in GitHub Desktop.
gh-org-mgr

gh-org-iac

GitHub Organization Infrastructure-as-Code - Declarative, idempotent tooling for managing GitHub organizations, repositories, teams, and fork synchronization.

Gist: https://gist.github.com/cprima/5e6dbc305e584d35d334b7a3355cc2a4

This is a multi-org shared gist. Each organization has its own manifest file (e.g., my-org.json, another-org.json).

Features

  • Idempotent Operations - Safe to run multiple times, only applies changes when needed
  • Fork Management - Automatically sync forks with upstream repositories
  • Organization Settings - Centrally manage org-wide configurations
  • Team Management - Create teams and assign repository permissions
  • Repository Configuration - Settings, Pages, collaborators, and more
  • Dry Run Mode - Preview changes before applying them
  • Template-based Seeding - Initialize repos from template repositories
  • Delta Inspection - Deep diff analysis between forks and upstream

Multi-Organization Workflow

This gist enables managing multiple GitHub organizations from a single, shared tooling repository:

How It Works

  1. Shared Tooling (tracked in gist) - bootstrap.py, documentation, and *.example.* templates
  2. Org-Specific Config (local only) - Each org copies templates to create local config files
  3. Template Pattern:
    • Copy pyproject.example.tomlpyproject.toml (local, gitignored)
    • Copy org.example.jsonyour-org.json (local, gitignored)
    • Copy .org.example.org (local, gitignored)
  4. Safe Updates - git pull only updates shared tooling, never overwrites your local configs

File Structure

gh-org-iac/  (gist clone)
├── bootstrap.py              # Shared IaC script (tracked)
├── pyproject.example.toml    # Template (tracked)
├── pyproject.toml            # Your local copy (gitignored)
├── org.example.json          # Template (tracked)
├── your-org.json             # Your local config (gitignored)
├── .org.example              # Template (tracked)
├── .org                      # Your local marker (gitignored)
├── !README.md                # Documentation (tracked)
└── .gitignore                # Git configuration (tracked)

Local files stay local forever - they are never committed or pushed back to the gist.

Prerequisites

Quick Start

Getting Started

This gist contains shared bootstrap tooling for managing multiple GitHub organizations. Each org creates local config files from templates.

# 1. Clone the gist
gh gist clone 5e6dbc305e584d35d334b7a3355cc2a4 my-org
cd my-org

# 2. Create local files from templates
cp pyproject.example.toml pyproject.toml
cp org.example.json my-org.json
cp .org.example .org
echo "my-org" > .org

# 3. Customize your local files (these stay local, not pushed)
# Edit my-org.json with your organization settings

# 4. Install dependencies
uv sync

# 5. Run the bootstrap script
uv run python bootstrap.py --dry  # Preview changes
uv run python bootstrap.py        # Apply configuration

Organization Flag File (.org)

The .org file tells the script which organization manifest to use. This is especially useful when multiple manifests exist in the same directory.

# Create .org file
echo "my-org" > .org

# Script automatically uses my-org.json
uv run python bootstrap.py --dry

# Override with --manifest if needed
uv run python bootstrap.py --manifest another-org.json --dry

When to commit .org:

  • ✅ Commit if this directory is dedicated to one org
  • ❌ Don't commit if clones will manage different orgs
  • The .gitignore makes it optional

Pulling Updates from Gist

# Pull latest updates to shared tooling and templates
git pull

# Your local config files are NOT touched by git pull:
# - pyproject.toml (your customizations stay intact)
# - your-org.json (your org config stays intact)
# - .org (your org marker stays intact)

# Only these are updated:
# - bootstrap.py (shared script)
# - *.example.* files (templates)
# - Documentation

Backing Up Your Configuration

Your org-specific config files are not stored in the shared gist. To backup your configuration:

  • Private Git Repo: Create a private repository for your org's config
  • Encrypted Storage: Use encrypted cloud storage
  • Secrets Manager: Use AWS Secrets Manager, 1Password, etc.

Example with private repo:

git init
git add pyproject.toml your-org.json .org
git remote add backup [email protected]:your-org/iac-config-private.git
git commit -m "Backup org config"
git push backup main

Basic Usage

# Preview changes (dry run)
uv run python bootstrap.py --dry

# Apply configuration
uv run python bootstrap.py

# Only sync forks with upstream
uv run python bootstrap.py --sync-only

# Force sync (hard reset to upstream)
uv run python bootstrap.py --force-sync --dry

Manifest Configuration

Each organization has its own manifest JSON file (e.g., my-org.json, another-org.json) that defines the organization structure. The manifest filename should match the pattern <org-name>.json.

Organization Settings

{
  "org": "my-org",
  "template_repo": "fork-template",
  "org_settings": {
    "description": "Community forks and experimental repositories",
    "default_repository_permission": "read",
    "members_can_create_repositories": true,
    ...
  }
}

Repository Configuration

{
  "repos": [
    {
      "name": "awesome-project-fork",
      "description": "Fork of upstream/awesome-project for experimentation",
      "topics": ["fork", "python", "learning"],
      "visibility": "public",
      "upstream": "upstream-org/awesome-project",
      "sync_branch": "main",
      "init_readme": false,
      "has_discussions": true,
      "has_issues": true,
      "allow_forking": true
    }
  ]
}

Key Repository Fields

Field Required Description
name Yes Repository name
description No Repository description
visibility No public or private (default: from defaults)
upstream No Source repo for fork syncing (e.g., owner/repo)
sync_branch No Branch to sync (default: main)
topics No Array of topic tags
init_readme No Initialize with template (default: false for forks)
has_issues No Enable issues (default: true)
has_discussions No Enable discussions (default: true)
has_wiki No Enable wiki (default: false)
allow_forking No Allow others to fork (default: true)

Fork Synchronization

The bootstrap script can automatically keep your forks up-to-date with their upstream repositories.

How It Works

  1. Fork Detection - Verifies repo is actually a fork via GitHub API
  2. Sync Command - Uses gh repo sync to update from upstream
  3. Branch Targeting - Syncs specified branch (defaults to default branch)
  4. Conflict Handling - Fast-forward by default, force sync optional

Sync Examples

# Sync all forks configured in manifest
uv run python bootstrap.py

# Only sync forks, skip repo creation/updates
uv run python bootstrap.py --sync-only

# Preview sync operations
uv run python bootstrap.py --sync-only --dry

# Force sync (discards diverging changes)
uv run python bootstrap.py --force-sync

# Disable syncing entirely
uv run python bootstrap.py --no-sync

Fork Configuration Example

{
  "name": "cli-fork",
  "description": "Fork of GitHub CLI for custom features",
  "upstream": "cli/cli",
  "sync_branch": "trunk",
  "topics": ["fork", "go", "cli"]
}

Delta Inspection

The --inspect-delta flag provides deep insights into what has changed between your fork and its upstream repository before syncing.

What It Shows

  • Commit Status - How many commits ahead/behind
  • Upstream Changes - New commits that would be synced (up to 10)
  • File Changes - Which files would be modified, added, or deleted
  • Fork-specific Commits - Changes unique to your fork that would be lost with --force-sync

Delta Inspection Examples

# Inspect all forks without making changes
uv run python bootstrap.py --inspect-delta --no-sync

# Inspect and then sync
uv run python bootstrap.py --inspect-delta

# Inspect only specific forks (with --sync-only)
uv run python bootstrap.py --inspect-delta --sync-only

# See what would happen with force sync
uv run python bootstrap.py --inspect-delta --force-sync --dry

Sample Output

Inspecting Delta: typer <-> fastapi/typer

Status: 5 commits behind | 2 commits ahead

New commits in upstream (would be synced):
  • abc1234 Fix typo in documentation
  • def5678 Add support for Python 3.12
  • ghi9012 Update dependencies
  • jkl3456 Improve error messages
  • mno7890 Add new CLI option

Files affected: 12
  M README.md
  M pyproject.toml
  M typer/main.py
  A tests/test_new_feature.py
  ... and 8 more files

Commits in fork not in upstream:
  • xyz1111 Custom feature for my use case
  • abc2222 Local configuration changes
⚠ These commits would be LOST with --force-sync

CLI Options

--manifest, -m PATH       Path to manifest JSON (auto-detected from .org file if not specified)
--dry / --no-dry         Plan only, no writes (default: --no-dry)
--sync / --no-sync       Sync forks with upstream (default: --sync)
--sync-only              Only sync forks, skip repo creation
--force-sync             Force sync (hard reset to upstream)
--inspect-delta          Show detailed diff between fork and upstream
--help                   Show help message

Workflow

The bootstrap script executes in this order:

  1. Organization Settings - Apply org-wide configurations
  2. Team Management - Create/update teams and permissions
  3. For Each Repository:
    • Create/verify repository exists
    • Clone locally if needed
    • Seed from template (if init_readme: true)
    • Set default branch
    • Apply repository settings
    • Configure GitHub Pages
    • Add collaborators
    • Sync with upstream (if upstream field present)

Examples

Adding a New Fork

  1. Edit my-org.json:
{
  "repos": [
    {
      "name": "typer-fork",
      "description": "Fork of typer for custom CLI features",
      "upstream": "tiangolo/typer",
      "sync_branch": "master",
      "topics": ["fork", "python", "cli"],
      "visibility": "public"
    }
  ]
}
  1. Run bootstrap:
# Preview first
uv run python bootstrap.py --dry

# Apply changes
uv run python bootstrap.py

Keeping Forks Updated

Schedule regular syncs with a cron job or CI/CD:

# Daily sync job
0 2 * * * cd /path/to/my-org && uv run python bootstrap.py --sync-only

Organization-wide Updates

Change repo defaults for all repositories:

{
  "repo_defaults": {
    "has_discussions": true,
    "delete_branch_on_merge": true,
    "allow_forking": true
  }
}

Then run: uv run python bootstrap.py

Development

Project Structure

my-org/
├── bootstrap.py          # Main IaC script
├── my-org.json           # Organization manifest
├── pyproject.toml        # Python dependencies
└── !README.md            # This file

Code Quality

# Format and lint
uv run ruff format bootstrap.py
uv run ruff check bootstrap.py

# Run tests (when added)
uv run pytest

Safety Features

  • Dry run mode - Always test with --dry first
  • Idempotent operations - Safe to run multiple times
  • Fork verification - Confirms fork relationship before syncing
  • Error handling - Graceful failures with informative messages
  • Fast-forward sync - Won't overwrite diverging changes without --force-sync

Troubleshooting

Authentication Issues

# Verify gh CLI is authenticated
gh auth status

# Re-authenticate if needed
gh auth login

Fork Sync Failures

  • Ensure upstream field matches actual fork parent
  • Use --force-sync if branches have diverged (caution: overwrites changes)
  • Check that the fork relationship exists on GitHub

Permission Errors

  • Verify you have admin access to the organization
  • Check that gh CLI is authenticated with correct account

License

This tooling is open for GitHub organization management via the gist model.

Related

# Virtual environment
.venv/
venv/
env/
# Python cache
__pycache__/
*.pyc
*.pyo
*.pyd
.Python
# UV lock file (regenerated on sync)
uv.lock
# Org-specific customizable files (copy from *.example.* templates)
pyproject.toml
.org
*.json
!*.example.json
# Cloned repositories (exclude all directories except gist files)
# Each directory is a cloned fork/repo
*/
# But include these specific patterns
!.gitignore
!bootstrap.py
!pyproject.example.toml
!*.md
!*.txt
# OS files
.DS_Store
Thumbs.db
*.swp
*.swo
*~
# IDE
.vscode/
.idea/
*.iml
your-org-name
#!/usr/bin/env python
import json
import shutil
import subprocess
from pathlib import Path
import typer
from rich.console import Console
console = Console()
def sh(cmd, dry=False, check=True):
if dry:
console.print(f"[cyan]DRY:[/cyan] {' '.join(cmd)}")
return ""
r = subprocess.run(cmd, capture_output=True, text=True)
if check and r.returncode != 0:
raise RuntimeError(r.stderr.strip() or r.stdout.strip())
return r.stdout.strip()
def repo_exists(org, name) -> bool:
try:
sh(["gh","repo","view",f"{org}/{name}","--json","name"])
return True
except Exception:
return False
def ensure_repo(org, name, vis, desc, topics, dry):
"""Create or update repository - idempotent"""
created = False
if not repo_exists(org, name):
sh(["gh","repo","create", f"{org}/{name}",
f"--{vis}", "--disable-issues", "--disable-wiki", "--confirm"], dry)
console.print(f"[green]Created {org}/{name}[/green]")
created = True
else:
console.print(f"[dim]Repo exists {org}/{name}[/dim]")
# Only update visibility if it changed (check current state)
if not created:
try:
current_vis = sh(
["gh", "repo", "view", f"{org}/{name}", "--json", "visibility", "-q", ".visibility"], check=True
)
current_vis = current_vis.strip().lower()
target_vis = vis.lower()
if current_vis != target_vis:
sh(["gh","repo","edit",f"{org}/{name}",f"--visibility={target_vis}"], dry, check=False)
console.print(f"[yellow]Visibility changed: {current_vis} => {target_vis}[/yellow]")
except Exception:
# If check fails, try to update anyway
sh(["gh","repo","edit",f"{org}/{name}",f"--visibility={vis}"], dry, check=False)
if desc:
sh(["gh","repo","edit",f"{org}/{name}","--description",desc], dry, check=False)
if topics:
sh(["gh","repo","edit",f"{org}/{name}","--add-topic",",".join(topics)], dry, check=False)
def set_default_branch(org, name, default_branch, dry):
if default_branch:
sh(["gh","api","-X","PATCH",f"repos/{org}/{name}","-f",f"default_branch={default_branch}"], dry, check=False)
def ensure_local_clone(base: Path, org, name, dry):
local = base / name
if not local.exists():
sh(["gh","repo","clone",f"{org}/{name}",str(local)], dry)
console.print(f"Cloned => {local}")
else:
console.print("Local exists => " + str(local))
return local
def seed_repo(local: Path, name: str, org: str, template_repo: str, dry: bool):
"""Initialize repo from template repository instead of local seed folder"""
if dry:
console.print(f"[cyan]DRY:[/cyan] seed repo {local} from template {template_repo}")
return
# Check if repo already has commits
try:
sh(["git","-C",str(local),"rev-parse","HEAD"], check=True)
console.print("Repo already initialized")
return
except Exception:
pass # No commits yet, continue with seeding
# Clone template repo to temp location
import tempfile
temp_dir = Path(tempfile.mkdtemp()) / template_repo
try:
sh(["gh","repo","clone",f"{org}/{template_repo}",str(temp_dir)], check=True)
# Remove .git to avoid conflicts
git_dir = temp_dir / ".git"
if git_dir.exists():
shutil.rmtree(git_dir)
# Copy template files to target repo
rsync_dir(temp_dir, local)
# Substitute placeholders
substitute_placeholders(local, name, org)
finally:
# Clean up temp dir
if temp_dir.exists():
shutil.rmtree(temp_dir.parent)
# Stage all files and commit
sh(["git","-C",str(local),"add","."], check=False)
st = sh(["git","-C",str(local),"status","--porcelain"], check=False)
if st.strip():
sh(["git","-C",str(local),"commit","-m","bootstrap: initial commit from template"], check=False)
sh(["git","-C",str(local),"branch","-M","main"], check=False)
sh(["git","-C",str(local),"push","-u","origin","main"], check=False)
console.print(f"Initialized repo => {name}")
else:
console.print("No changes to commit")
def rsync_dir(src: Path, dst: Path):
"""Copy files from src to dst, respecting .seedignore"""
ignore = set()
ig = src / ".seedignore"
if ig.exists():
ignore = {p.strip() for p in ig.read_text(encoding="utf-8").splitlines()
if p.strip() and not p.startswith("#")}
for item in src.rglob("*"):
rel = item.relative_to(src)
# Skip .seedignore itself and ignored patterns
if str(rel) == ".seedignore":
continue
if any(str(rel).startswith(p) for p in ignore):
continue
target = dst / rel
if item.is_dir():
target.mkdir(parents=True, exist_ok=True)
else:
target.parent.mkdir(parents=True, exist_ok=True)
shutil.copy2(item, target)
def substitute_placeholders(path: Path, name: str, org: str):
"""Replace placeholders in seed files"""
from datetime import datetime
year = str(datetime.utcnow().year)
for file in path.rglob("*"):
if file.is_file() and file.suffix.lower() in {".md", ".txt", ".yml", ".yaml", ".json"}:
try:
text = file.read_text(encoding="utf-8")
new_text = (text.replace("{{REPO_NAME}}", name)
.replace("{{ORG}}", org)
.replace("{{YEAR}}", year))
if new_text != text:
file.write_text(new_text, encoding="utf-8")
except Exception:
pass
def inspect_fork_delta(org, name, upstream, branch, base_path):
"""Inspect differences between fork and upstream repository"""
if not upstream:
console.print(f"[dim]No upstream configured for {name}[/dim]")
return
console.print(f"\n[bold yellow]Inspecting Delta: {name} <-> {upstream}[/bold yellow]")
local = base_path / name
# Clone or update local repo
if not local.exists():
try:
sh(["gh", "repo", "clone", f"{org}/{name}", str(local)], check=True)
console.print(f"[dim]Cloned {name} for inspection[/dim]")
except Exception as e:
console.print(f"[red]Failed to clone {name}: {e}[/red]")
return
else:
console.print(f"[dim]Using local copy: {local}[/dim]")
# Add upstream remote if not exists
try:
remotes = sh(["git", "-C", str(local), "remote"], check=True)
if "upstream" not in remotes:
upstream_url = f"https://github.com/{upstream}.git"
sh(["git", "-C", str(local), "remote", "add", "upstream", upstream_url], check=True)
console.print(f"[dim]Added upstream remote: {upstream}[/dim]")
except Exception as e:
console.print(f"[yellow]Warning: Could not add upstream remote: {e}[/yellow]")
# Fetch from both remotes
try:
console.print("[dim]Fetching from origin and upstream...[/dim]")
sh(["git", "-C", str(local), "fetch", "origin", branch], check=False)
sh(["git", "-C", str(local), "fetch", "upstream", branch], check=False)
except Exception as e:
console.print(f"[yellow]Warning: Fetch failed: {e}[/yellow]")
# Get commit comparison
try:
# Commits behind (in upstream but not in fork)
behind = sh(["git", "-C", str(local), "rev-list", "--count", f"origin/{branch}..upstream/{branch}"], check=True)
# Commits ahead (in fork but not in upstream)
ahead = sh(["git", "-C", str(local), "rev-list", "--count", f"upstream/{branch}..origin/{branch}"], check=True)
behind_count = int(behind.strip()) if behind.strip() else 0
ahead_count = int(ahead.strip()) if ahead.strip() else 0
# Status summary
if behind_count == 0 and ahead_count == 0:
console.print("[green]✓ Fork is up-to-date with upstream[/green]")
return
else:
status_parts = []
if behind_count > 0:
status_parts.append(f"[red]{behind_count} commits behind[/red]")
if ahead_count > 0:
status_parts.append(f"[cyan]{ahead_count} commits ahead[/cyan]")
console.print(f"Status: {' | '.join(status_parts)}")
# Show commits behind (what would be synced)
if behind_count > 0:
console.print(f"\n[yellow]New commits in upstream (would be synced):[/yellow]")
commits = sh(["git", "-C", str(local), "log", "--oneline", "--no-decorate",
f"origin/{branch}..upstream/{branch}", "-10"], check=True)
for line in commits.strip().split('\n')[:10]:
if line.strip():
console.print(f" • {line}")
if behind_count > 10:
console.print(f" ... and {behind_count - 10} more commits")
# Show files that would change
try:
files_changed = sh(["git", "-C", str(local), "diff", "--name-status",
f"origin/{branch}...upstream/{branch}"], check=True)
if files_changed.strip():
lines = files_changed.strip().split('\n')
file_count = len(lines)
console.print(f"\n[yellow]Files affected: {file_count}[/yellow]")
for line in lines[:15]:
if line.strip():
console.print(f" {line}")
if file_count > 15:
console.print(f" ... and {file_count - 15} more files")
except Exception:
pass
# Show commits ahead (would be lost with force sync)
if ahead_count > 0:
console.print(f"\n[cyan]Commits in fork not in upstream:[/cyan]")
commits = sh(["git", "-C", str(local), "log", "--oneline", "--no-decorate",
f"upstream/{branch}..origin/{branch}", "-5"], check=True)
for line in commits.strip().split('\n')[:5]:
if line.strip():
console.print(f" • {line}")
if ahead_count > 5:
console.print(f" ... and {ahead_count - 5} more commits")
console.print("[red]⚠ These commits would be LOST with --force-sync[/red]")
except Exception as e:
console.print(f"[red]Failed to inspect delta: {e}[/red]")
def sync_fork(org, name, upstream, branch, force_sync, dry):
"""Sync fork with upstream repository - idempotent"""
if not upstream:
return
# Check if repo is actually a fork
try:
repo_info = sh(["gh", "repo", "view", f"{org}/{name}", "--json", "isFork", "-q", ".isFork"], check=True)
is_fork = repo_info.strip().lower() == "true"
if not is_fork:
console.print(f"[yellow]Skipping sync: {name} is not a fork[/yellow]")
return
except Exception:
console.print(f"[dim]Could not verify fork status for {name}[/dim]")
return
# Build sync command
cmd = ["gh", "repo", "sync", f"{org}/{name}"]
if branch:
cmd.extend(["--branch", branch])
if upstream:
cmd.extend(["--source", upstream])
if force_sync:
cmd.append("--force")
console.print(f"[yellow]Force syncing {name} with {upstream}[/yellow]")
try:
sh(cmd, dry, check=True)
console.print(f"[green]Synced {name} with {upstream}[/green]")
except Exception as e:
console.print(f"[yellow]Failed to sync {name}: {str(e)}[/yellow]")
def apply_org_settings(org, settings, dry):
"""Apply organization-wide settings"""
if not settings:
return
# Build API call with all settings
cmd = ["gh", "api", "-X", "PATCH", f"orgs/{org}"]
applied = []
# Add string fields
for field in ["description", "name", "blog", "location", "email", "company", "twitter_username",
"default_repository_permission", "default_repository_branch"]:
if field in settings and settings[field]:
cmd.extend(["-f", f"{field}={settings[field]}"])
applied.append(field)
# Add boolean fields
for field in ["members_can_create_repositories", "members_can_create_public_repositories",
"members_can_create_private_repositories", "members_can_delete_repositories",
"members_can_change_repo_visibility", "members_can_fork_private_repositories",
"members_can_delete_issues", "members_can_invite_outside_collaborators",
"members_can_create_teams", "members_can_create_pages", "readers_can_create_discussions",
"web_commit_signoff_required", "has_organization_projects", "has_repository_projects",
"dependabot_alerts_enabled_for_new_repositories",
"dependabot_security_updates_enabled_for_new_repositories",
"dependency_graph_enabled_for_new_repositories"]:
if field in settings:
cmd.extend(["-F", f"{field}={str(settings[field]).lower()}"])
applied.append(field)
if len(cmd) > 4: # Only run if we have settings to apply
sh(cmd, dry)
console.print(f"[green]Org settings:[/green] {len(applied)} settings applied")
def apply_repo_settings(org, name, settings, is_private, is_fork, dry):
"""Apply repository settings"""
if not settings:
return
# Build API call with repo settings
cmd = ["gh", "api", "-X", "PATCH", f"repos/{org}/{name}"]
notable = []
# Add string fields
for field in ["description", "homepage"]:
if field in settings and settings[field]:
cmd.extend(["-f", f"{field}={settings[field]}"])
# Add boolean fields and track notable ones
for field in ["has_issues", "has_projects", "has_wiki", "has_discussions",
"allow_squash_merge", "allow_merge_commit", "allow_rebase_merge",
"allow_auto_merge", "delete_branch_on_merge", "allow_update_branch",
"web_commit_signoff_required", "archived", "has_downloads", "is_template"]:
if field in settings:
cmd.extend(["-F", f"{field}={str(settings[field]).lower()}"])
# Track settings that differ from typical defaults or are noteworthy
if field in ["has_discussions", "has_wiki", "archived", "is_template"] and settings[field]:
notable.append(f"{field}=true")
elif field in ["has_issues", "has_projects"] and not settings[field]:
notable.append(f"{field}=false")
elif field in ["has_downloads"] and not settings[field]:
notable.append(f"{field}=false")
# Only include allow_forking for public, non-fork repos
# Forks cannot control their forking settings (inherited from parent)
if not is_private and not is_fork and "allow_forking" in settings:
cmd.extend(["-F", f"allow_forking={str(settings['allow_forking']).lower()}"])
if len(cmd) > 4: # Only run if we have settings to apply
sh(cmd, dry, check=False)
if notable:
console.print(f"[dim]Repo settings: {', '.join(notable)}[/dim]")
else:
console.print("[dim]Repo settings: defaults applied[/dim]")
def apply_pages_settings(org, name, pages_config, is_private, dry):
"""Apply GitHub Pages configuration - idempotent"""
if not pages_config:
return
enabled = pages_config.get("enabled", False)
# Check if Pages currently exists and get current config
pages_exists = False
current_pages = None
try:
result = sh(["gh", "api", f"repos/{org}/{name}/pages"], check=True)
if result:
current_pages = json.loads(result)
pages_exists = True
except Exception:
pass
# Handle enabling Pages
if enabled:
if is_private:
console.print(f"[dim]Pages skipped: {name} is private (requires paid plan)[/dim]")
return
# Get target configuration
source = pages_config.get("source", {})
target_branch = source.get("branch", "main")
target_path = source.get("path", "/docs")
target_build_type = pages_config.get("build_type", "workflow")
# Get target custom domain
target_cname = pages_config.get("cname")
# Check if Pages needs updating
needs_update = not pages_exists
if pages_exists and current_pages:
current_build = current_pages.get("build_type")
current_source = current_pages.get("source", {})
current_branch = current_source.get("branch")
current_path = current_source.get("path")
current_cname = current_pages.get("cname")
if (current_build != target_build_type or
current_branch != target_branch or
current_path != target_path or
current_cname != target_cname):
needs_update = True
if needs_update:
# Determine method: POST to create, PUT to update
method = "PUT" if pages_exists else "POST"
cmd = ["gh", "api", "-X", method, f"repos/{org}/{name}/pages"]
cmd.extend(["-F", f"source[branch]={target_branch}"])
cmd.extend(["-F", f"source[path]={target_path}"])
cmd.extend(["-F", f"build_type={target_build_type}"])
# Add custom domain if specified
if target_cname:
cmd.extend(["-F", f"cname={target_cname}"])
sh(cmd, dry, check=False)
action = "updated" if pages_exists else "enabled"
console.print(f"[green]Pages {action}: {target_build_type} from {target_branch}{target_path}[/green]")
else:
console.print(f"[dim]Pages unchanged: {target_build_type} from {target_branch}{target_path}[/dim]")
# Handle HTTPS enforcement (separate API call, only if Pages exists and custom domain is set)
if pages_exists and target_cname:
target_https = pages_config.get("https_enforced", True)
current_https = current_pages.get("https_enforced") if current_pages else False
if target_https != current_https:
# Note: HTTPS can only be enforced after DNS is configured and certificate is issued
# This API call may fail if certificate is not ready yet
sh(["gh", "api", "-X", "PUT", f"repos/{org}/{name}/pages",
"-F", f"https_enforced={str(target_https).lower()}"], dry, check=False)
console.print(f"[green]HTTPS enforcement: {target_https}[/green]")
elif target_https:
console.print("[dim]HTTPS already enforced[/dim]")
# Handle disabling Pages
elif pages_exists and not enabled:
sh(["gh", "api", "-X", "DELETE", f"repos/{org}/{name}/pages"], dry)
console.print("[dim]Pages disabled[/dim]")
def ensure_members(org, name, members, dry):
for m in members or []:
user = m["user"]
perm = m.get("permission","push")
sh(["gh","api","-X","PUT", f"repos/{org}/{name}/collaborators/{user}",
"-f", f"permission={perm}"], dry)
console.print(f"Member {user} <= {perm}")
def set_deploy_key_secret(org, name, is_private, dry):
"""Set CATALOG_PUBLISH_TOKEN secret for private repos"""
if not is_private:
return
# Check if deploy key file exists
key_file = Path("deploy-key")
if not key_file.exists():
console.print("[dim]Deploy key not found, skipping secret setup[/dim]")
return
try:
# Read private key
private_key = key_file.read_text()
# Set secret via gh CLI
cmd = ["gh", "secret", "set", "CATALOG_PUBLISH_TOKEN",
"--repo", f"{org}/{name}", "--body", private_key]
sh(cmd, dry, check=False)
console.print("[green]Deploy key secret set for private repo[/green]")
except Exception as e:
console.print(f"[yellow]Failed to set deploy key secret: {e}[/yellow]")
def team_exists(org, team_slug) -> bool:
"""Check if team exists by slug"""
try:
sh(["gh", "api", f"orgs/{org}/teams/{team_slug}"], check=True)
return True
except Exception:
return False
def ensure_teams(org, teams_config, dry):
"""Create and configure teams - idempotent"""
if not teams_config:
return
console.print("\n[bold yellow]Teams Management[/bold yellow]")
for team in teams_config:
team_name = team["name"]
team_slug = team_name.lower().replace(" ", "-")
privacy = team.get("privacy", "closed")
description = team.get("description", "")
repos = team.get("repos", {})
# Check if team exists
exists = team_exists(org, team_slug)
if not exists:
# Create team
cmd = ["gh", "api", "-X", "POST", f"orgs/{org}/teams",
"-f", f"name={team_name}",
"-f", f"privacy={privacy}"]
if description:
cmd.extend(["-f", f"description={description}"])
sh(cmd, dry, check=False)
console.print(f"[green]Created team: {team_name}[/green]")
else:
# Update team settings if they exist
cmd = ["gh", "api", "-X", "PATCH", f"orgs/{org}/teams/{team_slug}",
"-f", f"name={team_name}",
"-f", f"privacy={privacy}"]
if description:
cmd.extend(["-f", f"description={description}"])
sh(cmd, dry, check=False)
console.print(f"[dim]Team exists: {team_name}[/dim]")
# Assign repository permissions
for repo_name, permission in repos.items():
# Map permission names to GitHub API permission values
# GitHub API accepts: pull, push, admin, maintain, triage
permission_map = {
"read": "pull",
"write": "push",
"admin": "admin",
"maintain": "maintain",
"triage": "triage"
}
api_permission = permission_map.get(permission.lower(), permission.lower())
cmd = ["gh", "api", "-X", "PUT",
f"orgs/{org}/teams/{team_slug}/repos/{org}/{repo_name}",
"-f", f"permission={api_permission}"]
sh(cmd, dry, check=False)
console.print(f" [dim]+ {repo_name}: {permission}[/dim]")
def main(
manifest: Path = typer.Option(None, "--manifest", "-m", help="Path to manifest JSON (auto-detected from .org file if not specified)"),
dry: bool = typer.Option(False, "--dry/--no-dry", help="Plan only (no writes)"),
sync: bool = typer.Option(True, "--sync/--no-sync", help="Sync forks with upstream (default: True)"),
sync_only: bool = typer.Option(False, "--sync-only", help="Only sync existing forks, skip repo creation"),
force_sync: bool = typer.Option(False, "--force-sync", help="Force sync (hard reset to upstream)"),
inspect_delta: bool = typer.Option(False, "--inspect-delta", help="Show detailed diff between fork and upstream")
):
# Auto-detect manifest from .org flag file if not specified
if manifest is None:
org_file = Path(".org")
if org_file.exists():
org_name = org_file.read_text().strip()
manifest = Path(f"{org_name}.json")
console.print(f"[dim]Using manifest from .org file: {manifest}[/dim]")
else:
console.print(f"[red]No manifest specified and .org file not found[/red]")
console.print(f"[yellow]Create .org file with org name, or use --manifest flag[/yellow]")
raise typer.Exit(1)
if not manifest.exists():
console.print(f"[red]Missing manifest: {manifest}[/red]")
raise typer.Exit(1)
cfg = json.loads(manifest.read_text())
org = cfg["org"]
org_settings = cfg.get("org_settings", {})
dfl = cfg.get("defaults", {})
repo_defaults = cfg.get("repo_defaults", {})
pages_defaults = cfg.get("pages_defaults", {})
template_repo = cfg.get("template_repo", "api-template") # <--- template repository
base = Path.cwd()
console.rule(f"[bold yellow]Apply (IaC) {org}[/bold yellow]")
# Apply org-wide settings first
if org_settings:
apply_org_settings(org, org_settings, dry)
# Create and configure teams
teams_config = cfg.get("teams", [])
if teams_config:
ensure_teams(org, teams_config, dry)
for r in cfg["repos"]:
name = r["name"]
desc = r.get("description","")
topics = r.get("topics",[])
# visibility precedence: repo.visibility > repo.private(bool) > defaults.visibility > "private"
if "visibility" in r:
vis = str(r["visibility"]).lower()
elif "private" in r:
vis = "private" if r["private"] else "public"
else:
vis = str(dfl.get("visibility","private")).lower()
default_branch = r.get("default_branch") or dfl.get("default_branch") or "main"
init_readme = bool(r.get("init_readme", dfl.get("init_readme", True)))
members = r.get("members", [])
upstream = r.get("upstream")
sync_branch = r.get("sync_branch") or default_branch
# Merge repo_defaults with per-repo settings (per-repo overrides defaults)
repo_settings = {**repo_defaults, **{k: v for k, v in r.items()
if k in ["homepage", "has_issues", "has_projects", "has_wiki", "has_discussions",
"allow_squash_merge", "allow_merge_commit", "allow_rebase_merge",
"allow_auto_merge", "delete_branch_on_merge", "allow_update_branch",
"allow_forking", "web_commit_signoff_required", "archived",
"has_downloads", "is_template"]}}
# Merge pages_defaults with per-repo pages config (per-repo overrides defaults)
pages_config = None
if "pages" in r:
pages_config = {**pages_defaults, **r["pages"]}
elif pages_defaults.get("enabled", False):
pages_config = pages_defaults
console.print(f"[bold cyan]=> {name}[/bold cyan]")
# Detect if this is a fork (has upstream field)
is_fork = bool(upstream)
if not sync_only:
if is_fork:
# Forks already exist on GitHub - just verify and update metadata
console.print(f"[dim]Fork of {upstream} - skipping creation[/dim]")
if desc or topics:
if desc:
sh(["gh","repo","edit",f"{org}/{name}","--description",desc], dry, check=False)
if topics:
sh(["gh","repo","edit",f"{org}/{name}","--add-topic",",".join(topics)], dry, check=False)
else:
# Regular repos - full creation workflow
# 1. Create remote repo
ensure_repo(org, name, vis, desc, topics, dry)
# 2. Clone locally and seed only if init_readme is true (new repos)
if init_readme:
local = ensure_local_clone(base, org, name, dry)
seed_repo(local, name, org, template_repo, dry)
# Common operations for both forks and regular repos
# 3. Set default branch
set_default_branch(org, name, default_branch, dry)
# 4. Apply repo settings
is_private = (vis == "private")
apply_repo_settings(org, name, repo_settings, is_private, is_fork, dry)
# 5. Apply Pages settings
apply_pages_settings(org, name, pages_config, is_private, dry)
# 6. Add collaborators
ensure_members(org, name, members, dry)
# 7. Set deploy key secret for private repos
set_deploy_key_secret(org, name, is_private, dry)
# 8. Inspect delta if requested (clones locally as needed)
if inspect_delta and upstream:
inspect_fork_delta(org, name, upstream, sync_branch, base)
# 9. Sync fork with upstream if configured
if sync and upstream:
sync_fork(org, name, upstream, sync_branch, force_sync, dry)
console.rule("[bold green]Done[/bold green]")
if __name__ == "__main__":
typer.run(main)

CLAUDE.md

This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.

Project Overview

gh-org-iac is a declarative, idempotent Infrastructure-as-Code tool for managing GitHub organizations. It's designed as a multi-org shared gist where the tooling (bootstrap.py) is shared across organizations, but each org has its own uniquely-named manifest JSON file.

Gist URL: https://gist.github.com/cprima/5e6dbc305e584d35d334b7a3355cc2a4

Key Architecture Principles

  1. Shared Tooling, Isolated Configs: bootstrap.py is shared; manifests like my-org.json, another-org.json are org-specific
  2. Idempotent by Design: All operations are safe to run multiple times; script checks current state before applying changes
  3. Gist-based Distribution: Updates to bootstrap.py propagate via git pull; new org manifests are shared via git push

Common Commands

Setup and Dependencies

# Install dependencies (creates .venv and uv.lock)
uv sync

# Run script (always use uv run to ensure correct environment)
uv run python bootstrap.py --help

Running the Script

# Always test with dry run first
uv run python bootstrap.py --dry

# Apply changes with organization-specific manifest
uv run python bootstrap.py --manifest my-org.json

# Sync forks only (skip repo creation/updates)
uv run python bootstrap.py --sync-only

# Inspect what would change before syncing
uv run python bootstrap.py --inspect-delta --no-sync

Code Quality

# Format code
uv run ruff format bootstrap.py

# Lint code
uv run ruff check bootstrap.py

Gist Synchronization

# Pull latest tooling updates
git pull

# Push new/updated manifest
git add <org-name>.json
git commit -m "Update <org-name> manifest"
git push

Architecture

Core Components

bootstrap.py (single-file architecture):

  • sh() - Wrapper for subprocess calls with dry-run support
  • ensure_repo() - Idempotent repo creation/updates
  • sync_fork() - Fork synchronization using gh repo sync
  • inspect_fork_delta() - Deep diff analysis (clones locally, fetches from origin+upstream, compares)
  • apply_org_settings() - Organization-level configuration
  • ensure_teams() - Team creation and repo permissions
  • apply_repo_settings() - Repository-level configuration
  • apply_pages_settings() - GitHub Pages configuration
  • main() - Orchestrates all operations in order

Execution Flow

  1. Parse manifest JSON
  2. Apply org settings (if not --sync-only)
  3. Create/configure teams (if defined)
  4. For each repository:
    • Create/verify repo exists
    • Seed from template (if init_readme: true)
    • Apply settings (issues, wiki, discussions, etc.)
    • Configure GitHub Pages
    • Add collaborators
    • Inspect delta (if --inspect-delta)
    • Sync with upstream (if upstream field present and not --no-sync)

Manifest Structure

Each org has a JSON manifest with these top-level keys:

  • org - Organization name
  • template_repo - Template for new repos
  • org_settings - Org-wide GitHub settings
  • teams - Team definitions with repo permissions
  • defaults - Default values for repos (visibility, branch, etc.)
  • repo_defaults - Default repo settings (issues, wiki, etc.)
  • pages_defaults - Default GitHub Pages config
  • repos - Array of repository definitions

Critical repo fields for fork management:

  • upstream - Source repository (e.g., "owner/repo") - triggers fork sync
  • sync_branch - Branch to sync (defaults to default_branch)

Delta Inspection Feature

When --inspect-delta is used, the script:

  1. Clones fork locally (or uses existing clone)
  2. Adds upstream remote if missing
  3. Fetches from both origin and upstream
  4. Compares using git rev-list and git log
  5. Shows: commits behind, commits ahead, files affected
  6. Warns if --force-sync would lose local commits

This happens before syncing, allowing informed decisions.

Important Conventions

File Naming

  • Manifest files: <org-name>.json (matches org name)
  • README: !README.md (the ! makes it sort first in gist listings)
  • Script name: bootstrap.py (never rename; multiple orgs depend on this)
  • Org flag file: .org (contains org name, auto-selects manifest)

Organization Flag File (.org)

When multiple manifests exist in the same directory, create a .org file containing the org name:

echo "my-org" > .org

The script will automatically use my-org.json without requiring --manifest flag:

# Uses manifest from .org file
uv run python bootstrap.py --dry

# Override if needed
uv run python bootstrap.py --manifest other-org.json

Committing .org:

  • Commit if directory is dedicated to one org (provides default)
  • Don't commit if different clones manage different orgs (keep local)
  • The .gitignore has this commented out for flexibility

Manifest Defaults

When setting defaults for forks vs. new repos:

  • Forks: "init_readme": false (don't overwrite fork content)
  • Forks: "upstream": "owner/repo" (required for sync)
  • Forks: "visibility": "public" (typical for open source forks)

Idempotency Patterns

All functions check current state before applying changes:

  • repo_exists() before creating
  • Check current visibility before updating
  • Verify fork status before syncing
  • Check if repo has commits before seeding

Safety Features

  • Dry run mode (--dry) prints commands without executing
  • Fork sync uses fast-forward by default (won't overwrite diverging changes)
  • --force-sync is required to hard reset (and shows warnings)
  • All gh API calls include error handling

Multi-Org Workflow

Each organization clones the gist and creates local copies of template files:

# Org 1
gh gist clone 5e6dbc305e584d35d334b7a3355cc2a4 my-org
cd my-org
cp pyproject.example.toml pyproject.toml  # Local customization
cp org.example.json my-org.json            # Local config
cp .org.example .org                       # Local org marker
echo "my-org" > .org                       # Write org name
uv sync
uv run python bootstrap.py --dry

# Org 2
gh gist clone 5e6dbc305e584d35d334b7a3355cc2a4 another-org
cd another-org
cp pyproject.example.toml pyproject.toml
cp org.example.json another-org.json
cp .org.example .org
echo "another-org" > .org
uv sync
uv run python bootstrap.py --dry

Important:

  • Files tracked in gist: bootstrap.py, docs, *.example.* templates
  • Files kept local: pyproject.toml, .org, *.json configs
  • Updates via git pull are safe - local files never overwritten

Common Tasks

Adding a New Org

  1. Clone gist: gh gist clone 5e6dbc305e584d35d334b7a3355cc2a4 <new-org-name>
  2. Create local files from templates:
    cp pyproject.example.toml pyproject.toml
    cp org.example.json <new-org-name>.json
    cp .org.example .org
    echo "<new-org-name>" > .org
  3. Edit <new-org-name>.json to configure org settings and repos
  4. Test: uv run python bootstrap.py --dry
  5. Apply: uv run python bootstrap.py

Note: Config files stay local and are never pushed to the gist.

Updating Shared Tooling

  1. Edit bootstrap.py or update templates (*.example.*)
  2. Test with multiple org configs
  3. Commit and push to gist
  4. All org clones pull to get update (their local configs remain untouched)

Fetching Existing Forks from GitHub

gh repo list <org-name> --json name,isFork,parent,description,defaultBranchRef,isPrivate --limit 100

Use this output to populate the manifest's repos array.

Adding Fork Sync to Existing Manifest

For each fork, add:

{
  "name": "repo-name",
  "upstream": "original-owner/original-repo",
  "sync_branch": "main"
}

External Dependencies

  • GitHub CLI (gh): All GitHub API operations (must be authenticated)
  • uv: Package manager for Python dependencies
  • Python 3.12+: Required for type hints and modern syntax
  • typer: CLI framework (handles argument parsing)
  • rich: Terminal formatting and colors
  • git: For delta inspection (cloning, fetching, comparing)

Constraints and Limitations

  • No README.md: Use !README.md (the ! prefix sorts first in gist listings)
  • No commits without permission: Never force push; never add author info to commits (per user's .claude/CLAUDE.md)
  • Manifest uniqueness: Each org must have uniquely named manifest to avoid conflicts
  • Delta inspection requires git: The --inspect-delta feature clones repos locally for comparison
{
"org": "your-org-name",
"template_repo": "repo-template",
"org_settings": {
"description": "Your organization description",
"name": "Your Org Display Name",
"blog": "https://example.com/blog",
"location": "San Francisco, CA",
"email": "[email protected]",
"company": "Example Corp",
"twitter_username": "example",
"default_repository_permission": "read",
"default_repository_branch": "main",
"members_can_create_repositories": false,
"members_can_create_public_repositories": false,
"members_can_create_private_repositories": false,
"members_can_delete_repositories": false,
"members_can_change_repo_visibility": false,
"members_can_fork_private_repositories": false,
"members_can_delete_issues": false,
"members_can_invite_outside_collaborators": true,
"members_can_create_teams": false,
"members_can_create_pages": true,
"readers_can_create_discussions": false,
"web_commit_signoff_required": false,
"has_organization_projects": true,
"has_repository_projects": true,
"dependabot_alerts_enabled_for_new_repositories": true,
"dependabot_security_updates_enabled_for_new_repositories": true,
"dependency_graph_enabled_for_new_repositories": true
},
"teams": [
{
"name": "Core Team",
"privacy": "closed",
"description": "Core maintainers with admin access",
"repos": {
"example-repo": "admin",
"example-fork": "write"
}
},
{
"name": "Contributors",
"privacy": "closed",
"description": "External contributors with write access",
"repos": {
"example-repo": "write",
"example-fork": "read"
}
}
],
"defaults": {
"visibility": "private",
"default_branch": "main",
"init_readme": true
},
"repo_defaults": {
"has_issues": true,
"has_projects": true,
"has_wiki": false,
"has_discussions": false,
"allow_squash_merge": true,
"allow_merge_commit": false,
"allow_rebase_merge": true,
"allow_auto_merge": false,
"delete_branch_on_merge": true,
"allow_update_branch": true,
"allow_forking": true,
"web_commit_signoff_required": false,
"archived": false,
"has_downloads": true,
"is_template": false
},
"pages_defaults": {
"enabled": false,
"build_type": "workflow",
"source": {
"branch": "main",
"path": "/docs"
},
"https_enforced": true
},
"repos": [
{
"name": "example-fork",
"description": "Fork of upstream/example-repo for experimentation",
"upstream": "upstream-owner/example-repo",
"sync_branch": "main",
"visibility": "public",
"topics": ["fork", "example"],
"has_discussions": true
},
{
"name": "example-repo",
"description": "Example repository created by bootstrap script",
"visibility": "private",
"default_branch": "main",
"init_readme": true,
"topics": ["example", "infrastructure"],
"has_issues": true,
"has_projects": false,
"has_wiki": false,
"has_discussions": false,
"members": [
{
"user": "example-user",
"permission": "push"
}
]
},
{
"name": "docs-site",
"description": "Documentation site with GitHub Pages",
"visibility": "public",
"init_readme": true,
"topics": ["documentation", "github-pages"],
"has_wiki": false,
"pages": {
"enabled": true,
"build_type": "workflow",
"source": {
"branch": "main",
"path": "/docs"
},
"https_enforced": true,
"cname": "docs.example.com"
}
}
]
}
[project]
name = "gh-org-iac-bootstrap"
version = "0.1.0"
description = "Infrastructure-as-Code tooling for managing GitHub organizations"
readme = "!README.md"
requires-python = ">=3.12"
dependencies = [
"typer>=0.9.0",
"rich>=13.0.0",
]
[dependency-groups]
dev = [
"pytest>=7.0.0",
"ruff>=0.1.0",
]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["."]
only-include = ["bootstrap.py"]
[tool.ruff]
line-length = 120
target-version = "py312"
[tool.ruff.lint]
select = ["E", "F", "W", "I", "N", "UP"]
ignore = []
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment