Skip to content

Instantly share code, notes, and snippets.

@saitho
Last active January 29, 2026 12:41
Show Gist options
  • Select an option

  • Save saitho/9f6e4ab197776b48ee3fe8a41e9ebdc4 to your computer and use it in GitHub Desktop.

Select an option

Save saitho/9f6e4ab197776b48ee3fe8a41e9ebdc4 to your computer and use it in GitHub Desktop.
Visualize composer dependencies

Project Utility Scripts

This project contains two primary Python scripts designed to assist with dependency management and TYPO3 upgrade planning.

1. Dependency Visualizer (visualize_dependencies.py)

This script analyzes your composer.json and composer.lock files to generate comprehensive dependency reports and diagrams. It is specifically optimized for TYPO3 projects with multiple local packages.

Features

  • Multi-Format Output: Generates DOT (Graphviz), SVG, PlantUML, and Markdown reports.
  • Smart Grouping: Automatically collapses the largest group of site packages (e.g., btv/btv-ws-*) with identical dependencies into a single node. Outliers with unique dependencies are kept separate to highlight deviations.
  • Filtering: Focuses on relevant namespaces (typo3/, btv/, saitho/) to avoid noise from common vendor packages.
  • Direct Dependencies Mode: Option to only show immediate requirements of your own packages.

Configuration (Top of file)

  • MAX_LEVELS: Limit the depth of the dependency tree traversal.
  • ONLY_DIRECT_DEPS_OF_OWN: Set to True to stop exploration after the first level of external dependencies.
  • OWN_PREFIXES: Namespaces considered "internal" to your organization.
  • SITE_PACKAGE_PREFIX: The prefix used to identify site packages for grouping.

Usage

python3 visualize_dependencies.py

Output Files:

  • dependencies.dot: Source for Graphviz.
  • dependencies.svg: Scalable vector graphic of the dependency tree.
  • dependencies.puml: View with PlantUML.
  • dependencies.md: A human-readable text report showing reverse dependencies (who depends on what).

2. Upgrade Simulator (simulate_upgrade.py)

This script simulates a major TYPO3 upgrade (e.g., from v12 to v13). It identifies blockers and calculates an optimal step-by-step upgrade path.

Features

  • Compatibility Check: Checks if your current version constraints allow the TARGET_MAJOR version of TYPO3.
  • Packagist Integration: Automatically queries Packagist to see if newer, compatible versions of third-party extensions are available.
  • Phase-Based Planning:
    • Phase 1 (Preparation): Identifies extensions that must be removed because they are incompatible.
    • Phase 2 (Upgrade): Lists extensions that can be upgraded alongside the TYPO3 Core.
    • Phase 3 (Restoration): Calculates the order in which internal/incompatible extensions should be re-added based on their own inter-dependencies.

Configuration (Top of file)

  • TARGET_MAJOR: The TYPO3 version you are targeting (e.g., 13).
  • INTERNAL_PREFIXES: Your organization's namespaces.
  • SITE_PACKAGE_PREFIX: Prefix for site packages to be grouped in the simulation.

Usage

python3 simulate_upgrade.py

Output Files:

  • upgrade_path.mermaid: A flowchart showing the Phase 1-3 transition.
  • upgrade_path.puml: A PlantUML activity diagram of the upgrade process.

Requirements

  • Python 3.6+
  • requests library (for simulate_upgrade.py)
    pip install requests
  • A valid composer.json and composer.lock in the same directory as the scripts.
#!/usr/bin/env python3
import os
import sys
import re
import requests
from collections import defaultdict
from visualize_dependencies import DependencyVisualizer
# Configuration
TARGET_MAJOR = 13
INTERNAL_PREFIXES = ['btv/', 'saitho/']
SITE_PACKAGE_PREFIX = 'btv/btv-ws-'
class UpgradeSimulator:
def __init__(self, visualizer, target_major):
self.visualizer = visualizer
self.target_major = target_major
self.blockers = defaultdict(list) # { package_name: [ (req_name, constraint), ... ] }
self.available_fixes = {} # { package_name: (new_version, compatible_constraint) }
self.cache = {}
self.restoration_deps = {} # Stores dependencies between Phase 3 packages
self.site_package_members = []
def is_compatible(self, constraint, target_major):
"""Heuristic to check if a composer constraint explicitly allows the target major version."""
if not constraint:
return True
supported_majors = re.findall(r'(?:\^|~|>=|>|==|\|\||^|\s)(\d+)', constraint)
if str(target_major) in supported_majors:
return True
if any(x in constraint for x in ['dev-master', 'dev-main', '*']):
return True
return False
def is_internal(self, package_name):
"""Checks if a package is considered internal based on configured prefixes."""
if package_name == "Site Packages":
return True
return any(package_name.startswith(prefix) for prefix in INTERNAL_PREFIXES)
def fetch_compatible_version_from_packagist(self, package_name):
"""Queries Packagist to find a version of the package compatible with the target major."""
if package_name.startswith('typo3/cms-') or package_name == 'typo3/minimal':
return f"v{self.target_major}.0.0", f"^{self.target_major}"
if self.is_internal(package_name):
return None
if package_name in self.cache:
return self.cache[package_name]
print(f" Searching Packagist for {package_name}...")
url = f"https://packagist.org/packages/{package_name}.json"
try:
response = requests.get(url, timeout=10)
if response.status_code == 200:
versions = response.json().get('package', {}).get('versions', {})
for version, v_data in versions.items():
if any(x in version.lower() for x in ['dev', 'alpha', 'beta']): continue
reqs = v_data.get('require', {})
for r_name, r_constr in reqs.items():
if r_name == 'typo3/cms-core' or r_name.startswith('typo3/cms-'):
if self.is_compatible(r_constr, self.target_major):
self.cache[package_name] = (version, r_constr)
return version, r_constr
except: pass
self.cache[package_name] = None
return None
def calculate_restoration_steps(self, edges, restore_targets):
"""Calculates dependency-based steps for Phase 3 restoration and returns the dep map."""
# We need to map edges to handle the "Site Packages" group
mapped_deps = defaultdict(set)
def get_display_name(pkg):
if pkg.startswith(SITE_PACKAGE_PREFIX):
return "Site Packages"
return pkg
for parent, child, _, _ in edges:
p_display = get_display_name(parent)
c_display = get_display_name(child)
if p_display in restore_targets and c_display in restore_targets:
if p_display != c_display:
mapped_deps[p_display].add(c_display)
self.restoration_deps = mapped_deps
steps = []
remaining = set(restore_targets)
while remaining:
current_step = sorted([p for p in remaining if not (mapped_deps[p] & remaining)])
if not current_step:
steps.append(sorted(list(remaining)))
break
steps.append(current_step)
remaining -= set(current_step)
return steps
def simulate(self):
print(f"--- TYPO3 v{self.target_major} Upgrade Simulation ---")
levels, edges, visited = self.visualizer.traverse_dependencies()
for parent, child, is_missing, constraint in edges:
if child.startswith('typo3/cms-') or child == 'typo3/minimal':
if not self.is_compatible(constraint, self.target_major):
self.blockers[parent].append((child, constraint))
if self.blockers:
print(f"Checking for compatible updates for {len(self.blockers)} blockers...")
for pkg_name in sorted(self.blockers.keys()):
if pkg_name == self.visualizer.main_root: continue
fix = self.fetch_compatible_version_from_packagist(pkg_name)
if fix: self.available_fixes[pkg_name] = fix
# Categorize blockers and group site packages
upgradable = []
incompatible = []
has_site_packages = False
for pkg in sorted(self.blockers.keys()):
if pkg == self.visualizer.main_root or pkg.startswith('typo3/'): continue
if pkg.startswith(SITE_PACKAGE_PREFIX):
self.site_package_members.append(pkg)
has_site_packages = True
continue
if pkg in self.available_fixes:
upgradable.append(pkg)
else:
incompatible.append(pkg)
if has_site_packages:
incompatible.append("Site Packages")
restoration_steps = self.calculate_restoration_steps(edges, incompatible)
self.generate_upgrade_path_mermaid(upgradable, incompatible, restoration_steps)
self.generate_upgrade_path_puml(upgradable, incompatible, restoration_steps)
def generate_upgrade_path_mermaid(self, upgradable, incompatible, restoration_steps):
"""Generates an optimal upgrade path as a Mermaid diagram."""
lines = ["graph TD", f' title[Optimal Upgrade Path to TYPO3 v{self.target_major}]']
# Phase 1: Preparation
lines.append(' subgraph Phase_1 ["Phase 1: Preparation (Cleanup)"]')
removals = []
for pkg in sorted(incompatible):
safe_id = self.visualizer.get_safe_id(pkg) + "_rem"
label = f"Remove {pkg}"
if pkg == "Site Packages":
label = f"Remove Site Packages ({len(self.site_package_members)} extensions)"
lines.append(f' {safe_id}["{label}"]')
removals.append(safe_id)
if not removals:
lines.append(' no_removals["No extensions need removal"]')
removals = ["no_removals"]
lines.append(' end')
# Phase 2: Upgrade
lines.append(' subgraph Phase_2 ["Phase 2: Upgrade"]')
core_upgrade = "core_up_node"
lines.append(f' {core_upgrade}["Upgrade TYPO3 Core to v{self.target_major}"]')
for pkg in sorted(upgradable):
safe_id = self.visualizer.get_safe_id(pkg) + "_up"
new_v, _ = self.available_fixes[pkg]
lines.append(f' {safe_id}["Upgrade {pkg} to {new_v}"]')
lines.append(f' {safe_id} --> {core_upgrade}')
lines.append(' end')
# Phase 3: Restoration
first_step_ids = []
for i, step_pkgs in enumerate(restoration_steps):
step_num = i + 1
lines.append(f' subgraph Phase_3_Step_{step_num} ["Phase 3: Restoration - Step {step_num}"]')
for pkg in step_pkgs:
safe_id = self.visualizer.get_safe_id(pkg) + "_rest"
if pkg == "Site Packages":
label = f"Re-add Site Packages (Manual fix required)"
else:
label = f"Re-add {pkg} (Manual fix required)" if self.is_internal(pkg) else f"Re-add {pkg} (Wait for release)"
lines.append(f' {safe_id}["{label}"]')
if i == 0:
first_step_ids.append(safe_id)
lines.append(' end')
for f_id in first_step_ids:
lines.append(f' {core_upgrade} --> {f_id}')
for pkg, target_deps in self.restoration_deps.items():
parent_id = self.visualizer.get_safe_id(pkg) + "_rest"
for dep_pkg in target_deps:
child_id = self.visualizer.get_safe_id(dep_pkg) + "_rest"
lines.append(f' {child_id} -- "dependency" --> {parent_id}')
for r in removals: lines.append(f' {r} --> {core_upgrade}')
# Styling
lines.append(' style Phase_1 fill:#fff1f1,stroke:#d9534f')
lines.append(' style Phase_2 fill:#f1f9ff,stroke:#0275d8,stroke-width:4px')
for pkg in sorted(upgradable):
lines.append(f' style {self.visualizer.get_safe_id(pkg)}_up fill:#3399ff,color:white')
for pkg in sorted(incompatible):
if self.is_internal(pkg):
lines.append(f' style {self.visualizer.get_safe_id(pkg)}_rest fill:#ff9900')
output_path = os.path.join(self.visualizer.base_path, 'upgrade_path.mermaid')
with open(output_path, 'w', encoding='utf-8') as f:
f.write("\n".join(lines))
print(f"\nMermaid Upgrade Path generated: {output_path}")
def generate_upgrade_path_puml(self, upgradable, incompatible, restoration_steps):
"""Generates an optimal upgrade path as a PlantUML activity diagram."""
lines = ["@startuml", f"title Optimal Upgrade Path to TYPO3 v{self.target_major}", ""]
lines.append('partition "Phase 1: Preparation (Cleanup)" {')
if not incompatible:
lines.append(' :No extensions need removal;')
for pkg in sorted(incompatible):
if pkg == "Site Packages":
lines.append(f' :Remove Site Packages ({len(self.site_package_members)} extensions);')
else:
lines.append(f' :Remove {pkg};')
lines.append("}")
lines.append('partition "Phase 2: Upgrade" {')
if upgradable:
lines.append(' split')
for idx, pkg in enumerate(sorted(upgradable)):
if idx > 0: lines.append(' split again')
new_v, _ = self.available_fixes[pkg]
lines.append(f' #lightskyblue:Upgrade {pkg} to {new_v};')
lines.append(' split again')
lines.append(f' #lightblue:Upgrade TYPO3 Core to v{self.target_major};')
lines.append(' endsplit')
else:
lines.append(f' #lightblue:Upgrade TYPO3 Core to v{self.target_major};')
lines.append("}")
for i, step_pkgs in enumerate(restoration_steps):
lines.append(f'partition "Phase 3: Restoration - Step {i+1}" {{')
if len(step_pkgs) > 1:
lines.append(' split')
for idx, pkg in enumerate(step_pkgs):
if idx > 0: lines.append(' split again')
self._add_puml_restoration_line(lines, pkg)
lines.append(' endsplit')
else:
self._add_puml_restoration_line(lines, step_pkgs[0])
lines.append("}")
lines.append("@endum")
output_path = os.path.join(self.visualizer.base_path, 'upgrade_path.puml')
with open(output_path, 'w', encoding='utf-8') as f:
f.write("\n".join(lines))
print(f"PlantUML Upgrade Path generated: {output_path}")
def _add_puml_restoration_line(self, lines, pkg):
if pkg == "Site Packages":
lines.append(f' #orange:Re-add Site Packages (Manual fix required);')
elif self.is_internal(pkg):
lines.append(f' #orange:Re-add {pkg} (Manual fix required);')
else:
lines.append(f' :Re-add {pkg} (Wait for release);')
if __name__ == "__main__":
visualizer = DependencyVisualizer(os.path.dirname(os.path.abspath(__file__)))
UpgradeSimulator(visualizer, TARGET_MAJOR).simulate()
#!/usr/bin/env python3
import json
import os
import sys
import glob
import subprocess
from collections import defaultdict
from datetime import datetime
# Configuration: Set to an integer to limit depth, or None for unlimited
MAX_LEVELS = 3
# If True, only show dependencies that are directly required by our own packages (btv/* or local)
ONLY_DIRECT_DEPS_OF_OWN = True
# Prefix configuration for package filtering
OWN_PREFIXES = ('btv/','saitho/')
# Prefix for site packages that should be grouped if they have identical dependencies
SITE_PACKAGE_PREFIX = 'btv/btv-ws-'
class DependencyVisualizer:
def __init__(self, base_path, max_levels=None, only_direct_own=False):
self.base_path = base_path
self.max_levels = max_levels
self.only_direct_own = only_direct_own
self.packages_dir = os.path.join(base_path, 'packages')
self.lock_data = self._load_json('composer.lock')
if not self.lock_data:
print("Error: composer.lock not found. Please run 'composer install' first.")
sys.exit(1)
# Map package name to its version and dependency requirements from lock file
all_locked_packages = self.lock_data.get('packages', []) + self.lock_data.get('packages-dev', [])
self.package_map = {pkg['name']: pkg for pkg in all_locked_packages}
self.version_map = {pkg['name']: pkg['version'] for pkg in all_locked_packages}
# Track missing dependencies, their dependents and the required version constraints
self.missing_report = defaultdict(dict)
# Identify local packages in the packages/ directory
self.root_packages = self._find_local_packages()
# Include the main project composer.json as a root
main_composer = self._load_json('composer.json')
root_name = main_composer.get('name', 'root')
self.root_packages[root_name] = main_composer.get('require', {})
self.version_map[root_name] = 'local'
self.main_root = root_name
self._group_site_packages()
def _group_site_packages(self):
"""Combines packages with the same SITE_PACKAGE_PREFIX and identical dependencies into one node."""
if not SITE_PACKAGE_PREFIX:
return
# Identify all packages matching the prefix
matching_packages = {}
all_package_names = set(list(self.root_packages.keys()) + list(self.package_map.keys()))
for name in all_package_names:
if name.startswith(SITE_PACKAGE_PREFIX):
# Get dependencies (excluding PHP/ext- as they don't affect site package structure usually)
deps = {}
if name in self.root_packages:
deps = self.root_packages[name]
elif name in self.package_map:
deps = self.package_map[name].get('require', {})
# Filter out PHP and extensions for comparison
filtered_deps = {k: v for k, v in deps.items() if k != 'php' and not k.startswith('ext-')}
# Create a stable key from sorted dependencies
dep_key = json.dumps(filtered_deps, sort_keys=True)
matching_packages.setdefault(dep_key, []).append(name)
if not matching_packages:
return
# Find the majority dependency set (the one that most packages share)
# We only want to create ONE merged group for the "standard" case.
majority_dep_key = max(matching_packages, key=lambda k: len(matching_packages[k]))
names = matching_packages[majority_dep_key]
if len(names) <= 1:
return
name_map = {} # Maps old package name to new aggregated name
# Create an aggregated name
new_name = f"{SITE_PACKAGE_PREFIX}-* ({len(names)} packages)"
is_root = False
# Merge dependencies (take from first package as they are identical)
first_name = names[0]
if first_name in self.root_packages:
merged_deps = self.root_packages[first_name].copy()
else:
merged_deps = self.package_map[first_name].get('require', {}).copy()
for name in names:
name_map[name] = new_name
if name in self.root_packages:
is_root = True
# Cleanup original entries
if name in self.root_packages:
del self.root_packages[name]
if name in self.package_map:
del self.package_map[name]
if name in self.version_map:
del self.version_map[name]
# Update main root reference if it was one of these packages
if name == self.main_root:
self.main_root = new_name
# Register the new combined package
self.version_map[new_name] = 'merged'
if is_root:
self.root_packages[new_name] = merged_deps
else:
self.package_map[new_name] = {'name': new_name, 'version': 'merged', 'require': merged_deps}
# Update all dependency references to point to the new aggregated names
self._update_all_dependency_references(name_map)
def _update_all_dependency_references(self, name_map):
"""Updates all 'require' dictionaries to use the new mapped names."""
if not name_map:
return
def update_deps(deps):
new_deps = {}
for dep_name, constraint in deps.items():
target_name = name_map.get(dep_name, dep_name)
new_deps[target_name] = constraint
return new_deps
# Update root packages
for name in list(self.root_packages.keys()):
self.root_packages[name] = update_deps(self.root_packages[name])
# Update packages in package_map
for name in list(self.package_map.keys()):
if 'require' in self.package_map[name]:
self.package_map[name]['require'] = update_deps(self.package_map[name]['require'])
def _load_json(self, filename):
path = os.path.join(self.base_path, filename)
if not os.path.exists(path):
return {}
with open(path, 'r', encoding='utf-8') as f:
try:
return json.load(f)
except json.JSONDecodeError:
return {}
def _find_local_packages(self):
"""Finds all composer.json files in the packages directory and extracts their info."""
local_roots = {}
pattern = os.path.join(self.packages_dir, '*', 'composer.json')
for composer_path in glob.glob(pattern):
try:
with open(composer_path, 'r', encoding='utf-8') as f:
data = json.load(f)
name = data.get('name')
if name:
local_roots[name] = data.get('require', {})
self.version_map[name] = 'local'
except Exception as e:
print(f"Warning: Could not read {composer_path}: {e}")
return local_roots
def is_relevant(self, name):
"""Filters for TYPO3 or project-specific packages to keep graphs clean."""
return name.startswith('typo3/') or name.startswith(OWN_PREFIXES) or name in self.root_packages
def is_own_package(self, name):
"""Checks if a package is considered 'ours' (local or btv/*)."""
return name.startswith(OWN_PREFIXES) or name in self.root_packages
def get_safe_id(self, name):
"""Generates a safe identifier for a package name across different diagram types."""
return name.replace('/', '_').replace('-', '_').replace('.', '_').replace(' ', '_')
def traverse_dependencies(self):
"""Standard BFS traversal starting from root packages."""
levels = {0: list(self.root_packages.keys())}
visited = set(self.root_packages.keys())
queue = []
edges = [] # List of tuples: (parent, child, is_missing, constraint)
# Initialize queue with root packages
for name, deps in self.root_packages.items():
queue.append((name, deps, 1))
while queue:
parent_name, deps, level = queue.pop(0)
if self.max_levels is not None and level > self.max_levels:
continue
for dep_name, constraint in deps.items():
if dep_name == 'php' or dep_name.startswith('ext-'):
continue
if not self.is_relevant(dep_name) and not self.is_relevant(parent_name):
continue
is_missing = dep_name not in self.version_map
if is_missing:
self.missing_report[dep_name][parent_name] = constraint
edge = (parent_name, dep_name, is_missing, constraint)
if edge not in edges:
edges.append(edge)
if dep_name not in visited:
# Determine if we should explore this dependency's own dependencies
can_explore = True
if self.only_direct_own and not self.is_own_package(dep_name):
can_explore = False
visited.add(dep_name)
levels.setdefault(level, []).append(dep_name)
if not is_missing and can_explore:
pkg_data = self.package_map.get(dep_name)
if pkg_data and 'require' in pkg_data:
queue.append((dep_name, pkg_data['require'], level + 1))
return levels, edges, visited
def generate_dot(self, levels, edges):
lines = ["digraph dependencies {", " rankdir=LR;", " node [shape=box, fontname=\"Arial\"];"]
for lv in sorted(levels.keys()):
lines.append(f' subgraph cluster_level_{lv} {{')
lines.append(f' label = "Level {lv}"; style = dashed; color = grey;')
for name in levels[lv]:
version = self.version_map.get(name)
attrs = []
if name in self.root_packages:
attrs.append('style=filled, fillcolor=yellow')
elif version is None:
attrs.append('style=filled, fillcolor="#ffcccc", color="#cc0000"')
label_text = f"{name}\\n({version if version else 'MISSING'})"
if version is None:
for parent, constraint in sorted(self.missing_report[name].items()):
label_text += f"\\nReq by {parent}: {constraint}"
attrs.append(f'label="{label_text}"')
lines.append(f' "{name}" [{", ".join(attrs)}];')
lines.append(' }')
for parent, child, is_missing, constraint in edges:
if is_missing:
attr = ' [color=red]'
else:
attr = f' [label="{constraint}"]'
lines.append(f' "{parent}" -> "{child}"{attr};')
lines.append("}")
return "\n".join(lines)
def generate_puml(self, levels, edges, visited):
"""Generates a PlantUML component diagram."""
lines = ["@startuml", "skinparam componentStyle uml2", ""]
# Style definitions
lines.append("skinparam component {")
lines.append(" BackgroundColor<<ROOT>> Yellow")
lines.append(" BackgroundColor<<MISSING>> #ffcccc")
lines.append(" BorderColor<<MISSING>> Red")
lines.append("}")
lines.append("")
# Components
for name in sorted(visited):
safe_id = self.get_safe_id(name)
version = self.version_map.get(name)
stereotype = ""
if name in self.root_packages:
stereotype = "<<ROOT>>"
elif version is None:
stereotype = "<<MISSING>>"
label = f"{name}"
if version:
label += f" ({version})"
elif name in self.missing_report:
label += " (MISSING)"
lines.append(f'[{label}] as {safe_id} {stereotype}')
lines.append("")
# Relations
for parent, child, is_missing, constraint in edges:
p_id = self.get_safe_id(parent)
c_id = self.get_safe_id(child)
arrow = "-[#red]->" if is_missing else "-->"
lines.append(f'{p_id} {arrow} {c_id} : "{constraint}"')
lines.append("")
lines.append("@endum")
return "\n".join(lines)
def generate_markdown(self, levels, edges):
"""Generates a Markdown report of the dependencies."""
now = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
lines = [
"# Dependency Report",
"",
f"> [!NOTE]",
f"> This file was automatically generated on {now}.",
"",
"## Explanation",
"This report visualizes the dependency tree of the project's composer packages.",
"",
"- **Levels**: Level 0 represents the root project and local packages. Higher levels represent deeper dependencies.",
"- **Filtering**: The report is filtered to show only relevant packages (TYPO3 core or project-specific namespaces).",
f"- **Grouping**: Packages starting with `{SITE_PACKAGE_PREFIX}` that share identical dependencies are merged into single nodes to keep the report concise. Outliers with unique dependencies are kept separate.",
"- **Direct Dependencies**: If configured, only direct dependencies of our own packages are explored.",
""
]
lines.append("## Packages by Level")
for lv in sorted(levels.keys()):
lines.append(f"### Level {lv}")
for name in sorted(levels[lv]):
version = self.version_map.get(name, "MISSING").replace('|', '\\|')
lines.append(f"- **{name}** (`{version}`)")
lines.append("")
lines.append("## Reverse Dependencies (Dependents)")
lines.append("| Package | Required By | Constraint |")
lines.append("| --- | --- | --- |")
# Sort by child (the package being depended upon)
for parent, child, is_missing, constraint in sorted(edges, key=lambda x: (x[1], x[0])):
safe_constraint = constraint.replace('|', '\\|')
lines.append(f"| {child} | {parent} | `{safe_constraint}` |")
return "\n".join(lines)
def run(self):
levels, edges, visited = self.traverse_dependencies()
# DOT
dot_path = os.path.join(self.base_path, 'dependencies.dot')
with open(dot_path, 'w', encoding='utf-8') as f:
f.write(self.generate_dot(levels, edges))
# SVG (via DOT)
svg_path = os.path.join(self.base_path, 'dependencies.svg')
try:
subprocess.run(['dot', '-Tsvg', dot_path, '-o', svg_path], check=True)
except Exception as e:
print(f"Warning: Could not generate SVG via 'dot' command: {e}")
# PlantUML
with open(os.path.join(self.base_path, 'dependencies.puml'), 'w', encoding='utf-8') as f:
f.write(self.generate_puml(levels, edges, visited))
# Markdown
with open(os.path.join(self.base_path, 'dependencies.md'), 'w', encoding='utf-8') as f:
f.write(self.generate_markdown(levels, edges))
print(f"\nSuccess! Analyzed {len(self.root_packages)} root package(s).")
if self.missing_report:
print("\n!!! MISSING DEPENDENCIES DETECTED !!!")
for pkg, dependents in sorted(self.missing_report.items()):
print(f" - {pkg}")
for dep, constraint in sorted(dependents.items()):
print(f" Required by: {dep} ({constraint})")
print(f"\nFiles generated in {self.base_path}:")
print("1. dependencies.dot (Source for Graphviz)")
print("2. dependencies.svg (Vector graphic)")
print("3. dependencies.puml (View with PlantUML)")
print("4. dependencies.md (Markdown report)")
if __name__ == "__main__":
visualizer = DependencyVisualizer(
os.path.dirname(os.path.abspath(__file__)),
MAX_LEVELS,
ONLY_DIRECT_DEPS_OF_OWN
)
visualizer.run()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment