Initial setup: Claude Code + n8n MCP integration

- CLAUDE.md with project docs and architecture
- n8n-mcp config example (.mcp.json.example)
- 7 gandalf skills (jump, hai, docker, status, remote-env, orchestrate, skills)
- hai-infra + hai-tasks MCP server connections
- .gitignore for secrets

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
hofmann 2026-03-01 20:59:10 +00:00
parent c4e93800de
commit 4654f87bb8
10 changed files with 428 additions and 0 deletions

View file

@ -0,0 +1,51 @@
---
description: Docker container management (local or remote)
argument-hint: [ps|logs|restart] [container] [@host]
---
# Docker Management
Manage Docker containers locally or on remote hosts.
## Commands
| Command | Description |
|---------|-------------|
| `ps` | List running containers |
| `logs <name>` | Show container logs |
| `restart <name>` | Restart container |
| `stop <name>` | Stop container |
| `start <name>` | Start container |
## Remote Hosts
Append `@host` to run on remote:
- `@hai` → hofmanns.ai (100.64.0.1)
- `@git` → git.hofmanns.tech (100.64.0.6)
- `@rtx` → local RTX machine
## Instructions
1. Parse: $ARGUMENTS
2. If `@host` suffix, run via SSH:
```bash
ssh user@host 'docker <command>'
```
3. Otherwise run locally:
```bash
docker <command>
```
4. Format output:
```
🐳 DOCKER STATUS [@hai]
┌────────────────┬─────────────┬────────┐
│ Container │ Status │ Ports │
├────────────────┼─────────────┼────────┤
│ caddy │ 🟢 Up 5d │ 80,443 │
│ synapse │ 🟢 Up 5d │ 8008 │
│ authentik │ 🟢 Up 5d │ 9000 │
└────────────────┴─────────────┴────────┘
```

View file

@ -0,0 +1,18 @@
# SSH to hofmanns.ai
Connect to the hofmanns.ai server via SSH.
## Usage
Run: `ssh hai` or `ssh hofmanns.ai`
## Server Details
- Host: hofmanns.ai
- User: ubuntu
- Key: ~/.ssh/hofmanns_ai
## Instructions
When this skill is invoked, SSH into the server and help the user with their task on the remote machine.
```bash
ssh hai
```

View file

@ -0,0 +1,43 @@
---
description: SSH jump to remote hosts (hai, git, slate, rtx, jet)
argument-hint: <target> [command]
---
# Jump to Remote Host
Quick SSH access to Gandalf infrastructure nodes.
## Targets
| Target | Host | IP |
|--------|------|-----|
| `hai` | ubuntu@hofmanns.ai | 100.64.0.1 |
| `git` | ubuntu@git.hofmanns.tech | 100.64.0.6 |
| `slate` | root@192.168.10.1 | 100.64.0.7 |
| `rtx` | d@192.168.10.236 | local |
| `jet` | d@jet | offline |
| `cloud` | d@cloud | offline |
## Instructions
1. Parse the target from argument: $ARGUMENTS
2. Map target to SSH command:
- `hai`, `hofmanns``ssh ubuntu@100.64.0.1`
- `git`, `gitea``ssh ubuntu@100.64.0.6`
- `slate``ssh root@100.64.0.7`
- `rtx``ssh d@192.168.10.236`
- `jet`, `jetson``ssh d@jet`
- `cloud``ssh d@cloud`
3. If second argument provided, run as remote command:
```bash
ssh user@host 'command'
```
4. Otherwise open interactive session or show host info
5. Display result with status icons:
- 🟢 connected
- 🔴 failed
- ⚪ offline/unreachable

View file

@ -0,0 +1,66 @@
---
description: Orchestrate tasks via n8n workflows and Slack notifications
argument-hint: <workflow|notify|status> [message]
---
# Orchestration Hub
Trigger n8n workflows and send Slack notifications.
## Endpoints
- **n8n**: https://n8n.hofmanns.app (also n8n.hofmanns.ai)
- **Slack webhook**: https://slack.hofmanns.tech
## Instructions
### If argument is `notify <message>`:
Send a Slack notification:
```bash
curl -X POST https://slack.hofmanns.tech/webhook \
-H "Content-Type: application/json" \
-d '{"text": "<message>", "channel": "#claude"}'
```
### If argument is `status`:
Check n8n and slack health:
```bash
# Check n8n
curl -s https://n8n.hofmanns.app/healthz
# Check slack webhook
curl -s -o /dev/null -w "%{http_code}" https://slack.hofmanns.tech/health
```
### If argument is `workflow <name>`:
Trigger an n8n workflow by webhook:
```bash
curl -X POST "https://n8n.hofmanns.app/webhook/<workflow-name>" \
-H "Content-Type: application/json" \
-d '{"triggered_by": "claude", "timestamp": "'$(date -Iseconds)'"}'
```
### If argument is `list`:
List available workflows (if n8n API accessible)
### Display format:
```
┌─────────────────────────────────────────────────────────┐
│ 🎭 ORCHESTRATION │
├─────────────────────────────────────────────────────────┤
│ 📡 n8n.hofmanns.app 🟢 healthy │
│ 💬 slack.hofmanns.tech 🟢 connected │
├─────────────────────────────────────────────────────────┤
│ Available workflows: │
│ • backup-trigger Daily backup orchestration │
│ • deploy-hai Deploy to hai server │
│ • notify-status Send status to all channels │
└─────────────────────────────────────────────────────────┘
```
## Integration Examples
Combine with other skills:
- `/remote-env` → check status → `/orchestrate notify "hai is down!"`
- Deploy workflow → `/orchestrate workflow deploy-hai`

View file

@ -0,0 +1,66 @@
# Remote Environment
## All Devices
### Headscale (self-hosted)
| Host | IP | Type | Status |
|------|-----|------|--------|
| hai | 100.64.0.1 | Server | ✅ |
| rtx | 100.64.0.2 | RTX 4090 | ❌ migrate |
| git | 100.64.0.6 | Forgejo | ✅ |
| slate | 100.64.0.7 | Tablet | ❌ offline |
### Tailscale (to migrate)
| Device | IP | Type |
|--------|-----|------|
| d-game-win11 | 100.124.77.127 | Windows RTX |
| gl-be3600 | 100.85.97.9 | Router |
| glkvm | 100.72.138.24 | KVM |
| chromecast | 100.89.150.57 | Chromecast |
| lenovo-dee | 100.109.245.3 | Laptop |
| lenovo-tb330fu | 100.70.237.126 | Tablet |
| lenovo-yt-x705f | 100.83.86.102 | Tablet |
| nothing-a063 | 100.93.194.91 | Phone |
| nothing-a063-1 | 100.76.147.31 | Phone |
## SSH
```bash
ssh hai # ubuntu@hofmanns.ai
ssh rtx # d@100.64.0.2
ssh git # git@100.64.0.6
ssh router # root@100.85.97.9
ssh kvm # d@100.72.138.24
ssh jetson # d@100.72.246.11
```
## Services
### hai (public)
| URL | Service |
|-----|---------|
| git.hofmanns.ai | Forgejo |
| auth.hofmanns.ai | Authentik |
| headscale.hofmanns.app | Headscale |
| n8n.hofmanns.app | n8n |
### rtx (tunneled)
| URL | Service |
|-----|---------|
| ewa.hofmanns.ltd | EWA |
| rgb.hofmanns.ltd | RGB Chaos |
| rtx.hofmanns.ltd | Frigate |
## Docker on hai
```bash
ssh hai "docker ps --format 'table {{.Names}} {{.Status}}'"
```
| Container | Purpose |
|-----------|---------|
| element | Matrix client |
| synapse | Matrix server |
| gitea | Forgejo git |
| headscale | VPN control |
| n8n | Automation |
| authentik | Auth |
| mytube-* | YouTube tools |

View file

@ -0,0 +1,42 @@
---
description: List all available Claude Code skills and commands
argument-hint: [skill-name]
---
# Skills Directory
List and describe available custom skills.
## Instructions
1. Scan `~/.claude/commands/*.md` for skills
2. Display skills:
```
┌─────────────────────────────────────────────────────────────┐
│ 🎯 GANDALF SKILLS │
├─────────────────────────────────────────────────────────────┤
│ 🔗 REMOTE ACCESS │
│ ├─ /jump <host> SSH to hai, git, slate, rtx │
│ ├─ /hai [cmd] Quick commands for hofmanns.ai │
│ ├─ /remote-env Show all remote host status │
│ └─ /status Full infrastructure dashboard │
├─────────────────────────────────────────────────────────────┤
│ 🐳 SERVICES │
│ ├─ /docker [cmd] Container management (local/remote)│
│ ├─ /ollama [model] LLM inference on RTX 4090 │
│ └─ /orchestrate n8n workflows & Slack │
├─────────────────────────────────────────────────────────────┤
│ 🎮 DEVICES │
│ ├─ /tv [cmd] Chromecast TV control │
│ └─ /rgb [profile] RGB lighting control │
├─────────────────────────────────────────────────────────────┤
│ ⚙️ META │
│ ├─ /skills This list │
│ ├─ /permissions Manage Claude permissions │
│ └─ /agents Show available Task agents │
└─────────────────────────────────────────────────────────────┘
```
3. If skill-name provided, read that skill's .md file and show detailed help

View file

@ -0,0 +1,31 @@
# Status
When user says "status", "check", "what's running" - RUN these commands:
```bash
# Run all at once
echo "=== HEADSCALE ===" && ssh hai "docker exec headscale headscale nodes list" && echo "" && echo "=== DOCKER ===" && ssh hai "docker ps --format 'table {{.Names}} {{.Status}}'" && echo "" && echo "=== TUNNEL ===" && ssh hai "ss -tlnp | grep -E '8042|5000'" && echo "" && echo "=== LOCAL ===" && tailscale status
```
## Expected Results
### Headscale
| Host | IP | Expected |
|------|-----|----------|
| hai | 100.64.0.1 | online |
| rtx | 100.64.0.2 | migrate! |
| git | 100.64.0.6 | online |
| slate | 100.64.0.7 | offline |
### Docker
| Container | Expected |
|-----------|----------|
| element | healthy |
| synapse | healthy |
| gitea | up |
| headscale | up |
| n8n | up |
| authentik | healthy |
### Tunnel
Ports 5000 (Frigate) and 8042 (EWA) should be LISTEN

6
.gitignore vendored Normal file
View file

@ -0,0 +1,6 @@
node_modules/
.env
*.log
.DS_Store
.mcp.json
.claude/mcp.json

24
.mcp.json.example Normal file
View file

@ -0,0 +1,24 @@
{
"mcpServers": {
"n8n-mcp": {
"command": "npx",
"args": ["n8n-mcp"],
"env": {
"MCP_MODE": "stdio",
"LOG_LEVEL": "error",
"DISABLE_CONSOLE_OUTPUT": "true",
"N8N_API_URL": "http://localhost:5678",
"N8N_API_KEY": "YOUR_N8N_API_KEY_HERE",
"N8N_MCP_TELEMETRY_DISABLED": "true"
}
},
"hai-infra": {
"type": "sse",
"url": "https://infra.hofmanns.ai/sse"
},
"hai-tasks": {
"type": "sse",
"url": "https://mcp.hofmanns.ai/sse"
}
}
}

81
CLAUDE.md Normal file
View file

@ -0,0 +1,81 @@
# code.hofmanns.ai — Claude Code + n8n MCP Integration
> AI-powered workflow builder for hofmanns.ai infrastructure.
> Based on [Nate Herk's tutorial](https://youtu.be/B6k_vAjndMo) — Claude Code + n8n MCP integration.
## What This Project Does
Claude Code connects to **n8n** via the **n8n-mcp** server (Model Context Protocol) to:
- Search 1,200+ n8n nodes and their documentation
- Build, validate, and deploy n8n workflows
- Access workflow templates (2,700+)
- Manage workflows via the n8n REST API
Additionally connects to **hofmanns.ai infrastructure** via custom MCP servers:
- **hai-infra** (infra.hofmanns.ai) — shell, Docker, Caddy, filesystem on hai
- **hai-tasks** (mcp.hofmanns.ai) — task management
- **jetson-mcp** — GPU status, Docker on Jetson Orin
## Architecture
```
Claude Code (this machine)
├── n8n-mcp (npx) → localhost:5678 (n8n API)
├── hai-infra MCP → infra.hofmanns.ai:3100
├── hai-tasks MCP → mcp.hofmanns.ai:3847
└── jetson-mcp → jetson-mcp.hofmanns.ai:3100
```
## n8n Skills (7 installed)
| Skill | Purpose |
|-------|---------|
| n8n-mcp-tools-expert | Node search, workflow management via MCP |
| n8n-expression-syntax | {{}} patterns, $json, $node variables |
| n8n-workflow-patterns | 5 architectural patterns, 2,700+ templates |
| n8n-validation-expert | Error interpretation, auto-sanitization |
| n8n-node-configuration | Property dependencies, required fields |
| n8n-code-javascript | Code nodes, $input/$helpers, DateTime |
| n8n-code-python | Python in n8n, stdlib, limitations |
## Quick Reference
### n8n API
- **URL:** http://localhost:5678
- **API Key:** See credentials
- **Key Expires:** 2026-03-08 (renew!)
### Forgejo (git.hofmanns.ai)
- **API Key:** See credentials
- **SSH:** `ssh -p 2222 git@git.hofmanns.ai`
- **This repo:** `hofmann/code.hofmanns.ai`
### Key Conventions
- Always copy workflows before editing (never modify production directly)
- Test in development first
- Export backups before changes
- Webhook data is at `$json.body`, NOT root level
## Installed Components
```
~/.claude/.mcp.json # MCP server config (n8n-mcp + hai servers)
~/.claude/skills/ # 7 n8n skills
n8n-mcp-tools-expert/
n8n-expression-syntax/
n8n-workflow-patterns/
n8n-validation-expert/
n8n-node-configuration/
n8n-code-javascript/
n8n-code-python/
```
## Usage Examples
```
# Ask Claude Code to build workflows:
"Build a webhook workflow that receives JSON, processes it, and sends to Slack"
"Create an n8n workflow that monitors a URL and sends Telegram alerts on changes"
"Find me the Slack node and show its configuration options"
"Validate my workflow for errors"
```