n8n Automation Mastery Series
    Part 5 of 6

    Multi-Service Integration

    Connect GitHub, Slack, email, databases, and APIs into cross-platform workflows — the kind that cost $100+/month on Zapier.

    45 minutes
    n8n instance from Part 1
    Prerequisites

    Completed Part 4, running n8n instance

    Time to Complete

    ~45 minutes

    What You'll Build

    4 cross-platform integration workflows

    Where Self-Hosted Automation Pays for Itself

    The real power of n8n isn't any single workflow — it's connecting multiple services into pipelines that move data across your entire stack. A GitHub issue triggers a Slack notification, which updates a database record, which sends an email to the assignee, which creates a calendar event.

    On Zapier, each of those connections is a "step" in a multi-step zap. A 5-step zap on the Team plan runs 2,000 tasks per month. If this pipeline fires 20 times a day, you burn through your monthly allocation in 20 days. On n8n running on your RamNode VPS, this same pipeline runs unlimited times at zero marginal cost.

    Setting Up Credentials

    Before building workflows, configure credentials for each service. n8n stores them encrypted using the N8N_ENCRYPTION_KEY you set in Part 1.

    GitHub

    1. In GitHub: Settings → Developer Settings → Personal Access Tokens → Fine-grained tokens
    2. Create a token with permissions for Issues, Pull Requests, Webhooks
    3. In n8n: Add a GitHub API credential with your token

    Slack

    1. Go to api.slack.com/appsCreate New App → From scratch
    2. Under OAuth & Permissions, add scopes: chat:write, channels:read, channels:history, files:write
    3. Install to workspace and copy the Bot User OAuth Token
    4. In n8n: Add a Slack API credential

    PostgreSQL (or MySQL)

    1. In n8n: Add a Postgres credential
    2. Fill in host, port, database name, user, and password
    3. If the database is on the same VPS, use host.docker.internal as the host

    SMTP Email

    1. In n8n: Add an SMTP credential
    2. Configure with your mail server, or use SendGrid, Mailgun, or Amazon SES
    3. For Gmail: use an App Password with smtp.gmail.com:587

    Workflow 1: GitHub Issue → Slack → Database → Email

    When someone opens a GitHub issue, the pipeline receives the event, stores the issue in your database, posts to Slack, and emails the assignee.

    GitHub Trigger

    1. Add a GitHub Trigger node
    2. Select your GitHub credential
    3. Set Repository Owner and Repository Name
    4. Set Events: issues

    Extract and Format

    Code Node
    const issue = $input.first().json;
    
    // Only process newly opened issues
    if (issue.action !== 'opened') return [];
    
    return [{
      json: {
        issueNumber: issue.issue.number,
        title: issue.issue.title,
        body: issue.issue.body?.substring(0, 500) || 'No description',
        author: issue.issue.user.login,
        authorAvatar: issue.issue.user.avatar_url,
        url: issue.issue.html_url,
        labels: (issue.issue.labels || []).map(l => l.name).join(', '),
        assignee: issue.issue.assignee?.login || 'unassigned',
        repository: issue.repository.full_name,
        createdAt: issue.issue.created_at
      }
    }];

    Store in PostgreSQL

    Create the table first:

    SQL — Create Table
    CREATE TABLE github_issues (
      id SERIAL PRIMARY KEY,
      issue_number INTEGER NOT NULL,
      title TEXT NOT NULL,
      body TEXT,
      author VARCHAR(255),
      url TEXT,
      labels TEXT,
      assignee VARCHAR(255),
      repository VARCHAR(255),
      status VARCHAR(50) DEFAULT 'open',
      created_at TIMESTAMP,
      processed_at TIMESTAMP DEFAULT NOW()
    );

    Notify via Slack

    Slack Message
    🐛 *New Issue #{{ $json.issueNumber }}*
    *{{ $json.title }}*
    
    {{ $json.body }}
    
    Author: {{ $json.author }}
    Labels: {{ $json.labels || 'none' }}
    Assignee: {{ $json.assignee }}
    <{{ $json.url }}|View on GitHub>

    Email the Assignee

    Add an IF node checking {{ $json.assignee !== 'unassigned' }}, then a Send Email node on the true path with the issue details and a direct link.

    Workflow 2: Daily Digest — Aggregate and Report

    Instead of flooding Slack with individual notifications, aggregate events and send a daily summary.

    Architecture
    Schedule Trigger (daily at 9 AM)
      → PostgreSQL (query recent issues/events)
        → Code (format digest)
          → Slack (post digest)
          → Send Email (to team lead)

    Query Recent Issues

    PostgreSQL Query
    SELECT
      issue_number, title, author, assignee, labels, status, url
    FROM github_issues
    WHERE processed_at >= NOW() - INTERVAL '24 hours'
    ORDER BY created_at DESC;

    Format the Digest

    Code Node
    const issues = $input.all().map(i => i.json);
    
    if (issues.length === 0) {
      return [{ json: { hasIssues: false, digest: 'No new issues in the last 24 hours. 🎉' } }];
    }
    
    const byStatus = {
      open: issues.filter(i => i.status === 'open'),
      closed: issues.filter(i => i.status === 'closed')
    };
    
    let digest = `📊 *Daily Issue Digest — ${new Date().toLocaleDateString()}*\n\n`;
    digest += `Total: ${issues.length} | Open: ${byStatus.open.length} | Closed: ${byStatus.closed.length}\n\n`;
    
    if (byStatus.open.length > 0) {
      digest += `*Open Issues:*\n`;
      byStatus.open.forEach(i => {
        digest += `• #${i.issue_number}: ${i.title} (${i.assignee}) — <${i.url}|view>\n`;
      });
    }
    
    return [{ json: { hasIssues: true, digest, issueCount: issues.length } }];

    Workflow 3: Inbound Email → Database → API

    Process inbound emails and sync them to external systems — useful for support inboxes, form submissions, or any email-driven workflow.

    Parse Email

    Code Node
    const email = $input.first().json;
    return [{
      json: {
        from: email.from?.text || email.from,
        subject: email.subject,
        body: email.text?.substring(0, 2000) || '',
        receivedAt: email.date,
        messageId: email.messageId,
        hasAttachments: (email.attachments || []).length > 0,
        attachmentCount: (email.attachments || []).length
      }
    }];

    Log to Database

    PostgreSQL Insert
    INSERT INTO inbound_emails (from_address, subject, body, received_at, message_id, has_attachments)
    VALUES ($1, $2, $3, $4, $5, $6)
    RETURNING id;

    Forward to External API

    HTTP Request
    Method: POST
    URL: https://api.yourservice.com/tickets
    Headers:
      Authorization: Bearer {{ $credentials.apiToken }}
      Content-Type: application/json
    Body:
    {
      "source": "email",
      "from": "{{ $json.from }}",
      "subject": "{{ $json.subject }}",
      "body": "{{ $json.body }}",
      "external_id": "{{ $json.messageId }}"
    }

    Auto-Acknowledge

    Reply to the sender confirming receipt with their reference number.

    Workflow 4: Database Change → Multi-Channel Notification

    Monitor a database table for changes and push notifications to multiple channels simultaneously.

    Schedule Trigger: Every minute

    Query for New Records

    PostgreSQL
    SELECT * FROM orders
    WHERE status = 'completed'
      AND notified = false
    ORDER BY created_at ASC
    LIMIT 10;

    Split into Parallel Paths

    • Path 1: Slack — Post to #sales channel
    • Path 2: Send Email — Notify the account manager
    • Path 3: HTTP Request — Update your analytics/CRM system

    Mark as Notified

    PostgreSQL Update
    UPDATE orders SET notified = true WHERE id = {{ $json.id }};

    Merge all three paths back together before the update, ensuring all notifications are sent before marking as processed.

    Working with REST APIs

    Many integrations don't have dedicated n8n nodes. The HTTP Request node is your universal adapter for any service with a REST API.

    Pagination

    Code Node — Paginated Requests
    const baseUrl = 'https://api.example.com/records';
    const perPage = 100;
    let page = 1;
    let allRecords = [];
    let hasMore = true;
    
    while (hasMore) {
      const response = await fetch(`${baseUrl}?page=${page}&per_page=${perPage}`, {
        headers: { 'Authorization': 'Bearer YOUR_TOKEN' }
      });
      const data = await response.json();
      allRecords = allRecords.concat(data.results);
      hasMore = data.results.length === perPage;
      page++;
    }
    
    return allRecords.map(record => ({ json: record }));

    Authentication Patterns

    API Key in Header: X-API-Key: {{ $credentials.apiKey }}

    Bearer Token: Authorization: Bearer {{ $credentials.accessToken }}

    OAuth2: Use n8n's built-in OAuth2 credential type, which handles token refresh automatically.

    Rate Limiting

    Code Node — Rate Limit Handler
    const remaining = parseInt($input.first().json.headers['x-ratelimit-remaining']);
    if (remaining < 5) {
      return [{ json: { ...$input.first().json, shouldWait: true, waitMs: 60000 } }];
    }
    return $input.all();

    Credential Security Best Practices

    Use n8n's credential system — never hardcode API keys in Code nodes. Store them as credentials and reference them through node configuration.

    Principle of least privilege — create API tokens with the minimum permissions each workflow needs. Don't use an admin-level GitHub token for a workflow that only reads issues.

    Rotate regularly — set a calendar reminder to rotate API keys quarterly. Update the credential in one place and all workflows that reference it automatically use the new key.

    Audit access — periodically review which workflows use which credentials. n8n shows credential usage in the credential detail view.

    What's Next?

    In Part 6, we're taking everything production-grade — scaling n8n with queue workers, database-backed execution storage, automated backups, high availability configurations, and monitoring n8n itself.

    Your automation platform becomes as reliable as the infrastructure it manages.