Pipeline Daemon with
Live Events & HTTP API

Trigger pipelines remotely, stream logs in real-time with SSE, and run isolated Docker containers

$ pin apply --daemon Starting daemon... ✓ Server running ✓ Port: 8081 Waiting for events... PIN DAEMON :8081 SSE Monitor - Real-time Events [INFO] Pipeline triggered [INFO] Container started [LOG] Running build... [SUCCESS] Job completed [INFO] Pipeline complete ✓

Why Pin?

🚀

Daemon Mode

Run as a long-running service with HTTP API endpoints for remote pipeline execution

Real-time Events

Monitor pipeline execution via Server-Sent Events from multiple clients simultaneously

🐳

Docker-based

Isolated execution environments with full Docker and custom Dockerfile support

🔄

Smart Retry

Automatic retries with exponential backoff for handling transient failures

⚙️

Flexible Config

YAML-based configuration with validation, environment variables, and conditional execution

🔗

Parallel Jobs

Run multiple jobs in parallel for faster pipeline execution

Daemon Mode & HTTP API

Keep Pin running as a service and trigger pipelines remotely

HTTP Endpoints

GET /events Real-time SSE stream
POST /trigger Execute pipeline
GET /health Health check

Event Types

  • daemon_start - Service started
  • pipeline_trigger - New execution
  • job_container_start - Container started
  • log - Real-time logs
  • job_completed - Job finished
  • pipeline_complete - Pipeline done

JavaScript Example

const eventSource = new EventSource('http://localhost:8081/events');

eventSource.onmessage = (event) => {
  const data = JSON.parse(event.data);
  console.log(`[${data.level}] ${data.message}`);
};

Quick Start

1

Download

Get the latest release for your platform

Download from GitHub
2

Create pipeline.yaml

workflow:
  - build

build:
  image: golang:alpine
  copyFiles: true
  script:
    - go build -o app .
3

Run

pin apply -f pipeline.yaml

Or start daemon mode:

pin apply --daemon

Real-World Examples

See how Pin handles different scenarios

🔄 Parallel Microservices

Run multiple services simultaneously with isolated ports

workflow:
  - user-service
  - auth-service
  - api-gateway

user-service:
  image: node:18-alpine
  copyFiles: true
  port:
    - "127.0.0.1:3001:3000"
  parallel: true
  script:
    - cd services/user-service
    - npm install && npm start

auth-service:
  image: node:18-alpine
  copyFiles: true
  port:
    - "127.0.0.1:3002:3000"
  parallel: true
  script:
    - cd services/auth-service
    - npm install && npm start

🎯 Conditional Deployment

Deploy to different environments based on branch

workflow:
  - build
  - deploy-staging
  - deploy-production

deploy-staging:
  image: alpine:latest
  condition: $BRANCH == "develop"
  script:
    - echo "Deploying to staging..."

deploy-production:
  image: alpine:latest
  condition: $BRANCH == "main"
  script:
    - echo "Deploying to production..."

BRANCH=main pin apply -f deploy.yaml

🔁 Smart Retry Logic

Automatic retries with exponential backoff for flaky operations

network-operation:
  image: alpine:latest
  retry:
    attempts: 5
    delay: 1
    backoff: 2.0
  script:
    - wget https://api.example.com/data
    - cat data

Retries: 1s → 2s → 4s → 8s → 16s

🐳 Custom Dockerfile

Build and use your own development environment

workflow:
  - setup-env
  - run-dev

setup-env:
  dockerfile: "./dev.Dockerfile"
  copyFiles: true
  script:
    - echo "Environment ready"

run-dev:
  image: setup-env-custom:latest
  port:
    - "8080:8080"
  env:
    - NODE_ENV=development
  script:
    - npm run dev

📡 HTTP-Triggered Pipeline

Trigger pipelines remotely and monitor in real-time

# Start daemon
pin apply --daemon

# Trigger from anywhere
curl -X POST \
  -H "Content-Type: application/yaml" \
  --data-binary @pipeline.yaml \
  http://localhost:8081/trigger

# Watch live events
curl -N http://localhost:8081/events

Documentation