Manual video editing is a bottleneck. To ensure continuous viral expansion for the Idol Protocol, we have fully automated the content generation pipeline using a new microservice: srv-clipper-drone.
1. The Trigger & Capture (OBS WebSockets)
We abandoned simulating keystrokes (pyautogui) to trigger the OBS Replay Buffer. It was fragile. The Native Bridge now communicates directly with OBS via WebSockets.
- Input: The
srv-ingest-twitchcontainer detects a viral moment (e.g., chat spamming "LUL" or someone typing "!clip"). - Execution: A Kafka event is fired. The Native Bridge intercepts it and silently commands OBS to save the last 60 seconds to a shared Docker volume.
2. The Factory (Python + FFmpeg + Whisper)
Once the .mp4 hits the volume, the Clipper Drone wakes up:
# Pipeline Overview: 1. CROP: FFmpeg resizes the 1920x1080 source to a 1080x1920 vertical layout (Cam Top / Screen Bottom). 2. TRANSCRIBE: Audio is fed into OpenAI's Whisper (Small model running on local CPU). 3. BURN SUBS: Hardcodes the subtitles using JetBrains Mono font, Neon Green with black outlines. 4. BRANDING: Appends the GlitchPoint outro and static sound effect.
The final file is dropped into the /ready_to_upload directory, waiting for a final human approval tap before being deployed to TikTok and YouTube Shorts via API.