Syphon and NDI both share video between Mac apps but solve different problems. Syphon is a Mac-only, GPU-direct texture-sharing protocol with zero compression and sub-frame latency — ideal for single-machine VJ setups (RenderWave → MadMapper, OBS, Resolume). NDI is a cross-platform network video protocol with compression and multi-machine support — ideal for distributing video across multiple computers in festival or broadcast rigs.
If you only need to move pixels between two apps on the same MacBook, the answer is almost always Syphon. The moment a second machine enters the rig, the answer is almost always NDI. The rest of this post is the long version of why.
What is Syphon?
Syphon is an open-source framework for sharing real-time video between applications on macOS. It was created by Tom Butterworth and Anton Marini (the team behind Vidvox/VDMX) and has been the de facto Mac visual interop standard since 2010. The project lives at syphon.github.io.
What makes Syphon different from a cable or a screen capture:
- Mac-only. The Windows equivalent is Spout. They are conceptually similar but not interoperable without a bridge.
- GPU-to-GPU texture sharing. Syphon passes an
IOSurface-backed texture handle between processes, so the receiver reads the same GPU memory the sender wrote — there is no CPU copy and no encode/decode. The current modern implementation is Metal-backed via Syphon-Framework on GitHub. - Zero compression. Frames are not encoded. What the sender renders is what the receiver sees, bit-for-bit, full RGBA.
- Sub-frame latency. Because there is no encode and no transport, latency is effectively the time it takes the receiver to bind the shared texture in its next draw call — typically under a millisecond.
- Local machine only. Syphon does not cross machines. It does not go over the network. It is process-to-process on one Mac.
Apps that send or receive Syphon natively include RenderWave, VDMX, Resolume, MadMapper, OBS Studio (via plugin or built-in depending on version), CoGe, Modul8, After Effects (via plugin), and most of the Mac visual ecosystem. The supported app list is maintained at syphon.github.io/apps.
What is NDI?
NDI (“Network Device Interface”) is a network video protocol developed by NewTek and now maintained by Vizrt after the 2019 acquisition. The protocol page and developer docs live at ndi.video.
What NDI is and is not:
- Cross-platform. Mac, Windows, Linux, iOS, Android, plus a growing hardware ecosystem (IP cameras, switchers, monitors). The SDK is at ndi.video/sdk.
- Network transport over IP. NDI sends frames over standard Gigabit (or 10 GbE) Ethernet, or Wi-Fi when you must. Discovery is mDNS/Bonjour — receivers find senders automatically on the same LAN.
- Compressed. Full NDI uses an internal codec (NewTek SpeedHQ family); NDI HX uses H.264/HEVC for lower bandwidth. Per NDI’s own bandwidth docs, full-bandwidth 1080p60 NDI runs around 125–150 Mbps.
- Latency is real. A typical NDI hop on a wired LAN is 80–200 ms depending on resolution, frame rate, and receiver buffering.
- Multi-machine by design. You can fan a single source out to dozens of receivers on the network.
- Alpha is supported but heavy. NDI carries RGBA 4:4:4:4 in its full-bandwidth modes. NDI HX is more aggressive about compression and may drop or degrade alpha depending on the encoder. See NDI’s video format docs.
NDI on Mac requires the NDI Tools install (or the SDK if you are building against it). Most pro VJ apps that support NDI bundle the runtime or detect the system install.
Latency and capability comparison
| Protocol | Typical latency | Resolution ceiling | Compression | Network required | Mac-native | Alpha |
|---|---|---|---|---|---|---|
| Syphon | < 1 ms (local GPU) | Whatever your GPU can hold (8K+ feasible) | None | No | Yes | Full RGBA |
| NDI (full) | 80–200 ms | 4K is standard; 8K supported in NDI 5+ | Yes (SpeedHQ) | Yes | Yes (via NDI Tools) | RGBA in full bandwidth |
| NDI HX 3 | ~120–250 ms | 4K@60p | Yes (H.264/HEVC) | Yes | Yes | Limited |
| Spout | < 1 ms | GPU-bound | None | No | No (Windows) | Full RGBA |
The “ceiling” numbers above are practical not theoretical — Syphon does not impose a resolution limit, it inherits whatever your GPU and the receiving app’s render target can handle. NDI’s ceilings are spec-defined and bandwidth-bound.
Alpha channel support
This is the question that decides a lot of real shows.
- Syphon: full RGBA, always. Because the underlying object is an
IOSurface-backed Metal texture, alpha is preserved by definition. Send a transparent shader output from RenderWave into MadMapper and the alpha is there for compositing. The Syphon-Framework header explicitly carriesMTLPixelFormatBGRA8Unormand friends — see the Syphon Metal server source. - NDI: alpha is supported but path-dependent. Full-bandwidth NDI carries 4:4:4:4 with alpha. NDI HX, which most people actually use over real-world networks because of bandwidth, varies — encoders may strip alpha or carry it inefficiently. Always test alpha end-to-end on the specific NDI version and codec path you ship with, not just on the spec sheet.
If your show depends on transparent overlays (lower-thirds, alpha-masked logos, key-and-fill into a switcher), confirm the receiver actually got an alpha channel by checking with NDI Studio Monitor’s pixel inspector before show night.
Multi-machine setups
This is where the choice is binary.
- Syphon does not cross machines. It uses shared GPU memory on one Mac. There is no Syphon-over-network mode. If you have two Macs and you want one to send video to the other, Syphon cannot help you — convert to NDI (or run a Syphon-to-NDI bridge).
- NDI is built for multi-machine. Discovery happens automatically via mDNS, every machine on the LAN sees every NDI source, and you can fan one source out to many receivers without re-encoding per receiver. Festival rigs with separate “graphics” and “playback” machines, or stages with a Mac VJ source feeding a Windows vMix front-of-house switcher, are the canonical NDI use cases.
Concrete pattern from a real rig: RenderWave runs on a MacBook Pro M4 Max in the booth, outputs Syphon locally to MadMapper for projection-mapping the stage, AND outputs NDI over a wired LAN to a vMix PC at front-of-house that handles the IMAG and streaming bus. Same source, two transports, two destinations, one show.
When to use Syphon
- Mac-only rig. One machine doing everything.
- RenderWave → MadMapper for projection mapping on stage geometry.
- RenderWave → OBS for live streaming the visual feed.
- RenderWave → Resolume when you want clip-launch on top of a generative shader source (hybrid rigs are real).
- Latency-critical live VJing at clubs — you do not want a 150 ms hop between the shader and the projector.
- Alpha-critical compositing — Syphon’s RGBA is non-negotiable, NDI’s depends on the path.
When to use NDI
- Multi-machine festival rigs where graphics, playback, and switching live on separate computers.
- Mac source → PC switcher (vMix, ATEM Software Control via NDI input, OBS on Windows).
- Distributing one visual source to multiple separate output computers driving separate screens or projector arrays.
- Streaming to IP-based broadcast workflows or hardware NDI monitors / decoders.
- Tour rigs where Mac and Windows machines need to share video without a capture card.
- Remote production scenarios where the source machine is not in the same room as the output.
RenderWave’s support for both
RenderWave outputs Syphon natively at any resolution your GPU can drive, including transparent RGBA for compositing into MadMapper or layering with another Syphon source. The Studio tier adds NDI input and output for multi-machine and broadcast workflows. See renderwave.io/features for the current capability list.
The practical takeaway: you do not have to choose. A Mac VJ in 2026 with a serious rig will use both — Syphon for the local instrument-to-mapper hop, NDI for the cross-machine and cross-platform hops.
Minimal Syphon setup with RenderWave → MadMapper
The simplest working chain on a single Mac:
- In RenderWave, open Output and enable Syphon Output. Give the server a recognizable name (e.g.
RenderWave Main). - Open MadMapper. In the Media Library, switch to the Syphon tab.
RenderWave Mainwill appear in the discovered server list. Drag it into your media bin.- Drag the Syphon media onto a surface. Now MadMapper is mapping a live shader output, not a pre-rendered file.
No latency tuning. No codec choice. No network. If both apps are running on the same Mac, it works.
Minimal NDI setup with RenderWave → vMix on a second machine
- Install NDI Tools on both machines. Both must be on the same wired LAN (Gigabit minimum, 10 GbE recommended for 4K).
- In RenderWave Studio, open Output and enable NDI Output. Name the source (e.g.
RW-Booth-1). - On the second machine (Mac or Windows), open vMix → Add Input → NDI/Desktop Capture → pick
RW-Booth-1from the discovered list. - Verify the latency in vMix’s source preview before show. If it is over ~200 ms, check the LAN — Wi-Fi or a hop through a saturated switch will hurt you.
FAQ
What is the latency of Syphon?
Sub-millisecond on a local Mac. Syphon shares GPU memory directly via IOSurface, so there is no encode, decode, or network step — the receiver binds the texture the sender just wrote on its next frame. Practical end-to-end latency is bounded by your monitor’s refresh rate, not by Syphon itself.
Does NDI support alpha channel?
Yes, but conditionally. Full-bandwidth NDI carries RGBA 4:4:4:4 and preserves alpha cleanly. NDI HX (the H.264/HEVC variant most real-world networks use) varies — encoders may strip alpha or compress it heavily. Test alpha end-to-end on the version and encoder path you are actually shipping, not on the spec sheet. See the NDI SDK docs.
Can I use Syphon over the network?
No. Syphon is local-only, by design. It uses macOS IOSurface shared GPU memory, which does not cross machines. If you need to move video between machines, use NDI, or run a Syphon-to-NDI bridge on the source Mac to translate.
Is NDI free?
The NDI runtime, NDI Tools, and the NDI SDK are free for download and use from ndi.video. You do not pay per stream or per machine. Commercial integrations (building NDI into a paid product) are governed by the NDI SDK license.
Which is better for projection mapping?
Syphon, almost always — if the mapper and the source are on the same Mac. The combination of zero latency, full RGBA, and zero compression means the mapped output looks exactly like the source. If your projection-mapping machine is separate from your visual-source machine, NDI is the only realistic option, and you accept the latency.
Can I run Syphon and NDI from RenderWave at the same time?
Yes. RenderWave Studio can publish Syphon and NDI from the same render output simultaneously. Local apps (MadMapper, OBS) connect over Syphon; remote machines connect over NDI. Same source, two transports.
What about Spout?
Spout is the Windows equivalent of Syphon — same idea, same zero-copy GPU texture sharing, different platform. Spout and Syphon do not directly interoperate. Cross-platform bridges exist (Spout-to-NDI and similar) but the modern pattern is “Syphon on the Mac side, Spout on the Windows side, NDI between them.”
By Wesley Walz, founder of RenderWave. I run Syphon to MadMapper for local mapping and NDI to a second machine when the show needs a switcher I do not own.