The Composable Runtime
The Composable Runtime
The Composable Runtime
Dagger is an open-source runtime for composable workflows, designed for complex systems requiring repeatability, modularity, observability, and cross-platform support. It’s great for building AI agents and CI/CD pipelines.
Dagger is an open-source runtime for composable workflows, designed for complex systems requiring repeatability, modularity, observability, and cross-platform support. It’s great for building AI agents and CI/CD pipelines.
Dagger is an open-source runtime for composable workflows, designed for complex systems requiring repeatability, modularity, observability, and cross-platform support. It’s great for building AI agents and CI/CD pipelines.
Key Features
Containerized Workflow Execution
Transform code into containerized operations. Build reproducible workflows in any language with custom environments, parallel processing, and seamless chaining.
Universal Type System
Mix and match components from any language with type-safe connections. Use the best tools from each ecosystem without translation headaches.
Automatic Artifact Caching
Operations produce cacheable, immutable artifacts — even for LLMs and API calls. Your workflows run faster and cost less.
Built-in Observability
Full visibility into operations with tracing, logs, and metrics. Debug complex workflows and know exactly what's happening.
Open Platform
Works with any compute platform and tech stack — today and tomorrow. Ship faster, experiment freely, and don’t get locked into someone else's choices.
LLM Augmentation
Native integration of any LLM that automatically discovers and uses available functions in your workflow. Ship mind-blowing agents in just a few dozen lines of code.
Interactive Terminal
Directly interact with your workflow or agents in real-time through your terminal. Prototype, test, debug, and ship even faster.
Containerized Workflow Execution
Transform code into containerized operations. Build reproducible workflows in any language with custom environments, parallel processing, and seamless chaining.
Universal Type System
Mix and match components from any language with type-safe connections. Use the best tools from each ecosystem without translation headaches.
Automatic Artifact Caching
Operations produce cacheable, immutable artifacts — even for LLMs and API calls. Your workflows run faster and cost less.
Built-in Observability
Full visibility into operations with tracing, logs, and metrics. Debug complex workflows and know exactly what's happening.
Open Platform
Works with any compute platform and tech stack — today and tomorrow. Ship faster, experiment freely, and don’t get locked into someone else's choices.
LLM Augmentation
Native integration of any LLM that automatically discovers and uses available functions in your workflow. Ship mind-blowing agents in just a few dozen lines of code.
Interactive Terminal
Directly interact with your workflow or agents in real-time through your terminal. Prototype, test, debug, and ship even faster.
Containerized Workflow Execution
Transform code into containerized operations. Build reproducible workflows in any language with custom environments, parallel processing, and seamless chaining.
Universal Type System
Mix and match components from any language with type-safe connections. Use the best tools from each ecosystem without translation headaches.
Automatic Artifact Caching
Operations produce cacheable, immutable artifacts — even for LLMs and API calls. Your workflows run faster and cost less.
Built-in Observability
Full visibility into operations with tracing, logs, and metrics. Debug complex workflows and know exactly what's happening.
Open Platform
Works with any compute platform and tech stack — today and tomorrow. Ship faster, experiment freely, and don’t get locked into someone else's choices.
LLM Augmentation
Native integration of any LLM that automatically discovers and uses available functions in your workflow. Ship mind-blowing agents in just a few dozen lines of code.
Interactive Terminal
Directly interact with your workflow or agents in real-time through your terminal. Prototype, test, debug, and ship even faster.
Dagger in Action

Dockerfile Optimizer
A Dagger module that helps optimize your Dockerfiles using AI assistance.

Multi Agent Demo
This is a demo of a function using multiple LLMs to complete a (silly) task.
“A-ha!” moments

Jordan Parker
When @solomonstre drops an open-source Claude Code... ya run with it. "Toy Programmer" + added MCP & cursor-rules + an updated version of @zbeyens phenom "dotai" repo


Kyle Penfound
This is so awesome to play with. I just made a new frontend for my main demo repo without any hand-holding, writing integrations, or anything. Just core dagger + bring your own LLM. In a 2 minute video. Most of which is for the sake of the recording


Benjie D.
So im finally coming around on this whole AI thing is the future...Solomon Hykes demo of an agent writing a curl clone in 5 minutes using Dagger is still lingering in my mind...if any of you are curmudgeons like me you might want to watch...


Yves
If you don’t know where to start to create AI agents, you should have a look at Dagger agents. That makes it so easy to build tools that will benefit from AI but as real developer tool.


Steeve Morin
Few understand the implications of this. This will be absolutely defining


Alberto Fuentes
I'm feeling a "docker moment" in real time. Even though I didn't know the CI tool, it seems so promising for multiple areas.


Matias Pan
@dagger_io experimental AI tooling is really interesting. I just leveraged two separate modules (Strava and Notify). It took me 20 minutes to build the prompt and less than 5 to write the code.


Ankur Duggal
This is crazy cool! If you have worked with LLMs this is amazing!


Peter Jausovec
all your dagger functions are now available for agents to use. built-in function registry.


Madhav Jivrajani
This is nuts. @dagger_io has always had amazing dev exp, but the fact that it lends itself so seamlessly to something like this is endlessly impressive, massive props @solomonstre and the team!












