Successfully created native binaries for both CLI and MCP server using GraalVM native-image. Both binaries work correctly for their respective use cases:
GRAALVM_HOME environment variable setGRAALVM_HOME=/Library/Java/JavaVirtualMachines/graalvm-25.jdk/Contents/HomeCLI Binary:
# Build CLI uberjar (includes AOT compilation with dynaload)
clj -T:build jar-cli
# Build native binary (requires GRAALVM_HOME)
GRAALVM_HOME=/path/to/graalvm clj -T:build native-cli
Output: target/mcp-tasks-<platform>-<arch> (~39 MB native executable)
Server Binary:
# Build server uberjar (includes AOT compilation with dynaload)
clj -T:build jar-server
# Build native binary (requires GRAALVM_HOME)
GRAALVM_HOME=/path/to/graalvm clj -T:build native-server
# Or use Babashka task (builds both)
bb build-native-server
Output: target/mcp-tasks-server-<platform>-<arch> (~40 MB native executable)
Status: Fully compatible with AOT compilation
Approach: Uses borkdude/dynaload for lazy loading with AOT support:
build.clj passes -Dborkdude.dynaload.aot=true during uberjar compilationImplementation details (in src/mcp_tasks/schema.cljc and src/mcp_tasks/execution_state.cljc):
(def malli-validator (dynaload 'malli.core/validator {:default (constantly (fn [_] true))}))
(def malli-explainer (dynaload 'malli.core/explainer {:default (constantly (fn [_] nil))}))
Benefits over previous requiring-resolve approach:
Previous approach (requiring-resolve):
requiring-resolve with fallback defaultsCurrent approach (dynaload with AOT):
borkdude/dynaload with :default fallbacksEntry Points (src/mcp_tasks/native_init.clj and src/mcp_tasks/native_server_init.clj):
mcp-tasks.cli/-mainmcp-tasks.native-init/-main → mcp-tasks.cli/-mainmcp-tasks.main/-mainmcp-tasks.native-server-init/-main → mcp-tasks.main/-mainThe build-uberjar function in dev/build.clj configures dynaload AOT mode:
(b/compile-clj {:basis basis
:src-dirs ["src"]
:class-dir class-dir
:java-opts ["-Dborkdude.dynaload.aot=true"]})
This JVM option tells dynaload to resolve all lazy references at compile time, enabling GraalVM to include the resolved code in the native binary.
["native-image"
"-jar" jar-file
"--no-fallback" ;; No fallback to JVM
"-H:+ReportExceptionStackTraces" ;; Better error messages
"--initialize-at-build-time" ;; Initialize all classes at build time
"-o" output-binary]
CLI Binary:
# Help command
./target/mcp-tasks-<platform>-<arch> --help
# List tasks (no Malli warnings)
./target/mcp-tasks-<platform>-<arch> list --status open --format human
# All CLI commands tested and working
# Schema validation functioning correctly
Server Binary:
# Start MCP server (stdio transport)
./target/mcp-tasks-server-<platform>-<arch>
# Server starts and accepts MCP protocol messages
# Tested via integration tests with :native-binary metadata
# Smoke tests verify startup on all platforms
# Comprehensive tests validate full MCP protocol on Linux
Previous behavior showed:
Warning: Malformed EDN at line N: Could not locate malli/core__init.class...
Current behavior: No warnings. Malli is properly loaded at compile time via dynaload AOT.
Native binaries require explicit resource configuration to embed markdown files:
;; In dev/build.clj native-image invocation
["-H:IncludeResources=prompts/.*\\.md,category-prompts/.*\\.md"]
This ensures all prompt markdown files are embedded in the binary and accessible via io/resource.
Problem: GraalVM native images don't support directory listing via io/resource + file-seq. This pattern works in JARs but fails in native binaries because resources are embedded directly in the binary without a traditional filesystem structure.
Solution: Generate a manifest file at build time listing all workflow prompts.
Implementation:
Manifest Generation (dev/build.clj):
(defn generate-prompt-manifest
"Generate manifest file listing all workflow prompts.
Scans resources/prompts/ directory and creates resources/prompts-manifest.edn
containing a vector of workflow prompt names (without .md extension).
This enables prompt discovery in GraalVM native images where directory
listing via io/resource is not supported."
[]
(let [prompts-dir (io/file "resources/prompts")
workflow-files (->> (file-seq prompts-dir)
(filter #(and (.isFile %)
(str/ends-with? (.getName %) ".md")
(not (.isDirectory (.getParentFile %)))))
(map #(str/replace (.getName %) #"\\.md$" ""))
sort
vec)]
(spit (io/file "resources/prompts-manifest.edn") (pr-str workflow-files))))
Manifest Reading (src/mcp_tasks/prompts.clj):
(defn list-builtin-workflows
"List all built-in workflow prompts.
Reads from generated manifest file (resources/prompts-manifest.edn) which is
created at build time. This approach works in both JAR and GraalVM native
images, avoiding the limitation that directory listing via io/resource +
file-seq doesn't work in native binaries."
[]
(if-let [manifest-resource (io/resource "prompts-manifest.edn")]
(try
(read-string (slurp manifest-resource))
(catch Exception e
(log/error :failed-to-read-manifest {:error (.getMessage e)})
[]))
[]))
Manifest Format: Simple EDN vector of strings (prompt names without .md extension)
["complete-story" "create-story-pr" "create-story-tasks" ...]
Build Integration: The manifest is generated during build-uberjar before resources are copied, ensuring it's included in both JAR and native binaries. The manifest file is committed to git for reproducibility.
Development Workflow: When adding or removing workflow prompts:
resources/prompts/clojure -T:build jar-cli or jar-server to regenerate manifestCategory Prompt Discovery: Category prompts use the existing discover-categories mechanism which reads from .mcp-tasks/category-prompts/ in the filesystem (not embedded resources), so they don't require manifest-based discovery.
Status: Not required for current implementation
The build succeeds without custom reflection configuration because:
Current measurements (with dynaload AOT):
Previous measurements (with requiring-resolve):
Analysis: The slight size increase (~0.7-2 MB) is due to Malli being properly included in the binary. Previously, Malli was effectively excluded because requiring-resolve couldn't load it at compile time. The trade-off is:
Future binary size reduction options:
--gc=G1 - Different garbage collector-O3 - Higher optimization level--pgo)The native server binary can be configured in MCP clients:
Claude Code:
claude mcp add mcp-tasks -- /usr/local/bin/mcp-tasks-server
Claude Desktop:
{
"mcpServers": {
"mcp-tasks": {
"command": "/usr/local/bin/mcp-tasks-server"
}
}
}
The server binary uses stdio transport and requires no additional arguments or configuration.
Both native binaries (CLI and server) are production-ready with full schema validation support. The migration from requiring-resolve to borkdude/dynaload with AOT compilation:
Improvements:
Trade-offs:
Key Benefits:
Can you improve this documentation?Edit on GitHub
cljdoc builds & hosts documentation for Clojure/Script libraries
| Ctrl+k | Jump to recent docs |
| ← | Move to previous article |
| → | Move to next article |
| Ctrl+/ | Jump to the search field |