Thoryn
  • engineering
  • cli
  • graalvm
  • kotlin

Single-binary CLI — GraalVM, picocli, and the Kotlin reflection trap

thoryn ships as a 30-MB native binary that starts in 30 ms. Getting picocli + Kotlin + GraalVM to cooperate took one fix the picocli docs do not warn you about.

24 Jun 2026 · Mark Bakker

thoryn is a single binary. About 30 MB. No JVM on the host, no JAVA_HOME to keep aligned with ours. brew install thoryn-io/tap/thoryn and you can run it. Cold start is roughly 30 ms — fast enough that thoryn login feels like any other CLI, not like a JVM tool dressed up as one.

Getting there meant exactly one gotcha that the picocli documentation does not warn you about: the recommended annotation-processor wiring is Java-source-only. It silently emits nothing for Kotlin sources, the native build still succeeds, and then the first command throws MissingResourceException at runtime because the reflection metadata picocli needs is not in the image. We didn't notice until the CI native-build matrix turned red on --version.

This post is the engineer's-eye view: why we picked GraalVM, where the reflection trap lives, how we worked around it, and what the cross-platform pipeline looks like.

Why a CLI at all

Every developer platform that respects its users ships one. Auth0 has auth0, Okta has okta-cli, Stripe and Vercel and Cloudflare all have theirs. Ours is thoryn, and it lives in the customer plane: list OAuth clients, rotate a secret, enrol a federation member, tail an audit log. The user-facing surface is on the /cli/ page; this post is for engineers building similar tooling.

The shape we wanted: a single command, no install dance, scripts cleanly, exits with a meaningful code. Anything that needs java -jar does not meet that bar. Anything that needs a JRE on the host does not meet that bar.

Why GraalVM-native

A small Spring Boot CLI takes 1–2 seconds to bootstrap the JVM and class-load before it does anything. That is fine for a server but actively user-hostile in a CLI; people notice the gap between hitting Enter and seeing output. Native binaries skip the JVM entirely — thoryn --version returns in roughly 30 ms on a 2024 MacBook. Shipping a JVM CLI also means either bundling a 200 MB JRE per platform or telling users "first install JDK 21." Both lose. A native binary is self-contained.

The binary is ~30 MB compressed per platform. Large for a CLI, small enough to fit on a Homebrew tap or an apt repo. We accept the trade. The downsides — slow native-image build times (3–5 minutes per platform), reflection that has to be declared up front, no dynamic class loading — are real but don't bite a CLI: the code path is bounded, entry points are static, and the build runs once per release, not once per request.

The picocli reflection trap

picocli is a small annotation-driven CLI library. You write @Command on a class, @Option on a field, and at runtime picocli wires the argv into your fields by reflection.

GraalVM, by default, strips reflection metadata. Annotations aren't enough — you have to tell native-image which classes need their methods, fields, and constructors preserved at runtime. That information goes into a reflect-config.json file under META-INF/native-image/.

picocli ships a code generator that walks your @Command graph and emits exactly that file. The picocli docs say to wire it through Maven's annotationProcessorPaths:

<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-compiler-plugin</artifactId>
  <configuration>
    <annotationProcessorPaths>
      <path>
        <groupId>info.picocli</groupId>
        <artifactId>picocli-codegen</artifactId>
      </path>
    </annotationProcessorPaths>
  </configuration>
</plugin>

This is Java-source-only. The annotation-processor pipeline runs as part of javac. Our sources are Kotlin, compiled by kotlinc first and then handed to javac only as already-compiled .class files. By the time javac looks at the codegen processor's ServiceLoader, there are no source elements left for it to walk. The processor runs, finds nothing, and writes an empty reflect-config.json. No warnings, no errors, the build is green.

The native image then builds successfully — there is nothing in reflect-config.json to fail on. The binary launches. The first time picocli tries to introspect a @Command class for its options, the reflection metadata is missing, and you get a stack trace pointing at the picocli internals with no obvious link back to the build configuration that caused it. The picocli FAQ touches on Kotlin support but does not call out that the annotation-processor path silently produces nothing — the AOT-config doc reads as if it just works.

The fix

picocli also publishes the same code generator as a runnable main class: picocli.codegen.aot.graalvm.ReflectionConfigGenerator. It takes a list of fully-qualified @Command class names on the command line, walks them by reflection against the compiled classpath, and writes the same reflect-config.json you would have got from the annotation processor.

We invoke it directly via exec-maven-plugin against target/classes/, after the Kotlin compile step:

<plugin>
  <groupId>org.codehaus.mojo</groupId>
  <artifactId>exec-maven-plugin</artifactId>
  <executions>
    <execution>
      <phase>process-classes</phase>
      <goals><goal>java</goal></goals>
      <configuration>
        <mainClass>picocli.codegen.aot.graalvm.ReflectionConfigGenerator</mainClass>
        <arguments>
          <argument>--output</argument>
          <argument>${project.build.outputDirectory}/META-INF/native-image/picocli-generated/reflect-config.json</argument>
          <argument>org.thoryn.cli.ThorynMain</argument>
        </arguments>
      </configuration>
    </execution>
  </executions>
</plugin>

By the time process-classes runs, everything is bytecode. Kotlin or Java, the generator does not care. The output has all four @Command classes wired up. The shaded jar carries the file under META-INF/native-image/, where native-image discovers it automatically during the binary build. Same shape as the annotation-processor path, different dispatcher.

There's a smaller version of the same trap one layer down: the file class Kotlin generates for a top-level main function has a Kt suffix, but the @Command-annotated class does not. The codegen tool wants the latter. We keep both as Maven properties so a future rename touches one place.

GraalVM-native CLI build pipeline — Kotlin sources compile to bytecode, ReflectionConfigGenerator emits reflect-config.json, native-image links the platform binary, cosign signs each artefact

The CI matrix and distribution

A native binary is per-platform. We build four:

RunnerArch-march
macos-14arm64default (GraalVM rejects compatibility here)
macos-13x64-march=compatibility
ubuntu-latestx64-march=compatibility
windows-latestx64-march=compatibility

-march=compatibility produces a portable x86_64 binary that runs on older CPUs at a small perf cost. AArch64 GraalVM rejects compatibility — it errors out — so macOS arm64 leaves the flag unset and inherits the default armv8-a. We model the difference as a per-OS Maven sub-profile activated by a -Dnative-march=compatibility system property the workflow sets per matrix entry, rather than putting OS-detection logic inside the POM. Each job smoke-tests --version and --help before uploading the artefact.

Distribution is the other half of the job. We support six channels: a Homebrew tap, signed apt and rpm repos, a Scoop bucket, a winget manifest, and direct download from GitHub Releases with cosign verify-blob. Every binary and the SHA256SUMS file are signed by cosign keyless via the GitHub OIDC identity. Each publisher gates on its secret being present — if the Homebrew tap deploy key isn't configured, the Homebrew step skips and the rest of the release continues. Same shape for apt, rpm, Scoop, winget. This was small but load-bearing: it lets us turn the workflow on with one channel configured, prove the pipeline end-to-end, and add more channels without touching the workflow file.

What we'd do again

Picking GraalVM was the right call. The 30-ms cold start is the difference between "feels like a CLI" and "feels like a JVM tool wearing a CLI costume," and that perception bleeds into how much trust users put in the binary. Picking picocli was the right call too — the @Command / @Option model is clean and the reflection generator is a real first-class option. The trap is the documentation gap, not the design.

If you are building something similar in Kotlin, the one piece of advice is: do not assume the picocli annotation-processor wiring works for Kotlin sources. Test the native binary against your real commands, not just --version. Look at the reflect-config.json that lands inside your shaded jar before you run native-image. If it is empty, the rest of the build is lying to you.

Try it: brew install thoryn-io/tap/thoryn, then thoryn login and thoryn clients list. If it feels fast, that's the point.