LogoLogo
SlackLogin
  • Overview
    • Welcome
  • Setup & Configuration
    • Connecting to Trunk
    • Managing Your Organization
      • GitHub App Permissions
  • Integration with Slack
  • Flaky Tests (Beta)
    • Overview
    • Get Started
      • Test Frameworks
        • Android
        • Bazel
        • Behave
        • cargo-nextest
        • Cypress
        • Dart Test
        • Go
        • GoogleTest
        • Gradle
        • Jasmine
        • Jest
        • Karma
        • Kotest
        • Maven
        • minitest
        • Mocha
        • Nightwatch
        • NUnit
        • Pest
        • PHPUnit
        • Playwright
        • Pytest
        • Robot Framework
        • RSpec
          • RSpec (Manual Uploads)
        • Swift Testing
        • Vitest
        • XCTest
        • Other Test Frameworks
      • CI Providers
        • Azure DevOps Pipelines
        • BitBucket Pipelines
        • Buildkite
        • CircleCI
        • Drone CI
        • GitHub Actions
        • GitLab
        • Jenkins
        • Semaphore CI
        • TeamCity
        • Travis CI
        • Other CI Providers
    • Dashboard
    • Flaky Test Detection
    • Quarantining
    • PR Comments
    • Ticketing Integrations
      • Jira Integration
      • Linear Integration
      • Other Ticketing Platforms
    • Webhooks
      • Slack Integration
      • Microsoft Teams Integration
      • GitHub Issues Integration
      • Linear Integration
    • Uploader CLI Reference
  • Merge Queue
    • Overview
    • How does it work?
    • Setup
      • Quick Start
      • Settings
      • Integration for Slack
    • Concepts and Optimizations
      • Predictive Testing
      • Optimistic Merging
      • Pending Failure Depth
      • Anti-Flake Protection
      • Batching
      • Parallel Queues
        • Bazel
        • Nx
        • API
      • FAQ
    • Priority
    • Managing Merge Queue
      • Using the Merge UI
      • Metrics
      • Command Line
    • Webhooks
    • Reference
  • Code Quality
    • Overview
    • Why Metalinters?
      • How does it work?
      • Why Code Quality?
    • Setup & Installation
      • Initialize Trunk
      • Local Linting
      • Linting in CI
      • Nightly Report (Deprecated)
      • OpenAI Codex Support
    • IDE Integration
      • VSCode
      • Neovim
      • GitHub Codespaces
    • Linters
      • Supported Linters
        • Actionlint
        • Ansible-lint
        • Autopep8
        • Bandit
        • Biome
        • Black
        • Brakeman
        • buf
        • Buildifier
        • cfnlint
        • Checkov
        • circleci
        • ClangFormat
        • clang-tidy
        • Clippy
        • cmake-format
        • codespell
        • cspell
        • cue-fmt
        • dart
        • deno
        • Detekt
        • djlint
        • dotenv-linter
        • dotnet-format
        • dustilock
        • ESLint
        • Flake8
        • git-diff-check
        • Gitleaks
        • Gofmt
        • gofumpt
        • goimports
        • gokart
        • golangci-lint
        • golines
        • google-java-format
        • graphql-schema-linter
        • hadolint
        • haml-lint
        • isort
        • iwyu
        • ktlint
        • kube-linter
        • markdown-link-check
        • markdown-table-prettify
        • Markdownlint
        • markdownlint-cli2
        • mypy
        • nancy
        • nixpkgs-fmt
        • opa
        • OSV-Scanner
        • Oxipng
        • perlcritic
        • perltidy
        • php-cs-fixer
        • phpstan
        • pmd
        • pragma-once
        • pre-commit-hooks
        • Prettier
        • prisma
        • psscriptanalyzer
        • Pylint
        • pyright
        • regal
        • remark-lint
        • renovate
        • rome
        • rubocop
        • Ruff
        • rufo
        • rustfmt
        • scalafmt
        • semgrep
        • ShellCheck
        • shfmt
        • sort-package-json
        • sourcery
        • sql-formatter
        • SQLFluff
        • sqlfmt
        • squawk
        • standardrb
        • stringslint
        • stylelint
        • stylua
        • SVGO
        • swiftformat
        • swiftlint
        • taplo
        • Terraform
        • terragrunt
        • terrascan
        • TFLint
        • tfsec
        • tofu
        • Trivy
        • Trufflehog
        • txtpbfmt
        • vale
        • Yamllint
        • yapf
      • Run Linters
      • Manage Linters
      • Configure Linters
      • Ignoring Issues and Files
      • Custom Linters
      • Shared Configs
      • Upgrades
    • Debugging
    • Licensing
  • CLI & API References
    • CLI Reference
      • Install
      • Getting Started
        • Code Quality
        • Merge Queue
        • Flaky Tests
        • Tools
        • Actions
          • Git Hooks
        • Announce
      • Compatibility
      • Caching
      • Commands Reference
        • Code Quality
        • Actions
        • Merge
      • Configuration
        • Plugins
          • Share Config Between Codebases
          • Exporting Linter Configs
        • Runtimes
        • Tools
        • Lint
          • Definitions
          • Commands
          • Output
          • Output Parsing
          • Files and Caching
          • Dependencies
          • Auto-Enable
        • Actions
          • Notifications
          • Logging and Troubleshooting
        • Merge
        • Telemetry
        • Per User Overrides
    • API Reference
      • Flaky Tests
      • Merge Queue
      • Webhooks Reference
  • Pricing & Security
    • Security
  • Billing
  • Community & Support
  • Links
    • Open App
    • Slack Community
    • Changelog
    • Feature Requests
On this page
  • Checklist
  • Generating Reports
  • Try It Locally
  • Next Steps
Edit on GitHub
  1. Flaky Tests (Beta)
  2. Get Started
  3. Test Frameworks

Kotest

A guide for generating Trunk-compatible test reports for Kotest

Last updated 10 days ago

You can automatically in your Kotest projects by integrating with Trunk. This document explains how to configure Kotest to output JUnit XML reports that can be uploaded to Trunk for analysis.

Checklist

By the end of this guide, you should achieve the following before proceeding to the to configure your CI provider.

After correctly generating reports following the above steps, you'll be ready to move on to the next steps to .

Generating Reports

Steps for generating JUnit XML reports for Kotest depend on the build system you use for your project:

Tests run with Gradle will generate Trunk-compatible JUnit XML reports by default. You can further in your build.gradle.kts or build.gradle.

Kotest projects using Maven require the following to be added to a project's pom.xml so JUnit XML reports can be generated:

  • the maven-surefire-plugin must be added to the plugins section of pom.xml

pom.xml
<project>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>3.2.2</version>
            </plugin>
            
            <!-- other plugins -->
        </plugins>
    </build>
</project>
  • the kotest-extensions-junitxml must be added to the dependencies section of pom.xml

pom.xml
<dependencies>
    <dependency>
        <groupId>io.kotest</groupId>
        <artifactId>kotest-extensions-junitxml-jvm</artifactId>
        <version>5.9.0</version>
        <scope>test</scope>
    </dependency>
    
    <!-- other dependencies -->
</dependencies>

Report File Path

You can configure the path for generated JUnit XML files:

By default, Kotlin projects will produce a directory with JUnit XML reports under ./app/build/test-results/test. You can locate these files with the glob "./app/build/test-results/test/*.xml".

If you wish to override the default test result path, you can do so in the build.gradle.kts or build.gradle files:

build.gradle.kts (Kotlin) or build.gradle (Groovy)
java.testResultsDir = layout.buildDirectory.dir("junit-reports")

You can change the report file path by configuring the reportsDirectory in your maven-surefire-plugin in your pom.xml file:

pom.xml
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-surefire-plugin</artifactId>
    <version>3.2.2</version>
    <configuration>
        <reportsDirectory>${project.build.directory}/junit/</reportsDirectory>
    </configuration>
</plugin>

The example above will output JUnit XML reports that can be located with the /target/junit/*.xml glob.

Disable Retries

Maven uses the maven-surefire-plugin to run tests, which allows you to control the test retry behavior. You can disable retries by specifying 0 retries:

mvn -Dsurefire.rerunFailingTestsCount=0 test

Try It Locally

The Validate Command

curl -fsSLO --retry 3 https://trunk.io/releases/trunk && chmod +x trunk
./trunk flakytests validate --junit-paths "./app/junit-reports/*.xml"

Make sure to specify the path to your JUnit XML test reports.

This will not upload anything to Trunk. To improve detection accuracy, you should address all errors and warnings before proceeding to the next steps.

Test Upload

Before modifying your CI jobs to automatically upload test results to Trunk, try uploading a single test run manually.

You make an upload to Trunk using the following command:

curl -fsSLO --retry 3 https://trunk.io/releases/trunk && chmod +x trunk
./trunk flakytests upload --junit-paths "./app/junit-reports/*.xml" \
    --org-url-slug <TRUNK_ORG_SLUG> \
    --token <TRUNK_ORG_TOKEN>

Next Steps

Configure your CI to upload test runs to Trunk. Find the guides for your CI framework below:

You need to disable automatic retries if you previously enabled them. Retries compromise the accurate detection of flaky tests. You should disable retries for accurate detection and use the feature to stop flaky tests from failing your CI jobs.

If you've enabled retries using a plugin like the , disable it when running tests for Trunk flaky tests.

You can validate your test reports using the . If you don't have it installed already, you can install and run the validate command like this:

You can find your Trunk organization slug and token in the settings or by following these . After your upload, you can verify that Trunk has received and processed it successfully in the Uploads tab. Warnings will be displayed if the report has issues.

detect and manage flaky tests
next steps
configure uploads in CI
configure reporting behavior
Quarantining
test-retry-gradle-plugin
Trunk CLI
instructions
Cover

Azure DevOps Pipelines

Cover

BitBucket Pipelines

Cover

BuildKite

Cover

CircleCI

Cover

Drone CI

Cover

GitHub Actions

Cover

Gitlab

Cover

Jenkins

Cover

Semaphore

Cover

TeamCity

Cover

Travis CI

Cover

Other CI Providers