Tutorial: JavaFX Mill project

Using Mill to set-up an OpenJFX (formerly JavaFX) Java or Scala application

Hugo FerreiraHugo Ferreira

Introduction

In this article I am going to show you how to use Mill to set-up a Java and a Scala application that both use OpenJFX(previously JavaFX). The motivation behind this article is that, although planned, Scala does not support JPMS modules. This is also true of other JVM languages. Java does provide backward compatibility by allowing "legacy developers" to use standard Jar libraries as unnamed modules for JDK 9 and above. However, this is fraught with difficulties. In particular, OpenJFX requires some hacks that don't always seem to work. This article also serves as a tutorial for those of you are unfamiliar with Mill. I will explain and demonstrate the use of Mill step by step.

All the code and scripts are available in this Github repository. This Mill project supports both Scala 2 and Scala 3. This has been tested with JDK 11 and JDK 17. You need only change the build script to use the desired version of Scala and Java. Before we get into the nitty gritty of things, I will first explain why Java modules were introduced and some issues I have found. Then I will give a brief introduction on Mill and include some pointers to indispensable resources on how to use it. I will then explain how to use Mill to download, compile and execute a OpenJFX application. I will also describe some of Mill's utilities commands to facilitate your work. And finally, I will conclude this presentation with a short summary.

Java Module system

The Java Platform Module System (JPMS, formerly referred to as Jigsaw) was introduced with JDK 9. The advantages include a means to explicitly define dependencies between libraries so that we can produce smaller and more efficient runtime systems, allow application developers to define service consumers and providers (for example patching at compile time) and restricting access to the internals of a module (Java reflection won't work). Several guides and tutorials exist that can help you get started 1.

When I tried to setup a Scala application to use java.net.http, I had problems. I later found that the backward compatibility mode in that specific case works correctly. However, when trying to use the OpenJFX instructions for the compilation and execution of a "Non-modular application", I found the following issues:

Note that the final application may also be operating system independent if all of the required native libraries are downloaded and packaged into the final Jar archive. In this article we will focus on supporting a single operating system.

The compilation and packaging steps described in the list above can be done manually. However, this is cumbersome, time-consuming and error-prone. A developer would have to first install the OpenJFX modules or download them to a specific directory for a given JDK version. All the compilation and execution commands would then have to be painstakingly changed or parameterized to point to the download directory with these modules. This would pose a significant barrier for someone to quickly clone, use and even contribute to your project.

In addition to the issues listed above, I have found that several libraries require the use of the module-specific command line arguments for the Java compiler because:

  • Some modules must be opened for access via reflection (for example ControlsFX and Chart-FX);
  • Some modules must be "patched" to use a specific provider of a given service (for example ControlsFX and TestFX Monocle).

The OpenJFX installation instruction however also show that both Maven and Gradle can be used to download, compile and package non-modular Java applications. So I set out to replicate part of their functionality in Mill so that you can easily setup your own project that uses JPMS in a non-modular Scala or Java application.

Mill

When I started my journey learning Scala, the reference build tool was (and still is) SBT. There are quite a few build tools out their that can be used to build Scala projects (see references [15, 18]). In fact, while researching for this article I also found SBuild. I, like many other Scala newbies, found SBT somewhat difficult to use due to it peculiar domain specific language (DSL). I also had trouble trying to quickly customize my small projects. The coding, publishing and use of SBT plugins required too much "ceremony". Of course, this is my opinion and your experience with SBT may not be the same, especially with the improvements that were made since version 0.13. The advantages of SBT are: very good documentation and an extensive set of plugins.

Some time later, I chanced upon a presentation by Jan Vogt that described the CBT build tool. I liked the ideas that he presented - a build tool that was simply a Scala library that provided build functionality that could be easily extended and adapted. A build specification was nothing more than a Scala script. That led me to look for build tools with a similar philosophy, and I discovered Mill. At the time, Mill seemed to be a more promising resource due to its scripting capabilities and the already existing documentation. I decided to investigate it and to date am satisfied with this tool.

Brief history and background

The tool was originally designed, developed and used by Li Haoyi (Li Haoyi GitHub's page). He has developed several other useful open source projects and is well regarded within the Scala community. I would say that currently, circa 2022, the principal maintainer and contributor is Tobias Roeser. This is not surprising, since he has also worked on Sbuild. Besides other work, he has also additional Mill related projects such as MillW and Mill-IntegrationTest. The former, I find useful because it allows Windows users to use Mill without prior installation (as is the case with Linux) and, in several occasions, allowed the use of snapshot versions that the original Mill script failed to do. The latter is an "integration test plugin for Mill plugins" which allows anyone contemplating reusing and distributing their Mill scripts, to test them adequately. I tinkered with this some time ago to see how it works.

There are enough tutorials and documentation to get you up and running in no time. You can find an early introductory text from Li Haoyi, which is still relevant. He also has video presentations (for example see (26)) and a book on Scala that describes how to setup and take advantage of the Scala ecosystem (disclaimer - I have not read it). And of course, with Tobias's updates, the official documentation is an indispensable resource.

Mill consists of three main components:

  • A Scala scripting] engine (Ammonite). This is another of Li Haoyi's open source project. For a brief introduction on its scripting capabilities see (30);
  • A dependency resolver and artifact manager. Mill uses Coursier (Pure Scala Artifact Fetching) as its backend. This open source project is a marvel in and of itself that also allows anyone to download, install and execute publicly available Java virtual machine (JVM) based applications and tools (including Java, Scala and build tools such as SBT);
  • Build utilities that allows one to implement and execute build tasks, define dependencies among tasks and automatically cache and execute those tasks when strictly necessary. This is Mill's core per se and is provided as a set of Scala 2 classes that can be extended as required to fit your every need.

Before we move onto describing Mill and how it is used, it is important to describe what the Ammonite scripting engine can do. Note that Ammonite also provides an interactive (Read Evaluate and Print Loop) REPL, but we solely focus on its scripting capabilities. In essence Ammonite allows you to code and execute Scala 2 applications as if you are executing a REPL. All you need to do is install Ammonite (amm) and execute the script:

$ amm MyScript.sc

Ammonite will compile and execute that code for you. As with the standard Scala installation, Ammonite also allows you to import from the Scala standard library and use those classes and object you need. Here is an example copied verbatim from Ammonite's documentation:

// MyScript.sc
// print banner
println("Hello World!!")

// common imports
import sys.process._
import collection.mutable

// common initialization code
val x = 123
println("x is " + 123)

And here is the output:

Hello World!!
x is 123

What is even more interesting is that we are not limited do the Scala standard library. We can use magic import instructions to import:

Ammonite will take care of downloading, caching, compiling and executing your script. You can organize the various script to any depth and complexity you require. Ammonite provides many other goodies we will not take advantage of, which include for example:

  • Staging the imports during script runtime, so you only download what is needed;
  • Having and using multiple entry point (main function);
  • Defining and using command line arguments defined as main function parameters (automatically parsed and converted);
  • Documenting the entry point parameters that are automatically shown in the usage message.

In essence, the Mill entry point is an Ammonite script named build.sc. You can use all of the above capabilities to your advantage. In addition to this, Mill provides bundled and libraries that will allow you to quickly and easily set up your Java and Scala projects.

Basic Scala build

I now describe how Mill is used to set up the example javaFXMill project. For those of you that are new to Mill, I will provide ample information for you to understand and follow along with no problems. You will then be able to create your own projects using this script as a template and use the Mill command line to build and execute your project. Note that you can start your own project with a giter8 template using, for example, the Mill Scala Seed project. For those that usually work with SBT, be warned that the directory structure is not the same. However, Mill does provide a compatibility module, if you wish to maintain that structure.

The javaFXMill project root contains both the mill binary and the build.sc script. All build paths are relative to this build script. I have also provided a .mill-version configuration file that indicates the version of Mill to be used. The first time you run mill it will check if the correct version of the Mill binary is available. If not, it first downloads the correct version and then delegates to this version the desired command. To check the version you can execute:

$ ./mill --version

And you should get an output similar to this:

Mill Build Tool version 0.10.2
Java version: 11.0.14.1, vendor: Ubuntu, runtime: /usr/lib/jvm/java-11-openjdk-amd64
Default locale: en_US, platform encoding: UTF-8
OS name: "Linux", version: 5.13.0-39-generic, arch: amd64

At the start of the build.sc script we have the required imports:

import mill._
import mill.api.Loose
import mill.define.{Target, Task}
import scalalib._
import coursier.core.Resolution
import java.io.File

We are only using Mill's bundled libraries, so no need to use Ammonite's Ivy magic import command. Note that the Coursier API is also available to use if required.

The project consist of the following 3 modules:

  • javafx: an example of compiling and running a Java based OpenJFX application using automated dependency management;
  • manage: an example of compiling and running a Scala 2/3 bases OpenJFX application using automated dependency management;
  • unmanage: an example of compiling and running a Scala 2/3 bases OpenJFX application using manual dependency management.

In Mill each module represents a compilation unit that is defined by a core Mill object. A compilation unit is defined by extending one or more Mill objects in the build script. The name of the script's object is the name of both the compilation unit and the directory of its sources. Each object assumes a common layout and provides a set of parameters and commands that can be accessed via the mill command line. Several Mill modules are provided out-of-the-box that support common project types and configurations. These include:

  • JavaModule: standard Java projects;
  • ScalaModule: standard Scala projects;
  • CrossScalaModule: Scala projects that target several binary incompatible Scala versions;
  • ScalaJSModule: ScalaJS that allow you to develop front-end Web applications using Scala;
  • ScalaNativeModule: Scala native that targets native applications ("compiled ahead-of-time via LLVM");
  • SbtModule: supports Scala projects using the SBT layout;
  • CrossSbtModule: same as the CrossScalaModule but uses the SBT layout;
  • PublishModule: is a module that can be used as a "mixin" to provide commands for publishing the module as a Maven artifact;
  • Custom Module: allows us for example to organize the modules in a hierarchical fashion;
  • ExternalModules: Modules which are shared between several builds;

I urge those new to Mill, to later take a look at its extensive and detailed documentation. It provides additional information and usage examples that include among others:

  • Defining and using common configurations;
  • Defining and using global configurations (for example to publish artifacts);
  • Using Scala compiler plugins;
  • Using the ScalafmtModule to automatically format your code;
  • Defining a default main class;
  • Using foreign Modules, which allow build scripts to load other Mill projects from external folders using Ammonite’s $file magic import.
  • Contributing to and using third party plugins;

All of Mill's modules have a common set of variables and methods that represent for example JVM compilation flags, JVM fork flags, environment variables and Mill compilation and execution tasks. Each Mill module will also have additional variables and methods specific to its functionality. To define a compilation module in the build script, we need only extend the appropriate Mill module objects and override their methods with the functionality we require. So for the managed compilation module we have:

val ScalaVersion      = "3.1.1"
val javaFXVersion     = "16"
val mUnitVersion      = "1.0.0-M3"
val controlsFXVersion = "11.1.0"


object managed extends OpenJFX with ScalaModule {
  override def scalaVersion = T{ ScalaVersion }

  override def mainClass: T[Option[String]] = Some("helloworld.HelloWorld")
  override def ivyDeps = Agg(
                              ivy"$CONTROLS",
                              ivy"$CONTROLSFX"
                             )

    object test extends Tests {
      def ivyDeps = Agg(ivyMunit)
      def testFramework = ivyMunitInterface
    }
}

It's that simple. Let's review the details. First and foremost the managed compilation unit extends the ScalaModule because we want to generate a Scala application. Next we override the scalaVersion task to indicate that we want to use Scala version 3.1.1. Mill will, via Coursier, make sure the correct Scala compiler is available. Next, we set the default main class that Mill will execute with the run utility command (described in the next section). If only one class or object of the source code has a main method, this configuration is not required because it will use that single main method. However, this compilation unit, has two such methods. So we set the configuration mainClass task to the desired application's main class - helloworld.HelloWorld.

Usually we need to import and use libraries. Mill uses Coursier to manage these libraries. The ivyDeps method allows us to list the Maven artifacts that need to be downloaded and automatically included in both the compilation and execution class paths. These artifacts are cached using the Ivy dependency manager to avoid repeated downloads. Note that the local user's cache is accessible to any other build tools that use the Ivy or Maven tools. Mill also provides the runIvyDeps and compileIvyDeps configuration tasks that allow one to override and add libraries specifically to either the compilation or execution phases. For example compileIvyDeps will not appear in the transitive dependencies used to construct the compiled class path. Mill also allows you to configure the repositories used by Coursier to download the artifacts.

The libraries are defined using Mill's ivy string interpolation operator. The strings used by the manage compilation unit are defined in the OpenJFX module, which extends a JavaModule. We won't go into the details just yet, but suffice to say we simply inherit and use these values as shown above. Below is an excerpt of the OpenJFX module that holds the String names of the libraries. The naming convention is similar to the standard convention used by Apache Maven with slight modifications to support explicit versioning and selector usage. The separator : is used for Java, :: is used for Scala and ::: is used for cross publishing against full Scala versions. The test libraries can be selected by adding ;classifier=tests to the end of the library name.

trait OpenJFX extends JavaModule {

  // Modules 

  val BASE_       = s"base"
  val CONTROLS_   = s"controls"
  val FXML_       = s"fxml"
  val GRAPHICS_   = s"graphics"
  val MEDIA_      = s"media"
  val SWING_      = s"swing"
  val WEB_        = s"web"
  val CONTROLSFX_ = s"controlsfx"

  // Extra modules
  // Note that the module name and the library name are not the same
  val controlsFXModule = "org.controlsfx.controls"

  // Module libraries 
  val BASE       = s"org.openjfx:javafx-$BASE_:$javaFXVersion"
  val CONTROLS   = s"org.openjfx:javafx-$CONTROLS_:$javaFXVersion"
  val FXML       = s"org.openjfx:javafx-$FXML_:$javaFXVersion"
  val GRAPHICS   = s"org.openjfx:javafx-$GRAPHICS_:$javaFXVersion"
  val MEDIA      = s"org.openjfx:javafx-$MEDIA_:$javaFXVersion"
  val SWING      = s"org.openjfx:javafx-$SWING_:$javaFXVersion"
  val WEB        = s"org.openjfx:javafx-$WEB_:$javaFXVersion"
  val CONTROLSFX = s"org.controlsfx:$CONTROLSFX_:$controlsFXVersion"

  val ivyMunit = ivy"org.scalameta::munit::$mUnitVersion"
  val ivyMunitInterface = "munit.Framework"
...
}

We defined an additional compilation unit test within the managed module. The Tests module provides specialized functionality required to automate the execution of unit test. The test module inherits the configuration from its outer module. In particular all of the Ivy artifact dependencies in the managed module are also added automatically to the test module. However, we still need to add the configuration related to the unit test library. The script above imports the MUnit library using the following line:

      def ivyDeps = Agg(ivyMunit)

In addition to the library, Mill requires an interface between it and the testing framework. This is the interface that allows it to search for tests, execute them and collect the result to display to the user. The following script's line adds the appropriate interface required by the MUnit library:

      def testFramework = ivyMunitInterface

Each test library has its own interface. It is therefore necessary to know the name of that interface. In case of MUnit it is "munit.Framework". To make things easier, Mill already provides a set of predefined test frameworks that you need only inherit from. So the script code for the test module could be defined as :

    object test extends Tests with TestModule.Munit {
      def ivyDeps = Agg(ivyMunit)
    }

The source code, as per the layout rules, is the following:

├── build.sc
├── javafx
│   └── src
│       ├── button
│       │   ├── ButtonApp.java
│       │   └── Main.java
│       └── helloworld
│           └── HelloWorld.java
├── managed
│   ├── src
│   │   ├── button
│   │   │   ├── ButtonApp.scala
│   │   │   └── Main.scala
│   │   └── helloworld
│   │       └── HelloWorld.scala
│   └── test
│       └── src
│           └── ExampleSpec.scala
├── mill
├── millw
├── millw.bat
├── out
│   ├── ...
│   ...
├── README.md
└── unmanaged
    ├── src
    │   ├── button
    │   │   ├── ButtonApp.scala
    │   │   └── Main.scala
    │   └── helloworld
    │       └── HelloWorld.scala
    └── test
        └── src
            └── ExampleSpec.scala

For each top level compilation unit, we have a corresponding directory at the root of the project. The name of these directories is the same as the module's name. For the inner test modules, these also appear under their parent modules' directories. The structure within the inner modules is the same. You are free to create structures with as many levels as required. I should point out that the outer modules need not contain source code. You can create hierarchical structures simply as a means to organize your code.

Scala is a JVM language that provides full interoperability with Java. This means that not only do you have access to Java's ecosystem of libraries, but you can also add your own Java source code to your projects. You need only place the Java source were you place the Scala code. The Scala compiler takes care of the rest. I usually place the Java code in separate packages, but only to ease maintenance and coding. Scala also provides collection conversion libraries to facilitate the use of Java's collections and visa versa. Note that the older converters are deprecated, and new Scala 3 versions of the packages are available. Scala 3's introductory text has additional details on interoperability. There are some issues that you need to be aware of, but these can be dealt with easily. Here are the Scala imports for the Java collections:

import scala.jdk.CollectionConverters.*
import scala.jdk.StreamConverters.*

One note to add on this, is that both Java and Scala also supports the use of variable length arguments. To use these variable argument Scala methods from Java, you need to annotate Scala code. When using Java's variable arguments methods from Scala, use Scala's _* operator.

Mill Commands and Utilities

Resolve

Lets execute the Mill commands for this compilation unit. We start off with the resolve utility command that lists all the targets (tasks) that are available:

$ ./mill resolve _

The command above produces the following result:

Compiling /home/user/VSCodeProjects/javaFXMill/build.sc
[1/1] resolve 
all
clean
inspect
javafx
managed
par
path
plan
resolve
show
showNamed
shutdown
shutdown
unmanaged
version
version
visualize
visualizePlan

We obtain a list of Mill utilities that are always available irrespective of the compilation units we define. These include for example version, clean, resolve and show. We also see listed our compilation units javafx, managed and unmanaged that represent our top level targets. In this case, they are JavaFX applications, but they could, for example, be shared code or libraries. Note that the underscore used in the resolve above is a wildcard. If we used a double underscore __ the listing would be recursive. If you have cloned the example repository, give it a try.

Each object or method we define in the script makes available a number of targets or tasks to the Mill command line. Some of these tasks are inherited from Mill's module objects that we extend as is or override. We are also free to define and use new tasks in the build script. We can also extend several Mill modules as mixins when they provide very specific tasks such as for publishing artifacts (PublishModule) and executing unit tests (Tests). So lets take a look at what tasks are available in the managed module by executing the following example:

$ ./mill resolve managed._


And here is the output of the `resolve` command:
[1/1] resolve 
managed.allIvyDeps
managed.allScalacOptions
managed.allSourceFiles
managed.allSources
managed.ammoniteReplClasspath
managed.ammoniteVersion
managed.artifactId
managed.artifactName
managed.artifactScalaVersion
managed.artifactSuffix
managed.assembly
managed.compile
managed.compileClasspath
managed.compileIvyDeps
managed.console
managed.crossFullScalaVersion
managed.docJar
managed.docResources
managed.docSources
managed.finalMainClass
managed.finalMainClassOpt
managed.forkArgs
managed.forkEnv
managed.forkWorkingDir
managed.generatedSources
managed.ideaCompileOutput
managed.ideaConfigFiles
managed.ideaJavaModuleFacets
managed.ivyDeps
managed.ivyDepsTree
managed.jar
managed.javacOptions
managed.javadocOptions
managed.launcher
managed.localClasspath
managed.mainClass
managed.mandatoryIvyDeps
managed.mandatoryScalacOptions
managed.manifest
managed.platformSuffix
managed.prepareOffline
managed.prependShellScript
managed.repl
managed.resolvedAmmoniteReplIvyDeps
managed.resolvedIvyDeps
managed.resolvedRunIvyDeps
managed.resources
managed.run
managed.runBackground
managed.runClasspath
managed.runIvyDeps
managed.runLocal
managed.runMain
managed.runMainBackground
managed.runMainLocal
managed.runUseArgsFile
managed.scalaCompilerClasspath
managed.scalaDocClasspath
managed.scalaDocOptions
managed.scalaDocPluginClasspath
managed.scalaDocPluginIvyDeps
managed.scalaLibraryIvyDeps
managed.scalaOrganization
managed.scalaVersion
managed.scalacOptions
managed.scalacPluginClasspath
managed.scalacPluginIvyDeps
managed.showModuleDeps
managed.sourceJar
managed.sources
managed.test
managed.transitiveCompileIvyDeps
managed.transitiveIvyDeps
managed.transitiveLocalClasspath
managed.unmanagedClasspath
managed.upstreamAssembly
managed.upstreamAssemblyClasspath
managed.upstreamCompileOutput


Run and test

Each of the managed module's tasks listed above can be executed via the Mill command. These tasks have dependencies among each other and form an execution graph. For example to run the modules main class (managed.run), the target managed.compile must be up-to-date. This means that the managed.compile task will always be executed before the task managed.run, if it is not up-to-date. Mill uses caching extensively to avoid repeating tasks needlessly. Lets see what happens when we issue the command to execute the managed module's main class:

$ ./mill managed.run

The output, after pressing the GUI button thrice is:

[32/45] managed.compile 
[info] compiling 3 Scala sources to /home/user/VSCodeProjects/javaFXMill/out/managed/compile.dest/classes ...
[info] done compiling
[45/45] managed.run 
Hello Managed Scala World!
Hello Managed Scala World!
Hello Managed Scala World!

and the following JavaFX (OpenJFX) GUI should appear:

GUI from managed.run

Mill, by default, executes the application by forking a new JVM. However, one can also execute the application in Mill's JVM with the following command:

$ ./mill managed.runLocal

and the results should be the same.

We can also select the particular main method to execute. To do this we need only use the runMain target. The example code has two JavaFX applications. To execute either of these applications use one of these commands:

$ ./mill -i managed.runMain helloworld.HelloWorld
$ ./mill -i managed.runMain button.Main

The first command above produces the same result as described for the run target - the default main class is helloworld.HelloWorld. The button.Main application opens this dialogue box:

Managed runMain Screenshot

To test the code, we must select the unit test framework to use, prepare those unit tests and then execute them. We have already seen how to setup the MUnit framework. Here we show how to execute these tests via the Mill Command line. We won't go into any detail here. The goal is to have the example serve as a template for your future work. The manage module has the test source code placed in its inner test module. I have provided a single test example in the following directory:

managed/test/src/ExampleSpec.scala

and here is the sample test:

class ExampleSpec extends munit.FunSuite {

  test("test_ok") {
    val obtained = 42
    val expected = 42
    assertEquals(obtained, expected)
  }

  test("test_fails") {
    val obtained = 42
    val expected = 43
    assertEquals(obtained, expected)
  }
}

I have made sure the test will fail so that you can see how easily one can track the failed test to the test's source code. To execute all of the tests under managed compilation unit execute:

$ ./mill -i managed.test

This produces the following result:

[60/70] managed.test.compile 
[info] compiling 1 Scala source to /home/user/VSCodeProjects/javaFXMill/out/managed/test/compile.dest/classes ...
[info] done compiling
[70/70] managed.test.test 
managed.ExampleSpec:
  + test_ok 0.008s
==> X managed.ExampleSpec.test_fails  0.02s munit.ComparisonFailException: /home/user/VSCodeProjects/javaFXMill/managed/test/src/ExampleSpec.scala:32
31:    val expected = 43
32:    assertEquals(obtained, expected)
33:  }
values are not the same
=> Obtained
42
=> Diff (- obtained, + expected)
-42
+43
    at munit.FunSuite.assertEquals(FunSuite.scala:11)
    at managed.ExampleSpec.$init$$$anonfun$2(ExampleSpec.scala:32)
1 targets failed
managed.test.test 1 tests failed: 
  managed.ExampleSpec.test_fails managed.ExampleSpec.test_fails

Mill searches for all tests and executes these MUnit tests. For each unit test suite that is found, MUnit will print out the test suite's name as follows:

managed.ExampleSpec:

For each test in the test suite, the test name is shown in green if it passes:

managed.ExampleSpec:
+ test_ok 0.008s

However, if it fails, the test name is shown in red together with the highlighted source code location of the failed test:

=> X managed.ExampleSpec.test_fails 0.02s munit.ComparisonFailException: /home/user/VSCodeProjects/javaFXMill/managed/test/src/ExampleSpec.scala:32
31: val expected = 43
32: assertEquals(obtained, expected)
33: }

And when possible, the values used for the test are also shown:

values are not the same
=> Obtained
42
=> Diff ( - obtained , + expected )
-42
+43

Mill, by default, executes your applications and tests, by forking a new JVM. You can also execute them within the same JVM as Mill:

$ ./mill -i managed.test.testLocal

and the results should be the same.

We can use as many test suites as we wish. However, repeatedly executing all the tests may take too much time. We can use the following command to select a single test suite. This command executes all of the tests in the managed.ExampleSpec test suite:

$ ./mill -i managed.test managed.ExampleSpec.*

The results will be the same as above. It is important to add the wildcard * otherwise no tests are executed (a ** will also work). Mill will only list the tests that were identified and executed. It will silently fail if no tests were found, for example if you use an incorrect name.

We can also select a single test:

$ ./mill -i managed.test managed.ExampleSpec.test_ok

or multiple tests using partial matching. The commands below are equivalent:

$ ./mill -i managed.test "managed.ExampleSpec.test_*"
$ ./mill -i managed.test managed.ExampleSpec.test_*

I have yet to find a way to execute named tests separately. For example, the following line will incorrectly execute all the tests when none should be executed:

$ ./mill -i "managed.test managed.ExampleSpec.test_okX" + "managed.test managed.ExampleSpec.test_fail"

In fact all tests are executed if we use quotes on the test target (even if we indicate a non-existent test):

$ ./mill -i "managed.test managed.ExampleSpec.test_fail"

which seems like an error.

Show and inspect

Mill's run command will automatically look for a class or object with a main method and execute it, passing it any arguments you place on the command line. If you analyse the example code, you will see that the module has two main classes:

  • button.Main
  • helloWorld.HelloWorld

In these circumstances if the ScalaModule mainClass member is not overridden, then the following command:

$ ./mill managed.run

will result in an error such as this:

Compiling /home/user/VSCodeProjects/javaFXMill/build.sc
[34/45] managed.finalMainClass 
1 targets failed
managed.finalMainClass Multiple main classes found (button.Main,helloworld.HelloWorld) please explicitly specify which one to use by overriding mainClass

I have been using the words "target" and "task" interchangeably, but not all tasks are target tasks. The official documentation contains details on the type of tasks and their various properties. Task types include targets, sources and commands. These tasks properties indicate, for example, whether they are runnable from command line, if their results are cached or if they can take arguments.

Metadata is associated with each task. This includes configuration data that is either set by us in the build script, set by defaults, set by the Mill modules or generated during task execution (for example used for caching). This metadata is stored in JSON format and placed in a hidden .out directory in the project's root. The structure of the modules within this output directory are the same as those defined in the script file. So, after executing the managed commands above we can list the contents of this directory:

$ ls -1 ./out/

and we can see the metadata of the utility commands and the managed module:

clean.json
inspect.json
inspect.log
managed
mill
mill-profile.json
mill-worker-laf+3l
show.json
show.log

And if we look deeper into the managed module:

$ ls -1 ./out/managed


we will see additional metadata related to each of the module's tasks:
allScalacOptions.json
allSourceFiles.json
allSources.json
compileClasspath.json
compile.dest
compileIvyDeps.json
compile.json
compile.log
enablePluginScalacOptions.json
finalMainClassOpt.json
forkArgs.json
forkEnv.json
forkEnv.overridden
forkWorkingDir.json
generatedSources.json
ivyDeps.json
javacOptions.json
localClasspath.json
mainClass.json
mandatoryIvyDeps.json
mandatoryIvyDeps.overridden
mandatoryScalacOptions.json
platformSuffix.json
resolvedIvyDeps.json
resolvedRunIvyDeps.json
resources.json
runClasspath.json
runIvyDeps.json
run.json
run.log
runUseArgsFile.json
scalaCompilerClasspath.json
scalacOptions.json
scalacPluginClasspath.json
scalacPluginIvyDeps.json
scalaLibraryIvyDeps.json
scalaOrganization.json
scalaVersion.json
sources.json
transitiveCompileIvyDeps.json
transitiveIvyDeps.json
transitiveLocalClasspath.json
unmanagedClasspath.json
upstreamAssemblyClasspath.json
upstreamCompileOutput.json


We could look into those metadata files to learn more about the compiled build script and debug it. However, Mill provides an easier way to query your build scripts. In many cases we want to print out a value of a configuration task. To do this, we use the show utility command. For example, we can query the managed compilation module to find out what the main class is:

$ ./mill show managed.mainClass

Output:

[
  "helloworld.HelloWorld"
]

or determine what the Scala version was set to:

$ ./mill show managed.scalaVersion

Output:

[1/1] show 
"3.1.1"

The show command is very useful, for example, when debugging your class path. In particular, it allows us to see what Mill is doing "under the hood". Some tasks related to class paths include:

$ ./mill resolve managed._ | grep -i path

Output:

[1/1] resolve 
managed.ammoniteReplClasspath
managed.compileClasspath
managed.localClasspath
managed.runClasspath
managed.scalaCompilerClasspath
managed.scalaDocClasspath
managed.scalaDocPluginClasspath
managed.scalacPluginClasspath
managed.transitiveLocalClasspath
managed.unmanagedClasspath
managed.upstreamAssemblyClasspath

For example, we can see what the compile-time class path is, by executing the following command:

$ ./mill show managed.compileClasspath


Output:
[1/1] show 
[1/1] show > [15/15] managed.compileClasspath 
[
  "ref:c984eca8:/home/user/VSCodeProjects/javaFXMill/managed/resources",
  "qref:0f4aa102:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
  "qref:347bea21:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
  "qref:81c212a8:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala3-library_3/3.1.1/scala3-library_3-3.1.1.jar",
  "qref:d8c3eec4:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-linux.jar",
  "qref:f52f10d0:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
  "qref:4df2d3aa:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.6/scala-library-2.13.6.jar",
  "qref:a80bfcce:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-linux.jar",
  "qref:8f336a78:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
  "qref:24e66df9:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-linux.jar"
]


Note that the managed module uses Mill's managed class path functionality. That means all Maven dependencies are automatically pulled in. In addition to this, it also identifies the current operating system and only downloads the required native libraries. The list above shows several libraries that we did not explicitly include in the ivyDeps configuration task, including the Scala core libraries. If you execute the ./mill show managed.runClasspath, you will also see that it is the same as the compileClasspath configuration. As I have pointed out you can use the runIvyDeps and compileIvyDeps configuration tasks to set up different libraries for the compilation and runtime phases.

To explore the transitive libraries' setup by Mill, the following tasks can be used:

$ ./mill resolve managed._ | grep -i transitive

Output:

[1/1] resolve 
managed.transitiveCompileIvyDeps
managed.transitiveIvyDeps
managed.transitiveLocalClasspath

In particular, transitiveCompileIvyDeps is of interest because it indicates which libraries are managed. It also provides additional information, such as the publication classifiers used and the cross-compilation or operating system platform tags that are set. Later we will see how we can manually set up libraries that won't be managed by Mill.

The test compilation unit we define within the managed compilation unit inherits the setup from this outer object. This means that the dependencies declared in the managed module will also be ued in the inner module. The test module uses the MUnit framework to execute the unit tests, so this library is only required by the test compilation unit. To confirm this, execute the following command to list its managed libraries:

$ ./mill show managed.test.compileClasspath


Output:
[1/1] show 
[1/1] show > [47/47] managed.test.compileClasspath 
[
  "ref:c984eca8:/home/user/VSCodeProjects/javaFXMill/managed/resources",
  "ref:cd75580a:/home/user/VSCodeProjects/javaFXMill/out/managed/compile.dest/classes",
  "ref:c984eca8:/home/user/VSCodeProjects/javaFXMill/managed/test/resources",
  "qref:45efcd0d:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scalameta/munit_3/1.0.0-M3/munit_3-1.0.0-M3.jar",
  "qref:81c212a8:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala3-library_3/3.1.1/scala3-library_3-3.1.1.jar",
  "qref:0f4aa102:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
  "qref:347bea21:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
  "qref:0440c0f1:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scalameta/junit-interface/1.0.0-M3/junit-interface-1.0.0-M3.jar",
  "qref:26e95212:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/junit/junit/4.13.2/junit-4.13.2.jar",
  "qref:4df2d3aa:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.6/scala-library-2.13.6.jar",
  "qref:d8c3eec4:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-linux.jar",
  "qref:f52f10d0:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
  "qref:9e7bca5a:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-sbt/test-interface/1.0/test-interface-1.0.jar",
  "qref:6f3db795:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar",
  "qref:a80bfcce:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-linux.jar",
  "qref:8f336a78:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
  "qref:24e66df9:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-linux.jar"
]


The above list shows the MUnit library and its dependencies. These libraries do not appear in the managed compilation unit, but are available in the test compilation unit.

The show command can also be used to print out the metadata of more than one task. For example the following command can be used to show the metadata of the sources and compileClasspath tasks.

$ ./mill show "managed.{sources,compileClasspath}"


Output:
[1/1] show 
[1/1] show > [3/16] managed.resources 
[
  [
    "ref:f01978bc:/home/user/VSCodeProjects/javaFXMill/managed/src"
  ],
  [
    "ref:c984eca8:/home/user/VSCodeProjects/javaFXMill/managed/resources",
    "qref:0f4aa102:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
    "qref:347bea21:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
    "qref:81c212a8:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala3-library_3/3.1.1/scala3-library_3-3.1.1.jar",
    "qref:d8c3eec4:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-linux.jar",
    "qref:f52f10d0:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
    "qref:4df2d3aa:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.6/scala-library-2.13.6.jar",
    "qref:a80bfcce:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-linux.jar",
    "qref:8f336a78:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
    "qref:24e66df9:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-linux.jar"
  ]
]


The showNamed utility command is the same as the show command, but each of the output elements are now indexed with the task name. This comes in handy when you view more than one task at the same time, as is shown below:

$ ./mill showNamed "managed.{sources,compileClasspath}"


Output:
[1/1] showNamed 
[1/1] showNamed > [3/16] managed.resources 
{
  "managed.sources": [
    "ref:f01978bc:/home/user/VSCodeProjects/javaFXMill/managed/src"
  ],
  "managed.compileClasspath": [
    "ref:c984eca8:/home/user/VSCodeProjects/javaFXMill/managed/resources",
    "qref:0f4aa102:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
    "qref:347bea21:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
    "qref:81c212a8:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala3-library_3/3.1.1/scala3-library_3-3.1.1.jar",
    "qref:d8c3eec4:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-linux.jar",
    "qref:f52f10d0:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
    "qref:4df2d3aa:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.6/scala-library-2.13.6.jar",
    "qref:a80bfcce:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-linux.jar",
    "qref:8f336a78:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
    "qref:24e66df9:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-linux.jar"
  ]


The inspect is a more verbose version of the show command. It is employed in the same way as the show command and also allows for the use of wildcards. Besides the information that show provides, inspect also includes a description of the tasks and a list of inputs to those tasks. This provides a technique for the fine-grained exploration and debugging of your Mill scripts. Here is an example of the runtarget:

$ ./mill inspect managed.run

Output:

[1/1] inspect 
managed.run(JavaModule.scala:611)
    Runs this module's code in a subprocess and waits for it to finish

Inputs:
    managed.finalMainClass
    managed.runClasspath
    managed.forkArgs
    managed.forkEnv
    managed.forkWorkingDir
    managed.runUseArgsFile

Note that surprisingly, we do not see the mainClass target we set in the managed task. But if we probe the finalMainClass further we get:

$ ./mill inspect managed.finalMainClass

Output:

[1/1] inspect 
managed.finalMainClass(JavaModule.scala:74)

Inputs:
    managed.finalMainClassOpt

And if we look at the Mill source code we find the following code snippet:

  /**
   * Allows you to specify an explicit main class to use for the `run` command.
   * If none is specified, the classpath is searched for an appropriate main
   * class to use if one exists
   */
  def mainClass: T[Option[String]] = None

  def finalMainClassOpt: T[Either[String, String]] = T {
    mainClass() match {
      case Some(m) => Right(m)
      case None =>
        zincWorker.worker().discoverMainClasses(compile()) match {
          case Seq() => Left("No main class specified or found")
          case Seq(main) => Right(main)
          case mains =>
            Left(
              s"Multiple main classes found (${mains.mkString(",")}) " +
                "please explicitly specify which one to use by overriding mainClass"
            )
        }
    }
  }

  def finalMainClass: T[String] = T {
    finalMainClassOpt() match {
      case Right(main) => Result.Success(main)
      case Left(msg) => Result.Failure(msg)
    }
  }

Sure enough, the target uses the mainClass task that we override in our build script. Personally, I prefer the use of the IntelliJ IDE to debug some of my more complex Mill scripts. This IDE supports the parsing and analysis of Ammonite scripts, so with a simple ctrl+left button mouse click we can explore our script efficiently.

Analysing the dependency graph

I have already shown Mill commands that you can use to find dependencies between tasks. Recall that Mill tasks form a directed acyclic graph (DAG) that determines which tasks must be executed and the order in which they must be executed. These dependencies are set by Mill internally, either by explicit calls between tasks that we define ourselves or overriding moduleDeps, which sets dependencies between whole compilation units.

Mill provides the path task that allows us to explore and check the dependencies between two tasks. The first argument is the final target of the path. The official documentation states that if more than one path exists between two targets, one will be selected arbitrarily. I am assuming that this path is selected deterministically and therefore represents the true execution path of Mill. If we execute the following command:

$ ./mill path managed.assembly managed.sources

we get:

Compiling /home/user/VSCodeProjects/javaFXMill/build.sc
[1/1] path 
managed.sources
managed.allSources
managed.allSourceFiles
managed.compile
managed.localClasspath
managed.assembly

You can now use the show and inspect commands to obtain more information on each of the targets listed above. For example, to identify all the source files used to generate the compilation unit's Jar library we can use:

$ ./mill inspect managed.allSourceFiles

to check if that target will provide the intended information. From this output, it seems so:

[1/1] inspect 
managed.allSourceFiles(ScalaModule.scala:51)
    All individual source files fed into the Zinc compiler.

    All individual source files fed into the Java compiler

Inputs:
    managed.allSources

To get a list of the actual source files, we use:

$ ./mill show managed.allSourceFiles

and get:

[1/1] show 
[1/1] show > [4/4] managed.allSourceFiles 
[
  "ref:6db4a3f8:/home/user/VSCodeProjects/javaFXMill/managed/src/helloworld/HelloWorld.scala",
  "ref:2636cdfc:/home/user/VSCodeProjects/javaFXMill/managed/src/button/Main.scala",
  "ref:9e0369bc:/home/user/VSCodeProjects/javaFXMill/managed/src/button/ButtonApp.scala"
]

and indeed those are all the sources we have in our managed module, which has no other dependencies with any other module. It is important to point out the path command does not show all the targets that will be executed. For example the managed.compile target that compiles all the code, has additional dependencies. If we use the command:

$ ./mill inspect managed.compile

we see in the following output that the managed.allSources target is but one of the inputs of the managed.compiletarget:

[1/1] inspect 
managed.compile(ScalaModule.scala:195)
    Compiles the current module to generate compiled classfiles/bytecode.
    
    When you override this, you probably also want to override [[bspCompileClassesPath]].

Inputs:
    managed.scalaVersion
    managed.upstreamCompileOutput
    managed.allSourceFiles
    managed.compileClasspath
    managed.javacOptions
    managed.scalaOrganization
    managed.allScalacOptions
    managed.scalaCompilerClasspath
    managed.scalacPluginClasspath

Of particular interest is the managed.upstreamAssemblyClasspath target that can show us the dependencies used to compile and run the module. The following command can be used to explore those dependencies:

$ ./mill inspect managed.upstreamAssemblyClasspath

which, allows use to further identify and explore targets of interest that are shown in the command's output below:

[1/1] inspect 
managed.upstreamAssemblyClasspath(JavaModule.scala:356)
    All upstream classfiles and resources necessary to build and executable
    assembly, but without this module's contribution

Inputs:
    managed.transitiveLocalClasspath
    managed.unmanagedClasspath
    managed.resolvedRunIvyDeps

For example, we can determine all the third party libraries that are required to compile the managed module using the following command:

$ ./mill show managed.resolvedRunIvyDeps
And those libraries are listed in the output below:
[1/1] show 
[1/1] show > [10/10] managed.resolvedRunIvyDeps 
[
  "qref:0f4aa102:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
  "qref:347bea21:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
  "qref:5276bca2:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala3-library_3/3.1.2/scala3-library_3-3.1.2.jar",
  "qref:d8c3eec4:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-linux.jar",
  "qref:f52f10d0:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
  "qref:815c539d:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.8/scala-library-2.13.8.jar",
  "qref:a80bfcce:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-linux.jar",
  "qref:8f336a78:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
  "qref:24e66df9:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-linux.jar"
]

I suggest that when you define your own tasks, you use Ammonite's usage doc annotations to provide useful information that will be shown by the inspect command. This will allow users, of your build script, to explore your targets and tasks as if they were standard Mill tasks.

Usually we don't know, or cannot recall, the names of the targets that are dependencies and what order those dependencies should be. For example, if we use the incorrect order, Mill will complain:

[1/1] path 
1 targets failed
path No path found between managed.sources and managed.assembly

The plan utility command is useful in that we need only provide the final target. It will determine all the targets and tasks that need to be executed. These are sorted for a dry-run and shown in the order they would have been executed by Mill. The command below can be used to show the (often long) list of tasks that must be executed to produce the desired output of the target:

$ ./mill plan managed.assembly
Output:
[1/1] plan 
managed.resources
managed.scalaVersion
mill.scalalib.ZincWorkerModule.classpath
mill.scalalib.ZincWorkerModule.worker
managed.upstreamCompileOutput
managed.sources
managed.generatedSources
managed.allSources
managed.allSourceFiles
managed.transitiveLocalClasspath
managed.unmanagedClasspath
managed.platformSuffix
managed.compileIvyDeps
managed.transitiveCompileIvyDeps
managed.ivyDeps
managed.mandatoryIvyDeps.overridden.mill.scalalib.JavaModule.mandatoryIvyDeps
managed.scalaOrganization
managed.scalaLibraryIvyDeps
managed.mandatoryIvyDeps
managed.transitiveIvyDeps
managed.resolvedIvyDeps
managed.compileClasspath
managed.javacOptions
managed.mandatoryScalacOptions
managed.scalacPluginIvyDeps
managed.enablePluginScalacOptions
managed.scalacOptions
managed.allScalacOptions
managed.scalaCompilerClasspath
managed.scalacPluginClasspath
managed.compile
managed.localClasspath
managed.mainClass
managed.finalMainClassOpt
managed.manifest.overridden.mill.scalalib.JavaModule.manifest
managed.manifest
managed.runIvyDeps
managed.resolvedRunIvyDeps
managed.upstreamAssemblyClasspath
managed.runClasspath
managed.forkArgs
managed.prependShellScript
managed.upstreamAssembly
managed.assembly


Note that just because the tasks are listed for the dry-run, does not mean they will actually be executed. If the cached results can be reused, they will serve as input to their dependent tasks. It is also important to point out that not all Mill task types use caching. In those cases the tasks will always be executed.

It would also be interesting to see the unordered set of tasks in a tree-like structure. Mill does not seem to have this, but it does have a visualize utility command. This command determines and plots the dependency DAG of a target. This utility command seems to require GraphViz to generate the plots, so before proceeding, install it. I suspect GraphViz is used to produce the pngand svg plots from the dot source. For my Linux distribution I used these3 commands:

$ sudo apt-get install -y graphviz
$ sudo apt-get install -y graphviz-dev

The following instruction:

$ ./mill show visualize managed._

shows what files are generated by the visualize utility command, and it also generates that output:

[1/1] show 
[1/1] show > [3/3] visualize 
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[
  "ref:1c1b3ff8:/home/user/VSCodeProjects/javaFXMill/out/visualize.dest/out.txt",
  "ref:520f50af:/home/user/VSCodeProjects/javaFXMill/out/visualize.dest/out.dot",
  "ref:1cb6580f:/home/user/VSCodeProjects/javaFXMill/out/visualize.dest/out.json",
  "ref:bef28556:/home/user/VSCodeProjects/javaFXMill/out/visualize.dest/out.png",
  "ref:31845709:/home/user/VSCodeProjects/javaFXMill/out/visualize.dest/out.svg"
]

The visualize command generates both the source description of the DAG (txt, dot and json files) as well as the graphic output of the DAG (png and svg files). You can for example use the following command in the Linux prompt to visualize the DAG:

$ shotwell /home/user/VSCodeProjects/javaFXMill/out/visualize.dest/out.png

The svg output is shown below:

./mill show visualize managed._

There is also a similar visualizePlan utility command. Unlike visualize, it shows all targets, irrespective of whether or not they are resolved by the current target query. These non-resolved tasks, which are not generated by visualise, are shown with a dotted borderlines. The command below:

$ ./mill show visualizePlan managed._

generates a similar output:

[1/1] show 
[1/1] show > [3/3] visualizePlan 
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[
  "ref:ea2a153b:/home/user/VSCodeProjects/javaFXMill/out/visualizePlan.dest/out.txt",
  "ref:0d064b81:/home/user/VSCodeProjects/javaFXMill/out/visualizePlan.dest/out.dot",
  "ref:280e3f4d:/home/user/VSCodeProjects/javaFXMill/out/visualizePlan.dest/out.json",
  "ref:c47c7d39:/home/user/VSCodeProjects/javaFXMill/out/visualizePlan.dest/out.png",
  "ref:f526ae0b:/home/user/VSCodeProjects/javaFXMill/out/visualizePlan.dest/out.svg"
]

The DAG, shown below, is also very similar but if you look closely, the "dotted" tasks there, do not appear in the previous DAG:

./mill show visualizePlan managed._

I wanted to give you a sense of the complexity these DAGs can have. This is a very small project but the DAG is already substantial. For medium size projects with dependencies between modules (see moduleDeps configuration task), the DAGs become very large. So be selective in choosing your query targets.

You may also find it easier to view the SVG files, which allow zooming and panning without loss of image quality, unlike the raster images such as PNG. Any web browser should allow you to view these files. In addition to this, I also use InkScape to view the files and possibly alter them for publishing. On my Linux distribution I used the following commands to install InkScape:

$ sudo add-apt-repository ppa:inkscape.dev/stable
$ sudo apt update
$ sudo apt install inkscape

And to view4 the file, I use the following command:

$ inkscape /home/user/VSCodeProjects/javaFXMill/out/visualizePlan.dest/out.svg &

At this point you should be able to:

  • Create your own Scala project
  • Setup a Mill project that uses Maven artifacts
  • Compile and run your code
  • Execute unit tests

In addition to what I have described, two additional commands (managed.assembly and managed.publish) may be of interest for deploying your code. We won't describe these here. The official documentation provides the necessary details. I will just add that by using the publish target and making your artifacts available via Maven, allows a user to easily install one of your applications with Coursier.

Using your IDE

One way to develop your system is to alter your code and then execute the Mill targets to compile the source code, run the main method or execute the unit tests. However, it is tedious and cumbersome to switch between your editor and the command line. Mill provides us with the --watch flag that can be used, for example, as follows:

$ ./mill -i --watch managed.compile
$ ./mill -i --watch managed.runMain helloworld.HelloWorld
$ ./mill -i --watch managed.run
$ ./mill -i --watch managed.test

This flag ensures that Mill continuously scans the target's sources and if any change is detected, the command is automatically executed again. This means that we can edit, compile and execute our code without leaving the editor. It also has the added advantage of avoiding the wasted time of launching a new JVM for Mill.

A note on the -i flag. Without this flag, a Mill server is launched in the background. This means that a JVM is kept warm and execution is faster. However, the use of the Mill server does not seem to work perfectly in Windows. It is therefore advisable that, at least in Windows, the interactive flag be used so that no background session be started.

For those of us that use IDEs, we can still use Mill in a terminal together with the -i and -watch flags. However, native IDE Mill support allows us to take full advantage of all its functionality. IntelliJ is one of my favourite IDEs. It currently does not support Mill build files. However, Mill provides the following command to generate native IntelliJ projects:

$ mill mill.scalalib.GenIdea/idea

Once executed, you can load the project. I have found that this works well with Scala 2 projects. For Scala 3, this is not the case (circa April 2022), although with some tweaking you may get it to work. I find that Scala 3's "braceless" syntax is not recognized, which makes coding difficult. However, using native IntelliJ projects when created from the IDE itself, does work correctly for Scala 3.

Note that every time you add or remove dependencies or change your Mill script, you need to rerun the command above. You can then edit and execute your code and even perform your unit tests within the IDE. IntelliJ also has the advantage that it allows you to code and analyze Ammonite scripts. This is a boon when you are debugging more complex scripts. It also allows you to explore the Mill projects scripts with a simple ctrl+left button click on the script's methods and values. Remember that you will need to reload the Mill scripts after every change, as I have already explained. The Mill targets and your custom tasks will not be accessible through the IDE, but you are free to use it simultaneously in another terminal.

Note that IntelliJ also supports Bloop. However, Mill is not automatically supported. Integration via Bloop is made available with a Mill Bloop plugin. As with the previous GenIdea target, you must generate the Bloop configuration files with the following command (more information):

./mill --import ivy:com.lihaoyi::mill-contrib-bloop:  mill.contrib.bloop.Bloop/install

This seems to work well. I am able to compile, execute and debug my Scala 3 code with no problems. Scala 3's "braceless" syntax is also correctly recognized. Note that after every change to the Mill script, you must manually repeat the above command to export the Bloop build and reload the project in the IntelliJ IDE. Once again, you do not have direct access to Mills targets and commands via the IDE.

I also use VSCode. Scala 2 and 3 is supported by the Metals VSCode plugin and integration is done via Bloop. Unlike IntelliJ, their is no need to install Bloop. This is done automatically as soon as the project is opened and the Mill build script is detected. A Bloop server will be launched and the Bloop project files generated and used. Wth this setup you have a fully functioning IDE that can compile and execute your code and unit tests. As with the IntelliJ case, you don't have direct access to the script's tasks, but you can (as in IntelliJ's case) use Mill simultaneously in a separate terminal.

With the suggestions I provide here you can now productively work on your Scala 2 and 3 projects. Note that these are suggestions, but you are free to use other editors (such Sublime, Vim, Emacs, Eclipse and some oline IDEs), many of which are also supported by Metals, albeit with varying degrees of maturity. As a final comment, I have found IntelliJ's IDE perform better in 2 areas: debugging and working with Java code. The latter is important when working with mixed language projects and also projects that require debugging or exploring Java libraries. On the contrary, I find VSCode's support for editing and proofreading Markdown files better.

Searching for library updates

We finish off with a utility command showUpdates that may be useful for the larger and more mature projects. It looks for dependencies that have been updated and lists them. Below is the command:

$ ./mill mill.scalalib.Dependency/showUpdates

and for the example project the output was:

[2/2] mill.scalalib.Dependency.showUpdates 
Found 2 dependency update for javafx
  org.openjfx:javafx-controls : 16 -> 17 -> 17.0.0.1 -> 17.0.1 -> 17.0.2 -> 18
  org.controlsfx:controlsfx : 11.1.0 -> 11.1.1
Found 2 dependency update for managed
  org.openjfx:javafx-controls : 16 -> 17 -> 17.0.0.1 -> 17.0.1 -> 17.0.2 -> 18
  org.controlsfx:controlsfx : 11.1.0 -> 11.1.1
No dependency updates found for managed.test
No dependency updates found for unmanaged
No dependency updates found for unmanaged.test

I know that the Scala Steward project can also be used to check for updates and generate pull requests automatically. I have not experimented with this, but it works on Mill projects too. It can also be used to update the Mill version itself (in .mill-version file). This may be something of interest to explore and report in the future.

Basic Java build

I have set up an equivalent Java compilation unit named javafx. Take a look at the source code, and you will see the same directory structure and the same application code. The differences are that we use only Java source code and the unit tests are not included, so no test directory is provided. So what does a Java module look like in the build script? Here is the Mill module in that script:

object javafx extends OpenJFX {
  override def mainClass: T[Option[String]] = Some("helloworld.HelloWorld")

  override def ivyDeps = Agg(
                              ivy"$CONTROLS",
                              ivy"$CONTROLSFX"
                             )

}

That is it. You will notice that the main differences between this module and that of the managed one, are:

  • The use of the OpenJFX module which is in essence a JavaModule (more details on this later);
  • A missing override def scalaVersion = T{ ScalaVersion } method

Note that Java projects extend the JavaModule and Scala projects extend the ScalaModule, but Mill's ScalaModule also inherits from the JavaModule. So all the configuration tasks that are available to the JavaModule are also available to the ScalaModule. More concretely, Scala projects may also override JVM related parameters. The inverse, however, is not true. The following is a minimal Java project:

object foo extends JavaModule {
  override def javacOptions = T{ Seq("-source", "11", "-target", "11", "-Xlint") }
  override def forkArgs = Seq("-Xmx4g")
  override def forkEnv = Map("HELLO_MY_ENV_VAR" -> "WORLD")
}

And this is a minimal example of a Scala project:

object foo extends ScalaModule {
  override def scalaVersion = "2.13.8"
  override def scalacOptions = Seq("-Ydelambdafy:inline")
  override def ammoniteVersion = "2.4.0"
}

As was already pointed out, one can inherit from (mix-in) several Mill modules in order to provide additional functionality (for example the PublishModule or any of the other third party contributions that are already available). In the example project I opted to define an OpenJFX module that inherits from the JavaModule, because it allows me to override the forkArgs task. In this way the forkArgs configuration task can be made available to both Java and Scala projects simply by extending the OpenJFX module. In the case of Java projects, no more modules are required.

Defining and using JavaFX Dependencies

Both the managed and javafx modules that were described above inherit and use the OpenJFX module. The OpenJFX module is a trait that inherits from JavaModule and allows me to set up a common configuration that can be quickly and easily used to set up either a Java or Scala project. Here is the definition of the Module:

trait OpenJFX extends JavaModule {

  // Modules 

  val BASE_       = s"base"
  val CONTROLS_   = s"controls"
  val FXML_       = s"fxml"
  val GRAPHICS_   = s"graphics"
  val MEDIA_      = s"media"
  val SWING_      = s"swing"
  val WEB_        = s"web"
  val CONTROLSFX_ = s"controlsfx"

  // Extra modules
  // Note that the module name and the library name are not the same
  val controlsFXModule = "org.controlsfx.controls"

  // Module libraries 
  val BASE       = s"org.openjfx:javafx-$BASE_:$javaFXVersion"
  val CONTROLS   = s"org.openjfx:javafx-$CONTROLS_:$javaFXVersion"
  val FXML       = s"org.openjfx:javafx-$FXML_:$javaFXVersion"
  val GRAPHICS   = s"org.openjfx:javafx-$GRAPHICS_:$javaFXVersion"
  val MEDIA      = s"org.openjfx:javafx-$MEDIA_:$javaFXVersion"
  val SWING      = s"org.openjfx:javafx-$SWING_:$javaFXVersion"
  val WEB        = s"org.openjfx:javafx-$WEB_:$javaFXVersion"
  val CONTROLSFX = s"org.controlsfx:$CONTROLSFX_:$controlsFXVersion"

  // OpenFX/JavaFX libraries
  val javaFXModuleNames = Seq(BASE_, CONTROLS_, FXML_, GRAPHICS_, MEDIA_, SWING_, WEB_)

  val ivyMunit = ivy"org.scalameta::munit::$mUnitVersion"
  val ivyMunitInterface = "munit.Framework"

  val pathSeparator= File.pathSeparator

  override def forkArgs: Target[Seq[String]] = T {
    // get the managed libraries
    val allLibs: Loose.Agg[PathRef] = runClasspath()
    // get the OpenJFX and related managed libraries
    val s: Loose.Agg[String] = allLibs.map(_.path.toString())
                                      .filter{
                                         s =>
                                           val t= s.toLowerCase()
                                           t.contains("javafx") || t.contains("controlsfx")
                                        }

    // Create the JavaFX module names (convention is amenable to automation)
    import scala.util.matching.Regex

    // First get the javaFX only libraries
    val javaFXLibs = raw".*javafx-(.+?)-.*".r
    val javaFXModules = s.iterator.map(m => javaFXLibs.findFirstMatchIn(m).map(_.group(1)) )
                      .toSet
                      .filter(_.isDefined)
                      .map(_.get)
    // Now generate the module names
    val modulesNames = javaFXModules.map( m => s"javafx.$m") ++
                          Seq(controlsFXModule) // no standard convention, so add it manually

    // Add to the modules list
    Seq(
        "--module-path", s.iterator.mkString( pathSeparator ), 
        "--add-modules", modulesNames.iterator.mkString(","),
        "--add-exports=javafx.controls/com.sun.javafx.scene.control.behavior=org.controlsfx.controls",
        "--add-exports=javafx.controls/com.sun.javafx.scene.control.inputmap=org.controlsfx.controls",
        "--add-exports=javafx.graphics/com.sun.javafx.scene.traversal=org.controlsfx.controls"
    ) ++
      // add standard parameters
      Seq("-Dprism.verbose = true", "-ea")
  }

}

The module above consists of two parts. The first, is the set of artifact names of the JavaFX (or OpenJFX) and related libraries. Anyone that needs these libraries can extend this module, reference the required names and use them in the ivyDeps definition. Note that we need only add the main library. Mill will determine what other dependencies are also required and download them accordingly. Here is an example we have already seen from the managed and javafx modules:

  override def ivyDeps = Agg(
                              ivy"$CONTROLS",
                              ivy"$CONTROLSFX"
                             )

The second part overrides the forkArgs target task, which allows us to set-up the JVM's command line arguments. We need to add the JPM specific arguments so that the JVM can find and load the necessary modules. We may also need to tweak these parameters to open or export some packages within these modules. To automatically identify and add the modules, the script above first uses runClasspath() to extract the existing class path, which is determined by ivyDeps (we cannot use ivyDeps directly because it does not contain all the resolved dependencies). It then filters this classpath, which contains the full path to all the necessary artifacts, to get the corresponding artifact names (variable s). In the example above it looks for artifacts that contain the "javafx" and "controlsfx" substrings. From these names, it then extracts the OpenJFX module names using regular expressions (javaFXModules variable) and adds any other module names that are also required (modulesNames variable).

It is important to point out that the naming convention of the modules used by various library authors, is not the same. This means that, in some cases, the module name cannot be inferred and extracted from the artifact's name. An example is the ControlsFX library use in the script above. In this case, as shown below, the module name must be defined (first line) and added explicitly (last line):

  val controlsFXModule = "org.controlsfx.controls"
  val modulesNames = javaFXModules.map( m => s"javafx.$m") ++
                        Seq(controlsFXModule) // no standard convention, so add it manually

The next step is to add the module names (--add-modules) and the paths (--module-path) of their Jar archives, to the JVM's command line arguments. The code snippet below shows how the JVM's command line arguments are constructed:

    Seq(
        "--module-path", s.iterator.mkString( pathSeparator ), 
        "--add-modules", modulesNames.iterator.mkString(",")
    )

If you are setting up a vanilla OpenJFX project, then that should be enough. You can now import the required packages in your Java or Scala source code and execute the application. However, you may get both compile and run time errors and must therefore export a module's package so that all its public types and members are available to one or more of the other packages. For example, the following line:

            "--add-exports=javafx.controls/javafx.scene.control.skin=ALL-UNNAMED",

will let any package (ALL-UNNAMED) access the javafx.scene.control.skin package. If you use ControlsFX, then you not only provide access to the public members of some of the JavaFX packages (--add-exports), but you must also allow it to have access to private members (usually done via reflection) using --add-exports. The following is an example of what you may need to use ControlsFX:

            "--add-exports=javafx.controls/javafx.scene.control.skin=ALL-UNNAMED",
            "--add-exports=javafx.graphics/com.sun.javafx.scene=org.controlsfx.controls",
            "--add-exports=javafx.graphics/com.sun.javafx.scene.traversal=org.controlsfx.controls",

            "--add-opens=javafx.controls/com.sun.javafx.scene.control.inputmap=org.controlsfx.controls", 
            "--add-opens=javafx.base/com.sun.javafx.runtime=org.controlsfx.controls",
            "--add-opens=javafx.base/com.sun.javafx.collections=org.controlsfx.controls",
            "--add-opens=javafx.graphics/com.sun.javafx.css=org.controlsfx.controls",
            "--add-opens=javafx.graphics/com.sun.javafx.scene=org.controlsfx.controls",
            "--add-opens=javafx.graphics/com.sun.javafx.scene.traversal=org.controlsfx.controls",
            "--add-opens=javafx.graphics/javafx.scene=org.controlsfx.controls",
            "--add-opens=javafx.controls/com.sun.javafx.scene.control=org.controlsfx.controls",
            "--add-opens=javafx.controls/com.sun.javafx.scene.control.behavior=org.controlsfx.controls",
            "--add-opens=javafx.controls/javafx.scene.control.skin=org.controlsfx.controls",
            "--add-opens=javafx.controls/com.sun.javafx.scene.control.behavior=org.controlsfx.controls"

Every Java or Scala library that is added to the project's dependencies may require additional packages to be exported or opened due to chained dependencies. For example, when using ChartFX, execution failed with the error because it also accesses private members of a ControlsFX package:

java.lang.IllegalAccessError: superclass access check failed: class de.gsi.chart.ui.ProfilerInfoBox$CustomBreadCrumbButton (in module de.gsi.chartfx.chart) cannot access class impl.org.controlsfx.skin.BreadCrumbBarSkin$BreadCrumbButton (in module org.controlsfx.controls) because module org.controlsfx.controls does not export impl.org.controlsfx.skin to module de.gsi.chartfx.chart

In this case you can add the following JVM arguments:

            "--add-opens=org.controlsfx.controls/impl.org.controlsfx.skin=de.gsi.chartfx.chart",
            "--add-opens=javafx.graphics/javafx.scene=de.gsi.chartfx.chart",

Many of the library authors will provide instructions on what packages need to be exported or opened. Note that any package that is opened (private and public members), will automatically also be exported (only public members).

A small note for those of you who need to use Mill to download and use OS native libraries versions, as is the case with JavaFX. To correctly download the OS native artifacts, one must either inherit from CoursierModule or add the following code snippet to all the modules that use OS native dependencies (not required in the submodules):

  override def resolutionCustomizer: Task[Option[Resolution => Resolution]] = T.task {
    Some((_: coursier.core.Resolution).withOsInfo(coursier.core.Activation.Os.fromProperties(sys.props.toMap)))
  }

Unmanaged libraries

The use of Maven artifacts and the Ivy cache to pull in and use those required artifacts is essential for the maintenance of a Java and Scala project. But what happens when the required artifact is only available via manual download? How do we add these libraries to the Mill compilation unit? The answer is to override the unmanagedClasspath task. The following is an example adapted from the official documentation:

import mill._, scalalib._

object foo extends ScalaModule {
  def scalaVersion = "3.1.1"
  def unmanagedClasspath = T {
    if (!os.exists(millSourcePath / "lib")) Agg()
    else Agg.from(os.list(millSourcePath / "lib").map(PathRef(_)))
  }
}

In the script above, we add a lib directory to the root path of the foo module. You can download and place the required Jar archives into this directory. The next step, is to configure Mill to include these files in the class path used to compile and execute the module. To do this, we need only override the unmanagedClasspath configuration task. The goal for this task is to return the list of paths to libraries in the lib directory. Once the unmanagedClasspath has been defined, it will be included together with the managed libraries. The path of the unmanaged libraries are also automatically propagated to any of the dependent modules and submodules.

To construct the path, we first need to determine the foo module's root path. The Module's variable millSourcePath points to the module's root (not the project's root). Note that this variable is not a (configuration) task and therefore cannot be listed via Mill's resolve utility task, nor can it be viewed using the Mill's show utility task. Mill already includes the OS-Lib library, and we use it to first check if a foo/lib directory exists. If it does not, we simply return an empty list of paths. If it does, we collect the list of paths to all the files contained in the foo/lib directory and returns these. Note that by default the unmanagedClasspath is empty, but you can always use super.unmanagedClasspath() to access a parents' existing list of artifacts if these need to be complemented or changed. Once we have overridden this task, we can use the show command, as exemplified below, to check that the path is set up correctly:

./mill show foo.unmanagedClasspath

We are free to place the libraries wherever we want. You could, for example, place the libraries in the project's root. In this case the following code snippet can be used to obtain the project's root path:

val baseDir = build.millSourcePath

and by substituting millSourcePath with baseDir in the code snippet above, we can place and reference a set of libraries at the project level.

You can use always include the unmanagedClasspath task defined above in any project, irrespective of whether or not unmanaged libraries are required. Because it checks for the lib directory before attempting to load the artifacts, you need only create and populate the directory if and when it is required.

Manually downloading the libraries is not your only option. In the example project I have also included an unmanaged module that is defined as follows:

object unmanaged extends OpenJFX with ScalaModule {
  def scalaVersion = T{ ScalaVersion }

  override def mainClass: T[Option[String]] = Some("helloworld.HelloWorld")

  override def unmanagedClasspath: Target[Loose.Agg[PathRef]] = T{
    import coursier._
    import coursier.parse.DependencyParser

    val controlsFXModule = dep"org.controlsfx:controlsfx:11.1.0"

    // Generate the dependencies
    val javaFXModules = javaFXModuleNames.map(
      m => Dependency(Module(org"org.openjfx", ModuleName(s"javafx-$m")), javaFXVersion)
    ) ++
      Seq(controlsFXModule)

    // Check if the libraries exist and download if they don't
    val files = Fetch().addDependencies(javaFXModules: _*).run()

    // Return the list f libraries
    val pathRefs = files.map(f => PathRef(os.Path(f)))
    Agg(pathRefs : _*)
  }

    object test extends Tests {
      def ivyDeps = Agg(ivyMunit)
      def testFramework = ivyMunitInterface
    }

}

The module definition above is the same as the managed module, but the unmanagedClasspath task is used instead of the ivyDeps task. For demonstration purposes I used Coursier to download all the required libraries. We could have used any other means such as an FTP of HTTP client to download the libraries. However, Coursier allows us to automatically download all the required dependencies and cache them for future use, using the Ivy dependency manager. Using Coursier will also make it easier to describe the next example (target allOS). We can now use the following command to show exactly what artifacts have been downloaded and where they have been placed:

$ ./mill show unmanaged.unmanagedClasspath
List of unmanaged libraries
[1/1] show 
[1/1] show > [1/1] unmanaged.unmanagedClasspath 
[
  "ref:ddef4c4b:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
  "ref:d150f068:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
  "ref:c6d45ef4:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-fxml/16/javafx-fxml-16.jar",
  "ref:19dc6267:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
  "ref:e42f9c9c:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-media/16/javafx-media-16.jar",
  "ref:945374cf:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-swing/16/javafx-swing-16.jar",
  "ref:406062e3:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-web/16/javafx-web-16.jar",
  "ref:5fa6aaf7:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
  "ref:3df5678c:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-linux.jar",
  "ref:0bc3d56f:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-linux.jar",
  "ref:2b46ca60:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-fxml/16/javafx-fxml-16-linux.jar",
  "ref:e0c0bcbc:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-linux.jar",
  "ref:c7238720:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-media/16/javafx-media-16-linux.jar",
  "ref:a9caa1a4:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-swing/16/javafx-swing-16-linux.jar",
  "ref:8b0370d7:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-web/16/javafx-web-16-linux.jar"
]


It is important to point out that this technique can be used in conjunction with the managed libraries by defining both the ivyDeps and unmanagedClasspath tasks. Mill will take care of combining the required libraries and construct the correct class paths. In fact, the module above still requires Scala's core libraries to compile and run. The following Mill commands shows that those libraries are indeed added to the class path (in this example both runtime and compile class paths are the same, which is not always true):

$ ./mill show unmanaged.runClasspath
$ ./mill show unmanaged.compileClasspath
List of managed and unmanaged libraries
[1/1] show 
[1/1] show > [36/36] unmanaged.runClasspath 
[
  "ref:c984eca8:/home/user/VSCodeProjects/javaFXMill/unmanaged/resources",
  "ref:04bff8b4:/home/user/VSCodeProjects/javaFXMill/out/unmanaged/compile.dest/classes",
  "ref:ddef4c4b:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
  "ref:d150f068:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
  "ref:c6d45ef4:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-fxml/16/javafx-fxml-16.jar",
  "ref:19dc6267:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
  "ref:e42f9c9c:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-media/16/javafx-media-16.jar",
  "ref:945374cf:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-swing/16/javafx-swing-16.jar",
  "ref:406062e3:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-web/16/javafx-web-16.jar",
  "ref:5fa6aaf7:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
  "ref:3df5678c:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-linux.jar",
  "ref:0bc3d56f:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-linux.jar",
  "ref:2b46ca60:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-fxml/16/javafx-fxml-16-linux.jar",
  "ref:e0c0bcbc:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-linux.jar",
  "ref:c7238720:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-media/16/javafx-media-16-linux.jar",
  "ref:a9caa1a4:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-swing/16/javafx-swing-16-linux.jar",
  "ref:8b0370d7:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-web/16/javafx-web-16-linux.jar",
  "qref:81c212a8:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala3-library_3/3.1.1/scala3-library_3-3.1.1.jar",
  "qref:4df2d3aa:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.6/scala-library-2.13.6.jar"
]


As was previously mentioned, to be able to provide a "fat" JAR that allows us to create a truly cross-platform application, we need to download native binary libraries for the supported operating systems. The attentive reader will have noticed that I have run these examples in Linux because the Linux binary native libraries are included in the class path (in the list above, the files names have the -linux substring in their names). I have created an allOS module that allows a developer to define the OS systems that should be supported and download their OS specific dependencies automatically. To show how to download these native operating system dependencies via Coursier, I have the target osClasspath that one can execute as follows:

$ ./mill -i show allOS.osClasspath
And we get the following list of URLs for the artifacts:
[1/1] show 
[1/1] show > [1/1] allOS.osClasspath 
[
  "Current = Os(Some(amd64), HashSet(unix), Some(linux), Some(5.13.0-40-generic))",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-linux.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-mac.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-swing/16/javafx-swing-16-mac.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-fxml/16/javafx-fxml-16.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-web/16/javafx-web-16-mac.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-media/16/javafx-media-16-linux.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-mac.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-media/16/javafx-media-16-mac.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-fxml/16/javafx-fxml-16-mac.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-linux.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-linux.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-win.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-fxml/16/javafx-fxml-16-win.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-web/16/javafx-web-16-linux.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-mac.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-fxml/16/javafx-fxml-16-linux.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-swing/16/javafx-swing-16-linux.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-swing/16/javafx-swing-16.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-swing/16/javafx-swing-16-win.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-media/16/javafx-media-16-win.jar",
  "https://repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-media/16/javafx-media-16.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-web/16/javafx-web-16.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-win.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-web/16/javafx-web-16-win.jar",
  "https://repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-win.jar"
]


In the list above we can see the artifacts for both Windows (win) and MacOS ('mac`). Here is the code snippet of the target's Mill script:

  def osClasspath: Target[Seq[String]] = T{
    implicit val ec: scala.concurrent.ExecutionContext = scala.concurrent.ExecutionContext.global

    // Extra OpenFX library
    val controlsFXModule = dep"org.controlsfx:controlsfx:11.1.0"

    val current = coursier.core.Activation.Os.fromProperties(sys.props.toMap)
    // Generate the dependencies
    val javaFXModules = javaFXModuleNames.map(
      m => Dependency(Module(org"org.openjfx", ModuleName(s"javafx-$m")), javaFXVersion)
    ) ++
      Seq(controlsFXModule)

    val deps = javaFXModules
    val resWin: Future[Resolution] =
                          resolveWin
                              .addDependencies(deps: _*)
                              .future()
    val resMac: Future[Resolution] =
                          resolveMac
                              .addDependencies(deps: _*)
                              .future()
   val resLinux: Future[Resolution] =
                          resolveLinux
                            .addDependencies(deps: _*)
                            .future()

   val res = Future.sequence( List(resWin, resMac, resLinux) )
   val result = Await.result(res, Duration.Inf)
   val urls = result.map(_.dependencyArtifacts().map(_._3.url).toSet).reduceLeft((acc,s) => acc ++ s)

   s"Current = $current" :: urls.toList
  }

In the code above, I first obtain the host operating system's (OS) name that is stored in the current variable. This is placed at the start of the target's output and helps a developer debug the Mill scripts. In the example above we see that the script was executed in the Linux OS running on an AMD64 architecture. We then use the full list of JavaFX module names javaFXModuleNames to generate the dependency definitions of these modules. Note that we also add the controlsFXModule to the final list of modules. We really only needed to use this module because Coursier automatically determines what javaFX libraries are strictly necessary. We then create 3 Coursier Resolution objects (resWin, resMac and resLinux) to load the same dependencies (dep), each for a different OS (Windows, MacOS and Linux respectively). We initialize the asynchronous downloads (via Scala Future) of the dependencies for each OS using the defined resolution object and wait indefinitely for these to finish. After the downloads finish, we extract the resolved URLs and return those as the task's results. From the output above we see that we have the native OS libraries for Windows, MacOS and Linux.

The code snippet below shows how we define a Coursier Resolution object for Windows (see the example code for the Linux and Windows cases):

  val winX64 =Activation.Os(
    Some("x86_64"),
    Set("windows"),
    Some("windows"),
    None
  )

  val resolveWin = Resolve()
    .withResolutionParams(
      ResolutionParams()
        .withOsInfo {
          winX64
        }
    )

To generate a "fat" Jar we need only change the unmanagedClasspath task to download the native OS libraries. The artifacts for the host OS can be downloaded via the standard ivyDeps method shown below:

  override def ivyDeps = Agg( ivy"$CONTROLS", ivy"$CONTROLSFX" )

This means that we need only set-up the Coursier Fetch objects to use the ResolutionParams for the missing Activation.OS objects. The code snippet below shows the unmanagedClasspath task.

  override def unmanagedClasspath: Target[Loose.Agg[PathRef]] = T{

    import coursier.params.ResolutionParams

    // Get the name of the current (host) OS
    val osName = coursier.core.Activation.Os.fromProperties(sys.props.toMap).name.get

    // Extra OpenFX library
    // Coursier: only a single String literal is allowed here, so cannot decouple version
    val controlsFXModule = dep"org.controlsfx:controlsfx:11.1.0"

    // Generate the dependencies
    val javaFXModuleNames = Seq( CONTROLS_ )
    val javaFXModules = javaFXModuleNames.map(
      m => Dependency(Module(org"org.openjfx", ModuleName(s"javafx-$m")), javaFXVersion)
    ) ++
      Seq(controlsFXModule)

    // Setup resolution Windows downloads (if not current OS)
    val filesWin =
      if (osName != winX64.name.get) {
        Fetch()
        .addDependencies(javaFXModules: _*)
        .withResolutionParams( ResolutionParams().withOsInfo{ winX64 })
        .addArtifactTypes(Type.all)
        .run()
        .toSet
      } else Set[File]()

    // Setup resolution MacOS downloads (if not current OS)
    val filesMac =
      if (osName != macOSx64.name.get) {
        Fetch()
        .addDependencies(javaFXModules: _*)
        .withResolutionParams(ResolutionParams().withOsInfo { macOSx64 })
        .addArtifactTypes(Type.all)
        .run()
        .toSet
      } else Set[File]()

    // Setup resolution Linux downloads (if not current OS)
    val filesLinux =
      if (osName != linuxX64.name.get) {
        Fetch()
        .addDependencies(javaFXModules: _*)
        .withResolutionParams(ResolutionParams().withOsInfo { linuxX64 })
        .addArtifactTypes(Type.all)
        .run()
        .toSet
      } else Set[File]()

    val allOS = filesWin ++ filesMac ++ filesLinux
    val files = allOS.toSeq

    // Return the list of libraries
    val pathRefs = files.map(f => PathRef(os.Path(f)))
    Agg(pathRefs : _*)
  }

The target above uses the same list of dependencies as the managed ivyDeps library. It creates Fetch objects for the non-host operating systems and synchronously downloads the resolved artifacts. At the end of the method, the lists of all the downloaded artifacts are merged and returned as output. The resolution parameters use the same Activation.OS that were used in the osClasspath target, so only Windows, MacOs and Linux are supported. To run the above target execute the following command:

$ ./mill -i show allOS.unmanagedClasspath
The following list of artifacts are downloaded:
Compiling /home/user/VSCodeProjects/javaFXMill/build.sc
[1/1] show 
[1/1] show > [1/1] allOS.unmanagedClasspath 
[
  "ref:27b03bb6:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-mac.jar",
  "ref:9cc4c083:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-mac.jar",
  "ref:ddef4c4b:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
  "ref:5fa6aaf7:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
  "ref:19dc6267:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
  "ref:352e8fdb:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-mac.jar",
  "ref:b70cffea:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-win.jar",
  "ref:6087dd8c:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-win.jar",
  "ref:d150f068:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
  "ref:2ec9c6a2:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-win.jar"
]


One can see that, not only are the ControlsFX and JavaFX Controls artifacts downloaded, but so are the JavaFX dependencies graphics and base downloaded. The OS native artifacts for MacOS and Windows are also downloaded (files names that terminate with -macand -win). To see the full list of downloaded artifacts execute the following command:

$ ./mill -i show allOS.runClasspath
Full runtime class path:
1/1] show > [36/36] allOS.runClasspath 
[
  "ref:c984eca8:/home/user/VSCodeProjects/javaFXMill/allOS/resources",
  "ref:65a4706e:/home/user/VSCodeProjects/javaFXMill/out/allOS/compile.dest/classes",
  "ref:27b03bb6:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-mac.jar",
  "ref:9cc4c083:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-mac.jar",
  "ref:ddef4c4b:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
  "ref:5fa6aaf7:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
  "ref:19dc6267:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
  "ref:352e8fdb:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-mac.jar",
  "ref:b70cffea:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-win.jar",
  "ref:6087dd8c:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-win.jar",
  "ref:d150f068:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
  "ref:2ec9c6a2:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-win.jar",
  "qref:0f4aa102:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16.jar",
  "qref:347bea21:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/controlsfx/controlsfx/11.1.0/controlsfx-11.1.0.jar",
  "qref:81c212a8:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala3-library_3/3.1.1/scala3-library_3-3.1.1.jar",
  "qref:d8c3eec4:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-controls/16/javafx-controls-16-linux.jar",
  "qref:f52f10d0:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16.jar",
  "qref:4df2d3aa:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.6/scala-library-2.13.6.jar",
  "qref:a80bfcce:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-graphics/16/javafx-graphics-16-linux.jar",
  "qref:8f336a78:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16.jar",
  "qref:24e66df9:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/openjfx/javafx-base/16/javafx-base-16-linux.jar"
]


The astute reader will observe that Mill (circa 04-2022), when combining the ivyDeps and unmanagedClasspath artifacts, does not remove duplicates. This does not in any way prevent the applications from running correctly, but may result in very large class paths. We could filter the unwanted files at the unmanagedClasspath target and return only those Jars that contain native OS content. I purposely did not select the OS native artifacts, because the dependency chain may include other Jar files that do not use the same naming convention, or may include different Jars depending on the OS. One way to "solve" this issue is to filter out duplicates that are resolved from ivyDeps. To do this we simply use the resolvedIvyDeps target to download and filter the duplicate files as shown below:

    val deps = resolvedIvyDeps().map(_.path.toIO).iterator.toSet
    val allOS = filesWin ++ filesMac ++ filesLinux -- deps
    val files = allOS.toSeq

Note that because these targets are cached, the resolution and downloading of files is only executed when necessary, otherwise the cached values are used. One should now be able to execute the applications with the following commands with the class path containing all the dependencies for all operating systems.

$ ./mill -i allOS.runMain helloworld.HelloWorld
$ ./mill -i allOS.runMain button.Main

This however will not work. It is necessary to invoke the JVM with the class path that only includes the correct OS native artifacts. If multiple versions exist, the JavaFX application will fail to load. To "solve" this issue the forkArgs target of the OpenJFX module was changed slightly so that only the current host OS artifacts are included in the class path. The following code snippet was added to remove the non-host OS artifacts:

  // The osAll module downloads these OS native versions of the libraries
  val supported = Set("mac", "linux", "win")
  // Get the name of the current (host) OS
  val osName = coursier.core.Activation.Os.fromProperties(sys.props.toMap).name.get.toLowerCase
  // Filter for removing incompatible native OS libraries
  // If we have several native libraries for different OS, JavaFX cannot select the correct one
  val tag = osName match {
    case "linux" => "linux"
    case "mac os x" => "mac"
    case "windows" => "win"
  }
  val remove = supported - tag

  def validOS(artifact: String): Boolean = {
    if (supported.exists(s => artifact.contains(s))) {
      // Not a native OS Jar for this OS
      artifact.contains( tag )
    } else {
      // Not a native OS Jar
      true
    }
  }

  ...

  override def forkArgs: Target[Seq[String]] = T {

    ...

    // Add to the modules list
    val t = Seq(
        "--module-path", s.filter( validOS ).iterator.mkString( pathSeparator ),
        "--add-modules", modulesNames.iterator.mkString(","), // "javafx.controls,javafx.graphics,javafx.base,org.controlsfx.controls",
        "--add-exports=javafx.controls/com.sun.javafx.scene.control.behavior=org.controlsfx.controls",
        "--add-exports=javafx.controls/com.sun.javafx.scene.control.inputmap=org.controlsfx.controls",
        "--add-exports=javafx.graphics/com.sun.javafx.scene.traversal=org.controlsfx.controls"
    ) ++
      // add standard parameters
      Seq("-Dprism.verbose = true", "-ea")
    t
  }

In the code snippet above, the validOS method checks that if an artifact is an OS dependent archive, then it is only retained if it is for the current host OS that is running the Mill script. If you will be running these applications without Mill, then you must write your own OS scripts to identify and use the correct class path.

Conclusion

In this article I have endeavoured to provide enough information for anyone to set up and use a project with several pure or mixed Scala and Java project modules using the Mill build tool. I have also focused on the use of Java Module System (JMS) because many libraries are now provided as Java modules, but not all JVM languages, such as Scala, support these natively. I used JavaFX (or OpenFX) as an example because it represents a use-case of modular libraries that requires some care in using. Finally, I have also detailed the use of unmanaged libraries. This allows us to set up projects that require Jar libraries that are not available as Maven artifacts. It also allows us to download and generate a "fat" Jar that contain native libraries for various operating systems. Such an application can be distributed as a single Jar and run in various operating systems. The example I provide is an OpenFX application that can be executed in several operating systems (MacOS, Windows and Linux). All the source code for this article is available in Github.

The Mill build tool has been described in detail. I have included a brief history of this tool and shown how to set up and execute simple projects. I have exemplified many of Mill's utility commands that allow us to compile and run the applications. I have also demonstrated how to set up and run unit tests using MUnit. Mill also provides a set of utility commands that allow us to inspect, analyse and plot module dependencies, project variables (such as source files and dependencies) and even search for library updates. With this information you are now able to develop, debug and use your own Mill scripts.

The use of an IDE is essential for developer productivity. I have shown how Mill can be used in conjunction with an editor or IDE to automate unit testing and compilation. In order to take full advantage of the IDEs, I have also shown how to import your Mill projects into the IntelliJ and VSCode IDEs using Bloop. You should now be able to develop your projects taking full advantage of the IDE.

You can use the example project as a template for your Scala or Java OpenFX projects. After cloning or copying the repository, you should be able to run all the examples that were described in this article. If you find any issues or have any suggestions, feel free to open a ticket in Github's issue tracker.

References

  1. Understanding Java 9 Modules
  2. A Guide to Java 9 Modularity
  3. Jakob Jenkov's Java Modules tutorial
  4. Scala compatibility with modules
  5. jlink tool
  6. Java 9+ modularity: The difficulties and pitfalls of migrating from Java 8 to Java 9+
  7. OpenJFX module setup
  8. Apache Maven
  9. Gradle
  10. TestFX Monocle
  11. ControlsFX
  12. Chart-FX
  13. Mill build tool
  14. Scala Build Tool - SBT
  15. The Best Build Tool For Scala language
  16. Scala Book - The most used scala build tool (sbt)
  17. SBuild - the magic-free yet powerful build tool
  18. A quick tour of build tools in Scala
  19. Jan Christopher Vogt
  20. Chris' Build Tool (CBT) for Scala
  21. Li Haoyi GitHub
  22. Li Haoyi
  23. Tobias Roeser Github
  24. Mill Documentation/Introduction to Mill
  25. Mill: Better Scala Builds
  26. Mill: a Build Tool based on Pure Functional Programming
  27. Hands-on Scala Programming
  28. Ammonite
  29. Ammonite Github
  30. Ammonite Scripting
  31. Coursier - Pure Scala Artifact Fetching
  32. Coursier Github
  33. Mill Scala Seed
  34. Example javaFXMill project
  35. Ivy - The agile dependency manager
  36. MUnit - Scala testing library with actionable errors and extensible APIs
  37. JetBrain's IntelliJ IDEA
  38. GraphViz
  39. InkScape
  40. Metals VSCode plugin
  41. Visual Studio Code
  42. Bloop build server
  43. ChartFX

  1. Strangely their does not seem to be an official tutorial from Oracle. Some links to the OpenJDK site are available here.

  2. I have not researched nor do I know how to implement such a solution.

  3. I am not sure if the dev package is required, but it won't consume too much space.

  4. The inkview command could be used, but it does not seem to support panning and zooming.