Configuring your Project
You can configure your Mill build in a number of ways:
Compilation & Execution Flags
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
def scalacOptions = Seq("-Ydelambdafy:inline")
def forkArgs = Seq("-Xmx4g")
def forkEnv = Map("HELLO_MY_ENV_VAR" -> "WORLD")
}
You can pass flags to the Scala compiler via scalacOptions
. By default,
foo.run
runs the compiled code in a subprocess, and you can pass in JVM flags
via forkArgs
or environment-variables via forkEnv
.
You can also run your code via
mill foo.runLocal
Which runs it in-process within an isolated classloader. This may be faster
since you avoid the JVM startup, but does not support forkArgs
or forkEnv
.
If you want to pass main-method arguments to run
or runLocal
, simply pass
them after the foo.run
/foo.runLocal
:
mill foo.run arg1 arg2 arg3
mill foo.runLocal arg1 arg2 arg3
Adding Ivy Dependencies
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.12.4"
def ivyDeps = Agg(
ivy"com.lihaoyi::upickle:0.5.1",
ivy"com.lihaoyi::pprint:0.5.2",
ivy"com.lihaoyi::fansi:0.2.4",
ivy"${scalaOrganization()}:scala-reflect:${scalaVersion()}"
)
}
You can define the ivyDeps
field to add ivy dependencies to your module. The
ivy"com.lihaoyi::upickle:0.5.1"
syntax (with ::
) represents Scala
dependencies; for Java dependencies you would use a single :
e.g.
"ivy"com.lihaoyi:upickle:0.5.1"
. If you have dependencies cross-published
against the full Scala version (eg. 2.12.4
instead of just 2.12
),
you can use :::
as in ivy"org.scalamacros:::paradise:2.1.1"
.
To select the test-jars from a dependency use the following syntax:
ivy"org.apache.spark::spark-sql:2.4.0;classifier=tests
.
Please consult the Library Dependencies in Mill section for even more details.
Repository configuration
By default, these are resolved from maven central, but you can add your own
resolvers by overriding the repositoriesTask
task in the module:
build.sc
import coursier.maven.MavenRepository
def repositoriesTask = T.task { super.repositoriesTask() ++ Seq(
MavenRepository("https://oss.sonatype.org/content/repositories/releases")
) }
To add custom resolvers to the initial bootstrap of the build, you can create a
custom ZincWorkerModule
, and override the zincWorker
method in your
ScalaModule
by pointing it to that custom object:
build.sc
import coursier.maven.MavenRepository
object CustomZincWorkerModule extends ZincWorkerModule with CoursierModule {
def repositoriesTask() = T.task { super.repositoriesTask() ++ Seq(
MavenRepository("https://oss.sonatype.org/content/repositories/releases")
) }
}
object YourBuild extends ScalaModule {
def zincWorker = CustomZincWorkerModule
// ... rest of your build definitions
}
Runtime and compile-time dependencies
If you want to use additional dependencies at runtime or override dependencies and their versions at runtime, you can do so with runIvyDeps
.
build.sc
: Declaring runtime dependenciesimport mill._, scalalib._
object foo extends JavaModule {
def runIvyDeps = Agg(
ivy"ch.qos.logback:logback-classic:1.2.7",
ivy"com.h2database:h2:2.0.202"
)
}
Mill has no You might be used to test-scoped dependencies from other build tools like Maven, Gradle or sbt.
As test modules in Mill are just regular modules, there is no special need for a dedicated test-scope.
You can use |
You can also declare compile-time-only dependencies with compileIvyDeps
.
These are present in the compile classpath, but will not propagated to the transitive dependencies.
build.sc
: Declaring compile-time-only dependenciesimport mill._, scalalib._
object foo extends JavaModule {
def compileIvyDeps = Agg(
ivy"javax.servlet:sevlet-api:2.5"
)
}
Compile-time dependencies are translated to provided -scoped dependencies when publish to Maven or Ivy-Repositories.
|
Keeping up-to-date with Scala Steward
It’s always a good idea to keep your dependencies up-to-date.
If your project is hosted on GitHub, GitLab, or Bitbucket, you can use Scala Steward to automatically open a pull request to update your dependencies whenever there is a newer version available.
Scala Steward can also keep your Mill version up-to-date. |
Adding a Test Suite
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
object test extends Tests {
def ivyDeps = Agg(ivy"com.lihaoyi::utest:0.8.2")
def testFramework = "utest.runner.Framework"
}
}
For convenience, you can also use one of the predefined test frameworks:
-
TestModule.Junit4
-
TestModule.Junit5
-
TestModule.TestNg
-
TestModule.Munit
-
TestModule.ScalaTest
-
TestModule.Specs2
-
TestModule.Utest
build.sc
: ScalaModule
with UTest tests using the predefined TestModule.Utest
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
object test extends ScalaTests with TestModule.Utest {
def ivyDeps = Agg(ivy"com.lihaoyi::utest:0.8.2")
}
}
You can define a test suite by creating a nested module extending ScalaTests
, and
specifying the ivy coordinates and name of your test framework. This expects the
tests to be laid out as follows:
build.sc foo/ src/ Example.scala resources/ ... test/ src/ ExampleTest.scala resources/ ... out/ foo/ ... test/ ...
The above example can be run via
mill foo.test
By default, tests are run in a subprocess, and forkArg
and forkEnv
can be
overridden to pass JVM flags & environment variables. You can also use
mill foo.test.testLocal
To run tests in-process in an isolated classloader.
If you want to pass any arguments to the test framework, simply put them after
foo.test
in the command line. e.g. uTest
lets you pass in a selector to decide which test to run, which in Mill would be:
mill foo.test foo.MyTestSuite.testCaseName
You can define multiple test suites if you want, e.g.:
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
object test extends ScalaTests with TestModule.Utest {
def ivyDeps = Agg(ivy"com.lihaoyi::utest:0.8.2")
}
object integration extends ScalaTests with TestModule.Utest {
def ivyDeps = Agg(ivy"com.lihaoyi::utest:0.8.2")
}
}
Each of which will expect their sources to be in their respective foo/test
and
foo/integration
folder.
ScalaTests
modules are ScalaModule
s like any other, and all the same
configuration options apply.
Custom Test Frameworks
Integrating with test frameworks like Scalatest or specs2 is simply a matter of adding it to ivyDeps
and specifying the testFramework
you want to use.
Scalatest example:
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
object test extends Tests with TestModule.ScalaTest {
def ivyDeps = Agg(ivy"org.scalatest::scalatest:3.0.4")
}
}
Specs2 example:
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
object test extends Tests with TestModule.Specs2 {
def ivyDeps = Agg(ivy"org.specs2::specs2-core:4.6.0")
}
}
After that, you can follow the instructions in Adding a Test Suite, and use mill foo.test
as usual, or pass args to the test suite via mill foo.test arg1 arg2 arg3
.
Scala Compiler Plugins
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
def compileIvyDeps = Agg(ivy"com.lihaoyi:::acyclic:0.1.7")
def scalacOptions = Seq("-P:acyclic:force")
def scalacPluginIvyDeps = Agg(ivy"com.lihaoyi:::acyclic:0.1.7")
}
You can use Scala compiler plugins by setting scalacPluginIvyDeps
. The above
example also adds the plugin to compileIvyDeps
, since that plugin’s artifact
is needed on the compilation classpath (though not at runtime).
Remember that compiler plugins are published against the full Scala
version (eg. 2.13.8 instead of just 2.13), so when including them make sure to
use the ::: syntax shown above in the example.
|
Generating API Documentation
To generate API documenation you can use the docJar
task on the module you’d
like to create the documenation for. For example, given a module called
example
you could do:
mill example.docJar
This will result in your javaDoc
being created in
out/app/docJar.dest/javadoc
.
For both Scala and Java modules there may be extra options that you’d like to
pass specifically to either javadoc
or scaladoc
. You can pass these with
javadocOptions
and scalaDocOptions
respectively.
build.sc
import mill._, scalalib._
object example extends ScalaModule {
def scalaVersion = "3.1.3"
def scalaDocOptions = Seq(
"-siteroot",
"mydocs",
"-no-link-warnings"
)
}
Scaladoc 3 Site Generation
When using Scala 3 you’re also able to use Scaladoc to generate a full static site next to your API documenation. This can include general documenation for your project and even a blog. While you can find the full documenation for this in the Scala 3 docs, below you’ll find some useful information to help you generate this with Mill.
By default, Mill will consider the site root as it’s called in
Scala 3
docs, to be the value of docResources()
. It will look there for your
_docs/
and your _blog/
directory if any exist. Let’s pretend we have a
project called example
defined like this:
build.sc
import mill._, scalalib._
object example extends ScalaModule {
def scalaVersion = "3.1.3"
}
Your project structure for this would look something like this:
. ├── build.sc ├── example │ ├── docs │ │ ├── _blog │ │ │ ├── _posts │ │ │ │ └── 2022-08-14-hello-world.md │ │ │ └── index.md │ │ └── _docs │ │ ├── getting-started.md │ │ ├── index.html │ │ └── index.md │ └── src │ └── example │ └── Hello.scala
After generating your docs with mill example.docJar
you’ll find by opening
your out/app/docJar.dest/javadoc/index.html
locally in your browser you’ll
have a full static site including your API docs, your blog, and your
documenation!
Reformatting your code
Mill supports code formatting via scalafmt out of the box.
To have a formatting per-module you need to make your module extend mill.scalalib.scalafmt.ScalafmtModule
:
build.sc
import mill._, scalalib._, scalafmt._
object foo extends ScalaModule with ScalafmtModule {
def scalaVersion = "2.13.12"
}
Now you can reformat code with mill foo.reformat
command, or only check for misformatted files with mill foo.checkFormat
.
You can also reformat your project’s code globally with mill mill.scalalib.scalafmt.ScalafmtModule/reformatAll __.sources
command,
or only check the code’s format with mill mill.scalalib.scalafmt.ScalafmtModule/checkFormatAll __.sources
.
It will reformat all sources that matches __.sources
query.
If you add a .scalafmt.conf
file at the root of you project, it will be used
to configure formatting. It can contain a version
key to specify the scalafmt
version used to format your code. See the
scalafmt configuration documentation
for details.
Common Configuration
build.sc
import mill._, scalalib._
trait CommonModule extends ScalaModule {
def scalaVersion = "2.13.12"
}
object foo extends CommonModule
object bar extends CommonModule {
def moduleDeps = Seq(foo)
}
You can extract out configuration common to multiple modules into a trait
that
those modules extend. This is useful for providing convenience & ensuring
consistent configuration: every module often has the same scala-version, uses
the same testing framework, etc. and all that can be extracted out into the
trait
.
Global configuration
Mill builds on ammonite which allows you to
define global configuration. Depending on
how you start mill, one of two files will be loaded. For the build REPL
(--repl
or -i
without specifying a target), ~/.mill/ammonite/predef.sc
will be loaded, and for builds from the command line the file
~/.mill/ammonite/predefScript.sc
will be included. You might want to create
a symlink from one to the other to avoid duplication.
Example ~/.mill/ammonite/predef.sc
~/.mill/ammonite/predef.sc
val nexusUser = "myuser"
val nexusPassword = "mysecret"
Everything declared in the above file will be available to any build you run.
def repositories = super.repositories ++ Seq(
// login and pass are globally configured
MavenRepository("https://nexus.mycompany.com/repository/maven-releases", authentication = Some(coursier.core.Authentication(nexusUser, nexusPassword)))
)
Custom Tasks
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
}
def lineCount = T {
foo.sources().flatMap(ref => os.walk(ref.path)).filter(os.isFile).flatMap(os.read.lines).size
}
def printLineCount() = T.command {
println(lineCount())
}
You can define new cached Targets using the T {…}
syntax, depending on
existing Targets e.g. foo.sources
via the foo.sources()
syntax to extract
their current value, as shown in lineCount
above. The return-type of a Target
has to be JSON-serializable (using
uPickle) and the Target is cached when
first run until its inputs change (in this case, if someone edits the
foo.sources
files which live in foo/src
. Cached Targets cannot take
parameters.
You can print the value of your custom target using show
, e.g.
mill show lineCount
You can define new un-cached Commands using the T.command {…}
syntax. These
are un-cached and re-evaluate every time you run them, but can take parameters.
Their return type needs to be JSON-writable as well, or (): Unit
if you want
to return nothing.
Your custom targets can depend on each other using the def bar = T {… foo()
…}
syntax, and you can create arbitrarily long chains of dependent targets.
Mill will handle the re-evaluation and caching of the targets' output for you,
and will provide you a T.dest
folder for you to use as scratch space or
to store files you want to return.
Custom targets and commands can contain arbitrary code. Whether you want to
download files (e.g. using mill.modules.Util.download
), shell-out to Webpack
to compile some Javascript, generate sources to feed into a compiler, or create
some custom jar/zip assembly with the files you want (e.g. using
mill.modules.Jvm.createJar
), all of these can simply be custom targets with
your code running in the T {…}
block.
Custom Modules
build.sc
import mill._, scalalib._
object qux extends Module {
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
}
object bar extends ScalaModule {
def moduleDeps = Seq(foo)
def scalaVersion = "2.13.12"
}
}
Not every Module needs to be a ScalaModule
; sometimes you just want to group
things together for neatness. In the above example, you can run foo
and bar
namespaced inside qux
:
mill qux.foo.compile
mill qux.bar.run
You can also define your own module traits, with their own set of custom tasks, to represent other things e.g. Javascript bundles, docker image building,:
build.sc
trait MySpecialModule extends Module {
...
}
object foo extends MySpecialModule
object bar extends MySpecialModule
Module/Task Names
build.sc
import mill._
import mill.scalalib._
object `hyphenated-module` extends Module {
def `hyphenated-target` = T{
println("This is a hyphenated target in a hyphenated module.")
}
}
object unhyphenatedModule extends Module {
def unhyphenated_target = T{
println("This is an unhyphenated target in an unhyphenated module.")
}
def unhyphenated_target2 = T{
println("This is the second unhyphenated target in an unhyphenated module.")
}
}
Mill modules and tasks may be composed of the following character types:
-
Alphanumeric (A-Z, a-z, and 0-9)
-
Underscore (
_
) -
Hyphen (
-
)
Due to Scala naming restrictions, module and task names with hyphens must be surrounded by back-ticks (`
).
Using hyphenated names at the command line is unaffected by these restrictions.
mill hyphenated-module.hyphenated-target
mill unhyphenatedModule.unhyphenated_target
mill unhyphenatedModule.unhyphenated_target2
Overriding Tasks
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
def compile = T {
println("Compiling...")
super.compile()
}
def run(args: String*) = T.command {
println("Running..." + args.mkString(" "))
super.run(args:_*)
}
}
You can re-define targets and commands to override them, and use super
if you
want to refer to the originally defined task. The above example shows how to
override compile
and run
to add additional logging messages, but you can
also override ScalaModule#generatedSources
to feed generated code to your
compiler, ScalaModule#prependShellScript
to make your assemblies executable,
or ScalaModule#console
to use the Ammonite REPL instead of the normal Scala
REPL.
In Mill builds the override
keyword is optional.
Unmanaged Jars
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
def unmanagedClasspath = T {
if (!os.exists(millSourcePath / "lib")) Agg()
else Agg.from(os.list(millSourcePath / "lib").map(PathRef(_)))
}
}
You can override unmanagedClasspath
to point it at any jars you place on the
filesystem, e.g. in the above snippet any jars that happen to live in the
foo/lib/
folder.
Defining a Main Class
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
def mainClass = Some("foo.bar.Baz")
}
Mill’s foo.run
by default will discover which main class to run from your
compilation output, but if there is more than one or the main class comes from
some library you can explicitly specify which one to use. This also adds the
main class to your foo.jar
and foo.assembly
jars.
Merge/exclude/relocate files from assembly
When you make a runnable jar of your project with assembly
command,
you may want to exclude some files from a final jar (like signature files, and manifest files from library jars),
and merge duplicated files (for instance reference.conf
files from library dependencies).
By default mill excludes all *.sf
, *.dsa
, *.rsa
, and META-INF/MANIFEST.MF
files from assembly, and concatenates all reference.conf
files.
You can also define your own merge/exclude rules.
build.sc
import mill._, scalalib._
import mill.modules.Assembly._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
def assemblyRules = Seq(
Rule.Append("application.conf"), // all application.conf files will be concatenated into single file
Rule.AppendPattern(".*\\.conf"), // all *.conf files will be concatenated into single file
Rule.ExcludePattern(".*\\.temp"), // all *.temp files will be excluded from a final jar
Rule.Relocate("shapeless.**", "shade.shapless.@1") // the `shapeless` package will be shaded under the `shade` package
)
}
Downloading Non-Maven Jars
build.sc
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.12"
def unmanagedClasspath = Agg(
mill.modules.Util.download(
"https://github.com/williamfiset/FastJavaIO/releases/download/v1.0/fastjavaio.jar",
os.rel / "fastjavaio.jar"
)
)
}
You can also override unmanagedClasspath
to point it at jars that you want to
download from arbitrary URLs. Note that targets like unmanagedClasspath
are
cached, so your jar is downloaded only once and re-used indefinitely after that.
Using the Ammonite Repl / Scala console
All ScalaModule
s have a console
and a repl
target, to start a Scala console or an Ammonite Repl.
To use the latter, you can (and sometimes need to) customize the Ammonite version to work with your selected Scala version.
The default Ammonite version is the one, which is used by Mill internally (Mill’s build.sc
is an Ammonite script, after all).
But depending on the Scala version you are using, there is probably no matching Ammonite release available.
In order to start the repl, you have to specify a different available Ammonite version.
ammoniteVersion
to select a release compatible to the scalaVersion
import mill._. scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.12.6"
def ammoniteVersion = "2.4.0"
}
Why is Ammonite tied to the exact Scala version? This is because Ammonite depends on the Scala compiler. In contrast to the Scala library, compiler releases do not guarantee any binary compatibility between releases. As a consequence, Ammonite needs full Scala version specific releases. The older your used Mill version or the newer the Scala version you want to use, the higher is the risk that the default Ammonite version will not match. |