Scala Module Configuration
This page goes into more detail about the various configuration options
for ScalaModule
.
Many of the APIs covered here are listed in the Scaladoc:
Common Configuration Overrides
This example shows some of the common tasks you may want to override on a
ScalaModule
: specifying the mainClass
, adding additional
sources/resources, generating resources, and setting compilation/run
options. We also define it as a RootModule,
so its sources will live in the top-level src/
folder and its tasks can be
called directly via compile
or run
without needing a foo.
module prefix.
package build
import mill._, scalalib._
object `package` extends RootModule with ScalaModule {
def scalaVersion = "2.13.8"
// You can have arbitrary numbers of third-party dependencies
def ivyDeps = Agg(
ivy"com.lihaoyi::scalatags:0.8.2",
ivy"com.lihaoyi::os-lib:0.10.7"
)
// Choose a main class to use for `.run` if there are multiple present
def mainClass: T[Option[String]] = Some("foo.Foo2")
// Add (or replace) source folders for the module to use
def sources = Task.Sources {
super.sources() ++ Seq(PathRef(millSourcePath / "custom-src"))
}
// Add (or replace) resource folders for the module to use
def resources = Task.Sources {
super.resources() ++ Seq(PathRef(millSourcePath / "custom-resources"))
}
// Generate sources at build time
def generatedSources: T[Seq[PathRef]] = Task {
for (name <- Seq("A", "B", "C")) os.write(
Task.dest / s"Foo$name.scala",
s"""
package foo
object Foo$name {
val value = "hello $name"
}
""".stripMargin
)
Seq(PathRef(Task.dest))
}
// Pass additional JVM flags when `.run` is called or in the executable
// generated by `.assembly`
def forkArgs: T[Seq[String]] = Seq("-Dmy.custom.property=my-prop-value")
// Pass additional environmental variables when `.run` is called. Note that
// this does not apply to running externally via `.assembly
def forkEnv: T[Map[String, String]] = Map("MY_CUSTOM_ENV" -> "my-env-value")
// Additional Scala compiler options, e.g. to turn warnings into errors
def scalacOptions: T[Seq[String]] = Seq("-deprecation", "-Xfatal-warnings")
}
If you want to better understand how the various upstream tasks feed into
a task of interest, such as run
, you can visualize their relationships via
> mill visualizePlan run
(right-click open in new tab to see full sized)
Note the use of millSourcePath
, Task.dest
, and PathRef
when preforming
various filesystem operations:
-
millSourcePath
refers to the base path of the module. For the root module, this is the root of the repo, and for inner modules it would be the module path e.g. for modulefoo.bar.qux
themillSourcePath
would befoo/bar/qux
. This can also be overriden if necessary -
Task.dest
refers to the destination folder for a task in theout/
folder. This is unique to each task, and can act as both a scratch space for temporary computations as well as a place to put "output" files, without worrying about filesystem conflicts with other tasks -
PathRef
is a way to return the contents of a file or folder, rather than just its path as a string. This ensures that downstream tasks properly invalidate when the contents changes even when the path stays the same
> mill run
Foo2.value: <h1>hello2</h1>
Foo.value: <h1>hello</h1>
FooA.value: hello A
FooB.value: hello B
FooC.value: hello C
MyResource: My Resource Contents
MyOtherResource: My Other Resource Contents
my.custom.property: my-prop-value
MY_CUSTOM_ENV: my-env-value
> mill show assembly
".../out/assembly.dest/out.jar"
> ./out/assembly.dest/out.jar # mac/linux
Foo2.value: <h1>hello2</h1>
Foo.value: <h1>hello</h1>
FooA.value: hello A
FooB.value: hello B
FooC.value: hello C
MyResource: My Resource Contents
MyOtherResource: My Other Resource Contents
my.custom.property: my-prop-value
> sed -i.bak 's/Foo2 {/Foo2 { println(this + "hello")/g' custom-src/Foo2.scala
> mill compile # demonstrate -deprecation/-Xfatal-warnings flags
error: object Foo2 { println(this + "hello")
error: ^
error: ...Implicit injection of + is deprecated. Convert to String to call +...
Compilation & Execution Flags
package build
import mill._, scalalib._
object `package` extends RootModule with ScalaModule {
def scalaVersion = "2.13.8"
def scalacOptions = Seq("-Ydelambdafy:inline")
def forkArgs = Seq("-Xmx4g", "-Dmy.jvm.property=hello")
def forkEnv = Map("MY_ENV_VAR" -> "WORLD")
}
You can pass flags to the Scala compiler via scalacOptions
.
> ./mill run
hello WORLD
By default,
run
runs the compiled code in a subprocess, and you can pass in JVM flags
via forkArgs
or environment-variables via forkEnv
.
You can also run your code via
mill foo.runLocal
Which runs it in-process within an isolated classloader. This may be faster
since you avoid the JVM startup, but does not support forkArgs
or forkEnv
.
If you want to pass main-method arguments to run
or runLocal
, simply pass
them after the foo.run
/foo.runLocal
:
mill foo.run arg1 arg2 arg3
mill foo.runLocal arg1 arg2 arg3
Classpath and Filesystem Resources
package build
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.8"
def ivyDeps = Agg(
ivy"com.lihaoyi::os-lib:0.10.7"
)
object test extends ScalaTests {
def ivyDeps = Agg(ivy"com.lihaoyi::utest:0.8.4")
def testFramework = "utest.runner.Framework"
def otherFiles = Task.Source(millSourcePath / "other-files")
def forkEnv = super.forkEnv() ++ Map(
"OTHER_FILES_DIR" -> otherFiles().path.toString
)
}
}
> ./mill foo.test
... foo.FooTests...simple ...
...
This section discusses how tests can depend on resources locally on disk.
Mill provides two ways to do this: via the JVM classpath resources, and via
the resource folder which is made available as the environment variable
MILL_TEST_RESOURCE_DIR
;
-
The classpath resources are useful when you want to fetch individual files, and are bundled with the application by the
.assembly
step when constructing an assembly jar for deployment. But they do not allow you to list folders or perform other filesystem operations. -
The resource folder, available via
MILL_TEST_RESOURCE_DIR
, gives you access to the folder path of the resources on disk. This is useful in allowing you to list and otherwise manipulate the filesystem, which you cannot do with classpath resources. However, theMILL_TEST_RESOURCE_DIR
only exists when running tests using Mill, and is not available when executing applications packaged for deployment via.assembly
-
Apart from
resources/
, you can provide additional folders to your test suite by defining aTask.Source
(otherFiles
above) and passing it toforkEnv
. This provide the folder path as an environment variable that the test can make use of
Example application code demonstrating the techniques above can be seen below:
Hello World Resource File
Test Hello World Resource File A
Test Hello World Resource File B
Other Hello World File
package foo
object Foo {
// Read `file.txt` from classpath
def classpathResourceText = os.read(os.resource / "file.txt")
}
package foo
import utest._
object FooTests extends TestSuite {
def tests = Tests {
test("simple") {
// Reference app module's `Foo` class which reads `file.txt` from classpath
val appClasspathResourceText = Foo.classpathResourceText
assert(appClasspathResourceText == "Hello World Resource File")
// Read `test-file-a.txt` from classpath
val testClasspathResourceText = os.read(os.resource / "test-file-a.txt")
assert(testClasspathResourceText == "Test Hello World Resource File A")
// Use `MILL_TEST_RESOURCE_DIR` to read `test-file-b.txt` from filesystem
val testFileResourceDir = os.Path(sys.env("MILL_TEST_RESOURCE_DIR"))
val testFileResourceText = os.read(testFileResourceDir / "test-file-b.txt")
assert(testFileResourceText == "Test Hello World Resource File B")
// Use `MILL_TEST_RESOURCE_DIR` to list files available in resource folder
assert(
os.list(testFileResourceDir).sorted ==
Seq(testFileResourceDir / "test-file-a.txt", testFileResourceDir / "test-file-b.txt")
)
// Use the `OTHER_FILES_DIR` configured in your build to access the
// files in `foo/test/other-files/`.
val otherFileText = os.read(os.Path(sys.env("OTHER_FILES_DIR")) / "other-file.txt")
assert(otherFileText == "Other Hello World File")
}
}
}
Note that tests require that you pass in any files that they depend on explicitly. This is necessary so that Mill knows when a test needs to be re-run and when a previous result can be cached. This also ensures that tests reading and writing to the current working directory do not accidentally interfere with each others files, especially when running in parallel.
Mill runs test processes in a sandbox/ folder, not in your project root folder, to
prevent you from accidentally accessing files without explicitly passing them. Thus
you cannot just read resources off disk via new FileInputStream("foo/resources/test-file-a.txt")
.
If you have legacy tests that need to run in the project root folder to work, you
can configure your test suite with def testSandboxWorkingDir = false
to disable
the sandbox and make the tests run in the project root.
Scala Compiler Plugins
package build
import mill._, scalalib._
object `package` extends RootModule with ScalaModule {
def scalaVersion = "2.13.8"
def compileIvyDeps = Agg(ivy"com.lihaoyi:::acyclic:0.3.6")
def scalacOptions = Seq("-P:acyclic:force")
def scalacPluginIvyDeps = Agg(ivy"com.lihaoyi:::acyclic:0.3.6")
}
You can use Scala compiler plugins by setting scalacPluginIvyDeps
. The above
example also adds the plugin to compileIvyDeps
, since that plugin’s artifact
is needed on the compilation classpath (though not at runtime).
Remember that compiler plugins are published against the full Scala
version (eg. 2.13.8 instead of just 2.13), so when including them make sure to
use the ::: syntax shown above in the example.
|
> ./mill compile
...
error: Unwanted cyclic dependency
error: ...src/Foo.scala...
error: def y = Bar.z
error: ...src/Bar.scala...
error: def x = Foo.y
Scaladoc Config
To generate API documenation you can use the docJar
task on the module you’d
like to create the documentation for, configured via scalaDocOptions
or
javadocOptions
:
package build
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "3.1.3"
def scalaDocOptions = Seq("-siteroot", "mydocs", "-no-link-warnings")
}
> ./mill show foo.docJar
> unzip -p out/foo/docJar.dest/out.jar foo/Foo.html
...
...My Awesome Docs for class Foo...
When using Scala 3 you’re also able to use Scaladoc to generate a full static site next to your API documention. This can include general documenation for your project and even a blog. While you can find the full documenation for this in the Scala 3 docs, below you’ll find some useful information to help you generate this with Mill.
By default, Mill will consider the site root as it’s called in
Scala 3
docs, to be the value of docResources()
. It will look there for your
_docs/
and your _blog/
directory if any exist. Given a
project called bar
:
object bar extends ScalaModule {
def scalaVersion = "3.1.3"
}
Your project structure for this would look something like this:
. ├── build.mill ├── bar │ ├── docs │ │ ├── _blog │ │ │ ├── _posts │ │ │ │ └── 2022-08-14-hello-world.md │ │ │ └── index.md │ │ └── _docs │ │ ├── getting-started.md │ │ ├── index.html │ │ └── index.md │ └── src │ └── example │ └── Hello.scala
After generating your docs with mill example.docJar
you’ll find by opening
your out/app/docJar.dest/javadoc/index.html
locally in your browser you’ll
have a full static site including your API docs, your blog, and your
documentation.
> ./mill show bar.docJar
> unzip -p out/bar/docJar.dest/out.jar bar/Bar.html
...
...<p>My Awesome Docs for class Bar</p>...
Specifying the Main Class
package build
import mill._, scalalib._
object `package` extends RootModule with ScalaModule {
def scalaVersion = "2.13.8"
def mainClass = Some("foo.Qux")
}
Mill’s foo.run
by default will discover which main class to run from your
compilation output, but if there is more than one or the main class comes from
some library you can explicitly specify which one to use. This also adds the
main class to your foo.jar
and foo.assembly
jars.
> ./mill run
Hello Qux
Customizing the Assembly
package build
import mill._, scalalib._
import mill.scalalib.Assembly._
object foo extends ScalaModule {
def moduleDeps = Seq(bar)
def scalaVersion = "2.13.8"
def ivyDeps = Agg(ivy"com.lihaoyi::os-lib:0.10.7")
def assemblyRules = Seq(
// all application.conf files will be concatenated into single file
Rule.Append("application.conf"),
// all *.conf files will be concatenated into single file
Rule.AppendPattern(".*\\.conf"),
// all *.temp files will be excluded from a final jar
Rule.ExcludePattern(".*\\.temp"),
// the `shapeless` package will be relocated under the `shade` package
Rule.Relocate("shapeless.**", "shade.shapless.@1")
)
}
object bar extends ScalaModule {
def scalaVersion = "2.13.8"
}
When you make a runnable jar of your project with assembly
command,
you may want to exclude some files from a final jar (like signature files,
and manifest files from library jars), and merge duplicated files (for
instance reference.conf
files from library dependencies).
By default mill excludes all *.sf
, *.dsa
, *.rsa
, and
META-INF/MANIFEST.MF
files from assembly, and concatenates all
reference.conf
files. You can also define your own merge/exclude rules.
> ./mill foo.assembly
> unzip -p ./out/foo/assembly.dest/out.jar application.conf || true
Bar Application Conf
Foo Application Conf
> java -jar ./out/foo/assembly.dest/out.jar
Loaded application.conf from resources:...
...Foo Application Conf
...Bar Application Conf
Cross-Scala-Version Modules
package build
import mill._, scalalib._
val scalaVersions = Seq("2.12.17", "2.13.8")
object foo extends Cross[FooModule](scalaVersions)
trait FooModule extends CrossScalaModule {
def moduleDeps = Seq(bar())
}
object bar extends Cross[BarModule](scalaVersions)
trait BarModule extends CrossScalaModule
This is an example of cross-building a module across multiple Scala
versions. Each module is replaced by a Cross
module, which is given a list
of strings you want the cross-module to be replicated for. You can then
specify the cross-modules with square brackets when you want to run tasks on
them.
CrossScalaModule
supports both shared sources within src/
as well as
version specific sources in src-x/
, src-x.y/
, or src-x.y.z/
that
apply to the cross-module with that version prefix.
> mill resolve __.run
foo[2.12.17].run
foo[2.13.8].run
bar[2.12.17].run
bar[2.13.8].run
> mill foo[2.12.17].run
Foo.value: Hello World Scala library version 2.12.17...
Bar.value: bar-value
Specific code for Scala 2.x
Specific code for Scala 2.12.x
> mill foo[2.13.8].run
Foo.value: Hello World Scala library version 2.13.8...
Bar.value: bar-value
Specific code for Scala 2.x
Specific code for Scala 2.13.x
> mill bar[2.13.8].run
Bar.value: bar-value
CrossScalaModule
s can depend on each other using moduleDeps
, but require
the ()
suffix in moduleDeps
to select the appropriate instance of the
cross-module to depend on. You can also pass the crossScalaVersion
explicitly to select the right version of the cross-module:
object foo2 extends Cross[Foo2Module](scalaVersions)
trait Foo2Module extends CrossScalaModule {
def moduleDeps = Seq(bar(crossScalaVersion))
}
object bar2 extends Cross[Bar2Module](scalaVersions)
trait Bar2Module extends CrossScalaModule
Unidoc
package build
import mill._, scalalib._
object foo extends ScalaModule with UnidocModule {
def scalaVersion = "2.13.8"
def moduleDeps = Seq(bar, qux)
object bar extends ScalaModule {
def scalaVersion = "2.13.8"
}
object qux extends ScalaModule {
def scalaVersion = "2.13.8"
def moduleDeps = Seq(bar)
}
def unidocVersion = Some("0.1.0")
def unidocSourceUrl = Some("https://github.com/lihaoyi/test/blob/master")
}
This example demonstrates use of mill.scalalib.UnidocModule
. This can be
mixed in to any ScalaModule
, and generates a combined Scaladoc for the
module and all its transitive dependencies. Two tasks are provided:
-
.unidocLocal
: this generates a site suitable for local browsing. If unidocSourceUrl is provided, the scaladoc provides links back to the local sources -
.unidocSite
: this generates a site suitable for local browsing. If unidocSourceUrl is provided, the scaladoc provides links back to the sources as browsable from theunidocSourceUrl
base (e.g. on Github)
> ./mill show foo.unidocLocal
".../out/foo/unidocLocal.dest"
> cat out/foo/unidocLocal.dest/foo/Foo.html
...
...My Eloquent Scaladoc for Foo...
> cat out/foo/unidocLocal.dest/foo/qux/Qux.html
...
...My Excellent Scaladoc for Qux...
> cat out/foo/unidocLocal.dest/foo/bar/Bar.html
...
...My Lucid Scaladoc for Bar...
> ./mill show foo.unidocSite
Custom Tasks
This example shows how to define task that depend on other tasks:
-
For
generatedSources
, we override the task and make it depend directly onivyDeps
to generate its source files. In this example, to include the list of dependencies as tuples within a staticobject
-
For
lineCount
, we define a brand new task that depends onsources
, and then overrideforkArgs
to use it. That lets us access the line count at runtime usingsys.props
and print it when the program runs
package build
import mill._, scalalib._
object `package` extends RootModule with ScalaModule {
def scalaVersion = "2.13.8"
def ivyDeps = Agg(ivy"com.lihaoyi::mainargs:0.4.0")
def generatedSources: T[Seq[PathRef]] = Task {
val prettyIvyDeps = for (ivyDep <- ivyDeps()) yield {
val org = ivyDep.dep.module.organization.value
val name = ivyDep.dep.module.name.value
val version = ivyDep.dep.version
s"""("$org", "$name", "$version")"""
}
os.write(
Task.dest / s"MyDeps.scala",
s"""
package foo
object MyDeps {
val value = List(
${prettyIvyDeps.mkString(",\n")}
)
}
""".stripMargin
)
Seq(PathRef(Task.dest))
}
def lineCount: T[Int] = Task {
sources()
.flatMap(pathRef => os.walk(pathRef.path))
.filter(_.ext == "scala")
.map(os.read.lines(_).size)
.sum
}
def forkArgs: T[Seq[String]] = Seq(s"-Dmy.line.count=${lineCount()}")
def printLineCount() = Task.Command { println(lineCount()) }
}
The above build defines the customizations to the Mill task graph shown below, with the boxes representing tasks defined or overriden above and the un-boxed labels representing existing Mill tasks:
Mill lets you define new cached Tasks using the Task {…}
syntax,
depending on existing Tasks e.g. foo.sources
via the foo.sources()
syntax to extract their current value, as shown in lineCount
above. The
return-type of a Task has to be JSON-serializable (using
uPickle, one of Mill’s Bundled Libraries)
and the Task is cached when first run until its inputs change (in this case, if
someone edits the foo.sources
files which live in foo/src
). Cached Tasks
cannot take parameters.
Note that depending on a task requires use of parentheses after the task
name, e.g. ivyDeps()
, sources()
and lineCount()
. This converts the
task of type T[V]
into a value of type V
you can make use in your task
implementation.
This example can be run as follows:
> mill run --text hello
text: hello
MyDeps.value: List((com.lihaoyi,mainargs,0.4.0))
my.line.count: 14
> mill show lineCount
14
> mill printLineCount
14
Custom tasks can contain arbitrary code. Whether you want to
download files using requests.get
, shell-out to Webpack
to compile some Javascript, generate sources to feed into a compiler, or
create some custom jar/zip assembly with the files you want , all of these
can simply be custom tasks with your code running in the Task {…}
block.
You can also import arbitrary Java or Scala libraries from Maven Central via
import $ivy to use in your build.
You can create arbitrarily long chains of dependent tasks, and Mill will
handle the re-evaluation and caching of the tasks' output for you.
Mill also provides you a Task.dest
folder for you to use as scratch space or
to store files you want to return:
-
Any files a task creates should live within
Task.dest
-
Any files a task modifies should be copied into
Task.dest
before being modified. -
Any files that a task returns should be returned as a
PathRef
to a path withinTask.dest
That ensures that the files belonging to a particular task all live in one place, avoiding file-name conflicts, preventing race conditions when tasks evaluate in parallel, and letting Mill automatically invalidate the files when the task’s inputs change.
Overriding Tasks
package build
import mill._, scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.13.8"
def sources = Task {
os.write(
Task.dest / "Foo.scala",
"""package foo
object Foo {
def main(args: Array[String]): Unit = {
println("Hello World")
}
}
""".stripMargin
)
Seq(PathRef(Task.dest))
}
def compile = Task {
println("Compiling...")
super.compile()
}
def run(args: Task[Args] = Task.Anon(Args())) = Task.Command {
println("Running..." + args().value.mkString(" "))
super.run(args)()
}
}
You can re-define tasks to override them, and use super
if you
want to refer to the originally defined task. The above example shows how to
override compile
and run
to add additional logging messages, and we
override sources
which was Task.Sources
for the src/
folder with a plain
T{…}
task that generates the necessary source files on-the-fly.
that this example replaces your src/ folder with the generated
sources, as we are overriding the def sources task. If you want to add
generated sources, you can either override generatedSources , or you can
override sources and use super to include the original source folder with super :
|
object foo2 extends ScalaModule {
def scalaVersion = "2.13.8"
def generatedSources = Task {
os.write(Task.dest / "Foo.scala", """...""")
Seq(PathRef(Task.dest))
}
}
object foo3 extends ScalaModule {
def scalaVersion = "2.13.8"
def sources = Task {
os.write(Task.dest / "Foo.scala", """...""")
super.sources() ++ Seq(PathRef(Task.dest))
}
}
In Mill builds the override
keyword is optional.
> mill foo.run
Compiling...
Running...
Hello World
Using the Ammonite Repl / Scala console
All ScalaModule
s have a console
and a repl
task, to start a Scala console or an Ammonite Repl.
When using the console
, you can configure its scalac
options using the consoleScalacOptions
task.
For example, you may want to inherit all of your regular scalacOptions
but disable -Xfatal-warnings
:
consoleScalacOptions
to disable fatal warningsimport mill._, scalalib._
object foo extends ScalaModule {
def consoleScalacOptions = scalacOptions().filterNot(o => o == "-Xfatal-warnings")
}
To use the repl
, you can (and sometimes need to) customize the Ammonite version to work with your selected Scala version.
Mill provides a default Ammonite version,
but depending on the Scala version you are using, there may be no matching Ammonite release available.
In order to start the repl, you may have to specify a different available Ammonite version.
ammoniteVersion
to select a release compatible to the scalaVersion
import mill._. scalalib._
object foo extends ScalaModule {
def scalaVersion = "2.12.6"
def ammoniteVersion = "2.4.0"
}
Why is Ammonite tied to the exact Scala version? This is because Ammonite depends on the Scala compiler. In contrast to the Scala library, compiler releases do not guarantee any binary compatibility between releases. As a consequence, Ammonite needs full Scala version specific releases. The older your used Mill version or the newer the Scala version you want to use, the higher is the risk that the default Ammonite version will not match. |
Disabling incremental compilation with Zinc
By default all ScalaModule
s use incremental compilation via Zinc to
only recompile sources that have changed since the last compile, or ones that have been invalidated
by changes to upstream sources.
If for any reason you want to disable incremental compilation for a module, you can override and set
zincIncrementalCompilation
to false
build.mill
import mill._, scalalib._
object foo extends ScalaModule {
def zincIncrementalCompilation = false
}