Giter Site home page Giter Site logo

dbuild's Introduction

dbuild

...is a build and debugging tool based on sbt.

It is used to coordinate the development of multiple, independent projects that evolve in parallel: dbuild uses a multi-project definition file to build all the requested projects, and makes sure that they all work together, even though each of them may evolve independently.

You can find the complete dbuild documentation at the dbuild web site.

Maintenance status

dbuild is now only used by the Scala 2 community build. It receives only light maintenance, as needed, from the Scala team at Lightbend.

Release Notes

see CHANGELOG.md

dbuild's People

Contributors

adriaanm avatar dwijnand avatar jsuereth avatar ktoso avatar pvlugter avatar sethtisue avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dbuild's Issues

support sbt 0.13

We're using sbt 0.13 for the new modules in Scala 2.11, and would like to use dbuild to build them.

This doesn't work, for example for https://github.com/scala/scala-partest/blob/master/build.sbt:

build.sbt:25: error: not found: value Def
[error] resourceGenerators in Compile <+= Def.task {
[error]                                   ^
[info] [error] Type error in expression
[info] Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? 
java.lang.RuntimeException: Failure to run sbt!  Error code: 1
    at scala.sys.package$.error(package.scala:27)
    at distributed.support.sbt.SbtRunner.run(SbtRunner.scala:46)

http://localhost:8088/artifactory/toni-maven is down

Ha ha, it's not down. When one types "sbt package" in a clean checkout of a project, even a private one, it should probably not have dozens of errors like these:

[info] Resolving org.sonatype.sisu.inject#guice-bean;2.3.0 ...
[error] Server access Error: Connection refused url=http://localhost:8088/artifactory/toni-maven/org/sonatype/sisu/inject/guice-bean/2.3.0/guice-bean-2.3.0.pom
[info] Resolving org.sonatype.sisu.inject#containers;2.3.0 ...
[error] Server access Error: Connection refused url=http://localhost:8088/artifactory/toni-maven/org/sonatype/sisu/inject/containers/2.3.0/containers-2.3.0.pom
[info] Resolving org.sonatype.sisu#sisu-inject;2.3.0 ...
[error] Server access Error: Connection refused url=http://localhost:8088/artifactory/toni-maven/org/sonatype/sisu/sisu-inject/2.3.0/sisu-inject-2.3.0.pom
[info] Resolving org.sonatype.sisu#sisu-parent;2.3.0 ...
[error] Server access Error: Connection refused url=http://localhost:8088/artifactory/toni-maven/org/sonatype/sisu/sisu-parent/2.3.0/sisu-parent-2.3.0.pom
[info] Resolving org.sonatype.forge#forge-parent;10 ...
[error] Server access Error: Connection refused url=http://localhost:8088/artifactory/toni-maven/org/sonatype/sisu/sisu-parent/2.3.0/sisu-parent-2.3.0.jar
[error] Server access Error: Connection refused url=http://localhost:8088/artifactory/toni-maven/org/sonatype/sisu/sisu-inject/2.3.0/sisu-inject-2.3.0.jar
[error] Server access Error: Connection refused url=http://localhost:8088/artifactory/toni-maven/org/sonatype/sisu/inject/containers/2.3.0/containers-2.3.0.jar
[error] Server access Error: Connection refused url=http://localhost:8088/artifactory/toni-maven/org/sonatype/sisu/inject/guice-bean/2.3.0/guice-bean-2.3.0.jar
[error] Server access Error: Connection refused url=http://localhost:8088/artifactory/toni-maven/org/sonatype/sisu/inject/guice-plexus/2.3.0/guice-plexus-2.3.0.jar
[error] Server access Error: Connection refused url=http://localhost:8088/artifactory/toni-maven/org/sonatype/sisu/sisu-inject-plexus/2.3.0/sisu-inject-plexus-2.3.0.jar

Even if it has to fail for everyone who isn't toni, it should not have errors involving attempts to connect to toni's local maven repository.

global extra sbt commands

It would be great to be able to say:

global.extra.commands: ["set logLevel in update := Level.Warn"]

Rather than repeating it in each SBT project.

Print the name of the config file used by dbuild

For example, if you check this log you'll see:

>> Building Zinc using dbuild
>> Fetching update for git://github.com/typesafehub/sbt-builds-for-ide
>> Checking out master
Starting dbuild...
[warn] Credentials will be ignored while deploying to file:///localhome/jenkins/b/workspace/pr-scala-integrate-ide/target/m2repo
[info] --== Extracting dependencies for scala-xml ==--

It doesn't tell you which config file it actually used from sbt-builds-for-ide git repo.

Allow global variables in configuration files

As configuration files grow, it may be very handy to be able to define variables, like scalaVersion. Currently, the configuration follows a strict schema, so unknown variables trigger errors.

One possible solution is to have a separate section for variables, and enforce the schema only for build and options subtrees. For instance

{
  vars {
     scalaVersion: 2.11.0-SNAPSHOT
  }
  build: {
    "projects":[
      {
        name:  "scala-lib",
        system: "ivy",
        uri:    "ivy:org.scala-lang#scala-library;"${scalaVersion}
      },...
  }
}

Complexity problem when serializing/deserializing RepeatableProjectBuilds

The definition of a RepeatableProjectBuild includes: dependencies: Seq[RepeatableProjectBuild], where each project maintains a list of its dependencies. As long as these structures are kept in memory, all is well; however, when the date is written to the cache repository, all of the nested definitions are recursively expanded, leading to a dramatic explosion in size, and in the time necessary for the JSON parsing. With 17 projects, I have seen 3MB metadata files containing nothing but the same data over and over, taking tens of seconds to parse.
It is quite necessary to convert the RepeatableProjectBuild into a non-recursive structure for JSON serialization (and maybe also for inter-actor communication).

dbuild should abort when running with JDK7 or later

otherwise the scala build fails; in that case it is necessary to delete the scala build directory, or at least I did not manage to find out how to make it build the swing artifacts if it has skipped them once before

Stack-wide MIMA

Stack-wide MIMA running to ensure compatibility with previous versions.

system properties are not visible

The Java system properties are not currently available for replacement within a dbuild configuration file. The relevant support should be added.

Repeatable build is no longer repeatable build config

It seems I can't just dump repeatable build output from the log file into dbuild and reproduce a failed jenkins build locally (exactly whatever jenkins saw).

We should ensure this use case still works so we can debug issues with wierd environment parameters and other fun things.

Build path cannot contain '@'

Attempting a dsbt-build when on a path that contains '@' results in an error. This is problematic when building under Jenkins, since a '@' will be added to the path in that case when concurrent builds are enabled.

Catch time out conditions, and notify

If a build gets stuck, dbuild will return with an "Unexpected" outcome, skipping notifications. It should intercept that instead, and send notifications as usual.

Specify repositories in the build configuration file

Repositories are specified in dbuild.properties, but it would be very useful to control them from the usual configuration file.

For instance, during PR validation, Scala artifacts are only published locally. DBuild should pick them up, but currently the only way to do that is to manually edit the property file. Having additional repositories in the build configuration would allow one to use variables, and stay robust in the face of new dbuild releases.

Load variables from properties file

Though the fix for #40 improves the situation here (though josh-hack-3 seems to have a solution that didn't make it into 0.6.5? it uses 'globals' rather than 'vars'), it would be even more convenient to be able to load a properties file, so it's easier to import lots of variables.

The scala build serializes all version numbers in the versions.properties file, and our bash scripts need to parse them and export them as bash variables. Simply specifying the properties file would be less prone to typos.

Happy to implement this myself, if you point me in the right direction. It's not clear to me how the implementation of #40 works.

incomplete dependency extraction

Found incorrect behavior in dependency extraction and rewriting. In one test case, the package "scala-io" contains a bunch of subprojects, including "file" and "core", where file depends on core.

The dbuild file for "scala-io" contains projects: ["file"]. During extraction, the list of dependencies of "file" are extracted.

During build, "file" depends on "core", which is in the same project. So "core" is compiled first. However, "core" depends on "scala-arm". Even though "scala-arm" is in the dbuild file, the dependency is not detected. The dependency is not rewritten, and compilation fails.

Possible approaches: 1) a crude solution would be to disable the inter-project resolver. In that way, dbuild would stop asking for the dependent subproject in the same project, in this case "core". Once "core" is added to the list of subprojects, all would work as expected. 2) better, track inter-project dependencies during extraction, and calculate dependencies including those internal dependencies, possibly printing a warning that some non-requested subprojects will be included anyway as they are a requirement for at least one of the subprojects explicitly requested in the dbuild file.

Publishing twice to a remote repository may result in an error.

Building the same dsbt build project twice, when using a remote repository, may result in an attempt to overwrite files already existing on the repository. Artifactory may deny that, as an additional DELETE privilege is required to overwrite. Dbuild should not attempt to overwrite metadata of an already existing project/build.

Ivy build system should be able to share local cache with sbt

This means we should be using the same mechanisms as sbt to prevent "which repository this came from" information from corrupting the cache. However, we should be able to share resolution of ivy artifacts, so we don't take a long time to resolve things, but can share results between builds for "stable" artifacts. (thinks published to maven/ivy).

Incorrect repeatable build deserialization affects drepo

When serializing, after extraction, a RepeatableProjectBuild contains List() elements, as part of its nested elements. When deserializing, from within drepo, the RepeatableProjectBuild ends up containing WrappedArrays(). This affects UUIDs, and consequently drepo. The issue seems caused by the auto-list config facility using "SeqString", and in particular its SeqStringDeserializer.

Auto-clean build and extraction directories

The build directory of dbuild currently grows uncontrollably; during development, I end up generating 50-100GB of temporary files each week, which are never reclaimed.

Conversely, when launched dbuild should take care of automatically removing the build and extraction files that are older than a certain configurable threshold (for instance, 3 days for successful builds, 10 days for failed builds (unless we are trying to rebuild the same uuids, in which case the timestamp should just be updated)).

In addition, dbuild should ideally also emit a warning when target directories and caches of previous versions are found, as those directories are versioned and would live indefinitely unless removed manually.

dbuild-setup lacks needed meta

The meta information corresponding to a single project build is published only after the project was built successfully. This means that the 'dbuild-setup' command cannot be used to re-test a failed build, as the meta file needed to reload and re-wire the dependencies is unavailable. The metadata should be published instead once the build is first attempted (that is, when all its dependencies are already available).

"One click" release and integration test

The build should be able to have a integration test task that will build both the sbt 0.12 + sbt 0.13 plugins, all build system support and run a dbuild, ensuring things work and information is passed around. We can use dummy builds if needed.

See sbt-remote-control project for example on how to build multiple sbt plugins at the same time from one sbt.

Support for sbt 0.12.x is limited due to dispatch version clash

When building spray-json I got:

[info] [info] Done updating.
[info] [error] java.lang.NoClassDefFoundError: dispatch/ImplicitHandlerVerbs
[info] [error] Use 'last' for the full log.
[info] Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? 
[error] java.lang.RuntimeException: 
[error]     at scala.sys.package$.error(package.scala:27)
[error]     at distributed.support.sbt.SbtRunner$$anonfun$run$1.apply(SbtRunner.scala:60)
[error]     at distributed.support.sbt.SbtRunner$$anonfun$run$1.apply(SbtRunner.scala:34)
[error]     at sbt.IO$.withTemporaryFile(IO.scala:311)

The issue is that the ls plugin and dbuild plugins use different version of dispatch. Sbt doesn't offer classpath isolation for plugins so we are in bad situation.

I tried upgrading to sbt 0.13 where all plugins seem to agree on dispatch version but I ran into regression in boilerplate plugin spray-json is using. Therefore, for now we are stuck.

sbt-republish may fail in 0.7.0

There seems to be a mismatch concerning compiler-interface, in certain cases. For instance, in one test case the sbt jar published after the compilation of sbt is:
org.scala-sbt/compiler-interface_2.11.0-dbuildx1317a1280278e3f43e6d797a110a8f1a71fe410e/0.13.1-dbuildx7efb24a03fb92a1fec9a58adf4d9f9ad17ff018e/jars/compiler-interface-src_2.11.0-dbuildx1317a1280278e3f43e6d797a110a8f1a71fe410e.jar
but the file that sbt republish looks for is:
org.scala-sbt/compiler-interface_2.11.0-dbuildx1317a1280278e3f43e6d797a110a8f1a71fe410e/0.13.1-dbuildx7efb24a03fb92a1fec9a58adf4d9f9ad17ff018e/jars/compiler-interface-src.jar
That has to be a mismatch arising from one of the refactorings that intervened between 0.6.5 and 0.7.0; I'm investigating now.

Add option to reduce dbuild verbosity

@retronym suggests the following filtering:

NOISE='\[info\] (Resolving|downloading| \[SUCCESSFUL \]|Checking file|Including)'
./dbuild-runner.sh "community-2.11.x.dbuild" "0.7.1-M1" | egrep -v "$NOISE" | tee log.txt
egrep -q "The dbuild result is.*SUCCESS" log.txt

A similar functionality could be implemented directly within dbuild (possibly selecting which messages to include or omit), in order to reduce the output verbosity.

Add play build

Base it on this branch for now with fixes: https://github.com/jroper/Play20/tree/sbt-refactor

We need to filter out any Scala 2.9.2 projects (SBT related) and just use the 'production' ones.

Later, we can make an SBT-based 'dbuild' script that builds SBT's nightlies and all sbt-related plugins, which would build the other portion of play.

Ivy build system should allow excludes

The Ivy build system should be able to exclude dependencies. This is a twofold problem:

  1. It should be able to exclude dependencies from showing up in dbuild metadata, thereby breaking cycles.
  2. It should also be able to exclude dependencies from being resolved when "building". This could help speed things up.

hard timeouts are being reached

object Timeouts {
  // overall timeout for the entire dbuild to complete;
  // should never be reached, unless something truly
  // unexpected occurs. One of the subsequent other
  // timeouts would rather be encountered beforehand.
  val dbuildTimeout: Timeout = 5.hours + 30.minutes

Maybe the "truly unexpected" thing was us getting all of these projects to work in one build?

https://jenkins-dbuild.typesafe.com:8499/job/Community-2.11.x-retronym/buildTimeTrend

In any case, we're hitting the limit now, and need to make it configurable.

I also would love to see per-project timings in the summary so we could know where the time is being spent without piecing that together from the log file. Would also be useful to know the breakdown between compilation, doc generation, and test execution.

/cc @adriaanm @gkossakowski

dbuild-setup may get confused if the clone dir has the same name of a subproject

The dbuild logic uses some heuristics that is aimed at recognizing when an sbt subproject name really corresponds to the "default projects" (this is a source of recurring troubles in sbt as well).
Among the possible variations on the name, a subproject with the same name as the containing directory is currently also interpreted as the default project (0.13 only); that assumption breaks with some projects (for instance specs2) that use the same string for one of their own (really defined) subprojects. This is only a problem when using dbuild-setup, as during compilation dbuild deliberately uses directories based on hashes, to avoid this specific issue.
A workaround is just renaming the clone dir prior to using dbuild-setup; however, this should be documented as a caveat, and a more sophisticated heuristic should eventually be put in place.

support using a pre-built scala

PR validation should be as fast as possible, so we only want to build scala once, and test various projects against it using dbuild or other build tools. Thus, it would be good to be able to just use a scala build and compile other projects using that.

Long term, this isn't really necessary, assuming our full validation will be one dbuild job, that does parallel builds etc, but that seems like a longer term thing.

Add an optional parameter to dbuild, to specify a desired target project

Sometimes it is useful to use an existing dbuild configuration file to test repeatedly a single project; however, dbuild will now try to build the full list of projects listed. An optional parameter should be added to the dbuild command so that, after extraction, only the specified target and of its dependencies are rebuilt; the other unrelated projects should receive a status "DID NOT RUN (skipped)", while the build file as a whole (project ".") should be "FAILED (incomplete)", as it was only partially run.

Support scp in deploy

The IDE uberscript uploads files, at the end of the build, using scp. The same functionality needs to be added as an additional scheme to the dbuild deploy stage.

Restrict which dependency versions get replaced

We need something like a 'provides' mechanism when rewiring dependencies.

For example, the Play project would need the scala dependency rewired for some projects (that currently depend on scala 2.10.0-M7), but not all. Some projects would stay on scala 2.9.2, due to lining up with sbt 0.12. We still want to build the latest version of the scala 2.9.2 projects as these will be depended on downstream.

Remote repository cleanup tool

Now that we're running nightlies and have lots of artifacts, we may want to run a cleanup tool.

  • Remove all artifacts/metadata older than X days
  • Preserve specific 'tagged' metadata/artifacts

Scala home generation in sbt-build is fragile and now wrong

The current mechanism of generating scala-home assumes a few incorrect things:

  1. All necessary artifacts are in the org.scala-lang namespace
  2. Artifacts needed for scala-home are not cross-versioned.

The migration to modules in scala 2.11 has booted scala-xml and scala-parser-combinators out of the main repository. These are both cross versioned, have different versions than scala itself and different module ids. They are also both needed to run scaladoc, so they must be in the ScalaInstance in sbt.

Also, I'm unaware of whether or not resolving scala via Ivy (in 2.11) will work correctly, since the dependency information may be off. I have not tried it.

We have a few options here:

  1. Continue to use my hacky workaround (pending in a PR). Here we just look for random jars and make good guesses as to which should be included.
  2. Try to resolve the scala-compiler.jar artifact using ivy and all dependencies into the ScalaInstance.
  3. Try to embed inter-artifact dependencies in dbuild metadata, and use the scala-compiler artifact to resolve itself and all dependencies.

I think we should aim for option 2.

Project names cannot be too short

If one of the projects in a .dsbt file has a very short name, like "a", the build will fail:

}java.lang.IllegalArgumentException: Prefix string too short
    at java.io.File.createTempFile0(File.java:1728)
    at java.io.File.createTempFile(File.java:1850)
    at sbt.IO$.withTemporaryFile(IO.scala:310)

It's an easy fix, but I'm writing it down so that it's not forgotten.

dsbt-build may leak artifacts to ~/ivy2

Dsbt-build should, in theory, confine all of the artifacts loaded during its operation to the directory containing the project. However, at present some artifacts may be loaded into the ~/.ivy2 cache directory, and specifically during the loading stages of the nested sbt invocations. Although not a major issue, it should be easy to fix that by changing the ivy.root used while starting sbt.

Consolidate settings fixup routines

At this time, dsbt-build and dsbt-setup use two similar, but unrelated sets of routines in order to modify the sbt state. Dsbt-build should be modified in order to use the newer routines.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.