Giter Site home page Giter Site logo

conan-io / conan-package-tools Goto Github PK

View Code? Open in Web Editor NEW
165.0 21.0 70.0 1.04 MB

Conan Package Tools. Helps with massive package creation and CI integration (Travis CI, Appveyor...)

License: MIT License

Python 100.00%
conan ci-server docker-image travis-ci conan-docker

conan-package-tools's Introduction

.github/workflows/conan_package_tools.yml codecov PyPI - Downloads

This project is no longer recommended or maintained ๐Ÿ›‘

This project is no longer maintained, it will not get any fixes or support. It will be soon fully archived. Modern Conan 2.0 extensions can be found in https://github.com/conan-io/conan-extensions

Conan 2.0 support โš ๏ธ

The project Conan Package Tools does not support Conan 2.x and there is no current planned support.

In case you need such support, please, open an issue explaining your current case with more details.

Conan Package Tools

Introduction

This package allows to automate the creation of conan packages for different configurations.

It eases the integration with CI servers like TravisCI and Appveyor, so you can use the cloud to generate different binary packages for your conan recipe.

Also supports Docker to create packages for different GCC and Clang versions.

Installation

$ pip install conan_package_tools

Or you can clone this repository and store its location in PYTHONPATH.

How it works

Using only conan C/C++ package manager (without conan package tools), you can use the conan create command to generate, for the same recipe, different binary packages for different configurations. The easier way to do it is using profiles:

$ conan create myuser/channel --profile win32
$ conan create myuser/channel --profile raspi
$ ...

The profiles can contain, settings, options, environment variables and build_requires. Take a look to the conan docs to know more.

Conan package tools allows to declare (or autogenerate) a set of different configurations (different profiles). It will call conan create for each one, uploading the generated packages to a remote (if needed), and using optionally docker images to ease the creation of different binaries for different compiler versions (gcc and clang supported).

Basic, but not very practical, example

Create a build.py file in your recipe repository, and add the following lines:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager(username="myusername")
    builder.add(settings={"arch": "x86", "build_type": "Debug"},
                options={}, env_vars={}, build_requires={})
    builder.add(settings={"arch": "x86_64", "build_type": "Debug"},
                options={}, env_vars={}, build_requires={})
    builder.run()

Now we can run the python script, the ConanMutiPackager will run the conan create command two times, one to generate x86 Debug package and another one for x86_64 Debug.

> python build.py

############## CONAN PACKAGE TOOLS ######################

INFO: ******** RUNNING BUILD **********
conan create myuser/testing --profile /var/folders/y1/9qybgph50sjg_3sm2_ztlm6dr56zsd/T/tmpz83xXmconan_package_tools_profiles/profile

[build_requires]
[settings]
arch=x86
build_type=Debug
[options]
[scopes]
[env]

...


############## CONAN PACKAGE TOOLS ######################

INFO: ******** RUNNING BUILD **********
conan create myuser/testing --profile /var/folders/y1/9qybgph50sjg_3sm2_ztlm6dr56zsd/T/tmpMiqSZUconan_package_tools_profiles/profile

[build_requires]
[settings]
arch=x86_64
build_type=Debug
[options]
[scopes]
[env]


#########################################################

...

If we inspect the local cache we can see that there are two binaries generated for our recipe, in this case the zlib recipe:

$ conan search zlib/1.2.11@myuser/testing

Existing packages for recipe zlib/1.2.11@myuser/testing:

Package_ID: a792eaa8ec188d30441564f5ba593ed5b0136807
    [options]
        shared: False
    [settings]
        arch: x86
        build_type: Debug
        compiler: apple-clang
        compiler.version: 9.0
        os: Macos
    outdated from recipe: False

Package_ID: e68b263f26a4d7513e28c9cae1673aa0466af777
    [options]
        shared: False
    [settings]
        arch: x86_64
        build_type: Debug
        compiler: apple-clang
        compiler.version: 9.0
        os: Macos
    outdated from recipe: False

Now, we could add new build configurations, but in this case we only want to add Visual Studio configurations and the runtime, but, of course, only if we are on Windows:

import platform
from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager(username="myusername")
    if platform.system() == "Windows":
        builder.add(settings={"arch": "x86", "build_type": "Debug", "compiler": "Visual Studio", "compiler.version": 14, "compiler.runtime": "MTd"},
                    options={}, env_vars={}, build_requires={})
        builder.add(settings={"arch": "x86_64", "build_type": "Release", "compiler": "Visual Studio", "compiler.version": 14, "compiler.runtime": "MT"},
                    options={}, env_vars={}, build_requires={})
    else:
        builder.add(settings={"arch": "x86", "build_type": "Debug"},
                    options={}, env_vars={}, build_requires={})
        builder.add(settings={"arch": "x86_64", "build_type": "Debug"},
                    options={}, env_vars={}, build_requires={})
    builder.run()

In the previous example, when we are on Windows, we are adding two build configurations:

- "Visual Studio 14, Debug, MTd runtime"
- "Visual Studio 14, Release, MT runtime"

We can also adjust the options, environment variables and build_requires:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager(username="myuser")
    builder.add({"arch": "x86", "build_type": "Release"},
                {"mypackage:option1": "ON"},
                {"PATH": "/path/to/custom"},
                {"*": ["MyBuildPackage/1.0@lasote/testing"]})
    builder.add({"arch": "x86_64", "build_type": "Release"}, {"mypackage:option1": "ON"})
    builder.add({"arch": "x86", "build_type": "Debug"}, {"mypackage:option2": "OFF", "mypackage:shared": True})
    builder.run()

We could continue adding configurations, but probably you realized that it would be such a tedious task if you want to generate many different configurations in different operating systems, using different compilers, different compiler versions etc.

Generating the build configurations automatically

Conan package tools can generate automatically a matrix of build configurations combining architecture, compiler, compiler.version, compiler.runtime, compiler.libcxx, build_type and and shared/static options.

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager()
    builder.add_common_builds()
    builder.run()

If you run the python build.py command, for instance, in Mac OSX, it will add the following configurations automatically:

{'compiler.version': '7.3', 'arch': 'x86', 'build_type': 'Release', 'compiler': 'apple-clang'})
{'compiler.version': '7.3', 'arch': 'x86', 'build_type': 'Debug', 'compiler': 'apple-clang'})
{'compiler.version': '7.3', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'apple-clang'})
{'compiler.version': '7.3', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'apple-clang'})
{'compiler.version': '8.0', 'arch': 'x86', 'build_type': 'Release', 'compiler': 'apple-clang'})
{'compiler.version': '8.0', 'arch': 'x86', 'build_type': 'Debug', 'compiler': 'apple-clang'})
{'compiler.version': '8.0', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'apple-clang'})
{'compiler.version': '8.0', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'apple-clang'})
{'compiler.version': '8.1', 'arch': 'x86', 'build_type': 'Release', 'compiler': 'apple-clang'})
{'compiler.version': '8.1', 'arch': 'x86', 'build_type': 'Debug', 'compiler': 'apple-clang'})
{'compiler.version': '8.1', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'apple-clang'})
{'compiler.version': '8.1', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'apple-clang'})

These are all the combinations of arch=x86/x86_64, build_type=Release/Debug for different compiler versions.

But having different apple-clang compiler versions installed in the same machine is not common at all. We can adjust the compiler versions using a parameter or an environment variable, specially useful for a CI environment:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager(apple_clang_versions=["9.0"]) # or declare env var CONAN_APPLE_CLANG_VERSIONS=9.0
    builder.add_common_builds()
    builder.run()

In this case, it will call conan create with only this configurations:

{'compiler.version': '9.0', 'arch': 'x86', 'build_type': 'Release', 'compiler': 'apple-clang'})
{'compiler.version': '9.0', 'arch': 'x86', 'build_type': 'Debug', 'compiler': 'apple-clang'})
{'compiler.version': '9.0', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'apple-clang'})
{'compiler.version': '9.0', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'apple-clang'})

You can adjust other constructor parameters to control the build configurations that will be generated:

  • gcc_versions: Generate only build configurations for the specified gcc versions (Ignored if the current machine is not Linux)
  • visual_versions: Generate only build configurations for the specified Visual Studio versions (Ignore if the current machine is not Windows)
  • visual_runtimes: Generate only build configurations for the specified runtimes. (only for Visual Studio)
  • visual_toolsets: Specify the toolsets per each specified Visual Studio version. (only for Visual Studio)
  • msvc_versions: Generate only build configurations for the specified msvc versions (Ignore if the current machine is not Windows)
  • msvc_runtimes: Generate only build configurations for the specified runtimes. (only for msvc)
  • msvc_runtime_types: Specify the runtime types per each specified msvc version. (only for msvc)
  • apple_clang_versions: Generate only build configurations for the specified apple clang versions (Ignored if the current machine is not OSX)
  • archs: Generate build configurations for the specified architectures, by default, ["x86", "x86_64"].
  • build_types: Generate build configurations for the specified build_types, by default ["Debug", "Release"].

Or you can adjust environment variables:

  • CONAN_GCC_VERSIONS
  • CONAN_VISUAL_VERSIONS
  • CONAN_VISUAL_RUNTIMES
  • CONAN_VISUAL_TOOLSETS
  • CONAN_MSVC_VERSIONS
  • CONAN_MSVC_RUNTIMES
  • CONAN_MSVC_RUNTIME_TYPES
  • CONAN_APPLE_CLANG_VERSIONS
  • CONAN_CLANG_VERSIONS
  • CONAN_ARCHS
  • CONAN_BUILD_TYPES

Check the REFERENCE section to see all the parameters and ENVIRONMENT VARIABLES available.


IMPORTANT! Both the constructor parameters and the corresponding environment variables of the previous list ONLY have effect when using builder.add_common_builds().


So, if we want to generate packages for x86_64 and armv8 but only for Debug and apple-clang 9.0:

$ export CONAN_ARCHS=x86_64,armv8
$ export CONAN_APPLE_CLANG_VERSIONS=9.0
$ export CONAN_BUILD_TYPES=Debug

$ python build.py

There are also two additional parameters of the add_common_builds:

  • pure_c: (Default True) If your project is C++, pass the pure_c=False to add both combinations using libstdc and libstdc++11 for the setting compiler.libcxx. When True, the default profile value of libcxx will be applied. If you don't want libcxx value to apply to your binary packages you have to use the configure method to remove it:
    def configure(self):
        del self.settings.compiler.libcxx
  • shared_option_name: If your conanfile.py have an option shared, the generated builds will contain automatically the "True/False" combination for that option. Pass "False" to deactivate it or "lib_name:shared_option_name" to specify a custom option name, e.j: boost:my_shared``
  • dll_with_static_runtime: Will add also the combination of runtime MT with shared libraries.
  • header_only: If your conanfile.py have an option header_only, the generated builds will contain automatically the "True/False" combination for that option #454.
  • build_all_options_values: It includes all possible values for the listed options #457.
from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager()
    builder.add_common_builds(shared_option_name="mypackagename:shared", pure_c=False)
    builder.run()

Filtering or modifying the configurations

Use the remove_build_if helper with a lambda function to filter configurations:

from cpt.packager import ConanMultiPackager

builder = ConanMultiPackager(username="myuser")
builder.add_common_builds()
builder.remove_build_if(lambda build: build.settings["compiler.version"] == "4.6" and build.settings["build_type"] == "Debug")

Use the update_build_if helper with a lambda function to alter configurations:

from cpt.packager import ConanMultiPackager

builder = ConanMultiPackager(username="myuser")
builder.add_common_builds()
builder.update_build_if(lambda build: build.settings["os"] == "Windows",
                        new_build_requires={"*": ["7zip_installer/0.1.0@conan/stable"]})
# Also avaiable parameters:
#    new_settings, new_options, new_env_vars, new_build_requires, new_reference

Or you can directly iterate the builds to do any change. EX: Remove the GCC 4.6 packages with build_type=Debug:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager(username="myuser")
    builder.add_common_builds()
    filtered_builds = []
    for settings, options, env_vars, build_requires, reference in builder.items:
        if settings["compiler.version"] != "4.6" and settings["build_type"] != "Debug":
             filtered_builds.append([settings, options, env_vars, build_requires, reference])
    builder.builds = filtered_builds
    builder.run()

Package Version based on Commit Checksum

Sometimes you want to use Conan as in-source but you do not need to specify a version in the recipe, it could be configured by your build environment. Usually you could use the branch name as the package version, but if you want to create unique packages for each new build, upload it and do not override on your remote, you will need to use a new version for each build. In this case, the branch name will not be enough, so a possible approach is to use your current commit checksum as version:

from cpt.packager import ConanMultiPackager
from cpt.ci_manager import CIManager
from cpt.printer import Printer


if __name__ == "__main__":
    printer = Printer()
    ci_manager = CIManager(printer)
    builder = ConanMultiPackager(reference="mypackage/{}".format(ci_manager.get_commit_id()[:7]))
    builder.add_common_builds()
    builder.run()

As SHA-1 is 40 digits long, you could format the result to short size

Save created packages summary

In case you want to integrate CPT with other tools, for example you want to have build logic after creating packages, you can save a json report about all configurations and packages.

Examples:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager()
    builder.add_common_builds()
    builder.run(summary_file='cpt_summary_file.json')


from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager()
    builder.add_common_builds()
    builder.run()
    builder.save_packages_summary('cpt_summary_file.json')

Alternatively you can use the CPT_SUMMARY_FILE environment variable to set the summary file path

Using all values for custom options

Sometimes you want to include more options to your matrix, including all possible combinations, so that, you can use build_all_options_values:

from cpt.packager import ConanMultiPackager


if __name__ == "__main__":
    builder = ConanMultiPackager(reference="mypackage/0.1.0")
    builder.add_common_builds(build_all_options_values=["mypackage:foo", "mypackage:bar"])
    builder.run()

Now let's say mypackage's recipe contains the follow options: shared, fPIC, foo and bar. Both foo and bar can accept True or False. The method add_common_builds will generate a matrix including both foo and bar with all possible combinations.

Using Docker

Instance ConanMultiPackager with the parameter use_docker=True, or declare the environment variable CONAN_USE_DOCKER: It will launch, when needed, a container for the current build configuration that is being built (only for Linux builds).

There are docker images available for different gcc versions: 4.6, 4.8, 4.9, 5, 6, 7 and clang versions: 3.8, 3.9, 4.0.

The containers will share the conan storage directory, so the packages will be generated in your conan directory.

Example:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager()
    builder.add_common_builds()
    builder.run()

And run the build.py:

$ export CONAN_USERNAME=myuser
$ export CONAN_GCC_VERSIONS=4.9
$ export CONAN_DOCKER_IMAGE=conanio/gcc49
$ export CONAN_USE_DOCKER=1
$ python build.py

It will generate a set of build configurations (profiles) for gcc 4.9 and will run it inside a container of the conanio/gcc49 image.

If you want to run the arch="x86" build inside a docker container of 32 bits you can set the parameter docker_32_images in the ConanMultiPackager constructor or set the environment variable CONAN_DOCKER_32_IMAGES. In this case, the docker image name to use will be appended with -i386.

The Docker images used by default both for 64 and 32 bits are pushed to dockerhub and its Dockerfiles are available in the conan-docker-tools repository.

Running scripts and executing commands before to build on Docker

When Conan Package Tools uses Docker to build your packages, sometimes you need to execute a "before build" step. If you need to install packages, change files or create a setup, there is an option for that: docker_entry_script

Example:

This example shows how to install tzdata package by apt-get, before to build the Conan package.

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    command = "sudo apt-get -qq update && sudo apt-get -qq install -y tzdata"
    builder = ConanMultiPackager(use_docker=True, docker_image='conanio/gcc7', docker_entry_script=command)
    builder.add_common_builds()
    builder.run()

Also, it's possible to run some internal script, before to build the package:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    command = "python bootstrap.py"
    builder = ConanMultiPackager(use_docker=True, docker_image='conanio/gcc7', docker_entry_script=command)
    builder.add_common_builds()
    builder.run()

Using with your own Docker images

The default location inside the Docker container is /home/conan on Linux and C:\Users\ContainerAdministrator on Windows. This is fine if you use the conan Docker images but if you are using your own image, these locations probably won't exist.

To use a different location, you can use the option docker_conan_home or the environment variable CONAN_DOCKER_HOME.

Installing extra python packages before to build

Maybe you need to install some python packages using pip before to build your conan package. To solve this situation you could use pip_install:

Example:

This example installs bincrafters-package-tools and conan-promote before to build:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager(pip_install=["bincrafters-package-tools==0.17.0", "conan-promote==0.1.2"])
    builder.add_common_builds()
    builder.run()

But if you prefer to use environment variables:

export CONAN_PIP_INSTALL="bincrafters-package-tools==0.17.0,conan-promote=0.1.2"

Passing additional Docker parameters during build

When running conan create step in Docker, you might want to run the container with a different Docker network. For this you can use docker_run_options parameter (or CONAN_DOCKER_RUN_OPTIONS envvar)

builder = ConanMultiPackager(
  docker_run_options='--network bridge --privileged',
  ...

When run, this will translate to something like this:

sudo -E docker run ... --network bridge --privileged conanio/gcc6 /bin/sh -c "cd project &&  run_create_in_docker"

Installing custom Conan config

To solve custom profiles and remotes, Conan provides the config feature where is possible to edit the conan.conf or install config files.

If you need to run conan config install <url> before to build there is the argument config_url in CPT:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    config_url = "https://github.com/bincrafters/conan-config.git"
    builder = ConanMultiPackager(config_url=config_url)
    builder.add_common_builds()
    builder.run()

But if you are not interested to update your build.py script, it's possible to use environment variables instead:

export CONAN_CONFIG_URL=https://github.com/bincrafters/conan-config.git

Specifying a different base profile

The options, settings and environment variables that the add_common_builds() method generate, are applied into the default profile of the conan installation. If you want to use a different profile you can pass the name of the profile in the run() method.

Example:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager(clang_versions=["3.8", "3.9"])
    builder.add_common_builds()
    builder.run("myclang")

Alternatively you can use the CONAN_BASE_PROFILE environment variable to choose a different base profile:

CONAN_BASE_PROFILE=myclang

Specifying build context for cross building

Since Conan 1.24, you can pass an additional profile for build context, so that, you can pass both profiles by environment variables:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager(gcc_versions=["10", "11"])
    builder.add_common_builds()
    builder.run(base_profile_name="raspberrypi", base_profile_build_name="default")

The base_profile_name is equivalent to profile host, where my libraries and executables will run, and the base_profile_build_name is the profile related where the artifacts are built.

Also, you can use environment variables instead:

CONAN_BASE_PROFILE=default
CONAN_BASE_PROFILE_BUILD=raspberrypi

Read more build context here

The CI integration

If you are going to use a CI server to generate different binary packages for your recipe, the best approach is to control the build configurations with environment variables.

So, having a generic build.py should be enough for almost all the cases:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager()
    builder.add_common_builds(shared_option_name="mypackagename:shared", pure_c=False)
    builder.run()

Then, in your CI configuration, you can declare different environment variables to limit the build configurations to an specific compiler version, using a specific docker image etc.

For example, if you declare the following environment variables:

CONAN_GCC_VERSIONS=4.9
CONAN_DOCKER_IMAGE=conanio/gcc49

the add_common_builds() method will only add different build configurations for GCC=4.9 and will run them in a docker container.

You can see working integrations with Travis and Appveyor in the zlib repository here

Travis integration

Travis CI can generate a build with multiple jobs defining a matrix with environment variables. We can configure the builds to be executed in the jobs by defining some environment variables.

The following is a real example of a .travis.yml file that will generate packages for Linux gcc (4.9, 5, 6), Linux Clang (3.9 and 4.0) and OSx with apple-clang (8.0, 8.1 and 9.0).

Remember, you can use conan new command to generate the base files for appveyor, travis etc. Check conan new --help.

.travis.yml example:

env:
   global:
     - CONAN_USERNAME: "conan" # ADJUST WITH YOUR REFERENCE USERNAME!
     - CONAN_LOGIN_USERNAME: "lasote" # ADJUST WITH YOUR LOGIN USERNAME!
     - CONAN_CHANNEL: "testing" # ADJUST WITH YOUR CHANNEL!
     - CONAN_UPLOAD: "https://api.bintray.com/conan/conan-community/conan" # ADJUST WITH YOUR REMOTE!
     - CONAN_STABLE_BRANCH_PATTERN: "release/*"
     - CONAN_UPLOAD_ONLY_WHEN_STABLE: 1 # Will only upload when the branch matches "release/*"

linux: &linux
   os: linux
   sudo: required
   language: python
   python: "3.6"
   services:
     - docker
osx: &osx
   os: osx
   language: generic
matrix:
   include:

      - <<: *linux
        env: CONAN_GCC_VERSIONS=4.9 CONAN_DOCKER_IMAGE=conanio/gcc49
      - <<: *linux
        env: CONAN_GCC_VERSIONS=5 CONAN_DOCKER_IMAGE=conanio/gcc5
      - <<: *linux
        env: CONAN_GCC_VERSIONS=6 CONAN_DOCKER_IMAGE=conanio/gcc6
      - <<: *linux
        env: CONAN_GCC_VERSIONS=7 CONAN_DOCKER_IMAGE=conanio/gcc7
      - <<: *linux
        env: CONAN_CLANG_VERSIONS=3.9 CONAN_DOCKER_IMAGE=conanio/clang39
      - <<: *linux
        env: CONAN_CLANG_VERSIONS=4.0 CONAN_DOCKER_IMAGE=conanio/clang40
      - <<: *osx
        osx_image: xcode7.3
        env: CONAN_APPLE_CLANG_VERSIONS=7.3
      - <<: *osx
        osx_image: xcode8.2
        env: CONAN_APPLE_CLANG_VERSIONS=8.0
      - <<: *osx
        osx_image: xcode8.3
        env: CONAN_APPLE_CLANG_VERSIONS=8.1
      - <<: *osx
        osx_image: xcode9
        env: CONAN_APPLE_CLANG_VERSIONS=9.0

install:
  - chmod +x .travis/install.sh
  - ./.travis/install.sh

script:
  - chmod +x .travis/run.sh
  - ./.travis/run.sh

You can also use multiples "pages" to split the builds in different jobs (Check pagination section first to understand):

.travis.yml

env:
   global:
     - CONAN_USERNAME: "conan" # ADJUST WITH YOUR REFERENCE USERNAME!
     - CONAN_LOGIN_USERNAME: "lasote" # ADJUST WITH YOUR LOGIN USERNAME!
     - CONAN_CHANNEL: "testing" # ADJUST WITH YOUR CHANNEL!
     - CONAN_UPLOAD: "https://api.bintray.com/conan/conan-community/conan" # ADJUST WITH YOUR REMOTE!
     - CONAN_STABLE_BRANCH_PATTERN: "release/*"
     - CONAN_UPLOAD_ONLY_WHEN_STABLE: 1 # Will only upload when the branch matches "release/*"

linux: &linux
   os: linux
   sudo: required
   language: python
   python: "3.6"
   services:
     - docker
osx: &osx
   os: osx
   language: generic
matrix:
   include:

      - <<: *linux
        env: CONAN_GCC_VERSIONS=4.9 CONAN_DOCKER_IMAGE=conanio/gcc49 CONAN_CURRENT_PAGE=1

      - <<: *linux
        env: CONAN_GCC_VERSIONS=4.9 CONAN_DOCKER_IMAGE=conanio/gcc49 CONAN_CURRENT_PAGE=2

      - <<: *linux
        env: CONAN_GCC_VERSIONS=5 CONAN_DOCKER_IMAGE=conanio/gcc5 CONAN_CURRENT_PAGE=1

       - <<: *linux
        env: CONAN_GCC_VERSIONS=5 CONAN_DOCKER_IMAGE=conanio/gcc5 CONAN_CURRENT_PAGE=2

      - <<: *linux
        env: CONAN_GCC_VERSIONS=6 CONAN_DOCKER_IMAGE=conanio/gcc6 CONAN_CURRENT_PAGE=1

      - <<: *linux
        env: CONAN_GCC_VERSIONS=6 CONAN_DOCKER_IMAGE=conanio/gcc6 CONAN_CURRENT_PAGE=2

      - <<: *linux
        env: CONAN_CLANG_VERSIONS=3.9 CONAN_DOCKER_IMAGE=conanio/clang39 CONAN_CURRENT_PAGE=1

       - <<: *linux
        env: CONAN_CLANG_VERSIONS=3.9 CONAN_DOCKER_IMAGE=conanio/clang39 CONAN_CURRENT_PAGE=2

      - <<: *linux
        env: CONAN_CLANG_VERSIONS=4.0 CONAN_DOCKER_IMAGE=conanio/clang40 CONAN_CURRENT_PAGE=1

      - <<: *linux
        env: CONAN_CLANG_VERSIONS=4.0 CONAN_DOCKER_IMAGE=conanio/clang40 CONAN_CURRENT_PAGE=2

      - <<: *osx
        osx_image: xcode7.3
        env: CONAN_APPLE_CLANG_VERSIONS=7.3 CONAN_CURRENT_PAGE=1

      - <<: *osx
        osx_image: xcode7.3
        env: CONAN_APPLE_CLANG_VERSIONS=7.3 CONAN_CURRENT_PAGE=2

      - <<: *osx
        osx_image: xcode8.2
        env: CONAN_APPLE_CLANG_VERSIONS=8.0 CONAN_CURRENT_PAGE=1

      - <<: *osx
        osx_image: xcode8.2
        env: CONAN_APPLE_CLANG_VERSIONS=8.0 CONAN_CURRENT_PAGE=2

      - <<: *osx
        osx_image: xcode8.3
        env: CONAN_APPLE_CLANG_VERSIONS=8.1 CONAN_CURRENT_PAGE=1

      - <<: *osx
        osx_image: xcode8.3
        env: CONAN_APPLE_CLANG_VERSIONS=8.1 CONAN_CURRENT_PAGE=2

install:
  - chmod +x .travis/install.sh
  - ./.travis/install.sh

script:
  - chmod +x .travis/run.sh
  - ./.travis/run.sh

.travis/install.sh

#!/bin/bash

set -e
set -x

if [[ "$(uname -s)" == 'Darwin' ]]; then
    brew update || brew update
    brew outdated pyenv || brew upgrade pyenv
    brew install pyenv-virtualenv
    brew install cmake || true

    if which pyenv > /dev/null; then
        eval "$(pyenv init -)"
    fi

    pyenv install 3.7.11
    pyenv virtualenv 3.7.11 conan
    pyenv rehash
    pyenv activate conan
fi

pip install conan --upgrade
pip install conan_package_tools
conan user

If you want to "pin" a conan_package_tools version use:

pip install conan_package_tools==0.37.0

That version will be used also in the docker images.

.travis/run.sh

#!/bin/bash

set -e
set -x

if [[ "$(uname -s)" == 'Darwin' ]]; then
    if which pyenv > /dev/null; then
        eval "$(pyenv init -)"
    fi
    pyenv activate conan
fi

python build.py

Remember to set the CONAN_PASSWORD variable in the travis build control panel!

Appveyor integration

This is very similar to Travis CI. With the same build.py script we have the following appveyor.yml file:

build: false

environment:
    PYTHON: "C:\\Python37"
    PYTHON_VERSION: "3.7.9"
    PYTHON_ARCH: "32"

    CONAN_USERNAME: "lasote"
    CONAN_LOGIN_USERNAME: "lasote"
    CONAN_CHANNEL: "stable"
    VS150COMNTOOLS: "C:\\Program Files (x86)\\Microsoft Visual Studio\\2017\\Community\\Common7\\Tools\\"
    CONAN_UPLOAD: "https://api.bintray.com/conan/luisconanorg/fakeconancenter"
    CONAN_REMOTES: "https://api.bintray.com/conan/luisconanorg/conan-testing"

    matrix:
        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
          CONAN_VISUAL_VERSIONS: 12
        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
          CONAN_VISUAL_VERSIONS: 14
        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2017
          CONAN_VISUAL_VERSIONS: 15


install:
  - set PATH=%PATH%;%PYTHON%/Scripts/
  - pip.exe install conan_package_tools --upgrade
  - conan user # It creates the conan data directory

test_script:
  - python build.py
  • Remember to set the CONAN_PASSWORD variable in appveyor build control panel!

Bamboo CI integration

Bamboo is a commercial CI tool developed by Atlassian. When building from bamboo, several environment variables get set during builds.

If the env var bamboo_buildNumber is set and the branch name (bamboo_planRepository_branch env var) matches stable_branch_pattern, then the channel name gets set to stable.

Jenkins CI integration

Jenkins is an open source CI tool that was originally forked from hudson. When building on jenkins, several environment variables get set during builds.

If the env var JENKINS_URL is set and the branch name (BRANCH_NAME env var) matches stable_branch_pattern, then the channel name gets set to stable.

Currently, only the pipeline builds set the BRANCH_NAME env var automatically.

GitLab CI integration

GitLab CI is a commercial CI tool developed by GitLab. When building on gitlab-ci, several environment variables get set during builds.

If the env var GITLAB_CI is set and the branch name (CI_BUILD_REF_NAME env var) matches stable_branch_pattern, then the channel name gets set to stable.

Upload packages

You can upload the generated packages automatically to a conan-server using the following environment variables (parameters also available):

  • Remote url:

      CONAN_UPLOAD: "https://api.bintray.com/conan/conan-community/conan"
    
  • User to login in the remote:

      CONAN_LOGIN_USERNAME: "lasote"
    
  • User (to generate the packages in that user namespace, e.j: zlib/1.2.11@conan/stable):

      CONAN_USERNAME: "conan"
    
  • Channel (to generate the packages in that channel namespace, e.j: zlib/1.2.11@conan/testing):

      CONAN_CHANNEL: "testing"
    
  • If the detected branch in the CI matches the pattern, declare the CONAN_CHANNEL as stable:

      CONAN_STABLE_BRANCH_PATTERN: "release/*"
    

Upload dependencies (#237)

Sometimes your dependencies are not available in remotes and you need to pass --build=missing to build them. The problem is that you will need to fix one-by-one, updating the CI script, instead of just uploading all built packages.

Now you can upload ALL of your dependencies together, in addition to your package, to the same remote. To do this, you need to define:

CONAN_UPLOAD_DEPENDENCIES="all"

Or, set it in ConanMultiPackager arguments:

ConanMultiPackager(upload_dependencies="all")

However, maybe you want to upload ONLY specified packages by their names:

CONAN_UPLOAD_DEPENDENCIES="foo/0.1@user/channel,bar/1.2@bar/channel"

Or,

ConanMultiPackager(upload_dependencies=["foo/0.1@user/channel", "bar/1.2@bar/channel"])

Pagination

Sometimes, if your library is big or complex enough in terms of compilation time, the CI server could reach the maximum time of execution, because it's building, for example, 20 different binary packages for your library in the same machine.

You can split the different build configurations in different "pages". So, you can configure your CI to run more "worker" machines, one per "page".

There are two approaches:

Sequential distribution

By simply passing two pagination parameters, curpage and total_pages or the corresponding environment variables:

$ export CONAN_TOTAL_PAGES=3
$ export CONAN_CURRENT_PAGE=1

$ python build.py

If you added 10 different build configurations to the builder:

  • With CONAN_CURRENT_PAGE=1 it runs only 1,4,7,10
  • With CONAN_CURRENT_PAGE=2 it runs only 2,5,8
  • With CONAN_CURRENT_PAGE=3 it runs only 3,6,9

In your CI server you can configure a matrix with different "virtual machines" or "jobs" or "workers": In each "machine" you can specify a different CONAN_CURRENT_PAGE environment variable. So your different configurations will be distributed in the different machines.

Named pages

By adding builds to the named_builds dictionary, and passing curpage with the page name:

from cpt.packager import ConanMultiPackager
from collections import defaultdict

if __name__ == '__main__':
    builder = ConanMultiPackager(curpage="x86", total_pages=2)
    named_builds = defaultdict(list)
    builder.add_common_builds(shared_option_name="bzip2:shared", pure_c=True)
    for settings, options, env_vars, build_requires, reference in builder.items:
        named_builds[settings['arch']].append([settings, options, env_vars, build_requires, reference])
    builder.named_builds = named_builds
    builder.run()

named_builds now have a dictionary entry for x86 and another for x86_64:

  • for CONAN_CURRENT_PAGE="x86" it would do all x86 builds
  • for CONAN_CURRENT_PAGE="x86_64" it would do all x86_64 builds

Generating multiple references for the same recipe

You can add a different reference in the builds tuple, so for example, if your recipe has no "version" field, you could generate several versions in the same build script. Conan package tools will export the recipe using the different reference automatically:

from cpt.packager import ConanMultiPackager

if __name__ == '__main__':
    builder = ConanMultiPackager()
    builder.add_common_builds(reference="mylib/1.0@conan/stable")
    builder.add_common_builds(reference="mylib/2.0@conan/stable")
    builder.run()

Working with Bintray: Configuring repositories

Use the argument upload or environment variable CONAN_UPLOAD to set the URL of the repository where you want to upload your packages. Will be also used to read from it.

Use CONAN_PASSWORD environment variable to set the API key from Bintray. If your username in Bintray doesn't match with the specified CONAN_USERNAME specify the variable CONAN_LOGIN_USERNAME or the parameter login_username to ConanMultiPackager .

If you are using travis or appveyor you can use a hidden enviroment variable from the repository setup package.

To get an API key in Bintray to "Edit profile"/"API key".

Use the argument remotes or environment variable CONAN_REMOTES to configure additional repositories containing needed requirements.

Example: Add your personal Bintray repository to retrieve and upload your packages and also some other different repositories to read some requirements.

In your .travis.yml or appveyor.yml files declare the environment variables:

CONAN_UPLOAD="https://api.bintray.com/mybintrayuser/conan_repository"
CONAN_REMOTES="https://api.bintray.com/other_bintray_user/conan-repo, https://api.bintray.com/other_bintray_user2/conan-repo"

Or in your build.py:

from cpt.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager(username="myuser",
                                 upload="https://api.bintray.com/mybintrayuser/conan_repository",
                                 remotes="https://api.bintray.com/other_bintray_user/conan-repo, https://api.bintray.com/other_bintray_user2/conan-repo")
    builder.add_common_builds()
    builder.run()

Visual Studio auto-configuration

When the builder detects a Visual Studio compiler and its version, it will automatically configure the execution environment for the "conan test" command with the vcvarsall.bat script (provided by all Microsoft Visual Studio versions). So you can compile your project with the right compiler automatically, even without CMake.

MinGW builds

MinGW compiler builds are also supported. You can use this feature with Appveyor.

You can choose different MinGW compiler configurations:

  • Version: 4.8 and 4.9 are supported
  • Architecture: x86 and x86_64 are supported
  • Exceptions: seh and sjlj are supported
  • Threads: posix and win32 are supported

Using MINGW_CONFIGURATIONS env variable in Appveyor:

MINGW_CONFIGURATIONS: '4.9@x86_64@seh@posix, 4.9@x86_64@seh@win32'

Check an example here

Clang builds

Clang compiler builds are also supported. You can use this feature with TravisCI.

You can choose different Clang compiler configurations:

  • Version: 3.8, 3.9 and 4.0 are supported
  • Architecture: x86 and x86_64 are supported

Using CONAN_CLANG_VERSIONS env variable in Travis ci or Appveyor:

CONAN_CLANG_VERSIONS = "3.8,3.9,4.0"

FULL REFERENCE

ConanMultiPackager parameters reference

  • username: The username (part of the package reference, not the login_username)

  • login_username: The login username. Could take two possible values:

    • String with the login":

      login_username = "my_user"
      
    • Dict containing the remote name and the login for that remote. Use together with "remotes" variable to specify remote names e.j:

      login_username = {"remote1": "my_user", "my_artifactory": "other_user"}
      
  • password. Password to authenticate with the remotes. Could take two possible values:

    • String with the password:

      password = "my_pass"
      
    • Dict containing the remote name and the login for that remote. Use together with "remotes" variable to specify remote names e.j:

      password = {"remote1": "my_pass", "my_artifactory": "my_pass2"}
      
  • remotes: Could take two values:

    • String of URLs separated by ",":

      remotes = "https://api.bintray.com/conan/conan-community/conan,https://api.bintray.com/conan/other/conan2"
      
    • List of tuples containing the "url", "use_ssl" flag and "name" . e.j:

      remotes = [("https://api.bintray.com/conan/conan-community/conan", True, "remote1"),
                 ("https://api.bintray.com/conan/other/conan2", False, "remote2")]
      
  • options: Options used on package build:

    • List of options:
      options = ["foobar:with_qux=True", "foobar:with_bar=False"]
      
      
  • gcc_versions: List with a subset of gcc_versions. Default ["4.9", "5", "6", "7"]

  • clang_versions: List with a subset of clang_versions. Default ["3.8", "3.9", "4.0"]

  • apple_clang_versions: List with a subset of apple-clang versions. Default ["6.1", "7.3", "8.0"]

  • visual_versions: List with a subset of Visual Studio versions. Default [10, 12, 14]

  • visual_runtimes: List containing Visual Studio runtimes to use in builds. Default ["MT", "MD", "MTd", "MDd"]

  • mingw_configurations: Configurations for MinGW

  • archs: List containing specific architectures to build for. Default ["x86", "x86_64"]

  • use_docker: Use docker for package creation in Linux systems.

  • docker_run_options: Pass additional parameters for docker when running the create step.

  • docker_conan_home: Location where package source files will be copied to inside the Docker container

  • docker_image_skip_update: If defined, it will skip the initialization update of "conan package tools" and "conan" in the docker image. By default is False.

  • docker_image_skip_pull: If defined, it will skip the "docker pull" command, enabling a local image to be used, and without being overwritten.

  • always_update_conan_in_docker: If True, "conan package tools" and "conan" will be installed and upgraded in the docker image in every build execution. and the container won't be commited with the modifications.

  • docker_entry_script: Command to be executed before to build when running Docker.

  • pip_install: Package list to be installed by pip before to build. e.j ["foo", "bar"]

  • docker_32_images: If defined, and the current build is arch="x86" the docker image name will be appended with "-i386". e.j: "conanio/gcc63-i386"

  • docker_shell: Shell command to be executed by Docker. e.j: "/bin/bash -c" (Linux), "cmd /C" (Windows)

  • curpage: Current page of packages to create

  • total_pages: Total number of pages

  • vs10_x86_64_enabled: Flag indicating whether or not to build for VS10 64bits. Default [False]

  • upload_retry: Num retries in upload in case of failure.

  • upload_only_when_stable: Will try to upload only if the channel is the stable channel. Default [False]

  • upload_only_when_tag: Will try to upload only if the branch is a tag. Default [False]

  • upload_only_recipe: If defined, will try to upload only the recipes. The built packages will not be uploaded. Default [False]

  • upload_dependencies: Will try to upload dependencies to your remote. Default [False]

  • upload_force: Will try to force uploaded all packages. Default [True]

  • build_types: List containing specific build types. Default ["Release", "Debug"]

  • cppstds: List containing specific cpp standards. Default None

  • skip_check_credentials: Conan will skip checking the user credentials before building the packages. And if no user/remote is specified, will try to upload with the already stored credentiales in the local cache. Default [False]

  • allow_gcc_minors Declare this variable if you want to allow gcc >=5 versions with the minor (5.1, 6.3 etc).

  • exclude_vcvars_precommand For Visual Studio builds, it exclude the vcvars call to set the environment.

  • build_policy: Can be None, single value or a list. Default None.

    • None: Only Build current package. Equivalent to --build current_package_ref
    • "never": No build from sources, only download packages. Equivalent to --build never
    • "missing": Build only missing packages. Equivalent to --build missing
    • "outdated": Build only missing or if the available package is not built with the current recipe. Useful to upload new configurations, e.j packages for a new compiler without rebuild all packages. Equivalent to --build outdated
    • "all": Build all requirements. Equivalent to --build
    • "cascade": Build from code all the nodes with some dependency being built (for any reason). Equivalent to --build cascade
    • "some_package" : Equivalent to --build some_package
    • "pattern*": will build only the packages with the reference starting with pattern*. Equivalent to --build pattern*
    • ["pattern*", "another_pattern*"]: will build only the packages with the reference matching these patterns. Equivalent to --build pattern* --build another_pattern*
    • ["pattern*", "outdated"]: --build pattern* --build outdated Check Conan Build policies for more details.
  • require_overrides List containing overrides requirement to the consumer conanfile. e.j ["foo/0.1@user/channel", "bar/1.2@bar/channel"]

  • test_folder: Custom test folder consumed by Conan create, e.j .conan/test_package

  • lockfile: Custom conan lockfile to be used, e.j. conan.lock. Default [None]

  • conanfile: Custom conanfile consumed by Conan create. e.j. conanfile.py

  • config_url: Conan config URL be installed before to build e.j https://github.com/bincrafters/conan-config.git

  • config_args: Conan config arguments used when installing conan config

  • force_selinux: Force docker to relabel file objects on the shared volumes

  • skip_recipe_export: If True, the package recipe will only be exported on the first build. Default [False]

  • update_dependencies: Update all dependencies before building e.g conan create -u

  • global_conf: A list with values to be added to global.conf file

Upload related parameters:

  • upload: Could take two values:

    • String with an URL.

      upload = "https://api.bintray.com/conan/conan-community/conan"
      
    • Tuple containing the "url", "use_ssl" flag and "name".

      upload = ("https://api.bintray.com/conan/conan-community/conan", True, "remote1")
      
  • reference: Reference of the package to upload. Ex: "zlib/1.2.8". If not specified it will be read from the conanfile.py.

  • remote: Alternative remote name. Default "default"

  • stable_branch_pattern: Regular expression, if current git branch matches this pattern, the packages will be uploaded to stable channel. By default it will check the following patterns: ["master$", "release*", "stable*"]

  • stable_channel: Stable channel, default "stable".

  • channel: Channel where your packages will be uploaded if previous parameter doesn't match

Commit messages reference

The current commit message can contain special messages:

  • [skip ci]: Will skip the building of any package (unless CONAN_IGNORE_SKIP_CI is set)
  • [build=XXX]: Being XXX a build policy (see build_policy parameter reference)
  • [build=XXX] [build=YYY]: Being XXX and YYY the two build policies to use (see build_policy parameter reference)

Complete ConanMultiPackager methods reference:

  • add_common_builds(shared_option_name=None, pure_c=True, dll_with_static_runtime=False, reference=None, header_only=True, build_all_options_values=None): Generate a set of package configurations and add them to the list of packages that will be created.

    • shared_option_name: If given, ConanMultiPackager will add different configurations for -o shared=True and -o shared=False.
    • pure_c: ConanMultiPackager won't generate different builds for the libstdc++ c++ standard library, because it is a pure C library.
    • dll_with_static_runtime: generate also build for "MT" runtime when the library is shared.
    • reference: Custom package reference
    • header_only: Generate new builds following header-only options #454
    • build_all_options_values: Include all values for the listed options #457
  • login(remote_name): Performs a conan user command in the specified remote.

  • add(settings=None, options=None, env_vars=None, build_requires=None): Add a new build configuration, so a new binary package will be built for the specified configuration.

  • run(): Run the builds (Will invoke conan create for every specified configuration)

Environment configuration

You can also use environment variables to change the behavior of ConanMultiPackager, so that you don't pass parameters to the ConanMultiPackager constructor.

This is especially useful for CI integration.

  • CONAN_USERNAME: Your conan username (for the package reference)

  • CONAN_REFERENCE: Reference of the package to upload, e.g. "zlib/1.2.8". Otherwise it will be read from the conanfile.py

  • CONAN_LOGIN_USERNAME: Unique login username for all remotes. Will use "CONAN_USERNAME" when not present.

  • CONAN_LOGIN_USERNAME_XXX: Specify a login for a remote name:

    • CONAN_LOGIN_USERNAME_MYREPO=my_username
  • CONAN_PASSWORD: Conan Password, or API key if you are using Bintray.

  • CONAN_PASSWORD_XXX: Specify a password for a remote name:

    • CONAN_PASSWORD_MYREPO=mypassword
  • CONAN_REMOTES: List of URLs separated by "," for the additional remotes (read). You can specify the SSL verify flag and the remote name using the "@" separator. e.j:

    • CONAN_REMOTES=url1@True@remote_name, url2@False@remote_name2

    The remote name is useful in case you want to specify custom credentials for different remotes. See CONAN_LOGIN_USERNAME_XXX and CONAN_PASSWORD_XXX

  • CONAN_UPLOAD: URL of the repository where we want to use to upload the packages. The value can containing the URL, the SSL validation flag and remote name (last two optionals) separated by "@". e.j:

    • CONAN_UPLOAD=https://yourcompany.jfrog.io/artifactory/api/conan/local
    • CONAN_UPLOAD=https://yourcompany.jfrog.io/artifactory/api/conan/local@True
    • CONAN_UPLOAD=https://yourcompany.jfrog.io/artifactory/api/conan/local@True@other_repo_name

    If a remote name is not specified, upload_repo will be used as a remote name. If the SSL validation configuration is not specified, it will use True by default.

  • CONAN_UPLOAD_RETRY: If defined, in case of fail retries to upload again the specified times

  • CONAN_UPLOAD_ONLY_WHEN_STABLE: If defined, will try to upload the packages only when the current channel is the stable one.

  • CONAN_UPLOAD_ONLY_WHEN_TAG: If defined, will try to upload the packages only when the current branch is a tag.

  • CONAN_UPLOAD_ONLY_RECIPE: If defined, will try to upload only the recipes. The built packages will not be uploaded.

  • CONAN_UPLOAD_DEPENDENCIES: If defined, will try to upload the listed package dependencies to your remote.

  • CONAN_UPLOAD_FORCE: If defined, will try to force upload all packages. Default is True.

  • CONAN_SKIP_CHECK_CREDENTIALS: Conan will skip checking the user credentials before building the packages. And if no user/remote is specified, will try to upload with the already stored credentiales in the local cache. Default [False]

  • CONAN_DOCKER_ENTRY_SCRIPT: Command to be executed before to build when running Docker.

  • CONAN_PIP_INSTALL: Package list to be installed by pip before to build, comma separated, e.g. "pkg-foo==0.1.0,pkg-bar"

  • CONAN_GCC_VERSIONS: Gcc versions, comma separated, e.g. "4.6,4.8,5,6"

  • CONAN_CLANG_VERSIONS: Clang versions, comma separated, e.g. "3.8,3.9,4.0"

  • CONAN_APPLE_CLANG_VERSIONS: Apple clang versions, comma separated, e.g. "6.1,8.0"

  • CONAN_ARCHS: Architectures to build for, comma separated, e.g. "x86,x86_64"

  • CONAN_OPTIONS: Conan build options, comma separated, e.g. "foobar:with_bar=True,foobar:with_qux=False"

  • CONAN_SHARED_OPTION_NAME: Set shared_option_name by environment variable, e.g. "mypackagename:shared"

  • CONAN_BUILD_ALL_OPTIONS_VALUES: Set build_all_options_values by environment variable, e.g. "mypackagename:foo,mypackagename:bar"

  • CONAN_BUILD_TYPES: Build types to build for, comma separated, e.g. "Release,Debug"

  • CONAN_CPPSTDS: List containing values for compiler.cppstd. Default None

  • CONAN_VISUAL_VERSIONS: Visual versions, comma separated, e.g. "12,14"

  • CONAN_VISUAL_RUNTIMES: Visual runtimes, comma separated, e.g. "MT,MD"

  • CONAN_VISUAL_TOOLSETS: Map Visual versions to toolsets, e.g. 14=v140;v140_xp,12=v120_xp

  • CONAN_MSVC_VERSIONS: msvc versions, comma separated, e.g. "19.29,193"

  • CONAN_MSVC_RUNTIMES: msvc runtimes, comma separated, e.g. "static,dynamic"

  • CONAN_MSVC_RUNTIME_TYPES: msvc runtime types, comma separated, e.g. "Debug,Release"

  • CONAN_USE_DOCKER: If defined will use docker

  • CONAN_CURRENT_PAGE: Current page of packages to create

  • CONAN_TOTAL_PAGES: Total number of pages

  • CONAN_DOCKER_IMAGE: If defined and docker is being used, it will use this dockerimage instead of the default images, e.g. "conanio/gcc63"

  • CONAN_DOCKER_HOME: Location where package source files will be copied to inside the Docker container

  • CONAN_DOCKER_RUN_OPTIONS: Pass additional parameters for docker when running the create step

  • CONAN_DOCKER_IMAGE_SKIP_UPDATE: If defined, it will skip the initialization update of "conan package tools" and "conan" in the docker image. By default is False.

  • CONAN_DOCKER_IMAGE_SKIP_PULL: If defined, it will skip the "docker pull" command, enabling a local image to be used, and without being overwritten.

  • CONAN_ALWAYS_UPDATE_CONAN_DOCKER: If defined, "conan package tools" and "conan" will be installed and upgraded in the docker image in every build execution and the container won't be commited with the modifications.

  • CONAN_DOCKER_32_IMAGES: If defined, and the current build is arch="x86" the docker image name will be appended with "-i386". e.j: "conanio/gcc63-i386"

  • CONAN_DOCKER_SHELL: Shell command to be executed by Docker. e.j: "/bin/bash -c" (Linux), "cmd /C" (Windows)

  • CONAN_STABLE_BRANCH_PATTERN: Regular expression, if current git branch matches this pattern, the packages will be uploaded to CONAN_STABLE_CHANNEL channel. Default "master". E.j: "release/*"

  • CONAN_STABLE_CHANNEL: Stable channel name, default "stable"

  • CONAN_CHANNEL: Channel where your packages will be uploaded if the previous parameter doesn't match

  • CONAN_PIP_PACKAGE: Specify a conan package to install (by default, installs the latest) e.j conan==0.0.1rc7

  • MINGW_CONFIGURATIONS: Specify a list of MinGW builds. See MinGW builds section.

  • CONAN_BASH_PATH: Path to a bash executable. Used only in windows to help the tools.run_in_windows_bash() function to locate our Cygwin/MSYS2 bash. Set it with the bash executable path if itโ€™s not in the PATH or you want to use a different one.

  • CONAN_PIP_USE_SUDO Use "sudo" when invoking pip, by default it will use sudo when not using Windows and not running docker image "conanio/". "False" to deactivate.

  • CONAN_PIP_COMMAND Run custom pip command when updating Conan. e.g. "/usr/bin/pip2"

  • CONAN_DOCKER_PIP_COMMAND Run custom pip command when updating Conan and CPT in Docker container. e.g. "/usr/bin/pip2"

  • CONAN_DOCKER_USE_SUDO Use "sudo" when invoking docker, by default it will use sudo when not using Windows. "False" to deactivate.

  • CONAN_ALLOW_GCC_MINORS Declare this variable if you want to allow gcc >=5 versions with the minor (5.1, 6.3 etc).

  • CONAN_EXCLUDE_VCVARS_PRECOMMAND For Visual Studio builds, it exclude the vcvars call to set the environment.

  • CONAN_BUILD_REQUIRES You can specify additional build requires for the generated profile with an environment variable following the same profile syntax and separated by "," i.e CONAN_BUILD_REQUIRES: mingw-installer/7.1@conan/stable, pattern: other/1.0@conan/stable

  • CONAN_BUILD_POLICY: Comma separated list of build policies. Default None.

    • None: Only Build current package. Equivalent to --build current_package_ref
    • "never": No build from sources, only download packages. Equivalent to --build never
    • "missing": Build only missing packages. Equivalent to --build missing
    • "outdated": Build only missing or if the available package is not built with the current recipe. Useful to upload new configurations, e.j packages for a new compiler without rebuild all packages. Equivalent to --build outdated
    • "all": Build all requirements. Equivalent to --build
    • "cascade": Build from code all the nodes with some dependency being built (for any reason). Equivalent to --build cascade
    • "some_package" : Equivalent to --build some_package
    • "pattern*": will build only the packages with the reference starting with pattern*. Equivalent to --build pattern*
    • "pattern*,another_pattern*": will build only the packages with the reference matching these patterns. Equivalent to --build pattern* --build another_pattern*
    • "pattern*,outdated": Equivalent to --build pattern* --build outdated Check Conan Build policies for more details.
  • CONAN_REQUIRE_OVERRIDES: Comma separated list of requirement overrides.

  • CONAN_CONFIG_URL: Conan config URL be installed before to build e.j https://github.com/bincrafters/conan-config.git

  • CONAN_CONFIG_ARGS: Conan config arguments used when installing conan config

  • CONAN_BASE_PROFILE: Apply options, settings, etc. to this profile instead of default.

  • CONAN_BASE_PROFILE_BUILD: Apply the specified profile to the build machine. Default is None

  • CONAN_IGNORE_SKIP_CI: Ignore [skip ci] in commit message.

  • CONAN_CONANFILE: Custom conanfile consumed by Conan create. e.j. conanfile.py

  • CONAN_LOCKFILE: Custom conan lockfile to be used, e.j. conan.lock.

  • CPT_TEST_FOLDER: Custom test_package path, e.j .conan/test_package

  • CONAN_FORCE_SELINUX: Force docker to relabel file objects on the shared volumes

  • CONAN_SKIP_RECIPE_EXPORT: If defined, the package recipe will only be exported on the first build.

  • CPT_UPDATE_DEPENDENCIES: Update all dependencies before building e.g conan create -u

  • CONAN_PURE_C: Set pure_c by environment variable, default True

  • CONAN_GLOBAL_CONF: Add global.conf file with listed values e.g '*:tools.cmake.cmaketoolchain:generator=Ninja,tools.system.package_manager:mode=install'

Full example

You can see the full zlib example here

conan-package-tools's People

Contributors

anton-matosov avatar artalus avatar bilke avatar boussaffawalid avatar chaosteil avatar croydon avatar czoido avatar danimtb avatar dmitry-zakablukov avatar dvirtz avatar ericlemanissier avatar fedterzi avatar intelligide avatar jimaarons avatar kasunch avatar lasote avatar marache avatar mathieu avatar memsharded avatar nunojpg avatar raulbocanegra avatar samsonbox avatar sbannier avatar sixten-hilborn avatar solvingj avatar sourcedelica avatar theirix avatar uilianries avatar xaltotun avatar yasn77 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

conan-package-tools's Issues

Appveyor freezes and times out?

This is more a question about the use of the tools with appveyor rather than an issue with the tools themselves. I have this build:

https://ci.appveyor.com/project/coding3d/conan-ogg

It freezes after setting up conaninfo.txt and then times out. Is this a known issue with appveyor or may there be something that I am doing wrong? It has happened to me before and then it just stopped happening without me doing anything special, at least as far as I understood.

python3 support

The zlib example fails for python3 with

CONAN_USERNAME=ph03 python3 build.py                                                                                                      Mi 05/18/16 21:14
zlib/1.2.8@ph03/testing export: Copied 1 '.cmake' files: FindZLIB.cmake
zlib/1.2.8@ph03/testing export: Copied 1 '.txt' files: CMakeLists.txt
zlib/1.2.8@ph03/testing: The stored package has not changed
zlib/1.2.8@PROJECT
    URL: http://github.com/lasote/conan-zlib
    License: http://www.zlib.net/zlib_license.html
Traceback (most recent call last):
  File "build.py", line 7, in <module>
    builder.run()
  File "/home/janickm/.local/lib/python3.4/site-packages/conan/packager.py", line 244, in run
    self._pack()
  File "/home/janickm/.local/lib/python3.4/site-packages/conan/packager.py", line 267, in _pack
    self._execute_build(build)
  File "/home/janickm/.local/lib/python3.4/site-packages/conan/packager.py", line 376, in _execute_build
    self._execute_test(None, settings, options)
  File "/home/janickm/.local/lib/python3.4/site-packages/conan/packager.py", line 81, in _execute_test
    conan_compiler, conan_compiler_version = self.conan_compiler_info()
  File "/home/janickm/.local/lib/python3.4/site-packages/conan/packager.py", line 388, in conan_compiler_info
    from ConfigParser import ConfigParser
ImportError: No module named 'ConfigParser'

whereas with python2 it works great

env CONAN_USERNAME=ph03 python build.py                                                                                                   Mi 05/18/16 21:14
zlib/1.2.8@ph03/testing export: Copied 1 '.cmake' files: FindZLIB.cmake
zlib/1.2.8@ph03/testing export: Copied 1 '.txt' files: CMakeLists.txt
zlib/1.2.8@ph03/testing: The stored package has not changed
zlib/1.2.8@PROJECT
    URL: http://github.com/lasote/conan-zlib
    License: http://www.zlib.net/zlib_license.html

############## CONAN PACKAGE TOOLS ######################

DEBUG: - Skipped build, compiler mismatch: {'compiler.version': '4.6', 'arch': 'x86', 'build_type': 'Debug', 'compiler': 'gcc'}

#########################################################


############## CONAN PACKAGE TOOLS ######################

DEBUG: - Skipped build, compiler mismatch: {'compiler.version': '4.6', 'arch': 'x86', 'build_type': 'Release', 'compiler': 'gcc'}

#########################################################

Is it a problem on my side, or should I be able to run the tool with python3 as well?

Appveyor integration broken since yesterday

https://ci.appveyor.com/project/nunojpg/conan-botan/build/job/3ckemv2apg4if2rn

    Running setup.py install for conan-package-tools: finished with status 'done'
Successfully installed PyJWT-1.4.2 PyYAML-3.12 asn1crypto-0.22.0 astroid-1.4.9 backports.functools-lru-cache-1.3 bottle-0.12.13 cffi-1.10.0 colorama-0.3.9 conan-0.22.3 conan-package-tools-0.3.2 configparser-3.5.0 cryptography-1.8.1 distro-1.0.4 enum34-1.1.6 fasteners-0.14.1 future-0.16.0 idna-2.5 ipaddress-1.0.18 isort-4.2.5 lazy-object-proxy-1.3.1 mccabe-0.6.1 monotonic-1.3 node-semver-0.1.1 packaging-16.8 patch-1.16 pluginbase-0.5 pyOpenSSL-17.0.0 pycparser-2.17 pylint-1.6.5 pyparsing-2.2.0 requests-2.14.1 six-1.10.0 wrapt-1.10.10
conan user
Traceback (most recent call last):
  File "C:\Python27\Scripts\conan-script.py", line 6, in <module>
    from pkg_resources import load_entry_point
  File "c:\python27\lib\site-packages\pkg_resources\__init__.py", line 3017, in <module>
    @_call_aside
  File "c:\python27\lib\site-packages\pkg_resources\__init__.py", line 3003, in _call_aside
    f(*args, **kwargs)
  File "c:\python27\lib\site-packages\pkg_resources\__init__.py", line 3030, in _initialize_master_working_set
    working_set = WorkingSet._build_master()
  File "c:\python27\lib\site-packages\pkg_resources\__init__.py", line 661, in _build_master
    return cls._build_from_requirements(__requires__)
  File "c:\python27\lib\site-packages\pkg_resources\__init__.py", line 674, in _build_from_requirements
    dists = ws.resolve(reqs, Environment())
  File "c:\python27\lib\site-packages\pkg_resources\__init__.py", line 853, in resolve
    raise DistributionNotFound(req, requirers)
pkg_resources.DistributionNotFound: The 'requests<2.14.0,>=2.7.0' distribution was not found and is required by conan
Command exited with code 1

builder.builds = filtered_builds not filtering?

I'm using a script similiar to the example to filter x86 builds, but the x86 builds were still present, so I added some debug prints:

    builder = ConanMultiPackager(username="nunojpg")
    builder.add_common_builds(shared_option_name="restbed:shared", pure_c=False)
    filtered_builds = []
    for settings, options, env_vars, build_requires in builder.builds:
        if settings["arch"] != "x86":
            filtered_builds.append([settings, dict(options.items() + [('restbed:ssl', False)]), env_vars, build_requires])
            filtered_builds.append([settings, dict(options.items() + [('restbed:ssl', True)]), env_vars, build_requires])
    print "filtered_builds="
    print filtered_builds
    builder.builds = filtered_builds
    print "builder.builds="
    print builder.builds
    builder.run()

The output is like this:

filtered_builds=
[[{'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}, {'restbed:shared': True, 'restbed:ssl': False}, {}, {}], [{'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}, {'restbed:shared': True, 'restbed:ssl': True}, {}, {}], [{'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'gcc'}, {'restbed:shared': True, 'restbed:ssl': False}, {}, {}], [{'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'gcc'}, {'restbed:shared': True, 'restbed:ssl': True}, {}, {}], [{'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}, {'restbed:shared': False, 'restbed:ssl': False}, {}, {}], [{'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}, {'restbed:shared': False, 'restbed:ssl': True}, {}, {}], [{'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'gcc'}, {'restbed:shared': False, 'restbed:ssl': False}, {}, {}], [{'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'gcc'}, {'restbed:shared': False, 'restbed:ssl': True}, {}, {}]]
builder.builds=
[BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86', 'build_type': 'Debug', 'compiler': 'gcc'}, options={'restbed:shared': True}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86', 'build_type': 'Release', 'compiler': 'gcc'}, options={'restbed:shared': True}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86', 'build_type': 'Debug', 'compiler': 'gcc'}, options={'restbed:shared': False}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86', 'build_type': 'Release', 'compiler': 'gcc'}, options={'restbed:shared': False}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}, options={'restbed:shared': True}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'gcc'}, options={'restbed:shared': True}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}, options={'restbed:shared': False}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'gcc'}, options={'restbed:shared': False}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}, options={'restbed:shared': True, 'restbed:ssl': False}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}, options={'restbed:shared': True, 'restbed:ssl': True}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'gcc'}, options={'restbed:shared': True, 'restbed:ssl': False}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'gcc'}, options={'restbed:shared': True, 'restbed:ssl': True}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}, options={'restbed:shared': False, 'restbed:ssl': False}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}, options={'restbed:shared': False, 'restbed:ssl': True}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'gcc'}, options={'restbed:shared': False, 'restbed:ssl': False}, env_vars={}, build_requires={}), BuildConf(settings={'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Release', 'compiler': 'gcc'}, options={'restbed:shared': False, 'restbed:ssl': True}, env_vars={}, build_requires={})]

So the filtered_builds array doesn't have any x86 as expected, but the builder.builds has.

Am I doing something wrong?

Actual execution at: https://travis-ci.org/nunojpg/conan-restbed/jobs/224929589

AppVeyor integration working?

I'm getting this failure: https://ci.appveyor.com/project/nunojpg/conan-restbed/build/1.0.3

WARN: Remotes registry file missing, creating default one in C:\Users\appveyor.conan\registry.txt
Current 'conan.io' user: None (anonymous)
C:\Python27-x64\python build.py
Traceback (most recent call last):
File "build.py", line 1, in
from conan.packager import ConanMultiPackager
ImportError: No module named conan.packager
Command exited with code 1

I've searched for any recent builds succeeding on AppVeyor, but don't know of any repositories with CI going.

tools.vcvars_command empty when running with package tools

My package build fails in appveyor because tools.vcvars_command() returns an empty string. I found the problem occurs when the build is started with the build.py script. On my machine, if I build the package normally (with test_package for example), everything is fine. If I call python build.py, the build fails with the same error as in appveyor:

&& was unexpected at this time.

A failing build is accessible here: https://ci.appveyor.com/project/osechet/conan-qt-base/build/job/mvkk4vc67rfsarow

Allow for custom package root path

Currently, ConanMultiPackager._docker_pack method has path to "project" directory hard-coded:

command = "sudo docker run --rm -v %s:/home/conan/project -v " \
          "~/.conan/data:/home/conan/.conan/data -it %s %s /bin/sh -c \"" \
          "cd project && sudo pip install conan_package_tools --upgrade %s && " \
          "conan_json_packager\"" % (curdir, env_vars, image_name, specific_conan_package)

I would like to be able to use custom path (e.g. subdirectory of "project"), maybe via another CONAN_... environment variable. This would allow for multiple packages to be stored within a single repository.

Channel & upload configuration in appveyor

Hello,

I have a couple of questions with regard to the following commit of my package:

https://github.com/coding3d/conan-ogg/tree/a7d075f530520c28544a17eb39405681d4e37e01

The commit has produced the following build on appveyor:

https://ci.appveyor.com/project/coding3d/conan-ogg/build/1.0.11

My questions may be due to my own misunderstanding but I am mentioning them here, in case they are actually issues that have to do with the package tools:

  • Should this build have led to an upload to conan.io, given the fact that I have set CONAN_UPLOAD to 0 in appveyor.yml? Does CONAN_UPLOAD control whether or not the packages will be uploaded, or does it do something else?
  • Since I have set CONAN_CHANNEL to "ci" in appveyor.yml, why is the channel of uploaded packages "stable"? In general, can we control what the channel of uploaded packages is? (If the "stable" channel is a conan.io best practice, it sounds good, but it's still good to know).

Thanks!

Why DLLs are never built with static runtime?

In file conan/packager.py, method _add_visual_builds():

        if shared_option_name:
            if "MT" in self.visual_runtimes:
                sets.append([{"build_type": "Release", "compiler.runtime": "MT"},
                             {shared_option_name: False}])
            if "MTd" in self.visual_runtimes:
                sets.append([{"build_type": "Debug", "compiler.runtime": "MTd"},
                             {shared_option_name: False}])

Why DLLs are never built with static runtime? It is very useful and I can't find any pros for the current behavior.

Unable to locate "vcvarsall.bat" during Visual Studio 2017 builds

When attempting to build with Visual Studio 2017, the following error is produced:

The system cannot find the path specified.
Error while executing:
call "%vs150comntools%../../VC/Auxiliary/Build/vcvarsall.bat" amd64 && conan test_package . -s arch="x86_64" -s build_type="Release" -s compiler="Visual Studio" -s compiler.runtime="MT" -s compiler.version="15" -o thelib:shared="False"

I noted that there was a commit to add support for it:
c4cace8

But Microsoft has removed the %vsXXXcomntools% variables starting with VS2017 and forward:
https://blogs.msdn.microsoft.com/vcblog/2017/03/06/finding-the-visual-c-compiler-tools-in-visual-studio-2017/

Currently we are working around this by manually creating a vs150comntools variable.

Wrong password when using special characters

Short explanation:
When I try to upload some package, by conan_package_tools, I receive:

ERROR: Wrong user or password.

Full explanation:
My user and password work well when I execute by conan's client:

$ conan user -r conan.io -p '<password-with-special-characters>' uilianries
Change 'conan.io' user from None (anonymous) to uilianries

However, when I use conan package tools, the authentication fails:

CONAN_USERNAME=uilianries CONAN_CHANNEL=ci CONAN_REFERENCE=libusb/1.0.21 CONAN_PASSWORD='<password-with-special-characters>' CONAN_UPLOAD=1 python build.py

At the end, all my builds pass, but the upload fails:

############## CONAN PACKAGE TOOLS ######################

INFO: ******** RUNNING UPLOAD COMMAND **********
conan upload libusb/1.0.21@uilianries/ci --all --force

#########################################################

ERROR: Wrong user or password. [Remote: conan.io]
Traceback (most recent call last):
File "build.py", line 13, in
builder.run()
File "/usr/local/lib/python2.7/dist-packages/conan/packager.py", line 273, in run
self._upload_packages()
File "/usr/local/lib/python2.7/dist-packages/conan/packager.py", line 375, in _upload_packages
raise Exception("Error with user credentials")
Exception: Error with user credentials

I tried escape the special characters by '\' (e.g. '1234\&\$foobar'), but didn't work.

I read packager.py and I just found a replace for double quotes. Another point is that, the password is forwarded to conan user command, by python os.system.

To simplify the analysis, I wrote the follow code:

from os import getenv
from os import system
 
if __name__ == "__main__":
    env = getenv("CONAN_PASSWORD")
    env =  env.replace('"', '\\"')
    system("echo %s" % env)
    system("conan user -c")
    system("conan user -r conan.io -p %s uilianries" % env)

This piece works well when I export my password in quotes.

I'm using conan at version 0.20.2 and conan_package_tools 0.2.35 (both are latest).

I have no more ideas how to use the automatic upload with my password.

Windows builds end to C:\

Hi

I'm using conan-package-tools in Windows and when I run the python build.py command, all the packages are built in C:.conan folder instead of C:\Users\user.conan.
There's an interesting effect too: in C:\Users\user.conan\MyPackage\ there are .conan_link files pointing to the C:.conan folder.

How can I force to build the packages in user folders instead of on C: root folder?

Update recommended travis.yml

README.md suggests a slightly outdated versions of xcode.
Travis stopped supporting xcode7.1 and xcode 6.2 recently so they are silently dispatched to the xcode 7.3 instead:

This job was configured to run on an OS X image that was retired on November 28, 2016. It was routed to our Xcode 7.3.1 infrastructure.

Xcode 8.1 and 8.2 should be added.

There is a more generic question about a lifecycle of package's .travis.yml - how can it be updated? Should it be updated automatically or should the repository owner be notified about change?

Priority of visual_versions ConanMultiPackager's parameter vs CONAN_VISUAL_VERSIONS

Disclaimer: I am a newbie using Conan and GitHub, so probably what I am proposing here is completely wrong, or the style is not correct... Sorry for that.

I am creating a package for a library that, under Windows, is only compilable using Visual Studio 2013 (12) and 2015 (14). I am trying to show this restriction in the build.py file, so looks like this:

from conan.packager import ConanMultiPackager
if __name__ == "__main__":
    builder = ConanMultiPackager(username="demo")
    builder.add_common_builds(visual_versions = ["12", "14"])
    builder.run()

In my machine, I only have VS 2013 installed so, in order to build the packages, I tried this:

set CONAN_VISUAL_VERSIONS="12" 
python build.py

However, the script tried to build the package using both VS 2013 and 2015 which caused a build error. Checking the code of packager.py I found:

if visual_versions is not None:
    self.visual_versions = visual_versions
else:
    env_visual_versions = list(filter(None, os.getenv("CONAN_VISUAL_VERSIONS", "").split(",")))
    self.visual_versions = env_visual_versions or self.default_visual_versions

That is, the ConanMultiPackager's parameter has priority over the environment variable. However, in my opinion, the environment variable should have a higher priority, as it would allow to cover cases like this (a machine with only a VS environment installed) without having to modify the script.

Thanks in advance.

Wildcard upload requires confirmation

Conan will ask for confirmation (yes/no) if the upload reference is a pattern.
The packager should probably pass --confirm to conan upload since it's purpose is mostly for CI/automation. Without --confirm my CI job creates an infinite large log when it hangs at:

Are you sure you want to upload '...'? (yes/no): ERROR:  is not a valid answer
Are you sure you want to upload '...'? (yes/no): ERROR:  is not a valid answer
...

Temporary directory "~/" created on build

Hi!

Conan package tools 0.4.31 creates a directory "~" in current project directory, when the build starts.

Steps to reproduce:

  1. Start a docker container
    $ docker run --rm -ti lasote/conangcc63

  2. Update Conan and Package Tools
    $ sudo pip install -U conan conan_package_tools

  3. Create a empty project
    $ conan new -s -t foobar/0.1.0@qux/baz

  4. Create default build.py

from conan.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager(username="qux")
    builder.add_common_builds()
    builder.run()
  1. Just run it
    $ python build.py

As result, conan + conan_package_tools will produce a entry tested package.
However, a list command will show:

conan@e93746325111:~/$ ls
build.py  conanfile.py  src  test_package  ~

And finally we have:
$ file \~/.conan/conan.conf

I didn't see that behavior before.

Compressing large files fails on Appveyor before upload

I have a large library (VTK) which I build on Appveyor but in debug it fails to compress all the files due to running out of memory. Can I do something about it? See also the Appveyor-build.

Compressing package...                                                Traceback (most recent call last):
  File "C:\Python27\Scripts\conan-script.py", line 9, in <module>
    load_entry_point('conan==0.8.4', 'console_scripts', 'conan')()
  File "c:\python27\lib\site-packages\conans\conan.py", line 6, in run
    main(sys.argv[1:])
  File "c:\python27\lib\site-packages\conans\client\command.py", line 615, in main
    error = command.run(args)
  File "c:\python27\lib\site-packages\conans\client\command.py", line 529, in run
    method(args[0][1:])
  File "c:\python27\lib\site-packages\conans\client\command.py", line 437, in upload
    args.remote, all_packages=args.all, force=args.force)
  File "c:\python27\lib\site-packages\conans\client\manager.py", line 292, in upload
    uploader.upload_conan(conan_reference, all_packages=all_packages, force=force)
  File "c:\python27\lib\site-packages\conans\client\uploader.py", line 26, in upload_conan
    self.upload_package(PackageReference(conan_ref, package_id), index + 1, total)
  File "c:\python27\lib\site-packages\conans\client\uploader.py", line 35, in upload_package
    self._remote_proxy.upload_package(package_ref)
  File "c:\python27\lib\site-packages\conans\client\proxy.py", line 205, in upload_package
    result = self._remote_manager.upload_package(package_reference, remote)
  File "c:\python27\lib\site-packages\conans\client\remote_manager.py", line 38, in upload_package
    the_files = compress_package_files(the_files)
  File "c:\python27\lib\site-packages\conans\client\remote_manager.py", line 122, in compress_package_files
    return compress_files(files, PACKAGE_TGZ_NAME, excluded=(CONANINFO, CONAN_MANIFEST))
  File "c:\python27\lib\site-packages\conans\client\remote_manager.py", line 144, in compress_files
    addfile(the_file, content, tgz)
  File "c:\python27\lib\site-packages\conans\client\remote_manager.py", line 140, in addfile
    tar.addfile(tarinfo=info, fileobj=string)
  File "c:\python27\lib\tarfile.py", line 2051, in addfile
    copyfileobj(fileobj, self.fileobj, tarinfo.size)
  File "c:\python27\lib\tarfile.py", line 275, in copyfileobj
    dst.write(buf)
  File "c:\python27\lib\gzip.py", line 241, in write
    self.fileobj.write(self.compress.compress(data))
MemoryError: out of memory
Traceback (most recent call last):
  File "build.py", line 8, in <module>
    builder.run()
  File "C:\Python27\lib\site-packages\conan\packager.py", line 244, in run
    self._upload_packages()
  File "C:\Python27\lib\site-packages\conan\packager.py", line 347, in _upload_packages
    raise Exception("Error uploading")
Exception: Error uploading

Cannot upload to specified remote

when using a different remote, only the conan upload call had the remote added, we need the conan user call to have it too. Merge Request #4 resolves this

Easier way to use pages?

My Qt package is very long to build and I need to be sure only one build is executed in one CI executor. This can be achieve using pages but it can be quite complicated to evaluate the number of pages to set and which executor will build what.
For example, to build with 2 versions of msvc, in debug and release, I have to configure appveyor as follow:

        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
          CONAN_VISUAL_VERSIONS: 12
          CONAN_TOTAL_PAGES: 4
          CONAN_CURRENT_PAGE: 1
        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
          CONAN_VISUAL_VERSIONS: 12
          CONAN_TOTAL_PAGES: 4
          CONAN_CURRENT_PAGE: 2
        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
          CONAN_VISUAL_VERSIONS: 12
          CONAN_TOTAL_PAGES: 4
          CONAN_CURRENT_PAGE: 3
        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
          CONAN_VISUAL_VERSIONS: 12
          CONAN_TOTAL_PAGES: 4
          CONAN_CURRENT_PAGE: 4
        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
          CONAN_VISUAL_VERSIONS: 14
          CONAN_TOTAL_PAGES: 4
          CONAN_CURRENT_PAGE: 1
        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
          CONAN_VISUAL_VERSIONS: 14
          CONAN_TOTAL_PAGES: 4
          CONAN_CURRENT_PAGE: 2
        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
          CONAN_VISUAL_VERSIONS: 14
          CONAN_TOTAL_PAGES: 4
          CONAN_CURRENT_PAGE: 3
        - APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
          CONAN_VISUAL_VERSIONS: 14
          CONAN_TOTAL_PAGES: 4
          CONAN_CURRENT_PAGE: 4

But to reach this result I have to understand how the package tools work internally and next time I will read the script, there is a high chance I will need time to understand what is done. And if I want to add mingw configurations, it becomes even more unclear.

Is there an easier way to work with pages? Or maybe is there a way to directly say what should be build by the executor?

Can't build on non-default version of gcc

We're trying to build a package with gcc 4.9. The system has versions 4.8, 4.8, 6.3 (default).

if __name__ == "__main__":
    builder = ConanMultiPackager(
    reference="switcher/0.4",
    upload=True,
    remote="barbarian")
    builder.add({"os": "Linux", "compiler": "gcc", "compiler.libcxx": "libstdc++", "compiler.version": "4.9", "build_type": "Debug", "arch": "x86"})
    builder.add({"os": "Linux", "compiler": "gcc", "compiler.libcxx": "libstdc++", "compiler.version": "4.9", "build_type": "Debug", "arch": "x86_64"})
    builder.run()
It seems to be the first time you've ran conan
Auto detecting your dev setup to initialize conan.conf
Found gcc 6.3
Default conan.conf settings
    os=Linux
    arch=x86_64
    compiler=gcc
    compiler.version=6.3
    compiler.libcxx=libstdc++
    build_type=Release
*** You can change them in ~/.conan/conan.conf ***
*** Or override with -s compiler='other' -s ...s***


WARN: Conanfile doesn't have 'url'.
It is recommended to add it as attribute
WARN: Conanfile doesn't have 'license'.
It is recommended to add it as attribute
WARN: Conanfile doesn't have 'description'.
It is recommended to add it as attribute
switcher/0.4@paulius/switcher_program: A new conanfile.py version was exported
switcher/0.4@paulius/switcher_program: Folder: /home/parallels/.conan/data/switcher/0.4/paulius/switcher_program/export
PROJECT: WARN: config() has been deprecated. Use config_options and configure
switcher/0.4@PROJECT
WARN: Remotes registry file missing, creating default one in /home/parallels/.conan/registry.txt

############## CONAN PACKAGE TOOLS ######################

DEBUG: - Skipped build, compiler mismatch: {'os': 'Linux', 'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86', 'build_type': 'Debug', 'compiler': 'gcc'}

#########################################################


############## CONAN PACKAGE TOOLS ######################

DEBUG: - Skipped build, compiler mismatch: {'os': 'Linux', 'compiler.version': '4.9', 'compiler.libcxx': 'libstdc++', 'arch': 'x86_64', 'build_type': 'Debug', 'compiler': 'gcc'}

#########################################################

It looks like conan found the default compiler and is sticking with it. How can we make it recognise other versions? We could define the CXX variable or change the settings, but we're going to be doing multi-compiler builds pretty soon, and ConanMultiPackager seems like our best bet here.

More flexible handling of `upload_repo`

I have some problems with running the new upload handling in ConanMultiPackager using our private conan registry.

First problem, I already have a conan remote entry with the same address as the one specified in the upload parameter. Which then gives the following error:

ERROR: Remote 'foo' already exists with same URL

Maybe the upload parameter should be able to specify an existing remote entry if the build environment already has this set up?

If I remove this entry this entry before running the builder I can upload the package. But, if I run the build script a second time the packager.py tries to add the upload_repoagain, even though it already exist.

ERROR: Remote 'upload_repo' already exists in remotes (use update to modify)

I do not see where this is removed from the list of remotes?

Better control on visual studio default builds

When using visual studio builds and default builds it is not possible to have Visual Studio 2010 64Bits builds and also one cannot select which runtime to compile, merge requests #5 and #6 resolve this without changing the way it work before.

Automatic test requires does not work with CI

Official documentation and conan new -t suggest using a conanfile in a test package without requires directive. I tried to use this.

Travis uses CONAN_USERNAME, CONAN_PASSWORD and maybe CONAN_REFERENCE and others that should be passed to the conan-package-tools and then to the conan API. But it says "Please specify user and channel" when running on Travis, running build.py. Running test_package with a reference on command line works fine. Running test_package without a reference obviously fails.

I tried to debug a problem - user variables is lost between c-p-t and conan API.

conan (0.25.1)
conan-package-tools (0.4.35)

Cannot install conan-package-tools on lasote/conangcc53

Last docker image does not contain openssl so cryphography cannot be installed.

See the build https://travis-ci.org/theirix/conan-jsoncpp/jobs/205219355
Also reproducible by launching sudo pip install conan_package_tools --upgrade with the latest lasote/conangcc53 image.

Installing collected packages: conan-package-tools, idna, pyasn1, enum34, ipaddress, pycparser, cffi, cryptography, pyOpenSSL
Running setup.py install for conan-package-tools ... done
Running setup.py install for pycparser ... done
Running setup.py install for cryptography ... error
Complete output from command /usr/bin/python -u -c "import setuptools, tokenize;file='/tmp/pip-build-OvhSK2/cryptography/setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" install --record /tmp/pip-M7vs4Z-record/install-record.txt --single-version-externally-managed --compile:
running install
running build
running build_py
creating build
creating build/lib.linux-x86_64-2.7
creating build/lib.linux-x86_64-2.7/cryptography
copying src/cryptography/fernet.py -> build/lib.linux-x86_64-2.7/cryptography
copying src/cryptography/init.py -> build/lib.linux-x86_64-2.7/cryptography
copying src/cryptography/about.py -> build/lib.linux-x86_64-2.7/cryptography
copying src/cryptography/exceptions.py -> build/lib.linux-x86_64-2.7/cryptography
copying src/cryptography/utils.py -> build/lib.linux-x86_64-2.7/cryptography
creating build/lib.linux-x86_64-2.7/cryptography/x509
copying src/cryptography/x509/extensions.py -> build/lib.linux-x86_64-2.7/cryptography/x509
copying src/cryptography/x509/base.py -> build/lib.linux-x86_64-2.7/cryptography/x509
copying src/cryptography/x509/init.py -> build/lib.linux-x86_64-2.7/cryptography/x509
copying src/cryptography/x509/oid.py -> build/lib.linux-x86_64-2.7/cryptography/x509
copying src/cryptography/x509/general_name.py -> build/lib.linux-x86_64-2.7/cryptography/x509
copying src/cryptography/x509/name.py -> build/lib.linux-x86_64-2.7/cryptography/x509
creating build/lib.linux-x86_64-2.7/cryptography/hazmat
copying src/cryptography/hazmat/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/backends
copying src/cryptography/hazmat/backends/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends
copying src/cryptography/hazmat/backends/interfaces.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends
copying src/cryptography/hazmat/backends/multibackend.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives
copying src/cryptography/hazmat/primitives/constant_time.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives
copying src/cryptography/hazmat/primitives/serialization.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives
copying src/cryptography/hazmat/primitives/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives
copying src/cryptography/hazmat/primitives/hmac.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives
copying src/cryptography/hazmat/primitives/keywrap.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives
copying src/cryptography/hazmat/primitives/padding.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives
copying src/cryptography/hazmat/primitives/cmac.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives
copying src/cryptography/hazmat/primitives/hashes.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings
copying src/cryptography/hazmat/bindings/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/encode_asn1.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/rsa.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/dsa.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/hmac.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/ec.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/decode_asn1.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/dh.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/backend.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/cmac.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/hashes.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/utils.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/ciphers.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
copying src/cryptography/hazmat/backends/openssl/x509.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/openssl
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/commoncrypto
copying src/cryptography/hazmat/backends/commoncrypto/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/commoncrypto
copying src/cryptography/hazmat/backends/commoncrypto/hmac.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/commoncrypto
copying src/cryptography/hazmat/backends/commoncrypto/backend.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/commoncrypto
copying src/cryptography/hazmat/backends/commoncrypto/hashes.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/commoncrypto
copying src/cryptography/hazmat/backends/commoncrypto/ciphers.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/backends/commoncrypto
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/ciphers
copying src/cryptography/hazmat/primitives/ciphers/base.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/ciphers
copying src/cryptography/hazmat/primitives/ciphers/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/ciphers
copying src/cryptography/hazmat/primitives/ciphers/algorithms.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/ciphers
copying src/cryptography/hazmat/primitives/ciphers/modes.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/ciphers
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/kdf
copying src/cryptography/hazmat/primitives/kdf/kbkdf.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/kdf
copying src/cryptography/hazmat/primitives/kdf/hkdf.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/kdf
copying src/cryptography/hazmat/primitives/kdf/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/kdf
copying src/cryptography/hazmat/primitives/kdf/pbkdf2.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/kdf
copying src/cryptography/hazmat/primitives/kdf/scrypt.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/kdf
copying src/cryptography/hazmat/primitives/kdf/concatkdf.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/kdf
copying src/cryptography/hazmat/primitives/kdf/x963kdf.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/kdf
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/twofactor
copying src/cryptography/hazmat/primitives/twofactor/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/twofactor
copying src/cryptography/hazmat/primitives/twofactor/totp.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/twofactor
copying src/cryptography/hazmat/primitives/twofactor/hotp.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/twofactor
copying src/cryptography/hazmat/primitives/twofactor/utils.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/twofactor
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/interfaces
copying src/cryptography/hazmat/primitives/interfaces/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/interfaces
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/asymmetric
copying src/cryptography/hazmat/primitives/asymmetric/rsa.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/asymmetric
copying src/cryptography/hazmat/primitives/asymmetric/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/asymmetric
copying src/cryptography/hazmat/primitives/asymmetric/dsa.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/asymmetric
copying src/cryptography/hazmat/primitives/asymmetric/ec.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/asymmetric
copying src/cryptography/hazmat/primitives/asymmetric/padding.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/asymmetric
copying src/cryptography/hazmat/primitives/asymmetric/dh.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/asymmetric
copying src/cryptography/hazmat/primitives/asymmetric/utils.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/primitives/asymmetric
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/openssl
copying src/cryptography/hazmat/bindings/openssl/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/openssl
copying src/cryptography/hazmat/bindings/openssl/_conditional.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/openssl
copying src/cryptography/hazmat/bindings/openssl/binding.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/openssl
creating build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/commoncrypto
copying src/cryptography/hazmat/bindings/commoncrypto/init.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/commoncrypto
copying src/cryptography/hazmat/bindings/commoncrypto/binding.py -> build/lib.linux-x86_64-2.7/cryptography/hazmat/bindings/commoncrypto
running egg_info
writing requirements to src/cryptography.egg-info/requires.txt
writing src/cryptography.egg-info/PKG-INFO
writing top-level names to src/cryptography.egg-info/top_level.txt
writing dependency_links to src/cryptography.egg-info/dependency_links.txt
writing entry points to src/cryptography.egg-info/entry_points.txt
reading manifest file 'src/cryptography.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
no previously-included directories found matching 'docs/_build'
warning: no previously-included files matching '*' found under directory 'vectors'
writing manifest file 'src/cryptography.egg-info/SOURCES.txt'
running build_ext
generating cffi module 'build/temp.linux-x86_64-2.7/_padding.c'
creating build/temp.linux-x86_64-2.7
generating cffi module 'build/temp.linux-x86_64-2.7/_constant_time.c'
generating cffi module 'build/temp.linux-x86_64-2.7/_openssl.c'
building '_openssl' extension
creating build/temp.linux-x86_64-2.7/build
creating build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7
x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/include/python2.7 -c build/temp.linux-x86_64-2.7/_openssl.c -o build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_openssl.o
build/temp.linux-x86_64-2.7/_openssl.c:434:30: fatal error: openssl/opensslv.h: No such file or directory
compilation terminated.
error: command 'x86_64-linux-gnu-gcc' failed with exit status 1

----------------------------------------

Command "/usr/bin/python -u -c "import setuptools, tokenize;file='/tmp/pip-build-OvhSK2/cryptography/setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" install --record /tmp/pip-M7vs4Z-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-OvhSK2/cryptography/

Verbose option for print build list

Conan Package Tools introduced the print build list that helps to identify all builds to be performed.

# packager.py:225
print("Page       : ", curpage)
print("Builds list:")
for p in builds_in_current_page: print(list(p._asdict().items()))

Today I'm working with Gcc and Clang, so, I got a bunch of builds in my CI log for each job. It's not a problem, but could be selected by verbose option:

# packager.py:205
def run_builds(self, curpage=None, total_pages=None, verbose=False):
  ...
    if verbose:
        print("Page       : ", curpage)
        print("Builds list:")
        for p in builds_in_current_page: print(list(p._asdict().items()))

Another approach, could be using python logging, putting this message on DEBUG level.

Travis GCC 5.2, 5.3 builds not scheduled correctly

In my package I set up two pages in the Travis-config so that for every gcc version there is one build job for debug and one for release builds (I am just building for x86_64). This works fine for all gcc versions except 5.2 and 5.3 where each of the two Travis builds builds debug *and release and therefore hit the build timeout limit.

Is there anything I am doing wrong? Thanks a lot!

Plan for Profile breaking change in develop

Current conan develop branch has changed, so it produces a failure in conan-package-tools:

File "C:\Users\memsharded\conanws\conan-package-tools\conan\packager.py", line 335, in _get_profile
    return Profile.loads(tmp % (settings, options, env_vars, br_lines))
AttributeError: type object 'Profile' has no attribute 'loads'

Remember to update it for next release

Stable username option

Hi!

The conventional behavior using conan is, each user provides some channel with it packages, for example, I created libusb package at conan.io, so, who wants this package, will look at uilianries/stable. Okay, this works and everybody goes happy to home.

However, if you use conan at your company and want to create some main stable channel, this approach will not fit very well. For example, the developers Foo, Qux and Baz, use the same package in some project, but no one wants to keep searching who is the maintainer of package and where is the real stable package, so, the solution is create a main user name: Company. After this point, all developers know the stable channel will be at Company/version. But, I new problem comes, who is Company user maintainer and who submits in it channel? Any one could submit, since the package was validate (build, tests ...).

The Conan package tools switch to stable channel when it found some pattern in branch name. But, consider the situation where my package will be the main package for all developers, so I'll need bring this package to Company user. To do this, I need to change the user name and password to upload at stable channel. This works, but is a quite manual.

To turn this process more automatic, the user name could be switched by stable pattern, some like this:

# packager.py:490

if channel:
    self.logger.warning("Redefined channel by CI branch matching with '%s', "
"setting CONAN_CHANNEL to '%s'" % (pattern, channel))
	self.username = getenv("CONAN_STABLE_USERNAME", self.username)
	self.password = getenv("CONAN_STABLE_PASSWORD", self.password)

This piece will change the current user name to a stable user name, when the stable channel is used. If a stable user name was not defined, so, the current user name is maintained as before. The same behavior is applied to password.

The stable switch for user name allows an automatic switch not just for channel. So, when I need some package, by default, I'll search at Company user, instead to ask who is the real maintainer.

Pages behaviour

It is somehow not specified on the documentation how the builds are split by pages.

I've tested it and it seems that page N gets the builds with array index i mod N == 0, that is for example for 4 pages:

builder.builds[0] - > Page 1
builder.builds[1] - > Page 2
builder.builds[2] - > Page 3
builder.builds[3] - > Page 4
builder.builds[4] - > Page 1
builder.builds[5] - > Page 2

Please confirm and I can pull request a docs clarification.

Also, is there a way to have any control over this? For example, if I have 2 pages, I might want to split the builds in Page 1 for x86 and Page 2 for x86_64 so they take about the same time, as I also might want to keep variations of Debug or Shared in the same page, to reuse builds that just do different packaging.

Would I have to construct my builds array with x86 on even indexes and x86_64 on odd indexes?

Thanks,
Nuno

os setting is missing

Hi,

I would expect the ConanMultiPackager's builds to have the os setting since this setting is available in most other contexts.
I currently want to disable Linux 32 bit builds like this:

builder = ConanMultiPackager()
builder.add_common_builds(...)
builder.builds = [
    [settings, options]
    for settings, options in builder.builds
    if not (settings["os"] == "Linux" and settings["arch"] == "x86")
]

However it is obvious from the source that os is never defined:

Traceback (most recent call last):
  File "build.py", line 11, in <module>
    if not (settings["os"] == "Linux" and settings["arch"] == "x86")
KeyError: 'os'

I suppose os may also be valuable in the future if cross compilation is supported.

Update readme

Hi,

afaik the command test was changed from "test" to "test-package"?

Greets

Add Bamboo CI

I'd like to see bamboo get recognized the same way as appveyor and travis. Proposition lies in PR #11

MinGW build fails in Appveyor

This MinGW build failed in Appveyor but I don't know why. Everything works fine until the package creation. Then it complains about virtual environment. I have never seen this problem until now. I guess it's related to to the fact conan-package-tools uses mingw_installer (in a virtual environment) but I don't know what to do to fix it.
Can you help me?

MinGW builds not working with appveyor

When running a mingw build in appveyor, I get the following error:
ERROR: Conanfile: 'settings.compiler.libcxx' value not defined

When conan user is called, conan set the default settings for Visual Studio and do not set the libcxx setting. Then when the conan-package-tools starts the mingw build, it does not provide the libcxx setting neither.

Is there a way for me to set this settings in the appveyor configuration?

A build log is available here: https://ci.appveyor.com/project/osechet/conan-icu/build/1.0-6/job/x5efdfydi1pq1q06

build_policy missunderstanding

Hi!

I'm building a package (here) that download the sources from a git repo. I want to maintain a version, master, that always grab the last revision from that repository. I supposed that this will be the behaviour of build_policy='always', the client will get the sources and compile them each time conan install is invoked... but

ERROR: Conanfile has build_policy='always', no packages can be uploaded
Traceback (most recent call last):
  File "build.py", line 16, in <module>
    builder.run()
  File "C:\Python27-x64\lib\site-packages\conan\packager.py", line 261, in run
    self._upload_packages()
  File "C:\Python27-x64\lib\site-packages\conan\packager.py", line 367, in _upload_packages
    raise Exception("Error uploading")
Exception: Error uploading
Command exited with code 1

It's ok, it has no sense to upload the binaries, but I want to upload the recipe... is this behaviour possible?

Thanks!

Python3 doesn't work on OSX

https://travis-ci.org/nunojpg/conan-botan/jobs/232949741

Traceback (most recent call last):
  File "/Users/travis/.pyenv/versions/3.6.1/envs/conan/lib/python3.6/site-packages/pkg_resources/__init__.py", line 659, in _build_master
    ws.require(__requires__)
  File "/Users/travis/.pyenv/versions/3.6.1/envs/conan/lib/python3.6/site-packages/pkg_resources/__init__.py", line 967, in require
    needed = self.resolve(parse_requirements(requirements))
  File "/Users/travis/.pyenv/versions/3.6.1/envs/conan/lib/python3.6/site-packages/pkg_resources/__init__.py", line 858, in resolve
    raise VersionConflict(dist, req).with_context(dependent_req)
pkg_resources.ContextualVersionConflict: (pyOpenSSL 17.0.0 (/Users/travis/.pyenv/versions/3.6.1/envs/conan/lib/python3.6/site-packages), Requirement.parse('pyOpenSSL<16.1.0,>=16.0.0'), {'conan'})

Tested it failing for Python 3.5 and 3.6.

In Linux it works fine.

Error when specifying remote for uploading packages

The documentation states that a remote can be specified, and shows an example on how to do it; however, the example gives me an error:

Traceback (most recent call last):
File "example.py", line 6, in <module>
    builder.upload_packages("mypackage/1.2.3@user/testing", "myconanserverpassword", remote="mycustomserver")
TypeError: upload_packages() got an unexpected keyword argument 'remote'

Example:

from conan.packager import ConanMultiPackager

if __name__ == "__main__":
    builder = ConanMultiPackager([], "user", "testing")
    builder.upload_packages("mypackage/1.2.3@user/testing", "myconanserverpassword", remote="mycustomserver")

Versions:

  • conan: 0.7.4
  • conan-package-tools: 0.1.2
  • Python: 2.7.11

add_common_builds for Windows assumes mingw

After upgrading to 0.2.20, I'm getting:

Traceback (most recent call last):
  File "build.py", line 18, in <module>
     builder.add_common_builds(pure_c=False)
   File "C:\Users\Jenkins\AppData\Roaming\Python\Python27\site-packages\conan\packager.py", line 124, in add_common_builds
     if self.mingw_builds(pure_c):
   File "C:\Users\Jenkins\AppData\Roaming\Python\Python27\site-packages\conan\packager.py", line 95, in mingw_builds
    version, arch, exception, thread = config
    ValueError: need more than 1 value to unpack

Looks like mingw_builds needs to check if mingw_configurations is empty?

        for config in self.mingw_configurations:
            version, arch, exception, thread = config

Libraries incompatibility when running build.py on linux

I've created a package for dlib and it worked in appveyor, but on local linux and travis I get an error at the test_package step

home/conan/.conan/data/dlib/19.1.0/.../lib/libdlib.so when searching for -ldlib
/usr/bin/ld: skipping incompatible /home/conan/.conan/data/dlib/19.1.0/.../lib/libdlib.a when searching for -ldlib

I've checked on local linux and the library and its dependencies are build with the same arch. Runnig just test_package works fine. What can be the problem?

Regression on 0.3.1: ValueError: too many values to unpack

This build now fails:

https://travis-ci.org/nunojpg/conan-restbed/jobs/222866566

+python build.py
############## CONAN PACKAGE TOOLS ######################
WARNING: Redefined channel by CI branch matching with 'master', setting CONAN_CHANNEL to 'stable'
#########################################################
Traceback (most recent call last):
File "build.py", line 7, in
for settings, options in builder.builds:
ValueError: too many values to unpack
The command "./.travis/run.sh" exited with 1.

Stable channel support for Gitlab

Hi!

At packager.py:470, we have:

def _get_channel(self, default_channel, stable_channel)

This function filter the CI runner by environment variable, but Gitlab is not listed. As result, CONAN_STABLE_BRANCH_PATTERN doesn't work at Gitlab Runner.

To support this new feature, you could use Gitlab Variables.

I think that, this piece solves the case:

gitlab = os.getenv("GITLAB_CI" , False)  # Mark that job is executed in GitLab CI environment
gitlab_branch = os.getenv("CI_BUILD_REF_NAME", False)  # The branch or tag name for which project is built

Saludos.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.