Giter Site home page Giter Site logo

conan-extensions's People

Contributors

abrilrbs avatar anders-zpt avatar czoido avatar danimtb avatar davidsanfal avatar ericlemanissier avatar erik-moqvist avatar hedtke avatar jkowalleck avatar juansblanco avatar memsharded avatar michxymi avatar nicebluechai avatar prince-chrismc avatar schwaerz avatar uilianries avatar wbehrens-on-gh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

conan-extensions's Issues

[feature] Dependabot like tool to automate updating lockfiles

What is your suggestion?

Unfortunately, dependabot has stopped accepting feature requests to support other package managers. See this github thread here

A tool like this would greatly supplement Conan lockfiles.

For our internal components, we're going to ask teams to start checking in the lockfile beside their conanfile. However, teams are hesitant to adopt this as it is another process that they need to maintain, and it's not something they're not familiar with. If this process were automated, I'm sure there would be greater adoption.

Here are a few key features in dependabot that would be useful

  • automatically creating a PR with an updated lockfile
  • ignore updating subset of dependencies

Conan specific features/considerations

  • Invoke conan config install
  • Find the latest version which matches the version ranges from conanfile
  • Support checking dependencies for mutliple configurations (windows/linux dependencies can be different), and save the dependencies in the same lockfile
  • Make sure updates still resolve to a viable conan graph (no conflicts with other dependencies)

Have you read the CONTRIBUTING guide?

  • I've read the CONTRIBUTING guide

Support for specifying the build url on the command line

Currently we are manually patching the build_info.json created by the art:build_info create to include the build url, like:

{
    "url": "https://foo.com/bar",
    "version": "1.0.1",
    "name": "build-info-test",
    "number": "666",
...
}

I was wondering whether it is planned to support this out of the box, probably by adding a parameter similar to the name and number parameters?

Or maybe it is already working somehow but I just don't know how?

[question] Build info in Artifactory doesn't link to the conan packages in Artifactory

For our builds we used the Artifactory plugin for Jenkins in combination with Conan. Additionally we push the build information to artifactory after uploading the Conan package to artifactory. The problem is that there are no artifacts in the build information.This makes it especially difficult for developers. You cannot jump directly from the build info in artifactory to the respective artifact in the artifactory tree.

We use the follwowing code to upload the conan package and publish the build info to artifactory.

String cmd = "upload TestProj/*@user/develop --all -r " + artifactoryServerName + " --confirm "
def b = artifactoryConanClient.run(command: cmd, buildInfo: buildInfo)
artifactoryServer.publishBuildInfo b

Artifactory shows that there are three artifacts:
Artifactory Build Info

But if I open the package there is no link to the artifacts:

Artifactory build info - published modules

Does anyone else have similar problems or see a bug in my code?

Versions:
Artifactory: 6.17.0
artifactoryPluginVersion: 3.6.1
Conan: 1.23.0

cannot import name 'api_request' from 'utils'

When I install the following command, the following prompt appears, how to solve it, thanks!

❯ conan config install https://github.com/conan-io/conan-extensions.git
Trying to clone repo: https://github.com/conan-io/conan-extensions.git
Repo cloned!
Copying file runtime_zip_deploy.py to /Users/robin/.conan2/extensions/deployers
Copying file licenses.py to /Users/robin/.conan2/extensions/deployers
Copying file cmd_cyclonedx.py to /Users/robin/.conan2/extensions/commands/sbom
Copying file readme_build_info.md to /Users/robin/.conan2/extensions/commands/art
Copying file cmd_property.py to /Users/robin/.conan2/extensions/commands/art
Copying file utils.py to /Users/robin/.conan2/extensions/commands/art
Copying file cmd_server.py to /Users/robin/.conan2/extensions/commands/art
Copying file readme_server.md to /Users/robin/.conan2/extensions/commands/art
Copying file readme_property.md to /Users/robin/.conan2/extensions/commands/art
Copying file cmd_build_info.py to /Users/robin/.conan2/extensions/commands/art
Copying file cmd_bump_deps.py to /Users/robin/.conan2/extensions/commands/recipe
Copying file cmd_convert_txt.py to /Users/robin/.conan2/extensions/commands/migrate
Copying file cmd_export_all_versions.py to /Users/robin/.conan2/extensions/commands/cci
Copying file cmd_upgrade_qt_recipe.py to /Users/robin/.conan2/extensions/commands/cci
Copying file cmd_list_v2_ready.py to /Users/robin/.conan2/extensions/commands/cci

❯ conan art:server --help
ERROR: Error loading custom command art.cmd_build_info: cannot import name 'api_request' from 'utils' (/Users/robin/miniconda3/lib/python3.11/site-packages/utils/__init__.py)
ERROR: Error loading custom command art.cmd_property: cannot import name 'api_request' from 'utils' (/Users/robin/miniconda3/lib/python3.11/site-packages/utils/__init__.py)
ERROR: Error loading custom command art.cmd_server: cannot import name 'api_request' from 'utils' (/Users/robin/miniconda3/lib/python3.11/site-packages/utils/__init__.py)
'art:server' is not a Conan command. See 'conan --help'.

ERROR: Unknown command 'art:server'
ERROR: Error loading custom command art.cmd_build_info: cannot import name 'api_request' from 'utils'

[question] test_requires in lockfile/SBOM

What is your question?

test_requires packages (e.g. gtest) are entered as "requires" in the lockfiles and are therefore also part of the SBOM. I think such requires should not be included in the SBOM. Or is there any reason for this?

Have you read the CONTRIBUTING guide?

  • I've read the CONTRIBUTING guide

Set the Artifact.properties using the CI/CD building process (TeamCity)

Hi team,

I Have not found a solution for my request.

  1. I have a task - I need to save some properties about git_commit / build_number _CI_CD/ HW_version in Artifactory.
    I found artifact.properties** file in .conan cache folder which is possible to edit only locally. And after editing artifact.properties file I could upload packages manually into Artifactory repository. That is ok, and working fine.
    But how to be with CI/CD building, when packages uploaded automatically ? How to edit this file in this case (like in server side)? Maybe, there are any other possibilities to save the props in packages revision?

  2. One more question, how to read this properties? I mean not only in Artifactory GUI, but in command line. It is needed to check firstly which package to install based on these props values.
    For example to see all recipe revisions or package ID, or package revisions I could use conan search command. But what about properties? Are there any commands to represent it in terminal?

"conan art:build-info upload" is using wrong file path and fails

When using the "conan art:build-info upload", according to the documentation, we should provide the repository where the packages were uploaded in Artifactory (which is not the same one as the Conan remote name, which is a virtual repository).

Once we run the "conan art:build-info upload" command, it fails due to the package not being found as it is not stored in the virtual repository location (please note that adding properties is supported only for local and local-cache repositories). This occurs because, during the build upload process, it attempts to use the path of the virtual repository rather than the local repository where the packages were initially uploaded. For example:
image

iOS/OSX universal binaries handling

universal (fat) binaries are pretty common for Apple platforms - that's just binaries for several architectures combined together.

sorry for the long text - but I am trying to describe my use case as detailed as possible :)
e.g. for OSX it's common to have x86 + x64 universal binaries, while for iOS it's usually armv7 + armv7s + arm64/armv8 combined together, plus optionally x86 and x64, if iOS simulator is requied (so from 3 to 5 architectues). same story for other Apple platforms, such as watchOS and tvOS.
usually, universal binaries are produced by running multiple builds and then running lipo utility on resulting shared/static libraries. there is alternate approach to run single universal build, but it tends to be complicated and error-prone with configure (autotools) style projects - e.g. such configure scripts need to detect size of pointer, which is ambiguous in case of universal binaries (sizeof(void*) - 4 or 8?).

I need some advice on how to proceed with universal binaries in conan. there are several approaches how it could be done:

  1. conan receipt invokes N builds for N architectures and runs lipo to combine as postprocessing.
    this is the way I am currently using. typical conan file may look like:
if self.settings.os == "Windows":
   self.build_with_msvc()
else:
   self.build_with_configure()

and then for OSX/iOS it will become much more complicated, like:

if self.settings.os == "Windows":
   self.build_with_msvc()
elif self.settings.os == "iOS":
   arches = ["armv7", "armv7s", "arm64", "i386", "x86_64"]
   for arch in arches :
        self.build_with_configure(arch, "iphoneos")
    self.lipo(arches)
elif self.settings.os == "Macos":
   arches = ["i386", "x86_64"]
   for arch in arches :
        self.build_with_configure(arch, "macos")
    self.lipo(arches)
else:
   self.build_with_configure()

there are few disadvantages of such approach:

  • conanfile has very special path for iOS and Macos, although build process is same as for other NIX* platforms
  • lots of code copy-pasted from package to package for universal binaries handling
  • package "arch" setting is no longer showing the truth, moreover, it's confusing field in case of universal binaries
  1. build N conan packages for each architecture and then combine them somehow
    the idea is to build conan packages for each architecture (armv7, arm64, etc) as usual and then somehow combine them into aggregated multi-architecture conan package. I am not sure if it's even possible now to have some approach for conan package to support multiple settings at the same time, but probably there are more use cases for such combined packages (e.g. Windows tools which may run on both x86 and x64, but not on ARM might have combined x86+x64 indication). probably this approach ends up in some new conan command line, like "conan combine" or "conan lipo" which will run lipo for all libraries from packages to be combined.

some unclear points about this approach:

  • while binaries are in general easy to handle, headers might be tricky part if they are different for different arches (in my practice I just copy headers from sources, but some headers might be generated from the build process and contain definitions like SIZE_OF_VOIDP)
  • anyway, conan somehow needs to know which binaries to process and where are they located (but in worst case, conan still can wildcard *so *a *dylib)
  • some convention on how to identify and describe combined packages shall be developed
  • new conan command will be too OS-specific which might be not good (or useless for other arches?)
  • probably issues with conflict resolution may appear - e.g. if asked for x86 arch, what to install, x86 or x86+x64 package?

Delete permissions needed for package promotion?

We are running into problems when trying to promote packages from one Artifactory repository to another.

To be precise, we are using the following scenario:
First, packages are built and uploaded in the classic way (via conan upload) to a "Quality" repository.
If everything is ready and some tests are executed, the packages shall be promoted to a "Production" repository.
We use the conan art:promote command from the bottom of this page:
https://github.com/conan-io/conan-extensions/tree/main/extensions/commands/art
to promote the packages.

But when doing so, we get this error message:


ERROR: Error requesting api/copy/Quality/user/channel/package-name/version/channel/recipe-rev/package/package-id/package-rev/?to=/Production/user/package-name/version/channel/recipe-rev/package/package-id/package-rev&suppressLayouts=0: {
	"messages" : [ {
		"level" : "ERROR",
		"message" : "User doesn't have permissions to override 'Production:user/package-name/version/channel/recipe-rev/package/package-id/package-rev/.timestamp'. Needs delete permissions."
	} ]
}

The user permissions for the "Production" repository indeed only include read and write permissions, but not delete or overwrite permissions.
However, this is by intention: In fact, deleting and overwriting in the "Production" repo in general is usually forbidden in our organization, because we always want to be able to reproduce the build, and therefore we want to prevent packages from being (accidentally) deleted or overwritten.

So my questions are:

  1. Is this behaviour intended?
  2. 
Is it really necessary, to have the delete permissions, is there no way to implement the promotion without them?

BTW: It seems, only the .timestamp file prevents the promotion - I don’t quite understand, why it already exists in the destination folder, or why it has to be modified at all…

[feature] Add token-based authentication to art:server command

In our company, we have an Artifactory server with SSO. Somehow, it is imcompatible with the way art:server add tries to get a token because the URL it tries to access doesn't exist. However, I can manually create an identity token and I'd like a way to specify it, e.g.:

conan art:server add my_art https://artifactory.company.com --user=fschoenm --token=<token>

There may be some corner cases but essentially, the subcommand to add a server should just use the specified token instead of trying to generate it.

[feature request] Add meta-data (label, tags, properties) to recipe and published packages for better search

Hi @memsharded and @lasote

this is a new feature request which would make things easier to find.

Basics

As I have seen, one can set properties for meta-data in Artifactory. Currently this is only available in pro version and I'm currently not sure if they already can be set automatically by conan. And I'm also not sure, if one can use these properties in a conan search command (e.g. in query). I didn't find any reference for that, so I think it's not.

Idea

So the idea / feature request is, that conan should provide a way, to define meta data in a recipe and/or command line, which will automatically set properties in Artifactory. If so, the user would be able to have additional filter when searching for packages.

Use case/example

The user is looking for a package (library) which can process XML data but the user does not know the (exact) name of the library. And probably there are more than just one library which can do this (there are a lot of libs who can do that). So the user wants to search for all stable packages which have the label/tag/property (no idea what could be the best term for that) "xml" or "xml*".
The command could look like this

conan search * [-r my-remote] -q "property=xml*"

which would list all (remote-)packages where the property is "xml" or everything else starting with "xml".

Best Aalmann

art:build info upload fails to upload build info where a (python) dependency is located on a different remote

We are using a python dependency (base class) in our conanfiles, which is located on a generic publicly available conan repo (conan-remote-a).
The package to be created will be uploaded to a different remote (conan-remote-b).

When trying to upload the build info, I will get an error '400' complaining that the properties for conan_base cannot be set currectly. Which is understandable as it searches (according to the error message) in the wrong remote (conan-remote-b).

Interestingly art:build_info create will always add the dependency of the python base class into the build info json file - even if I don't specify --with-dependencies. Maybe a bug there as well?

[question] How to upload & promote a package that was exported to the local cache

Hello,

I am trying to follow artifactory best practices by having 3 different conan repos: one for dev, one for tests and one for prod.

If I understood correctly, having a pro license, I can promote a build from one repo to the other.

However, after conan build and conan export-pkg have been run, there is no direct and easy way to upload a package to a repo, so that it can later be promoted with the JFROG CLI (unless uploaded with the dedicated CLI command, rather than conan upload).

Now, the conan art:promote was made for that purpose, but I don't really understand how it works.

I tried creating a graph when running conan build and passing it to conan art:build-info. That gives a very small file, while I expected it to list all my dependencies (there are properly listed inside graph.json), but none of that is present. Could you please give me a hint of how all of that works? I found no proper documentation regarding a full flow with these commands (from a conan build to an upload & promotion). Needless to say that JFROG CLI equivalents are not very clear either.

Thanks in advance!

[bug] License Zipper is skipping packages

There seem to be situations where the license zipper (and maybe other deployers) cannot collect files from all dependencies. This seems to happen if Conan decides that it can skip certain dependencies (e.g. header-only libs that have already been compiled into a static lib).

However, of course the license of skipped packages might still mandate me to bring it along, which is entirely unrelated to which packages have to be available for the build process.

Is there even a way to handle this situation correctly (or maybe I'm operating it wrong)? Otherwise Conan might be inadequate to correctly handle license obligations.

Steps to reproduce

I tried as example with the fast-dds package that depends on asio:

  1. Install conan-extensions with conan config install.
  2. Make sure all packages exist in your package cache:
$ conan install --requires fast-dds/2.10.1 -b missing
[...]
  1. Call conan install again with the license deployer:
$ conan install --requires fast-dds/2.10.1 --deployer=licenses
[...]
======== Computing dependency graph ========
Graph root
    cli
Requirements
    asio/1.28.0#83b60467ba4e3487d61a29a722166a0b - Cache
    fast-cdr/1.0.27#8dbda822b54bd311d8765cd3ea9b3381 - Cache
    fast-dds/2.10.1#41c9359faf3149a48652b3d049a2883a - Cache
    foonathan-memory/0.7.3#e15d6872869ad2a42dc4e8b95e624772 - Cache
    tinyxml2/9.0.0#f53e08b723411bc90730f934c6c2f511 - Cache
Build requirements
    cmake/3.27.0#1916e6ab33353145b93244d065a2c868 - Cache
Resolved version ranges
    cmake/[>=3.16.3 <4]: cmake/3.27.0

======== Computing necessary packages ========
Requirements
    asio/1.28.0#83b60467ba4e3487d61a29a722166a0b:da39a3ee5e6b4b0d3255bfef95601890afd80709#a99d859c17cb5264f9ecec84095602ae - Skip
    fast-cdr/1.0.27#8dbda822b54bd311d8765cd3ea9b3381:d841a431cd09fff8a601e5cb5d88ba4c9a0ac886#118448c3308155aa9521fbbab814f2f7 - Cache
    fast-dds/2.10.1#41c9359faf3149a48652b3d049a2883a:e10bb7c35eee6d3b2521d8cc27ed8664095ea09e#a3be3e7f2ecdbcbe29dc57ce732cb7bc - Cache
    foonathan-memory/0.7.3#e15d6872869ad2a42dc4e8b95e624772:a7502e7e8a977fc7b55a68c566cea99e85523872#587a800f6de0077c6e37fce471c27920 - Cache
    tinyxml2/9.0.0#f53e08b723411bc90730f934c6c2f511:d841a431cd09fff8a601e5cb5d88ba4c9a0ac886#1f0cab4b10005ca32472a4f43e15fa19 - Cache
Build requirements
    cmake/3.27.0#1916e6ab33353145b93244d065a2c868:63fead0844576fc02943e16909f08fcdddd6f44b#ca1493fa25e1944ebc4de1c49ac0fad5 - Skip

======== Installing packages ========
fast-cdr/1.0.27: Already installed! (1 of 4)
foonathan-memory/0.7.3: Already installed! (2 of 4)
foonathan-memory/0.7.3: Appending PATH env var with : /home/fschoenm/devel/conan-license-test/conan/p/b/foona04b81a8e3ccb8/p/bin
tinyxml2/9.0.0: Already installed! (3 of 4)
fast-dds/2.10.1: Already installed! (4 of 4)
WARN: deprecated: Usage of deprecated Conan 1.X features that will be removed in Conan 2.X:
WARN: deprecated:     'cpp_info.names' used in: fast-dds/2.10.1, fast-cdr/1.0.27, foonathan-memory/0.7.3
WARN: deprecated:     'cpp_info.build_modules' used in: fast-dds/2.10.1, fast-cdr/1.0.27, foonathan-memory/0.7.3
WARN: deprecated:     'env_info' used in: foonathan-memory/0.7.3

======== Finalizing install (deploy, generators) ========
deployer(licenses): /home/fschoenm/devel/conan-license-test/conan/p/b/fast-8bd3ddd4b92b9/p/licenses
deployer(licenses): /home/fschoenm/devel/conan-license-test/licenses/fast-dds/2.10.1
deployer(licenses): /home/fschoenm/devel/conan-license-test/conan/p/b/tinyx15bb29e242aad/p/licenses
deployer(licenses): /home/fschoenm/devel/conan-license-test/licenses/tinyxml2/9.0.0
deployer(licenses): /home/fschoenm/devel/conan-license-test/conan/p/b/fast-480486eeb23cb/p/licenses
deployer(licenses): /home/fschoenm/devel/conan-license-test/licenses/fast-cdr/1.0.27
deployer(licenses): /home/fschoenm/devel/conan-license-test/conan/p/b/foona04b81a8e3ccb8/p/licenses
deployer(licenses): /home/fschoenm/devel/conan-license-test/licenses/foonathan-memory/0.7.3
deployer(licenses): ['fast-dds/2.10.1/LICENSE', 'tinyxml2/9.0.0/LICENSE.txt', 'fast-cdr/1.0.27/LICENSE', 'foonathan-memory/0.7.3/LICENSE']

[...]
  1. Check the licenses.zip:
$ unzip -l licenses.zip
Archive:  licenses.zip
  Length      Date    Time    Name
---------  ---------- -----   ----
    11358  04-04-2023 14:53   fast-dds/2.10.1/LICENSE
      808  06-07-2021 02:10   tinyxml2/9.0.0/LICENSE.txt
    11358  03-22-2023 07:29   fast-cdr/1.0.27/LICENSE
      902  01-11-2023 12:44   foonathan-memory/0.7.3/LICENSE
---------                     -------
    24426                     4 files

As you can see, the asio license is missing.

[Question] Unable to upload build info

Hi!
I'm doing migration to conan 2.0 from conan 1.x.x and conan_build_info --v2 which works fine on prod Artifactory. But the new conan art:build-info upload command returns

ERROR: 403: Request for 'conan:my_company/bzip2/1.0.8/stable/d290a2bca88be37304ea12f0db2a6149/export/conanfile.py' is forbidden for user: 'gitlab', You must have annotate permission on this path

which means I need to extend permissions for gitlab user. Can anyone explain why? What the difference in this two approaches? I ask because I need to justify to the security team why this is necessary.

I've tried to check the difference on created by my own trial Artifactory account.
According to the How to manage Build Info's in Artifactory I repeated all steps:

conan create . --name bzip2 --version=1.0.8 --user=user --channel=testing -r test-conan -f json > out.json
conan upload bzip2/1.0.8@user/testing -r test-conan -c
conan art:build-info create out.json $(BUILD_NAME) $(BUILD_ID) test-conan --url=https://conan2bi.jfrog.io/artifactory --user=$(ADMIN_NAME) --password=$(ADMIN_PASS) --with-dependencies > bi.json
conan art:build-info upload bi.json --url=https://conan2bi.jfrog.io/artifactory --user=$(ADMIN_NAME) --password=$(ADMIN_PASS)

First 3 commands work without issue and produce correct json files.
But the last command output:

ERROR: 400: Failed to set properties on test-conan:user/bzip2/1.0.8/testing/89da1be3ae77a6034b86f0298f4f2b33/export/conanfile.py

What can be wrong there?

[feature] Publish buildinfo in a project specific artifactory repo

Provide an option to the conan_build_info --v2 publish subcommand to be able to specify an artifactory project with a dedicated buildinfo repository where the buildinfo should be stored.

Currently one can only perform this with the jfrog CLI build upload command (see https://www.jfrog.com/confluence/display/JFROG/Artifactory+REST+API#ArtifactoryRESTAPI-BuildUpload) but this requires an additional tool in the CI pipeline (including setup and authentication) though the principal functionality is already contained in the conan_build_info tool (see also czoido/conan-build-info-example#1).

[question] repository VS server VS url (vs remote)

Why are there repository, --server and --url? What's the difference, and why do I need multiple?

Also I get the following error:

  File "/home/hannes/.conan2/extensions/commands/art/cmd_build_info.py", line 167, in _get_remote_artifacts
    assert self._url and self._repository, "Missing information in the Conan local cache, " \
AssertionError: Missing information in the Conan local cache, please provide the --url and --repository arguments to retrieve the information from Artifactory.

ERROR: Missing information in the Conan local cache, please provide the --url and --repository arguments to retrieve the information from Artifactory.

Note that it says --repository, although "repository" is a positional argument and not specified via --.

It should also be noted that normal conan tools use the term "remote", accessible via -r and --remote. Why wasn't this used?

Error when installing conan-extensions

I had the following error when trying to install conan-extensions:

$ conan config install https://github.com/conan-io/conan-extensions.git
ERROR: Error loading custom command qt.cmd_upgrade_qt_recipe: No module named 'ruamel'
Trying to clone repo: https://github.com/conan-io/conan-extensions.git
Repo cloned!
Copying file cmd_property.py to /Users/martindelille/.conan2/extensions/commands/art
Copying file cmd_server.py to /Users/martindelille/.conan2/extensions/commands/art
Copying file cmd_build_info.py to /Users/martindelille/.conan2/extensions/commands/art
Copying file cmd_bump_deps.py to /Users/martindelille/.conan2/extensions/commands/recipe
Copying file cmd_convert_txt.py to /Users/martindelille/.conan2/extensions/commands/migrate
Copying file cmd_export_all_versions.py to /Users/martindelille/.conan2/extensions/commands/cci
Copying file cmd_upgrade_qt_recipe.py to /Users/martindelille/.conan2/extensions/commands/cci
Copying file cmd_list_v2_ready.py to /Users/martindelille/.conan2/extensions/commands/cci

I installed the missing package using pip install ruamel.yaml.

After installing extensions: "Error loading custom command cci.cmd_upgrade_qt_recipe" in each conan call

Hi! I work with conan installed by standalone installer(not by pip). After installing latest conan-extensions every conan command call starts with the error:

ERROR: Error loading custom command cci.cmd_upgrade_qt_recipe: No module named 'xml.etree'

Example:
image
Steps to reproduce:

  1. Download and install https://github.com/conan-io/conan/releases/download/2.0.14/conan-win-64.exe
  2. Run: conan config install https://github.com/conan-io/conan-extensions.git
  3. Run any other conan command(conan --version)

[docs] Add documentation for all build-info subcommands to the readme

          > > LGTM, the only thing missing is adding the documentation to the README

I took a look at the readme but can see a good place for it. Also, my thought was that this new --project flag is optional, and I didn't want to add more things to the commands of the example. If you think we should add it somewhere, please open an issue and I'll be happy to contribute it in a PR 😃

Maybe now that the commands has almost all the functionality it's a good opportunity to document all subcommands in the README?

Originally posted by @czoido in #56 (comment)

ERROR: There are no artifacts ...

I am experimenting with these commands, but build-info always fails for me:

% conan art:build-info create install_log.json mybuildname_release 1 ARTIFACTORY --with-dependencies  --url ARTIFACTORY_URL -vv            
ERROR: There are no artifacts for the openssl/3.1.2#ec69d7153001734ad8772d6689d84ac6 recipe. Probably the package was not uploaded before creating the Build Info.Please upload the package to the server and try again.

The respective part of install_log.json reads:

            "11": {
                "ref": "openssl/3.1.2#ec69d7153001734ad8772d6689d84ac6",
                "id": "11",
                "recipe": "Downloaded",
                "package_id": "8c90e1d6069bb1e38287dda3adc2164a219fc8ae",
                "prev": "0e7f358e9d3b86deb9b942d6eb87997a",
                "rrev": "ec69d7153001734ad8772d6689d84ac6",
                "rrev_timestamp": 1704471030.13,
                "prev_timestamp": 1704471030.82,
                "remote": "ARTIFACTORY",
                "binary_remote": "ARTIFACTORY",
                "build_id": null,
                "binary": "Download",
                "invalid_build": false,
                "info_invalid": null,
                "name": "openssl",
                "user": null,
                "channel": null,

So, clearly, it downloaded the openssl artifact with the respective hash from the artifactory.

The artifactory WebUI also shows:
image

What am I doing wrong?

Publish properties in the conan art:build-info upload command.

This is something that should simplify the process of working with the Build Info, instead of doing:

conan art:property build-info-add build_info.json ...
conan art:build-info upload build_info.json ...

To do so, it would be good that we can have shared modules between the two commands that is something that apparently is not possible and should be solved in the Conan client.

Support for searching in multiple Artifactory repositories for the artifacts of dependencies

The conan art:build-info create command currently only searches for artifacts in a single Artifactory repository. This can be specified by the repository positional argument. However, this is not sufficient as the dependencies of a package are frequently published to a different repository than the consumer package.

In our case some of the dependencies of the package we develop come from a different repository in Artifactory, hence the conan art:build-info create fails with the following message.

There are no artifacts for the recipe. Probably the package was not uploaded before creating the Build Info. Please upload the package to the server and try again.

Would it be possible to update the handling of the repository argument to support searching in multiple Artifactory repositories preferably by providing the repositories as a list of comma-separated values?

Build info upload failing in case the recipe upload was skipped, but a new package has been uploaded

I tried the art:build-info in our project to upload the build info to Artifactory for the new Conan2 builds, like:

    conan create . --format json --channel $channel --user $user --version $version --profile:build $profile --profile:host $profile > create.json
    conan upload xtea/$version@$user/$channel --remote $remote --confirm

    conan config install https://git-url.com/conan-extensions.git --source-folder=extensions/commands/art --target-folder=extensions/commands/art
    conan art:build-info create create.json ${JOB_NAME:-local} ${BUILD_NUMBER:-666} $remote --url https://articatory-url.com/artifactory > build_info.json
    sed -ie "2i \ \ \ \ \"url\": \"${BUILD_URL:-https://foo}\"," $(pwd)/build_info.json
    conan art:build-info upload build_info.json --url https://articatory-url.com/artifactory --user $CONAN_LOGIN_USERNAME --password $CONAN_PASSWORD

If I do not use --force in conan upload, for subsequent calls it will not re-upload the recipe, but upload a new package revision. The build info upload will fail with the following error message:

Upload summary:
conan-repo
  package/1.2.3@user/channel
    revisions
      a258c816cd0fe01b0e3f1b7b23150160 (Skipped, already in server)
        packages
          a7977d43447e716bc7861543ecdf642b7cc66708
            revisions
              df1ca62616cc0c83ac397c4e21dbbf86 (Uploaded)

....

ERROR: There are no artifacts for the package/1.2.3@user/channel#a258c816cd0fe01b0e3f1b7b23150160 recipe. Probably the package was not uploaded before creating the Build Info.Please upload the package to the server and try again.

[bug] conan art:server add results in ERROR: 400: Couldn't perform replication, check server logs for more information.

Hello,

I am using conan 2.3.0. I only checked on a linux machine, running python 3.9.18.

Running conan art:server add my_artifactory --user 'admin' --password 'password' 'http://artifactory:8082/artifactory/api/conan/conan-dev-local' -vvv

Gives me:

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/conan/cli/cli.py", line 193, in run
    command.run(self._conan_api, args[0][1:])
  File "/usr/local/lib/python3.9/site-packages/conan/cli/command.py", line 180, in run
    sub.run(conan_api, parser, *args)
  File "/usr/local/lib/python3.9/site-packages/conan/cli/command.py", line 197, in run
    info = self._method(conan_api, parent_parser, self._parser, *args)
  File "/home/carl//.conan2/extensions/commands/art/cmd_server.py", line 123, in server_add
    token = api_request("get", f"{url}/api/security/encryptedPassword", user, password)
  File "/home/carl//.conan2/extensions/commands/art/utils.py", line 80, in api_request
    raise BadRequestException(response_to_str(response))
utils.BadRequestException: 400: Couldn't perform replication, check server logs for more information.

ERROR: 400: Couldn't perform replication, check server logs for more information.

I first thought the error was caused because the targetted server was a virtual one, but changing it a local one ended up in the same result. I did not (know how to?) find anything interesting in the server's log either.

Could you please have a look?

Thanks in advance.

[question] Use art:build-info for consumer of conan packages

I'm not sure if this is even possible with the Artifactory extension. I have project that is acting just as a consumer of conan packages but does not create a conan package itself. I'm only using conan install . for getting the dependencies.

I tried to follow the steps in https://github.com/conan-io/conan-extensions/blob/main/extensions/commands/art/README.md, however I only got to step 2a in the description. After conan install I have a JSON file that looks fine but I cannot conan upload because my consumer is not a conan package itself. conan art:build-info create then doesn't generate anything useful, just an almost empty JSON struct:

{
    "version": "1.0.1",
    "name": "application-name",
    "number": "129",
    "agent": {},
    "started": "2024-01-04T10:24:14.908+0100",
    "buildAgent": {
        "name": "conan",
        "version": "2.0.16"
    },
    "modules": []
}

Nevertheless, I'd like to store the dependencies in the build info. My conan packages are stored on the same Artifactory (if that's a requirement) but just having the names for now would also be fine.

Is what I want even possible?

[question] python_requires and build infos

What is your question?

Hi!
the dependency graph of a python_require package does not include any modules. There is only node "0".

{
    "graph": {
        "nodes": {
            "0": {
                "ref": "conanfile",
                "id": "0",
                "recipe": "Cli",
                ...
            }
        },
        "root": {
            "0": "None"
        },
        "overrides": {},
        "resolved_ranges": {}
    }
}

This results in a build info (created with "conan art:build-info create") not containing any modules.

When uploading the build info to Artifactory, the build is not linked to the package files in the repository and it is not possible to promote such a build.

How can I promote builds of python_require packages in Artifactory?

Have you read the CONTRIBUTING guide?

  • I've read the CONTRIBUTING guide

SBOM semantics and missing properties

We currently use a custom generator to build SBOMs but are looking into switching to this extension.

I noticed that several fields are missing (e.g., the cpe, the license texts, etc.) and some have different semantics – for example, as far as I understand, the "author" of a component (or, in newer SBOM spec versions, "authors"; maybe also manufacturers?) would be the person/organization who wrote the source code, created the model, etc., while in conan the "author" is the author of the recipe. Maybe the conan recipe author would be fitting for "supplier" field, or . The author of the BOM, however, should always be Conan (and not the author of the recipe).

I have to admit that I am still unsure about the difference between author and manufacturer (especially since "author" is for "manual" processes and "manufacturer" for automated processes...), but that's maybe not that important right now.

  • Are there any plans to streamline this (and maybe support the specs 1.5 and/or 1.6)?
  • Is there a good way within conanfiles outside of conan_data to specify other relevant fields? (That's what we do right now, we utilize the conan_data with a hash sbom which then contains copyright, cpe, and a couple of other entries which we need; plus, we always extract license texts into a LICENSE file which we place next to the recipe so we can even include that in our SBOM generation)

hook_copy_pdbs_to_package.py fails for ffmpeg on Windows MSVC with MSYS

I assume it is failing because conanfile.run() is executing in the msys environment (ie, what ffmpeg is building inside)

This is what it printed out:

INSTALL m/cocache4/cache/b/ffmpebf64c855994b7/b/src/libavutil/tea.h
INSTALL m/cocache4/cache/b/ffmpebf64c855994b7/b/src/libavutil/tx.h
INSTALL m/cocache4/cache/b/ffmpebf64c855994b7/b/src/libavutil/film_grain_params.h
INSTALL m/cocache4/cache/b/ffmpebf64c855994b7/b/src/libavutil/video_hint.h
INSTALL libavutil/avconfig.h
INSTALL libavutil/ffversion.h

ffmpeg/6.1: [HOOK - hook_copy_pdbs_to_package.py] post_package(): PDBs post package hook running
ffmpeg/6.1: [HOOK - hook_copy_pdbs_to_package.py] post_package(): RUN: "%ProgramFiles(x86)%\Microsoft Visual Studio\Installer\vswhere.exe" -find "**\dumpbin.exe" -format json
mkdir: cannot create directory ‘/dev/shm’: Read-only file system
mkdir: cannot create directory ‘/dev/mqueue’: Read-only file system
/usr/bin/bash: line 1: fg: no job control

ERROR: [HOOK - hook_copy_pdbs_to_package.py] post_package(): Failed to locate dumpbin.exe which is needed to locate the PDBs and copy them to package folder.

[question] Package lists, promotion and virtual remotes

What is your question?

I'm trying to implement package promotion using package lists, but I'm having trouble with using a virtual remote as the source remote.

I create the package list like this:

conan install --require=top_package/version@company -r conan-virtual -f json > create_dev.json
conan list --graph=create_dev.json --format=json > pkglist_dev.json

My extension uses the REST API, with URL's such as:

 /artifactory/api/copy/conan-virtual/company/package/2.0.0/_/a9e11b8366f22f0e50ac3dcff74da3c4/export?to=/conan-prod/company/package/2.0.0/_/a9e11b8366f22f0e50ac3dcff74da3c4/export&suppressLayouts=1&dry=1

This leads to a 409 error. If the source remote is not virtual, there is no problem.

This is a fairly severe limitation, and I need some help dealing with this. Either of the following would be an acceptable solution to me:

  1. A way to determine which non-virtual remote the package is actually in.
    or, if necessary:
  2. Check if a remote is virtual, so I can give a sensible error message

Have you read the CONTRIBUTING guide?

  • I've read the CONTRIBUTING guide

jfrog xray report generation on consumed packages

What is your question?

I have created a conan package from examples2/tutorial/creating_packages/add_requires. When I upload the package in the conan package type repository in jfrog server, the xray report is not showing any security/vulnerability issue in the consumed package(fmt in this case). But xray scan is reporting issues if I push fmt package individually. Does the xray report not generated on consumed packages or libs? If yes, what I am missing?

conan_pkg_info fmt hello_pkg

Have you read the CONTRIBUTING guide?

  • I've read the CONTRIBUTING guide

SBOM XML namespace cannot be parsed by dependencytrack

Uploading an SBOM in XML 1.4 format results in parsing errors in dependencytrack. The namespace ns0 cannot be parsed properly.

The image shows two SBOMs in XML format. The one created by this extension uses xmlns:ns0 and the parser crashes:

2023-11-06 07:46:03,118 WARN [BomUploadProcessingTask] The BOM uploaded is not in a supported format. Supported formats include CycloneDX XML and JSON

The other example created with another tool does not use the namespace and dependencytrack works fine
Bild

Can we deactivate the ns0?

[question] How to package an application with its dependencies libraries?

What is your question?

Hello,

This is to get more clarification regarding conan-io/conan#14067

I ran your code @RubenRBS:

from conan.tools.files import copy

def deploy(graph, output_folder, **kwargs):
    for node in graph.nodes:
        libdirs = node.conanfile.cpp_info.libdirs
        if libdirs:
            for libdir in libdirs: 
                copy(node.conanfile, "*.so*", src=libdir, dst="your/final/path")

Two things were to be fixed. The first one was the import (files instead of file). The second one was to change the copy from *.so to *.so* to prevent some links to be broken, like for boost libraries.

I have a few questions:

  • Is it possible to run a cmake install at the beginning of the deploy() method? I already do so inside my package()method:
def package(self):
	cmake = CMake(self)
	cmake.install()

but I do not know if I can call that one directly.

  • Is it possible to remove some of the unrequired libs? In fact, taking the example of boost once again, one does not necessarily needs all the libs, since it only makes the deployment folder bigger.

  • Is it possible to add system libs? I need to add python to my folder, but since the recipe of cpython is not Conan 2.X ready just yet on conancenter, I am forced to link with the system Python in my CMakeLists.txt.

All of these questions serve one purpose : I want to run a single conan command to package (build done beforehand) and create a zip file of my project and all its dependencies. I then want to distribute it to clients (they will have the proper environment to run it, but won't have conan installed). Do you think you could give me a hint?

Thanks in advance.

Have you read the CONTRIBUTING guide?

  • I've read the CONTRIBUTING guide

[bug] SBOM generation with sbom:cyclonedx using "--requires" argument produces broken sbom

I am trying to generate SBOM for conan packages that are used in a project using the sbom:cyclonedx conan extension as specified in the docs.

In my case it is necessary to use the "--requires" option to provide a reference to the recipe for which I want to create SBOM. When doing so the generated SBOM looks broken where an unknown component is introduced in the SBOM output.
If the SBOM is generated by passing a path to the conan recipe of the project then the SBOM looks fine.

The same can be seen in the output provided in the README file for the extension.

The problem seems to be that the extension tries to set the component in metadata using the dependency graph root, which in case when "--requires" is used is always "cli" (I don't know what exactly that means). Since "cli" is not valid reference or package I always get UNKNOWN component in the produced SBOM.

Using CPack deployer

I'm developping a CMake Qt application using 11 conan dependencies and I would like to use CPack for deployement.

I read here that it should be a conan extension and I wondered how this would works ? Shall I write my own deployer ?

I currently call cpack from the build method.

I read here that @gmeeker has a quite similar setup so maybe you could share your thought.

Certificate verification is failing when using conan art:server add

I have installed the conan extensions and tried to use the following command to add the JFROG server.
conan art:server add artifactory <server_name> --user=<user_name> --password=
It generates an error: SSL: CERTIFICATE_VERIFY_FAILED
I tried to resolve it in two ways.

  1. I tried to pass ssl-verify False for the above command. But could not make it.Can you mention the proper command with ssl-verification false.
  2. I fetched the base64 encoded ASCII single certificate for the jfrog server and added the content in cacert.pem file. Then used the following command to install the cacert. But the issue persists. Can you guide me?
    conan config install cacert.pem

art:build-info create skips packages with binary: "Download"

I am trying to use art:build create as part of a github action.
The job being run on a clean node every time, when i run conan install, a lot of packages end up having binary: "Download" in the generated intermediate .json file.

Then, when I run art:build-info create on that file, because of this line, those packages are not included in the output.

Is that intentional?
I was able to workaround this by running conan install twice, which caused the second run to have binary: "Cache", and then using the --add-cached-deps option for art:build-info create . However, I believe there should be an option for including downloaded packages, like there's one for cached ones.

Fix dependencies IDs in build-info

The id should be

{
      "sha1": "aa299733a58ceb75834cc8aa294dc8de790b02a2",
      "md5": "eef32b59e65106d8c1834f5226bd3c09",
     "id": "liba/1.0@user/channel#0b1a78f5925f62480ed2d97002e926fa :: conan_sources.tgz"
},

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.