Giter Site home page Giter Site logo

ctuning / ck-env Goto Github PK

View Code? Open in Web Editor NEW
71.0 12.0 25.0 17.19 MB

CK repository with components and automation actions to enable portable workflows across diverse platforms including Linux, Windows, MacOS and Android. It includes software detection plugins and meta packages (code, data sets, models, scripts, etc) with the possibility of multiple versions to co-exist in a user or system environment.

Home Page: https://github.com/mlcommons/ck

License: BSD 3-Clause "New" or "Revised" License

Shell 12.23% C 7.28% Python 51.94% Java 0.01% TeX 0.01% R 0.01% ChucK 0.01% CMake 6.57% HTML 8.42% Batchfile 1.40% Thrift 0.01% 1C Enterprise 0.01% Ada 0.01% Perl 0.01% Roff 0.01% C++ 12.14% Hack 0.01% Makefile 0.01% HCL 0.01% Faust 0.01%
package-manager collective-knowledge research-workflows python linux windows macos android multiple-versions hpc mobile-devices reproducible-research compilers libraries tools models datasets portable-workflows codereef

ck-env's Introduction

Note that this repository is outdated: we are now using the next generation of the MLCommons CK workflow automation meta-framework (Collective Mind aka CM) developed by the open working group. Feel free to join this community effort to learn how to modularize ML Systems and automate their benchmarking, optimization and deployment in the real world!

Fighting the software and hardware chaos

All CK components for AI and ML are now collected in one repository!

This project is hosted by the cTuning foundation (non-profit R&D organization).

compatibility License

Linux & MacOS: Travis Build Status Windows: AppVeyor Build status

This is a Collective Knowledge repository with the functionality to support portable, customizable, reusable and automated workflows. It lets users automatically detect the target platform with already installed software, data and models required for a given workflow using CK software detection plugins and install missing packages in a unified way. Multiple versions of code, data and models can now co-exist in a user or system environment similar to Python virtualenv.

Further info:

Author

Contributors

Shared CK modules with actions

Installation

First install the CK framework as described here.

Then install this CK repository as follows:

 $ ck pull repo:ck-env

 $ ck list soft
 $ ck list package

Usage

You can detect and register in the CK all the instances of GCC and LLVM as follows:

 $ ck detect soft:compiler.gcc
 $ ck detect soft:compiler.llvm

You can now see multiple versions of the detected software registered in the CK as follows:

 $ ck show env

You can then compile and run unified CK benchmarks shared by the community using any of the above compiler instances (GCC, LLVM, ICC ...) and their versions simply as follows:

 $ ck pull repo:ck-autotuning
 $ ck pull repo:ctuning-programs

 $ ck compile program:cbench-automotive-susan --speed
 $ ck run program:cbench-automotive-susan

If you have Android NDK and SDK installed, CK can automatically detect it together with compiler versions (GCC, LLVM), register them and let you compile and run benchmarks on Android simply via:

 $ ck compile program:cbench-automotive-susan --speed --target_os=android21-arm-v7a
 $ ck run program:cbench-automotive-susan --target_os=android21-arm-v7a

You can find further details about our customizable and cross-platform package/environment manager here.

Questions and comments

Please feel free to get in touch with the CK community if you have any questions, suggestions and comments!

ck-env's People

Contributors

bellycat77 avatar chunosov avatar ctuning-admin avatar dsavenko avatar ens-lg4 avatar g4v avatar gfursin avatar lhartfield-arm avatar maria-18-git avatar psyhtest avatar rhymmor avatar sztaylor avatar taskset avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ck-env's Issues

Add NDK version to tags

I have several NDKs installed:

android-ndk-r13b/ android-ndk-r17b/ android-ndk-r18b/

and some compilers detected:

anton@velociti:~$ ck show env --tags=compiler,android
Env UID:         Target OS:      Bits: Name:                     Version: Tags:

1c70f43a4910bf0c android28-arm64    64 Android NDK LLVM compiler 7.0.2    64bits,android,compiler,host-os-linux-64,lang-c,lang-cpp,llvm,ndk,target-os-android28-arm64,v7,v7.0,v7.0.2
fbe5a0899e47cda8 android28-arm64    64 Android NDK GCC compiler  clang    64bits,android,compiler,gcc,host-os-linux-64,lang-c,lang-cpp,ndk,target-os-android28-arm64,v0
f58acb20767b3ac2 android24-arm64    64 Android NDK LLVM compiler 6.0.2    64bits,android,compiler,host-os-linux-64,lang-c,lang-cpp,llvm,ndk,target-os-android24-arm64,v6,v6.0,v6.0.2
9a7394e1429c8ed0 android24-arm64    64 Android NDK GCC compiler  4.9.x    64bits,android,compiler,gcc,host-os-linux-64,lang-c,lang-cpp,ndk,target-os-android24-arm64,v4,v4.9,v4.9.0

It's not easy, however, to see which NDKs the compilers come from:

$ ck cat env --tags=compiler,android | grep CK_ENV_COMPILER.*BIN | grep -v PATH
export CK_ENV_COMPILER_GCC_BIN=/home/anton/data/android-ndk-r17b/toolchains/aarch64-linux-android-4.9/prebuilt/linux-x86_64/bin
export CK_ENV_COMPILER_LLVM_BIN=/home/anton/data/android-ndk-r17b/toolchains/llvm/prebuilt/linux-x86_64/bin
export CK_ENV_COMPILER_GCC_BIN=/home/anton/data/android-ndk-r18b/toolchains/aarch64-linux-android-4.9/prebuilt/linux-x86_64/bin
export CK_ENV_COMPILER_LLVM_BIN=/home/anton/data/android-ndk-r18b/toolchains/llvm/prebuilt/linux-x86_64/bin

I guess that LLVM 6.0.2 and GCC 4.9.x are in NDK r17b, while LLVM 7.0.2 (and a weird GCC clang) is in NDK r18b, but it would be better to include the NDK version in the tags.

parse_version() is called twice (instead of once)

When detecting soft entries it seems that their version-detection code runs twice with the same input parameters (and thus with the same results). Here is the output of one full detection run :

$ ck detect soft --tags=lib,pyquil

  Searching for Rigetti pyQuil library (pyquil/__init__.py) to automatically register in the CK - it may take some time, pl  Searching for Rigetti pyQuil library (pyquil/__init__.py) to automatically register in the CK - it may take some time, pl  Searching for Rigetti pyQuil library (pyquil/__init__.py) to automatically register in the CK - it may take some time, pl  Searching for Rigetti pyQuil library (pyquil/__init__.py) to automatically register in the CK - it may take some time, please wait ...

    * Searching in /usr ...
    * Searching in /opt ...
    * Searching in /usr/local/Cellar ...
    * Searching in /Users/lg4 ...
    * Searching in /Users/lg4/Library/Python ...

  Search completed in 57.9 secs. Found 1 target files (may be pruned) ...

  Detecting and sorting versions (ignore some work output) ...


  Prepared CMD to detect version: PYTHONPATH=/Users/lg4/Library/Python/3.6/lib/python/site-packages python3 -c 'import pyquil ; print(pyquil.__version__)' >/var/folders/26/xh863tq54n59h5rnrk3c7mtw0000gn/T/B7mJmI ...

Executing "bash -c "chmod 755 /var/folders/26/xh863tq54n59h5rnrk3c7mtw0000gn/T/tmp-GTVbSa.sh; . /var/folders/26/xh863tq54n59h5rnrk3c7mtw0000gn/T/tmp-GTVbSa.sh"" ...

Version detected: 1.9.0
    * /Users/lg4/Library/Python/3.6/lib/python/site-packages/pyquil/__init__.py   (Version 1.9.0)


  Registering in the CK (/Users/lg4/Library/Python/3.6/lib/python/site-packages/pyquil/__init__.py) ...

  Software entry found: lib.pyquil (7a688556e84d7c47)

  Attempting to detect version automatically (if supported) ...

  Prepared CMD to detect version: PYTHONPATH=/Users/lg4/Library/Python/3.6/lib/python/site-packages python3 -c 'import pyquil ; print(pyquil.__version__)' >/var/folders/26/xh863tq54n59h5rnrk3c7mtw0000gn/T/07h7jr ...

Executing "bash -c "chmod 755 /var/folders/26/xh863tq54n59h5rnrk3c7mtw0000gn/T/tmp-oyMCqQ.sh; . /var/folders/26/xh863tq54n59h5rnrk3c7mtw0000gn/T/tmp-oyMCqQ.sh"" ...

Version detected: 1.9.0

  Detected version: 1.9.0

Searching if environment already exists using:
  * Tags: lib,quantum,rigetti,pyquil,host-os-macos-64,target-os-macos-64,64bits,v1,v1.9,v1.9.0

    Environment with above tags is not yet registered in CK ...

Environment entry added (58b160288a164bc2)!
  Successfully registered with UID: 58b160288a164bc2

Create package:lib-glog-0.3.5

At the moment we only have package:lib-glog-trunk. The latest release is 0.3.5. While the GLog project is moving slowly (the previous 0.3.4 release happened over two years ago), we should still have a stable package.

print all symlinks when detecting soft

When there are symlinks, we currently show the real path. It may be useful to show or even select symlinks. It's not clear what is the best way. I noticed it with colleagues from LLNL. We should think about it. Not urgent!

create target device description

I started working on a module 'device' that should be used to describe avialable target platforms where we can remotely run workloads on (and possibly compile program if supported). For now, device description will be added manually including target_os, target_device_id (for Android devices).

Importantly, we should also agree on how to describe remote access (i.e. if it's via ADB, SSH, on even CK for Windows-based devices).

Then we will need to update 'program' module from ck-autotuning to take '--target={device UOA}' as an input and then prepare all necessary steps to compile and run program for this device ...

I expect we will need to discuss implementation detail more ...

Bazel compile fails on x86 Ubuntu 18.04

Running ck install package:lib-tensorflow-1.10.1-src-cuda-xla inside the ctuning/ck-ubuntu-18.04 docker container fails on Bazel compile with:

> ** PATCH **
> ** COMPILE **
> Building Bazel from scratch......
> Building Bazel with Bazel.
> .WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by com.google.protobuf.UnsafeUtil (file:/tmp/bazel_KjG9DrmJ/archive/libblaze.jar) to field java.nio.Buffer.address
> WARNING: Please consider reporting this to the maintainers of com.google.protobuf.UnsafeUtil
> WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> WARNING: --batch mode is deprecated. Please instead explicitly shut down your Bazel server using the command "bazel shutdown".
> INFO: Analysed target //src:bazel (213 packages loaded).
> INFO: Found 1 target...
> INFO: From Compiling src/main/cpp/blaze_util_posix.cc:
> src/main/cpp/blaze_util_posix.cc: In function 'void blaze::Daemonize(const char*, bool)':
> src/main/cpp/blaze_util_posix.cc:217:3: warning: ignoring return value of 'int dup(int)', declared with attribute warn_unused_result [-Wunused-result]
>    (void) dup(STDOUT_FILENO);  // stderr (2>&1)
>    ^~~~~~~~~~~~~~~~~~~~~~~~~
> src/main/cpp/blaze_util_posix.cc: In function 'void blaze::DieAfterFork(const char*)':
> src/main/cpp/blaze_util_posix.cc:264:8: warning: ignoring return value of 'ssize_t write(int, const void*, size_t)', declared with attribute warn_unused_result [-Wunused-result]
>    write(STDERR_FILENO, message, strlen(message));  // strlen should be OK
>    ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> src/main/cpp/blaze_util_posix.cc:265:8: warning: ignoring return value of 'ssize_t write(int, const void*, size_t)', declared with attribute warn_unused_result [-Wunused-result]
>    write(STDERR_FILENO, ": ", 2);
>    ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~
> src/main/cpp/blaze_util_posix.cc:266:8: warning: ignoring return value of 'ssize_t write(int, const void*, size_t)', declared with attribute warn_unused_result [-Wunused-result]
>    write(STDERR_FILENO, error_string, strlen(error_string));
>    ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> src/main/cpp/blaze_util_posix.cc:267:8: warning: ignoring return value of 'ssize_t write(int, const void*, size_t)', declared with attribute warn_unused_result [-Wunused-result]
>    write(STDERR_FILENO, "\n", 1);
>    ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~
> src/main/cpp/blaze_util_posix.cc: In function 'uint64_t blaze::AcquireLock(const string&, bool, bool, blaze::BlazeLock*)':
> src/main/cpp/blaze_util_posix.cc:629:3: warning: ignoring return value of 'int ftruncate(int, __off_t)', declared with attribute warn_unused_result [-Wunused-result]
>    (void) ftruncate(lockfd, 0);
>    ^~~~~~~~~~~~~~~~~~~~~~~~~~~
> INFO: From JavacBootstrap src/main/java/com/google/devtools/build/lib/shell/libshell-skylark.jar [for host]:
> warning: [options] bootstrap class path not set in conjunction with -source 8
> warning: Implicitly compiled files were not subject to annotation processing.
>   Use -proc:none to disable annotation processing or -implicit to specify a policy for implicit compilation.
> 2 warnings
> INFO: From JavacBootstrap src/java_tools/singlejar/java/com/google/devtools/build/singlejar/libbootstrap.jar [for host]:
> warning: [options] bootstrap class path not set in conjunction with -source 8
> warning: Implicitly compiled files were not subject to annotation processing.
>   Use -proc:none to disable annotation processing or -implicit to specify a policy for implicit compilation.
> 2 warnings
> INFO: From JavacBootstrap src/java_tools/buildjar/java/com/google/devtools/build/buildjar/libskylark-deps.jar [for host]:
> warning: [options] bootstrap class path not set in conjunction with -source 8
> warning: Implicitly compiled files were not subject to annotation processing.
>   Use -proc:none to disable annotation processing or -implicit to specify a policy for implicit compilation.
> Note: Some input files use or override a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 2 warnings
> INFO: From JavacBootstrap src/java_tools/buildjar/java/com/google/devtools/build/buildjar/jarhelper/libbootstrap_jarhelper.jar [for host]:
> warning: [options] bootstrap class path not set in conjunction with -source 8
> 1 warning
> INFO: From JavacBootstrap src/java_tools/buildjar/java/com/google/devtools/build/buildjar/libbootstrap_VanillaJavaBuilder.jar [for host]:
> warning: [options] bootstrap class path not set in conjunction with -source 8
> warning: Implicitly compiled files were not subject to annotation processing.
>   Use -proc:none to disable annotation processing or -implicit to specify a policy for implicit compilation.
> Note: src/java_tools/buildjar/java/com/google/devtools/build/buildjar/VanillaJavaBuilder.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 2 warnings
> INFO: From Generating Java (Immutable) proto_library @googleapis//:google_devtools_build_v1_build_events_proto:
> google/devtools/build/v1/build_events.proto: warning: Import google/rpc/status.proto but not used.
> INFO: From Generating Java (Immutable) proto_library @googleapis//:google_watch_v1_proto:
> google/watcher/v1/watch.proto: warning: Import google/protobuf/empty.proto but not used.
> INFO: From Generating Java (Immutable) proto_library @googleapis//:google_bytestream_bytestream_proto:
> google/bytestream/bytestream.proto: warning: Import google/protobuf/wrappers.proto but not used.
> INFO: From SkylarkAction external/googleapis/google_bytestream_bytestream_java_grpc_srcs.jar:
> google/bytestream/bytestream.proto: warning: Import google/protobuf/wrappers.proto but not used.
> INFO: From SkylarkAction external/googleapis/google_watch_v1_java_grpc_srcs.jar:
> google/watcher/v1/watch.proto: warning: Import google/protobuf/empty.proto but not used.
> ERROR: /CK-TOOLS/tool-bazel-0.15.2-linux-64/src/src/main/java/com/google/devtools/build/lib/BUILD:1336:1: Building src/main/java/com/google/devtools/build/lib/libbazel-class.jar () failed: Worker process returned an unparseable WorkResponse!
> 
> Did you try to print something to stdout? Workers aren't allowed to do this, as it breaks the protocol between Bazel and the worker process.
> 
> ---8<---8<--- Exception details ---8<---8<---
> com.google.protobuf.InvalidProtocolBufferException$InvalidWireTypeException: Protocol message tag had invalid wire type.
> 	at com.google.protobuf.InvalidProtocolBufferException.invalidWireType(InvalidProtocolBufferException.java:115)
> 	at com.google.protobuf.CodedInputStream$StreamDecoder.skipField(CodedInputStream.java:2100)
> 	at com.google.protobuf.GeneratedMessageV3.parseUnknownFieldProto3(GeneratedMessageV3.java:303)
> 	at com.google.devtools.build.lib.worker.WorkerProtocol$WorkResponse.<init>(WorkerProtocol.java:1866)
> 	at com.google.devtools.build.lib.worker.WorkerProtocol$WorkResponse.<init>(WorkerProtocol.java:1830)
> 	at com.google.devtools.build.lib.worker.WorkerProtocol$WorkResponse$1.parsePartialFrom(WorkerProtocol.java:2420)
> 	at com.google.devtools.build.lib.worker.WorkerProtocol$WorkResponse$1.parsePartialFrom(WorkerProtocol.java:2415)
> 	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:221)
> 	at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:262)
> 	at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:275)
> 	at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:280)
> 	at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
> 	at com.google.protobuf.GeneratedMessageV3.parseDelimitedWithIOException(GeneratedMessageV3.java:347)
> 	at com.google.devtools.build.lib.worker.WorkerProtocol$WorkResponse.parseDelimitedFrom(WorkerProtocol.java:2082)
> 	at com.google.devtools.build.lib.worker.WorkerSpawnRunner.execInWorker(WorkerSpawnRunner.java:313)
> 	at com.google.devtools.build.lib.worker.WorkerSpawnRunner.actuallyExec(WorkerSpawnRunner.java:154)
> 	at com.google.devtools.build.lib.worker.WorkerSpawnRunner.exec(WorkerSpawnRunner.java:112)
> 	at com.google.devtools.build.lib.exec.AbstractSpawnStrategy.exec(AbstractSpawnStrategy.java:95)
> 	at com.google.devtools.build.lib.exec.AbstractSpawnStrategy.exec(AbstractSpawnStrategy.java:63)
> 	at com.google.devtools.build.lib.exec.SpawnActionContextMaps$ProxySpawnActionContext.exec(SpawnActionContextMaps.java:362)
> 	at com.google.devtools.build.lib.analysis.actions.SpawnAction.internalExecute(SpawnAction.java:287)
> 	at com.google.devtools.build.lib.analysis.actions.SpawnAction.execute(SpawnAction.java:294)
> 	at com.google.devtools.build.lib.skyframe.SkyframeActionExecutor.executeActionTask(SkyframeActionExecutor.java:960)
> 	at com.google.devtools.build.lib.skyframe.SkyframeActionExecutor.prepareScheduleExecuteAndCompleteAction(SkyframeActionExecutor.java:891)
> 	at com.google.devtools.build.lib.skyframe.SkyframeActionExecutor.access$900(SkyframeActionExecutor.java:115)
> 	at com.google.devtools.build.lib.skyframe.SkyframeActionExecutor$ActionRunner.call(SkyframeActionExecutor.java:746)
> 	at com.google.devtools.build.lib.skyframe.SkyframeActionExecutor$ActionRunner.call(SkyframeActionExecutor.java:700)
> 	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
> 	at com.google.devtools.build.lib.skyframe.SkyframeActionExecutor.executeAction(SkyframeActionExecutor.java:442)
> 	at com.google.devtools.build.lib.skyframe.ActionExecutionFunction.checkCacheAndExecuteIfNeeded(ActionExecutionFunction.java:503)
> 	at com.google.devtools.build.lib.skyframe.ActionExecutionFunction.compute(ActionExecutionFunction.java:224)
> 	at com.google.devtools.build.skyframe.AbstractParallelEvaluator$Evaluate.run(AbstractParallelEvaluator.java:382)
> 	at com.google.devtools.build.lib.concurrent.AbstractQueueVisitor$WrappedRunnable.run(AbstractQueueVisitor.java:355)
> 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> 	at java.base/java.lang.Thread.run(Thread.java:834)
> ---8<---8<--- End of exception details ---8<---8<---
> 
> ---8<---8<--- Start of log ---8<---8<---
> -Xbootclasspath/p is no longer a supported option.
> ion.
> ---8<---8<--- End of log ---8<---8<---
> Target //src:bazel failed to build
> INFO: Elapsed time: 175.594s, Critical Path: 43.26s
> INFO: 1234 processes: 1234 local.
> FAILED: Build did NOT complete successfully
> 
> ERROR: Could not build Bazel

avoid downloading packages again during installation

We should provide an option to avoid download packages during package reinstallation.

For now we can do it via some environment key such as SKIP_DOWNLOAD that can be set as "ck install package:... -DSKIP_DOWNLOAD=yes

Then install.sh or install.bat can detect this option and skip download ...

In the future, we should make a smart check of MD5SUM to be sure that package didn't change ...

Target_device_id arg check fails when there's multiple devices connected to host machine

Hi there,

We found a bug with the setup function in ck-env/module/soft/module.py line 351.

If you have multiple devices connected to a host machine and run the following command line:
$ ck install package:xxxxxxxxxxxxxxxxxxxxxxx --target_os=android24-arm64 --target_device_id=XXXXXXXXX

the device_target_id argument isn't recognised and the following error pops-up:
CK error: [package] more than one remote device - specify via --device_id!

Asking us to set the device_id.

We changed line 351 from --target_device_id to device_id and rerun the first command it runs successfully

Load env

Is there some command to load the environment (ie., add LD_LIBRARY_PATHs of the dependencies) for a specific package/program? It could be useful when you debug some program.

Thanks J

UnboundLocalError: local variable 'c' referenced before assignment

1, browse http://cknowledge.org/repo/web.php?template=ck-ai-basic&action=index&module_uoa=wfe&native_action=show&native_module_uoa=advice

2, select "use machine learning to predict compiler flags"

3, input Program source code like this:
/* Sample C kernel to demonstrate MILEPOST optimization prediction */

#include <stdio.h>

void test()
{
int i, j;

j = 0;
for(i = 0; i < 1000; i++)
j +=i;
}

int main(void)
{
int i,j;
float x[100];

for (j=0; j<16; j++)
for (i=0; i<16; i++)
x[i+j]=ij1.2;

printf("x[5]=%f", x[5]);

return 0;
}

4, then click "Extract features". We will find this error:

Traceback (most recent call last):
File "/home/fursin/fggwork/ck/ck/kernel.py", line 9611, in
r=access(sys.argv[1:])
File "/home/fursin/fggwork/ck/ck/kernel.py", line 9567, in access
rr=perform_action(i)
File "/home/fursin/fggwork/ck/ck/kernel.py", line 3457, in perform_action
return a(i)
File "/home/fursin/CK/ck-web/module/wfe/module.py", line 417, in index
rx=ck.access(ii)
File "/home/fursin/fggwork/ck/ck/kernel.py", line 9567, in access
rr=perform_action(i)
File "/home/fursin/fggwork/ck/ck/kernel.py", line 3457, in perform_action
return a(i)
File "/home/fursin/CK/ck-analytics/module/advice/module.py", line 161, in show
r=ck.access(ii)
File "/home/fursin/fggwork/ck/ck/kernel.py", line 9567, in access
rr=perform_action(i)
File "/home/fursin/fggwork/ck/ck/kernel.py", line 3457, in perform_action
return a(i)
File "/home/fursin/CK/reproduce-milepost-project/module/milepost/module.py", line 623, in ask_ai_web
return show(i)
File "/home/fursin/CK/reproduce-milepost-project/module/milepost/module.py", line 270, in show
'dict':dprog})
File "/home/fursin/fggwork/ck/ck/kernel.py", line 9567, in access
rr=perform_action(i)
File "/home/fursin/fggwork/ck/ck/kernel.py", line 3457, in perform_action
return a(i)
File "/home/fursin/CK/ck-autotuning/module/program/module.py", line 7096, in add
r=ck.access(i)
File "/home/fursin/fggwork/ck/ck/kernel.py", line 9567, in access
rr=perform_action(i)
File "/home/fursin/fggwork/ck/ck/kernel.py", line 3457, in perform_action
return a(i)
File "/home/fursin/CK/ck-env/module/misc/module.py", line 409, in prepare_entry_template
tuoa=c['data_uid']
UnboundLocalError: local variable 'c' referenced before assignment

install python libraries on Fedora

When installing the libraries as per the readme

$ ck install package --tags=lib,python-package,numpy

on the Fedora system, the linking to the build folder is not correct and the libraries are not detected by ck env.

specification of the OS:

$ck detect platform
***************************************************************************************
Detecting OS and CPU features ...

OS CK UOA:            linux-64 (4258b5fe54828a50)

OS name:              Fedora 28 (Workstation Edition)
Short OS name:        Linux 5.0.9
Long OS name:         Linux-5.0.9-100.fc28.x86_64-x86_64-with-fedora-28-Twenty_Eight
OS bits:              64
OS ABI:               x86_64

Platform init UOA:    42818da3a0789331

output of install is:

*** Installation path used: /home/emanuele/CK-TOOLS/lib-python-numpy-compiler.python-3.6.8-linux-64
  -----------------------------------
  Resolving software dependencies ...

*** Dependency 1 = python (Python interpreter):

    Resolved. CK environment UID = cf74dffbfce54916 (detected version 3.6.8)
  -----------------------------------

Installing to /home/emanuele/CK-TOOLS/lib-python-numpy-compiler.python-3.6.8-linux-64

**************************************************************

Cleanup: removing /home/emanuele/CK-TOOLS/lib-python-numpy-compiler.python-3.6.8-linux-64/python_deps_site
Installing 'numpy' and its dependencies to '/home/emanuele/CK-TOOLS/lib-python-numpy-compiler.python-3.6.8-linux-64/python_deps_site/lib/python3.6/site-packages' ...
Collecting numpy
  Using cached https://files.pythonhosted.org/packages/c1/e2/4db8df8f6cddc98e7d7c537245ef2f4e41a1ed17bf0c3177ab3cc6beac7f/numpy-1.16.3-cp36-cp36m-manylinux1_x86_64.whl
Installing collected packages: numpy
Successfully installed numpy-1.16.3

Setting up environment for installed package ...
  (full path = /home/emanuele/CK-TOOLS/lib-python-numpy-compiler.python-3.6.8-linux-64/build/numpy/__init__.py)

  Software entry found: lib.python.numpy (6a10047b1bcd16fc)

  -----------------------------------
  Resolving software dependencies ...

*** Dependency 1 = python (Python interpreter):

    Resolved. CK environment UID = cf74dffbfce54916 (detected version 3.6.8)
  -----------------------------------
CK error: [package] software not found in a specified path (/home/emanuele/CK-TOOLS/lib-python-numpy-compiler.python-3.6.8-linux-64/build/numpy/__init__.py)!

The problem is the linking to the build folder inside the CK-TOOLS folder: the automatically created link is build -> /home/emanuele/CK-TOOLS/lib-python-numpy-compiler.python-3.6.8-linux-64/python_deps_site/lib/python3.6/site-packages
However, the correct location of the installed library is:
/home/emanuele/CK-TOOLS/lib-python-numpy-compiler.python-3.6.8-linux-64/python_deps_site/lib64/python3.6/site-packages

This happened to me to all the dependencies required to run MLPerf Inference - Object Detection - SSD-MobileNet that were installed with ck install.

A simple workaround is to modify the link manually and re-run installation to register the packages.

Cleanup src package dir even if set "PACKAGE_SKIP_CLEAN_SRC_DIR": "YES"

step to reproduce

  • Set meta to skip cleanup and clone src for example at
    ~/CK/ck-caffe/package/lib-caffe-bvlc-opencl-libdnn-clblast-universal/.cm/meta.json
  • compile some program with dependent packages for example
    $ck compile program:caffe-time-opencl

  • from log you cans see src removed and git clone process applied

detecting if wget and pip are installed on MacOS

Hi @ens-lg4 .

It will be nice to brainstorm how to automatically detect wget and pip on MacOS (or maybe even on other systems too) when required for some packages, and either report users that they need to install these dependencies, or even maybe install them automatically.

We can also detect if pip is old (i.e. if it gives an issue with user/global installation flag) and suggest to upgrade it or do it automatically ... These are very common issues which I saw at SC18 ...

Thanks!

Update lib-nccl to master

After a recent update of NCCL to 2.3.5-5 (25 September 2018), make install no longer works (the install target has been deprecated?).

For now, I've fixed the Git commit to the one just before this update (29 November 2017) in the meta of package:lib-nccl, but we should eventually update the installation script.

/cc @fvella @gfursin

Can't detect GPU on Raspberry Pi 3

Hi,

On Raspberry Pi 3, the command ck detect platform.gpu prints the following:

$ ck detect platform.gpu

Executing: lspci > /tmp/tmp-ck-HIVpbV
pcilib: Cannot open /proc/bus/pci
lspci: Cannot find any working access method.

and returns with 0. Therefore, GPU is not detected correctly (even if there's no GPU, it should work).

There is no SLEEP command on Windows

Windows script download-and-install-package/install.bat contains calls like sleep 2, but Windows tells

'sleep' is not recognized as an internal or external command,
operable program or batch file.

sleep command is found in _c:\MinGW\msys\1.0\bin_ directory.
Is should be mentioned about that we have to add this dir to PATH variable or use msys console.

Create packages for OpenCL headers

On some OpenCL platforms, no OpenCL headers are installed, even when a driver is available (e.g. libOpenCL.so). The headers can be fetched e.g. as follows:

# mkdir /usr/include/CL
# cd /usr/include/CL
# wget https://raw.githubusercontent.com/KhronosGroup/OpenCL-Headers/master/opencl12/CL/opencl.h
# wget https://raw.githubusercontent.com/KhronosGroup/OpenCL-Headers/master/opencl12/CL/cl.h
# wget https://raw.githubusercontent.com/KhronosGroup/OpenCL-Headers/master/opencl12/CL/cl_platform.h
# wget https://raw.githubusercontent.com/KhronosGroup/OpenCL-Headers/master/opencl12/CL/cl_ext.h
# wget https://raw.githubusercontent.com/KhronosGroup/OpenCL-Headers/master/opencl12/CL/cl_gl.h
# wget https://raw.githubusercontent.com/KhronosGroup/OpenCL-Headers/master/opencl12/CL/cl_gl_ext.h
# wget https://github.com/KhronosGroup/OpenCL-CLHPP/releases/download/v2.0.10/cl2.hpp

What's worse, building a package using OpenCL may fail due to missing headers half way or further down the build. This wastes time.

CK should detect the driver version and download the appropriate headers when none are found.

Environment settings may get overwritten during package installation

I encountered a peculiar bug when installing a package. On a system with two compilers installed (GCC and Clang), I chose to build a package with GCC, but to my surprise CK attempted to build it with Clang!

It turned out that the package had a dependency on a library built with Clang. In addition, the library dependency was specified with a higher "sort" value than the compiler dependency. As a result, CK first set up the compiler environment with CK_CXX=gcc (as per my request); then CK overwrote the compiler environment with CK_CXX=clang (as per the environment settings of the library).

A workaround in this case was to lower the "sort" value of the library. But what can be done in general to prevent this counter-intuitive behaviour?

JAVA_HOME is not set properly on Linux

Hi,

ck detect soft:compiler.java may not set JAVA_HOME properly. It searches for javac and may find a symlink /usr/lib/javac. If this option is chosen, JAVA_HOME is set to /usr. In fact, it should be set to something like /usr/lib/jvm/java-8-openjdk-amd64.

One way to fix it is to search for rt.jar instead of javac. It's not symlinked anywhere.

But, I guess, the best way would be to add symlink detection/resolution, so that symlinks are not considered at all.

Detect cpu platform fails on containers

By running ck detect platform.cpu
I got the following error:

OS CK UOA:            linux-64 (4258b5fe54828a50)

OS name:              Ubuntu 18.04.2 LTS
Short OS name:        Linux 4.15.0
Long OS name:         Linux-4.15.0-45-generic-x86_64-with-Ubuntu-18.04-bionic
OS bits:              64
OS ABI:               x86_64

Platform init UOA:    42818da3a0789331
CK error: [platform.cpu] problem opening text file=/proc/cpuinfo ([Errno 107] Transport endpoint is not connected: '/proc/cpuinfo')!

The problem is related to VMWare container we use but can affect some CK workflows.

Can't detect platform name/vendor/model of Rapsberry Pi 3

Hi,

On Raspberry Pi 3, the command ck detect platform prints the following:

...
***************************************************************************************
Detecting system features ...

Platform name:   
Platform vendor: 
Platform model:  

adding timeout when detecting soft version during soft detection

It may still happen that during version detection, some strange soft will be waiting for a user input. In such case, we should add a timeout when invoking a command which detect software version (let's say 10 sec). It's not urgent. Need to check that it works on Linux, Windows and MacOs ...

Adding "ck test package" to test all packages

CC @ens-lg4 , @psyhtest .

Since we use official and often non-permanent URLs to download and install packages (websites, GitHub, etc), we sometimes have issues that URLs change or even disappear. There are two solutions:

  • long-term: we should add a permanent storage (what we already do in some cases via cKnowledge.org) for packages. I've also heard that spack developers plan to add such a permanent repository for different packages and maybe we will be able to use it. However, some packages such as Java do not allow distribution of their packages outside Oracle website and often change location - that is a big issue.

  • current: we should add "ck test package:*" command to test all packages - this should be done not at the time of commit (i.e. via Travis), but using our own machine with lots of space. In such case, it can delete all env entries and installations, and attempt to install all packages from scratch. There is a trick to make it non-interactive if there are multiple versions of subdependencies or possible compilers - we need to think about that. Maybe we can create "test" scenarios for each package to reduce testing combinations for now ...

However, I think such functionality is of high priority now to keep all packages and workflows working!

Adding repository name for a dependency

If software dependency is not resolved and packages not found, check if repository name is specified. If it is specified and repository is not present, automatically download it and try to resolve dependency again ...

ar and ranlib problems with Caffe installation

I hit a couple of snags when installing Caffe (package:lib-caffe-bvlc-master-cpu-universal) on a CentOS 7 development board.

The original compiler (gcc-4.8.5) was quite old which is possibly why I was having segfaults when running program:caffe-classification. I upgraded to gcc-5.3.1 as follows:

sudo yum install centos-release-scl
sudo yum install devtoolset-4-gcc*
scl enable devtoolset-4 bash

The new compiler was placed into /opt/rh/devtoolset-4/root/usr/bin/. Initially, CK wouldn't pick it up, but I linked this directory from $CK_TOOLS. So far, so good:

$ ck show env --tags=gcc                                                                                                                                        
Env UID:         Target OS: Bits: Name:          Version: Tags:

f2e1b60b46c044ad   linux-64    64 GNU C compiler 5.3.1    64bits,compiler,gcc,host-os-linux-64,lang-c,lang-cpp,target-os-linux-64,v5,v5.3,v5.3.1
b1aaa7b21081bcea   linux-64    64 GNU C compiler 4.8.5    64bits,compiler,gcc,host-os-linux-64,lang-c,lang-cpp,target-os-linux-64,v4,v4.8,v4.8.5

When installing ck-caffe:package:lib-caffe-bvlc-master-universal-cpu, however, the build failed due to a linking problem. I inspected the contents of CMakeFiles/proto.dir/link.txt both in the successful installation (with gcc-4.8.5):

# $CK_TOOLS/lib-caffe-bvlc-master-cpu-trunk-gcc-4.8.5-linux-64/obj/src/caffe/CMakeFiles/proto.dir/link.txt
/usr/bin/ar qc ../../lib/libproto.a  CMakeFiles/proto.dir/__/__/include/caffe/proto/caffe.pb.cc.o
/usr/bin/ranlib ../../lib/libproto.a

and in the failed installation:

# $CK_TOOLS/lib-caffe-bvlc-master-cpu-trunk-gcc-5.3.1-linux-64/obj/src/caffe/CMakeFiles/proto.dir/link.txt
"" qc ../../lib/libproto.a  CMakeFiles/proto.dir/__/__/include/caffe/proto/caffe.pb.cc.o
/usr/bin/ranlib ../../lib/libproto.a

As can be seen, CK didn't provide anything for ar and used the default path to ranlib. Once I manually changed $CK_TOOLS/lib-caffe-bvlc-master-cpu-trunk-gcc-5.3.1-linux-64/obj/src/caffe/CMakeFiles/proto.dir/link.txt as follows:

"$CK_TOOLS/opt-rh-devtoolset-4-root-usr-bin/gcc-ar" qc ../../lib/libproto.a  CMakeFiles/proto.dir/__/__/include/caffe/proto/caffe.pb.cc.o
"$CK_TOOLS/opt-rh-devtoolset-4-root-usr-bin/gcc-ranlib" ../../lib/libproto.a

I was able to rerun make install manually.

One additional snag was that CK wouldn't register Caffe because the installation dir looked as follows:

drwxrwxr-x. 3 4096 May 14 01:49 include
drwxrwxr-x. 2 4096 May 14 01:49 lib64
drwxrwxr-x. 3 4096 May 14 01:49 python

while CK was expecting to find libcaffe.so under lib. I temporarily resolved this by making a symbolic link from install/lib64 to install/lib and rerunning ck detect soft:lib.caffe manually.

Python detection hung

Clipped output after running ./bin/ck run program:seissol-proxy:

  Package found: lib-seissol-scc18-proxy (228f87b1866a18d0)                               
                                                                         
                            
  -----------------------------------                 
  Resolving software dependencies ... 
                                                                                       
*** Dependency 1 = python (Python < 3):                                
                                                                         
 ********                                                                                  
 WARNING: no registered CK environment was found for "Python < 3" dependency with tags="compiler,python" and setup={"target_os_bits": "64", "host_os_uoa": "linux-64", "target_os_uoa": "linux-64"} and version constraints ([] <= v <= [3])
                                                                                                          
  Trying to automatically detect required software ...         
                                                                              
  1) Checking if "python" (compiler.python / 45d68512df22fde7) is installed ...
                                                                                                                                                                                                                                                                                         
  Searching for python (python*) to automatically register in the CK - it may take some time, please wait ...
                                                      
    * Searching in /usr ...                                                                      
    * Searching in /opt ...                                                                     
    * Searching in /g/g0/herbein1 ...                                      
                                                                                                                                      
  Search completed in 21.4 secs. Found 95 target files (may be pruned) ...                             
                           
  Detecting and sorting versions (ignore some work output) ...
                                     
    * /usr/lib64/pkgconfig/python3.pc                                                           
        WARNING: version was not detected
    * /usr/share/grace/bin/python3.4m-x86_64-config
        WARNING: version was not detected
    * /usr/share/grace/bin/python3.4   (Version 3.4.9) - skipped because of version constraints!
    * /usr/share/grace/bin/python3.4-config
        WARNING: version was not detected
    * /usr/share/grace/bin/python2   (Version 2.7.5)
    * /usr/share/systemtap/tapset/python3.stp
        WARNING: version was not detected
    * /usr/share/man/man1/python3.1.gz
        WARNING: version was not detected
    * /usr/share/gtksourceview-3.0/language-specs/python3.lang
        WARNING: version was not detected
    * /usr/apps/python-3.5.1/bin/python3   (Version 3.5.1) - skipped because of version constraints!
    * /usr/apps/python-3.5.1/bin/python3.5m-config
        WARNING: version was not detected
    * /usr/apps/python-2.7.11/bin/python2   (Version 2.7.11)
    * /usr/apps/python-2.7.13/bin/python2   (Version 2.7.13)
    * /usr/apps/python-3.6.0/bin/python3   (Version 3.6.0) - skipped because of version constraints!
    * /usr/apps/python-3.6.0/bin/python3.6m-config
        WARNING: version was not detected
    * /usr/apps/python-2.7.14/bin/python2   (Version 2.7.14)
    * /usr/apps/python-3.6.4/bin/python3   (Version 3.6.4) - skipped because of version constraints!
    * /usr/apps/python-3.6.4/bin/python3.6m-config
        WARNING: version was not detected
    * /usr/gapps/atsb/Python-2.7.1/python   (Version 2.7.1)
    * /usr/tce/bin/python3-3.5.1   (Version 3.5.1) - skipped because of version constraints!
    * /usr/tce/bin/python   (Version 2.7.14)
    * /usr/tce/bin/python2   (Version 2.7.14)
    * /usr/tce/bin/python3-3.6.0   (Version 3.6.0) - skipped because of version constraints!
    * /usr/tce/bin/python3.6   (Version 3.6.4) - skipped because of version constraints!

At which point the CPU utilization dropped to 0% and ck just hung.

new platform.init entries do not have .cm ...

I see new platform.init entries, but they do not have .cm, i.e.
platform.init/.cm ...
and
platform.init/mediatek-android/.cm ...

Do not forget to add all (hidden) files/directories when adding CK entries to git!

Recognise symlinks when detecting software installations

For now, CK doesn't recognise symlinks. So, when detecting installation of some library (e.g. libhdf5.so for ck-caffe), it asks to choose from 3 variants, two of them being symlinks to the other one. It would be better if CK understands they're symlinks and doesn't show them as separate installations.

Cache platform detection

I think we should cache most of static platform info in a local repository once it's detected and then just preload it (except dynamic info such as frequency) to avoid wasting time in CK compile/run/autotuning pipelines. We can also add a flag to refresh it if necessary. Need to think how to implement it while keeping backwards compatibility ...

Caffe installation offers two choices for HDF5

It's not an error but could be quite annoying (especially if you don't know which one you need).

*** Dependency 7 = lib-hdf5 (HDF5 library):

More than one environment found for "HDF5 library" with tags="lib,hdf5" and setup={"target_os_bits": "64", "host_os_uoa": "linux-64", "target_os_uoa": "linux-64"}:

0) HDF5 library (static) - v1.8.16 (64bits,hdf5,host-os-linux-64,lib,target-os-linux-64,v1,v1.8,v1.8.16,vstatic (263445893b9b9656))
1) HDF5 library - v1.8.16 (64bits,hdf5,host-os-linux-64,lib,target-os-linux-64,v1,v1.8,v1.8.16 (818659942168fb91))

I believe the static version was added to support CK-Cluster.

ck install package:imagenet-2012-val fails

The error message:

   ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
   CK detected a PROBLEM in the third-party CK package:

   CK package:           imagenet-2012-val
   CK repo:              ck-env
   CK repo URL:          https://github.com/ctuning/ck-env
   CK package URL:       https://github.com/ctuning/ck-env/tree/master/package/imagenet-2012-val
   Issues URL:           https://github.com/ctuning/ck-env/issues
   ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
   Please, submit the log to the authors of this external CK package at "https://github.com/ctuning/ck-env/issues" to collaboratively fix this problem!

CK error: [package] package installation failed!

adding support to install stable or dev packages

As @psyhtest suggested, it may be useful to provide a flag when installing packages, compiling programs and resolving dependencies to pick up stable versions of packages. This can be done by detecting flags --stable --dev
and then providing extra keys in JSON with dependencies that specify stable/dev package, i.e.
something like:

"compile_deps": {
"lib-tensorflow": {
"name": "TensorFlow library",
"sort": 10,
"tags": "lib,tensorflow",
"extra_tags":{"stable":"v1.0","dev":""}

}
},

We can discuss it later ...

Unable to set gpu_governor

In module/platform.gpu/module.py at line 620 it should be 'CK_GPU' instead of 'CK_CPU'.
This issue prevents setting CK_GPU_FREQ_GOVERNOR variable.

package failed building but env is created

Hi,

I was trying to install llvm under cK, but ran into space shortage.

LLVM didn't installed successfully, yet the entry was created by cK.

Is it ok?

-- Installing: /home/abdul/damm02/data/repos/autotuning/ck/tools/llvm-trunk-linux-64/bin/llvm-cxxdump
-- Installing: /home/abdul/damm02/data/repos/autotuning/ck/tools/llvm-trunk-linux-64/bin/llvm-dis
-- Installing: /home/abdul/damm02/data/repos/autotuning/ck/tools/llvm-trunk-linux-64/bin/llvm-dwp
CMake Error at tools/llvm-dwp/cmake_install.cmake:36 (file):
file INSTALL cannot copy file
"/home/abdul/damm02/data/repos/autotuning/ck/tools/llvm-trunk-linux-64/obj/bin/llvm-dwp"
to
"/home/abdul/damm02/data/repos/autotuning/ck/tools/llvm-trunk-linux-64/bin/llvm-dwp".
Call Stack (most recent call first):
tools/cmake_install.cmake:70 (include)
cmake_install.cmake:61 (include)

Error: failed installing ...
Press any key to continue!

Setting up environment for installed package ...
(full path = /home/abdul/damm02/data/repos/autotuning/ck/tools/llvm-trunk-linux-64/bin/clang)

Software entry found: compiler.llvm (1c2eb494b8ae3bc4)

Resolving software dependencies ...

*** Dependency 1 = tool-cmake (cmake):

Resolved. CK environment UID = 25ccb854dc1a7ee6

Environment entry added (a49609e11eb5f1ce)!
Installation time: 10537.1109741 sec.

Regards

Abdul

lib-protobuf-host installation issues

On an 64-bit ARM platform with Yocto Linux, I've had the following issues with installing lib-protobuf-host via CK.

Installing the development version

$ ck install package:lib-protobuf-trunk-host
  1. Because the library was installed under $INSTALL_DIR/install/lib64, auto detection failed:
CK error: [package] software not found in a specified path (/home/root/CK_TOOLS/lib-protobuf-host-development-linux-64/install/lib/libprotobuf.a)!
I provided a symbolic link:
$ ln -s $INSTALL_DIR/install/lib64 $INSTALL_DIR/install/lib
  1. Because only libprotobuf.a was created, manual detection failed too:
$ ck detect soft:lib.protobuf                                                                                                                                                                 

  Searching for ProtoBuf library (libprotobuf.so) to automatically register in the CK - it may take some time, please wait ...

    * Searching in /usr ...
    * Searching in /home/root/CK_TOOLS ...
    * Searching in /home/root ...

  Search completed in 2.1 secs. Found 0 target files (may be pruned) ...
  1. I fooled CK by creating an empty libprotobuf.so under INSTALL_DIR/install/lib:
$ touch $INSTALL_DIR/install/libprotobub.so
$ ck detect soft:lib.protobuf

(Fingers crossed, only the static library is required.)

Installing the released versions

CK-Caffe actually tried to use the 3.1.0 version but failed to untar it:

    Resolved. CK environment UID = 4a2b88bf4031b57c (detected version 5.2.1)
  -----------------------------------

Installing to /home/root/CK_TOOLS/lib-protobuf-host-3.1.0-linux-64


Downloading package from 'https://github.com/google/protobuf/archive/v3.1.0.tar.gz' ...
Connecting to github.com (192.30.253.112:443)
Connecting to codeload.github.com (192.30.253.120:443)
v3.1.0.tar.gz        100% |**********************************************************************************************|  3956k  0:00:00 ETA

Ungzipping archive ...

Untarring archive ...
protobuf-3.1.0/
...
protobuf-3.1.0/objectivec/Tests/CocoaPods/OSXCocoaPodsTester/OSXCocoaPodsTester.xcodeproj/
protobuf-3.1.0/objectivec/Tests/CocoaPods/OSXCocoaPodsTester/OSXCocoaPodsTester.xcodeproj/project.pb
protobuf-3.1.0/objectivec/Tests/CocoaPods/OSXCocoaPodsTester/OSXCocoaPodsTester.xcodeproj/project.xc
protobuf-3.1.0/objectivec/Tests/CocoaPods/OSXCocoaPodsTester/OSXCocoaPodsTester.xcodeproj/project.xc
tar: can't remove old file protobuf-3.1.0/objectivec/Tests/CocoaPods/OSXCocoaPodsTester/OSXCocoaPodsTester.xcodeproj/project.xc: Is a directory
Error: untaring package failed!
CK error: [package] package installation failed!

The exactly same failure happened when trying to install manually other releases of lib-protobuf-host:

$ ck install package:lib-protobuf-3.0.0-host
$ ck install package:lib-protobuf-3.2.0-host

It's weird that the last file (project.xc) is mentioned twice. This could be an issue with the archives or the tar utility on this platform (coming from BusyBox):

$ tar        
BusyBox v1.23.2 (2017-05-25 10:24:12 BST) multi-call binary.

Accidental vars redefenition when installind deps for a package

Investigating the reasons of issue ctuning/ck-request-asplos18-caffe-intel#14 I've found that caffe packages has similar var in its meta:

      "PACKAGE_URL_ANDROID": "https://github.com/BVLC/caffe",
      "PACKAGE_URL_LINUX": "https://github.com/BVLC/caffe",
      "PACKAGE_URL_WINDOWS": "https://github.com/BVLC/caffe", 

But if there is some dep need to be installed during caffe installation, these vars can override PACKAGE_URL var from dep's meta. It is because of download-and-install-package script contains such a code

if [ "$PACKAGE_URL_LINUX" != "" ] ; then
           PACKAGE_URL=$PACKAGE_URL_LINUX
fi

Not sure how not fix it properly.
I'd suggest to modify mentioned code as

if [ "$PACKAGE_URL_LINUX" != "" ] ; then
    if [ "$PACKAGE_URL" == "" ] ; then
           PACKAGE_URL=$PACKAGE_URL_LINUX
    fi
fi

i.e. we only redefine the var if it is not set yet. But I'm not sure which side effects it can involve. Also there are lot of another similar code pieces in that script but with different vars.
@gfursin what do you think?

CK detection of some software is extremely slow on NFS (noticed at LLNL)

When preparing a demo workflow for SCC18 (https://github.com/ctuning/ck-scc18) with #CollectiveKnowledge, @spack and @flux-framework during my visit to @LLNL, we've noticed that CK software plugins (http://cknowledge.org/shared-soft-detection-plugins.html) work extremely slowly on NFS, i.e. 3..4 minutes to detect one software dependency.

This is normal since we perform the full and recursive search of all directories by default, however maybe we should provide a flag like --quick to speed up search in such case by looking only in a few most commonly used locations. We can then let a user specify the full path to a given soft if it's not found?

CC @SteVwonder @trws @tgamblin

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.