Giter Site home page Giter Site logo

aws-samples / amazon-sagemaker-notebook-instance-lifecycle-config-samples Goto Github PK

View Code? Open in Web Editor NEW
407.0 19.0 244.0 101 KB

A collection of sample scripts to customize Amazon SageMaker Notebook Instances using Lifecycle Configurations

License: MIT No Attribution

Shell 69.95% Python 30.05%

amazon-sagemaker-notebook-instance-lifecycle-config-samples's Introduction

SageMaker Notebook Instance Lifecycle Config Samples

Overview

A collection of sample scripts to customize Amazon SageMaker Notebook Instances using Lifecycle Configurations

Lifecycle Configurations provide a mechanism to customize Notebook Instances via shell scripts that are executed during the lifecycle of a Notebook Instance.

Sample Scripts

  • add-pypi-repository - This script adds a private PyPi repository in addition to or instead of pypi.org.
  • auto-stop-idle - This script stops a SageMaker notebook once it's idle for more than 1 hour. (default time)
  • connect-emr-cluster - This script connects an EMR cluster to the Notebook Instance using SparkMagic.
  • disable-uninstall-ssm-agent - This script disables and uninstalls the SSM Agent at startup.
  • enable-fips-openssl-provider - This script enables the OpenSSL FIPS provider in each conda environment.
  • execute-notebook-on-startup - This script executes a Notebook file on the instance during startup.
  • export-to-pdf-enable - This script enables Jupyter to export a notebook directly to PDF.
  • install-conda-package-all-environments - This script installs a single conda package in all SageMaker conda environments, apart from the JupyterSystemEnv which is a system environment reserved for Jupyter.
  • install-conda-package-single-environment - This script installs a single conda package in a single SageMaker conda environment.
  • install-lab-extension - This script installs a jupyterlab extension package in SageMaker Notebook Instance.
  • install-nb-extension - This script installs a single jupyter notebook extension package in SageMaker Notebook Instance.
  • install-pip-package-all-environments - This script installs a single pip package in all SageMaker conda environments, apart from the JupyterSystemEnv which is a system environment reserved for Jupyter.
  • install-pip-package-single-environment - This script installs a single pip package in a single SageMaker conda environments.
  • install-r-package - This script installs a single R package in SageMaker R environment.
  • install-server-extension - This script installs a single jupyter notebook server extension package in SageMaker Notebook Instance.
  • migrate-ebs-data-backup - This script backs up content in /home/ec2-user/SageMaker/ to a S3 bucket specified in a tag on the notebook instance.
  • migrate-ebs-data-sync - This script downloads a snapshot created by migrate-ebs-data-backup to /home/ec2-user/SageMaker/ in a new notebook instance. You specify the snapshop using tags on the notebook instance.
  • mount-efs-file-system - This script mounts an EFS file system to the Notebook Instance at the ~/SageMaker/efs directory based off the DNS name.
  • mount-fsx-lustre-file-system - This script mounts an FSx for Lustre file system to the Notebook Instance at the /fsx directory based off the DNS and Mount name parameters.
  • notebook-history-s3 - This script persists the underlying sqlite database of commands and cells executed for S3.
  • persistent-conda-ebs - This script installs a custom, persistent installation of conda on the Notebook Instance's EBS volume, and ensures that these custom environments are available as kernels in Jupyter.
  • proxy-for-jupyter - This script configures proxy settings for your Jupyter notebooks and the SageMaker Notebook Instance.
  • publish-instance-metrics - This script publishes the system-level metrics from the Notebook Instance to CloudWatch.
  • set-codecommit-cross-account-access - This script sets cross-account CodeCommit access, so you can work on repositories hosted in another account.
  • set-env-variable - This script gets a value from the Notebook Instance's tags and sets it as an environment variable for all processes including Jupyter.
  • set-git-config - This script sets the username and email address in Git config.

Development

For contributors looking to develop scripts, they can be developed directly on SageMaker Notebook Instances since that is the environment that they are run with. Lifecycle Configuration scripts run as root, the working directory is /. To simulate the execution environment, you may use

sudo su
export PATH=/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/sbin:/sbin:/bin
cd /

Edit the script in a file such as my-script-on-start.sh and execute it as

sh my-script-on-start.sh

The directory structure followed is:

scripts/
    my-script-name/
        on-start.sh
        on-create.sh

Testing

To test the script end-to-end:

Create a Lifecycle Configuration with the script content and a Notebook Instance with the Lifecycle Configuration

# If the scripts are in a directory "scripts/my-script-name/*"
SCRIPT_NAME=my-script-name
ROLE_ARN=my-role-arn

RESOURCE_NAME="$SCRIPT_NAME-$RANDOM"

# Add any script specific options such as subnet-id
aws sagemaker create-notebook-instance-lifecycle-config \
    --notebook-instance-lifecycle-config-name "$RESOURCE_NAME" \
    --on-start Content=$((cat scripts/$SCRIPT_NAME/on-start.sh || echo "")| base64) \
    --on-create Content=$((cat scripts/$SCRIPT_NAME/on-create.sh || echo "")| base64)

aws sagemaker create-notebook-instance \
    --notebook-instance-name "$RESOURCE_NAME" \
    --instance-type ml.t2.medium \
    --role-arn "$ROLE_ARN" \
    --lifecycle-config-name "$RESOURCE_NAME"

aws sagemaker wait \
    notebook-instance-in-service \
    --notebook-instance-name "$RESOURCE_NAME"
  • Access the Notebook Instance and perform any validation specific to the script.
aws sagemaker create-presigned-notebook-instance-url \
    --notebook-instance-name "$RESOURCE_NAME"
aws sagemaker stop-notebook-instance \
    --notebook-instance-name "$RESOURCE_NAME"

aws sagemaker wait \
    notebook-instance-stopped \
    --notebook-instance-name "$RESOURCE_NAME"

aws sagemaker start-notebook-instance \
    --notebook-instance-name "$RESOURCE_NAME"

aws sagemaker wait \
    notebook-instance-in-service \
    --notebook-instance-name "$RESOURCE_NAME"
  • Access the Notebook Instance again and perform any validation specific to the script.
aws sagemaker create-presigned-notebook-instance-url \
    --notebook-instance-name "$RESOURCE_NAME"

File a Pull Request following the instructions in the Contribution Guidelines.

License Summary

This sample code is made available under the MIT-0 license. See the LICENSE file.

amazon-sagemaker-notebook-instance-lifecycle-config-samples's People

Contributors

andjsmi avatar apogupta2018 avatar athewsey avatar biglep avatar christian-kam avatar dmlause avatar eslesar-aws avatar imujjwal96 avatar jaipreet-s avatar jarryd-rk avatar jgray-aws avatar jpeddicord avatar lstilwell avatar mckev-amazon avatar medwig avatar michaelhsieh42 avatar neelamgehlot avatar shariefyoussef avatar shneydor avatar tayodok avatar theoldfather avatar tomfaulhaber avatar tomjonshea avatar tuliocasagrande avatar urirosenberg avatar w601sxs avatar zhanghan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

amazon-sagemaker-notebook-instance-lifecycle-config-samples's Issues

studio lifecycle

how do I attach lifecycle to STUDIO notebooks (not instance)?

execute-notebook-on-startup -

Can not execute a notebook with R and some dependencies , the LCC stuck in loop

LCC :
`
#!/bin/bash
set -e
nohup sudo -b -u ec2-user -i <<'EOF'
nohup sudo yum-config-manager --enable epel
nohup sudo yum -y install htop make automake gcc gcc-c++ clang libcurl-devel openssl-devel proj-devel geos-devel openssl-devel || echo "FAIL LINUX DEPS"

  conda activate R 
  conda install 'r-devtools=2.4.3' --name 'R' -y
  conda install 'r-raster=3.4-10' --name 'R' -y
  conda install 'r-sf=1.0_5' --name 'R'  -y
  conda install 'r-desctools=0.99.41' --name 'R' -y
  conda install 'r-xgboost' -y   --name 'R' -y
  conda install 'r-remotes' -y   --name 'R' -y
  conda install 'r-spData' -y   --name 'R' -y
  conda install 'r-spdep' -y   --name 'R' -y
  conda install 'r-reshape2=1.4.4' -y   --name 'R' -y
  conda install 'r-dplyr' -y   --name 'R' -y
  conda install 'r-readr' -y   --name 'R' -y
  conda install 'r-readxl' -y   --name 'R' -y
  conda install 'r-paws' -y   --name 'R' -y
  conda install 'r-botor' -y   --name 'R' -y
  conda install 'r-geojsonio' -y   --name 'R' -y
  conda install 'r-outForest' -y   --name 'R' -y
  conda install 'r-bestNormalize' -y   --name 'R' -y
  conda install 'r-arrow=6.0.1' -y   --name 'R' -y
  
  FILE0="/home/ec2-user/SageMaker/00_setup.ipynb" 
  FILE1="/home/ec2-user/SageMaker/01_oultiers_oi.ipynb"
  FILE2="/home/ec2-user/SageMaker/02_outliers_avaluo.ipynb"
  FILE3="/home/ec2-user/SageMaker/03_outliers_cbr_final.ipynb"
  FILE4="/home/ec2-user/SageMaker/04_generacion_capa_ufm2.ipynb"
  #conda deactivate
  
  conda activate python3
  nohup jupyter nbconvert "$FILE0" --ExecutePreprocessor.kernel_name=python3 --ExecutePreprocessor.timeout=2400 --to notebook --execute   
  #conda deactivate
  
  conda activate R
  #nohup jupyter nbconvert "$FILE1"  "$FILE2"  "$FILE3"  --ExecutePreprocessor.kernel_name=ir --ExecutePreprocessor.timeout=2400  --to notebook  --execute
  nohup jupyter nbconvert "$FILE1"   --ExecutePreprocessor.timeout=2400  --to notebook  --execute
  #nohup jupyter nbconvert "$FILE4"                      --ExecutePreprocessor.kernel_name=ir --ExecutePreprocessor.timeout=2400  --to notebook  --execute
  conda deactivate
  
  # PARAMETERS
  IDLE_TIME=2400
  
  echo "Fetching the autostop script"
  wget https://raw.githubusercontent.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/master/scripts/auto-stop-idle/autostop.py
  
  echo "Starting the SageMaker autostop script in cron"
  
  (crontab -l 2>/dev/null; echo "*/1 * * * * /usr/bin/python $PWD/autostop.py --time $IDLE_TIME --ignore-connections") | crontab -
   
  EOF

`

            `<html>
            <body>
            <!--StartFragment-->
            
            [NbConvertApp] Writing 49201 bytes to /home/ec2-user/SageMaker/00_setup.nbconvert.ipynb
            --
            [NbConvertApp] Converting notebook /home/ec2-user/SageMaker/01_oultiers_oi.ipynb to notebook
            [NbConvertApp] Executing notebook with kernel: ir
            trying URL 'https://cran.r-project.org/src/contrib/RcppEigen_0.3.3.9.1.tar.gz'
            Content type 'application/x-gzip' length 1633360 bytes (1.6 MB)
            ==================================================
            downloaded 1.6 MB
            trying URL 'https://cran.r-project.org/src/contrib/Rcpp_1.0.8.tar.gz'
            Content type 'application/x-gzip' length 3036631 bytes (2.9 MB)
            ==================================================
            downloaded 2.9 MB
            trying URL 'https://cran.r-project.org/src/contrib/FNN_1.1.3.tar.gz'
            Content type 'application/x-gzip' length 78492 bytes (76 KB)
            ==================================================
            downloaded 76 KB
            trying URL 'https://cran.r-project.org/src/contrib/ranger_0.13.1.tar.gz'
            Content type 'application/x-gzip' length 193819 bytes (189 KB)
            ==================================================
            downloaded 189 KB
            trying URL 'https://cran.r-project.org/src/contrib/missRanger_2.1.3.tar.gz'
            Content type 'application/x-gzip' length 44173 bytes (43 KB)
            ==================================================
            downloaded 43 KB
            * installing *source* package ‘Rcpp’ ...
            ** package ‘Rcpp’ successfully unpacked and MD5 sums checked
            ** using staged installation
            ** libs
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -I../inst/include/  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c api.cpp -o api.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -I../inst/include/  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c attributes.cpp -o attributes.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -I../inst/include/  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c barrier.cpp -o barrier.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -I../inst/include/  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c date.cpp -o date.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -I../inst/include/  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c module.cpp -o module.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -I../inst/include/  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c rcpp_init.cpp -o rcpp_init.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -shared -L/home/ec2-user/anaconda3/envs/R/lib/R/lib -Wl,-O2 -Wl,--sort-common -Wl,--as-needed -Wl,-z,relro -Wl,-z,now -Wl,--disable-new-dtags -Wl,--gc-sections -Wl,-rpath,/home/ec2-user/anaconda3/envs/R/lib -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib -L/home/ec2-user/anaconda3/envs/R/lib -o Rcpp.so api.o attributes.o barrier.o date.o module.o rcpp_init.o -L/home/ec2-user/anaconda3/envs/R/lib/R/lib -lR
            installing to /home/ec2-user/anaconda3/envs/R/lib/R/library/00LOCK-Rcpp/00new/Rcpp/libs
            ** R
            ** inst
            ** byte-compile and prepare package for lazy loading
            ** help
            *** installing help indices
            ** building package indices
            ** installing vignettes
            ** testing if installed package can be loaded from temporary location
            ** checking absolute paths in shared objects and dynamic libraries
            ** testing if installed package can be loaded from final location
            ** testing if installed package keeps a record of temporary installation path
            * DONE (Rcpp)
            * installing *source* package ‘FNN’ ...
            ** package ‘FNN’ successfully unpacked and MD5 sums checked
            ** using staged installation
            ** libs
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c ANN.cpp -o ANN.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c KNN_ANN.cpp -o KNN_ANN.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c KNN_correlation_distance.cpp -o KNN_correlation_distance.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c KNN_cover_test.cpp -o KNN_cover_test.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c KNN_cover_tree.cpp -o KNN_cover_tree.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c KNN_mutual_information.cpp -o KNN_mutual_information.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c bd_fix_rad_search.cpp -o bd_fix_rad_search.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c bd_pr_search.cpp -o bd_pr_search.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c bd_search.cpp -o bd_search.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c bd_tree.cpp -o bd_tree.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c brute.cpp -o brute.o
            x86_64-conda-linux-gnu-cc -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c init_FNN.c -o init_FNN.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c kd_dump.cpp -o kd_dump.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c kd_fix_rad_search.cpp -o kd_fix_rad_search.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c kd_pr_search.cpp -o kd_pr_search.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c kd_search.cpp -o kd_search.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c kd_split.cpp -o kd_split.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c kd_tree.cpp -o kd_tree.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c kd_util.cpp -o kd_util.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c label_point.cpp -o label_point.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG -Iinclude -DUSING_R -DUSING_RPRINT  -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib   -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c perf.cpp -o perf.o
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -shared -L/home/ec2-user/anaconda3/envs/R/lib/R/lib -Wl,-O2 -Wl,--sort-common -Wl,--as-needed -Wl,-z,relro -Wl,-z,now -Wl,--disable-new-dtags -Wl,--gc-sections -Wl,-rpath,/home/ec2-user/anaconda3/envs/R/lib -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib -L/home/ec2-user/anaconda3/envs/R/lib -o FNN.so ANN.o KNN_ANN.o KNN_correlation_distance.o KNN_cover_test.o KNN_cover_tree.o KNN_mutual_information.o bd_fix_rad_search.o bd_pr_search.o bd_search.o bd_tree.o brute.o init_FNN.o kd_dump.o kd_fix_rad_search.o kd_pr_search.o kd_search.o kd_split.o kd_tree.o kd_util.o label_point.o perf.o -L/home/ec2-user/anaconda3/envs/R/lib/R/lib -lR
            rm -f *.o core
            installing to /home/ec2-user/anaconda3/envs/R/lib/R/library/00LOCK-FNN/00new/FNN/libs
            ** R
            ** inst
            ** byte-compile and prepare package for lazy loading
            ** help
            *** installing help indices
            ** building package indices
            ** testing if installed package can be loaded from temporary location
            ** checking absolute paths in shared objects and dynamic libraries
            ** testing if installed package can be loaded from final location
            ** testing if installed package keeps a record of temporary installation path
            * DONE (FNN)
            * installing *source* package ‘RcppEigen’ ...
            ** package ‘RcppEigen’ successfully unpacked and MD5 sums checked
            ** using staged installation
            ** libs
            x86_64-conda-linux-gnu-c++ -std=gnu++14 -I"/home/ec2-user/anaconda3/envs/R/lib/R/include" -DNDEBUG  -I'/home/ec2-user/anaconda3/envs/R/lib/R/library/Rcpp/include' -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /home/ec2-user/anaconda3/envs/R/include -I/home/ec2-user/anaconda3/envs/R/include -Wl,-rpath-link,/home/ec2-user/anaconda3/envs/R/lib  -I../inst/include -fpic  -fvisibility-inlines-hidden  -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /home/ec2-user/anaconda3/envs/R/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1639563404388/work=/usr/local/src/conda/r-base-4.1.2 -fdebug-prefix-map=/home/ec2-user/anaconda3/envs/R=/usr/local/src/conda-prefix  -c RcppEigen.cpp -o RcppEigen.o
            In file included from ../inst/include/Eigen/Core:397,                 from ../inst/include/Eigen/Dense:1,                 from ../inst/include/RcppEigenForward.h:30,                 from ../inst/include/RcppEigen.h:25,                 from RcppEigen.cpp:22:
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:60:39: warning: ignoring attributes on template argument '__m128' {aka '__vector(4) float'} [-Wignored-attributes]   60 \| template<> struct is_arithmetic<__m128>  { enum { value = true }; };      \|                                       ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:61:40: warning: ignoring attributes on template argument '__m128i' {aka '__vector(2) long long int'} [-Wignored-attributes]   61 \| template<> struct is_arithmetic<__m128i> { enum { value = true }; };      \|                                        ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:62:40: warning: ignoring attributes on template argument '__m128d' {aka '__vector(2) double'} [-Wignored-attributes]   62 \| template<> struct is_arithmetic<__m128d> { enum { value = true }; };      \|                                        ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:161:43: warning: ignoring attributes on template argument 'Eigen::internal::Packet4f' {aka '__vector(4) float'} [-Wignored-attributes]  161 \| template<> struct unpacket_traits<Packet4f> { typedef float  type; enum {size=4, alignment=Aligned16}; typedef Packet4f half; };      \|                                           ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:162:43: warning: ignoring attributes on template argument 'Eigen::internal::Packet2d' {aka '__vector(2) double'} [-Wignored-attributes]  162 \| template<> struct unpacket_traits<Packet2d> { typedef double type; enum {size=2, alignment=Aligned16}; typedef Packet2d half; };      \|                                           ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:163:43: warning: ignoring attributes on template argument 'Eigen::internal::Packet4i' {aka '__vector(2) long long int'} [-Wignored-attributes]  163 \| template<> struct unpacket_traits<Packet4i> { typedef int    type; enum {size=4, alignment=Aligned16}; typedef Packet4i half; };      \|                                           ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:718:35: warning: ignoring attributes on template argument 'Eigen::internal::Packet4f' {aka '__vector(4) float'} [-Wignored-attributes]  718 \| struct palign_impl<Offset,Packet4f>      \|                                   ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:741:35: warning: ignoring attributes on template argument 'Eigen::internal::Packet4i' {aka '__vector(2) long long int'} [-Wignored-attributes]  741 \| struct palign_impl<Offset,Packet4i>      \|                                   ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:764:35: warning: ignoring attributes on template argument 'Eigen::internal::Packet2d' {aka '__vector(2) double'} [-Wignored-attributes]  764 \| struct palign_impl<Offset,Packet2d>      \|                                   ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:778:34: warning: ignoring attributes on template argument 'Eigen::internal::Packet4f' {aka '__vector(4) float'} [-Wignored-attributes]  778 \| ptranspose(PacketBlock<Packet4f,4>& kernel) {      \|                                  ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:783:34: warning: ignoring attributes on template argument 'Eigen::internal::Packet2d' {aka '__vector(2) double'} [-Wignored-attributes]  783 \| ptranspose(PacketBlock<Packet2d,2>& kernel) {      \|                                  ^
            ../inst/include/Eigen/src/Core/arch/SSE/PacketMath.h:790:34: warning: ignoring attributes on template argument 'Eigen::internal::Packet4i' {aka '__vector(2) long long int'} [-Wignored-attributes]  790 \| ptranspose(PacketBlock<Packet4i,4>& kernel) {      \|                                  ^
            In file included from ../inst/include/Eigen/Core:377,                 from ../inst/include/Eigen/Dense:1,                 from ../inst/include/RcppEigenForward.h:30,                 from ../inst/include/RcppEigen.h:25,                 from RcppEigen.cpp:22:
            ../inst/include/Eigen/src/Core/arch/Default/ConjHelper.h:15:70: warning: ignoring attributes on template argument 'Eigen::internal::Packet4f' {aka '__vector(4) float'} [-Wignored-attributes]   15 \|   template<> struct conj_helper<PACKET_REAL, PACKET_CPLX, false,false> {                                          \      \|                                                                      ^
            ../inst/include/Eigen/src/Core/arch/SSE/Complex.h:232:1: note: in expansion of macro 'EIGEN_MAKE_CONJ_HELPER_CPLX_REAL'  232 \| EIGEN_MAKE_CONJ_HELPER_CPLX_REAL(Packet2cf,Packet4f)      \| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
            ../inst/include/Eigen/src/Core/arch/Default/ConjHelper.h:22:70: warning: ignoring attributes on template argument 'Eigen::internal::Packet4f' {aka '__vector(4) float'} [-Wignored-attributes]   22 \|   template<> struct conj_helper<PACKET_CPLX, PACKET_REAL, false,false> {                                          \      \|                                                                      ^
            ../inst/include/Eigen/src/Core/arch/SSE/Complex.h:232:1: note: in expansion of macro 'EIGEN_MAKE_CONJ_HELPER_CPLX_REAL'  232 \| EIGEN_MAKE_CONJ_HELPER_CPLX_REAL(Packet2cf,Packet4f)      \| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
            ../inst/include/Eigen/src/Core/arch/Default/ConjHelper.h:15:70: warning: ignoring attributes on template argument 'Eigen::internal::Packet2d' {aka '__vector(2) double'} [-Wignored-attributes]   15 \|   template<> struct conj_helper<PACKET_REAL, PACKET_CPLX, false,false> {                                          \      \|                                                                      ^
            ../inst/include/Eigen/src/Core/arch/SSE/Complex.h:417:1: note: in expansion of macro 'EIGEN_MAKE_CONJ_HELPER_CPLX_REAL'  417 \| EIGEN_MAKE_CONJ_HELPER_CPLX_REAL(Packet1cd,Packet2d)      \| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
            ../inst/include/Eigen/src/Core/arch/Default/ConjHelper.h:22:70: warning: ignoring attributes on template argument 'Eigen::internal::Packet2d' {aka '__vector(2) double'} [-Wignored-attributes]   22 \|   template<> struct conj_helper<PACKET_CPLX, PACKET_REAL, false,false> {                                          \      \|                                                                      ^
            ../inst/include/Eigen/src/Core/arch/SSE/Complex.h:417:1: note: in expansion of macro 'EIGEN_MAKE_CONJ_HELPER_CPLX_REAL'  417 \| EIGEN_MAKE_CONJ_HELPER_CPLX_REAL(Packet1cd,Packet2d)      \| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
            In file included from ../inst/include/Eigen/Core:370,                 from ../inst/include/Eigen/Dense:1,                 from ../inst/include/RcppEigenForward.h:30,                 from ../inst/include/RcppEigen.h:25,                 from RcppEigen.cpp:22:
            ../inst/include/Eigen/src/Core/util/XprHelper.h: In instantiation of 'struct Eigen::internal::find_best_packet<float, 4>':
            ../inst/include/Eigen/src/Core/Matrix.h:22:57:   required from 'struct Eigen::internal::traits<Eigen::Matrix<float, 4, 1> >'
            ../inst/include/Eigen/src/Geometry/Quaternion.h:242:7:   required from 'struct Eigen::internal::traits<Eigen::Quaternion<float> >'
            ../inst/include/Eigen/src/Geometry/arch/Geometry_SSE.h:24:46:   required from here
            ../inst/include/Eigen/src/Core/util/XprHelper.h:187:44: warning: ignoring attributes on template argument 'Eigen::internal::packet_traits<float>::type' {aka '__vector(4) float'} [-Wignored-attributes]  187 \|          bool Stop = Size==Dynamic \|\| (Size%unpacket_traits<PacketType>::size)==0 \|\| is_same<PacketType,typename unpacket_traits<PacketType>::half>::value>      \|                                       ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
            ../inst/include/Eigen/src/Core/util/XprHelper.h:187:83: warning: ignoring attributes on template argument 'Eigen::internal::packet_traits<float>::type' {aka '__vector(4) float'} [-Wignored-attributes]  187 \|          bool Stop = Size==Dynamic \|\| (Size%unpacket_traits<PacketType>::size)==0 \|\| is_same<PacketType,typename unpacket_traits<PacketType>::half>::value>      \|                      ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
            ../inst/include/Eigen/src/Core/util/XprHelper.h:187:83: warning: ignoring attributes on template argument 'Eigen::internal::packet_traits<float>::type' {aka '__vector(4) float'} [-Wignored-attributes]
            ../inst/include/Eigen/src/Core/util/XprHelper.h:187:83: warning: ignoring attributes on template argument 'Eigen::internal::unpacket_traits<__vector(4) float>::half' {aka '__vector(4) float'} [-Wignored-attributes]
            ../inst/include/Eigen/src/Core/util/XprHelper.h:205:88: warning: ignoring attributes on template argument 'Eigen::internal::packet_traits<float>::type' {aka '__vector(4) float'} [-Wignored-attributes]  205 \|   typedef typename find_best_packet_helper<Size,typename packet_traits<T>::type>::type type;      \|                                                                                        ^~~~
            In file included from ../inst/include/Eigen/Core:439,                 from ../inst/include/Eigen/Dense:1,                 from ../inst/include/RcppEigenForward.h:30,                 from ../inst/include/RcppEigen.h:25,                 from RcppEigen.cpp:22:
            ../inst/include/Eigen/src/Core/DenseCoeffsBase.h: In instantiation of 'class Eigen::DenseCoeffsBase<Eigen::Matrix<float, 4, 1>, 0>':
            ../inst/include/Eigen/src/Core/DenseCoeffsBase.h:300:7:   required from 'class Eigen::DenseCoeffsBase<Eigen::Matrix<float, 4, 1>, 1>'
            ../inst/include/Eigen/src/Core/DenseCoeffsBase.h:551:7:   required from 'class Eigen::DenseCoeffsBase<Eigen::Matrix<float, 4, 1>, 3>'
            ../inst/include/Eigen/src/Core/DenseBase.h:41:34:   required from 'class Eigen::DenseBase<Eigen::Matrix<float, 4, 1> >'
            ../inst/include/Eigen/src/Core/MatrixBase.h:48:34:   required from 'class Eigen::MatrixBase<Eigen::Matrix<float, 4, 1> >'
            ../inst/include/Eigen/src/Core/PlainObjectBase.h:98:7:   required from 'class Eigen::PlainObjectBase<Eigen::Matrix<float, 4, 1> >'
            ../inst/include/Eigen/src/Core/Matrix.h:178:7:   required from 'class Eigen::Matrix<float, 4, 1>'
            ../inst/include/Eigen/src/Geometry/Quaternion.h:46:50:   required from 'class Eigen::QuaternionBase<Eigen::Quaternion<float> >'
            ../inst/include/Eigen/src/Geometry/Quaternion.h:250:7:   required from 'class Eigen::Quaternion<float>'
            ../inst/include/Eigen/src/Geometry/arch/Geometry_SSE.h:27:3:   required from here
            
            <!--EndFragment-->
            </body>
            </html>

`

auto-stop-idle/autostop.py ignores open terminal sessions

Bug:
When JupyterLab is open, but terminal sessions are used instead of notebook sessions, instance may get incorrectly shut down despite being actively used.

Debug:

  1. autostop.py uses sessions API: https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/blob/master/scripts/auto-stop-idle/autostop.py#L88
  2. Jupyter exposes info about notebooks and terminals under separate APIs:
  1. Activity on terminal never gets picked up by autostop.py which may result in premature shutdown

Caveats:

  • Newer Jupyter versions expose last activity timestamp in terminal API
  • Version available currently in AWS od not expose this information

From my manual check 5 minutes ago:

sh-4.2$ curl -k https://localhost:8443/api/terminals/1
{"name": "1"}sh-4.2$

Python kernels are not available after running the lifecycle config

I am using the following lifecycle config:
ENVIRONMENT=python3
NOTEBOOK_FILE=/home/ec2-user/SageMaker/data_prep.ipynb

source /home/ec2-user/anaconda3/bin/activate "$ENVIRONMENT"

nohup jupyter nbconvert "$NOTEBOOK_FILE" --ExecutePreprocessor.kernel_name=python --ExecutePreprocessor.timeout=1500 --execute &

source /home/ec2-user/anaconda3/bin/deactivate
It runs smoothly, however when I open up the notebook only R and Sparkmagic kernels are available, so I can not edit my data_prep.ipynb. It started about a week ago. Has anyone experienced similar behaviour?
Cheers, Mark

execute-notebook-on-startup [Script doesn't work]

Hi,
I tried to execute the script given at https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/blob/master/scripts/execute-notebook-on-startup/on-start.sh

found two error while running:
environment not found
jupyter nbconvert cmd didn't run as expected due to one missing argument.

Attaching the correct script.



#!/bin/bash

set -e

# OVERVIEW
# This script executes an existing Notebook file on the instance during start using nbconvert(https://github.com/jupyter/nbconvert)

# PARAMETERS

ENVIRONMENT=python3
NOTEBOOK_FILE=/home/ec2-user/SageMaker/MyNotebook.ipynb

source /home/ec2-user/anaconda3/bin/activate "$ENVIRONMENT"

jupyter nbconvert  --to notebook --ExecutePreprocessor.kernel_name=python3 --execute "$NOTEBOOK_FILE"

source /home/ec2-user/anaconda3/bin/deactivate

persistent-conda-ebs/on-start.sh does not add multiple conda environments

When I create multiple conda environments in the custom-miniconda folder the script only correctly adds one as a Jupyter kernel.

I traced the problem to line 23. After the first iteration of the for loop the log prints "Could not find conda environment: [my environment name]"

Changing line 23 from source activate "$BASENAME" to conda activate $BASENAME solves the problem for me.

Add IMDS service IP to NO_PROXY list in proxy-for-jupyter

The IP address 169.254.169.254 is a link-local address that's assigned to EC2 Instance Metadata Service (IMDS) interface. If 169.254.169.254 is excluded from the NO_PROXY list, then services that rely on IMDS will fail. For example, Boto3 will be unable to obtain temporary credentials for a role assigned to a notebook instance, resulting in the ambiguous error "Unable to locate credentials".

To resolve this issue, add 169.254.169.254 to the NO_PROXY variables in these two locations:

echo "export no_proxy='s3.amazonaws.com,127.0.0.1,localhost'" | tee -a /home/ec2-user/.profile >/dev/null

echo "os.environ['NO_PROXY']="\""s3.amazonaws.com,127.0.0.1,localhost"\""" | tee -a /home/ec2-user/.ipython/profile_default/startup/00-startup.py >/dev/null

Additional information:
As described in EC2 docs and this Wikipedia article, Link-local IPs are only valid and accessible from the instance, so it makes no sense to forward such traffic to a proxy server. For security reasons, proxies are typically configured to drop or reject traffic destined for a link-local address, which may further confuse a person trying to troubleshoot this issue.

tensorflow_p36 environment running notebook fails through configuration

Trying to run a Sagemaker notebook that essentially runs inference for a Keras language model.
The notebook works well standalone but doesn't work with the notebook lifecycle instance configuration.

The notebook should run in the tensorflow_p36 virtual environment and the conda_tensorflow_p36 kernel.

My file looks like this:

set -e

NOTEBOOK_FILE="/home/ec2-user/SageMaker/notebook.ipynb"
ENVIRONMENT="tensorflow_p36"
AUTO_STOP_FILE="/home/ec2-user/SageMaker/auto-stop.py"

source /home/ec2-user/anaconda3/bin/activate "$ENVIRONMENT"

nohup jupyter nbconvert --ExecutePreprocessor.kernel_name=conda_tensorflow_p36 --execute "$NOTEBOOK_FILE" &

source /home/ec2-user/anaconda3/bin/deactivate
IDLE_TIME=60

echo "Fetching the autostop script"
wget https://raw.githubusercontent.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/master/scripts/auto-stop-idle/autostop.py

echo "Starting the SageMaker autostop script in cron"
(crontab -l 2>/dev/null; echo "*/1 * * * * /usr/bin/python $PWD/autostop.py --time $IDLE_TIME --ignore-connections") | crontab -

The error message from CloudWatch logs is:
cp: cannot stat '/root/.keras/keras_tensorflow.json': No such file or directory

Any fix?

persisting enviroment only supports python 3.7.11?

Followed the steps by first running the on-create.sh and setup the on-start.sh on the life cycle configuration.

changed the line on the on-create.sh
wget https://repo.anaconda.com/miniconda/Miniconda3-4.10.3-Linux-x86_64.sh -O "$WORKING_DIR/miniconda.sh" KERNEL_NAME="custom_python" PYTHON="3.6"

after run completion, when we run the notebook using the custom_python enviroment, the python version is still in 3.7.11. Any Idea what is causing this?

notebook-history-s3 cronjob not saving data to S3

The on-start.sh script is downloading a file named "notebook-history-s3.py" using the following command:
wget https://raw.githubusercontent.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/master/scripts/notebook-history-s3/notebook-history-s3.py

But, when it creates a cronjob, it tries to execute a python file named "notebook_history_s3.py" which doesn't exist.
(crontab -l 2>/dev/null; echo "0 * * * * /usr/bin/python3 $PWD/notebook_history_s3.py") | crontab -

suggested change to fix the issue:
(crontab -l 2>/dev/null; echo "0 * * * * /usr/bin/python3 $PWD/notebook-history-s3.py") | crontab -

update use-persisted-conda

AFAICT https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/tree/master/scripts/persistent-conda-ebs is now included a dropdown option when creating a notebook instance

start notebook script

#!/usr/bin/env bash

set -e

PERSISTED_CONDA_DIR="${PERSISTED_CONDA_DIR:-/home/ec2-user/SageMaker/.persisted_conda}"

echo "Setting up persisted conda environments..."
mkdir -p ${PERSISTED_CONDA_DIR} && chown ec2-user:ec2-user ${PERSISTED_CONDA_DIR}

envdirs_clean=$(grep "envs_dirs:" /home/ec2-user/.condarc || echo "clean")
if [[ "${envdirs_clean}" != "clean" ]]; then
    echo 'envs_dirs config already exists in /home/ec2-user/.condarc. No idea what to do. Exiting!'
    exit 1
fi

echo "Adding ${PERSISTED_CONDA_DIR} to list of conda env locations"
cat << EOF >> /home/ec2-user/.condarc
envs_dirs:
  - ${PERSISTED_CONDA_DIR}
  - /home/ec2-user/anaconda3/envs
EOF

set-env-variable do not evaluate YOUR_ENV_VARIABLE_NAME properly

I have tried the set-env-variable script but it returns a string empty for YOUR_ENV_VARIABLE_NAME when evaluating the line

DO NOT WORK
YOUR_ENV_VARIABLE_NAME=tagkey
TAG=$(aws sagemaker list-tags --resource-arn $NOTEBOOK_ARN | jq .'Tags[] | select(.Key == "$YOUR_ENV_VARIABLE_NAME").Value' --raw-output)

It seems that for some reason the $YOUR_ENV_VARIABLE_NAME is not substituted by the value due to being enclosed in "".

But if I replace the YOUR_ENV_VARIABLE_NAME with the actual string then works. But I would prefer not to hardcode the KEY in the script.

WORKS
TAG=$(aws sagemaker list-tags --resource-arn $NOTEBOOK_ARN | jq .'Tags[] | select(.Key == "tagkey").Value' --raw-output)

[Request] sample for installing packages to Python3+ environments only

Many packages today don't provide backward compatibility to Python v2, so samples like install-pip-package-all-environments are useful but can require a non-trivial workaround to ignore Py2 environments.

I think I have a working example of this already (with smdebug and sagemaker-experiments), so will raise a PR to accompany this issue.

Auto Stop idle script giving error

Hi,

I have tried using this stop auto idle however following all the steps its giving me error as below

Error:
/bin/bash^M: bad interpreter: No such file or directory
/bin/bash: /tmp/OnStart_2021-12-21-07-55j53iytbj: /bin/bash^M: bad interpreter: No such file or directory

even after removing /bin/bash it gives other error

/tmp/OnStart_2021-12-21-08-36jg8sy0tu: line 1: set: -: invalid option

set: usage: set [-abefhkmnptuvxBCHP] [-o option-name] [--] [arg ...]
/tmp/OnStart_2021-12-21-08-36jg8sy0tu: line 2:

\r': command not found /tmp/OnStart_2021-12-21-08-36jg8sy0tu: line 13: \r': command not found /tmp/OnStart_2021-12-21-08-36jg8sy0tu: line 16: \r': command not found Fetching the autostop script --2021-12-21 08:36:11-- https://raw.githubusercontent.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/master/scripts/auto-stop-idle/autostop.py%0D Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.109.133, 185.199.110.133, 185.199.111.133, ... Connecting to raw.githubusercontent.com (raw.githubusercontent.com)\|185.199.109.133\|:443... connected. HTTP request sent, awaiting response... 404 Not Found 2021-12-21 08:36:11 ERROR 404: Not Found. /tmp/OnStart_2021-12-21-08-36jg8sy0tu: line 19: \r': command not found Starting the SageMaker autostop script in cron /tmp/OnStart_2021-12-21-08-36jg8sy0tu: line 21: \r': command not found

lifecycle config R libraries from both CRAN and Bioconductor

Any suggestion/tips about how to have Bioconductor libraries pre-installed in SageMaker?
The lifecycle configuration script to have pre-installed R libraries is only working for me for CRAN R libraries:

#!/bin/bash set -e nohup sudo -b -u ec2-user -i <<'EOF' conda install "r-ggplot2" --name "R" --yes conda install "r-rentrez" --name "R" --yes EOF

But when I add R libraries from Bioconductor, those are not installed. For example the next script will successfully install rentrez and ggplot2 (CRAN libraries) but not oligo nor annotate (Bioconductor libraries) :

#!/bin/bash set -e nohup sudo -b -u ec2-user -i <<'EOF' conda install "r-rentrez" --name "R" --yes conda install "r-oligo" --name "R" --yes conda install "r-annotate" --name "R" --yes conda install "r-ggplot2" --name "R" --yes EOF

Any idea how to make it work?

notebook-instance cant find persistent environment

I followed the sample code persistent-conda-ebs for installing an external package. But in the startup code it errors out as line 12: /home/ec2-user/SageMaker/custom-miniconda/miniconda/bin/activate: No such file or directory

#!/bin/bash

set -e
# OVERVIEW
# This script executes an existing Notebook file on the instance during start using nbconvert(https://github.com/jupyter/nbconvert)

# PARAMETERS

ENVIRONMENT=custom_python
NOTEBOOK_FILE="/home/ec2-user/SageMaker/forecast-notebook.ipynb"

source /home/ec2-user/SageMaker/custom-miniconda/miniconda/bin/activate "$ENVIRONMENT"

jupyter nbconvert "$NOTEBOOK_FILE" --ExecutePreprocessor.kernel_name=custom_python--execute

source /home/ec2-user/anaconda3/bin/deactivate

why use pip

why use pip in conda environment rather than conda install?

Another problem, if you use conda install, that may take more than 5 mins. can you use nohup to solve the time-ran-out issue? If you could, how we should do that?

How to add a custom private SSH key?

Hello, this isn't regarding anything existing but I was wondering if anyone knew how to add a custom private SSH key for git interactions. I tried to do the following via lifecycle configs, however, it keeps saying the authentication agent is not found. I think it has something to do with the kernels that are run for lifecycle?

echo "Activating ssh agent"
eval $(ssh-agent -s)
ssh-add "${USER_HOME}/.ssh/sagemaker-bitbucket"

I know there's a Git Repositories service as part of notebooks but I think this way offers more control and also it would be easier to work with private repos. Thank you!

Extension install scripts fail with empty variables

The sudo -u ec2-user -i EOF pattern used in the install-lab-extension sample fails because the main shell resolves the variable $EXTENSION_NAME (to blank) before it's passed to the subshell in the script.

As noted here, we need to escape $ when using this pattern.

I have a PR ready to fix this for both the install-lab-extension and install-nb-extension samples - will submit after recording the issue.

...but I think this might affect other samples too?

Not able to launch a default kernel

Hi All,

I'm programatically creating the sagemaker notebook instance, and wanted that any new notebook should open by default with a python3 kernel, however when I open up a notebook Could not find a kernel matching Python 3 , occurs giving a prompt to chose from the available kernels.

I searched online and tried out making additions to the jupyter_notebook_config.py in lifecycle scripts inside /home/ec2-user/.jupyter as

c.MultiKernelManager.default_kernel_name = 'python3', also tried
c.MultiKernelManager.default_kernel_name = 'conda-env-python3-py'

Noone of which seem to work.
How to solve this i.e. instead of asking for a prompt, directly loading the notebook with the said kernal. Also I noticed that this is particularly for the newly created instances, an instance that is stopped and restarted doesn't face this. Peculiar?

Thanks

restart jupyter-server : command not found

The on-start.sh script attempts to restart the jupyter server but I got error that the restart command is not found. How to fix that?

2021-12-22T15:06:26.697-08:00 | Restarting the Jupyter server.. 2021-12-22T15:06:32.236-08:00Copy/tmp/OnStart_2021-12-22-23-068l8lm3x9: line 24: restart: command not found | /tmp/OnStart_2021-12-22-23-068l8lm3x9: line 24: restart: command not found

Source of the restart command:
https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/blob/master/scripts/persistent-conda-ebs/on-start.sh

Autostop Cron job is recognized, but not stopping the instance.

I have my Notebook setup for the run with the start. It is running my code fine. However, it does not stop the instance when the notebook is idle. For example, the code finished running 9PM last night, system recognized it. The Auto script should be shutting the instance off within 1 hour of idle; it has been ~14 hours and it is still "InService".
Does anyone have a solution this problem?

More Examples

Hi
can you send an example of making a new exnvironment and activating it.
I am getting invalid choice error while doing the following in the script:

set -e
sudo -u ec2-user -i <<'EOF'
cd /home/ec2-user/SageMaker
conda create -n rele python=3.6
source activate rele
EOF

Sample script doesn't work

I get the following logs:

15:25:59
/bin/bash: /tmp/OnStart_2019-07-25-15-25eklvadz0: /bin/bash^M: bad interpreter: No such file or directory

15:41:03
/bin/bash: /tmp/OnStart_2019-07-25-15-41dneui_mc: /bin/bash^M: bad interpreter: No such file or directory

This is now just pure copy paste from your sample scripts


#!/bin/bash

set -e

# OVERVIEW
# This script installs a single pip package in all SageMaker conda environments, apart from the JupyterSystemEnv which is a 
# system environment reserved for Jupyter.
# Note this may timeout if the package installations in all environments take longer than 5 mins, consider using "nohup" to run this 
# as a background process in that case.

sudo -u ec2-user -i <<'EOF'

# PARAMETERS
PACKAGE=scipy

# Note that "base" is special environment name, include it there as well.
for env in base /home/ec2-user/anaconda3/envs/*; do
    source /home/ec2-user/anaconda3/bin/activate $(basename "$env")

    if [ $env = 'JupyterSystemEnv' ]; then
      continue
    fi

    pip install --upgrade "$PACKAGE"

    source /home/ec2-user/anaconda3/bin/deactivate
done

EOF

Increase sagemaker notebook instance session duration

I have a large training job, I am training using keras on ml.p2.xlarge instance and using conda_amazonei_tensorflow_p36 notebook. I got to know that juoyter notebook session duration lasts for 12 hours but my training job will take more than that. Can I increase the duration ?If so,how.

persistent-conda-ebs doen't allow to install more libraries?

persistent-conda-ebs:

I successfully created a new env in my Notebook Instance called custom_persistent_python using OnCreate and OnStart sripts on /persistent-conda-ebs. I've installed some of the common libraries via the OnCreate script but, what happens if I need a new one?
I've tried:

  • Installing it in a shell (after source activate /home/ec2-user/SageMaker/custom-miniconda/miniconda/envs/custom_persistent_python).
  • Via a Notebook (!command) in the custom env.

In both cases, library seems to install ok, but it can't be used inside the Notebook (No module named xxxx), not even after Restart kernel.
I've checked the $CONDA_DEFAULT_ENV inside the Notebook and it gives JupyterSystemEnv. Does this mean I'm not inside my custom environment?
How to install new libraries, not included in the OnCreate script, that are also persistent in my new env?

LifeCycle Configurations for R kernel: install R packages

Good afternoon
I have a question regarding the problem posted here

The standard R kernel when making a SageMaker notebook instance from Jupyter does not have some packages I would like installed. Manually installing them (R: install.packages("rjson")) takes a very long time. Some of the packages I want to install are 'rstan', 'survival', 'tidyverse'.

I would like to create a Lifecycle configurations script that automatically installs these packages when creating a notebook instance.

My current configurations file is:
#!/bin/bash
sudo -u ec2-user -i <<'EOF'
conda install --yes --name R --channel r r-essentials=3.6.0
EOF
After creating the notebook instance and starting a new notebook (R-kernel), I run
("rjson" %in% installed.packages())
("r-rjson" %in% installed.packages())
To test whether at the very least rjson is installed. However, it is not

Any help in getting my configurations script would be appreciated!!

Edit: I want a functionality like this

Nothing happened when using SageMaker Lifecycle configuration to execute a jupyter notebook on start

I want to set up some automatic schedule for running my SageMaker Notebook.
Currently I found link like this:
https://towardsdatascience.com/automating-aws-sagemaker-notebooks-2dec62bc2c84

I followed the steps to set up the lamda, cloudwatch, and the Lifecycle configuration.
During different experiment, some times the on_start lifecycle configuration can execute the jupyter notebook (In the notebook i just install some package and load the package and save the loading status to S3 bucket). However, it failed due to it can't stop the notebook.

Then I added permission to my IAM role for SageMaker autostop. Now the notebook instance can be turn on and turn off. But I don't see anything uploaded to my S3 any more. I am wondering if the on_start started the auto-stop too early before it finish the steps?

Below is my script for the current lifecycle configuration

set -e

ENVIRONMENT=python3
NOTEBOOK_FILE="/home/ec2-user/SageMaker/Test Notebook.ipynb"
AUTO_STOP_FILE="/home/ec2-user/SageMaker/auto-stop.py"

source /home/ec2-user/anaconda3/bin/activate "$ENVIRONMENT"

nohup jupyter nbconvert --ExecutePreprocessor.timeout=-1 --ExecutePreprocessor.kernel_name=python3 --execute "$NOTEBOOK_FILE" &

echo "Finishing running the jupyter notebook"

source /home/ec2-user/anaconda3/bin/deactivate

# PARAMETERS
IDLE_TIME=60  # 1 minute

echo "Fetching the autostop script"
wget -O autostop.py https://raw.githubusercontent.com/mariokostelac/sagemaker-setup/master/scripts/auto-stop-idle/autostop.py

echo "Starting the SageMaker autostop script in cron"
(crontab -l 2>/dev/null; echo "*/1 * * * * /bin/bash -c '/usr/bin/python3 $DIR/autostop.py --time ${IDLE_TIME} | tee -a /home/ec2-user/SageMaker/auto-stop-idle.log'") | crontab -

Note that, I do see the echo "Finishing running the jupyter notebook" from the cloudwatch log. But that's usually the first thing i saw from the log and it shows up immediately - faster than I expect how long it should take.

Also, currently the notebook is only running some fake task. The real task may take more than an hour.

Any suggestions help! Thank you for taking the time to read my questions.

Example script fails "auto_stop_idle.sh" - crontab

Hello,
I am launching a Sagemaker instance using Terraform. I attached it a lifecycle configuration including the file auto_stop_idle.sh so that it stops when idle.

I have managed to attach it but I think the crontab line in the bash file is incorrect:
(crontab -l 2>/dev/null; echo "5 * * * * /usr/bin/python $PWD/autostop.py --time $IDLE_TIME --ignore-connections") | crontab - (does nothing)

should be:
(crontab -l 2>/dev/null; echo "*/5 * * * * /usr/bin/python $PWD/autostop.py --time $IDLE_TIME --ignore-connections") | crontab - (works)

Should I consider making a PR ?

Example script fails - bash cannot found miniconda

I tried running the default on-start, on-create scripts, aka your example. The on-create scripts does not install miniconda successfully. Namely, I am referring to this line:
bash "$WORKING_DIR/miniconda.sh" -b -u -p "$WORKING_DIR/miniconda"
I kindly ask you to fix this issue. Thanks!

auto-stop-idle/on-start.sh notebooks built on aws2 can't "find" Python

Running the idle auto-stop on a few machines and found that the Python check (lines 30-42) will fail when used on a notebooks based on AWS2. AWS1 based notebooks work fine. It looks like the location of Python changed between versions, but I need to do a bit more digging.
Image of log when booting:
image

I'm thinking using the linux which python or which python3 to check for the presence of a Python install and run the boto3 import after that. This could help future proof against any other changes in the python path.

Something like

PYTHON_DIR=$(which python)
if [ -z "$PYTHON_DIR" ]; then
	echo "no Python installed"
	exit 1
else 
	python -c "import boto3" 2>/dev/null;
	echo "Found and installed boto3 at $PYTHON_DIR"
	echo "Starting the SageMaker autostop script in cron"
	(crontab -l 2>/dev/null; echo "*/5 * * * * $PYTHON_DIR $PWD/autostop.py --time $IDLE_TIME --ignore-connections >> /var/log/jupyter.log") | crontab -
fi

Lifecycle config cannot run notebook that imports sagemaker, boto3 libraries

I have the following lifecycle config:

#!/bin/bash
sudo -u ec2-user -i <<'EOF'

source activate python3
pip install requests==2.21.0
echo "Installing boto3"
pip install boto3
echo "Starting nohup command"
nohup jupyter nbconvert /home/ec2-user/SageMaker/TestNotebook.ipynb --ExecutePreprocessor.kernel_name=python3 --to notebook --execute /home/ec2-user/SageMaker/TestNotebook.ipynb
echo "Finished nohup command"
conda deactivate
EOF

which I'm trying to use to run the cells in a notebook called TestNotebook.ipynb.

The contents of TestNotebook.ipynb are as follows:

# Cell 1
import pandas as pd
df = pd.DataFrame([1,2,3,7])
df.to_csv('random_junk.csv')

# Cell 2
import boto3

From the error logs and the random_junk.csv file generated, it's clear that cell 1 works fine, but when I uncomment and allow cell 2 in the notebook, I get a complicated error log that includes a lot of badly formatted output. The once legible output in the log that I believe may be relevant is this:

image

Any help on how to resolve / get around this would be great thanks

Following script makes the python kernels missing

I forked the following sample script and changed the package from scipy to plotly and for some reason when the notebook is started it doesn't show any python kernel. Only R and Sparkmagic?

#!/bin/bash

set -e

# OVERVIEW
# This script installs a single conda package in all SageMaker conda environments, apart from the JupyterSystemEnv which is a 
# system environment reserved for Jupyter.
# Note this may timeout if the package installations in all environments take longer than 5 mins, consider using "nohup" to run this 
# as a background process in that case.

sudo -u ec2-user -i <<'EOF'
# PARAMETERS
PACKAGE=plotly
# Note that "base" is special environment name, include it there as well.
conda install "$PACKAGE" --name base --yes
for env in /home/ec2-user/anaconda3/envs/*; do
    env_name=$(basename "$env")
    if [ $env_name = 'JupyterSystemEnv' ]; then
      continue
    fi
    nohup conda install "$PACKAGE" --name "$env_name" --yes
done

EOF

SageMaker notebook instance doesn't work with ipywebrtc extension

This extension doesn't work on SageMaker https://github.com/maartenbreddels/ipywebrtc
I suspect this could be due to how the proxy is setup:
I get this error in the inspector:

The resource from “https://XXXXXXXX.notebook.us-east-2.sagemaker.aws/static/jupyter-webrtc.js?v=20191125191846” was blocked due to MIME type (“text/html”) mismatch (X-Content-Type-Options: nosniff

Any idea how to make it work?

pip install ipywebrtc

from ipywebrtc import CameraStream, ImageRecorder
camera = CameraStream(constraints=
                      {'facing_mode': 'user',
                       'audio': False,
                       'video': { 'width': 640, 'height': 480 }
                       })
camera

documentation: Readme.md file for every script

Instead of opening every on-start.sh or on-create.sh bash file to read the documentation on what a script does, I believe it would be better to have a README.md file for every script at the path "scripts/" to improve readability.

Additionally, it the root README file, we can have a list of all the scripts along with a small description

source activate

Hey there,

it cost me a lot of time to figure out that using
source activate ... (e.g. pytorch_p36)
is working and
source /home/ec2-user/anaconda3/bin/activate ... (e.g. pytorch_p36)
is not.
Maybe change the code in "install-pip-package-single-environment" accordingly.

Notebook instance keeps refreshing the page without opening Jupyter(Lab)

I have a Sagemaker notebook instance created with TypeScipt CDK, and I also created a lifecycle rule for it. I used auto-stop-idle example to create the script and ensured it is working. In on-start.sh script I get a specific version of the python script in the example to make sure we know which version we're using on start.

Stopping the instance after it's been idle for an hour worked just fine, but I started experiencing a problem that once I started the instance again, and tried to open jupyter / jupterlab, it would get stuck and keep refreshing the page. I contacted AWS support about the issue, and they were able to reproduce the problem. They noticed that it doesn't happen every time, though.

They suggested following things:

  1. Retry starting your Notebook instances, and wait for a few minutes even though the Jupyter interface seems stuck in refreshing the loading page.

  2. If the issue persists, try to clear your browser cache or try on a different browser.

  3. Again, if Jupyter interface isn’t opening, please stop the notebook instance, remove its auto-stop-idle lifecycle configuration, re-try to start and open it, and let me know if the issue still happens on the instance without lifecycle policy.

  4. Kindly share the jupyter.log file with me:

    1. Go to SageMaker console and choose notebook instances.
    2. Open the notebook instance in which you are facing this issue.
    3. Under ‘Monitor’ section, select ‘View logs’. It will redirect you to CloudWatch logs.
    4. In the CloudWatch console, choose the log stream for your notebook instance. Its name is in the form /jupyter.log.
    5. You can download the logs from CloudWatch by selecting ‘Actions’ -> ‘Download as .CSV file’.

and the second one helped me. So clearing cookies / using incognito mode on Chrome resolved the issue. We haven't had this problem before, before we started using the lifecycle rule mentioned.

If you need any additional details from me, please let me know.

Life cycle config took longer than 5 minutes

Hi all,

I'm trying to migrate my notebook instance which is using AL1 to AL2. I found this help doc: https://aws.amazon.com/blogs/machine-learning/migrate-your-work-to-amazon-sagemaker-notebook-instance-with-amazon-linux-2/.

As per the article, I created a life cycle config and updated the on-start.sh. I started the instance and it didn't start. Below is the error:

Notebook Instance Lifecycle Config 'arn:aws:sagemaker:ap-south-1:account-no:notebook-instance-lifecycle-config/backup-ebs' for Notebook Instance 'arn:aws:sagemaker:ap-south-1:account-no:notebook-instance/test' took longer than 5 minutes. Please check your CloudWatch logs for more details if your Notebook Instance has Internet access.

CloudWatch logs say "/bin/bash: /tmp/OnStart_2022-05-05-21-43705z8lmx: /bin/bash^M: bad interpreter: No such file or directory"

Please help!

Best,
Sarat

How to debug auto-stop-idle

Hello,

The auto-stop-idle script is not working for me. I have added it to the life cycle config script for start notebook.
set the IDLE_TIME=60
Started an instance with the life cycle config.
I could see the lifecycle config was run in the logs and the line
"Starting the SageMaker autostop script in cron" which comes from the on-start.sh.
The execution role has the AmazonSageMakerFullAccess policy so it should be able to execute the policy SageMaker:StopNotebookInstance.

How can I check the running python script?

update 1:
Running "/usr/bin/python $PWD/autostop.py --time $IDLE_TIME --ignore-connections" in a terminal in jupyter labs seems to stop the instance.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.