Giter Site home page Giter Site logo

kevinwkc / oshinko-s2i Goto Github PK

View Code? Open in Web Editor NEW

This project forked from radanalyticsio/oshinko-s2i

0.0 1.0 0.0 48.13 MB

This is a place to put s2i images and utilities for spark application builders for openshift

License: Apache License 2.0

Makefile 0.82% Java 0.42% Scala 1.94% Shell 92.92% Awk 0.13% Dockerfile 3.35% Python 0.42%

oshinko-s2i's Introduction

Build Status Docker python build Docker java build

oshinko-s2i

This is a place to put s2i images and utilities for Apache Spark application builders for OpenShift.

Building the s2i images

The easiest way to build the s2i images is to use the makefiles provided:

# To build all images
$ make

# To build images individually
$ make -f Makefile.pyspark
$ make -f Makefile.java
$ make -f Makefile.scala
$ make -f Makefile.sparklyr

The default repository for the image can be overridden with the LOCAL_IMAGE var:

$ LOCAL_IMAGE=myimage make -f Makefile.pyspark

Modifying dependencies in the image yaml files

The cekit tool generates the image context directories based on the content of the image.*.yaml files.

A script has been provided to make altering the image.*.yaml files simpler. It handles modifying the specified versions of oshinko, spark, scala, and sbt. Run this for more details

$ change-yaml.sh -h

Remaking image context directories when things change

The image context directories are generated with the cekit tool and contain the artifacts needed to build the images. They are:

* pyspark-build
* java-build
* scala-build
* sparklyr-build

If the yaml files used by cekit change (ie image.*.yaml) or the content included in an image changes (essentially anything under modules/), the image context directories need to be rebuilt.

Rebuilding context directories for an upstream pull request

If the changes being made are part of a PR to github.com/radanalyticsio/oshinko-s2i then all of the build directories should be generated from scratch. The best way to do this is with the make-build-dirs.sh script

$ make-build-dirs.sh

This will recreate the context directories starting from a clean environment, make sure any tarballs are truncated for github, and add all of the changes to the commit.

Rebuilding a particular context directory for testing/development

To regenerate a particular context directory, like pyspark-build, do this

$ make -f Makefile.pyspark clean-context context

To regenerate the context directory and also build the image, do this

$ make -f Makefile.pyspark clean build

Git pre-commit hook

The hooks/pre-commit hook can be installed in a local repo to prevent commits with non-zero length tarballs in the image build directories or to warn when changes have been made to yaml files or scripts but the image build directories have not changed. To install the hook locally do something like this:

$ cd .git/hooks
$ ln -s ../../hooks/pre-commit pre-commit

This is recommended, since the CI tests will reject a pull request with non-zero length tarballs anyway. Save some time, install the hook.

Using release-templates.sh

The templates included in this repository always reference the latest s2i images. Those images may change during the normal course of development.

The release-templates.sh script can be used to create local versions of the templates that reference s2i images from a particular oshinko release. You may want to use this script to guarantee that you are using a stable image. For example:

$ ./release-templates.sh v0.2.5

Successfully wrote templates to release_templates/ with version tag v0.2.5

grep radanalyticsio/radanalytics-.*spark:v0.2.5 *

release_templates/javabuilddc.json:            "name": "radanalyticsio/radanalytics-java-spark:v0.2.5"
release_templates/javabuild.json:              "name": "radanalyticsio/radanalytics-java-spark:v0.2.5"
release_templates/pysparkbuilddc.json:         "name": "radanalyticsio/radanalytics-pyspark:v0.2.5"
release_templates/pysparkbuild.json:           "name": "radanalyticsio/radanalytics-pyspark:v0.2.5"
release_templates/scalabuilddc.json:           "name": "radanalyticsio/radanalytics-scala-spark:v0.2.5"
release_templates/scalabuild.json:             "name": "radanalyticsio/radanalytics-scala-spark:v0.2.5"
release_templates/sparklyrbuild.json           "name": "radanalyticsio/radanalytics-sparklyr-spark:v0.25"

$ oc create -f release_templates/pysparkbuilddc.json

MacOS Tips

For MacOS you will also need to download these tools: gsed and truncate. You can install these using homebrew and these commands:

brew install truncate
brew install gnu-sed

oshinko-s2i's People

Contributors

tmckayus avatar rebeccasimmonds19 avatar rimolive avatar elmiko avatar mattf avatar jkremser avatar zroubalik avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.