Giter Site home page Giter Site logo

dataflow.spring.io's Introduction

Spring Data Flow Dashboard

Staging Build and Deploy Staging Check Links

Introduction

This project contains the Markdown files that get automatically generated as documentation and guides for the Spring Cloud Data Flow Microsite site.

Building

You'll need NodeJS and Yarn installed globally. Note that, for Node, you need version 10, not the latest version.

# Init
yarn install        # Install dependencies

# Linter / Prettier
yarn run lint       # Linter
yarn run fix        # Fix linting errors

# Dev
yarn start          # Run dev

# Prod
yarn build          # Run dev
yarn serve          # Serve the prod build

Contributing

We welcome contributions! All documentation for this project is written using Markdown. An example segment from our Stream Processing Getting Started Guide is shown below:

# Getting Started with Stream Processing

Spring Cloud Data Flow provides over 70 prebuilt streaming applications that you can use right away to implement common streaming use cases.
In this guide we will use two of these applications to construct a simple data pipeline that produces data sent from an external http request and consumes that data by logging the payload to the terminal.

Instructions for registering these prebuilt applications with Data Flow are provided in the [Installation guide](%currentPath%/installation/).

Q&A and issue tracking

If you have any feedback, additions, or changes to the documentation or guides, don't hesitate to add an issue.

Code of Conduct

This project is governed by the Spring Code of Conduct. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to [email protected].

License

The Spring Framework is released under version 2.0 of the Apache License.

dataflow.spring.io's People

Contributors

abhinavrau avatar boojongmin avatar cbo-indeed avatar chrisjs avatar corneil avatar cppwfs avatar cwjohnpark avatar dependabot[bot] avatar derrick-anderson avatar dturanski avatar ghillert avatar ilayaperumalg avatar jvalkeal avatar kinjelom avatar klopfdreh avatar log-info avatar markpollack avatar mcnichol avatar michaeldavissolace avatar mminella avatar nbendafi-yseop avatar onobc avatar oodamien avatar pavlosav avatar publinchi avatar sayapeg avatar sobychacko avatar spipnl avatar trisberg avatar tzolov avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dataflow.spring.io's Issues

Registering prebuilt applications documentation is confusing; does not produce expected results

This page https://dataflow.spring.io/docs/installation/local/docker/ does not have the section "Registering Pre-Built Applications" that the other sections of the guide have.

And when a user does go to another section after having followed the instructions here for manual install via docker-compose, the provided dataflow commands produce no actual apps imported:

dataflow:>app import --uri https://dataflow.spring.io/rabbitmq-maven-latest Successfully registered 0 applications from [....long list see below.....]]

dataflow:>app import --uri https://dataflow.spring.io/kafka-maven-latest Successfully registered 0 applications from [source.sftp, source.mqtt.metadata, sink.mqtt.metadata, source.file.metadata, processor.grpc.metadata, processor.tcp-client, source.s3.metadata, source.jms, source.ftp, processor.transform.metadata, source.time, sink.mqtt, sink.s3.metadata, processor.scriptable-transform, sink.log, source.load-generator, processor.transform, source.syslog, sink.websocket.metadata, source.loggregator.metadata, source.s3, source.sftp-dataflow, source.load-generator.metadata, processor.pmml.metadata, source.loggregator, source.tcp.metadata, processor.httpclient.metadata, sink.file.metadata, processor.object-detection.metadata, source.triggertask, source.twitterstream, source.gemfire-cq.metadata, processor.aggregator.metadata, sink.task-launcher-dataflow.metadata, source.mongodb, source.time.metadata, source.gemfire-cq, sink.counter.metadata, source.http, sink.tcp.metadata, sink.pgcopy.metadata, source.rabbit, source.jms.metadata, sink.gemfire.metadata, sink.cassandra.metadata, processor.tcp-client.metadata, processor.header-enricher, sink.throughput, processor.python-http, sink.mongodb, processor.twitter-sentiment, sink.log.metadata, processor.splitter, source.tcp, processor.python-jython.metadata, processor.image-recognition, source.trigger, source.mongodb.metadata, source.sftp-dataflow.metadata, processor.bridge, source.http.metadata, sink.ftp, source.rabbit.metadata, sink.jdbc, source.jdbc.metadata, source.mqtt, processor.pmml, sink.rabbit.metadata, processor.python-jython, sink.router.metadata, sink.cassandra, processor.filter.metadata, source.tcp-client.metadata, processor.header-enricher.metadata, processor.groovy-transform, source.ftp.metadata, sink.router, sink.redis-pubsub, source.tcp-client, processor.httpclient, sink.file, sink.websocket, source.syslog.metadata, sink.s3, sink.counter, sink.rabbit, processor.pose-estimation, processor.filter, source.trigger.metadata, source.mail.metadata, sink.pgcopy, processor.python-http.metadata, sink.jdbc.metadata, sink.ftp.metadata, processor.splitter.metadata, sink.sftp, processor.grpc, processor.groovy-filter.metadata, processor.twitter-sentiment.metadata, source.triggertask.metadata, sink.hdfs, sink.task-launcher-dataflow, processor.groovy-filter, sink.redis-pubsub.metadata, source.sftp.metadata, processor.image-recognition.metadata, processor.bridge.metadata, processor.groovy-transform.metadata, processor.aggregator, sink.sftp.metadata, processor.tensorflow.metadata, sink.throughput.metadata, sink.tcp, source.mail, source.gemfire.metadata, processor.tensorflow, processor.counter, source.jdbc, processor.counter.metadata, processor.pose-estimation.metadata, sink.gemfire, source.gemfire, source.twitterstream.metadata, sink.hdfs.metadata, processor.tasklaunchrequest-transform, source.file, sink.mongodb.metadata, processor.tasklaunchrequest-transform.metadata, processor.scriptable-transform.metadata, processor.object-detection]

`kubectl` Deployment instructions wrong

Currently, the 2.2.1 Kubernetes documentation for installing using kubectl reference deploying the RSocket Prometheus Proxy.

This proxy is part of the 2.3.0 marker. Per the instructions you should be working with the v2.2.1.RELEASE tag while deploying. This makes following these instructions impossible as the Prometheus RSocket Proxy was not added until 2.3

Update troubleshooting guides to point to installation links

As a developer, I'd like to update the troubleshooting guides to point to the installation guides in the Microsite as opposed to going to the reference guide.

Also, in Local installation with Docker Compose, we are missing the section on how to gain access to the logs. Specifically, this section from the reference guide wasn't ported over to installation.

See this SO thread for the background as to how I uncovered it. Because of this disconnect, I just couldn't point to the troubleshooting guide.

Update the CI plan to include "linkinator" as a task

We have had reports of broken links.

#158 appears to fix the Initializr breaking changes, but there could be others lurking in the site.

Chores:

  • Fix the broken links
  • Run linkinator against each of the supported versions
  • Configure/fix the CI plan

More wrong references to maximum-concurrent-tasks property

As a user, while reviewing the SFTP -> JDBC sample, I notice the wrong references to maximum-concurrent-tasks property in the following YAML and in other places. Review from this point onwards.

apiVersion: v1
data:
  application.yaml: |-
    spring:
      cloud:
        dataflow:
          task:
            platform:
              kubernetes:
                accounts:
                  default:
                    maximum-concurrent-tasks: 3
                    limits:
                      memory: 1024Mi
                      cpu: 500m

Search in a specific version

As a user, I want to search in a specific version of the documentation.
Right now, the search is only perform on the current documentation.

Microsite has issues with direct access to webpages

Hi,

When navigating to the local machine instruction from the home page, the page is available.
https://dataflow.spring.io/
-> 'getting started button' (https://dataflow.spring.io/getting-started/)
-> 'local machine button' (https://dataflow.spring.io/docs/installation/local/)

Yet when you refresh, or go to the url directly, the page is 403 forbidden.
https://dataflow.spring.io/docs/installation/local/ -> 403 forbidden

I assume it has something to do with the fact that the content is loaded dynamically when going from the homepage, and not requested as a single html page.

Can't execute bill-setup task

From @LyndonZhao:

Description:
Can't execute bill-setup task in the guide document.

dataflow version:
export DATAFLOW_VERSION=2.2.0.RELEASE
export SKIPPER_VERSION=2.1.2.RELEASE

cat stdout.log

. ____ _ __ _ _
/\ / ' __ _ () __ __ _ \ \ \
( ( )__ | '_ | '| | ' / ` | \ \ \
\/ )| |)| | | | | || (| | ) ) ) )
' || .__|| ||| |_, | / / / /
=========||==============|/=////
:: Spring Boot :: (v2.1.6.RELEASE)

2019-11-27 16:01:59.699 INFO 119 --- [ main] i.s.b.BillsetuptaskApplication : Starting BillsetuptaskApplication v0.0.1-SNAPSHOT on 4ffbcccc6e3a with PID 119 (/usr/local/spring-cloud-dataflow/register-app/billsetuptask-0.0.1-SNAPSHOT.jar started by root in /tmp/bill-setup-task1316090555936135858/1215216204882/bill-setup-task-79c84cad-6a13-4f39-9b2d-f33090ec9940)
2019-11-27 16:01:59.705 INFO 119 --- [ main] i.s.b.BillsetuptaskApplication : No active profile set, falling back to default profiles: default
2019-11-27 16:02:01.710 INFO 119 --- [ main] c.a.d.s.b.a.DruidDataSourceAutoConfigure : Init DruidDataSource
2019-11-27 16:02:02.608 INFO 119 --- [ main] com.alibaba.druid.pool.DruidDataSource : {dataSource-1} inited
2019-11-27 16:02:02.745 DEBUG 119 --- [ main] o.s.c.t.c.SimpleTaskAutoConfiguration : Using org.springframework.cloud.task.configuration.DefaultTaskConfigurer TaskConfigurer
2019-11-27 16:02:02.747 DEBUG 119 --- [ main] o.s.c.t.c.DefaultTaskConfigurer : No EntityManager was found, using DataSourceTransactionManager
2019-11-27 16:02:03.476 DEBUG 119 --- [ main] o.s.c.t.r.s.TaskRepositoryInitializer : Initializing task schema for mysql database
2019-11-27 16:02:04.305 ERROR 119 --- [ main] o.s.c.t.listener.TaskLifecycleListener : An event to end a task has been received for a task that has not yet started.
2019-11-27 16:02:04.306 WARN 119 --- [ main] s.c.a.AnnotationConfigApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.context.ApplicationContextException: Failed to start bean 'taskLifecycleListener'; nested exception is java.lang.IllegalArgumentException: Invalid TaskExecution, ID 10 task is already complete
2019-11-27 16:02:04.376 INFO 119 --- [ main] com.alibaba.druid.pool.DruidDataSource : {dataSource-1} closed
2019-11-27 16:02:04.395 ERROR 119 --- [ main] o.s.c.t.listener.TaskLifecycleListener : An event to end a task has been received for a task that has not yet started.
2019-11-27 16:02:04.408 INFO 119 --- [ main] ConditionEvaluationReportLoggingListener :

Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2019-11-27 16:02:04.429 ERROR 119 --- [ main] o.s.boot.SpringApplication : Application run failed

org.springframework.context.ApplicationContextException: Failed to start bean 'taskLifecycleListener'; nested exception is java.lang.IllegalArgumentException: Invalid TaskExecution, ID 10 task is already complete
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:185) ~[spring-context-5.1.8.RELEASE.jar!/:5.1.8.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.access$200(DefaultLifecycleProcessor.java:53) ~[spring-context-5.1.8.RELEASE.jar!/:5.1.8.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.start(DefaultLifecycleProcessor.java:360) ~[spring-context-5.1.8.RELEASE.jar!/:5.1.8.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.startBeans(DefaultLifecycleProcessor.java:158) ~[spring-context-5.1.8.RELEASE.jar!/:5.1.8.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.onRefresh(DefaultLifecycleProcessor.java:122) ~[spring-context-5.1.8.RELEASE.jar!/:5.1.8.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:893) ~[spring-context-5.1.8.RELEASE.jar!/:5.1.8.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:552) ~[spring-context-5.1.8.RELEASE.jar!/:5.1.8.RELEASE]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:742) [spring-boot-2.1.6.RELEASE.jar!/:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:389) [spring-boot-2.1.6.RELEASE.jar!/:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:311) [spring-boot-2.1.6.RELEASE.jar!/:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1213) [spring-boot-2.1.6.RELEASE.jar!/:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1202) [spring-boot-2.1.6.RELEASE.jar!/:2.1.6.RELEASE]
at io.spring.billsetuptask.BillsetuptaskApplication.main(BillsetuptaskApplication.java:10) [classes!/:0.0.1-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_192]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_192]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_192]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_192]
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:47) [billsetuptask-0.0.1-SNAPSHOT.jar:0.0.1-SNAPSHOT]
at org.springframework.boot.loader.Launcher.launch(Launcher.java:86) [billsetuptask-0.0.1-SNAPSHOT.jar:0.0.1-SNAPSHOT]
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50) [billsetuptask-0.0.1-SNAPSHOT.jar:0.0.1-SNAPSHOT]
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51) [billsetuptask-0.0.1-SNAPSHOT.jar:0.0.1-SNAPSHOT]
Caused by: java.lang.IllegalArgumentException: Invalid TaskExecution, ID 10 task is already complete
at org.springframework.util.Assert.isNull(Assert.java:159) ~[spring-core-5.1.8.RELEASE.jar!/:5.1.8.RELEASE]
at org.springframework.cloud.task.listener.TaskLifecycleListener.doTaskStart(TaskLifecycleListener.java:261) ~[spring-cloud-task-core-2.1.3.RELEASE.jar!/:2.1.3.RELEASE]
at org.springframework.cloud.task.listener.TaskLifecycleListener.start(TaskLifecycleListener.java:390) ~[spring-cloud-task-core-2.1.3.RELEASE.jar!/:2.1.3.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:182) ~[spring-context-5.1.8.RELEASE.jar!/:5.1.8.RELEASE]
... 20 common frames omitted

Embedded Code with a variable

Error on compilation:

Error embed https://raw.githubusercontent.com/spring-cloud/spring-cloud-dataflow/%github-tag%/spring-cloud-dataflow-autoconfigure/src/main/java/org/springframework/cloud/dataflow/autoconfigure/local/ProfileApplicationListener.java
Error embed https://raw.githubusercontent.com/spring-cloud/spring-cloud-dataflow/%github-tag%/spring-cloud-dataflow-autoconfigure/src/main/java/org/springframework/cloud/dataflow/autoconfigure/local/ProfileApplicationListener.java

Test and document SCDF + Prometheus for Streams/Tasks on Cloud Foundry

In continuation with what we have done for Prometheus + RSocket on K8s, we would want to document the prereqs and the configuration setup to integrate with Prometheus + RSocket setup on Cloud Foundry.

Acceptance:

  • Include prereqs
  • Include configuration steps
  • Use the same stream and task demo, but run it on CF instead against Prometheus

"Delegate listener must not be null" following getting start guide

Following getting started guide on Java 12 I faced the issue running "timestamp" batch job. The examples provide 1.3.0.RELEASE version for the "timestamp" batch task, however, the "java.lang.IllegalArgumentException: Delegate listener must not be null" is thrown when the task is being executed.

Switching to 2.1.0.RELEASE task version fixes the issue.

I would change documentation to provide version 2.1.0.RELEASE for the task to avoid confusion for people who are just starting. I've created PR for the change.

Support for multiple versions of Data Flow

A user should be able to select the version of Data Flow they are using to get the appropriate documentation. The suggestion to handle this is to use branches for the major releases and have the master branch be the area for new development.

For example, the current GA release is 2.2.1. This would be the default version to show on the web site. There would be two branches as we move towards the 2.3 GA release

master   <- this is where all work for 2.3 goes before the GA release.
2.2.x       <- release 2.2.x

which would correspond to a combo box

>2.2.1<
master

where >2.2.1< indicates the combo box default selection.

After 2.3 release

master  <- this is where 2.4 feature work would go
2.3.x   <- updates to 2.3.x branch
2.2.x   <- updates to 2.2.x branch

which would correspond to a combo box

>2.3.0<
2.2.1
master

The gatsby code that is used to drive the web site functionality is always taken from the master branch.

Search not working

As a user, when attempting to use the search component in the Microsite, I don't get any results in return. I don't see any errors in the browser console, either.

It seems the Algolia queries are POST'd, but nothing in return; maybe the REST response handler in the client-side needs a review/update.

image

Helm Configuration Does not include Grafana or Prometheus

The Helm deployment does not include Grafana or Prometheus and requires a significant amount of manual setup that is not necessary in the Docker Compose start up.

Can we update the Helm Stable Chart to include these? If so, would that come from here or should I open an Issue on the Helm Repo?

Demonstrate Autoscaling of streaming apps using Prometheus metrics

As a developer, I'd like to draft an end-to-end recipe to demonstrate the autoscaling capability in SCDF using Prometheus.

This would be an extension to what was presented at S1P 2019, but instead of using a K8s-native hack, we will attempt to use the newly added scale() API to scale applications that belong in a streaming data pipeline.

Document custom SCDF server setup

We can add a section that describes how to set up a custom SCDF server using EnableDataFlowServer annotation and its configuration.

Configure Gatsby to perform slash-less redirects

As a developer, I'd like to configure Gatsby to do the slash-less redirect to an HTTPs version. In general, though, we should probably avoid using the slash-less URLs in the first place.

Specifically, look at Functional Composition link in the FAQ.

The `Accessing the Host File System` guide is incomplete

In the Accessing the Host File System guid the source host folders must be mount to bot the dataflow-server and the skipper-server service definitions!

The same host and container folder paths used in dataflow-server must be mounted to the skipper-server as well:

skipper-server:
    image: springcloud/spring-cloud-skipper-server:${SKIPPER_VERSION:?SKIPPER_VERSION is not set!}
    container_name: skipper
    ports:
      - "7577:7577"
      - "9000-9010:9000-9010"
      - "20000-20105:20000-20105"
    environment:
      - SPRING_CLOUD_SKIPPER_SERVER_PLATFORM_LOCAL_ACCOUNTS_DEFAULT_PORTRANGE_LOW=20000
      - SPRING_CLOUD_SKIPPER_SERVER_PLATFORM_LOCAL_ACCOUNTS_DEFAULT_PORTRANGE_HIGH=20100
      - SPRING_DATASOURCE_URL=jdbc:mysql://mysql:3306/dataflow
      - SPRING_DATASOURCE_USERNAME=root
      - SPRING_DATASOURCE_PASSWORD=rootpw
      - SPRING_DATASOURCE_DRIVER_CLASS_NAME=org.mariadb.jdbc.Driver
    volumes:
      - /tmp/apps:/root/apps
    entrypoint: "./wait-for-it.sh mysql:3306 -- java -Djava.security.egd=file:/dev/./urandom -jar /spring-cloud-skipper-server.jar"
 volumes:
   - /thing1/thing2/apps:/root/apps

Broken links + h1 error on doc pages

Error: Broken internal link docs/batch-developer-guides/batch/simple-task/ [/docs/batch-developer-guides/batch/spring-task/]
Error: Broken internal link docs/batch-developer-guides/batch/simple-task/ [/docs/batch-developer-guides/batch/spring-batch/]
Error: Broken internal link docs/batch-developer-guides/batch/simple-task/ [/docs/batch-developer-guides/batch/spring-batch/]
Error: Broken external link https://kubernetes.io/docs/setup/pick-right-solution/ [/docs/installation/kubernetes/creating-a-cluster/][status:404]
Error: Broken external link https://docs.spring.io/spring-cloud-stream-app-starters/docs/%25streaming-apps-latest%25/reference/htmlsingle/#spring-cloud-stream-modules-time-source [/docs/recipes/polyglot/app/][status:404]
Error: Broken external link https://docs.spring.io/spring-cloud-dataflow/docs/%25scdf-version-latest%25/reference/htmlsingle/#spring-cloud-dataflow-stream-app-dsl [/docs/recipes/polyglot/app/][status:404]
Error: Broken external link https://docs.spring.io/spring-cloud-stream-app-starters/docs/%25streaming-apps-latest%25/reference/htmlsingle/#spring-cloud-stream-modules-log-sink [/docs/recipes/polyglot/app/][status:404]
Error: Broken external link https://docs.spring.io/spring-batch/docs/current/reference/html/domain.html#step [/docs/feature-guides/batch/partitioning/][status:404]
Error: Broken external link https://docs.spring.io/spring-batch/docs/current/reference/html/domain.html#job [/docs/feature-guides/batch/partitioning/][status:404]
Error: Broken external link https://docs.spring.io/spring-batch/docs/current/reference/html/index-single.html#partitionHandler [/docs/feature-guides/batch/partitioning/][status:404]
Error: Broken external link https://docs.spring.io/spring-batch/docs/current/reference/html/index-single.html#step [/docs/feature-guides/batch/partitioning/][status:404]
Error: Broken external link http://docs.spring.io/spring-cloud-dataflow/docs/%25scdf-version-latest%25/reference/htmlsingle/##_composed_tasks_dsl [/docs/concepts/architecture/][status:404]
Error: Broken external link https://docs.spring.io/spring-batch/docs/current/reference/html/index-single.html#taskletStep [/docs/feature-guides/batch/partitioning/][status:404]
Error: Invalid h1 on page /docs/stream-developer-guides/troubleshooting/debugging-scdf-streams/
Error: Broken external link https://docs.spring.io/spring-batch/docs/current/reference/html/index-single.html#job [/docs/feature-guides/batch/partitioning/][status:404]
Error: Broken external link https://docs.spring.io/spring-batch/docs/current/reference/html/index-single.html#stepExecutionSplitter [/docs/feature-guides/batch/partitioning/][status:404]
Error: Broken external link https://docs.spring.io/spring-batch/docs/current/reference/html/index-single.html#step [/docs/feature-guides/batch/partitioning/][status:404]```

Explain "spring.cloud.task.closecontext-enabled" in an FAQ

As a user, I'm trying to re-run a task (as well, via CTR); however, I notice that even after the task-run is complete, I still do not get a clean exit-signal. The task-execution continues to state that it is running, which creates a downstream bottleneck for when there's a concurrent-task-launch limit is set.

Let's capture spring.cloud.task.closecontext-enabled in an FAQ. Specifically, also, let's capture the steps to clean-up the context when the Task is a batch-job. (see: spring-cloud/spring-cloud-dataflow#3444 (comment))

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.