Giter Site home page Giter Site logo

scio.g8's Introduction

Scio giter8 template Build Status GitHub license Join the chat at https://gitter.im/spotify/scio

A Giter8 template for Scio that includes a simple WordCount job to help you getting started.

asciicast

Running

  1. Download and install the Java Development Kit (JDK) version 8 or 11.
  2. Install sbt
  3. sbt new spotify/scio.g8
  4. sbt stage
  5. target/universal/stage/bin/word-count --output=wc

⚠️ Check your project README.md for further details ⚠️

scio.g8's People

Contributors

alisonnnnn avatar andrewsmartin avatar andrisnoko avatar clairemcginty avatar dependabot-preview[bot] avatar dependabot[bot] avatar fallonchen avatar jbx avatar joar avatar kellen avatar kyprifog avatar mikaelstaldal avatar mrkm4ntr avatar nevillelyh avatar ravwojdyla avatar regadas avatar rustedbones avatar scala-steward avatar shnapz avatar stormy-ua avatar syodage avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

scio.g8's Issues

Unknown exception: Directory scio-job could not be created

Hello,

Whenever I try to use the template (using the default arguments or other arguments), I get "Directory could not be created" error. I noticed a file gets created in current directory with the same name as the directory, the content of this file is the same as /src/main/g8/$if(DataflowFlexTemplate.truthy)$metadata.json$endif$ in this repo. So I went ahead forked and deleted the file and the problem was gone.

I am using macOS Catalina and didn't try on other operating systems.

Cant run on flink

After doing

sbt stage

and

target/universal/stage/bin/scio-job --runner=FlinkRunner --flinkMaster=localhost:8081 --input=README.md --output=wc

on a flink serer v.1.10.2, the following error occurred.
I have tried following previous issues and remove .sbt and .g8 but didnt work.

[main] INFO org.apache.beam.runners.flink.FlinkRunner - Executing pipeline using FlinkRunner.
[main] INFO org.apache.beam.runners.flink.FlinkRunner - Translating pipeline to Flink program.
[main] INFO org.apache.beam.runners.flink.FlinkExecutionEnvironments - Creating a Batch Execution Environment.
[main] INFO org.apache.beam.runners.flink.FlinkExecutionEnvironments - Using Flink Master URL localhost:8081.
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator -  enterCompositeTransform- 
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    enterCompositeTransform- textFile@{WordCount.scala:20}:1
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |    visitPrimitiveTransform- textFile@{WordCount.scala:20}:1/Read
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    leaveCompositeTransform- textFile@{WordCount.scala:20}:1
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    enterCompositeTransform- map@{WordCount.scala:21}:2
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |    visitPrimitiveTransform- map@{WordCount.scala:21}:2/ParMultiDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    leaveCompositeTransform- map@{WordCount.scala:21}:2
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    enterCompositeTransform- flatMap@{WordCount.scala:22}:2
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |    visitPrimitiveTransform- flatMap@{WordCount.scala:22}:2/ParMultiDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    leaveCompositeTransform- flatMap@{WordCount.scala:22}:2
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    enterCompositeTransform- countByValue@{WordCount.scala:23}:1
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |    enterCompositeTransform- countByValue@{WordCount.scala:23}:1/pApply:1
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    enterCompositeTransform- countByValue@{WordCount.scala:23}:1/pApply:1/Init
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    enterCompositeTransform- countByValue@{WordCount.scala:23}:1/pApply:1/Init/Map
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    visitPrimitiveTransform- countByValue@{WordCount.scala:23}:1/pApply:1/Init/Map/ParMultiDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    leaveCompositeTransform- countByValue@{WordCount.scala:23}:1/pApply:1/Init/Map
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    leaveCompositeTransform- countByValue@{WordCount.scala:23}:1/pApply:1/Init
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    enterCompositeTransform- countByValue@{WordCount.scala:23}:1/pApply:1/Combine.perKey(Count)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    translated- countByValue@{WordCount.scala:23}:1/pApply:1/Combine.perKey(Count)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    leaveCompositeTransform- countByValue@{WordCount.scala:23}:1/pApply:1/Combine.perKey(Count)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |    leaveCompositeTransform- countByValue@{WordCount.scala:23}:1/pApply:1
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |    enterCompositeTransform- countByValue@{WordCount.scala:23}:1/map:2
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    visitPrimitiveTransform- countByValue@{WordCount.scala:23}:1/map:2/ParMultiDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |    leaveCompositeTransform- countByValue@{WordCount.scala:23}:1/map:2
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    leaveCompositeTransform- countByValue@{WordCount.scala:23}:1
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    enterCompositeTransform- map@{WordCount.scala:24}:2
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |    visitPrimitiveTransform- map@{WordCount.scala:24}:2/ParMultiDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    leaveCompositeTransform- map@{WordCount.scala:24}:2
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/RewindowIntoGlobal
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/RewindowIntoGlobal/Window.Assign
[main] INFO org.apache.flink.api.java.typeutils.TypeExtractor - No fields were detected for class org.apache.beam.sdk.util.WindowedValue so it cannot be used as a POJO type and must be processed as GenericType. Please read the Flink documentation on "Data Types & Serialization" for details of the effect on performance.
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/RewindowIntoGlobal
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten/ParMultiDo(WriteShardsIntoTempFiles)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum/ParMultiDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/WriteUnshardedBundlesToTempFiles
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/View.AsList
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/View.AsList/View.VoidKeyToMultimapMaterialization
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/View.AsList/View.CreatePCollectionView
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/View.AsList
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)/ParMultiDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map/ParMultiDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/GatherTempFileResults
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Finalize
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Finalize
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key/ParMultiDo(AssignShard)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    translated- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |   |    enterCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |   |   |    visitPrimitiveTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous)
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles/FinalizeTempFileBundles
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |   |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1/WriteFiles
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator - |    leaveCompositeTransform- saveAsTextFile@{WordCount.scala:25}:1
[main] INFO org.apache.beam.runners.flink.FlinkBatchPipelineTranslator -  leaveCompositeTransform- 
[main] INFO org.apache.beam.runners.flink.FlinkRunner - Starting execution of Flink program.
[main] INFO org.apache.flink.api.java.ExecutionEnvironment - The job has 0 registered types and 0 default Kryo serializers
[main] INFO org.apache.beam.sdk.io.FileBasedSource - Filepattern README.md matched 1 files with total size 899
[Flink-RestClusterClient-IO-thread-4] WARN org.apache.flink.util.ExecutorUtils - ExecutorService did not terminate in time. Shutting it down now.
[main] ERROR org.apache.beam.runners.flink.FlinkRunner - Pipeline execution failed
java.lang.RuntimeException: java.util.concurrent.ExecutionException: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph.
        at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:290)
        at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:970)
        at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:878)
        at org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.executePipeline(FlinkPipelineExecutionEnvironment.java:150)
        at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:87)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:317)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:303)
        at com.spotify.scio.ScioContext.execute(ScioContext.scala:639)
        at com.spotify.scio.ScioContext$$anonfun$run$1.apply(ScioContext.scala:627)
        at com.spotify.scio.ScioContext$$anonfun$run$1.apply(ScioContext.scala:615)
        at com.spotify.scio.ScioContext.requireNotClosed(ScioContext.scala:705)
        at com.spotify.scio.ScioContext.run(ScioContext.scala:615)
        at example.WordCount$.main(WordCount.scala:27)
        at example.WordCount.main(WordCount.scala)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph.
        at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
        at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)
        at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:965)
        ... 12 more
Caused by: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph.
        at org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$7(RestClusterClient.java:359)
        at java.base/java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:986)
        at java.base/java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:970)
        at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
        at java.base/java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:2088)
        at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$8(FutureUtils.java:274)
        at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
        at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
        at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
        at java.base/java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:610)
        at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1085)
        at java.base/java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.apache.flink.runtime.rest.util.RestClientException: [Internal server error., <Exception on server side:
org.apache.flink.runtime.client.JobSubmissionException: Failed to submit job.
        at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$internalSubmitJob$3(Dispatcher.java:339)
        at java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:930)
        at java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:907)
        at java.base/java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
        at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
        at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
        at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
        ... 6 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
        at org.apache.flink.runtime.jobmaster.JobManagerRunnerImpl.<init>(JobManagerRunnerImpl.java:152)
        at org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:84)
        at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$6(Dispatcher.java:382)
        at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
        ... 7 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Cannot initialize task 'CHAIN DataSource (at textFile@{WordCount.scala:20}:1/Read (org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat)) -> FlatMap (FlatMap at map@{WordCount.scala:21}:2/ParMultiDo(Anonymous)) -> FlatMap (FlatMap at map@{WordCount.scala:21}:2/ParMultiDo(Anonymous).output) -> FlatMap (FlatMap at flatMap@{WordCount.scala:22}:2/ParMultiDo(Anonymous)) -> FlatMap (FlatMap at flatMap@{WordCount.scala:22}:2/ParMultiDo(Anonymous).output) -> FlatMap (FlatMap at countByValue@{WordCount.scala:23}:1/pApply:1/Init/Map/ParMultiDo(Anonymous)) -> FlatMap (FlatMap at countByValue@{WordCount.scala:23}:1/pApply:1/Init/Map/ParMultiDo(Anonymous).output) -> FlatMap (FlatMap at ExplodeWindows: countByValue@{WordCount.scala:23}:1/pApply:1/Combine.perKey(Count)) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: countByValue@{WordCount.scala:23}:1/pApply:1/Combine.perKey(Count)) -> Map (Key Extractor)': Loading the input/output formats failed: org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat@1d944fc0
        at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:216)
        at org.apache.flink.runtime.scheduler.SchedulerBase.createExecutionGraph(SchedulerBase.java:256)
        at org.apache.flink.runtime.scheduler.SchedulerBase.createAndRestoreExecutionGraph(SchedulerBase.java:228)
        at org.apache.flink.runtime.scheduler.SchedulerBase.<init>(SchedulerBase.java:216)
        at org.apache.flink.runtime.scheduler.DefaultScheduler.<init>(DefaultScheduler.java:120)
        at org.apache.flink.runtime.scheduler.DefaultSchedulerFactory.createInstance(DefaultSchedulerFactory.java:105)
        at org.apache.flink.runtime.jobmaster.JobMaster.createScheduler(JobMaster.java:278)
        at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:266)
        at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:98)
        at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:40)
        at org.apache.flink.runtime.jobmaster.JobManagerRunnerImpl.<init>(JobManagerRunnerImpl.java:146)
        ... 10 more
Caused by: java.lang.Exception: Loading the input/output formats failed: org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat@1d944fc0
        at org.apache.flink.runtime.jobgraph.InputOutputFormatVertex.initInputOutputformatContainer(InputOutputFormatVertex.java:156)
        at org.apache.flink.runtime.jobgraph.InputOutputFormatVertex.initializeOnMaster(InputOutputFormatVertex.java:60)
        at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:212)
        ... 20 more
Caused by: java.lang.RuntimeException: Deserializing the input/output formats failed: Could not read the user code wrapper: org.apache.beam.runners.core.construction.SerializablePipelineOptions; local class incompatible: stream classdesc serialVersionUID = -5067588320286657615, local class serialVersionUID = 8599785494114821479
        at org.apache.flink.runtime.jobgraph.InputOutputFormatContainer.<init>(InputOutputFormatContainer.java:68)
        at org.apache.flink.runtime.jobgraph.InputOutputFormatVertex.initInputOutputformatContainer(InputOutputFormatVertex.java:153)
        ... 22 more
Caused by: org.apache.flink.runtime.operators.util.CorruptConfigurationException: Could not read the user code wrapper: org.apache.beam.runners.core.construction.SerializablePipelineOptions; local class incompatible: stream classdesc serialVersionUID = -5067588320286657615, local class serialVersionUID = 8599785494114821479
        at org.apache.flink.runtime.operators.util.TaskConfig.getStubWrapper(TaskConfig.java:290)
        at org.apache.flink.runtime.jobgraph.InputOutputFormatContainer.<init>(InputOutputFormatContainer.java:66)
        ... 23 more
Caused by: java.io.InvalidClassException: org.apache.beam.runners.core.construction.SerializablePipelineOptions; local class incompatible: stream classdesc serialVersionUID = -5067588320286657615, local class serialVersionUID = 8599785494114821479
        at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
        at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1982)
        at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1851)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2139)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434)
        at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2328)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434)
        at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2328)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:482)
        at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:440)
        at java.base/java.util.HashMap.readObject(HashMap.java:1460)
        at java.base/jdk.internal.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
        at java.base/java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1175)
        at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2295)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434)
        at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2328)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434)
        at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2328)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:482)
        at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:440)
        at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:576)
        at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:562)
        at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:550)
        at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:511)
        at org.apache.flink.runtime.operators.util.TaskConfig.getStubWrapper(TaskConfig.java:288)
        ... 24 more

End of exception on server side>]
        at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:390)
        at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:374)
        at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1072)
        ... 4 more
Exception in thread "main" java.lang.RuntimeException: Pipeline execution failed
        at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:90)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:317)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:303)
        at com.spotify.scio.ScioContext.execute(ScioContext.scala:639)
        at com.spotify.scio.ScioContext$$anonfun$run$1.apply(ScioContext.scala:627)
        at com.spotify.scio.ScioContext$$anonfun$run$1.apply(ScioContext.scala:615)
        at com.spotify.scio.ScioContext.requireNotClosed(ScioContext.scala:705)
        at com.spotify.scio.ScioContext.run(ScioContext.scala:615)
        at example.WordCount$.main(WordCount.scala:27)
        at example.WordCount.main(WordCount.scala)
Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph.
        at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:290)
        at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:970)
        at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:878)
        at org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.executePipeline(FlinkPipelineExecutionEnvironment.java:150)
        at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:87)
        ... 9 more
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph.
        at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
        at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)
        at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:965)
        ... 12 more
Caused by: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph.
        at org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$7(RestClusterClient.java:359)
        at java.base/java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:986)
        at java.base/java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:970)
        at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
        at java.base/java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:2088)
        at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$8(FutureUtils.java:274)
        at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
        at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
        at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
        at java.base/java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:610)
        at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1085)
        at java.base/java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.apache.flink.runtime.rest.util.RestClientException: [Internal server error., <Exception on server side:
org.apache.flink.runtime.client.JobSubmissionException: Failed to submit job.
        at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$internalSubmitJob$3(Dispatcher.java:339)
        at java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:930)
        at java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:907)
        at java.base/java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
        at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
        at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
        at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
        ... 6 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
        at org.apache.flink.runtime.jobmaster.JobManagerRunnerImpl.<init>(JobManagerRunnerImpl.java:152)
        at org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:84)
        at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$6(Dispatcher.java:382)
        at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
        ... 7 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Cannot initialize task 'CHAIN DataSource (at textFile@{WordCount.scala:20}:1/Read (org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat)) -> FlatMap (FlatMap at map@{WordCount.scala:21}:2/ParMultiDo(Anonymous)) -> FlatMap (FlatMap at map@{WordCount.scala:21}:2/ParMultiDo(Anonymous).output) -> FlatMap (FlatMap at flatMap@{WordCount.scala:22}:2/ParMultiDo(Anonymous)) -> FlatMap (FlatMap at flatMap@{WordCount.scala:22}:2/ParMultiDo(Anonymous).output) -> FlatMap (FlatMap at countByValue@{WordCount.scala:23}:1/pApply:1/Init/Map/ParMultiDo(Anonymous)) -> FlatMap (FlatMap at countByValue@{WordCount.scala:23}:1/pApply:1/Init/Map/ParMultiDo(Anonymous).output) -> FlatMap (FlatMap at ExplodeWindows: countByValue@{WordCount.scala:23}:1/pApply:1/Combine.perKey(Count)) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: countByValue@{WordCount.scala:23}:1/pApply:1/Combine.perKey(Count)) -> Map (Key Extractor)': Loading the input/output formats failed: org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat@1d944fc0
        at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:216)
        at org.apache.flink.runtime.scheduler.SchedulerBase.createExecutionGraph(SchedulerBase.java:256)
        at org.apache.flink.runtime.scheduler.SchedulerBase.createAndRestoreExecutionGraph(SchedulerBase.java:228)
        at org.apache.flink.runtime.scheduler.SchedulerBase.<init>(SchedulerBase.java:216)
        at org.apache.flink.runtime.scheduler.DefaultScheduler.<init>(DefaultScheduler.java:120)
        at org.apache.flink.runtime.scheduler.DefaultSchedulerFactory.createInstance(DefaultSchedulerFactory.java:105)
        at org.apache.flink.runtime.jobmaster.JobMaster.createScheduler(JobMaster.java:278)
        at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:266)
        at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:98)
        at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:40)
        at org.apache.flink.runtime.jobmaster.JobManagerRunnerImpl.<init>(JobManagerRunnerImpl.java:146)
        ... 10 more
Caused by: java.lang.Exception: Loading the input/output formats failed: org.apache.beam.runners.flink.translation.wrappers.SourceInputFormat@1d944fc0
        at org.apache.flink.runtime.jobgraph.InputOutputFormatVertex.initInputOutputformatContainer(InputOutputFormatVertex.java:156)
        at org.apache.flink.runtime.jobgraph.InputOutputFormatVertex.initializeOnMaster(InputOutputFormatVertex.java:60)
        at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:212)
        ... 20 more
Caused by: java.lang.RuntimeException: Deserializing the input/output formats failed: Could not read the user code wrapper: org.apache.beam.runners.core.construction.SerializablePipelineOptions; local class incompatible: stream classdesc serialVersionUID = -5067588320286657615, local class serialVersionUID = 8599785494114821479
        at org.apache.flink.runtime.jobgraph.InputOutputFormatContainer.<init>(InputOutputFormatContainer.java:68)
        at org.apache.flink.runtime.jobgraph.InputOutputFormatVertex.initInputOutputformatContainer(InputOutputFormatVertex.java:153)
        ... 22 more
Caused by: org.apache.flink.runtime.operators.util.CorruptConfigurationException: Could not read the user code wrapper: org.apache.beam.runners.core.construction.SerializablePipelineOptions; local class incompatible: stream classdesc serialVersionUID = -5067588320286657615, local class serialVersionUID = 8599785494114821479
        at org.apache.flink.runtime.operators.util.TaskConfig.getStubWrapper(TaskConfig.java:290)
        at org.apache.flink.runtime.jobgraph.InputOutputFormatContainer.<init>(InputOutputFormatContainer.java:66)
        ... 23 more
Caused by: java.io.InvalidClassException: org.apache.beam.runners.core.construction.SerializablePipelineOptions; local class incompatible: stream classdesc serialVersionUID = -5067588320286657615, local class serialVersionUID = 8599785494114821479
        at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
        at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1982)
        at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1851)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2139)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434)
        at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2328)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434)
        at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2328)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:482)
        at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:440)
        at java.base/java.util.HashMap.readObject(HashMap.java:1460)
        at java.base/jdk.internal.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
        at java.base/java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1175)
        at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2295)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434)
        at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2328)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434)
        at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2328)
        at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
        at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
        at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:482)
        at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:440)
        at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:576)
        at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:562)
        at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:550)
        at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:511)
        at org.apache.flink.runtime.operators.util.TaskConfig.getStubWrapper(TaskConfig.java:288)
        ... 24 more

End of exception on server side>]
        at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:390)
        at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:374)
        at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1072)
        ... 4 more   

Flink runner not supported for scala 2.13

[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.flink:flink-runtime_2.13:1.9.1
[error]   Not found
[error]   Not found
[error]   not found: /Users/kristapsveveris/.ivy2/local/org.apache.flink/flink-runtime_2.13/1.9.1/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/flink/flink-runtime_2.13/1.9.1/flink-runtime_2.13-1.9.1.pom
[error] Error downloading org.apache.flink:flink-streaming-java_2.13:1.9.1
[error]   Not found
[error]   Not found
[error]   not found: /Users/kristapsveveris/.ivy2/local/org.apache.flink/flink-streaming-java_2.13/1.9.1/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/flink/flink-streaming-java_2.13/1.9.1/flink-streaming-java_2.13-1.9.1.pom
[error] Error downloading org.apache.flink:flink-clients_2.13:1.9.1
[error]   Not found
[error]   Not found
[error]   not found: /Users/kristapsveveris/.ivy2/local/org.apache.flink/flink-clients_2.13/1.9.1/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/flink/flink-clients_2.13/1.9.1/flink-clients_2.13-1.9.1.pom
[error] 	at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:249)
[error] 	at lmcoursier.CoursierDependencyResolution.$anonfun$update$35(CoursierDependencyResolution.scala:218)
[error] 	at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] 	at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:218)
[error] 	at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] 	at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:52)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:102)
[error] 	at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:69)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$20(LibraryManagement.scala:115)
[error] 	at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:115)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:96)
[error] 	at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:150)
[error] 	at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:129)
[error] 	at sbt.Classpaths$.$anonfun$updateTask0$5(Defaults.scala:2950)
[error] 	at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] 	at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] 	at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] 	at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] 	at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] 	at sbt.Execute.work(Execute.scala:290)
[error] 	at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] 	at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] 	at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] 	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] 	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
[error] 	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[error] 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[error] 	at java.base/java.lang.Thread.run(Thread.java:830)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.flink:flink-runtime_2.13:1.9.1
[error]   Not found
[error]   Not found
[error]   not found: /Users/kristapsveveris/.ivy2/local/org.apache.flink/flink-runtime_2.13/1.9.1/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/flink/flink-runtime_2.13/1.9.1/flink-runtime_2.13-1.9.1.pom
[error] Error downloading org.apache.flink:flink-streaming-java_2.13:1.9.1
[error]   Not found
[error]   Not found
[error]   not found: /Users/kristapsveveris/.ivy2/local/org.apache.flink/flink-streaming-java_2.13/1.9.1/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/flink/flink-streaming-java_2.13/1.9.1/flink-streaming-java_2.13-1.9.1.pom
[error] Error downloading org.apache.flink:flink-clients_2.13:1.9.1
[error]   Not found
[error]   Not found
[error]   not found: /Users/kristapsveveris/.ivy2/local/org.apache.flink/flink-clients_2.13/1.9.1/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/flink/flink-clients_2.13/1.9.1/flink-clients_2.13-1.9.1.pom

The template can't create the target directory

Using the template accepting all default options doesn't work because instead of the target directory there is a file with same name that prevents directory creation.

$ sbt new spotify/scio.g8
[info] Loading settings for project global-plugins from plugins.sbt ...
[info] Loading global plugins from /Users/cebriant/.sbt/1.0/plugins
[info] Set current project to borralodedentro (in build file:/Users/cebriant/borralodedentro/)
[info] Set current project to borralodedentro (in build file:/Users/cebriant/borralodedentro/)
name [scio job]:
organization [example]:
DataflowRunner [yes/NO]:
DataflowFlexTemplate [yes/NO]:
FlinkRunner [yes/NO]:
SparkRunner [yes/NO]:

Unknown exception: Directory '/Users/cebriant/borralodedentro/./scio-job' could not be created

$ cat scio-job
{
  "name": "WordCount Example",
  "description": "",
  "parameters": [
    {
      "regexes": [
        "^gs:\/\/[^\n\r]+$"
      ],
      "name": "input",
      "label": "GCS input text file",
      "helpText": "GCS input text file"
    },
    {
      "regexes": [
        "^gs:\/\/[^\n\r]+$"
      ],
      "name": "output",
      "label": "GCS output text file",
      "helpText": "GCS output text file"
    }
  ]
}

Use of deprecated sbt-assembly key breaks build of freshly created project

Summary

As far as I can tell sbt-assembly deprecated the jarName key with release 1.0.0.
This template almost always uses its replacement assemblyJarName except in build.sbt's L125.

When creating a new project using

sbt new spotify/scio.g8
// redacted
name [scio job]: example
organization [example]: com.example
DataflowRunner [yes/NO]: yes
DataflowFlexTemplate [yes/NO]: yes
FlinkRunner [yes/NO]: no
SparkRunner [yes/NO]: no

it prevents sbt from successfully building the fresh project.

Suggestion

Change build.sbt's L125 to

scriptClasspath := Seq((assembly / assemblyJarName).value),

to reflect sbt-assembly's API changes.

Contribution

I couldn't find contributors guidelines on first try.

Will you accept a PR for this?

Template is broken

After issuing sbt new spotify/scio.g8 and accepting all defaults arguments (it also fail if I pick the values I want) I get an error

Exiting due to error in the template: /var/folders/55/4x3lxfld3kg_km9m4rf72f000000gq/T/giter8-32356141594912
File: /var/folders/55/4x3lxfld3kg_km9m4rf72f000000gq/T/giter8-32356141594912/src/main/g8/$if(DataflowFlexTemplate.truthy)$metadata.json$endif$, 9:7: 'name' came as a complete surprise to me

Invalid ssh privatekey when using `sbt new spotify/scio.g8`

Hello 👋

I'm having problems using sbt new spotify/scio.g8.

> sbt new spotify/scio.g8
[info] welcome to sbt 1.4.0 (AdoptOpenJDK Java 1.8.0_222)
[info] set current project to scala-ft (in build file:/Users/felix/Projects/home/scala-ft/)
[info] set current project to scala-ft (in build file:/Users/felix/Projects/home/scala-ft/)

ssh://[email protected]/spotify/scio.g8.git: invalid privatekey: <removed, I don't know if this is part of my private key>

I'm not sure how to debug this. My normal ssh keys work fine with git. I can do git clone ssh://[email protected]/spotify/scio.g8.git just fine.

I'm not very familiar with sbt, so I'm not sure where to look. Do you have some pointers on where the error could be occurring?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.