Comments (16)
Using the release 0.4 JAR give same errors
from jgribx.
@spidru Here is the GRIB2 file I'm using : https://we.tl/t-l3xbpVrYUv
Thanks
from jgribx.
Hi @Timelessprod, thanks for sending the file. The issue is that the GRIB file seems to use Simple Packing in its DRS section, whereas at the moment only Complex Packing and Spatial Differencing is supported for GRIB2. I'll try to find some time to implement Simple Packing, shouldn't be too much work.
from jgribx.
Hi @Timelessprod, thanks for sending the file. The issue is that the GRIB file seems to use Simple Packing in its DRS section, whereas at the moment only Complex Packing and Spatial Differencing is supported for GRIB2. I'll try to find some time to implement Simple Packing, shouldn't be too much work.
@spidru Thanks for your reply and for adapting your library. Do you have an idea of when it would be ready ? So I can tell my manager when my product will be available.
Thank you very much.
from jgribx.
Hi @Timelessprod, I'm also seeing that the file you shared also uses bitmap data, which is another thing that isn't supported at the moment. I'll try to get this done in a week or so, but it will depend on my availability.
from jgribx.
from jgribx.
Hi @Timelessprod, I've just pushed an initial version which include support for GRIB files specifying a bitmap as well as files using grid-point simple packing. Can you try it out?
https://83-92336207-gh.circle-artifacts.com/0/build/libs/JGribX.jar
from jgribx.
Hi @spidru. I'm not at work now but I'll try it tomorrow. Thanks for your time and fast reaction !
from jgribx.
Hi @spidru, I've update my JAR with your new one but it uses a more recent Java version than the one I can use at work (JDK 8 for Scala 1.12). Thus the code crash with this error :
java.lang.UnsupportedClassVersionError: mt/edu/um/cf2/jgribx/GribFile has been compiled by a more recent version of the Java Runtime (class file version 53.0), this version of the Java Runtime only recognizes class file versions up to 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:757)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
at com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader.loadClass(ClassLoaders.scala:151)
at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
at com.databricks.backend.daemon.driver.ClassLoaders$ReplWrappingClassLoader.loadClass(ClassLoaders.scala:65)
at java.lang.ClassLoader.loadClass(ClassLoader.java:406)
at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
from jgribx.
Hi @Timelessprod, the project is currently being built with JDK 9 by default. Here's a version compiled with JDK 8 that should work for you.
https://93-92336207-gh.circle-artifacts.com/0/build/libs/JGribX.jar
from jgribx.
Thank you ! I'll try it now
from jgribx.
After trying it yesterday and this morning, I'm having errors with it when using Spark dataframes as it seems that some functions of the library are not serializable. When I try to call the show()
method on my dataframe after filling it with GRIB data, I get this stack trace :
org.apache.spark.SparkException: Task not serializable
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:416)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:406)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2604)
at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$1(RDD.scala:893)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:165)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:125)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:395)
at org.apache.spark.rdd.RDD.mapPartitionsWithIndex(RDD.scala:892)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:727)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:200)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$3(SparkPlan.scala:252)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:165)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:248)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:192)
at org.apache.spark.sql.execution.UnionExec.$anonfun$doExecute$5(basicPhysicalOperators.scala:684)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.map(TraversableLike.scala:238)
at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
at scala.collection.AbstractTraversable.map(Traversable.scala:108)
at org.apache.spark.sql.execution.UnionExec.doExecute(basicPhysicalOperators.scala:684)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:200)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$3(SparkPlan.scala:252)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:165)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:248)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:192)
at org.apache.spark.sql.execution.collect.Collector$.collect(Collector.scala:79)
at org.apache.spark.sql.execution.collect.Collector$.collect(Collector.scala:88)
at org.apache.spark.sql.execution.ResultCacheManager.getOrComputeResult(ResultCacheManager.scala:508)
at org.apache.spark.sql.execution.CollectLimitExec.executeCollectResult(limit.scala:58)
at org.apache.spark.sql.Dataset.collectResult(Dataset.scala:2994)
at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3717)
at org.apache.spark.sql.Dataset.$anonfun$head$1(Dataset.scala:2718)
at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3709)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$5(SQLExecution.scala:116)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:249)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:101)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:845)
at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:77)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:199)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3707)
at org.apache.spark.sql.Dataset.head(Dataset.scala:2718)
at org.apache.spark.sql.Dataset.take(Dataset.scala:2925)
at org.apache.spark.sql.Dataset.getRows(Dataset.scala:307)
at org.apache.spark.sql.Dataset.showString(Dataset.scala:344)
at org.apache.spark.sql.Dataset.show(Dataset.scala:840)
at org.apache.spark.sql.Dataset.show(Dataset.scala:799)
If I try not to print the table, then the dataframe won't be filled with data as Spark use lazy evaluation. So maybe it would requier you to refine almost all of the lib which I don't want you to do, you've helped me enough and I don't want you to have to rewrite your whoe library only for my usecase.
However, I also get some NoValidGrbException
on some GRIB files such as this one https://we.tl/t-exk4aytqYs. Such files give me this stack trace :
mt.edu.um.cf2.jgribx.NoValidGribException: No valid GRIB records found.
at mt.edu.um.cf2.jgribx.GribFile.<init>(GribFile.java:138)
at mt.edu.um.cf2.jgribx.GribFile.<init>(GribFile.java:80)
at mt.edu.um.cf2.jgribx.GribFile.<init>(GribFile.java:63)
Maybe it's something to try to fix, isn't it ?
Thanks for all the support you've gave me this far.
from jgribx.
Hmm not quite sure how I would go about that serializable issue. If it's a blocking issue let me know and I'll have to look into it in some detail.
In the meantime, I'll find some time this week to have a look at your new GRIB file. Most probably there's something which is not yet supported. You can actually check that using the JGribX CLI as follows: java -jar JGribX.jar -i inputfile.grb2 -l 4
.
from jgribx.
Hi @Timelessprod, apologies for the long delay. Is this issue still relevant? If so, could you please re-share the GRIB file giving you a NoValidGribException
?
from jgribx.
Hi @spidru I'm sorry to tell you that I have no longer access to this file as it was on my professional computer and that my internship has ended 5 months ago. I'm sorry for the inconvenience.
from jgribx.
No problem at all. In that case, I'll close this issue since there is no further action to be done.
from jgribx.
Related Issues (15)
- Does not read all records HOT 7
- Support NOMADS, NOAA, NWS Grid Template Lambert Conformal (Can be Secant, Tangent, Conical, or Bipolar) HOT 5
- Support other data representation templates HOT 2
- Update the JAR HOT 3
- How to use this library in my Android App HOT 7
- Cant Read this type grib1 files HOT 5
- NoValidGribException: No valid GRIB records found. HOT 9
- NoValidGribException: No valid GRIB records found for RAP and NAM wind data HOT 4
- getRecord does not work with level without level value HOT 4
- Move directory structure to the gradle standard HOT 4
- Use getForecastTime() instead of getReferenceTime() in getRecord() HOT 1
- NPE on getValue when record is constant HOT 8
- Submit the jar to mavenrepository or jcenter HOT 3
- Getting NullPointer while loading GFS grib2 file HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from jgribx.