tminglei / slick-pg Goto Github PK
View Code? Open in Web Editor NEWSlick extensions for PostgreSQL
License: BSD 2-Clause "Simplified" License
Slick extensions for PostgreSQL
License: BSD 2-Clause "Simplified" License
one common question that PostGis is good to answer is "how do I find the N nearest things to this point?" as expressed in http://blog.opengeo.org/2011/09/28/indexed-nearest-neighbour-search-in-postgis/
So how to make the following query (from the post above)
SELECT name, gid
FROM geonames
ORDER BY geom <-> st_setsrid(st_makepoint(-90,40),4326)
LIMIT 10;
in slick-pg?
I have tried:
def pointsWithin(point: Geometry): List[OsmNode] = {
OsmNodes.where(r => (r.geom <-> point.bind) < 10d ).map(t => t)
}
but I get
polymorphic expression cannot be instantiated to expected type;
[error] found : [G, T]scala.slick.lifted.Query[G,T]
[error] required: List[net.snips.pogistan.OsmNode]
[error] OsmNodes.where(r => (r.geom <-> point.bind) < 10d ).map(t => t)
[error] ^
Getting the following error when attempting to create the DDL for newly migrated schema
scala.slick.SlickException: JdbcProfile has no TypeInfo for type scala.slick.driver.JdbcTypesComponent$MappedJdbcType$$anon$1/BIGINT
The config is basically the same from the front page, apart from ArrayImplicits
, which has been extended the following way so that custom array types can be used, details are here
https://gist.github.com/mdedetrich/8b1b2067e39736b4afbf
This is what the custom driver looks like
https://gist.github.com/mdedetrich/808f325b1a81dd1db508
(Is there any documentation on how to create custom array types of type List[T]
, in my case, both Institution
and MarketFinancialProduct
have implementations on MappedColumnType.base
), however the array type seems unable to use/see them
Hi,
I have a column which type is array of my custom types:
bom bom_entry[]
where bom_entry is
Composite type "public.bom_entry"
Column | Type | Modifiers | Storage | Description
----------+------------------+-----------+----------+-------------
ipn | text | | extended |
quantity | double precision | | plain |
seq_num | integer | | plain |
If the ipn contains a text that has spaces inside, I get the following error (see at the bottom).
If I change the value to ABC_ABC everything works. It seems like slick-pg has problem with deserializaing this representation of array (as presented in psql). Is the problem in escaping?
bom
---------------------------------------------------------
{"(\"ABC ABC\",1,0)","(\"DEF DEF\",1,0)"}
(1 row)
I think the problem is in escaping quotes and quotes are added by PostgreSQL only if there are spaces. With ABC_ABC value, the output is as following:
bom
-------------------------------------------------
{"(ABC_ABC,1,0)","(DEF_DEF,1,0)"}
(1 row)
I created custom type mapper as per your samples.
Can this be fixed somehow? Thanks!
Exception in thread "main" java.lang.IllegalArgumentException: unsupported token CTString(List(SingleQuote, Chunk(ABC ABC), SingleQuote))
at com.github.tminglei.slickpg.utils.PGObjectTokenizer$PGTokenReducer$.com$github$tminglei$slickpg$utils$PGObjectTokenizer$PGTokenReducer$$mergeString$1(PGObjectTokenizer.scala:58)
at com.github.tminglei.slickpg.utils.PGObjectTokenizer$PGTokenReducer$$anonfun$1.applyOrElse(PGObjectTokenizer.scala:87)
at com.github.tminglei.slickpg.utils.PGObjectTokenizer$PGTokenReducer$$anonfun$1.applyOrElse(PGObjectTokenizer.scala:84)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
at scala.collection.TraversableLike$$anonfun$collect$1.apply(TraversableLike.scala:278)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.collect(TraversableLike.scala:278)
at scala.collection.AbstractTraversable.collect(Traversable.scala:105)
at com.github.tminglei.slickpg.utils.PGObjectTokenizer$PGTokenReducer$.com$github$tminglei$slickpg$utils$PGObjectTokenizer$PGTokenReducer$$mergeComposite$1(PGObjectTokenizer.scala:84)
at com.github.tminglei.slickpg.utils.PGObjectTokenizer$PGTokenReducer$.compose(PGObjectTokenizer.scala:100)
at com.github.tminglei.slickpg.utils.PGObjectTokenizer$$anonfun$tokenParser$2.apply(PGObjectTokenizer.scala:333)
at com.github.tminglei.slickpg.utils.PGObjectTokenizer$$anonfun$tokenParser$2.apply(PGObjectTokenizer.scala:333)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$flatMap$1.apply(Parsers.scala:239)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$flatMap$1.apply(Parsers.scala:239)
at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
at scala.util.parsing.combinator.RegexParsers$class.parse(RegexParsers.scala:144)
at com.github.tminglei.slickpg.utils.PGObjectTokenizer.parse(PGObjectTokenizer.scala:9)
at scala.util.parsing.combinator.RegexParsers$class.parseAll(RegexParsers.scala:156)
at com.github.tminglei.slickpg.utils.PGObjectTokenizer.parseAll(PGObjectTokenizer.scala:9)
at com.github.tminglei.slickpg.utils.PGObjectTokenizer.tokenize(PGObjectTokenizer.scala:337)
at com.github.tminglei.slickpg.utils.PGObjectTokenizer$.tokenize(PGObjectTokenizer.scala:356)
at com.github.tminglei.slickpg.utils.TypeConverters$Util$$anon$4.apply(TypeConverters.scala:123)
at com.github.tminglei.slickpg.utils.TypeConverters$Util$$anon$4.apply(TypeConverters.scala:117)
at scala.Option.map(Option.scala:145)
at com.github.tminglei.slickpg.array.PgArrayJavaTypes$ArrayListJavaType.nextValue(PgArrayJavaTypes.scala:29)
at com.github.tminglei.slickpg.array.PgArrayJavaTypes$ArrayListJavaType.nextValue(PgArrayJavaTypes.scala:12)
at scala.slick.driver.JdbcTypesComponent$JdbcType$class.nextValueOrElse(JdbcTypesComponent.scala:30)
at com.github.tminglei.slickpg.array.PgArrayJavaTypes$ArrayListJavaType.nextValueOrElse(PgArrayJavaTypes.scala:12)
at scala.slick.jdbc.JdbcMappingCompilerComponent$MappingCompiler$$anon$1.read(JdbcMappingCompilerComponent.scala:23)
at scala.slick.jdbc.JdbcMappingCompilerComponent$MappingCompiler$$anon$1.read(JdbcMappingCompilerComponent.scala:20)
at scala.slick.profile.RelationalMappingCompilerComponent$ProductResultConverter$$anonfun$read$1.apply(RelationalProfile.scala:244)
at scala.slick.profile.RelationalMappingCompilerComponent$ProductResultConverter$$anonfun$read$1.apply(RelationalProfile.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at scala.slick.profile.RelationalMappingCompilerComponent$ProductResultConverter.read(RelationalProfile.scala:244)
at scala.slick.profile.RelationalMappingCompilerComponent$ProductResultConverter.read(RelationalProfile.scala:243)
at scala.slick.profile.RelationalMappingCompilerComponent$TypeMappingResultConverter.read(RelationalProfile.scala:262)
at scala.slick.driver.JdbcInvokerComponent$QueryInvoker.extractValue(JdbcInvokerComponent.scala:44)
at scala.slick.jdbc.StatementInvoker$$anon$1.extractValue(StatementInvoker.scala:36)
at scala.slick.jdbc.PositionedResultIterator.foreach(PositionedResult.scala:211)
at scala.slick.jdbc.Invoker$class.foreach(Invoker.scala:98)
at scala.slick.jdbc.StatementInvoker.foreach(StatementInvoker.scala:9)
at scala.slick.jdbc.Invoker$class.build(Invoker.scala:69)
at scala.slick.jdbc.StatementInvoker.build(StatementInvoker.scala:9)
at scala.slick.jdbc.Invoker$class.list(Invoker.scala:59)
at scala.slick.jdbc.StatementInvoker.list(StatementInvoker.scala:9)
at scala.slick.jdbc.UnitInvoker$class.list(Invoker.scala:157)
at scala.slick.driver.JdbcInvokerComponent$UnitQueryInvoker.list(JdbcInvokerComponent.scala:50)
Hi, I'm trying to get a bounding box around geometries in a table (the geometry field is called geom
) with something like:
t.map(r =>
(r.geom.xmin.min, r.geom.ymin.min, r.geom.xmax.max, r.geom.ymax.max)
)
However, this fails with errors like: value min is not a member of scala.slick.lifted.Column[Float]
. Are some implicit conversions not applied? Or am I doing something wrong? Thanks for your help!
Firstly, thanks a lot for this excellent library that helps to leverage the full power of pg with slick.
Is is possible to have an updated version based on slick 2.0-M3 ? I see that there is a branch but that seems a bit old.
Thanks again for this great stuff.
First of all, thank you for this excellent project and all your hard work. I'd like to understand how to use play json in my non-Play app. I know I can add the dependency to use the play json lib in a standalone way, but I'm not clear on how to use it as a total solution to have some column[JsValue] and then do queries using postgres json operators.
You've provided an example of your formatters here:
The thing I'm not clear on is how to this without using Play directly. In my project, I don't have access to the play.api.data package. So how do I provide formatters? An example of how to do this outside of a Play app would be really helpful.
Thanks again.
I probably just missed it, is there a way to use the converters with a static sql query? The documentation shows using a GetResult[T] method that reads the values from the result using methods like nextInt, nextString, etc.
http://slick.typesafe.com/doc/2.0.1/sql.html
I have several rather nasty, handwritten queries we use with postgis that return some geometry or json.
Hello. I've got some problem with pg-slick.
I've defined schema with Bool subtype of composite type
When i fetch it, i've seen error
java.lang.IllegalArgumentException: For input string: "t"
at scala.collection.immutable.StringLike$class.parseBoolean(StringLike.s
cala:238)
at scala.collection.immutable.StringLike$class.toBoolean(StringLike.scal
In PG row looks good โ {"(21,ff,t)"}
So i think, there are need some extra conversion to boolean from pg
Receiving this error on postgres 9.2.1, for an hstore type column, mapping to Map[String, String]
Exception thrown from PgHStoreSupport, line 97
hello,
thanks for this contribution.
I wanted to use your plugin by adding
"com.github.tminglei" % "slick-pg_2.10.1" % "0.1.1",
to my dependencies but i get the following error:
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: postgresql#postgresql;9.2-1002.jdbc4: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
Since I want to use it on postgres 9.1 I decided to fork your project to change the dependencies. But then I have the following bug:
error sbt.ResolveException: unresolved dependency: com.github.vallettea#slick-pg_2.10;0.1.1: not found
Any help would be appreciated.
hello,
when deploying on ubuntu I got:
postgresql#postgresql;9.2-1002.jdbc4: not found
i don't think this dependency exists (a least not in sonatype and maven)
maybe you should change the dependency of the project with:
"org.jumpmind.symmetric.jdbc" % "postgresql" % "9.2-1002-jdbc4",
It seems there is no easy way to use pg-enum types with Slick. See http://stackoverflow.com/questions/22945485/how-to-map-postgresql-custom-enum-column-with-slick2-0-1
Hi and thanks for your great work! I need to use the ST_XMin and ST_YMin PostGIS functions, which are currently missing from slick-pg's geom addon. What would be an easy way to add them (and other functions if they are required)?
Thanks,
Anton
hello,
I would like to have the geom field (in postgis database) rendered as text (lon, lat) and I understand one should use asText or asLatLonText but I get:
not found: value asText
[error] val test = OsmNodes.map(n => (n.id, asText(n.geom))).list().head
[error] ^
with this statement:
val test = OsmNodes.map(n => (n.id, asText(n.geom))).list()
What am I doing wrong?
Examples:
scala> {val d= LocalDateTime.now; (d, localDateTime2sqlTimestamp(d))}
returns:
res12: (java.time.LocalDateTime, java.sql.Timestamp) = (2014-06-14T18:30:41.818Z,2014-06-14 18:44:19.0)
One of the major features in slick 2.x is the code generator which can introspect the database schema and generate all the slick table boilerplate. This code generation can also be customized.
It will be great if slick-pg can offer a customized version of this generator which can generate all the right datatypes like RANGE, POINT etc.
hello,
is there a reason why common operation on geometries do not work on Point
which is a subclass of Geometry
?
Example:
Cannot perform option-mapped operation
[error] with type: (com.vividsolutions.jts.geom.Geometry, com.vividsolutions.jts.geom.Point) => R
[error] for base type: (com.vividsolutions.jts.geom.Geometry, com.vividsolutions.jts.geom.Geometry) => Double
[error] MeteoStationTable.map(r => r).sortBy(r => (r.geom <-> point.bind)).map(r => (r.stationId, r.geom.distance(point.bind))).list().toMap
[error] ^
I'm pretty new to the postgres json operators and to slick-pg (which is awesome, btw). I'm wondering if there's a way to do a query based on some nested json object. For example, given:
{"a":101,"b":"aaa","foo":{"bar":true},"c":[3,4,5,9]}
is there a way to filter for all rows where foo.bar == true?
Thanks a lot!
Hi,
Is it possible to have a support for array of hstores to map to List[Map[String, String]] ?
Can I add this as a custom mapping right now?
Thanks!
Hi,
When I declare enums or array and give them default values, slick-pg doesn't quote them in the DDL. I didn't test inserts and updates, but its probably the same.
Here's an example:
// -------------------------------------------------------------------------------
object Gender extends Enumeration {
val Male, Female = Value
}
type Gender = Gender.Value
case class T(id: Long, tags: List[String], gender: Gender)
class Ts(tag: Tag) extends Table[T](tag, "tees") {
val id = column[Long]("id", O.AutoInc, O.NotNull)
val tags = column[List[String]]("tags", O.NotNull, O.Default(List()))
val gender = column[Gender]("gender", O.NotNull, O.Default(Gender.Male))
def * = (id, tags, gender) <> (T.tupled, T.unapply)
}
val Ts = TableQuery[Ts]
trait MyPostgresDriver extends PostgresDriver
with PgArraySupport
with PgEnumSupport {
override lazy val Implicit = new ImplicitsPlus {}
override val simple = new SimpleQLPlus {}
trait ImplicitsPlus extends Implicits
with ArrayImplicits
with EnumImplicits
trait SimpleQLPlus extends SimpleQL
with ImplicitsPlus
with EnumImplicits
trait EnumImplicits {
implicit val genderTypeMapper = createEnumJdbcType("gender", Gender)
implicit val genderListTypeMapper = createEnumListJdbcType("gender", Gender)
implicit val genderColumnExtensionMethodsBuilder = createEnumColumnExtensionMethodsBuilder(Gender)
}
}
object MyPostgresDriver extends vPostgresDriver
import MyPostgresDriver.simple._
PgEnumSupportUtils.buildCreateSql("gender", Gender) // create type gender as enum ('Male', 'Female');
Ts.ddl.createStatements.foreach(s=>println(s+";"))
Gives:
create table "tees" ("id" SERIAL NOT NULL,"tags" text ARRAY DEFAULT {} NOT NULL,"gender" gender DEFAULT Male NOT NULL);
Gender and tags default values should be quotes:
create table "tees" ("id" SERIAL NOT NULL,"tags" text ARRAY DEFAULT '{}' NOT NULL,"gender" gender DEFAULT 'Male' NOT NULL);
Thanks.
I just copied the first block and get this:
class file needed by PgArraySupport is missing.
do you know where it can come from ?
Months in java.util.Calendar are '0' based where as in joda-time months are '1' based.
http://docs.oracle.com/javase/7/docs/api/java/util/Calendar.html#set(int, int, int, int, int, int)
month - the value used to set the MONTH calendar field. Month value is 0-based. e.g., 0 for January.
http://joda-time.sourceforge.net/apidocs/org/joda/time/LocalDate.html#LocalDate(int, int, int)
monthOfYear - the month of the year, from 1 to 12
This can be seen by reading a row that has an existing March 30th (2014-03-30) date in it, i.e. not one that has been inserted using the joda-time mapper.
It fails with:
org.joda.time.IllegalFieldValueException: Value 30 for dayOfMonth must be in the range [1,28]
at org.joda.time.field.FieldUtils.verifyValueBounds(FieldUtils.java:236)
at org.joda.time.chrono.BasicChronology.getDateMidnightMillis(BasicChronology.java:614)
at org.joda.time.chrono.BasicChronology.getDateTimeMillis(BasicChronology.java:159)
at org.joda.time.chrono.AssembledChronology.getDateTimeMillis(AssembledChronology.java:120)
at org.joda.time.LocalDate.<init>(LocalDate.java:457)
at org.joda.time.LocalDate.<init>(LocalDate.java:436)
at com.github.tminglei.slickpg.PgDateSupportJoda$class.com$github$tminglei$slickpg$PgDateSupportJoda$$sqlDate2jodaDate(PgDateSupportJoda.scala:53)
at com.github.tminglei.slickpg.PgDateSupportJoda$DateTimeImplicits$$anonfun$1.apply(PgDateSupportJoda.scala:23)
at com.github.tminglei.slickpg.PgDateSupportJoda$DateTimeImplicits$$anonfun$1.apply(PgDateSupportJoda.scala:23)
at scala.Option.map(Option.scala:145)
at com.github.tminglei.slickpg.date.PgDateJavaTypes$DateJdbcType.nextValue(PgDateJavaTypes.scala:33)
at scala.slick.driver.JdbcTypesComponent$JdbcType$class.nextValueOrElse(JdbcTypesComponent.scala:30)
at com.github.tminglei.slickpg.date.PgDateJavaTypes$DateJdbcType.nextValueOrElse(PgDateJavaTypes.scala:17)
at scala.slick.jdbc.JdbcMappingCompilerComponent$MappingCompiler$$anon$1.read(JdbcMappingCompilerComponent.scala:23)
at scala.slick.jdbc.JdbcMappingCompilerComponent$MappingCompiler$$anon$1.read(JdbcMappingCompilerComponent.scala:20)
at scala.slick.profile.RelationalMappingCompilerComponent$ProductResultConverter$$anonfun$read$1.apply(RelationalProfile.scala:244)
at scala.slick.profile.RelationalMappingCompilerComponent$ProductResultConverter$$anonfun$read$1.apply(RelationalProfile.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at scala.slick.profile.RelationalMappingCompilerComponent$ProductResultConverter.read(RelationalProfile.scala:244)
at scala.slick.profile.RelationalMappingCompilerComponent$ProductResultConverter.read(RelationalProfile.scala:243)
at scala.slick.driver.JdbcInvokerComponent$QueryInvoker.extractValue(JdbcInvokerComponent.scala:44)
at scala.slick.jdbc.StatementInvoker$$anon$1.extractValue(StatementInvoker.scala:36)
at scala.slick.jdbc.PositionedResultIterator.foreach(PositionedResult.scala:211)
at scala.slick.jdbc.Invoker$class.foreach(Invoker.scala:98)
at scala.slick.jdbc.StatementInvoker.foreach(StatementInvoker.scala:9)
at scala.slick.jdbc.Invoker$class.build(Invoker.scala:69)
at scala.slick.jdbc.StatementInvoker.build(StatementInvoker.scala:9)
at scala.slick.jdbc.Invoker$class.list(Invoker.scala:59)
at scala.slick.jdbc.StatementInvoker.list(StatementInvoker.scala:9)
at scala.slick.jdbc.UnitInvoker$class.list(Invoker.scala:157)
at scala.slick.driver.JdbcInvokerComponent$UnitQueryInvoker.list(JdbcInvokerComponent.scala:50)
at Foo$$anonfun$run$1.apply(<console>:21)
For example, the mapping from a calendar to LocalDate is incorrect
https://github.com/tminglei/slick-pg/blob/master/addons/joda-time/src/main/scala/com/github/tminglei/slickpg/PgDateSupportJoda.scala#L55 as it does not add '1' to the MONTH.
The opposite is also true. https://github.com/tminglei/slick-pg/blob/master/addons/joda-time/src/main/scala/com/github/tminglei/slickpg/PgDateSupportJoda.scala#L55 Does not subtract '1' to the month.
Hello,
After the scala 2.11 release, play framework has released 2.3-RC1 and slick has released 2.1-M1 with builds for scala 2.11
It will be great if slick-pg
can also offer a build for scala 2.11. I am planning to create a typesafe activator template with this combination.
Thanks again for the great library.
I accidentally stumbled on this https://github.com/tototoshi/slick-joda-mapper, which gave me some questions
JodaTime
?Main reason for these questions, is it seems we have multiple people solving the exact same problem in regards to slick, which isn't exactly a good thing
for a very large table 2.5 million nodes, the following query
val all_nodes = OsmNodes.map(p => p).list
fails with:
error java.lang.OutOfMemoryError: GC overhead limit exceeded
java.lang.OutOfMemoryError: GC overhead limit exceeded
at scala.slick.util.ProductLinearizer.getResult(ValueLinearizer.scala:62)
at scala.slick.util.ProductLinearizer.getResult(ValueLinearizer.scala:45)
at scala.slick.driver.BasicInvokerComponent$QueryInvoker.extractValue(BasicInvokerComponent.scala:26)
at scala.slick.jdbc.StatementInvoker$$anon$1.extractValue(StatementInvoker.scala:37)
at scala.slick.session.PositionedResultIterator.foreach(PositionedResult.scala:212)
at scala.slick.jdbc.Invoker$class.foreach(Invoker.scala:91)
at scala.slick.jdbc.StatementInvoker.foreach(StatementInvoker.scala:10)
at scala.slick.jdbc.Invoker$class.build(Invoker.scala:66)
at scala.slick.jdbc.StatementInvoker.build(StatementInvoker.scala:10)
at scala.slick.jdbc.Invoker$class.toMap(Invoker.scala:59)
at scala.slick.jdbc.StatementInvoker.toMap(StatementInvoker.scala:10)
at scala.slick.jdbc.UnitInvoker$class.toMap(Invoker.scala:153)
at scala.slick.driver.BasicInvokerComponent$QueryInvoker.toMap(BasicInvokerComponent.scala:19)
at net.snips.pogistan.PostGisQuery$$anonfun$fetchval$1.apply(PostGisQuery.scala:40)
at net.snips.pogistan.PostGisQuery$$anonfun$fetchval$1.apply(PostGisQuery.scala:28)
at scala.slick.session.Database.withSession(Database.scala:38)
at net.snips.pogistan.PostGisQuery$.fetchval(PostGisQuery.scala:28)
at net.snips.pogistan.Main$delayedInit$body.apply(Main.scala:8)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at net.snips.pogistan.Main$.main(Main.scala:5)
at net.snips.pogistan.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Is there a way to query by chunks or to create a stream?
To use the latest slick-pg 0.2.2.1 I had to bump my version of json4s.
"org.json4s" %% "json4s-native" % "3.2.6"
https://github.com/json4s/json4s - sbt installation.
I don't know why exactly since the 3.2.5 - 2.10 version shows up fine in maven.
http://repo1.maven.org/maven2/org/json4s/json4s-native_2.10/
The json4s native package wasn't being picked up with the 3.2.5 version, but works fine with 3.2.6.
hello
I could not find the proper way to query all OsmNodes where tags are not {}
The following filter work
OsmNodes.map(s => s).filter(_.tags != Map())
but i guess it fetches all nodes at one point.
This isn't really an issue, just starting that I am using slick-pg
with Slick 2.0.2 and I am not having any issues
hello,
now that I have finally understood slick-pg (and scala more generally) I have scaled up my code and encountered performance issues.
For example, in my psql shell:
osm=# explain analyze select x2."id", x2."geom" from "nodes" x2 where x2."id" in (196101, 2121445348, 196100, 196331);
QUERY PLAN
-----------------------------------------------------------------------------------------------------------------
Bitmap Heap Scan on nodes x2 (cost=17.31..33.24 rows=4 width=108) (actual time=0.000..0.000 rows=4 loops=1)
Recheck Cond: (id = ANY ('{196101,2121445348,196100,196331}'::bigint[]))
-> Bitmap Index Scan on id_node (cost=0.00..17.31 rows=4 width=0) (actual time=0.000..0.000 rows=4 loops=1)
Index Cond: (id = ANY ('{196101,2121445348,196100,196331}'::bigint[]))
Total runtime: 0.000 ms
(5 rows)
so very fast. The equivalent query in slick-pg:
scala> db withSession { implicit session: Session =>
| val highways = OsmWays.getWaysOfType(session, "highway")
| val h1 = highways.next()
| val h2 = highways.next()
| val query = OsmNodes.where(_.id inSetBind h2.nodes).map(p => (p.id -> p.geom))
| println(s"query = ${query.selectStatement}")
| timeit(query.toMap)
| }
query = select x2."id", x2."geom" from "nodes" x2 where x2."id" in (?, ?, ?, ?)
Evaluation took: 21 ms
do you know where this huge difference comes from?
thanks
Hello tminglei,
I tried your new release of slick-pg and I have a compilation problem when i go from % "slick-pg_2.10.1" % "0.1.1"
to "slick-pg_2.10.1" % "0.2.2"
. It seems that Postgis is no longer supported:
[error] /Users/vallette/projects/pogistan/pogistan-postgis/src/main/scala/net/snips/pogistan/postgis/SimplePostgresDriver.scala:27: not found: type PostGISSupport
[error] with PostGISSupport {
[error] ^
[info] Resolving commons-digester#commons-digester;1.8 ...
[error] /Users/vallette/projects/pogistan/pogistan-postgis/src/main/scala/net/snips/pogistan/postgis/SimplePostgresDriver.scala:42: not found: type PostGISAssistants
[error] with PostGISAssistants
[error] ^
[info] Resolving com.tinkerpop.rexster#rexster-protocol;2.3.0 ...
[error] /Users/vallette/projects/pogistan/pogistan-postgis/src/main/scala/net/snips/pogistan/postgis/SimplePostgresDriver.scala:37: not found: type PostGISImplicits
[error] with PostGISImplicits
[error] ^
Just an FYI,
I hacked together a way of doing records and array of records. I started working on it from this Stackoverflow question.
The adapted mappers are here .
It's based on regular expressions and pattern matching. Rather nasty. Hope you guys can come up with a nicer way of doing it and add it to the project <3. There should be away of generating matchers and regular expressions from the case classes, but my scala metaprogramming skills suxor :D I need records in hstore in the next few days. I'm dreading having to alter hstore as the operator support looks tasty, and I would want those features.
P.s., sorry if this is not the right place to put this. I don't know feature requests go.
Regards
Hassan
hello,
i'm trying to get all Lines
whose centroid is within an area
:
Lines.map(r => r).where(r => (r.geom.centroid within area.geom.bind)).map(r => r.uid).list()
but i get:
Cannot perform option-mapped operation
[error] with type: (com.vividsolutions.jts.geom.Point, com.vividsolutions.jts.geom.Geometry) => R
[error] for base type: (com.vividsolutions.jts.geom.Geometry, com.vividsolutions.jts.geom.Geometry) => Boolean
[error] val sectionsInside = Sections.map(r => r).where(r => (r.geom.centroid within boroughDb.geom.bind)).map(r => r.uid).list()
[error] ^
but normally jts.geom.Point
inherits within
from jts.geom.Geometry
?
Thanks,
Could you do the following for me :
Array[
and Row(
formatting, for 3-4 levels.I ask for 1 because I don't have much experience with JDBC, and I ask for 2 because I was developing the parser code in a messy project with messy tests. It would be easier if I open the slick-pg project in the IDE directly and I don't have any experience on who to write unit tests with scala :D
What do I need to import to be able to delete with a given query? I have imported MyPostgresDriver.profile.simple._ but I'm getting "value delete is not a member of scala.slick.lifted.Query".
Am I missing a required import?
Sorry again.
I have trouble figuring how i should use your code. I have attached two files and it seems not to work because I get the following error:
[error] /home/ubuntu/projects/pogistan/src/net/snips/pogistan/test.scala:55: could not find implicit value for parameter tm: scala.slick.ast.TypedType[Map[String,String]]
[error] def tags = columnMap[String, String]
[error] ^
[error] /home/ubuntu/projects/pogistan/src/net/snips/pogistan/test.scala:56: could not find implicit value for parameter tm: scala.slick.ast.TypedType[List[Int]]
[error] def nodes = columnList[Int]
hi @hsyed, i added some unit tests for PGObjectTokenizer, you can check it here.
but some of them are failed:
...
///
val input1 = """(111,test,"test desc",)"""
val expected1 =
CompositeE(List(
ValueE("111"),
ValueE("test"),
ValueE("test desc"),
null
))
assertEquals(expected1, PGObjectTokenizer(input1))
/**
java.lang.AssertionError:
Expected :CompositeE(List(ValueE(111), ValueE(test), ValueE(test desc), null))
Actual :CompositeE(List(ValueE(111), ValueE(test), ValueE(test desc)))
*/
///
val input2 = """(111,test,,)"""
val expected2 =
CompositeE(List(
ValueE("111"),
ValueE("test"),
null,
null
))
assertEquals(expected2, PGObjectTokenizer(input2))
/**
java.lang.AssertionError:
Expected :CompositeE(List(ValueE(111), ValueE(test), null, null))
Actual :CompositeE(List(ValueE(111), ValueE(test)))
*/
//
val input3 = """(111,,"test desc",)"""
val expected3 =
CompositeE(List(
ValueE("111"),
null,
ValueE("test desc"),
null
))
assertEquals(expected3, PGObjectTokenizer(input3))
...
///
val input1 = """(115,"(111,test,""test dd"",hi)",)"""
val expected1 =
CompositeE(List(
ValueE("115"),
CompositeE(List(
ValueE("111"),
ValueE("test"),
ValueE("test dd"),
ValueE("hi"))),
null
))
assertEquals(expected1, PGObjectTokenizer(input1))
///
val input2 = """(115,,{157})"""
val expected2 =
CompositeE(List(
ValueE("115"),
null,
ArrayE(List(
ValueE("157")
))
))
assertEquals(expected2, PGObjectTokenizer(input2))
...
seems all failed cases are related to null value parsing.
Can you help take a look at it?
ps: maybe we also need a Token/Element type for null value.
It would be great to have a play application in the examples.
hello, I guess there is a small bug in the joda implementation because I'm able to write in the database but retrieval creates this error:
[error] (run-main) org.joda.time.IllegalFieldValueException: Value 0 for monthOfYear must be in the range [1,12]
org.joda.time.IllegalFieldValueException: Value 0 for monthOfYear must be in the range [1,12]
at org.joda.time.field.FieldUtils.verifyValueBounds(FieldUtils.java:236)
at org.joda.time.chrono.BasicChronology.getDateMidnightMillis(BasicChronology.java:613)
at org.joda.time.chrono.BasicChronology.getDateTimeMillis(BasicChronology.java:177)
at org.joda.time.chrono.AssembledChronology.getDateTimeMillis(AssembledChronology.java:133)
at org.joda.time.LocalDateTime.<init>(LocalDateTime.java:511)
at org.joda.time.LocalDateTime.<init>(LocalDateTime.java:481)
at com.github.tminglei.slickpg.PgDateSupportJoda$class.com$github$tminglei$slickpg$PgDateSupportJoda$$sqlTimestamp2jodaDateTime(PgDateSupportJoda.scala:78)
at com.github.tminglei.slickpg.PgDateSupportJoda$DateTimeImplicits$$anonfun$5.apply(PgDateSupportJoda.scala:20)
at com.github.tminglei.slickpg.PgDateSupportJoda$DateTimeImplicits$$anonfun$5.apply(PgDateSupportJoda.scala:20)
at scala.Option.map(Option.scala:145)
at com.github.tminglei.slickpg.date.TimestampTypeMapper.nextValue(TimestampTypeMapper.scala:26)
at scala.slick.lifted.TypeMapperDelegate$class.nextValueOrElse(TypeMapper.scala:158)
at com.github.tminglei.slickpg.date.TimestampTypeMapper.nextValueOrElse(TimestampTypeMapper.scala:9)
at scala.slick.lifted.Column.getResult(ColumnBase.scala:28)
at scala.slick.lifted.Projection4.getResult(Projection.scala:177)
at scala.slick.lifted.Projection4.getResult(Projection.scala:164)
at scala.slick.lifted.MappedProjection.getResult(Projection.scala:65)
at scala.slick.lifted.AbstractTable.getResult(AbstractTable.scala:54)
at scala.slick.driver.BasicInvokerComponent$QueryInvoker.extractValue(BasicInvokerComponent.scala:26)
at scala.slick.jdbc.StatementInvoker$$anon$1.extractValue(StatementInvoker.scala:37)
at scala.slick.session.PositionedResultIterator.foreach(PositionedResult.scala:212)
at scala.slick.jdbc.Invoker$class.foreach(Invoker.scala:91)
at scala.slick.jdbc.StatementInvoker.foreach(StatementInvoker.scala:10)
at scala.slick.jdbc.Invoker$class.build(Invoker.scala:66)
at scala.slick.jdbc.StatementInvoker.build(StatementInvoker.scala:10)
at scala.slick.jdbc.Invoker$class.list(Invoker.scala:56)
at scala.slick.jdbc.StatementInvoker.list(StatementInvoker.scala:10)
at scala.slick.jdbc.UnitInvoker$class.list(Invoker.scala:150)
at scala.slick.driver.BasicInvokerComponent$QueryInvoker.list(BasicInvokerComponent.scala:19)
Hi tminglei,
Great work!
I tried the Play-Json example here https://github.com/tminglei/slick-pg/blob/master/addons/play-json/src/test/scala/com/github/tminglei/slickpg/MyPostgresDriver.scala
When I compiled it, the compiler complains
bad symbolic reference. A signature in PgArrayJdbcTypes.class refers to type JdbcType
[error] in trait scala.slick.driver.JdbcTypesComponent which is not available.
The slick that I use is 2.1.0-M2. It is not compatible with the current slick-pg?
Thanks.
Hi,
I created the following custom slick pg driver support:
import slick.driver.PostgresDriver
import com.github.tminglei.slickpg._
trait MyPostgresDriver extends PostgresDriver
with PgDateSupportJoda
with PgArraySupport {
override val Implicit = new Implicits with DateTimeImplicits with ArrayImplicits
override val simple = new Implicits with SimpleQL with DateTimeImplicits with ArrayImplicits
}
object MyPostgresDriver extends MyPostgresDriver
Usage:
import java.util.UUID
import persistence.MyPostgresDriver.simple._
class Users(tag: Tag) extends Table[User](tag, "users") {
def id = column[UUID]("id", O.PrimaryKey)
def firstName = column[String]("first_name")
def lastName = column[String]("last_name")
def email = column[String]("email")
def * = (id, firstName, lastName, email) <> (User.tupled, User.unapply)
And now my mappings stopped working with the following:
service scala.slick.SlickException: JdbcProfile has no TypeInfo for type UUID/UUID
service at scala.slick.driver.JdbcTypesComponent$class.typeInfoFor(JdbcTypesComponent.scala:127)
service at scala.slick.driver.PostgresDriver$.typeInfoFor(PostgresDriver.scala:126)
service at scala.slick.jdbc.JdbcMappingCompilerComponent$MappingCompiler$class.createColumnConverter(JdbcMappingCompilerComponent.scala:18)
service at scala.slick.jdbc.JdbcMappingCompilerComponent$JdbcCodeGen.createColumnConverter(JdbcMappingCompilerComponent.scala:36)
service at scala.slick.profile.RelationalMappingCompilerComponent$MappingCompiler$class.compileMapping(RelationalProfile.scala:215)
service at scala.slick.jdbc.JdbcMappingCompilerComponent$JdbcCodeGen.compileMapping(JdbcMappingCompilerComponent.scala:36)
service at scala.slick.profile.RelationalMappingCompilerComponent$MappingCompiler$$anonfun$compileMapping$1.apply(RelationalProfile.scala:218)
service at scala.slick.profile.RelationalMappingCompilerComponent$MappingCompiler$$anonfun$compileMapping$1.apply(RelationalProfile.scala:218)
Thank you!
My current setup for using slick-pg is with play-slick and Play Framework - 2.2.1 latest.
It would be great if the play-json library was supported. It even uses JsValue (Js) prefix so it wouldn't interfere with the JValue from json4s.
http://www.playframework.com/documentation/2.2.x/ScalaJson
http://repo.typesafe.com/typesafe/releases/com/typesafe/play/play-json_2.10/
Hello,
Which feature in Java 8 is required by slick-pg?
Thanks
Now that Java 8 has released, how can we use the new Date Time types with slick-pg ?
Is it possible to use aggregation functions supported in Postgres 9.x+
List of function are at: http://www.postgresql.org/docs/current/static/functions-aggregate.html#FUNCTIONS-AGGREGATE-TABLE
Im particularly interested in "string_agg"
Cheers!
If I have a column of type text[], and there is a text element of that array that has any of the characters
{}[]()
... in it, then I get an error. The specific error changed between 0.5.2.1 and 0.5.2.2.
0.5.2.1: java.lang.IllegalArgumentException: unsupported token CTRecord(List(ParenOpen(,0), Chunk(s), ParenClose(,0)))
0.5.2.2: java.util.NoSuchElementException: None.get
In the first example, the actual text value was '(s)'.
Thanks for your good work!
Hi,
thank you for the project!
i have a small issue with default DateTime:
When i save/load data from PG what is mapped TIMESTAMPTZ->joda.DateTime with last version everything works as expected., but if SQL field has default value specified as
created_at TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP
i cannot read this value properly and getting exception:
Invalid format: "2014-03-14 17:12:53.039271+01" is malformed at ".039271+01"
java.lang.IllegalArgumentException: Invalid format: "2014-03-14 17:12:53.039271+01" is malformed at ".039271+01"
at org.joda.time.format.DateTimeFormatter.parseDateTime(DateTimeFormatter.java:873)
at org.joda.time.DateTime.parse(DateTime.java:144)
at com.github.tminglei.slickpg.PgDateSupportJoda$DateTimeImplicits$$anonfun$8.apply(PgDateSupportJoda.scala:27)
at com.github.tminglei.slickpg.PgDateSupportJoda$DateTimeImplicits$$anonfun$8.apply(PgDateSupportJoda.scala:27)
at com.github.tminglei.slickpg.utils.PgCommonJdbcTypes$GenericJdbcType.nextValue(PgCommonJdbcTypes.scala:26)
at com.github.tminglei.slickpg.utils.PgCommonJdbcTypes$GenericJdbcType.nextValueOrElse(PgCommonJdbcTypes.scala:12)
Probably the problem is not related to your lib, just CURRENT_TIMESTAMP value is not compatible with joda converter. Any way your help would be greatly appreciated.
Current maven distribution does not include sources. Eclipse maven plugin downloads sources automatically if they are available, which is very handy in getting some insight on slick-pg internals.
slick-pg implements sql.TimeStamp
but this kind of object is rather poor to play with date and times in scala. The most widely adopted object for this purpose is DateTime
:
http://joda-time.sourceforge.net/apidocs/org/joda/time/DateTime.html
I wonder if it would be hard to add this kind of object as a column?
My current trick is to use a sql.TimeStamp
case class MyObject (
dateTime: Timestamp
)
and convert my DateTime
at the last moment:
insert(MyObject (new Timestamp(dateTime.getMillis())))
Play has this nice ability to automatically evolve the database with a click of a button:
If you want to use ENUMs, you can't use this feature, since you get an error about the inexistent type on evolution.
Is it possible that enum evolutions would be generated along all others?
I can create a 3D point with code like
val coordinate = new Coordinate(-129.9990786, 49.3555567, 101)
val geometryFactory = new GeometryFactory(new PrecisionModel(PrecisionModel.FLOATING), 4326)
val geometry = geometryFactory.createPoint(coordinate)
but when I go to write it to the database column with a dimension constraint I get org.postgresql.util.PSQLException: ERROR: new row for relation "course_photo" violates check constraint "enforce_dims_override_geometry"
.
My investigation shows that the issue is in PgPostGisSupport.scala:83 (and similar)
if (wkbWriterHolder.get == null) wkbWriterHolder.set(new WKBWriter(2, true))
The 2 parameter to WKBWriter forces everything to 2 dimensions.
Is there a nice way to deal with this? Shall I think about fixing it myself? What is an interface that matches the style of the rest of the code? (Sadly, I'm not an experienced enough Scala programmer to be confident with this.)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.