Giter Site home page Giter Site logo

kaizen-solutions / virgil Goto Github PK

View Code? Open in Web Editor NEW
35.0 35.0 8.0 914 KB

A purely functional Cassandra client built on top of the Datastax Java Driver supporting a variety of effect systems like ZIO & Cats-Effect supporting both Scala 2 & 3

License: Mozilla Public License 2.0

Scala 100.00%
cassandra cats-effect fs2 scala scala2 scala3 zio zio-streams

virgil's People

Contributors

calvinlfer avatar mihaisoloi avatar narma avatar samgj18 avatar scala-steward avatar serhiip avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

virgil's Issues

Unify Mutations and Queries

Initial exploration

import com.datastax.oss.driver.api.core.cql.PagingState
import io.kaizensolutions.virgil.codecs.Reader
import io.kaizensolutions.virgil.configuration.ExecutionAttributes
import io.kaizensolutions.virgil.dsl.{Assignment, Relation}
import zio._
import zio.stream._

final case class Interaction[+Result](
 interactionType: InteractionType[Result],
 executionAttributes: ExecutionAttributes
)
sealed trait InteractionType[+Result]
object InteractionType {
 final case class Mutation(mutationType: MutationType) extends InteractionType[ExecutionResult]

 final case class BatchMutation(mutations: NonEmptyChunk[Mutation]) extends InteractionType[ExecutionResult]

 final case class SingleElementQuery[Result](queryType: QueryType2, reader: Reader[Result])
     extends InteractionType[Task[Option[Result]]]

 final case class StreamingQuery[Result](queryType: QueryType2, reader: Reader[Result])
     extends InteractionType[Stream[Throwable, Result]]

 final case class PagedQuery[Result](queryType: QueryType2, reader: Reader[Result])
     extends InteractionType[Task[(Chunk[Result], Option[PagingState])]]
}

final case class ExecutionResult(result: Boolean) extends AnyVal

sealed trait MutationType
object MutationType {
 final case class Insert(tableName: String, columns: BindMarkers) extends MutationType

 final case class Update(tableName: String, assignments: NonEmptyChunk[Assignment], relations: NonEmptyChunk[Relation])
     extends MutationType

 final case class Delete(
   tableName: String,
   relations: NonEmptyChunk[Relation]
 ) extends MutationType

 final case class Truncate(tableName: String) extends MutationType

 final case class RawCql private (queryString: String, bindMarkers: BindMarkers) extends MutationType
}

sealed trait QueryType2
object QueryType2 {
 final case class Select[FromCassandra](
   tableName: String,
   columnNames: NonEmptyChunk[BindMarkerName],
   relations: Chunk[Relation]
 ) extends QueryType2

 final private[virgil] case class RawCql[FromCassandra] private (
   query: String,
   columns: BindMarkers
 ) extends QueryType2
}

Add `values` to InsertBuilder

Rather than doing

InsertBuilder(tableName)
  .value(col1, data.col1)
  .value(col2, data.col2)
  .value(col3, data.col3)
  .build

We would like to have

InsertBuilder(tableName)
  .values(
    col1 -> data.col1,
    col2 -> data.col2,
    col3 -> data.col3
  )
  .build

Investigate issue with Coveralls

Adding coveralls to the build causes issues and is unable to build on JDK 11

      ...,
      WorkflowStep.Sbt(
        name = Option("Coverage"),
        commands = List("coverageAggregate"),
        cond = None,
        env = Map.empty
      ),
      WorkflowStep.Sbt(
        name = Option("Coveralls"),
        commands = List("coveralls"),
        cond = None,
        env = Map(
          "COVERALLS_REPO_TOKEN" -> "${{ secrets.GITHUB_TOKEN }}",
          "COVERALLS_FLAG_NAME"  -> "Scala ${{ matrix.scala }}"
        )
      )
image
[error] java.util.NoSuchElementException: key not found: 2/io/kaizensolutions/virgil/codecs/UdtValueEncoderMagnoliaDerivation.scala
[error]         at scala.collection.MapLike.default(MapLike.scala:236)
[error]         at scala.collection.MapLike.default$(MapLike.scala:235)
[error]         at scala.collection.AbstractMap.default(Map.scala:65)
[error]         at scala.collection.MapLike.apply(MapLike.scala:144)
[error]         at scala.collection.MapLike.apply$(MapLike.scala:143)
[error]         at scala.collection.AbstractMap.apply(Map.scala:65)
[error]         at org.scoverage.coveralls.CoberturaMultiSourceReader.lineCoverage(CoberturaMultiSourceReader.scala:119)
[error]         at org.scoverage.coveralls.CoberturaMultiSourceReader.reportForSource(CoberturaMultiSourceReader.scala:130)
[error]         at org.scoverage.coveralls.CoverallsPlugin$.$anonfun$coverallsTask$6(CoverallsPlugin.scala:128)
[error]         at scala.collection.parallel.AugmentedIterableIterator.map2combiner(RemainsIterator.scala:116)
[error]         at scala.collection.parallel.AugmentedIterableIterator.map2combiner$(RemainsIterator.scala:113)
[error]         at scala.collection.parallel.immutable.ParHashSet$ParHashSetIterator.map2combiner(ParHashSet.scala:81)
[error]         at scala.collection.parallel.ParIterableLike$Map.leaf(ParIterableLike.scala:1064)
[error]         at scala.collection.parallel.Task.$anonfun$tryLeaf$1(Tasks.scala:53)
[error]         at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error]         at scala.util.control.Breaks$$anon$1.catchBreak(Breaks.scala:67)
[error]         at scala.collection.parallel.Task.tryLeaf(Tasks.scala:56)
[error]         at scala.collection.parallel.Task.tryLeaf$(Tasks.scala:50)
[error]         at scala.collection.parallel.ParIterableLike$Map.tryLeaf(ParIterableLike.scala:1061)
[error]         at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.internal(Tasks.scala:170)
[error]         at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.internal$(Tasks.scala:157)
[error]         at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.internal(Tasks.scala:440)
[error]         at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.compute(Tasks.scala:150)
[error]         at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.compute$(Tasks.scala:149)
[error]         at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.compute(Tasks.scala:440)
[error]         at java.base/java.util.concurrent.RecursiveAction.exec(RecursiveAction.java:189)
[error]         at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
[error]         at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
[error]         at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
[error]         at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
[error]         at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
[error] (coveralls) java.util.NoSuchElementException: key not found: 2/io/kaizensolutions/virgil/codecs/UdtValueEncoderMagnoliaDerivation.scala

Implement automatic derivation for Scala 3

We currently implement semi-automatic derivation of codecs in Scala 3 and fully automatic derivation of codecs in Scala 2. The intent is to bring Scala 3 to parity with Scala 2 with regards to fully automatic derivation.

Add stripMargin and triple-quotes string support

Hi,
Thanks for a great library!
If possible, I'd like to request a feature with triple-quotes cql-interpolated strings support & stripMargin method for cql-interpolated strings.

Why?
Because it would be really helpful when creating tables via virgil, now it requires a lot of strings concatenation.

Document the implicit resolution chain for codecs

For example:
Given the following Cassandra Row

final case class CassandraRow(a: Int, b: Long, c: OuterData)
final case class OuterData(x: String, y: String, z: InnerData)
final case class InnerData(i: Int)

We can see CassandraRow contains nested data which will be stored as a UDT value and that OuterData also contains InnerData which is also stored as a UDT value, so we have a UDT value in a Row and a UDT value within a UDT value.

How do the implicits resolve to build a CqlDecoder?

  • For a: Int -> we have a CqlPrimitiveDecoder[Int] - this gets summoned in the derivation (before going through the implicit conversion from PrimitiveDecoder -> CqlDecoder)

  • For b: Long -> we have a CqlPrimitiveDecoder[Long] - this gets summoned in the derivation (before going through the implicit conversion from PrimitiveDecoder -> CqlDecoder)

  • For c: OuterData -> this is more interesting, this ends up having a CqlPrimitiveDecoder[OuterData], but how does that get materialized?
    We get that through CqlUDTValueDecoder[OuterData]

If we step through the derivation of CqlUdtValueDecoder[OuterData], we'll find that it uses CqlPrimitiveDecoder to summon instances of String (x and y) and then you'll notice InnerData is also a UDTValue, so what ends up happening is you use CqlPrimitiveDecoder[InnerData] which ends up calling out to CqlUDTValueDecoder[InnerData] which ends up calling back out to CqlPrimitiveDecoder[Int] (the i: Int in InnerData).

You can see that CqlPrimitiveDecoder is a bridge that links all the mechanisms together
Perhaps this linking mechanism is unnecessary but this is how it currently works.

implicit call graph

Implement CQL.unbatched

Let's say you have a set of mutations in a single batch but you decide that you no longer want them in a batch:

val batchMutation = update1 + insert1 + update2 
val individualMutations: NonEmptyChunk[CQL[MutationResult]] = CQL.unbatched { batchMutation }

Currently, we track each individual mutation in a NonEmptyChunk but if we wanted to provide the user with each of the individual mutations in an API like so:

val batchMutation = update1 + insert1 + update2 
val u1 :*: i1 :*: u2 :*: Finish = CQL.unbatched { batchMutation }

Then we would need to use an HList and keep track of each insertion

Add examples

  • Low level API
    • cql
  • High level API
    • Insert
    • Update
    • Delete
    • Truncate
  • Paging

Issues with primitive decoders

Hi!

I'm facing following error when trying to create query with primitive type as a type parameter, e.g.:

selectUserState.query[String]

and receiving

could not find implicit value for parameter evidence: io.kaizensolutions.virgil.codecs.CqlRowDecoder.Object[String]

even with direct import of concrete decoder:

import io.kaizensolutions.virgil.codecs.CqlPrimitiveDecoder.stringPrimitiveDecoder

version: 0.7.0 for zio-1

Implement a Tagless Final interpreter

I’ve received a lot of internal demand for a Tagless final interpreter leveraging Cats Effect typeclasses and FS2 to perform the streaming operations. This is definitely possible (we were able to create a POC earlier) and we’ll do it as a separate sbt sub-project

Allow the column names to be different from the case class fields

For example, given a table with the following name

CREATE TABLE info (
  favorite BOOLEAN,
  lots_of_comments frozen<list<TEXT>>
);

We should be able to do something like this in Scala

final case class Info(
 @CqlColumn("favorite") fav: Boolean,
 @CqlColumn("lots_of_comments") lotsOfComments: List[String]
)

Add tests

  • cql tests
  • lower level Interpreter tests
  • DSL tests
  • UDT tests
  • Nested Collection tests

Note: Most likely need to pull in testcontainers so we can check against Cassandra itself (Done)

Restrict Relational Operators to the appropriate DML

I still need to do some more research on this but it seems like certain relations in WHERE clauses are not allowed for UPDATE vs SELECT statements. Initially, I was thinking of doing it like the following:

final case class Relation[A: Writer, Operator <: RelationOperator](
  columnName: ColumnName,
  operator: Operator,
  value: A
)

sealed trait RelationOperator
object RelationOperator {
  sealed trait DeleteRelationOperator extends RelationOperator
  sealed trait UpdateRelationOperator extends RelationOperator

  case object Equal              extends RelationOperator with UpdateRelationOperator with DeleteRelationOperator
  case object NotEqual           extends RelationOperator with UpdateRelationOperator
  case object GreaterThan        extends RelationOperator with UpdateRelationOperator
  case object GreaterThanOrEqual extends RelationOperator with UpdateRelationOperator
  case object LessThan           extends RelationOperator with UpdateRelationOperator
  case object LessThanOrEqual    extends RelationOperator with UpdateRelationOperator
  case object Like               extends RelationOperator
  case object In                 extends RelationOperator with DeleteRelationOperator
}

Then in the Update query builder, we can restrict inputs to use Relation[ScalaType, UpdateRelationOperator]

Add Support for java.time.LocalDateTime & timestamp

Timestamp using java.time.LocalDateTime is not supported and produces the error below

could not find implicit value for parameter ev: io.kaizensolutions.virgil.codecs.CqlRowComponentEncoder[java.time.LocalDateTime] .value(Timestamp, in.timestamp)

To reproduce, first timestamp CQL table:

CREATE TABLE timestampspec
(
    id   INT PRIMARY KEY,
    timestamp timestamp,
);

The data Spec types class

object TimestampSpecDatatypes {
  final case class ObjectWithTimestamp(
    id: Int,
    timestamp: LocalDateTime
  )

  object ObjectWithTimestamp {
    val Id   = "id"
    val Timestamp  = "timestamp"
    val table: String = "timestampspec"
    val truncate: CQL[MutationResult] = s"TRUNCATE TABLE $table".asCql.mutation

    def insert(in: ObjectWithTimestamp): CQL[MutationResult] =
      InsertBuilder(table)
        .value(Id, in.id)
        .value(Timestamp, in.timestamp)
        .build

    def find(id: Int): CQL[ObjectWithTimestamp] =
      SelectBuilder
        .from(table)
        .columns(Id, Timestamp)
        .where(Id === id)
        .build[ObjectWithTimestamp]
  }
}

And the test case

object TimestampSpec {
  def localDateTimeSpec =
    suite("Timestamp Operators Specification") {
      test("isNull") {
        check(insertTimeStampSpecGen) { timestampObject =>
          val update =
            UpdateBuilder(table)
              .set(Timestamp := timestampObject.timestamp)
              .where(Id === timestampObject.id)
              .ifCondition(Timestamp.isNull)
              .build
              .execute
              .runDrain

          val find = ObjectWithTimestamp.find(timestampObject.id).execute.runHead.some

          truncate.execute.runDrain *>
            update *>
            find.map(actual => assertTrue(actual == timestampObject))
        }
      }
    } @@ sequential @@ samples(4)

  def insertTimeStampSpecGen: Gen[Random, ObjectWithTimestamp] =
    for {
      id        <- Gen.int(1, 1000)
      timestamp <- Gen.localDateTime.map(_ => LocalDateTime.now())
    } yield ObjectWithTimestamp(id, timestamp)

}

Allow paging via selectPage

Rudimentary design:

  def select[Output](
    input: Query[Output],
    pagingState: Option[PagingState] = None,
    config: ExecutionAttributes = ExecutionAttributes.default,
  ): Task[(Chunk[Output], PagingState)] = ???

Rework and unify Reader/UdtReader and Writer/UdtWriter abstraction

I was under the assumption that the Cassandra Row abstraction did not share any similarities with the UdtValue abstraction in spite of sharing the same API which was a massive fail on my part because both of them extend GettableByName and this can be used to significantly reduce the amount of complexity around shifting back and forth between TypeMapper, UdtReader/Writer and Reader/Writer. The abstraction currently does not feel right at all.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.