Giter Site home page Giter Site logo

depot / kysely-planetscale Goto Github PK

View Code? Open in Web Editor NEW
353.0 6.0 14.0 1.66 MB

A Kysely dialect for PlanetScale Serverless

Home Page: https://depot.dev/blog/kysely-dialect-planetscale

License: MIT License

TypeScript 100.00%
kysely mysql planetscale vitess kysely-dialect

kysely-planetscale's Introduction

Banner Image

kysely-planetscale

CI npm Powered by TypeScript

A Kysely dialect for PlanetScale, using the PlanetScale serverless driver for JavaScript.

Installation

You should install both kysely and @planetscale/database with kysely-planetscale, as they are both required peer dependencies. You can install them with your favorite package manager:

# with pnpm
pnpm add kysely-planetscale kysely @planetscale/database

# with yarn
yarn add kysely-planetscale kysely @planetscale/database

# with npm
npm install kysely-planetscale kysely @planetscale/database

Usage

You can pass a new instance of PlanetScaleDialect as the dialect option when creating a new Kysely instance:

import {Kysely} from 'kysely'
import {PlanetScaleDialect} from 'kysely-planetscale'

const db = new Kysely<Database>({
  dialect: new PlanetScaleDialect({
    host: '<host>',
    username: '<user>',
    password: '<password>',
  }),
})

PlanetScaleDialect accepts the same options as connect({...}) from @planetscale/database, so for instance if you are using Node.js and need to provide a fetch implementation:

import {Kysely} from 'kysely'
import {PlanetScaleDialect} from 'kysely-planetscale'
import {fetch} from 'undici'

// Connect using a DATABASE_URL, provide a fetch implementation
const db = new Kysely<Database>({
  dialect: new PlanetScaleDialect({
    url: process.env.DATABASE_URL,
    fetch,
  }),
})

Type Conversion

PlanetScaleDialect provides built-in support for converting JavaScript Dates to and from DATETIME and TIMESTAMP columns, as Kysely's generated types expect. However, you can override or extend this behavior by providing a custom format or cast function to override the defaults.

Custom format function

PlanetScaleDialect passes all parameters to @planetscale/database unmodified, except for JavaScript Dates, which are converted to MySQL strings. If you set a format function, you can override this behavior:

import {Kysely} from 'kysely'
import {PlanetScaleDialect} from 'kysely-planetscale'
import SqlString from 'sqlstring'

const db = new Kysely<Database>({
  dialect: new PlanetScaleDialect({
    url: process.env.DATABASE_URL,
    format: SqlString.format,
  }),
})

Custom cast function

PlanetScaleDialect automatically type-casts DATETIME and TIMESTAMP to JavaScript Dates. If you'd prefer to customize this behavior, you can pass a custom cast function:

import {cast} from '@planetscale/database'
import {Kysely} from 'kysely'
import {PlanetScaleDialect} from 'kysely-planetscale'
import SqlString from 'sqlstring'

const db = new Kysely<Database>({
  dialect: new PlanetScaleDialect({
    url: process.env.DATABASE_URL,
    cast: inflate,
  }),
})

function inflate(field, value) {
  if (field.type === 'INT64' || field.type === 'UINT64') {
    return BigInt(value)
  }
  return cast(field, value)
}

Experimental: useSharedConnection

As of version 1.3.0, PlanetScaleDialect supports using a shared @planetscale/database connection for all non-transaction queries, to improve query performance. This option is not enabled by default, but can be enabled by setting the useSharedConnection option to true. Transaction queries will always run using their own connection.

This is an experimental feature, and may be removed in a future version.

import {Kysely} from 'kysely'
import {PlanetScaleDialect} from 'kysely-planetscale'

const db = new Kysely<Database>({
  dialect: new PlanetScaleDialect({
    url: process.env.DATABASE_URL,
    useSharedConnection: true,
  }),
})

License

MIT License, see LICENSE.

kysely-planetscale's People

Contributors

canrau avatar charlie-hadden avatar dependabot[bot] avatar esthevann avatar igalklebanov avatar jacobwgillespie avatar jahirfiquitiva avatar naterminch avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

kysely-planetscale's Issues

Throwing an error during a transaction doesn't allow the lock to roll back

I am using the latest version.

Here is my code:

export const createOauthUser = async (
  providerUser: OauthUser,
  databaseConfig: Config['database']
) => {
  const db = getDBClient(databaseConfig)
  try {
    await db.transaction().execute(async (trx) => {
      const userId = await trx
        .insertInto('user')
        .values({
          name: providerUser.name,
          email: providerUser.email,
          is_email_verified: true,
          password: null,
          role: 'user'
        })
        .executeTakeFirstOrThrow()
      console.log(userId)
      await trx
        .insertInto('authorisations')
        .values({
          user_id: Number(userId.insertId),
          provider_type: providerUser.providerType,
          provider_user_id: providerUser.id.toString()
        })
        .executeTakeFirstOrThrow()
      console.log('ended')
      return userId
    })
  } catch (error) {
    try {
      await db.deleteFrom('user').where('user.email', '=', providerUser.email).execute()
    } catch (err) {
      console.log(err)
    }
    console.log('here')
    throw new ApiError(
      httpStatus.FORBIDDEN,
      `Cannot signup with ${providerUser.providerType}, user already exists with that email`
    )
  }
  const user = await getUserByProviderIdType(
    providerUser.id.toString(), providerUser.providerType, databaseConfig
  )
  return User.convert(user)
}

I added the delete statement in the try catch purely to handle an error I am having. If I throw an error during my transaction it should rollback the transaction and release the locks but I instead get this error intermittently (the other times it completes in under 20 seconds):

DatabaseError: target: dictionary-api.-.primary: vttablet: rpc error: code = DeadlineExceeded desc = Lock wait timeout exceeded; try restarting transaction (errno 1205) (sqlstate HY000) (CallerID: ihkeoohzfw2oodprmc73): Sql: "delete from `user` where `user`.email = :user_email", BindVars: {REDACTED}
        at Connection.execute (/backend/node_modules/@planetscale/database/dist/index.js:78:19)
        at processTicksAndRejections (node:internal/process/task_queues:95:5)
        at _PlanetScaleConnection.executeQuery (/backend/node_modules/kysely-planetscale/dist/index.js:92:21)
        at /backend/node_modules/kysely/dist/esm/query-executor/query-executor-base.js:35:28
        at DefaultConnectionProvider.provideConnection (/backend/node_modules/kysely/dist/esm/driver/default-connection-provider.js:10:20)
        at DefaultQueryExecutor.executeQuery (/backend/node_modules/kysely/dist/esm/query-executor/query-executor-base.js:34:16)
        at DeleteQueryBuilder.execute (/backend/node_modules/kysely/dist/esm/query-builder/delete-query-builder.js:392:24)
        at Module.createOauthUser (/backend/src/services/user.service.ts:65:7)
        at Module.loginOrCreateUserWithOauth (/backend/src/services/auth.service.ts:93:19)
        at oauthCallback (/backend/src/controllers/auth/oauth/oauth.controller.ts:27:16) {
      status: 400,
      body: {
        message: 'target: dictionary-api.-.primary: vttablet: rpc error: code = DeadlineExceeded desc = Lock wait timeout exceeded; try restarting transaction (errno 1205) (sqlstate HY000) (CallerID: ihkeoohzfw2oodprmc73): Sql: "delete from `user` where `user`.email = :user_email", BindVars: {REDACTED}',
        code: 'UNKNOWN'
      }
    }

This is the sole test I am running and the sole code executing.

Running into a race condition where a newly created object cannot be immediately fetched outside of a database transaction

Using this code:

    const transactionId = await db.transaction().execute(async (trx) => {
      const insertId = await insertTransaction(data, trx)
      await doSomething(trx)
      await doSomething2(trx)
      await doSomething3(trx)
      return insertId
    })
    const transaction = await getTransactionById(transactionId, db)
    if (!transaction) {
      throw new ApiError(httpStatus.BAD_REQUEST, 'Created transaction not found')
    }
    return transaction

I have a 80% success rate to return the transaction. The other 20% of the time the transaction is created but it returns an error that it can't find it.

Obviously, I could fix this by putting the fetch inside the database transaction but then my test becomes flakey instead.

Surely, if I await the database transaction I should 100% of the time be able to fetch it afterwards no issue?

Error isn't handled

const results = await this.#conn.execute(compiledQuery.sql, parameters)
return {
rows: results.rows as O[],
numUpdatedOrDeletedRows: results.rowsAffected == null ? undefined : BigInt(results.rowsAffected),
}
}

The error inside the executeQuery method is not handled. execute returns an object described as the following:

export interface ExecutedQuery {
    headers: string[];
    types: Types;
    rows: Row[];
    size: number;
    statement: string;
    insertId: string | null;
    rowsAffected: number | null;
    error: VitessError | null;
    time: number;
}

A simple fix is to add

if (results.error) {
    throw results.error;
}

Boolean values with kysely mysql default for boolean being tinyint

Has anyone ran into trouble reading numbers instead of boolean values?

export function inflateDates(field: Field, value: string | null) {
if (field.type === 'DATETIME' && value) return parseJSON(value)
if (field.type === 'TIMESTAMP' && value) return parseJSON(value)
return cast(field, value)
}

I added another the following override to get boolean values out, but this only works knowing there is no other use of tinyint within the schema:

if (field.type === "INT8" && value !== null && value !== undefined)
          return parseInt(value) === 1 ? true : false;

The kysley default for boolean in mysql is tinyint:

https://github.com/koskimas/kysely/blob/b2b3d2e3afcb6722c4b67c9c419dbf354c60ced6/test/node/src/schema.test.ts#L1854-L1862

Sending a migration to planetscale?

Stupid question - how do I push a migration to planetscale? Have followed the Kyesly docs MigrateToLatest() though unable to get this working with the planetscale dialect.

Numbers returned as strings

In the following test suite, when using PlanetScaleDialect and receiving more than one row, what should be numbers is parsed as strings instead:

import { Kysely, MysqlDialect } from "kysely";
import { PlanetScaleDialect } from "kysely-planetscale";
import { createPool } from "mysql2";
import { strict as assert } from "node:assert";
import { describe, test } from "node:test";
import { fetch } from "undici";

describe("mysql", function () {
  const db = new Kysely({
    dialect: new MysqlDialect({
      pool: createPool({
        database: "mysql",
        host: "localhost",
        user: "root",
        password: "",
      }),
    }),
  });

  test("one row", async function () {
    const q = await db
      .selectFrom([
        db.selectNoFrom((eb) => eb.val(0).as("num_column")).as("foo"),
      ])
      .selectAll()
      .execute();

    assert.deepStrictEqual(q, [{ num_column: 0 }]); // ✅
  });

  test("two rows", async function () {
    const q = await db
      .selectFrom([
        db
          .selectNoFrom((eb) => eb.val(0).as("num_column"))
          .unionAll(db.selectNoFrom((eb) => eb.val(0).as("num_column")))
          .as("foo"),
      ])
      .selectAll()
      .execute();

    assert.deepStrictEqual(q, [{ num_column: 0 }, { num_column: 0 }]); // ✅
  });
});

describe("planetscale", function () {
  const db = new Kysely({
    dialect: new PlanetScaleDialect({
      url: process.env.PLANETSCALE_URL,
      fetch,
    }),
  });

  test("one row", async function () {
    const q = await db
      .selectFrom([
        db.selectNoFrom((eb) => eb.val(0).as("num_column")).as("foo"),
      ])
      .selectAll()
      .execute();

    assert.deepStrictEqual(q, [{ num_column: 0 }]); // ✅
  });

  test("two rows", async function () {
    const q = await db
      .selectFrom([
        db
          .selectNoFrom((eb) => eb.val(0).as("num_column"))
          .unionAll(db.selectNoFrom((eb) => eb.val(0).as("num_column")))
          .as("foo"),
      ])
      .selectAll()
      .execute();

    assert.deepStrictEqual(q, [{ num_column: 0 }, { num_column: 0 }]); // ❌ actual: [ { num_column: '0' }, { num_column: '0' } ]
  });
});

Gist for reproducing: https://gist.github.com/twiddler/8f98b48ec61c75eaa5422249c1edebc9. Please set PLANETSCALE_URL in .env and use Node 21 or later.

"Boolean" from prisma becomes "number" on types.ts

I have this schema with Prisma

model AuthUser {
  id           String        @id @unique
  auth_session AuthSession[]
  auth_key     AuthKey[]

  email          String
  name           String
  email_verified Boolean @default(false)
  test           Boolean

  @@map("auth_user")
}

When I auto generate types email_verified becomes a number

export type AuthUser = {
  id: string;
  email: string;
  name: string;
  email_verified: Generated<number>;
  test: number;
};

CommonJS version

The fact that kysely-planetscale is published as ESM makes it hard for commonjs projects to use it.
Could you publish an commonjs version ?

need to use planetscale driver if using mysql docker image for development

I have a planetscale db instance that I use for production with prisma. Now Im trying to migrate to kysely and I have the doubt if I have to make one connection with the normal mysql driver for development since I use a mysql db image in docker for development.

I tried using the url of my docker image to make the connection with the planetscale driver but It gave me an connection error. So I guess Im doing something wrong or this is simply not possible.

kysely-planetscale causes fetch to fail due to SSL version

Using this template: https://github.com/vercel/nextjs-planetscale-nextauth-tailwindcss-template

Fetch fails (this example is from a Canary version of Next.js but I tried numerous versions):

  ▲ Next.js 13.5.5-canary.2
  - Local:        http://localhost:3000
  - Environments: .env.local

 ✓ Ready in 3.7s
 ○ Compiling /page ...
 ✓ Compiled /page in 8.8s (1196 modules)
 ⨯ Internal error: TypeError: fetch failed
    at Object.fetch (node:internal/deps/undici/undici:11457:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
 ⨯ Internal error: TypeError: fetch failed
    at Object.fetch (node:internal/deps/undici/undici:11457:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
digest: "2026204822"

It seems the underlying issue is to do with how Kysely handles SSL...

  ] {
    library: 'SSL routines',
    reason: 'wrong version number',
    code: 'ERR_SSL_WRONG_VERSION_NUMBER'
  }

If I use mysql2 instead it works, however I wanted to use your lib like the template intends. Any idea what's going on with this or how to resolve it? It happens both locally and on Vercel.

This is how the library is being used:


import 'server-only';
import { Generated, Kysely } from 'kysely';
import { PlanetScaleDialect } from 'kysely-planetscale';

interface User {
  id: Generated<number>;
  name: string;
  username: string;
  email: string;
}

interface Database {
  users: User;
  // https://github.com/nextauthjs/next-auth/issues/4922
}

export const queryBuilder = new Kysely<Database>({
  dialect: new PlanetScaleDialect({
    url: process.env.DATABASE_URL
  })
});


Incorrect type for numDeletedRows being returned from deleteFrom query

👋

I recently opened an issue in the kysely repo and was redirected here.

const result = await db
    .deleteFrom("Log")
    .where("expiresAt", "<=", date)
    .executeTakeFirst();

const count = result.numDeletedRows ? Number(result.numDeletedRows) : 0;

Running the example above using the planetscale dialect, result is always an object with type DeleteResult and the type of result.numDeletedRows is always bigint according to typescript.

result is never undefined, but the issue I am experiencing is that if 0 rows are deleted, result.numDeletedRows returns undefined even though the type should always be bigint according to typescript, otherwise if 1 or more rows are deleted the correct type is returned.

Looking further into the kysely implementation, it seems as the type for result.numDeletedRows is supposed to be bigint | undefined.

I just wanted to ask and see if this is a bug that could be fixed or if I am simply overlooking something.

{
  "kysely": "^0.23.4",
  "kysely-planetscale": "^1.3.0",
  "kysely-codegen": "^0.9.0",
  "@planetscale/database": "^1.5.0"
}
export const db = new Kysely<DB>({
  dialect: new PlanetScaleDialect({
    host: "aws.connect.psdb.cloud",
    username: env.DATABASE_USERNAME,
    password: env.DATABASE_PASSWORD,
  }),
});

Thanks!

request failed, reason: connect ETIMEDOUT

Hello ... I'm trying to use the library with a PlanetScale database ... but any request to it just fails with a ETIMEDOUT error

This is my configuration

import { Kysely } from 'kysely';
import { PlanetScaleDialect } from 'kysely-planetscale';

export type ReactionName = 'likes' | 'loves' | 'awards' | 'bookmarks';

export type CountersReactions = { [Key in ReactionName]?: number };

interface CountersTable extends CountersReactions {
  slug: string;
  views?: number;
}

interface Database {
  counters: CountersTable;
}

export const queryBuilder = new Kysely<Database>({
  dialect: new PlanetScaleDialect({
    url: process.env.DATABASE_URL,
  }),
});

and I'm trying to get the data like:

const slug = 'uses';
const data = await queryBuilder
      .selectFrom('counters')
      .where('slug', '=', slug)
      .select(['slug', 'likes', 'loves', 'awards', 'bookmarks'])
      .execute();

but the query never succeeds and it just shows the error:
request to https://p9hawx83bcfm.us-west-2.psdb.cloud/psdb.v1alpha1.Database/Execute failed, reason: connect ETIMEDOUT 54.189.191.130:443"

This is the database schema
Shot 2023-04-05 at 21 57 14@2x

Anyway to force certain js types for different mysql types ?

I would like e.g. for decimal types to have the type returned from planetscale as a string to maintain precision. I'm not too sure how to do this. I saw in the Kysely docs you could do a conversion via cast but at that point it is too late no?

Any way to maintain precision? Or do I have to perform a cast when retrieving it?

Publish ESM version

Hello there. The version of kysely-planetscale currently on npm is a CJS-only version, and as a result cannot be used in my ESM-based project. I think you should publish versions for both ESM and CJS.

Serverless driver performance

I don't think this is an issue with this repo but wanted to ask this question here as you're likely to have experience. I did a small performance comparison with the standard MySQL connection and the new driver in a lambda function.

I found that when using the serverless driver the cold start times and the execution times in a warm lambda take at least twice as long, often longer. For example, for a cold start with MySQL it's 3-4s, with the HTTP driver, it's more like 10s. Warm starts are 1s compared to 3 or 4s

I haven't yet added any logging of what's taking the additional time, but before I dig into things further I was curious to find out what others' experiences were in terms of performance?

Inserting with a JSON Column

I have a table with a JSON Column. Something like this:

db
  .schema
  .createTable('my_table')
  .addColumn('my_json', 'json')
  .execute();

When inserting values I get the following error (shortened to the relevant part):

vttablet: rpc error: code = Unknown desc = Invalid JSON text: "Invalid value." at position 1 in value for column 'my_table.my_json'.

When I cast the values to strings (JSON.stringify) everything works fine. Problem is with the typings, though. Since keysley expects the my_json property to be an object.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.