Giter Site home page Giter Site logo

forbeslindesay / atdatabases Goto Github PK

View Code? Open in Web Editor NEW
586.0 6.0 46.0 2.98 MB

TypeScript clients for databases that prevent SQL Injection

Home Page: https://www.atdatabases.org

License: MIT License

TypeScript 97.26% JavaScript 2.68% CSS 0.05%
sql postgres sqlite expo nodejs javascript mysql typescript databases promises

atdatabases's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

atdatabases's Issues

Improved Queryable?

I wanted to get your thoughts on this change before I open a PR.

query(query: SQLQuery): Promise<any[]>;
query(query: SQLQuery[]): Promise<any[][]>;
queryStream(
query: SQLQuery,
{batchSize, signal}: {batchSize?: number; signal?: AbortSignal},
): AsyncIterable<any>;

  query<T = any>(query: SQLQuery): Promise<T[]>;
  query<T = any>(query: SQLQuery[]): Promise<T[][]>;
  queryStream<T = any>(query: SQLQuery, { batchSize, signal }: {
      batchSize?: number;
      signal?: AbortSignal;
  }): AsyncIterable<T>;

@databases/pg-typed insert improvement proposal

Currently, when we do a tables()["TableName"].insert({}) we end up with:

INSERT INTO "TableName" () VALUES () RETURNING *

Which doesn't work.

How about we default to INSERT INTO "TableName" DEFAULT VALUES RETURNING * instead?

CLI for pg-test and mysql-test

Currently we only have the raw API and the jest helpers. I would like us to add:

  • start - start a DB and optionally "set" the connection string on an env var somewhere
  • stop - stop a DB that was started by the start command.
  • with-db - run a given CLI command with a database running and the connection string in an env var

This will become considerably more useful when combined with the work in #23

Expose table name publicly inside pg-typed Table

Currently the Table instances from pg-typed don't expose the actual tableName publicly: https://github.com/ForbesLindesay/atdatabases/blob/master/packages/pg-typed/src/index.ts#L220

In certain cases, especially when you need to build a complex SQL query using JOINs, you might want to access the table names from the Table instances themselves. Something like this:

sql`
  SELECT *
  FROM ${Foo.tableName}
  JOIN ${Bar.tableName} ON ${Bar.tableName}.id = ${Foo.tableName}.id
`

I can submit a PR for this change.

syntax error at or near \"$1\"\n\n>

I am getting syntax errors while using select queries using this library. can you please look into this
` const { tableName } = req.query;

  const tableData = await req.app
    .get('db')
    .query(sql`select * from ${tableName}`);`

i am encountering following error
{"type":"error","message":"syntax error at or near \"$1\"\n\n> 1 | select * from $1\n | ^^\n","error":{"length":91,"name":"error","severity":"ERROR","code":"42601","position":"15","file":"scan.l","line":"1150","routine":"scanner_yyerror"}}

Document db.stream and make it consistent - MAYBE BREAKING CHANGE

Currently db.stream is implemented in:

  • postgres - returns a node.js readable stream of row objects
  • SQLite - returns an async iterable of row objects

I think we should rename the method on SQLite to something like queryIterable (I'm not wedded to this name), and then we should implement both methods on both libraries. Once this is done, we can add documentation for how to use the APIs.

sql.join - BREAKING CHANGE

Currently sql.join is the one place where we take a string and assume it's safe SQL (other than __dangerous__rawValue).

Example Vulnerable Code

function getMatchingItems(conditions: SQLQuery[], mode: string) {
  return `SELECT * FROM items WHERE ${sql.join(conditions, mode)}`;
}
// expected usage: user is able to pass some conditions and choose between "or" and "and"
// actual usage: if user can pass arbitrary "mode" they can run any arbitrary SQL

I'd like to propose that we move to one of these syntaxes...

Option 1

sql.join is a template literal tag that returns a function which you pass an array of query fragments to:

const arrayOfSqlFields = ["a", "b", "c", "d"].map(
  n => sql.identifier(n),
);

sql`SELECT ${sql.join`, `(arrayOfSqlFields)}`;
// => {text: 'SELECT "a", "b", "c", "d";', values: []}

const arrayOfSqlConditions = [
  sql.query`a = ${1}`,
  sql.query`b = ${2}`,
  sql.query`c = ${3}`
];
sql`WHERE (${sql.join`) AND (`(arrayOfSqlConditions)})`;
// => {text: 'WHERE (a = $1) AND (b = $2) AND (c = $3)', values: [1, 2, 3]}

Pros:

  • Simple
  • Specifying the separator up front is kind of nice
  • Overall expression is immediately an SQLQuery object

Cons:

  • Hard to see how you could do default separators with this
  • Bracket grouping feels off

Option 2

sql.join takes an array of SQLQueries and returns a tag for a template literal that results in the final SQLQuery

const arrayOfSqlFields = ["a", "b", "c", "d"].map(
  n => sql.identifier(n),
);

sql`SELECT ${sql.join(arrayOfSqlFields)`, `}`;
// => {text: 'SELECT "a", "b", "c", "d";', values: []}

const arrayOfSqlConditions = [
  sql.query`a = ${1}`,
  sql.query`b = ${2}`,
  sql.query`c = ${3}`
];
sql`WHERE (${sql.join(arrayOfSqlConditions)`) AND (`})`;
// => {text: 'WHERE (a = $1) AND (b = $2) AND (c = $3)', values: [1, 2, 3]}

Pros:

  • Maintains current ordering
  • Could support omitting the template literal for a default separator

Cons:

  • If we do the default thing, there is an awkward new type for a join with default separator to deal with
  • Bracketing still feels wrong

Option 3

sql.join is a template literal tag where the template should have just one value that is the array to join, immediately preceded or followed by the string used to join the fields

const arrayOfSqlFields = ["a", "b", "c", "d"].map(
  n => sql.identifier(n),
);

sql`SELECT ${sql.join`, ${arrayOfSqlFields}`}`;
// => {text: 'SELECT "a", "b", "c", "d";', values: []}

const arrayOfSqlConditions = [
  sql.query`a = ${1}`,
  sql.query`b = ${2}`,
  sql.query`c = ${3}`
];
sql`WHERE (${sql.join`) AND (${arrayOfSqlConditions}`})`;
// => {text: 'WHERE (a = $1) AND (b = $2) AND (c = $3)', values: [1, 2, 3]}

Pros:

  • Single expression with correct bracketing
  • Default of empty string happens naturally
  • Immediately returns type of SQLQuery

Cons:

  • Super odd syntax that nobody will be used to seeing anywhere else

Option 4

Whitelist a few common strings, with some tollerance:

  • /^\s*(or|and|\,)\s*$/i - allow ",", "or", "and" surrounded by optional whitespace, and be case insensitive

And then allow specifying SQLQuery objects as the specifiers for more advanced cases

const arrayOfSqlFields = ["a", "b", "c", "d"].map(
  n => sql.identifier(n),
);

sql`SELECT ${sql.join(arrayOfSqlFields, ', ')}`;
// => {text: 'SELECT "a", "b", "c", "d";', values: []}

const arrayOfSqlConditions = [
  sql.query`a = ${1}`,
  sql.query`b = ${2}`,
  sql.query`c = ${3}`
];
sql`WHERE (${sql.join(arrayOfSqlConditions, sql`) AND (`)})`;
// => {text: 'WHERE (a = $1) AND (b = $2) AND (c = $3)', values: [1, 2, 3]}

Pros:

  • Smallest change for current users - whitelist may even mean no change for many people
  • Clean grouping for syntax

Cons:

  • The most verbose option when you need a separator that doesn't match the whitelist

Option 5

Leave everything as it is, but add a lint rule that requires you only pass string literals to sql.join.

Pros:

  • Simplest API

Cons:

  • People who omit the lint rule may get insecure code - I don't think this is accepatable

Conclusion

Regardless of the chosen syntax, there would need to be a transition period where both old and new syntax was supported, with a decent deprecation warning for the old syntax. I think I favour option 4, as it seems easiest to explain to newcomers of the library. I wonder if we should even support the whitelist?

MySql backticks

First of all, thanks for this excellent library. It's a refreshing alternative to ORM's.

There is however one issue which seems to come up very often with mysql, and that's escaping back ticks. As I'm sure you know, Mysql uses them around field names, to avoid clashing with reserved words. However escaping them for use in tagged template literals is just plain ugly.

eg.

 db.query(sql`
    CREATE TABLE \`settings\` (
      \`id\` int(11) unsigned NOT NULL AUTO_INCREMENT,
      \`key\` varchar(64) NOT NULL DEFAULT '',
      \`value\` text,
      PRIMARY KEY (\`id\`),
      UNIQUE KEY \`key\` (\`key\`)
    ) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4;
  `)

Any ideas on how to improve this?

For my queries I've been aliasing the table to a single letter, then prefixing my fields, if a field name has a dot in it, you don't need back ticks, but that isn't always possible
eg.

 db.query(sql`
    Select * 
    from settings as s
    where
      s.key = "sitename"
  `)

Mysql queryNodeStream

Just a quick question. Does queryNodeStream kill the query if the connection is closed?

If not, is their a way to get the thread id, so I can run db.query(sql`KILL ${thread_id}`) ?

Automatically Generate Types from Database queries

I would like to be able to automatically generate types from the database queries. To do this:

  • Build an SQL parser that is flexible in how it handles placeholders that could be replaced with other SQL, and which supports a reasonable subset of the postgres/mysql/sqlite syntax - being worked on in #20
  • Write util to extract queries from typescript files
  • Write util to extract type from parsed SQL query and database schema
  • Tie everything together in an awesome CLI

sql.file does not support multiple statements for sqlite

hey - thanks for a great library. i haven't tested this with any other db besides sqlite, but it doesn't seem to support multiple statements via sql.file: https://gist.github.com/petehunt/6a889fc8d01e14fb1667a1e7f9cb0ffd

this is pretty important as most of the time i have a separate .sql file to set up the schema. right now i work around it by reading the file directly and splitting on semicolons -- not the most optimal way to do it i think :)

Documentation references files (globalSetup.js & globalTeardown.js) that don't exists

The issue

I'm going through the Postgres Testing with Node.js docs. In the Jest section it says to add this to the jest configs

globalSetup: "<rootDir>/node_modules/@databases/pg-test/jest/globalSetup.js",
globalTeardown: "<rootDir>/node_modules/@databases/pg-test/jest/globalTeardown.js",

However there is no globalSetup.js or globalTeardown.js file in that directory.

at_database_screenshot

If you have an answer I'd be happy to do a PR against the docs!

Multiple Instances of Dockerized Postgres databases

First of all, this is a great library you've developed--it's made us a lot more productive.

I have this use-case where I have two REST API projects, each having their own repos and set of database tables. We developed them separately because they each cover a different domain. Now we've come to a point in development where we'd like to do some local integration testing, and both projects use @databases/pg-test/.../globalSetup and every time we try to run the second instance of pg-test locally, it kills the first container. Fortunately both projects use different Postgres Schemas, so I'm not terribly concerned. I changed the Postgres driver port numbers on one project, but I'm still having no luck

I was wondering what my options were for running two projects at the same time which each use pg-test? I suppose I could:

  • Copy & paste the migrations & seeders file so that I only need to run pg-start on one project and still be able to run separate REST APIs without issue
  • Define a Kubernetes configuration file that would EITHER
    • run one database in one Docker container and run both sets of migrations/seeders in that one container OR
    • run two databases in two different Docker containers and define a Kubernetes bridge network
  • Maybe I'm missing something in @databases that would allow me to run two containers at the same time?

Build Standalone CLI for pg-migrations

pg-migrations isn't really useable at the moment as it lacks a CLI. It is currently a work in progress porting over from @moped/db-pg-migrations which was run via the global moped CLI. The moped CLI assumes the package will be bundled via webpack. This is an option, but I suspect we will be better off with something like ts-node.

I think the tasks are approximately:

  • Document the API for people building non-CLI migrations runners. The closest we have to existing documentation are the tests. The new docs should go in the docs folder so they create pages on the website.
  • Swap camelCase in table/field names for snake_case, using camelCase is odd and makes things look like they're not really native to postgres.
  • Build a CLI. This can be loosely based on the mongo one, but should use a combination of CLI flags, env vars and pg-config for any configuration options. It should prompt for any inputs it needs that are not configured properly (unless run in CI). It should support up all, down all, up one down one, down last, up a specific migration, down a specific migration and toggle the state of a specific migration without actually running it.
  • Document the CLI tool

Once this is complete, we'll want to investigate how we port this to MySQL and ideally share code with MySQL.

Property 'query' does not exist on type 'SQL' when trying to use sql.query

Hi there,

Trying out this example from the docs:

const arrayOfSqlConditions = [
  sql.query`a = ${1}`,
  sql.query`b = ${2}`,
  sql.query`c = ${3}`
];
sql`WHERE (${sql.join(arrayOfSqlConditions, sql`) AND (`)})`;
// => {text: 'WHERE (a = $1) AND (b = $2) AND (c = $3)', values: [1, 2, 3]}

gives me:

error TS2339: Property 'query' does not exist on type 'SQL'.

Connection does not seem to be terminated

When running the example code connecting to local mysql database https://www.atdatabases.org/docs/mysql it connects and queries fine but the node process doesn't automatically terminate so I have to use ctrl-c to do it. In other libraries like mysql you need to call connection.end(), I can't see that in this library but I tried db.dispose() and that failed saying there were no connections.

There may be doing something I'm doing wrong here but I compared the behaviour against the mysql module and that terminates fine. I prefer the interface to this library so would be interested in using it but at the moment don't have the confidence to do so.

AWS data api support?

For AWS serverless Aurora accessed via lambda, the recommended way to talk to a db is via the data api.

There's already a client which is very close to the atdatabases API. It'd be very useful to be able to use atdatabases as a layer so we can call code in the same way locally in tests (by using standard db pooling) vs in production from a lambda (via aws data api).

INSERT complex types into a PostgreSQL table.

I am able to select and insert text and numbers into a PostgreSQL table, but anything more complex fails.

For example this template literal produces completely legit SQL which can copy into the command line for execution.

    db.query(sql`
        
    INSERT INTO dev.jack_wolfskin (id, city, store, lat, lng, geom)
    VALUES (
        ${parseInt(store.id)},
        '${store.city}',
        '${JSON.stringify(store)}',
        ${parseFloat(store.lat)},
        ${parseFloat(store.lng)},
        ${'ST_SetSRID(ST_Point(' + parseFloat(store.lng) + ',' + parseFloat(store.lat) + '),4326)'}
    )
    
    `)

The table schema is as follows:

create table if not exists dev.my_table
(
	id integer,
	city text,
	store json,
	lat double precision,
	lng double precision,
	geom geometry(Point, 4326)

);

I have tried to break it down but I have no clue how to insert json type or geometry type generated from a function.

This works pretty well node postgres so there must be something in the atdatabase wrapper which I don't understand.

It would be great if you can provide an example as I would like to do more testing with this product.

pg-test does not wait for postgres to be ready

I'm using pg-test in conjunction with the existing node-pg library (partially since I only found this recently). However, it seems like its not waiting for the database to actually be ready before returning.

  const { databaseURL, kill } = await getDatabase();
  await new Promise((accept) => setTimeout(() => accept(), 1000));
  const client = new pg.Client(databaseURL);

  // Without the sleep (setTimeout) above, this throws an error because the database
  // can't be connected to yet
  await client.connect();

An option might be just to poll the database connection until it connects successfully and/or run a sanity check query (e.g., verify that SELECT 1 returns 1).

migrationsScript property not working with a string

Hi,

When I run my tests, I received this error:

Error: MySqlConfig.test.migrationsScript should be array

packaged.json config:

"mysql": {
        "test": {
            "migrationsScript": "npm run db:migrate"
        }
    }

If I change to Array:

"migrationsScript": [ "npm run db:migrate" ]

I received this:

Error: spawn npm run db:migrate ENOENT
    at Process.ChildProcess._handle.onexit (internal/child_process.js:267:19)
    at onErrorNT (internal/child_process.js:469:16)
    at processTicksAndRejections (internal/process/task_queues.js:84:21)

In the file node_modules/@databases/mysql-config/lib/MySqlConfig.validator.js has this validation:

migrationsScript: {
                    description: 'Optional script to run after the database\nhas been started but before running tests',
                    items: {
                        type: 'string'
                    },
                    type: 'array'
                },

But in the @databases/[email protected] TAG has this:

migrationsScript: {
          anyOf: [
            {
              items: {
                type: 'string',
              },
              type: 'array',
            },
            {
              type: 'string',
            },
          ],
          description:
            'Optional script to run after the database\nhas been started but before running tests',
        }

"@databases/mysql-test": "^2.0.0"
Node: v12.18.4
NPM: 6.14.6

Unhandled 'error' event

I am using this library to stream data from the database however after running for a while it gives me the following error

throw er; // Unhandled 'error' event ^ Error: Connection terminated unexpectedly at Connection.<anonymous> (/var/app/current/node_modules/pg/lib/client.js:132:73) at Object.onceWrapper (events.js:482:28) at Connection.emit (events.js:376:20) at Connection.emit (domain.js:470:12) TLSSocket.<anonymous> (/var/app/current/node_modules/pg/lib/connection.js:108:12)

till now I suppose it was the issue of pg npm but I am not using that in my project. how to handle the error or any solution to take care of this error would be helpful. Please reply as soon as you can
version - ^4.0.0

postgres error: bind message supplies 1 parameters, but prepared statement "" requires 0 when selecting to_regclass

with postgres in the latest version, when selecting to_regclass to check if a table exists, the prepared statement fails:

/**
 * Check if a table exists in the database
 * @param {string} table - Name of table
 * @returns {boolean}
 */
export async function checkIfTableExists(table: string): Promise<boolean> {
  console.log(`Checking if table ${table} exists..`);
  const rows = await (db as ConnectionPool).query(
    sql`SELECT to_regclass('public.${table}')`,
  );
  return rows[0].to_regclass === table;
}

output:

error: bind message supplies 1 parameters, but prepared statement "" requires 0

SQL library code

Hi @calebmer - I can't keep track of a discussion in a large commit, so if you want to discuss it, let's talk here.

Regarding the source of the code, the initial version is loosely based on the pg-sql implementation. I'd be happy to provide some sort of attribution if you'd like? I put this implementation into this code base for a few reasons though:

  • I wanted to use TypeScript for it natively, to match the rest of the code base
  • I wanted to also support generating slightly different output for different SQL dialects (not all of them support the explicit $1 style placeholder syntax)
  • The various db libraries here are very tightly integrated with that package, so I need to be able to manage version updates carefully.
  • I felt it was important that sql.raw had a value that was more obviously dangerous.

Regarding the symbols in pg-sql2, this library uses instanceof, which is equivalently secure - you can't create an instance of SQLQuery via JSON.parse.

Let me know if you have any other concerns/if you want specific attribution.

MySQLConnection.tx

Ideally it should be possible to do a transaction inside a task for MySQL, similar to what you can do with postgres. e.g.

db.task(async task => {
  await task.query(sql`SQL THAT RUNS OUTSIDE TRANSACTION`);
  await task.tx(tx => {
    await tx.query(sql`SQL THAT RUNS INSIDE TRANSACTION`);
  });
});

sql.file

I'd like to add a helper called sql.file that takes a file name and reads the file and returns that as an instance of SQLQuery. This would allow for putting large queries in separate .sql files. I think this wouldn't need to allow for passing parameters to the queries as the primary use case I envision is for migrations scripts.

MySQL Dates

Could we have a way of getting dates returned as strings.

@databases/pg-schema-cli config file?

Been using the Typescript type generator @databases/pg-schema-cli, and noticed it converted my PG enums to union types by default. While wishing there was a way to generate enums, found this promising snippet of code in the repo:

enumTypeMode: 'enum' | 'union_alias' | 'union_alias_with_object' | 'inline';

Is there a way to specify a config file for @databases/pg-schema-cli? Or some other way to specify the enum type used? I didnโ€™t it mentioned in the docs anywhere.

Thanks!

Configuring globalSetup to only run once in jest watch mode

Greetings, thanks for the great @database packages.

I'm currently experimenting with jest and PostgreSQL, and the test setup detailed in the documentation works great (https://www.atdatabases.org/docs/pg-test).

At the moment, jest runs the globalSetup script everytime the tests are executed in watch mode (see: jestjs/jest#6800), rather than once on start. This is a bit costly and slows down the iteration/feedback loop when developing. Depending on the case, this may or may not be desireable behavior.

According to the above jest issue, this setting could be configured at the jest level. It appears that the solution currently needs to be implemented in the script which does perform the globalSetup, by checking the watch and watchAll properties.

This feature is not yet supported by jest, though it could/should (?).

What would you think about adding this to @databases/pg-test? I'm unsure about the API (environment variable?) but I may have some time to work on a PR.

Conditional query

Is there a way to do a conditional query, i.e. have the query be different based on a condition. An example (which I'm pretty sure doesn't work):

await db.query(sql`
SELECT * FROM some_table
${someFilter != null ? sql`WHERE some_field = ${someFilter}` : ''}
`)

My only option now, given that I couldn't find a way to do this, is to have two query statements: one with the "WHERE" and one without.

pg-test in Docker-in-Docker

Hi,

first, thank you for this useful library. Though I think I would not be able to use it because our CI environment executes jobs inside docker. When the jest tests are starting, the process gets locked waiting for the DB to start (in my local environment it works perfectly), I've increased the timeout for the wait DB up to 200 seconds without any success. I've the feeling that as this is running in a docker-in-docker context, the port binding would not be performed against my container itself but to the docker host. Is this correct? It there any workaround?

I was thinking that an option to point directly to the container IP, instead of mapping the port would be very useful for this use case.

Thanks!

@databases eats async stack traces

Here's a repro. The line numbers still aren't right, but that's an issue with ts-node and my repro I think (it works in my main project). I think the issue is then-queue is not using native promises.

import connect, { sql } from "@databases/sqlite";

async function broken() {
  const db = await connect();
  await db.query(sql`this is an invalid sql statement`);
}

async function works() {
  const db = await connect();
  try {
    await db.query(sql`this is an invalid sql statement`);
  } catch (e) {
    throw new Error(e);
  }
}

async function main() {
  // if you uncomment broken(), the following is printed to stderr:
  // Error: SQLITE_ERROR: near "this": syntax error
  //await broken();

  // if you uncomment works(), the following is printed to stderr:
  await works();
  // Error: Error: SQLITE_ERROR: near "this": syntax error
  //   at /mnt/d/Repos/rowrm/src/repro.ts:13:11
  //   at step (/mnt/d/Repos/rowrm/src/repro.ts:37:23)
  //   at Object.throw (/mnt/d/Repos/rowrm/src/repro.ts:18:53)
  //   at rejected (/mnt/d/Repos/rowrm/src/repro.ts:10:65)
}

main().catch((e) => {
  console.error(e.stack);
  process.exit(1);
});

Create @databases/pg-typed, @databases/mysql-typed etc. based on rowrm

I was recently shown rowrm by @petehunt and it looks like an excellent middle ground between the raw (largely not type safe) SQL interface currently provided by this project, and the massively ambitious and complex ORMs like Prisma.

I'd like to start a small collection of type safe helpers for databases, starting with Postgres. The ideal interface I want to aim for, for each table is:

interface TableAPI<TRow, TRequiredColumns> {
  insert(
    ...rows: (Pick<TRow, TRequiredColumns> & Partial<Omit<TRow, TRequiredColumns>>)[]
  ): Promise<TRow[]>;
  insert(
    columns: SQLQuery,
    ...rows: SQLQuery[]
  ): Promise<TRow[]>;

  upsert(
    ...rows: (Pick<TRow, TRequiredColumns> & Partial<Omit<TRow, TRequiredColumns>>)[]
  ): Promise<TRow[]>;

  update(
    whereValues: Partial<TRow> | SQLQuery,
    updateValues: Partial<TRow> | SQLQuery,
  ): Promise<TRow[]>;

  delete(
    whereValues: Partial<TRow> | SQLQuery
  ): Promise<void>;

  select(
    whereValues: Partial<TRow> | SQLQuery,
    options?: {
      orderBy?: [keyof TRow, 'asc' | 'desc'] | [keyof TRow, 'asc' | 'desc'][],
      limit?: number,
      offset?: number,
    } | SQLQuery,
  ): Promise<TRow[]>;

  // throws if > 1 row matches
  selectOne(whereValues: Partial<TRow> | SQLQuery): Promise<TRow | null>;

  selectFirst(
    whereValues: Partial<TRow> | SQLQuery,
    orderBy: [keyof TRow, 'asc' | 'desc'] | [keyof TRow, 'asc' | 'desc'][] | SQLQuery,
  ): Promise<TRow | null>;


  untypedQuery(query: SQLQuery): Promise<TRow[]>;
}


// connection is only taken right at the end in order to make this easy to use inside & outside of transactions.
export function define<TTables>(): {
  [TTableName in keyof TTables]: (connectionOrTransaction: Queryable) => Table<TTables[TTableName]['table'], TTables[TTableName]['required']>;
};

You could then generate types from a schema like:

create table users (
  user_id integer primary key,
  screen_name varchar(128) unique not null,
  bio text,
  age integer
);

create table photos (
  photo_id integer primary key,
  owner_user_id integer not null,
  cdn_url varchar(128) not null,
  caption text
);

You can then use it like:

const connection = connect();
const { users, photos } = define<DbTables>();

// the rows below are typechecked based on the table schema
await users(connection).insert(
  { user_id: 1, screen_name: "@alice", bio: "my name is alice", age: 100 },
  { user_id: 2, screen_name: "@bob", bio: null, age: 99 }
);

// aliceByPkey is of type DbTables["users"] | null
const aliceByPkey = await users(connection).selectOne({ user_id: 1 });

// photosByAlice is of type DbTables["photos"][]
const photosByAlice = await photos(connection).select({ owner_user_id: 1 });

// you can add sorting criteria as well. all typesafe!
// you can provide an array of `orderBy` columns if you want to sort by multiple columns
const oldestUser = await users(connection).select(
  {},
  { orderBy: [["age", "desc"]], limit: 1 }
);

// drop into raw SQL for WHERE clauses
const ELDERLY_AGE = 100;
// elderlyUsers is of type DbTables["users"][]
const elderlyUsers = await users(connection).select(sql`age >= ${ELDERLY_AGE}`, sql`ORDER BY age DESC`);

// issue optimistically typed raw queries
// unsafeUsers is of type DbTables["users"][] but your query could return anything
const users = await users(connection).untypedQuery(sql`SELECT * FROM users`)

// issue untyped raw queries
// maxAge is of type any
const [{ maxAge }] = await connection.query(sql`select max(age) as maxAge from users`)

Steps to make this work (Postgres)

  • Make pg-schema usable to parse a schema and generate the required interfaces (#101)
  • Build a library that generates the required SQL queries and types based on rowrm but adapted for Postgres (#105)

Future nice to haves

  • selectOne should use a DataLoader with caching disabled to automatically batch and dedupe queries (at least if only one field is included in the where clause)
  • Extend pg-schema to support printing the schema as "CREATE TABLE" statements
  • Build CLI to generate required interfaces by parsing SQL CREATE TABLE statements.

unterminated dollar-quoted string at or near "$$

I'm using @databases/pg. I recently upgraded to 5.1.0 from 2.3.0, and started getting this error on function creation.

Here's my setup (simplified):

export default class Migration {
  static dbConnection: ConnectionPool

  runSingleUpMigration = async () => {
    const migration = sql.file('<path_to_sql_file>')
    await Migration.dbConnection.query(migration)
  }
}

SQL file:

CREATE OR REPLACE FUNCTION update_updated_at_column()
RETURNS TRIGGER AS $$
BEGIN
    NEW.updated_at = now();
    RETURN NEW;
END;
$$ language 'plpgsql';

CREATE TRIGGER <trigger_name> BEFORE UPDATE ON <table_name> FOR EACH ROW EXECUTE PROCEDURE update_updated_at_column();

The error:

unterminated dollar-quoted string at or near "$$
BEGIN
    NEW.updated_at = now()"

  1 | CREATE OR REPLACE FUNCTION update_updated_at_column()
> 2 | RETURNS TRIGGER AS $$
    |                    ^
  3 | BEGIN
  4 |     NEW.updated_at = now()

Would appreciate any help here!

Update: replacing $$ with single quotes is a workaround, but it still looks like a bug: $$ is a valid SQL syntax. and should be supported.

Question: how to build queries with template literals?

Hi, I'm having trouble with complex queries

I need a human-friendly SQL query built as a template literal (chainable builder is no good for me)

Something like this:

const buildQuery = (requiredParam: number, optionalParam?: string) => databasesPool.query(sql`
  SELECT
    * from a
  WHERE
    required = ${requiredParam}
    ${optionalParam ? sql`AND optionalParam = ${optionalParam}` : ''}
`)

But sql tag cant do that...

In pg.query I could do it via template literals and arguments:

const buildQuery = (requiredParam: number, optionalParam?: string) => pgDriverPool.query(`
  SELECT
    * from a
  WHERE
    required = $1
    ${optionalParam ? 'AND optionalParam = $2' : ''}
`, [requiredParam, optionalParam])

but @databases don't allow me to use it within the pool instance.

I'm kinda frustrated with it. I like the typescript staff but I need a template builder as well.

Add ability to exclude certain tables during `pg-schema` bindings generation

Currently pg-schema scans whole DB and print any non-system table including meta tables like atdatabases_migrations_applied and atdatabases_migrations_version. I would like to see some mechanism in order to exclude them from the generated code, simple --exclude-table table_name1, table_name2 option may be sufficient.

expo requires "fs" module

Currently the sql module requires fs so that it can read SQL queries from disk and os as part of pg-minify. I need to create a cut down version of this package for use in websql and expo as it is likely that these packages will not be available in those environments.

Automate tests for expo

Currently the process for testing expo changes is:

  1. cd into expo-test-project
  2. run yarn
  3. run yarn start
  4. scan the QR code on an iPhone or android device with the expo app installed.
  5. check you see "passed" in big green writing.

It would be nice to use something like Detox (perhaps via the react-native orb on CircleCI) to automatically load the app and check that the "passed" text appers.

Using @databases/pg-schema-cli with unix socket connection

I tried using the package pg-schema-cli with a db connection via unix socket. For this, I configured a config.json file (implementing this interface) and the --config option with a path to said config.json file, but the command outputs that a connection string is required.

As far as I can tell, connection strings don't support unix socket paths yet - leaving me no real options here.

My current workaround is to temporarily bind the unix socket to a TCP port via socat using this code:

#!/usr/bin/env bash

echo "Binding db unix socket to localhost"
socat 'TCP-LISTEN:1337,reuseaddr,fork' 'UNIX-CONNECT:/var/run/postgresql/.s.PGSQL.5432' &
SOCAT_PID="$!"

npx '@databases/pg-schema-cli'                                                                                                                              \
  --database "postgres://$POSTGRES_USER:$POSTGRES_PASSWORD@localhost:1337/$POSTGRES_DB"  \
  --directory "$TS_FOLDER";

echo "GENERATED TS TYPES IN $TS_FOLDER";

echo "Unbinding db unix socket from localhost (killing pid $SOCAT_PID)"
kill -9 "$SOCAT_PID"

However, I think a case could be made to allow this behavior without the workaround. I could imagine the easiest fix that works for all packages could be to extend the schema of the connection string to allow unix sockets as hosts. If the maintenance team agrees with me, I would be willing to provide an implementation (scoped to postgres).

Error: Jest: Got error running globalSetup

Hi, it looks that documented @databases/pg-test setup is not stable

From time to time I get the following error:

Error: Jest: Got error running globalSetup - .../api-server/node_modules/@databases/pg-test/lib/jest/globalSetup.js, reason: npm run migrate exited with code 1:
could not connect to postgres: Error: Connection terminated unexpectedly
    at Connection.<anonymous> (.../api-server/node_modules/pg/lib/client.js:132:73)

It looks the connection attempt starts before the container is actually up.

As a workaround we pause the migrations a bit, but it appears that the situation should be solved on a library level with some sort of observer that is pinging the database port unless it opens or timeout fires.

Unable to use Postgres Anonymous Code Block syntax

The @databases/pg packages do not support parsing file or SQLQuery text containing PG Anonymous Code Block syntax (i.e. $<optional_tag>$ ... $<optional_tag>$): https://www.postgresql.org/docs/12/plpython-do.html.

This syntax is common with structures like DO and PROCEDURE for defining the internal code definition. Lack of this support makes it impossible to deploy PROCEDURES, FUNCTIONS, DO blocks, etc since each of these require internal commands delimited with ";". Please consider adding this functionality. Please advise if this functionality is already supported.

[MySQL] Error: Too many connections

Hi everybody, I need to help.

I have the classic "too many connections" error. I know how to release connections using mysql2 library but here I don't know how to.

Can you tell me how can I release connections from pool?

Thanks!

Postgres not working on node 14

I have an issue where transactions never start on node 14. It looks like it's probably a problem with the underlying pg-promise

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.