forbeslindesay / atdatabases Goto Github PK
View Code? Open in Web Editor NEWTypeScript clients for databases that prevent SQL Injection
Home Page: https://www.atdatabases.org
License: MIT License
TypeScript clients for databases that prevent SQL Injection
Home Page: https://www.atdatabases.org
License: MIT License
I wanted to get your thoughts on this change before I open a PR.
atdatabases/packages/pg/src/types/Queryable.ts
Lines 15 to 20 in 2830a3e
query<T = any>(query: SQLQuery): Promise<T[]>;
query<T = any>(query: SQLQuery[]): Promise<T[][]>;
queryStream<T = any>(query: SQLQuery, { batchSize, signal }: {
batchSize?: number;
signal?: AbortSignal;
}): AsyncIterable<T>;
Currently, when we do a tables()["TableName"].insert({})
we end up with:
INSERT INTO "TableName" () VALUES () RETURNING *
Which doesn't work.
How about we default to INSERT INTO "TableName" DEFAULT VALUES RETURNING *
instead?
Unfortunately I need to pivot a project to use IPC sockets instead of a URL connection string. It looks like this isn't possible with this library.
Add an alternative connection object parameter which allows easier configuration of connection and the use of IPC sockets.
See TypeORM's connection options for an example https://github.com/typeorm/typeorm/blob/master/docs/connection-options.md
Currently we only have the raw API and the jest helpers. I would like us to add:
start
- start a DB and optionally "set" the connection string on an env var somewherestop
- stop a DB that was started by the start command.with-db
- run a given CLI command with a database running and the connection string in an env varThis will become considerably more useful when combined with the work in #23
Currently the Table
instances from pg-typed
don't expose the actual tableName
publicly: https://github.com/ForbesLindesay/atdatabases/blob/master/packages/pg-typed/src/index.ts#L220
In certain cases, especially when you need to build a complex SQL query using JOINs, you might want to access the table names from the Table instances themselves. Something like this:
sql`
SELECT *
FROM ${Foo.tableName}
JOIN ${Bar.tableName} ON ${Bar.tableName}.id = ${Foo.tableName}.id
`
I can submit a PR for this change.
I am getting syntax errors while using select queries using this library. can you please look into this
` const { tableName } = req.query;
const tableData = await req.app
.get('db')
.query(sql`select * from ${tableName}`);`
i am encountering following error
{"type":"error","message":"syntax error at or near \"$1\"\n\n> 1 | select * from $1\n | ^^\n","error":{"length":91,"name":"error","severity":"ERROR","code":"42601","position":"15","file":"scan.l","line":"1150","routine":"scanner_yyerror"}}
Currently db.stream
is implemented in:
I think we should rename the method on SQLite to something like queryIterable
(I'm not wedded to this name), and then we should implement both methods on both libraries. Once this is done, we can add documentation for how to use the APIs.
Currently sql.join
is the one place where we take a string and assume it's safe SQL (other than __dangerous__rawValue
).
function getMatchingItems(conditions: SQLQuery[], mode: string) {
return `SELECT * FROM items WHERE ${sql.join(conditions, mode)}`;
}
// expected usage: user is able to pass some conditions and choose between "or" and "and"
// actual usage: if user can pass arbitrary "mode" they can run any arbitrary SQL
I'd like to propose that we move to one of these syntaxes...
sql.join
is a template literal tag that returns a function which you pass an array of query fragments to:
const arrayOfSqlFields = ["a", "b", "c", "d"].map(
n => sql.identifier(n),
);
sql`SELECT ${sql.join`, `(arrayOfSqlFields)}`;
// => {text: 'SELECT "a", "b", "c", "d";', values: []}
const arrayOfSqlConditions = [
sql.query`a = ${1}`,
sql.query`b = ${2}`,
sql.query`c = ${3}`
];
sql`WHERE (${sql.join`) AND (`(arrayOfSqlConditions)})`;
// => {text: 'WHERE (a = $1) AND (b = $2) AND (c = $3)', values: [1, 2, 3]}
Pros:
Cons:
sql.join
takes an array of SQLQueries and returns a tag for a template literal that results in the final SQLQuery
const arrayOfSqlFields = ["a", "b", "c", "d"].map(
n => sql.identifier(n),
);
sql`SELECT ${sql.join(arrayOfSqlFields)`, `}`;
// => {text: 'SELECT "a", "b", "c", "d";', values: []}
const arrayOfSqlConditions = [
sql.query`a = ${1}`,
sql.query`b = ${2}`,
sql.query`c = ${3}`
];
sql`WHERE (${sql.join(arrayOfSqlConditions)`) AND (`})`;
// => {text: 'WHERE (a = $1) AND (b = $2) AND (c = $3)', values: [1, 2, 3]}
Pros:
Cons:
sql.join
is a template literal tag where the template should have just one value that is the array to join, immediately preceded or followed by the string used to join the fields
const arrayOfSqlFields = ["a", "b", "c", "d"].map(
n => sql.identifier(n),
);
sql`SELECT ${sql.join`, ${arrayOfSqlFields}`}`;
// => {text: 'SELECT "a", "b", "c", "d";', values: []}
const arrayOfSqlConditions = [
sql.query`a = ${1}`,
sql.query`b = ${2}`,
sql.query`c = ${3}`
];
sql`WHERE (${sql.join`) AND (${arrayOfSqlConditions}`})`;
// => {text: 'WHERE (a = $1) AND (b = $2) AND (c = $3)', values: [1, 2, 3]}
Pros:
Cons:
Whitelist a few common strings, with some tollerance:
/^\s*(or|and|\,)\s*$/i
- allow ",", "or", "and" surrounded by optional whitespace, and be case insensitiveAnd then allow specifying SQLQuery objects as the specifiers for more advanced cases
const arrayOfSqlFields = ["a", "b", "c", "d"].map(
n => sql.identifier(n),
);
sql`SELECT ${sql.join(arrayOfSqlFields, ', ')}`;
// => {text: 'SELECT "a", "b", "c", "d";', values: []}
const arrayOfSqlConditions = [
sql.query`a = ${1}`,
sql.query`b = ${2}`,
sql.query`c = ${3}`
];
sql`WHERE (${sql.join(arrayOfSqlConditions, sql`) AND (`)})`;
// => {text: 'WHERE (a = $1) AND (b = $2) AND (c = $3)', values: [1, 2, 3]}
Pros:
Cons:
Leave everything as it is, but add a lint rule that requires you only pass string literals to sql.join
.
Pros:
Cons:
Regardless of the chosen syntax, there would need to be a transition period where both old and new syntax was supported, with a decent deprecation warning for the old syntax. I think I favour option 4, as it seems easiest to explain to newcomers of the library. I wonder if we should even support the whitelist?
First of all, thanks for this excellent library. It's a refreshing alternative to ORM's.
There is however one issue which seems to come up very often with mysql, and that's escaping back ticks. As I'm sure you know, Mysql uses them around field names, to avoid clashing with reserved words. However escaping them for use in tagged template literals is just plain ugly.
eg.
db.query(sql`
CREATE TABLE \`settings\` (
\`id\` int(11) unsigned NOT NULL AUTO_INCREMENT,
\`key\` varchar(64) NOT NULL DEFAULT '',
\`value\` text,
PRIMARY KEY (\`id\`),
UNIQUE KEY \`key\` (\`key\`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4;
`)
Any ideas on how to improve this?
For my queries I've been aliasing the table to a single letter, then prefixing my fields, if a field name has a dot in it, you don't need back ticks, but that isn't always possible
eg.
db.query(sql`
Select *
from settings as s
where
s.key = "sitename"
`)
Just a quick question. Does queryNodeStream
kill the query if the connection is closed?
If not, is their a way to get the thread id, so I can run db.query(sql`KILL ${thread_id}`)
?
I would like to be able to automatically generate types from the database queries. To do this:
hey - thanks for a great library. i haven't tested this with any other db besides sqlite, but it doesn't seem to support multiple statements via sql.file
: https://gist.github.com/petehunt/6a889fc8d01e14fb1667a1e7f9cb0ffd
this is pretty important as most of the time i have a separate .sql
file to set up the schema. right now i work around it by reading the file directly and splitting on semicolons -- not the most optimal way to do it i think :)
I'm going through the Postgres Testing with Node.js docs. In the Jest
section it says to add this to the jest configs
globalSetup: "<rootDir>/node_modules/@databases/pg-test/jest/globalSetup.js",
globalTeardown: "<rootDir>/node_modules/@databases/pg-test/jest/globalTeardown.js",
However there is no globalSetup.js
or globalTeardown.js
file in that directory.
If you have an answer I'd be happy to do a PR against the docs!
First of all, this is a great library you've developed--it's made us a lot more productive.
I have this use-case where I have two REST API projects, each having their own repos and set of database tables. We developed them separately because they each cover a different domain. Now we've come to a point in development where we'd like to do some local integration testing, and both projects use @databases/pg-test/.../globalSetup
and every time we try to run the second instance of pg-test
locally, it kills the first container. Fortunately both projects use different Postgres Schemas, so I'm not terribly concerned. I changed the Postgres driver port numbers on one project, but I'm still having no luck
I was wondering what my options were for running two projects at the same time which each use pg-test
? I suppose I could:
pg-start
on one project and still be able to run separate REST APIs without issue@databases
that would allow me to run two containers at the same time?pg-migrations isn't really useable at the moment as it lacks a CLI. It is currently a work in progress porting over from @moped/db-pg-migrations which was run via the global moped CLI. The moped CLI assumes the package will be bundled via webpack. This is an option, but I suspect we will be better off with something like ts-node
.
I think the tasks are approximately:
camelCase
in table/field names for snake_case
, using camelCase
is odd and makes things look like they're not really native to postgres.Once this is complete, we'll want to investigate how we port this to MySQL and ideally share code with MySQL.
Hi there,
Trying out this example from the docs:
const arrayOfSqlConditions = [
sql.query`a = ${1}`,
sql.query`b = ${2}`,
sql.query`c = ${3}`
];
sql`WHERE (${sql.join(arrayOfSqlConditions, sql`) AND (`)})`;
// => {text: 'WHERE (a = $1) AND (b = $2) AND (c = $3)', values: [1, 2, 3]}
gives me:
error TS2339: Property 'query' does not exist on type 'SQL'.
When running the example code connecting to local mysql database https://www.atdatabases.org/docs/mysql it connects and queries fine but the node process doesn't automatically terminate so I have to use ctrl-c to do it. In other libraries like mysql
you need to call connection.end()
, I can't see that in this library but I tried db.dispose()
and that failed saying there were no connections.
There may be doing something I'm doing wrong here but I compared the behaviour against the mysql
module and that terminates fine. I prefer the interface to this library so would be interested in using it but at the moment don't have the confidence to do so.
For AWS serverless Aurora accessed via lambda, the recommended way to talk to a db is via the data api.
There's already a client which is very close to the atdatabases API. It'd be very useful to be able to use atdatabases as a layer so we can call code in the same way locally in tests (by using standard db pooling) vs in production from a lambda (via aws data api).
I am able to select and insert text and numbers into a PostgreSQL table, but anything more complex fails.
For example this template literal produces completely legit SQL which can copy into the command line for execution.
db.query(sql`
INSERT INTO dev.jack_wolfskin (id, city, store, lat, lng, geom)
VALUES (
${parseInt(store.id)},
'${store.city}',
'${JSON.stringify(store)}',
${parseFloat(store.lat)},
${parseFloat(store.lng)},
${'ST_SetSRID(ST_Point(' + parseFloat(store.lng) + ',' + parseFloat(store.lat) + '),4326)'}
)
`)
The table schema is as follows:
create table if not exists dev.my_table
(
id integer,
city text,
store json,
lat double precision,
lng double precision,
geom geometry(Point, 4326)
);
I have tried to break it down but I have no clue how to insert json type or geometry type generated from a function.
This works pretty well node postgres so there must be something in the atdatabase wrapper which I don't understand.
It would be great if you can provide an example as I would like to do more testing with this product.
I'm using pg-test in conjunction with the existing node-pg library (partially since I only found this recently). However, it seems like its not waiting for the database to actually be ready before returning.
const { databaseURL, kill } = await getDatabase();
await new Promise((accept) => setTimeout(() => accept(), 1000));
const client = new pg.Client(databaseURL);
// Without the sleep (setTimeout) above, this throws an error because the database
// can't be connected to yet
await client.connect();
An option might be just to poll the database connection until it connects successfully and/or run a sanity check query (e.g., verify that SELECT 1
returns 1
).
Hi,
When I run my tests, I received this error:
Error: MySqlConfig.test.migrationsScript should be array
packaged.json config:
"mysql": {
"test": {
"migrationsScript": "npm run db:migrate"
}
}
If I change to Array:
"migrationsScript": [ "npm run db:migrate" ]
I received this:
Error: spawn npm run db:migrate ENOENT
at Process.ChildProcess._handle.onexit (internal/child_process.js:267:19)
at onErrorNT (internal/child_process.js:469:16)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
In the file node_modules/@databases/mysql-config/lib/MySqlConfig.validator.js
has this validation:
migrationsScript: {
description: 'Optional script to run after the database\nhas been started but before running tests',
items: {
type: 'string'
},
type: 'array'
},
But in the @databases/[email protected]
TAG has this:
migrationsScript: {
anyOf: [
{
items: {
type: 'string',
},
type: 'array',
},
{
type: 'string',
},
],
description:
'Optional script to run after the database\nhas been started but before running tests',
}
"@databases/mysql-test": "^2.0.0"
Node: v12.18.4
NPM: 6.14.6
I am using this library to stream data from the database however after running for a while it gives me the following error
throw er; // Unhandled 'error' event ^ Error: Connection terminated unexpectedly at Connection.<anonymous> (/var/app/current/node_modules/pg/lib/client.js:132:73) at Object.onceWrapper (events.js:482:28) at Connection.emit (events.js:376:20) at Connection.emit (domain.js:470:12) TLSSocket.<anonymous> (/var/app/current/node_modules/pg/lib/connection.js:108:12)
till now I suppose it was the issue of pg npm but I am not using that in my project. how to handle the error or any solution to take care of this error would be helpful. Please reply as soon as you can
version - ^4.0.0
with postgres in the latest version, when selecting to_regclass to check if a table exists, the prepared statement fails:
/**
* Check if a table exists in the database
* @param {string} table - Name of table
* @returns {boolean}
*/
export async function checkIfTableExists(table: string): Promise<boolean> {
console.log(`Checking if table ${table} exists..`);
const rows = await (db as ConnectionPool).query(
sql`SELECT to_regclass('public.${table}')`,
);
return rows[0].to_regclass === table;
}
output:
error: bind message supplies 1 parameters, but prepared statement "" requires 0
Mysql Booleans on the database returned as 1 or 0 rather than true or false
Hi @calebmer - I can't keep track of a discussion in a large commit, so if you want to discuss it, let's talk here.
Regarding the source of the code, the initial version is loosely based on the pg-sql implementation. I'd be happy to provide some sort of attribution if you'd like? I put this implementation into this code base for a few reasons though:
$1
style placeholder syntax)sql.raw
had a value that was more obviously dangerous.Regarding the symbols in pg-sql2, this library uses instanceof
, which is equivalently secure - you can't create an instance of SQLQuery
via JSON.parse
.
Let me know if you have any other concerns/if you want specific attribution.
Ideally it should be possible to do a transaction inside a task for MySQL, similar to what you can do with postgres. e.g.
db.task(async task => {
await task.query(sql`SQL THAT RUNS OUTSIDE TRANSACTION`);
await task.tx(tx => {
await tx.query(sql`SQL THAT RUNS INSIDE TRANSACTION`);
});
});
I'd like to add a helper called sql.file
that takes a file name and reads the file and returns that as an instance of SQLQuery
. This would allow for putting large queries in separate .sql
files. I think this wouldn't need to allow for passing parameters to the queries as the primary use case I envision is for migrations scripts.
Could we have a way of getting dates returned as strings.
Been using the Typescript type generator @databases/pg-schema-cli
, and noticed it converted my PG enums to union types by default. While wishing there was a way to generate enums, found this promising snippet of code in the repo:
Is there a way to specify a config file for @databases/pg-schema-cli
? Or some other way to specify the enum type used? I didnโt it mentioned in the docs anywhere.
Thanks!
Greetings, thanks for the great @database
packages.
I'm currently experimenting with jest and PostgreSQL, and the test setup detailed in the documentation works great (https://www.atdatabases.org/docs/pg-test).
At the moment, jest
runs the globalSetup
script everytime the tests are executed in watch mode (see: jestjs/jest#6800), rather than once on start. This is a bit costly and slows down the iteration/feedback loop when developing. Depending on the case, this may or may not be desireable behavior.
According to the above jest
issue, this setting could be configured at the jest level. It appears that the solution currently needs to be implemented in the script which does perform the globalSetup
, by checking the watch and watchAll properties.
This feature is not yet supported by jest
, though it could/should (?).
What would you think about adding this to @databases/pg-test
? I'm unsure about the API (environment variable?) but I may have some time to work on a PR.
Is there a way to do a conditional query, i.e. have the query be different based on a condition. An example (which I'm pretty sure doesn't work):
await db.query(sql`
SELECT * FROM some_table
${someFilter != null ? sql`WHERE some_field = ${someFilter}` : ''}
`)
My only option now, given that I couldn't find a way to do this, is to have two query statements: one with the "WHERE" and one without.
Hi,
first, thank you for this useful library. Though I think I would not be able to use it because our CI environment executes jobs inside docker. When the jest tests are starting, the process gets locked waiting for the DB to start (in my local environment it works perfectly), I've increased the timeout for the wait DB up to 200 seconds without any success. I've the feeling that as this is running in a docker-in-docker context, the port binding would not be performed against my container itself but to the docker host. Is this correct? It there any workaround?
I was thinking that an option to point directly to the container IP, instead of mapping the port would be very useful for this use case.
Thanks!
Hi @ForbesLindesay,
I'd like to be able to pass in a constructor for how to create the pg
client/ pool. This is in order to support tracing via. https://github.com/openzipkin/zipkin-js/tree/master/packages/zipkin-instrumentation-postgres
Would you be open to accepting this as a PR?
Thanks for the fantastic lib!
Here's a repro. The line numbers still aren't right, but that's an issue with ts-node
and my repro I think (it works in my main project). I think the issue is then-queue
is not using native promises.
import connect, { sql } from "@databases/sqlite";
async function broken() {
const db = await connect();
await db.query(sql`this is an invalid sql statement`);
}
async function works() {
const db = await connect();
try {
await db.query(sql`this is an invalid sql statement`);
} catch (e) {
throw new Error(e);
}
}
async function main() {
// if you uncomment broken(), the following is printed to stderr:
// Error: SQLITE_ERROR: near "this": syntax error
//await broken();
// if you uncomment works(), the following is printed to stderr:
await works();
// Error: Error: SQLITE_ERROR: near "this": syntax error
// at /mnt/d/Repos/rowrm/src/repro.ts:13:11
// at step (/mnt/d/Repos/rowrm/src/repro.ts:37:23)
// at Object.throw (/mnt/d/Repos/rowrm/src/repro.ts:18:53)
// at rejected (/mnt/d/Repos/rowrm/src/repro.ts:10:65)
}
main().catch((e) => {
console.error(e.stack);
process.exit(1);
});
Depends on #34
I was recently shown rowrm by @petehunt and it looks like an excellent middle ground between the raw (largely not type safe) SQL interface currently provided by this project, and the massively ambitious and complex ORMs like Prisma.
I'd like to start a small collection of type safe helpers for databases, starting with Postgres. The ideal interface I want to aim for, for each table is:
interface TableAPI<TRow, TRequiredColumns> {
insert(
...rows: (Pick<TRow, TRequiredColumns> & Partial<Omit<TRow, TRequiredColumns>>)[]
): Promise<TRow[]>;
insert(
columns: SQLQuery,
...rows: SQLQuery[]
): Promise<TRow[]>;
upsert(
...rows: (Pick<TRow, TRequiredColumns> & Partial<Omit<TRow, TRequiredColumns>>)[]
): Promise<TRow[]>;
update(
whereValues: Partial<TRow> | SQLQuery,
updateValues: Partial<TRow> | SQLQuery,
): Promise<TRow[]>;
delete(
whereValues: Partial<TRow> | SQLQuery
): Promise<void>;
select(
whereValues: Partial<TRow> | SQLQuery,
options?: {
orderBy?: [keyof TRow, 'asc' | 'desc'] | [keyof TRow, 'asc' | 'desc'][],
limit?: number,
offset?: number,
} | SQLQuery,
): Promise<TRow[]>;
// throws if > 1 row matches
selectOne(whereValues: Partial<TRow> | SQLQuery): Promise<TRow | null>;
selectFirst(
whereValues: Partial<TRow> | SQLQuery,
orderBy: [keyof TRow, 'asc' | 'desc'] | [keyof TRow, 'asc' | 'desc'][] | SQLQuery,
): Promise<TRow | null>;
untypedQuery(query: SQLQuery): Promise<TRow[]>;
}
// connection is only taken right at the end in order to make this easy to use inside & outside of transactions.
export function define<TTables>(): {
[TTableName in keyof TTables]: (connectionOrTransaction: Queryable) => Table<TTables[TTableName]['table'], TTables[TTableName]['required']>;
};
You could then generate types from a schema like:
create table users (
user_id integer primary key,
screen_name varchar(128) unique not null,
bio text,
age integer
);
create table photos (
photo_id integer primary key,
owner_user_id integer not null,
cdn_url varchar(128) not null,
caption text
);
You can then use it like:
const connection = connect();
const { users, photos } = define<DbTables>();
// the rows below are typechecked based on the table schema
await users(connection).insert(
{ user_id: 1, screen_name: "@alice", bio: "my name is alice", age: 100 },
{ user_id: 2, screen_name: "@bob", bio: null, age: 99 }
);
// aliceByPkey is of type DbTables["users"] | null
const aliceByPkey = await users(connection).selectOne({ user_id: 1 });
// photosByAlice is of type DbTables["photos"][]
const photosByAlice = await photos(connection).select({ owner_user_id: 1 });
// you can add sorting criteria as well. all typesafe!
// you can provide an array of `orderBy` columns if you want to sort by multiple columns
const oldestUser = await users(connection).select(
{},
{ orderBy: [["age", "desc"]], limit: 1 }
);
// drop into raw SQL for WHERE clauses
const ELDERLY_AGE = 100;
// elderlyUsers is of type DbTables["users"][]
const elderlyUsers = await users(connection).select(sql`age >= ${ELDERLY_AGE}`, sql`ORDER BY age DESC`);
// issue optimistically typed raw queries
// unsafeUsers is of type DbTables["users"][] but your query could return anything
const users = await users(connection).untypedQuery(sql`SELECT * FROM users`)
// issue untyped raw queries
// maxAge is of type any
const [{ maxAge }] = await connection.query(sql`select max(age) as maxAge from users`)
selectOne
should use a DataLoader with caching disabled to automatically batch and dedupe queries (at least if only one field is included in the where clause)I'm using @databases/pg
. I recently upgraded to 5.1.0 from 2.3.0, and started getting this error on function creation.
Here's my setup (simplified):
export default class Migration {
static dbConnection: ConnectionPool
runSingleUpMigration = async () => {
const migration = sql.file('<path_to_sql_file>')
await Migration.dbConnection.query(migration)
}
}
SQL file:
CREATE OR REPLACE FUNCTION update_updated_at_column()
RETURNS TRIGGER AS $$
BEGIN
NEW.updated_at = now();
RETURN NEW;
END;
$$ language 'plpgsql';
CREATE TRIGGER <trigger_name> BEFORE UPDATE ON <table_name> FOR EACH ROW EXECUTE PROCEDURE update_updated_at_column();
The error:
unterminated dollar-quoted string at or near "$$
BEGIN
NEW.updated_at = now()"
1 | CREATE OR REPLACE FUNCTION update_updated_at_column()
> 2 | RETURNS TRIGGER AS $$
| ^
3 | BEGIN
4 | NEW.updated_at = now()
Would appreciate any help here!
Update: replacing $$
with single quotes is a workaround, but it still looks like a bug: $$ is a valid SQL syntax. and should be supported.
Hi, I'm having trouble with complex queries
I need a human-friendly SQL query built as a template literal (chainable builder is no good for me)
Something like this:
const buildQuery = (requiredParam: number, optionalParam?: string) => databasesPool.query(sql`
SELECT
* from a
WHERE
required = ${requiredParam}
${optionalParam ? sql`AND optionalParam = ${optionalParam}` : ''}
`)
But sql
tag cant do that...
In pg.query
I could do it via template literals and arguments:
const buildQuery = (requiredParam: number, optionalParam?: string) => pgDriverPool.query(`
SELECT
* from a
WHERE
required = $1
${optionalParam ? 'AND optionalParam = $2' : ''}
`, [requiredParam, optionalParam])
but @databases don't allow me to use it within the pool instance.
I'm kinda frustrated with it. I like the typescript staff but I need a template builder as well.
Currently pg-schema
scans whole DB and print any non-system table including meta tables like atdatabases_migrations_applied
and atdatabases_migrations_version
. I would like to see some mechanism in order to exclude them from the generated code, simple --exclude-table table_name1, table_name2
option may be sufficient.
Currently the sql
module requires fs
so that it can read SQL queries from disk and os
as part of pg-minify. I need to create a cut down version of this package for use in websql and expo as it is likely that these packages will not be available in those environments.
Currently the process for testing expo changes is:
yarn
yarn start
It would be nice to use something like Detox (perhaps via the react-native orb on CircleCI) to automatically load the app and check that the "passed" text appers.
I tried using the package pg-schema-cli
with a db connection via unix socket. For this, I configured a config.json
file (implementing this interface) and the --config
option with a path to said config.json
file, but the command outputs that a connection string is required.
As far as I can tell, connection strings don't support unix socket paths yet - leaving me no real options here.
My current workaround is to temporarily bind the unix socket to a TCP port via socat
using this code:
#!/usr/bin/env bash
echo "Binding db unix socket to localhost"
socat 'TCP-LISTEN:1337,reuseaddr,fork' 'UNIX-CONNECT:/var/run/postgresql/.s.PGSQL.5432' &
SOCAT_PID="$!"
npx '@databases/pg-schema-cli' \
--database "postgres://$POSTGRES_USER:$POSTGRES_PASSWORD@localhost:1337/$POSTGRES_DB" \
--directory "$TS_FOLDER";
echo "GENERATED TS TYPES IN $TS_FOLDER";
echo "Unbinding db unix socket from localhost (killing pid $SOCAT_PID)"
kill -9 "$SOCAT_PID"
However, I think a case could be made to allow this behavior without the workaround. I could imagine the easiest fix that works for all packages could be to extend the schema of the connection string to allow unix sockets as hosts. If the maintenance team agrees with me, I would be willing to provide an implementation (scoped to postgres).
Hi, it looks that documented @databases/pg-test
setup is not stable
From time to time I get the following error:
Error: Jest: Got error running globalSetup - .../api-server/node_modules/@databases/pg-test/lib/jest/globalSetup.js, reason: npm run migrate exited with code 1:
could not connect to postgres: Error: Connection terminated unexpectedly
at Connection.<anonymous> (.../api-server/node_modules/pg/lib/client.js:132:73)
It looks the connection attempt starts before the container is actually up.
As a workaround we pause the migrations a bit, but it appears that the situation should be solved on a library level with some sort of observer that is pinging the database port unless it opens or timeout fires.
I'd like to use @databases to connect to an Amazon RDS proxy. The proxy uses IAM authentication, which requires the use of the mysql_clear_password
authentication plugin. I don't see anything in the docs about authentication plugins. Can you help?
See https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.IAMDBAuth.Connecting.AWSCLI.html for an example of using the MySQL CLI to connect to an RDS instance with IAM authentication.
The @databases/pg packages do not support parsing file or SQLQuery text containing PG Anonymous Code Block syntax (i.e. $<optional_tag>$ ... $<optional_tag>$
): https://www.postgresql.org/docs/12/plpython-do.html.
This syntax is common with structures like DO and PROCEDURE for defining the internal code definition. Lack of this support makes it impossible to deploy PROCEDURES, FUNCTIONS, DO blocks, etc since each of these require internal commands delimited with ";". Please consider adding this functionality. Please advise if this functionality is already supported.
For escaping identifiers loops are used which are not that pretty/optimised and can be easily exchanged by a RegExp replace.
So here:
https://github.com/ForbesLindesay/atdatabases/blob/master/packages/sql/src/SQLQuery.ts#L402-L411
Use:
return '"' + str.replace(/"/g, '""') + '"'
Similar also for the MySQL identifier escaping here: https://github.com/ForbesLindesay/atdatabases/blob/master/packages/sql/src/SQLQuery.ts#L437-L446
Hi everybody, I need to help.
I have the classic "too many connections" error. I know how to release connections using mysql2 library but here I don't know how to.
Can you tell me how can I release connections from pool?
Thanks!
I have an issue where transactions never start on node 14. It looks like it's probably a problem with the underlying pg-promise
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.