Giter Site home page Giter Site logo

loopback-connector-postgresql's Introduction

loopback-connector-postgresql

PostgreSQL, is a popular open-source object-relational database. The loopback-connector-postgresql module is the PostgreSQL connector for the LoopBack framework.

The PostgreSQL connector supports both LoopBack 3 and LoopBack 4. For more information, see LoopBack 4 documentation, LoopBack 3 documentation and Module Long Term Support Policy below.

NOTE: The PostgreSQL connector requires PostgreSQL 8.x or 9.x.

Installation

In your application root directory, enter this command to install the connector:

$ npm install loopback-connector-postgresql --save

This installs the module from npm and adds it as a dependency to the application's package.json file.

If you create a PostgreSQL data source using the data source generator as described below, you don't have to do this, since the generator will run npm install for you.

Creating a data source

For LoopBack 4 users, use the LoopBack 4 Command-line interface to generate a DataSource with PostgreSQL connector to your LB4 application. Run lb4 datasource, it will prompt for configurations such as host, post, etc. that are required to connect to a PostgreSQL database.

After setting it up, the configuration can be found under src/datasources/<DataSourceName>.datasource.ts, which would look like this:

const config = {
  name: 'db',
  connector: 'postgresql',
  url: '',
  host:'localhost',
  port: 5432,
  user: 'user',
  password: 'pass',
  database: 'testdb',
};
For LoopBack 3 users

Use the Data source generator to add a PostgreSQL data source to your application. The generator will prompt for the database server hostname, port, and other settings required to connect to a PostgreSQL database. It will also run the npm install command above for you.

The entry in the application's /server/datasources.json will look like this:

{% include code-caption.html content="/server/datasources.json" %}

"mydb": {
  "name": "mydb",
  "connector": "postgresql"
  "host": "mydbhost",
  "port": 5432,
  "url": "postgres://admin:admin@mydbhost:5432/db1?ssl=false",
  "database": "db1",
  "password": "admin",
  "user": "admin",
  "ssl": false
}

Edit datasources.json to add other properties that enable you to connect the data source to a PostgreSQL database.

Connection Pool Settings

You can also specify connection pool settings in <DataSourceName>.datasource.ts ( or datasources.json for LB3 users). For instance you can specify the minimum and the maximum pool size, and the maximum pool client's idle time before closing the client.

Example of db.datasource.ts:

const config = {
  name: 'db',
  connector: 'postgresql',
  url: '',
  host: 'localhost',
  port: 5432,
  user: 'user',
  password: 'pass',
  database: 'testdb',
  min: 5,
  max: 200,
  idleTimeoutMillis: 60000,
  ssl: false
};

Check out node-pg-pool and node postgres pooling example for more information.

Configuration options

Property Type Description
connector String Connector name, either "loopback-connector-postgresql" or "postgresql"
database String Database name
debug Boolean If true, turn on verbose mode to debug database queries and lifecycle.
host String Database host name
password String Password to connect to database
port Number Database TCP port
url String Use instead of thehost,port,user,password, anddatabaseproperties. For example:'postgres://test:mypassword@localhost:5432/dev'.
username String Username to connect to database
min Integer Minimum number of clients in the connection pool
max Integer Maximum number of clients in the connection pool
idleTimeoutMillis Integer Maximum time a client in the pool has to stay idle before closing it
ssl Boolean Whether to try SSL/TLS to connect to server
defaultIdSort Boolean/String Set to false to disable default sorting on id column(s). Set to numericIdOnly to only apply to IDs with a number type id.
allowExtendedOperators Boolean Set to true to enable PostgreSQL-specific operators such as contains. Learn more in Extended operators below.

NOTE: By default, the 'public' schema is used for all tables.

The PostgreSQL connector uses node-postgres as the driver. For more information about configuration parameters, see node-postgres documentation.

Connecting to UNIX domain socket

A common PostgreSQL configuration is to connect to the UNIX domain socket /var/run/postgresql/.s.PGSQL.5432 instead of using the TCP/IP port. For example:

const config = {
  name: 'db',
  connector: 'postgresql',
  url: '',
  host: '/var/run/postgresql/',
  port: 5432,
  user: 'user',
  password: 'pass',
  database: 'testdb',
  debug: true
};

Defining models

LoopBack allows you to specify some database settings through the model definition and/or the property definition. These definitions would be mapped to the database. Please check out the CLI lb4 model for generating LB4 models. The following is a typical LoopBack 4 model that specifies the schema, table and column details through model definition and property definitions:

@model({
  settings: { postgresql: { schema: 'public', table: 'inventory'} },
})
export class Inventory extends Entity {
  @property({
    type: 'number',
    required: true,
    scale: 0,
    id: 1,
    postgresql: {
      columnName: 'id',
      dataType: 'integer',
      dataLength: null,
      dataPrecision: null,
      dataScale: 0,
      nullable: 'NO',
    },
  })
  id: number;

  @property({
    type: 'string',
    postgresql: {
      columnName: 'name',
      dataType: 'text',
      dataLength: null,
      dataPrecision: null,
      dataScale: null,
      nullable: 'YES',
    },
  })
  name?: string;

  @property({
    type: 'boolean',
    required: true,
    postgresql: {
      columnName: 'available',
      dataType: 'boolean',
      dataLength: null,
      dataPrecision: null,
      dataScale: null,
      nullable: 'NO',
    },
  })
  available: boolean;

  constructor(data?: Partial<User>) {
    super(data);
  }
}
For LoopBack 3 users

The model definition consists of the following properties.

Property Default Description
name Camel-case of the database table name Name of the model.
options N/A Model level operations and mapping to PostgreSQL schema/table
properties N/A Property definitions, including mapping to PostgreSQL column

For example:

{% include code-caption.html content="/common/models/model.json" %}

{
  "name": "Inventory",
  "options": {
    "idInjection": false,
    "postgresql": {
      "schema": "strongloop",
      "table": "inventory"
    }
  },
  "properties": {
    "id": {
      "type": "String",
      "required": false,
      "length": 64,
      "precision": null,
      "scale": null,
      "postgresql": {
        "columnName": "id",
        "dataType": "character varying",
        "dataLength": 64,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "NO"
      }
    },
    "productId": {
      "type": "String",
      "required": false,
      "length": 20,
      "precision": null,
      "scale": null,
      "id": 1,
      "postgresql": {
        "columnName": "product_id",
        "dataType": "character varying",
        "dataLength": 20,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "YES"
      }
    },
    "locationId": {
      "type": "String",
      "required": false,
      "length": 20,
      "precision": null,
      "scale": null,
      "id": 1,
      "postgresql": {
        "columnName": "location_id",
        "dataType": "character varying",
        "dataLength": 20,
        "dataPrecision": null,
        "dataScale": null,
        "nullable": "YES"
      }
    },
    "available": {
      "type": "Number",
      "required": false,
      "length": null,
      "precision": 32,
      "scale": 0,
      "postgresql": {
        "columnName": "available",
        "dataType": "integer",
        "dataLength": null,
        "dataPrecision": 32,
        "dataScale": 0,
        "nullable": "YES"
      }
    },
    "total": {
      "type": "Number",
      "required": false,
      "length": null,
      "precision": 32,
      "scale": 0,
      "postgresql": {
        "columnName": "total",
        "dataType": "integer",
        "dataLength": null,
        "dataPrecision": 32,
        "dataScale": 0,
        "nullable": "YES"
      }
    }
  }
}

To learn more about specifying database settings, please check the section Data Mapping Properties.

Type mapping

See LoopBack 4 types (or LoopBack 3 types) for details on LoopBack's data types.

LoopBack to PostgreSQL types

LoopBack Type PostgreSQL Type
String
JSON
Text
Default
VARCHAR2
Default length is 1024
String[] VARCHAR2[]
Number INTEGER
Date TIMESTAMP WITH TIME ZONE
Boolean BOOLEAN

Besides the basic LoopBack types, as we introduced above, you can also specify the database type for model properties. It would be mapped to the database (see Data Mapping Properties). For example, we would like the property price to have database type double precision in the corresponding table in the database, we have specify it as following:

  @property({
    type: 'number',
    postgresql: {
      dataType: 'double precision',
    },
  })
  price?: number;
For LoopBack 3 users
"properties": {
    // ..
    "price": {
      "type": "Number",
      "postgresql": {
        "dataType": "double precision",
      }
    },

{% include warning.html content=" Not all database types are supported for operating CRUD operations and queries with filters. For example, type Array cannot be filtered correctly, see GitHub issues: # 441 and # 342. " %}

PostgreSQL types to LoopBack

PostgreSQL Type LoopBack Type
BOOLEAN Boolean
VARCHAR
CHARACTER VARYING
CHARACTER
CHAR
TEXT
String
BYTEA Node.js Buffer object
SMALLINT
INTEGER
BIGINT
DECIMAL
NUMERIC
REAL
DOUBLE PRECISION
FLOAT
SERIAL
BIGSERIAL
Number
DATE
TIMESTAMP
TIMESTAMP WITH TIME ZONE
TIMESTAMP WITHOUT TIME ZONE
TIME
TIME WITH TIME ZONE
TIME WITHOUT TIME ZONE
Date
POINT GeoPoint

Numeric Data Type

Note: The node.js driver for postgres by default casts Numeric type as a string on GET operation. This is to avoid data precision loss since Numeric types in postgres cannot be safely converted to JavaScript Number.

For details, see the corresponding driver issue.

Querying JSON fields

Note The fields you are querying should be setup to use the JSON postgresql data type - see Defining models

Assuming a model such as this:

  @property({
    type: 'number',
    postgresql: {
      dataType: 'double precision',
    },
  })
  price?: number;

You can query the nested fields with dot notation:

CustomerRepository.find({
  where: {
    address.state: 'California',
  },
  order: 'address.city',
});

Extended operators

PostgreSQL supports the following PostgreSQL-specific operators:

Please note extended operators are disabled by default, you must enable them at datasource level or model level by setting allowExtendedOperators to true.

Operator contains

The contains operator allow you to query array properties and pick only rows where the stored value contains all of the items specified by the query.

The operator is implemented using PostgreSQL array operator @>.

Note The fields you are querying must be setup to use the postgresql array data type - see Defining models above.

Assuming a model such as this:

@model({
  settings: {
    allowExtendedOperators: true,
  }
})
class Post {
  @property({
    type: ['string'],
    postgresql: {
      dataType: 'varchar[]',
    },
  })
  categories?: string[];
}

You can query the tags fields as follows:

const posts = await postRepository.find({
  where: {
    {
      categories: {'contains': ['AA']},
    }
  }
});

Operator containedBy

Inverse of the contains operator, the containedBy operator allow you to query array properties and pick only rows where the all the items in the stored value are contained by the query.

The operator is implemented using PostgreSQL array operator <@.

Note The fields you are querying must be setup to use the postgresql array data type - see Defining models above.

Assuming a model such as this:

@model({
  settings: {
    allowExtendedOperators: true,
  }
})
class Post {
  @property({
    type: ['string'],
    postgresql: {
      dataType: 'varchar[]',
    },
  })
  categories?: string[];
}

You can query the tags fields as follows:

const posts = await postRepository.find({
  where: {
    {
      categories: {'containedBy': ['AA']},
    }
  }
});

Operator containsAnyOf

The containsAnyOf operator allow you to query array properties and pick only rows where the any of the items in the stored value matches any of the items in the query.

The operator is implemented using PostgreSQL array overlap operator &&.

Note The fields you are querying must be setup to use the postgresql array data type - see Defining models above.

Assuming a model such as this:

@model({
  settings: {
    allowExtendedOperators: true,
  }
})
class Post {
  @property({
    type: ['string'],
    postgresql: {
      dataType: 'varchar[]',
    },
  })
  categories?: string[];
}

You can query the tags fields as follows:

const posts = await postRepository.find({
  where: {
    {
      categories: {'containsAnyOf': ['AA']},
    }
  }
});

Operator match

The match operator allows you to perform a full text search using the @@ operator in PostgreSQL.

Assuming a model such as this:

@model({
  settings: {
    allowExtendedOperators: true,
  }
})
class Post {
  @property({
    type: 'string',
  })
  content: string;
}

You can query the content field as follows:

const posts = await postRepository.find({
  where: {
    {
      content: {match: 'someString'},
    }
  }
});

Discovery and auto-migration

Model discovery

The PostgreSQL connector supports model discovery that enables you to create LoopBack models based on an existing database schema. Once you defined your datasource:

(See database discovery API for related APIs information)

Auto-migration

The PostgreSQL connector also supports auto-migration that enables you to create a database schema from LoopBack models.

For example, based on the following model, the auto-migration method would create/alter existing customer table under public schema in the database. Table customer would have two columns: name and id, where id is also the primary key and has the default value SERIAL as it has definition of type: 'Number' and generated: true:

@model()
export class Customer extends Entity {
  @property({
    id: true,
    type: 'Number',
    generated: true
  })
  id: number;

  @property({
    type: 'string'
  })
  name: string;
}

By default, tables generated by the auto-migration are under public schema and named in lowercase.

Besides the basic model metadata, LoopBack allows you to specify part of the database schema definition via the property definition, which would be mapped to the database.

For example, based on the following model, after running the auto-migration script, a table named CUSTOMER under schema market will be created. Moreover, you can also have different names for your property and the corresponding column. In the example, by specifying the column name, the property name will be mapped to the customer_name column. This is useful when your database has a different naming convention than LoopBack (camelCase).

@model(
  settings: {
    postgresql: {schema: 'market', table: 'CUSTOMER'},
  }
)
export class Customer extends Entity {
  @property({
    id: true,
    type: 'Number',
    generated: true
  })
  id: number;

  @property({
    type: 'string',
    postgresql: {
      columnName: 'customer_name'
    }
  })
  name: string;
}

For how to run the script and more details:

(See LoopBack auto-migrate method for related APIs information)

Here are some limitations and tips:

  • If you defined generated: true in the id property, it generates integers by default. For auto-generated uuid, see Auto-generated id property
  • Only the id property supports the auto-generation setting generated: true for now
  • Auto-migration doesn't create foreign key constraints by default. But they can be defined through the model definition. See Auto-migrate with foreign keys
  • Destroying models may result in errors due to foreign key integrity. First delete any related models by calling delete on models with relationships.

Auto-migrate/Auto-update models with foreign keys

Foreign key constraints can be defined in the model definition.

Note: The order of table creation is important. A referenced table must exist before creating a foreign key constraint.

Define your models and the foreign key constraints as follows:

customer.model.ts:

@model()
export class Customer extends Entity {
  @property({
    id: true,
    type: 'Number',
    generated: true
  })
  id: number;

  @property({
    type: 'string'
  })
  name: string;
}

order.model.ts:

@model({
  settings: {
    foreignKeys: {
      fk_order_customerId: {
        name: 'fk_order_customerId',
        entity: 'Customer',
        entityKey: 'id',
        foreignKey: 'customerId',
        onDelete: 'CASCADE',
        onUpdate: 'SET NULL'
      },
    },
  })
export class Order extends Entity {
  @property({
    id: true,
    type: 'Number',
    generated: true
  })
  id: number;

  @property({
    type: 'string'
  })
  name: string;

  @property({
    type: 'Number'
  })
  customerId: number;
}
For LoopBack 3 users
({
  "name": "Customer",
  "options": {
    "idInjection": false
  },
  "properties": {
    "id": {
      "type": "Number",
      "id": 1
    },
    "name": {
      "type": "String",
      "required": false
    }
  }
},
{
  "name": "Order",
  "options": {
    "idInjection": false,
    "foreignKeys": {
      "fk_order_customerId": {
        "name": "fk_order_customerId",
        "entity": "Customer",
        "entityKey": "id",
        "foreignKey": "customerId",
        "onDelete": "CASCADE",
        "onUpdate": "SET NULL"
      }
    }
  },
  "properties": {
    "id": {
      "type": "Number"
      "id": 1
    },
    "customerId": {
      "type": "Number"
    },
    "description": {
      "type": "String",
      "required": false
    }
  }
})

{% include tip.html content=" Removing or updating the value of `foreignKeys` will be updated or delete or update the constraints in the db tables. If there is a reference to an object being deleted then the `DELETE` will fail. Likewise if there is a create with an invalid FK id then the `POST` will fail. The `onDelete` and `onUpdate` properties are optional and will default to `NO ACTION`. " %}

Auto-generated ids

Auto-migrate supports the automatic generation of property values for the id property. For PostgreSQL, the default id type is integer. Thus if you have generated: true in the id property, it generates integers by default:

{
  id: true,
  type: 'Number',
  required: false,
  generated: true // enables auto-generation
}

It is common to use UUIDs as the primary key in PostgreSQL instead of integers. You can enable it with either the following ways:

  • use uuid that is generated by your LB application by setting defaultFn: uuid:
  @property({
    id: true,
    type: 'string'
    defaultFn: 'uuid',
    // generated: true,  -> not needed
  })
  id: string;
  • use PostgreSQL built-in (extension and) uuid functions:
  @property({
  id: true,
  type: 'String',
  required: false,
  // settings below are needed
  generated: true,
  useDefaultIdType: false,
  postgresql: {
    dataType: 'uuid',
  },
})
  id: string;

The setting uses uuid-ossp extension and uuid_generate_v4() function as default.

If you'd like to use other extensions and functions, you can do:

  @property({
  id: true,
  type: 'String',
  required: false,
  // settings below are needed
  generated: true,
  useDefaultIdType: false,
  postgresql: {
    dataType: 'uuid',
    extension: 'myExtension',
    defaultFn: 'myuuid'
  },
})
  id: string;

WARNING: It is the users' responsibility to make sure the provided extension and function are valid.

Module Long Term Support Policy

This module adopts the Module Long Term Support (LTS) policy, with the following End Of Life (EOL) dates:

Version Status Published EOL
5.x Current Apr 2020 Apr 2023 (minimum)
3.x Active LTS Mar 2017 Apr 2022

Learn more about our LTS plan in docs.

Running tests

Own instance

If you have a local or remote PostgreSQL instance and would like to use that to run the test suite, use the following command:

  • Linux
POSTGRESQL_HOST=<HOST> POSTGRESQL_PORT=<PORT> POSTGRESQL_USER=<USER> POSTGRESQL_PASSWORD=<PASSWORD> POSTGRESQL_DATABASE=<DATABASE> CI=true npm test
  • Windows
SET POSTGRESQL_HOST=<HOST> SET POSTGRESQL_PORT=<PORT> SET POSTGRESQL_USER=<USER> SET POSTGRESQL_PASSWORD=<PASSWORD> SET POSTGRESQL_DATABASE=<DATABASE> SET CI=true npm test

Docker

If you do not have a local PostgreSQL instance, you can also run the test suite with very minimal requirements.

  • Assuming you have Docker installed, run the following script which would spawn a PostgreSQL instance on your local:
source setup.sh <HOST> <PORT> <USER> <PASSWORD> <DATABASE>

where <HOST>, <PORT>, <USER>, <PASSWORD> and <DATABASE> are optional parameters. The default values are localhost, 5432, root, pass and testdb respectively.

  • Run the test:
npm test

loopback-connector-postgresql's People

Contributors

0candy avatar achrinza avatar agnes512 avatar alireza-ahmadi avatar amir-61 avatar aol-nnov avatar ataft avatar b-admike avatar bajtos avatar bobbysmith007 avatar cgole avatar dhmlau avatar elv1s avatar emonddr avatar frbuceta avatar gregdingle avatar grosem avatar jannyhou avatar kesavkolla avatar loay avatar nabdelgadir avatar ptrblgh avatar raymondfeng avatar renovate[bot] avatar rmg avatar siddhipai avatar sidewaiise avatar ssh24 avatar superkhau avatar zbarbuto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

loopback-connector-postgresql's Issues

Underscored DB names

It would be nice if, instead of simply lowering the case of model names and properties (AaaBbb -> aaabbb), they be converted to underscored (AaaBbb -> aaa_bbb) when creating the DB in Postgres. That may make DB creation consistent with the reverse operation (discover and build models).

DateTime returning incorrect time in 2.0.0 & 2.1.0 for get request

Sample app for testing this here: It's just a base install, s simple model and branches for the different versions of the loopback-connector-postgresql

https://github.com/vespertilian/loopback_datetime

2.1.0 & 2.0.0

Data:

{
  "testDate1": "2015-06-10T01:11:06.774Z"
}

Response: correct

{
  "testDate1": "2015-06-10T01:11:06.774Z",
  "id": 5
}

Get /DatePostgres/:id (5)
Response: ***** incorrect

{
  "testDate1": "2015-06-09T15:11:06.774Z",
  "testDate2": null,
  "testDate3": null,
  "id": 6
}

1.7.1

Data:

{
  "testDate1": "2015-06-10T01:11:06.774Z"
}

Response: correct

{
  "testDate1": "2015-06-10T01:11:06.774Z",
  "id": 5
}

Get /DatePostgres/:id (5)
Response: correct

{
  "testDate1": "2015-06-10T01:11:06.774Z",
  "testDate2": null,
  "testDate3": null,
  "id": 5
}

DB table view with PG-Commander

Both the dates are saved in my local timezone (Sydney). Would be good if they respected the timezone the were submitted in. I am keen to store everything in UTC and convert it on the front end.

testdate1: 2015-06-10 11:11:06.774+10 id: 5
testdate1: 2015-06-10 01:11:06.774+10 id: 6

Thanks
Cameron

Error: connect ETIMEDOUT

If I simply run the command slc run . on a brand new app, I get the below output. The only thing I have done is install the connecter from this repo, use the loopback tutorial for creating a connector, then create a model using that connector.

Once I run the command, after about a minutes of either doing nothing, or even trying to use the app, I get this output:

$ slc run .
supervisor running without clustering (unsupervised)
2014-08-16T09:12:49.615Z pid:45122 worker:supervisor INFO strong-agent dashboard is at https://strongops.strongloop.com
2014-08-16T09:12:49.618Z pid:45122 worker:supervisor INFO strong-agent using collector https://collector.strongloop.com:443
2014-08-16T09:12:49.618Z pid:45122 worker:supervisor INFO strong-agent v0.4.12 profiling app 'shinyredrobot.com' pid '45122'
2014-08-16T09:12:49.618Z pid:45122 worker:supervisor INFO strong-agent[45122] started profiling agent
Browse your REST API at http://localhost:3000/explorer
Web server listening at: http://localhost:3000/
2014-08-16T09:12:49.952Z pid:45122 worker:supervisor INFO strong-agent[45122] connected to collector
{ [Error: connect ETIMEDOUT] code: 'ETIMEDOUT', errno: 'ETIMEDOUT', syscall: 'connect' }

/www/projects/shinyredrobot/shinyredrobot.com/node_modules/loopback-connector-postgresql/lib/postgresql.js:105
      throw err;
            ^
Error: connect ETIMEDOUT
    at errnoException (net.js:904:11)
    at Object.afterConnect [as oncomplete] (net.js:895:19)

here is my datasources.json:

{
  "db"          : {
    "name"     : "db",
    "connector": "memory"
  },
  "dev_redrobot": {
    "name"     : "dev_redrobot",
    "connector": "postgresql",
    "database" : "dev_redrobot",
    "username" : "myuser",
    "password" : "mypass"
  }
}

I followed the articles for getting started with loopback and didnt really have any issues. So I am not sure what I may be doing wrong. But I can't stay connected to the db. I also want to note, that before this timeout, the connection and the example models work fine. i can post data to the db using postman.

I am running Postgres 9.3, OSX 10.9, and node v0.10.28 and the slc command was just installed today. If there are any other logs or info I can provide, please let me know.

Running `autoupdate` is not idempotent

I have noticed that autoupdate exhibits some very odd behavior. On base models (like User) where I have defined additional properties, those properties are removed if they exist, and added if they do not exist. This means the following can happen:

  1. Add first custom field to a model, A
  2. Run autoupdate. A is present.
  3. Add another custom field, B.
  4. Run autoupdate. B is present, A is not.
  5. Run autoupdate. A is present, B is not.

automigrate does work as expected, however.

Error for the latest postgresql connector, method 'notifyObserversAround' does not exist(Bug)

this.notifyObserversAround('execute', context, function(context, done) {
^
TypeError: Object # has no method 'notifyObserversAround'
at PostgreSQL.SQLConnector.execute (D:\Workspace\RockieData\Sandbox\MK\alert
\node_modules\loopback-connector-postgresql\node_modules\loopback-connector\lib
sql.js:407:8)
at DataSource. (D:\Workspace\RockieData\Sandbox\MK\alert\node_mod
ules\loopback-connector-postgresql\node_modules\loopback-connector\lib\sql.js:39
7:12)
at DataSource.g (events.js:180:16)
at DataSource.EventEmitter.emit (events.js:92:17)
at DataSource.postInit (D:\Workspace\RockieData\Sandbox\MK\alert\node_module
s\loopback-datasource-juggler\lib\datasource.js:310:14)
at D:\Workspace\RockieData\Sandbox\MK\alert\node_modules\loopback-connector-
postgresql\lib\postgresql.js:91:17
at D:\Workspace\RockieData\Sandbox\MK\alert\node_modules\loopback-connector-
postgresql\node_modules\pg\lib\pool.js:77:9
at D:\Workspace\RockieData\Sandbox\MK\alert\node_modules\loopback-connector-
postgresql\node_modules\pg\node_modules\generic-pool\lib\generic-pool.js:281:11
at D:\Workspace\RockieData\Sandbox\MK\alert\node_modules\loopback-connector-
postgresql\node_modules\pg\lib\pool.js:51:18
at null. (D:\Workspace\RockieData\Sandbox\MK\alert\node_modules\l
oopback-connector-postgresql\node_modules\pg\lib\client.js:144:7)

Queries with dates are not working

2.x version of loopback-connector-postgresql is failing for simple queries too.

I've the following model:

{
"name": "histrollup",
"base": "PersistedModel",
"idInjection": false,
"options": {
"validateUpsert": true
},
"postgresql": {
"table": "v_histrollup"
},
"properties": {
"property": {
"type": "string",
"id": true,
"required": true,
"postgresql": {
"dataType": "uuid"
}
},
"staydate": {
"type": "date",
"required": true
}
}
}

When I do simple queries on this model things are not working:

app.models.Histrollup.find({where: {property: inputs.property, staydate:new Date('2015-05-01')}, limit:10})

The query that is getting generated for this is:
SQL: SELECT "property","staydate","rolluptype","totnights","totrevenue","bymarketcd","bychannel","byroomtype" FROM "castle"."v_histrollup" WHERE "property"=$1 AND "staydate"$2 ORDER BY "property" LIMIT 10
Parameters: ["c5ba1990-c458-11e4-a8e5-0b887e7b0d20",null]

The date parameter is going as null and I get the following error:

error: syntax error at or near "$2"

Also the following query is not working:

app.models.Histrollup.find({where: {property: inputs.property, staydate: {between: [inputs.startStaydate, inputs.endStaydate]}}, limit:10})

SQL: SELECT "property","staydate","rolluptype","totnights","totrevenue","bymarketcd","bychannel","byroomtype" FROM "castle"."v_histrollup" WHERE "property"=$1 AND "staydate" BETWEEN $2 AND $3 ORDER BY "property" LIMIT 10
Parameters: ["c5ba1990-c458-11e4-a8e5-0b887e7b0d20",{"sql":"to_timestamp(?,'yyyy-mm-dd hh24:mi:ss.ms')","params":["2014-04-28T10:00:00.000Z"]},{"sql":"to_timestamp(?,'yyyy-mm-dd hh24:mi:ss.ms')","params":["2014-10-01T09:59:59.999Z"]}]

error: invalid input syntax for type date: "{"sql":"to_timestamp(?,'yyyy-mm-dd hh24:mi:ss.ms')","params":["2014-04-28T10:00:00.000Z"]}"

Regards,
-Kesav

updateOrCreate return value is broken

updateOrCreate should return the updated data, but instead, returns only the id value. Unlike what Juggler's DAO expects - this breaks DataAccessObject's updateOrCreate/upsert method.

Getting error while creating new role over REST interface.

I am getting following error while trying to create a new role over REST interface, if I am not passing id manually. While manually passing id is saving the data. Which should not happen and id should be auto generated, unless I am missing here some thing..
{ -error: { name: "error" status: 500 message: "null value in column " id " violates not-null constraint" length: 246 severity: "ERROR" code: "23502" detail: "Failing row contains (null, adminjk, nj, 2014-05-14 00:18:20+05:30, 2014-05-14 00:18:20+05:30)." file: "src\backend\executor\execMain.c" line: "1611" routine: "ExecConstraints" stack: "error: null value in column " id " violates not-null constraint at Connection.parseE (E:\projects\Git\SourceTree\ConverBiz\SecondarySales\apiss\node_modules\loopback-connector-postgresql\node_modules\pg\lib\connection.js:521:11) at Connection.parseMessage (E:\projects\Git\SourceTree\ConverBiz\SecondarySales\apiss\node_modules\loopback-connector-postgresql\node_modules\pg\lib\connection.js:351:17) at Socket.<anonymous> (E:\projects\Git\SourceTree\ConverBiz\SecondarySales\apiss\node_modules\loopback-connector-postgresql\node_modules\pg\lib\connection.js:101:22) at Socket.EventEmitter.emit (events.js:95:17) at Socket.<anonymous> (_stream_readable.js:746:14) at Socket.EventEmitter.emit (events.js:92:17) at emitReadable_ (_stream_readable.js:408:10) at emitReadable (_stream_readable.js:404:5) at readableAddChunk (_stream_readable.js:165:9) at Socket.Readable.push (_stream_readable.js:127:10)"

Recreate the database on `npm test` to allow parallel CI test execution

As discussed with @rmg in Slack, current builds can't be run in parallel because we don't recreate the database when running npm test. It was suggested that running tables.sql before each npm test would be enough. Therefore the tasks are as follows:

  1. Create a script to run tables.sql
  2. Create pretest in the script section of package.json that runs the script in step 1 when npm test is executed.

discoverPrimaryKeys doesn't return primary keys

Hi, I'm trying to use the discover APIs to connect to an already existing Postgres database and build the models automatically. I found that the discoverPrimaryKeys function returns the list of the foreign keys but it doesn't return any of the primary key. Do you have any idea about how to solve this problem? Thanks.

Error with dates

I've a database with column type date

   Column   |  Type   | Modifiers
------------+---------+-----------
 property   | uuid    |
 staydate   | date    |
 totrevenue | numeric |

I created model to reflect this table:

"properties": {
        "property": {
            "type": "string",
            "id": true,
            "required": true,
            "postgresql": {
                "dataType": "uuid"
            }
        },
        "staydate": {
            "type": "date",
            "required": true
        }
}

When I do query for staydate:

app.models.Histrollup.find({where: {property: '992ab64e-e15b-11e4-9319-c39ff206d9ab', staydate: {between: [Date.create('2015-05-01'), Date.create('2015-05-10')]}}}, function(err,data){})

Here is the debug log:

loopback:connector:postgresql SQL: SELECT "property","staydate","rolluptype","totnights","totrevenue","bymarketcd","bychannel","byroomtype" FROM "castle"."v_histrollup" WHERE "property"=$1 AND "staydate" BETWEEN $2 AND $3 ORDER BY "property"
Parameters: ["992ab64e-e15b-11e4-9319-c39ff206d9ab",{"sql":"to_timestamp(?,'yyyy-mm-dd hh24:mi:ss.ms')","params":["2015-05-01T10:00:00.000Z"]},{"sql":"to_timestamp(?,'yyyy-mm-dd hh24:mi:ss.ms')","params":["2015-05-10T10:00:00.000Z"]}] +1m
undefined
>   loopback:connector:postgresql error: invalid input syntax for type date: "{"sql":"to_timestamp(?,'yyyy-mm-dd hh24:mi:ss.ms')","params":["2015-05-01T10:00:00.000Z"]}"
    at Connection.parseE (/Users/kesav/Programming/projects/hotelsoft/hotelsoftrzr/node_modules/loopback-connector-postgresql/node_modules/pg/lib/connection.js:534:11)
    at Connection.parseMessage (/Users/kesav/Programming/projects/hotelsoft/hotelsoftrzr/node_modules/loopback-connector-postgresql/node_modules/pg/lib/connection.js:361:17)
    at Socket.<anonymous> (/Users/kesav/Programming/projects/hotelsoft/hotelsoftrzr/node_modules/loopback-connector-postgresql/node_modules/pg/lib/connection.js:105:22)
    at emitOne (events.js:77:13)
    at Socket.emit (events.js:169:7)
    at readableAddChunk (_stream_readable.js:145:16)
    at Socket.Readable.push (_stream_readable.js:109:10)
    at TCP.onread (net.js:521:20) +10ms

The parameter {"sql":"to_timestamp(?,'yyyy-mm-dd hh24:mi:ss.ms') is not valid for dates

Numeric precision and scale not being set with Automigrate and Autoupdate

Sample .json format for numeric.

"distance": {
      "type": "number",
      "required": false,
      "postgresql": {
        "columnName": "distance",
        "dataType": "numeric",
        "dataLength": null,
        "dataPrecision": 10,
        "dataScale": 5,
        "nullable": "NO"
      }
}

After running Auto-migrate and viewing with PGAdmin the precision and scale are not set.
So Postgres uses the default.

What the column looks like in PG Admin

-- Column: distance

-- ALTER TABLE trip DROP COLUMN distance;

ALTER TABLE trip ADD COLUMN distance numeric;

What it should look like in PG Admin

-- Column: "averageSpeed"

-- ALTER TABLE trip DROP COLUMN "averageSpeed";

ALTER TABLE trip ADD COLUMN "averageSpeed" numeric(5,2);

Large arrays of integers

I tried storing large arrays but I received this error:

error: value too long for type character varying(1024)

For now, I manually set the column's type to text. Is this a safe workaround or will that information be lost whenever I migrate?
What is the suggested way of getting around this issue?

Thanks
Rubén

Fatal Error: connector crashes loopback on [where][date]='invalid date format' api query

This is a show stopper. If it doesn't like the date format, it crashes the whole API. Can't use this in production.

/var/www/html/api/node_modules/loopback-connector-postgresql/lib/postgresql.js:542
var iso = escape(val.toISOString());
^
RangeError: Invalid time value
at Date.toISOString (native)
at PostgreSQL.toDatabase (/var/www/html/api/node_modules/loopback-connector-postgresql/lib/postgresql.js:542:26)
at /var/www/html/api/node_modules/loopback-connector-postgresql/lib/postgresql.js:797:32
at Array.forEach (native)
at PostgreSQL._buildWhere (/var/www/html/api/node_modules/loopback-connector-postgresql/lib/postgresql.js:769:24)
at PostgreSQL.buildWhere (/var/www/html/api/node_modules/loopback-connector-postgresql/lib/postgresql.js:745:20)
at PostgreSQL.toFilter (/var/www/html/api/node_modules/loopback-connector-postgresql/lib/postgresql.js:872:20)
at PostgreSQL.all (/var/www/html/api/node_modules/loopback-connector-postgresql/lib/postgresql.js:707:12)
at Function.find (/var/www/html/api/node_modules/loopback-datasource-juggler/lib/dao.js:790:34)
at SharedMethod.invoke (/var/www/html/api/node_modules/loopback/node_modules/strong-remoting/lib/shared-method.js:224:17)

SQL syntax identifiers

Hello.

I create tables into database using this code:
https://gist.github.com/paulomcnally/9eef6e3fa4f622a0dc67

On my custom models i define options.postgresql.table (example).

Model name: UserEventType
Table name: user_event_types

If i don't define options.postgresql.table then my table name is: usereventtype

SQL syntax identifiers
http://www.postgresql.org/docs/current/interactive/sql-syntax-lexical.html#SQL-SYNTAX-IDENTIFIERS

Bad

Model: User
Table: user

Good

Model: User
Table: users

Bad

Model: AccessToken
Table: accesstoken

Good

Model: AccessToken
Table: access_token

The error is in the function escapeName on this line of code: https://github.com/strongloop/loopback-connector-postgresql/blob/master/lib/postgresql.js#L640

ReferencesMany and "inq" filter on id result in error

Hey,
I set up a referencesMany relation between objects, now when I run something like this:

http://localhost:3010/api/Entries?filter={"where":{"tagIds":{"inq":[2]}}}

I get the following error

{"error":{"name":"error","status":500,"message":"operator does not exist: character varying = integer","length":208,"severity":"ERROR","code":"42883","hint":"No operator matches the given name and argument type(s). You might need to add explicit type casts.","position":"196","file":"parse_oper.c","line":"722","routine":"op_error","stack":"error: operator does not exist: character varying = integer\n    
at Connection.parseE (.../node_modules/pg/lib/connection.js:534:11)\n    
at Connection.parseMessage (.../node_modules/pg/lib/connection.js:361:17)\n    
at Socket.<anonymous> (.../node_modules/pg/lib/connection.js:105:22)\n    
at emitOne (events.js:75:13)\n    
at Socket.emit (events.js:150:7)\n    
at readableAddChunk (_stream_readable.js:146:16)\n    
at Socket.Readable.push (_stream_readable.js:109:10)\n    
at TCP.onread (net.js:510:20)"}}
Entry.json
{
  "name": "Entry",
  "plural": "Entries",
  "base": "PersistedModel",
  "properties": {
    "name": {
      "type": "string",
      "required": true
    },
  },
  "relations": {
    "tags": {
      "type": "referencesMany",
      "model": "Tag"
    }
  }
}
Tag.json
{
  "name": "Tag",
  "plural": "Tags",
  "base": "PersistedModel",
  "properties": {
    "name": {
      "type": "string",
      "required": true
    }
  },
  "relations": {
    "entries": {
      "type": "hasAndBelongsToMany",
      "model": "Entry"
    }
  }
}

Am I missing something or is this a bug?

Enabling RegExp as a valid condition

Hi,

Background: I'm trying to run queries that will do some very lightweight search, or that will not pain me with case sensitivity problems

In the postgresql.js I can see that the RegExp query parsing seem working but I can't get it to work:

PostgreSQL.prototype.buildExpression =
  function (columnName, operator, columnValue, propertyValue) {
    console.log('Building expression from the PGSQL client');

    if (propertyValue instanceof RegExp) {
      columnValue = "'" + propertyValue.source + "'";
      if (propertyValue.ignoreCase) {
        return new ParameterizedSQL(columnName + ' ~* ?', [columnValue]);
      } else {
        return new ParameterizedSQL(columnName + ' ~ ?', [columnValue]);
      }
    }

It seems the loopback-connector package isn't totally ready for it either and the query doesn't seem to run well in the end.

Is that something normal? Is there a known workaround? (couldn't find it).

can not set 0 to number column

our project work with loopback and postgresql.
i have a number property in model, i try to set it to 0,
but it seem that not persistent to database.
i trace the source code.

PostgreSQL.prototype.save = function (model, data, callback) {
  var self = this;
  var props = self._categorizeProperties(model, data);

  var sql = [];
  sql.push('UPDATE ', self.tableEscaped(model), ' SET ', self.toFields(model, data));
  sql.push(' WHERE ');
  props.ids.forEach(function (id, i) {
    sql.push((i > 0) ? ' AND ' : ' ', self.idColumnEscaped(model), ' = $',
      (props.nonIdsInData.length + i + 1));
  });

  var queryParams = [];
  props.nonIdsInData.forEach(function(key) {
    queryParams.push(data[key] || null); //will not put 0 to queryParams.
  });
  props.ids.forEach(function(key) {
    queryParams.push(data[key] || null);
  });

  self.query(sql.join(''), queryParams, function (err) {
    callback(err);
  });
};

Use pgcrypto for user password.

I'd like to use Postgres' built-in pgcrypto to encyrpt/hash/check user credentials. Is this possible? As far as I can tell strongloop always hashes passwords before looking them up in the DB. I would like Postgres to handle that.

Is there maybe a method I could override where I can write my own SQL query?

Edit: I realize that this question is very much related to the default User model setup. I've skipped this "problem" for now and used the default methods provided by Strongloop. Will be modifying that in the future.

I would like to know/find out how I can write my own SQL queries in the future, or at least, use some of Postgres' specific features. The connection via jugglingdb seems very generic. I'm sure it's possible to find the DB connection via app.datasources['postgres-db'], but I'm not sure if it's a good idea.

Numeric type doesn't handle floating point

The mapping of a floating point number is to Postgres's NUMERIC type. This causes a runtime error.
What should be done about this? I see loopback wants to expose this interface http://docs.strongloop.com/display/LB/LoopBack+types stating the a floating point number can be handled by that numeric type. This connector then must detect if the number passed in is a floating point or integer and create the apporpriate pg type. correct?

Getting time zone not recognized error

I'm following the instructions for defining an admin role in the documentation, but I'm receiving the following error at the Role.create({name: 'admin'}, cb) step:

loopback:connector:postgresql error: time zone "gmt-0700" not recognized

Any help would be appreciated!

Support for json data

My postgresql table has a column that type is json. When I try to update/create the code fails. The json data is not stringified properly when query is constructed.

error: syntax error at or near "bcb1b9c6c4f70e9890e1c5075984acec5ee02"

When creating a new model in explorer, this error is produced. Model has trackChanges: true. If this is removed then creating models works as expected.

Using
loopback 2.4.0
loopback-connector-postgresql 1.1.4

/Users/berkeley/node/loopback/gems/node_modules/loopback/lib/models/change.js:453
    throw err;
          ^
error: syntax error at or near "bcb1b9c6c4f70e9890e1c5075984acec5ee02"
    at Connection.parseE (/Users/berkeley/node/loopback/gems/node_modules/loopback-connector-postgresql/node_modules/pg.js/lib/connection.js:534:11)
    at Connection.parseMessage (/Users/berkeley/node/loopback/gems/node_modules/loopback-connector-postgresql/node_modules/pg.js/lib/connection.js:361:17)
    at Socket.<anonymous> (/Users/berkeley/node/loopback/gems/node_modules/loopback-connector-postgresql/node_modules/pg.js/lib/connection.js:105:22)
    at Socket.emit (events.js:95:17)
    at Socket.<anonymous> (_stream_readable.js:764:14)
    at Socket.emit (events.js:92:17)
    at emitReadable_ (_stream_readable.js:426:10)
    at emitReadable (_stream_readable.js:422:5)
    at readableAddChunk (_stream_readable.js:165:9)
    at Socket.Readable.push (_stream_readable.js:127:10)

Numeric is converted to String

With this in my model.json:

{
  "amount": {
    "type": "number",
    "human": "Amount",
    "postgresql": {
      "dataType": "numeric",
      "dataPrecision": 16,
      "dataScale": 4
    }
  }
}

When I store the value 50000.5413 I get back '50000.5413'.

How to ONLY insert given properties and not send NULL to database.

I have a TIMESTAMP column named created in my users table. The database knows the default value for it, and I want the database to handle this.

How can I make sure that a Model.save() creates an SQL query that only inserts the properties I tell it to?

For example:

Access.create([{ // Extends User
  uuid: rID.uuid, // Some uuid I got from some other Model
  username: 'aaron',
  password: 'test',
  email: '[email protected]'
}]);

Ends up trying to insert every column of the Model/Table.

INSERT INTO "public"."access" ("uuid","name","username","password","role","acl","created","expired","realm","credentials","challenges","email","emailverified","verificationtoken","status","lastupdated")

Failing row contains (42, 0b81bb8b-1962-4b01-9f4c-87a0938bf94b, owner, null, aaron, $2a$10$GtfE7X4BR.eyf/AI7qwHQegYdqEaEzIFnLgMy5mqkc6JQ8bB6j3e2, {"all": "rw"}, null, null, null, null, null, test@test.test, null, null, null, null).

How can I prevent all these null values? I want (need) my database to handle default values. I know I can set the default values ALSO in my model definition, but I don't see why I should have to. This would also mean that a simple misconfigured timezone on an application server will mess up these Date fields, since the Default Value was decided on by my NodeJS Application and not in the database.

discoverSchema problems

Using the discoverSchema api on a postgres database has a couple of issues

  1. relations: true does not work in the options object. You have to use associations: true instead
  2. association discovery only seems to work with belongsTo relations. The associated hasMany is not generated.

if my db has

Artist has a pk of ArtistID
Album has a fk of ArtistID

then I get

"relations": { "artist": { "model": "Artist", "type": "belongsTo", "foreignKey": "artistid" }

in my Album model, but

``relations: {}` in my Artist model

GePoint create ERROR

hi
When I want to create a model that includes the GeoPoint Type , getting the following error
{
"error": {
"name": "error",
"status": 500,
"message": "invalid input syntax for type point: "{"lat":59.6150515,"lng":36.2876827}"",
"length": 122,
"severity": "ERROR",
"code": "22P02",
"file": "geo_ops.c",
"line": "1820",
"routine": "point_in",
"stack": "error: invalid input syntax for type point: "{"lat":59.6150515,"lng":36.2876827}"\n at Connection.parseE (/home/zaker/proj/markmap/node_modules/loopback-connector-postgresql/node_modules/pg.js/lib/connection.js:526:11)\n at Connection.parseMessage (/home/zaker/proj/markmap/node_modules/loopback-connector-postgresql/node_modules/pg.js/lib/connection.js:356:17)\n at Socket. (/home/zaker/proj/markmap/node_modules/loopback-connector-postgresql/node_modules/pg.js/lib/connection.js:105:22)\n at Socket.emit (events.js:95:17)\n at Socket. (stream_readable.js:764:14)\n at Socket.emit (events.js:92:17)\n at emitReadable (_stream_readable.js:426:10)\n at emitReadable (_stream_readable.js:422:5)\n at readableAddChunk (_stream_readable.js:165:9)\n at Socket.Readable.push (_stream_readable.js:127:10)"
}
}
what is the problem?

geopoint保存到postgresql数据库中时有误

geopoint类型的数据保存到postgresql数据库中使用的为POINT类型,
格式为:POINT(39.908701 116.397482),
实际上postgresql数据库的POINT类型格式是:POINT(116.397482 39.908701)。

错误的数据格式,在使用一下sql查询时出现异常:
SELECT st_distance(ST_Transform(ST_GeomFromText('POINT(39.908701 116.397482)',4326),26986), ST_Transform(ST_GeomFromText('POINT(39.908701 116.397482)',4326),26986));

ERROR: transform: couldn't project point (39.9087 116.397 0): latitude or longitude exceeded limits (-14)

error: duplicate key value violates unique constraint "mat-change_pkey"

Using loopback 2.4.1
loopback-connector-postgresql 1.1.4

Start with empty database set up with automigrate. Have a model with change tracking enabled. Create a new model. Model is successfully created. Then server crashes with the following error. Seems to also create the change model in the database.

Error: error: duplicate key value violates unique constraint "mat-change_pkey"
    at cb (/Users/berkeley/node/loopback/gems/node_modules/loopback/lib/models/change.js:170:19)
    at /Users/berkeley/node/loopback/gems/node_modules/loopback-datasource-juggler/lib/dao.js:173:20
    at /Users/berkeley/node/loopback/gems/node_modules/loopback-connector-postgresql/lib/postgresql.js:280:14
    at null.callback (/Users/berkeley/node/loopback/gems/node_modules/loopback-connector-postgresql/lib/postgresql.js:143:5)
    at Query.handleError (/Users/berkeley/node/loopback/gems/node_modules/loopback-connector-postgresql/node_modules/pg.js/lib/query.js:99:17)
    at null.<anonymous> (/Users/berkeley/node/loopback/gems/node_modules/loopback-connector-postgresql/node_modules/pg.js/lib/client.js:167:26)
    at emit (events.js:95:17)
    at Socket.<anonymous> (/Users/berkeley/node/loopback/gems/node_modules/loopback-connector-postgresql/node_modules/pg.js/lib/connection.js:109:12)
    at Socket.emit (events.js:95:17)
    at Socket.<anonymous> (_stream_readable.js:764:14)

Add an example for connecting to UNIX domain socket

A common PostgreSQL configuration is to connect to the UNIX domain socket /var/run/postgresql/.s.PGSQL.5432 instead of using the TCP/IP port. I read the code and made some attempts at configuring datasources.json until I found the solution

{
  "postgres": {
    "host": "/var/run/postgresql/",
    "port": "5432",
    "database": "dbname",
    "username": "dbuser",
    "password": "dbpassword",
    "name": "postgres",
    "debug": true,
    "connector": "postgresql"
  }
}

I think this could be added to the README.

Feature request: ability to set connection settings

I've a requirement to change session settings. The use case is I need to set timezone setting on pg client so that all timestamp with time zone data works properly. Since this connector opens a new connection from pool every time I can't just set the session properties during boot time.

Basically I've to set the following session properties on connection before query executes.

SET SESSION TIME ZONE 'America/Los_Angeles'
SET SEARCH_PATH schema1, public

I think it's better to add a callback function for connection so application can decides what to do with the connection.

Suggestion default PG timezone to UTC

Thanks for fixing #87 @raymondfeng

Following on from that topic. I would like to suggest that loopback-connector-postgres should default the Postgres DB timezone to 'UTC' on startup. That way when using a tool like PGAdmin or PG commander the times are returned as UTC.

Currently you can run this command

SET TIME ZONE 'UTC';

To manually set it.

I feel UTC times are easier to reason with and setting UTC as the default would improve developer ergonomics.

What do you think? Is this handled differently for other connectors?

Custom property types or how to map model property to table column?

Is there any way to define complex model property mapping to datasource table column and vice versa? PostGIS (PostgreSQL GIS extention library) has sophisticated data types, geometry and geography, which are stored in table columns as packed strings of digits, so are not usable directly. To work with them via "REST API" way these types need to be converted to and from GeoJSON (specific type of JSON ) by PostGIS own functions ST_AsGeoJSON(geom) and ST_GeomFromGeoJSON(geojson). So to get data from geometry-type column we need to run query

SELECT ST_AsGeoJSON(geom) AS geom FROM table WHERE ...

and to update geom column the query would be

UPDATE table SET geom = ST_GeomFromGeoJSON(geojson) WHERE ...

In node-orm2 project this issue is solved by defining custom property type with datastoreGet and datastoreSet methods, what's about LoopBack project?

trackChanges does not work with PostgreSQL

When trackChanges is true, I got error:

error: relation "public.program-change" does not exist
    at [object Object].Connection.parseE (.../node_modules/loopback-connector-postgresql/node_modules/pg/lib/connection.js:534:11)
    at [object Object].Connection.parseMessage (.../node_modules/loopback-connector-postgresql/node_modules/pg/lib/connection.js:361:17)
    at Socket.<anonymous> (.../node_modules/loopback-connector-postgresql/node_modules/pg/lib/connection.js:105:22)
    at Socket.emit (events.js:95:17)
    at Socket.<anonymous> (_stream_readable.js:764:14)
    at Socket.emit (events.js:92:17)
    at emitReadable_ (_stream_readable.js:426:10)
    at emitReadable (_stream_readable.js:422:5)
    at readableAddChunk (_stream_readable.js:165:9)
    at Socket.Readable.push (_stream_readable.js:127:10)

When is false or with memory connector, everything works well.

article.json:

{
  "name": "Program",
  "base": "PersistedModel",
  "trackChanges": true,
  "options": {
    "postgresql": {
      "schema": "myschema",
      "table": "programs"
    }
  },
  "properties": {
    "id": {
      "id": true,
      "type": "string",
      "defaultFn": "guid",
      "postgresql": {
        "columnName": "id",
        "dataType": "text"
      }
    }
  }
}

Is there something that I missed?

Function `now` is not executed [BUG]

Function now is not getting executed

"properties": {
  "created": {
      "type": "date",
      "default": "now"
    }
}

In line 450, variable val should already have current timestamp, instead it will have Invalid Date error.

/Users/vladmiller/Projects/kiidu/kiidu/node_modules/loopback-connector-postgresql/lib/postgresql.js:450
    var iso = val.toISOString();
                  ^
RangeError: Invalid time value
    at Date.toISOString (native)
    at PostgreSQL.toColumnValue (/Users/vladmiller/Projects/kiidu/kiidu/node_modules/loopback-connector-postgresql/lib/postgresql.js:453:19)
    at PostgreSQL.SQLConnector.buildFields (/Users/vladmiller/Projects/kiidu/kiidu/node_modules/loopback-connector-postgresql/node_modules/loopback-connector/lib/sql.js:888:18)
    at PostgreSQL.SQLConnector.buildInsert (/Users/vladmiller/Projects/kiidu/kiidu/node_modules/loopback-connector-postgresql/node_modules/loopback-connector/lib/sql.js:349:21)
    at PostgreSQL.SQLConnector.create (/Users/vladmiller/Projects/kiidu/kiidu/node_modules/loopback-connector-postgresql/node_modules/loopback-connector/lib/sql.js:428:19)
    at ModelConstructor.<anonymous> (/Users/vladmiller/Projects/kiidu/kiidu/node_modules/loopback-datasource-juggler/lib/dao.js:308:21)
    at ModelConstructor.trigger (/Users/vladmiller/Projects/kiidu/kiidu/node_modules/loopback-datasource-juggler/lib/hooks.js:65:12)
    at ModelConstructor.<anonymous> (/Users/vladmiller/Projects/kiidu/kiidu/node_modules/loopback-datasource-juggler/lib/dao.js:271:11)
    at ModelConstructor.trigger (/Users/vladmiller/Projects/kiidu/kiidu/node_modules/loopback-datasource-juggler/lib/hooks.js:65:12)
    at create (/Users/vladmiller/Projects/kiidu/kiidu/node_modules/loopback-datasource-juggler/lib/dao.js:270:9)

Connector attempts to connect to a database named after my Windows username

Recently decided to give StrongLoop development a shot on my Windows machine. I understand that being NodeJS doesn't necessarily imply Windows support in reality, nor is Windows generally a high compatibility target. Still thought I'd drop a line here before perhaps switching back to Linux for development.

When I attempt to run a server, I get this output:

Connection fails:  { [error: database "Grant" does not exist]
  name: 'error',
  length: 83,
  severity: 'FATAL',
  code: '3D000',
  detail: undefined,
  hint: undefined,
  position: undefined,
  internalPosition: undefined,
  internalQuery: undefined,
  where: undefined,
  file: 'postinit.c',
  line: '775',
  routine: 'InitPostgres' }
It will be retried for the next request.

events.js:72
        throw er; // Unhandled 'error' event
              ^
error: database "Grant" does not exist
    at Connection.parseE (C:\Users\Grant\Documents\Git\current\node_modules\loopback-connector-postgresql\node_modules\pg\lib\connection.js:534:11)
    at Connection.parseMessage (C:\Users\Grant\Documents\Git\current\node_modules\loopback-connector-postgresql\node_modules\pg\lib\connection.js:361:17)
    at Socket.<anonymous> (C:\Users\Grant\Documents\Git\current\node_modules\loopback-connector-postgresql\node_modules\pg\lib\connection.js:105:22)
    at Socket.emit (events.js:95:17)
    at Socket.<anonymous> (_stream_readable.js:764:14)
    at Socket.emit (events.js:92:17)
    at emitReadable_ (_stream_readable.js:426:10)
    at emitReadable (_stream_readable.js:422:5)
    at readableAddChunk (_stream_readable.js:165:9)
    at Socket.Readable.push (_stream_readable.js:127:10)

My name is Grant, as is my username on this Windows box. To test further, I cloned the project on another machine where my username is a-type, and sure enough, the connector tries to reach a database named a-type.

I am pretty confident my datasources.json is correct. Here's a redacted version:

{
  "db": {
    "name": "db",
    "connector": "memory"
  },
  "CurrentDB": {
    "host": "<db-endpoint>.us-east-1.rds.amazonaws.com",
    "port": 5432,
    "username": "postgres",
    "password": "<db-password>",
    "name": "CurrentDB",
    "connector": "loopback-connector-postgresql"
 }
}

Having some trouble tracking this down, and I haven't seen anyone else mention it. Kind of hoping someone spots something off in my config, but it could be an underlying compatibility issue with Windows.

Error: Timeout in connecting after 5000 ms for pg

g0v/ppt#6

It's weird that tables and all row are already constructed, however when I send request(using explorer):
it got this error, even if it's in localhost
{
"error": {
"name": "Error",
"status": 500,
"message": "Timeout in connecting after 5000 ms",
"stack": "Error: Timeout in connecting after 5000 ms\n at [object Object]. (/Users/mac/GitHub/edward_ptt/node_modules/loopback-datasource-juggler/lib/datasource.js:2029:10)\n at [object Object]._onTimeout (/Users/mac/GitHub/edward_ptt/node_modules/loopback/node_modules/continuation-local-storage/node_modules/async-listener/glue.js:188:31)\n at Timer.listOnTimeout (timers.js:89:15)"
}
}

any possible reason or suggestion?

find / find all throws error -> has no method 'notifyObserversAround

Through API explorer, if I choose 'find' or 'findAll' I'm seeing the following error. By the way If I choose the [email protected] It works fine.

Package version

Response Body

{
  "error": {
    "name": "TypeError",
    "status": 500,
    "message": "Object #<PostgreSQL> has no method 'notifyObserversAround',
    "stack": "TypeError: Object #<PostgreSQL> has no method 'notifyObserversAround'\n    at PostgreSQL.SQLConnector.execute (/home/ec2-user/pocketnews/node_modules/loopback-connector-postgresql/node_modules/loopback-connector/lib/sql.js:407:8)\n    at PostgreSQL.find [as all] (/home/ec2-user/pocketnews/node_modules/loopback-connector-postgresql/node_modules/loopback-connector/lib/sql.js:1050:8)\n    at /home/ec2-user/pocketnews/node_modules/loopback-datasource-juggler/lib/dao.js:1362:19\n    at doNotify (/home/ec2-user/pocketnews/node_modules/loopback-datasource-juggler/lib/observer.js:74:49)\n    at doNotify (/home/ec2-user/pocketnews/node_modules/loopback-datasource-juggler/lib/observer.js:74:49)\n    at doNotify (/home/ec2-user/pocketnews/node_modules/loopback-datasource-juggler/lib/observer.js:74:49)\n    at doNotify (/home/ec2-user/pocketnews/node_modules/loopback-datasource-juggler/lib/observer.js:74:49)\n    at Function.ObserverMixin._notifyBaseObservers (/home/ec2-user/pocketnews/node_modules/loopback-datasource-juggler/lib/observer.js:97:5)\n    at Function.ObserverMixin.notifyObserversOf (/home/ec2-user/pocketnews/node_modules/loopback-datasource-juggler/lib/observer.js:72:8)\n    at Function.ObserverMixin._notifyBaseObservers (/home/ec2-user/pocketnews/node_modules/loopback-datasource-juggler/lib/observer.js:95:15)"
  }
}

In Console Error

/home/app/node_modules/loopback-connector-postgresql/node_modules/loopback-connector/lib/sql.js:407
  this.notifyObserversAround('execute', context, function(context, done) {
       ^
TypeError: Object #<PostgreSQL> has no method 'notifyObserversAround'
    at PostgreSQL.SQLConnector.execute (/home/app/node_modules/loopback-connector-postgresql/node_modules/loopback-connector/lib/sql.js:407:8)
    at PostgreSQL.find [as all] (/home/app/node_modules/loopback-connector-postgresql/node_modules/loopback-connector/lib/sql.js:1050:8)
    at /home/app/node_modules/loopback-datasource-juggler/lib/dao.js:1362:19
    at doNotify (/home/app/node_modules/loopback-datasource-juggler/lib/observer.js:74:49)
    at doNotify (/home/app/node_modules/loopback-datasource-juggler/lib/observer.js:74:49)
    at doNotify (/home/app/node_modules/loopback-datasource-juggler/lib/observer.js:74:49)
    at doNotify (/home/app/node_modules/loopback-datasource-juggler/lib/observer.js:74:49)
    at Function.ObserverMixin._notifyBaseObservers (/home/app/node_modules/loopback-datasource-juggler/lib/observer.js:97:5)
    at Function.ObserverMixin.notifyObserversOf (/home/app/node_modules/loopback-datasource-juggler/lib/observer.js:72:8)
    at Function.ObserverMixin._notifyBaseObservers (/home/app/node_modules/loopback-datasource-juggler/lib/observer.js:95:15)

wrong schema discovered

If schema is given in a datasource definition, discoverSchema still returns {"schema": "public"... in discovered model.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.