Giter Site home page Giter Site logo

dynogels's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dynogels's Issues

Add a CHANGELOG.md

I've finally accepted reality and moved over to using dynogels going forward. However seeing as this module is more actively maintained (with several of major version bumps), it would be super helpful for those migrating from vogels to be able to see what changes they may need to make to their existing vogels-based code, without having to go through each commit since the fork. This has improved in the more recent releases, but many of the older releases just contain the version number bump along with dozens of commits, and isn't really a single place to see what's changed since Vogels.

This may not be sustainable indefinitely, but until a majority of vogels users transition to dynogels, I think it might help speed up adoption. I've started going through some of the new commits/releases myself, but thought this would be a more accurate and probably easier if done by one of the contributors / maintainers.

Thanks for all the work you've done @clarkie, et al in the past year!

BatchGetItems multiple requests doesn't work

This feature on model.getItems doesn't seem to work
DynamoDB limits the number of items you can get to 100 or 1MB of data for a single request. Vogels automatically handles splitting up into multiple requests to load all items.

The reason being is that when it detects that the request was capped (by UnprocessedKeys not being null/empty), it doesn't construct the subsequence request properly, resulting in:
Missing required key 'RequestItems' in params

I believe this is a related Vogels issue:
ryanfitz/vogels#166

I've tracked down the source of the issue here:
https://github.com/clarkie/dynogels/blob/master/lib/batch.js#L46

After we get UnprocessedKeys (due to 1mb limit), it tries to request further results automatically. However it doesn't build the subsequent request properly, as you can see here:
https://github.com/clarkie/dynogels/blob/master/lib/batch.js#L13
it is supposed to set RequestItems: as the base key, whereas it's not.

This results in:
Original Request:

{
  "name": "dynogels",
  "hostname": "blah",
  "pid": 3652,
  "model": "Model_name",
  "level": 30,
  "params": {
    "RequestItems": {
      "table_name": {
        "Keys": [ { keys_go_here: ''}  ]
      }
    }
  },
  "msg": "dynogels BATCHGET request",
  "time": "2017-03-22T03:55:35.890Z",
  "v": 0
}

subsequent ones:

{
  "name": "dynogels",
  "hostname": "blah",
  "pid": 3652,
  "model": "Model_name",
  "level": 30,
  "params": {
      "table_name": {
        "Keys": [ { keys_go_here: ''}  ]
      }
  },
  "msg": "dynogels BATCHGET request",
  "time": "2017-03-22T03:55:35.890Z",
  "v": 0
}

note the missing RequestItems in the second one.

update method: errorMessage does not contain a string

I got this error object when using update method


{
    "errorMessage": "[object Object]",
    "errorType": "Error",
    "stackTrace": [
        "Table.sendRequest.Table.get.internals.callBeforeHooks.Table.create.internals.createItem.internals.callBeforeHooks.internals.updateExpressions.internals.validateItemFragment.Table.update.internals.callBeforeHooks [as update] (/var/task/node_modules/dynogels/lib/table.js:300:21)",
        "Function.wrapper [as update] (/var/task/node_modules/lodash/lodash.js:4968:19)",
        "Function.UserOrder.addOrder (/var/task/node_modules/order_repository/models/user_order.js:47:16)",
        "exports.updateOrders (/var/task/functions/handler.js:54:47)"
    ]
}

As you can see there is no string in errorMessage. I confirmed by trying JSON.stringify(err.message).

How to reproduce it (userId is a uuid type):

Call update method like this: (set userId as non-uuid format)

  let userOrdersData = {
        userId: userId,
        orders: { $add: 1 }
    };
    
    UserOrder.update(userOrdersData, function(err, theUserOrders){
        if (err) {
            return cb(err, false);
        }
        return cb(err, true);
    });

Update: currently It is very difficult to diagnose errors if dynogels does not show them.

Model loggers cause Invalid Schema erro

Using the example

const accountLogger = require('winston');
   accountLogger.level = 'info';

   var Account = dynogels.define('Account', {
     hashKey: 'email',
     log: accountLogger
   }); // INFO level on account table

causes Dynogels to complain about an invalid configuration schema as internals.configSchema in schema.js doesn't allow the log attribute as optional (Or at all)

Changes in v7

Hi,

Is there a list of the changes for version 7 and also the breaking changes from v6?

Thanks

dynogels.types.uuid() returning an Object

Reinstalled my node_modules folder for the first time in 17 days and suddenly dynogels.types.uuid() is not returning a UUID, but returning an Object for some strange reason.

I kept receiving the error:

error: 'child "id" fails because ["id" must be a string] on ...

and on further investigation, id looked like:

     id: 
      { '0': 46,
        '1': 122,
        '2': 34,
        '3': 227,
        '4': 36,
        '5': 212,
        '6': 68,
        '7': 24,
        '8': 164,
        '9': 166,
        '10': 226,
        '11': 139,
        '12': 86,
        '13': 61,
        '14': 108,
        '15': 195,
        abortEarly: false,
        convert: true,
        allowUnknown: false,
        skipFunctions: false,
        stripUnknown: false,
        language: {},
        presence: 'optional',
        strip: false,
        noDefaults: false },

and my schema file looks like:

var schema = {
  id: dynogels.types.uuid(),

Not sure where this has come from and nothing else has changed and it was all previously working very well.

Thanks for the help! ๐Ÿ‘

Logic gates on chained functions (query)

Hi there, sorry if this is a silly question but i'm looking for a solution to do if statements on query function chains, I currently have it like this:

if(paginateKey){
        Post.query(hashKey)
            .descending()
            .limit(10)
            .startKey(parseInt(paginateKey))
            .exec(fn);
    } else {
        Post.query(hashKey)
            .descending()
            .limit(10)
            .exec(fn);
    }

Wondering if theres some sort of way to simplify it to something like:

Post.query(hashKey)
            .descending()
            .limit(10)
            paginateKey ? .startKey(parseInt(paginateKey)) : ()
            .exec(fn);

My use case is a lot more complex than this, just put together the above as a simplistic example

Thanks :)

.update() will create new items

This was brought up in #39, but is separate from the main issue there which is schema enforcement. Basically, you can create a new item in DynamoDB with an update call. The expected behavior should be to fail and only allow creating new items with a call to .create().

Tips for running locally against dynamodb local?

This is more a problem with aws-sdk I suppose, but I've tried everything I can think of to get dynogels to run locally against the local dynamodb server but I continue to get the "Missing region in config" aws error.

I've tried env, explicitly setting on the global AWS, dynogels.AWS, a combination of some or all, hitting it with a hammer. Nothing works.

Even doing the following combination of everything still fails with "Missing region in config" in spite of the fact that dynamodb.config.region is correctly set to "us-east-1".

process.env.AWS_DEFAULT_REGION = 'us-east-1';
process.env.AWS_ACCESS_KEY_ID = 'AKID';
process.env.AWS_SECRET_ACCESS_KEY = 'SECRET';
var config = {accessKeyId: 'AKID', secretAccessKey: 'SECRET', region: 'us-east-1'};
AWS.config.update(config);
dynogels.AWS.config.update(config);
var dynamodb = new AWS.DynamoDB({ endpoint: 'http://localhost:8000', region: config.region });
dynogels.dynamoDriver(dynamodb);
SomeModel.config({ service: dynamodb })
SomeModel.get(...) // "Missing region in config"

Nothing works. ๐Ÿ˜ญ Anyone have any tips?

I cloned dynogels and npm test all pass.

Edit: Oh, and I should add. I have a script that does dynogels.createTables without a problem so I'm completely confused.

Produce dynogels-specific errors

There are multiple places in dynogels where a new Error object is thrown or passed to a callback in error position. We should use a dynogels-specific error instead of the built-in Error type so that dynogels errors can be distinguished from other errors in error-handling code.

I am particularly singling this line out. It makes it very difficult to automatically and reliably determine that this error is the result of a schema check. In our application, we detect Joi errors and automatically react by responding with an HTTP error code of 422 (unprocessable entity) but we are completely unable to do that with these errors since the Joi validation error is masked behind an untyped error.

limit()

Sample code

  Address
    .scan()
    .where('active').ne(false)
    .where('user').eq(req.user.get('id'))
    .where('type').eq('pickup')
    .limit(1)

Items corresponding to this condition exist in db but doesn't returned because of .limit(1) call. Is this correct?

Disable Schema

I'm trying to disable schema validation. I tried not setting schema attribute, and setting it to false, but none of this worked. Also tried to set schema to Joi.any(), doesn't work too.

Is there a way to achiev this ?

Create only Object

How I know that this object was not created before?

When I run this code with the same values, it never got error.
The hashkey is username.
It dont create another item in Dynamodb but I doesnt know if it was previously created or if it is a new record.

User.create({username: event.username, password: event.password}, {overwrite : false}, function (err, acc) {
    if (err == null) {
        console.log('created account in DynamoDB', acc);
        var response = {
            status: "true",
            data: {
                username: event.username,
                password: event.password
            },
            error: []
        };
        return cb(null, response);
    } else {
        console.log(err);
        var response = {
            status: "false",
            data: {
                username: event.username,
                password: event.password
            },
            error: []
        };
        return cb(null, response);
    }
});

Pure-JS logging package?

I'm a bit frustrated with bunyan as it uses native add-ons. At the moment the pain this causes me is frequent needs for npm rebuild when I switch between node v4 (for AWS lambda) and node v6 (where debugging works well). I also just find it annoying when there are native modules without a really compelling reason, and I don't think logging records to stdout justifies native add-ons. I have no need for the D-trace baggage bunyan brings along with it.

Just starting this issue to see if anyone else is motivated to maybe swap in bole.

Add etag support

It would be nice to automatically create an etag field (_etag) similar to the timestamps, based on the contents of the document (https://www.npmjs.com/package/etag). Might be a bit complicated for updates as it would require a read first though.

allow index scan

the feature was added a few months ago in dynamodb. Can you add it to the scan module? currently, i manually add it to the scan prototype.

require('dynogels/lib/scan').prototype.usingIndex = function (name) { this.request.IndexName = name; return this; };

Allow more than 5 concurrent index creation

I noticed that we limit to 5 concurrent index creation at a given time

// http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/GSI.OnlineOps.html#GSI.OnlineOps.Creating

There's also a ticket on Vogels ryanfitz/vogels#125 for this issue

It used to be a limitation by AWS Dynamodb, but not anymore!
Would love some help to correct remove this limitation.

Suggestion - automatically query to fill up a limit on filtered results

Hi! First of all, love your work keeping this project alive - Vogels is great but it sucks that it's not so active anymore.

I have a suggestion based on this behaviour:

Batch Get Items

Model.getItems allows you to load multiple models with a single request to DynamoDB.

DynamoDB limits the number of items you can get to 100 or 1MB of data for a single request. Vogels automatically handles splitting up into multiple requests to load all items.

Given that we do something like that, perhaps we can automatically do something like this:

If I query with a 'limit' and then filter the results, the way DynamoDB works is it will query to the limit, and then apply the filter, so take this example:

    .query('partitionValue')
    .usingIndex('index')
    .where('sortKey')
    .gte('sortValue')
    .limit(pageSize)
    .filter('un_indexed_field')
    .equals('un_indexed_field_value');

Lets say I have 100 items in that table with that partition key, the first 50 all don't have that un_index_field_value set for un_index_field so they won't show up in the results. That means this query will return 0 items, but will have an indicator that there are more pages.

Basically what we could do, is automatically detect that we:

  • Got less than the limit value of results
  • Got a pointer that there are more pages to request

and then we could automatically request results of limit - amount_returned length, and continue until we get limit results or no pointer.

This could be an optional behaviour in the config. It's a fairly minor thing, but thought I'd suggest it!

prevent overwriting based on global secondary index

Hello,
My understanding is that {overwrite : false} prevents overwriting only based on the partition key.

When I create an object with a generated partition key (uuid), how can I prevent overwriting based on another attribute (for instance, an email address) ? Note that this attribute could be a global secondary index.

Thanks!

Missing some key conditions

Here we can see the list of supported key conditions in dynogels. There are more conditions supported by DynamoDB itself: 'EQ | NE | IN | LE | LT | GE | GT | BETWEEN | NOT_NULL | NULL | CONTAINS | NOT_CONTAINS | BEGINS_WITH'. In this particular case, I need the IN operator.

However, this may be more troublesome as I'm not sure if the query() function will work without an argument. In my case, I have a set of values and I want to get all records where the hash key is equal to one of those values. It doesn't appear that dynogels supports this, which means I would have to use scan() instead.

Dynamic Key Object

I am trying to create this object:

Lol = _dynogels2.default.define('Lol', {
                hashKey: 'userId',
                timestamps: true,
                tableName: 'Lol',
                schema: _joi2.default.object({
                    id: _dynogels2.default.types.uuid(),
                    isVerified: _joi2.default.boolean(),
                    messages: _joi2.default.object({
                        arg: _joi2.default.string(),
                        value: _joi2.default.boolean()
                    })
                }).unknown(true).options({ stripUnknown: true })
            });

My json:

{
    "id": "223da927-4547-47a4-9675-1a2934dbde9d",
    "isVerified": true,
    "messages": {
      "30_level": true
    }
  }

But the messages save in Dynamodb is:

{
    "id": "223da927-4547-47a4-9675-1a2934dbde9d",
    "isVerified": true,
    "messages": {}
  }

Update Item

Pls, help me with updating nested values in my complex item with this.MapModel.updateAsync(mapData).
Is it possible?

{
  "map-url": "http://go.com/map",
  "access": 1,
  "price": 35.58,
  "folder": "some folder",
  "valid-from": "32/23/3222",
  "valid-until": "25/5/8545",
  "mapId": "2da491f6-65e7-478b-8231-da6baba9779c",
  "icon-url": "http://some.com/icon",
  "createdAt": "2016-07-25T15:55:20.308Z",
  "name": "test",
  "pois": [
    {
      "id": "43443",
      "phone": "+38055645",
      "category": "cat",
      "details": "long text long textlong textlong textlong textlong textlong textlong text",
      "location": {
        "long": 53.33222,
        "lat": 43.434333
      },
      "address": {
        "zip": "04050",
        "street": "pimonenko",
        "state": "jit",
        "city": "kiev"
      },
      "valid-from": "32/23/3222",
      "name": "fffdfd",
      "valid-until": "25/5/8545"
    },
    {
      "id": "43443",
      "phone": "+38055645",
      "category": "cat",
      "details": "long text long textlong textlong textlong textlong textlong textlong text",
      "location": {
        "long": 53.33222,
        "lat": 43.434333
      },
      "address": {
        "zip": "04050",
        "street": "pimonenko",
        "state": "jit",
        "city": "kiev"
      },
      "valid-from": "32/23/3222",
      "name": "fffdfd",
      "valid-until": "25/5/8545"
    }
  ],
  "area-coordinates": {
    "to": {
      "long": 53.33222,
      "lat": 43.434333
    },
    "from": {
      "long": 53.33222,
      "lat": 43.434333
    }
  }
}

Joi.date in Joi.array.items issue

Strange issue:
when I post
object-pois-
{
...
valid-from:"2016-09-23"
valid-until:"2016-09-17"
}

to schema with array of pois with date fields - NO values of valid-from and valid-until where saved

Schema is
`import {
types
} from 'dynogels';
import Joi from 'joi';

const location = {
'lat': Joi.number(),
'long': Joi.number()
}

const address = {
'street': Joi.string(),
'city': Joi.string(),
'state': Joi.string(),
'zip': Joi.string(),
}

const poi = {
'id': types.uuid(),
'name': Joi.string(),
'location': location,
'address': address,
'phone': Joi.string(),
'category': Joi.string(),
'details': Joi.string().max(500),
'valid-from': Joi.date(),
'valid-until': Joi.date()

}

export default {
hashKey: 'mapId',

// add the timestamp attributes (updatedAt, createdAt)
timestamps: true,

schema: {
'mapId': types.uuid(),
'name': Joi.string().required(),
'folder': Joi.string().required(),
'icon-url': Joi.string().uri(),
'map-url': Joi.string().uri(),
'map-size': Joi.number().positive(),
'area-coordinates': {
'from': location,
'to': location
},
'pois': Joi.array().items(poi).sparse(true),
'price': Joi.number(),
'access': Joi.any().valid([0, 1, 2, 3]).required(),
'valid-from': Joi.date(),
'valid-until': Joi.date()
}
};
`

query() with limit() does not return LastEvaluatedKey

If you use scan() with startKey() then you'll receive LastEvaluatedKey but if you use query() instead then LastEvaluatedKey is empty.

CashierHistory
             .query(path['userId'])
            .limit(2)
            .startKey(queryString)
            .exec(function (err, acc) {
                if (err == null) {
                    console.log(acc);
                    return cb(null, Api.response(acc));
                } else {
                    console.log(err['message']);
                    return cb(null, Api.errors(200, {5: err['message']}));
                }
            });

As described in AWS docs "LastEvaluatedKey is only provided if you have used the Limit parameter, or if the result set exceeds 1 MB (prior to applying a filter)."

Create tables function call every time.

I have used dynogels.createTables() function inside the every model. And i have found that every db call createTables function is executed so the response is getting from dynamoDB is slow like 1 second every request.

So if I have required 5 table to get interacted in one request so it takes 5-6 second which is not good.

So please tell me if i am doing some wrong implementation or is there any other way to call createTables function only one time if table is not there inside the database.

On table.create, the ConditionExpression isn't merged correctly

Assume that it is necessary to specify a Condition Expression for create and also the option overwrite:

var params = {};
params.ConditionExpression = '#n <> :x';
params.ExpressionAttributeNames = {'#n' : 'name'};
params.ExpressionAttributeValues = {':x' : 'Kurt Warner'};
params.overwrite = false;

User.create({id : 123, name : 'Kurt Warner' }, params, function (error, acc) { ... });

Then, ConditionExpression is not set correctly because of the following line in file table.js:

189 params = _.merge({}, params, options);`

The merge overwrites the ConditionExpression and instead of "(id <> :id) AND #n <> :x, you get
#n <> :x. The lodash merge doesn't merge correctly the ConfitionExpression.
And the following error is issue by AWS SDK:

'Error: ValidationException: Value provided in ExpressionAttributeNames unused in expressions: keys: {#id}'.

Drop joi dependency in favor of json schema/ajv. PR accepted?

As someone that is using dynogels heavily in lambda; Joi -- and anything hapijs -- is the bane of my existence.

Joi alone turns my lambda 850kb bundle.zip into 3mb. And because their community violently opposes bundlers, they've stated time and time again they DGAF about making it better.

(I also question the security when this stuff goes unfixed even after reporting. Everything in a few of hapijs projects is globally +x for some unknown reason.)

Is there any interest in a PR that drops Joi and replaces it with json schema and something like ajv?

JSON Schema is not without its quirks, but it's a standard that works everywhere. Also -- aws seems to be choosing it internally with their api gateway swagger support and other places.

omitNulls why?

Is there any particular reason you are removing null/empty arrays from attrs while creating? My schema has array values and I'd like to store empty arrays if none for consistency. Or return object as per schema when getting the item from db.

Question: ordering by timestamp fields

Hi,
I'm wondering how can I deal with sorting by using tiemstamp fields createdAt and updatedAt because there are strings and dynamoDB does not have a date type so numbers should be used.
Thank you in advance!

Create if not exists

Sorry, I just found that project. This question was already posted in the vogels's repository. But I'm duplicating it here since this project is most updated.

Hi,

I have a simple table:

const Query = vogels.define('Query', {
    hashKey: 'queryId',

    schema: {
        queryId: vogels.types.uuid(),
        name: Joi.string().required()
    },

    tableName: 'query'
});

How to avoid duplicated queries that contain the same 'name'?

I've tried the following code:

function saveQuery(query, cb) {
    var params = {};
    params.ConditionExpression = '#n <> :x';
    params.ExpressionAttributeNames = { '#n': 'name' };
    params.ExpressionAttributeValues = { ':x': query.name };


    queryModel.Query.create(query, params, function(err, res) {
        if (err) {
            console.log(err);
            cb(err);
        } else {
            cb(null, res);
        }
    });
}

But, duplicated queries have been created.
Is there a way to do it using 'create'?

Thanks

.destroy(x) does not throw error if [x] does not exist

As per title, don't know if intended but was expecting an error if deleting on a hash key that does not exist, instead it silently fails.

Example of bug:

const Account = dynogels.define('account-email', {
  hashKey: 'name',
  schema: {
    email: Joi.string().email(),
  },
})

Account.create({
  email: '[email protected]',
})

Account.destroy({
  email: '[email protected]',
}, (err, data) => {
  console.log(err, data)
  // returns null, null
})

Workaround would be to ReturnValues: 'ALL_OLD' and see if data existed to manually throw error

High level Example:

Account.destroy({
  email: '[email protected]',
}, {
  ReturnValues: 'ALL_OLD',
}, (err, data) => {
  if(!data){
    throw 'No email exists'
  } else {
    return 'email deleted'
  }
})

Cheers

Migration from vogels: undefined fields make update operation to fail

I migrated from vogels to dynogels and I've noticed that one my unit tests is failing with this message:
"Uncaught AssertionError: expected [ValidationException: Invalid UpdateExpression: An expression attribute value used in expression is not defined; attribute value: :hashtag] to not exist"

The problem is that when updating a model by using MyModel.update(obj, function....) is failing due to
obj.hashtag is undefined.
When used vogels it didn't fail.

The solution seems to be set the field to null.

Is this an expected behavior? If so, what should I do in order to manage 'undefined' fields?
Thank you in advance

Provide way to validate object/model without storing

It can be helpful, when performing writes across multiple tables, to validate all of the documents being saved against the tables' Joi schemas prior to beginning any of the writes. This way, if any of the objects aren't valid according to the respective schema, none of the writes are attempted.

This functionality doesn't currently seem to be exposed; the only way to trigger a schema validation is to save the document.

Array of Objects as type

Thanks for dynogels. It's simply amazing. Impressed with the kind of coverage that it gives.

Is there a way to define JSON data type? or should we use string() type instead?
This is my scenario-

I need to build a DB with schema(rather example) that looks something like-

	"email":"[email protected]" - Hash Key
        "timestamp" : ----- Range Key
	"name":"Sat"
	"phone":"9383883223"
	"version":"version"
	"additionalSigninData":{
		"key1":"value1",
		"key2":"value2"
	}

	"markedLocations":[
		{
			"latitude":"77.33",
			"longitude":"12.33",
			"name":"Home",
			"description":"my new home",
			"placeid":"55334433",
			"image":"url/abc.png"
		}
	]
  1. So, I basically need to know how do I hold values in the 'MarkedLocations' array? The array has to hold objects in it.

  2. Updating the object's attributes(say updating latitude,longitude) and adding more objects to the array has to be possible.

Please can you suggest how do I achieve this?

Table name is wrong when it is pluralized.

Table name is not pluralized properly when model name end with "y" like that
category is become categorys which is wrong it will become categories. So please consider it is a bug and fix this in next reales.

Storage of dates

Joi allows specification of date types, which are represented as Date objects. Unfortunately, it appears the DynamoDB document client strips out types it doesn't understand, which includes dates. For schema entries that are dates, it would be nice if dynogels would convert them to and from strings automatically.

Using joi.string().isoDate() is a workaround, though it means that dates must be manually converted into Date objects if they need to be inspected.

v7 validation and null properties

dynogels treats null essentially as a special $delete command for properties. I can live with that, but v7 is now even more strict with regard to how nulls are handled and I'm stuck at 6.2.

Repro:

  1. Create a schema that allows null. e.g.: something: Joi.string().allow(null)
  2. Model.update({ something: null }, console.log)
  3. Validation error complaining that something is required

I have all my objects in json schema. These objects have nearly everything required and allow nulls. This is so I can guarantee the model the consumer gets and ajv handles this.

I use enjoi to turn that json schema into a joi schema.

However with v7, null values are stripped prior to doing validation and this leads to the problem.

Would you ever consider accepting a PR supports reading/writing null to ddb?

confusing doc in delete items

Hi,

You delete items in DynamoDB using the hashkey of model If your model uses both a hash and range key, than both need to be provided

than? I think it should be then right?

You delete items in DynamoDB using the hashkey of model. If your model uses both a hash and range key, then both need to be provided

Is there any plan to add promisified versions?

Are there any plans to add promise support in future releases? There are packages that help with that, allowing dual support for node callback pattern and promises.

Account.create([item1, item2, item3], function (err, acccounts) {
  console.log('created 3 accounts in DynamoDB', accounts);
});
// or
Account.create([item1, item2, item3]).then(function (acccounts) {
  console.log('created 3 accounts in DynamoDB', accounts);
}).catch(function (err) {
  console.error(err);
});
BlogPost
  .query('[email protected]')
  .where('title').beginsWith('Expanding')
  .exec(callback);
// or
BlogPost
  .query('[email protected]')
  .where('title').beginsWith('Expanding')
  .exec()
  .then(thenCallback, catchCallback);

Some of those packages are hybridify and polygoat, and packages like any-promise would allow flexibility of implementation choice.

.update() does not enforce schema

Don't know if intended or a bug but:

(copying format from update example)

const Account = dynogels.define('example-update', {
  hashKey: 'email',
  timestamps: true,
  schema: {
    email: Joi.string(),
    name: Joi.string()
  }
});
Account.update({ email: '[email protected]', whatever: true }, (err, acc) => {
    console.log('ran);
});

Produces on dynamoDB SCAN:

0: {
  "email" : {
    "S": "[email protected]"
  }
  "whatever" : {
    "BOOL": true
  }
}

Should I be able to pass through non defined attributes in the update?

Was expecting them to error

Question about this project

This is an awsome project, I like it, but may I ask why you guys fork, instead of pull request on
@ryanfitz /vogels, so what's the difference?
Does this repository ( @ryanfitz/vogels ) not maintain anymore?

Thanks,

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.