thedodd / wither Goto Github PK
View Code? Open in Web Editor NEWAn ODM for MongoDB built on the official MongoDB Rust driver.
Home Page: https://docs.rs/wither
License: Other
An ODM for MongoDB built on the official MongoDB Rust driver.
Home Page: https://docs.rs/wither
License: Other
Hi There
First of all thanks very much for putting this driver together - I'm really enjoying working with it.
I noticed from the docs that logging is available but if I understand the code correctly it's only used in migrations:
https://github.com/thedodd/wither/search?q=log&type=Code
Is my understanding correct?
FYI I have also put together a small sample app using this lib with tide and handlebars
https://github.com/No9/tide-morth-example/
Thanks again for your work.
Per some feedback that I was given, it would be beneficial to add some additional docs on how to use the model. Find, find all, updates, saving &c.
Also, add some docs on what sync is for and how to use it.
This is needed for keeping DB in a pristine state between integration tests. Among other things.
There's a bug related to using geoHaystack
indexes in the mongodb driver #289. Once it is fixed, it should just simply work as expected here.
Model.save
is good. Works as needed, but update
is needed for atomic operations so that concurrent updates to a document do not result in data loss or data races.
This should be significantly more straightforward to use.
I originally coded it this way so that users could sync their models at boot time. In retrospect, this is pretty inflexible. It should return a standard Result
and the user can panic
if they want to.
This will also allow for APIs to boot, attempt a sync, and even if it fails, the service can continue to stay online and service calls. It could attempt another sync operation in the background as part of a "manager" type which controls access to the database connection.
Model::sync
to no longer panic. It should return the default Result
type defined in this package. Probably Result<()>
.Particularly the example in the docs should show that this is defined in your Model
impl.
I currently have to do the following in order to find documents without raising an error:
let count = Submission::count(db.clone(), None, None).unwrap();
let submissions = if count > 0 { Submission::find(db.clone(), None, None).unwrap() } else { Vec::new() };
Json(json!({ "submissions": submissions }))
Shouldn't Submission::find() return an empty Vec on a empty collection and not raise a DecoderError? I am using MongoDB 4.0.3, should I try to downgrade to a later version?
Superfluous comma in Model.update
's docs: This operation targets the model, instance by the instance's ID.
First of all, thanks for the lib, I am testing it in a new project and so far I like it!
I came across the use case of deleting many attributes, are there any plans to support delete_many functionality or should I just use the native drive for this case?
I seem to be having an issue inserting this derived trait into my scope. I'm using the standard example on the read me available as well. I'm using the latest stable build of Rust currently which at this time is 1.47.0
. I'm still rather green to the language so apologies in advance if this is some silly issue.
Compiled Readme.md code against following toml dependencies and getting buch of errors.
Code
// First, we add import statements for the crates that we need.
// In Rust 2018, `extern crate` declarations will no longer be needed.
#[macro_use]
extern crate mongodb;
extern crate serde;
#[macro_use(Serialize, Deserialize)]
extern crate serde_derive;
extern crate wither;
#[macro_use(Model)]
extern crate wither_derive;
// Next we bring a few types into scope for our example.
use mongodb::{
Client, ThreadedClient,
db::{Database, ThreadedDatabase},
coll::options::IndexModel,
oid::ObjectId,
};
use wither::prelude::*;
// Now we define our model. Simple as deriving a few traits.
#[derive(Model, Serialize, Deserialize)]
struct User {
/// The ID of the model.
#[serde(rename="_id", skip_serializing_if="Option::is_none")]
pub id: Option<ObjectId>,
/// This field has a unique index on it.
#[model(index(index="dsc", unique="true"))]
pub email: String,
}
fn main() {
// Create a user.
let db = mongodb::Client::with_uri("mongodb://localhost:27017/").unwrap().db("mydb");
let mut me = User{id: None, email: "[email protected]".to_string()};
me.save(db.clone(), None);
// Update user's email address.
me.update(db.clone(), None, doc!{"$set": doc!{"email": "[email protected]"}}, None).unwrap();
// Fetch all users.
let all_users = User::find(db.clone(), None, None).unwrap();
}
toml dependencies
[dependencies]
futures = "0.3.5"
serde = "1.0.114"
serde_derive = "1.0.114"
wither = "0.8.0"
wither_derive = "0.8.0"
[dependencies.mongodb]
version = "1.0.0"
default-features = false
features = ["sync"]
Errors
error[E0432]: unresolved imports `mongodb::ThreadedClient`, `mongodb::db::ThreadedDatabase`, `mongodb::coll::options::IndexModel`, `mongodb::oid`
--> src/main.rs:14:13
|
14 | Client, ThreadedClient,
| ^^^^^^^^^^^^^^ no `ThreadedClient` in the root
15 | db::{Database, ThreadedDatabase},
| ^^^^^^^^^^^^^^^^ no `ThreadedDatabase` in `db`
16 | coll::options::IndexModel,
| ^^^^^^^^^^^^^^^^^^^^^^^^^ no `IndexModel` in `coll::options`
17 | oid::ObjectId,
| ^^^ help: a similar path exists: `wither::mongodb::oid`
error[E0432]: unresolved import `mongodb`
--> src/main.rs:22:10
|
22 | #[derive(Model, Serialize, Deserialize)]
| ^^^^^ no `IndexModel` in `coll::options`
|
= note: this error originates in a derive macro (in Nightly builds, run with -Z macro-backtrace for more info)
error: cannot find macro `doc` in this scope
--> src/main.rs:22:10
|
22 | #[derive(Model, Serialize, Deserialize)]
| ^^^^^
|
= note: this error originates in a derive macro (in Nightly builds, run with -Z macro-backtrace for more info)
error: cannot find macro `doc` in this scope
--> src/main.rs:40:33
|
40 | me.update(db.clone(), None, doc!{"$set": doc!{"email": "[email protected]"}}, None).unwrap();
| ^^^
error[E0433]: failed to resolve: could not find `oid` in `mongodb`
--> src/main.rs:22:10
|
22 | #[derive(Model, Serialize, Deserialize)]
| ^^^^^ could not find `oid` in `mongodb`
|
= note: this error originates in a derive macro (in Nightly builds, run with -Z macro-backtrace for more info)
error[E0603]: struct `Client` is private
--> src/main.rs:14:5
|
14 | Client, ThreadedClient,
| ^^^^^^ private struct
|
note: the struct `Client` is defined here
--> /home/bob/.cargo/registry/src/github.com-1ecc6299db9ec823/mongodb-1.0.0/src/lib.rs:141:9
|
141 | client::Client,
| ^^^^^^^^^^^^^^
error[E0603]: module `db` is private
--> src/main.rs:15:5
|
15 | db::{Database, ThreadedDatabase},
| ^^ private module
|
note: the module `db` is defined here
--> /home/bob/.cargo/registry/src/github.com-1ecc6299db9ec823/mongodb-1.0.0/src/lib.rs:111:5
|
111 | mod db;
| ^^^^^^^
error[E0603]: module `coll` is private
--> src/main.rs:16:5
|
16 | coll::options::IndexModel,
| ^^^^ private module
|
note: the module `coll` is defined here
--> /home/bob/.cargo/registry/src/github.com-1ecc6299db9ec823/mongodb-1.0.0/src/lib.rs:107:5
|
107 | mod coll;
| ^^^^^^^^^
error[E0603]: module `coll` is private
--> src/main.rs:22:10
|
22 | #[derive(Model, Serialize, Deserialize)]
| ^^^^^ private module
|
note: the module `coll` is defined here
--> /home/bob/.cargo/registry/src/github.com-1ecc6299db9ec823/mongodb-1.0.0/src/lib.rs:107:5
|
107 | mod coll;
| ^^^^^^^^^
error[E0603]: struct `Client` is private
--> src/main.rs:35:23
|
35 | let db = mongodb::Client::with_uri("mongodb://localhost:27017/").unwrap().db("mydb");
| ^^^^^^ private struct
|
note: the struct `Client` is defined here
--> /home/bob/.cargo/registry/src/github.com-1ecc6299db9ec823/mongodb-1.0.0/src/lib.rs:141:9
|
141 | client::Client,
| ^^^^^^^^^^^^^^
warning: unused `#[macro_use]` import
--> src/main.rs:3:1
|
3 | #[macro_use]
| ^^^^^^^^^^^^
|
= note: `#[warn(unused_imports)]` on by default
error[E0599]: no function or associated item named `with_uri` found for struct `mongodb::client::Client` in the current scope
--> src/main.rs:35:31
|
35 | let db = mongodb::Client::with_uri("mongodb://localhost:27017/").unwrap().db("mydb");
| ^^^^^^^^ function or associated item not found in `mongodb::client::Client`
error: aborting due to 11 previous errors; 1 warning emitted
Some errors have detailed explanations: E0432, E0433, E0599, E0603.
For more information about an error, try `rustc --explain E0432`.
error: could not compile `wither_eg`.
To learn more, run the command again with --verbose.
There should be no extra cost associated with simply adding any use
statements for types which the derive system depends upon. If the crates do not exist, cargo will notify the user pretty clearly as to the issue.
This is just a usability enhancement. It is annoying to have to declare use statements for types which derives require.
Hi, I am trying to run the example in the readme but this error comes up when cargo building the project. I'm using the latest versions:
[dependencies]
serde = "1.0.101"
serde_json = "1.0.41"
serde_derive = "1.0.101"
mongodb = "0.4.0"
wither = "0.8.0"
wither_derive = "0.6.1"
Env:
rustup --version
# rustup 1.19.0 (2af131cf9 2019-09-08)
cargo --version
# cargo 1.40.0-nightly (8b0561d68 2019-09-30)
Error:
error: proc-macro derive panicked
--> src/main.rs:22:10
|
22 | #[derive(Model, Serialize, Deserialize)]
| ^^^^^
|
= help: message: Unrecognized `#[model(index(...))]` attribute 'index'.
error: aborting due to previous error
I'm extremely newbie with rust, probably its an error on my end, but maybe you can point me in the right direction.
There's a bug related to storage engine specs for model(index(storage_engine(...)))
. Opened #282 to have the bug fixed.
Once that fix lands, we can update the code here to accept this index parameter.
Hey there, we are going to see a new version of https://github.com/mongodb/mongo-rust-driver soon along with the upgrade to MongoDB 4.4 (up from 4.2) on Atlas.
An alpha of the new MongoDB driver has just recently landed. Time to start cutting over.
Hopefully some of the indexing issues opened in this repo are resolved by the new driver.
Transactions and Change Streams are still not supported, but should be coming soon.
I am trying to experiment with branch 42-new-driver with the model below.
use bson::oid::ObjectId;
use serde::{Deserialize, Serialize};
use wither::Model;
#[derive(Serialize, Deserialize, Model)]
struct Foo {
#[serde(rename = "_id", skip_serializing_if = "Option::is_some")]
id: Option<ObjectId>,
}
The serde(rename...
annotation generated this error:
failed to parse serde rename attr
help: Unexpected literal type
string
rustc
...
The error disappear when I remove Model
from derive -- do you know why this is happening?
The custom derive system will be landing soon HAS LANDED (woot woot!), and one of the last outstanding challenges is to get a pattern in place which will work well for indexing model subdocument fields.
Put a plan of attack together on supporting indexes on nested models. Should be pretty straightforward. As a current workaround, users will simply have to implement the Model trait on their model manually.
This will do away with the lifetime bounds which are currently associated with model instances.
I have this model
use wither::prelude::*;
#[derive(Debug, Clone, Model, PartialEq, Deserialize, Serialize, TypedBuilder)]
#[model(collection_name = "accounts")]
#[model(index(
keys = r#"doc!{"email": 1}"#,
options = r#"doc!{"name": "unique-email", "unique": true, "background": true}"#
))]
pub struct Account {
#[serde(rename = "_id", skip_serializing_if = "Option::is_none")]
pub(crate) id: Option<ObjectId>,
pub(crate) email: Option<String>,
pub(crate) created: Option<String>,
pub(crate) updated: Option<String>,
}
I have noticed below call always return None
although the value exists. Does anyone knows why this is happening?
let filter = Some(bson::doc! {
"email": String::from("[email protected]")
});
let select_option = Account::find_one(&db, filter, None).await.map_err(|err| {
anyhow!("DB Error: :?}", err)
})?
println!("Verify: {:?}", select_option); // <-- Prints None
Compiling actix_demo v0.1.0 (D:\backCode\actix_demo)
error: cannot find derive macro Model
in this scope
--> src\common\structs.rs:6:17
|
6 | #[derive(Debug, Model, Serialize, Deserialize, Clone)]
| ^^^^^
error: aborting due to previous error
error: could not compile actix_demo
.
To learn more, run the command again with --verbose.
The Code:
structs.rs:
use serde_derive::{Deserialize, Serialize};
use mongodb::{
oid::ObjectId,
};
#[derive(Debug, Model, Serialize, Deserialize, Clone)]
pub struct MyObj {
/// The ID of the model.
#[serde(rename = "_id", skip_serializing_if = "Option::is_none")]
pub id: Option,
pub name: String,
pub number: i32,
}
...
main.rs:
#[macro_use]
extern crate json;
#[macro_use]
extern crate bson;
#[macro_use]
extern crate mongodb;
extern crate serde;
#[macro_use(Serialize, Deserialize)]
extern crate serde_derive;
extern crate wither;
#[macro_use(Model)]
extern crate wither_derive;
#[macro_use]
extern crate log;
extern crate hex;
extern crate actix_demo;
extern crate chrono;
use actix_web::{
error, middleware, web, App, Error, HttpRequest, HttpResponse, HttpServer,guard
};
use actix_web::http::{StatusCode};
use bytes::BytesMut;
use json::JsonValue;
use serde_derive::{Deserialize, Serialize};
use mongodb::{
ThreadedClient,
db::{Database, ThreadedDatabase},
coll::options::IndexModel,
oid::ObjectId,
};
use wither::prelude::;
use bson::Bson;
use serde_json::{Value, Map};
use r2d2_mongodb::{MongodbConnectionManager, ConnectionOptions};
use r2d2::Pool;
//use actix_demo::middlewareLocal::state::AppState;
use actix_demo::middlewareLocal::auth::Auth;
use actix_web::client::ClientRequest;
use sha2::{Sha256, Digest};
use rand::prelude::random;
use rand::Rng;
use actix_service::ServiceExt;
use reqwest::r#async::{Client, Response};
use futures::Future;
use std::collections::HashMap;
use reqwest::header::{USER_AGENT, CONTENT_TYPE, ACCESS_CONTROL_ALLOW_HEADERS};
//::header::{Headers, UserAgent, ContentType};
use actix_session::{CookieSession, Session};
use actix_identity::{Identity, CookieIdentityPolicy, IdentityService};
use qstring::QString;
use actix_files as fs;
use std::marker::PhantomData;
use serde_json::value::Value::Object;
use actix_service::Service;
use actix_demo::common::structs::;
...
I am fixing it for hours, and failed. Please help.
#37 is a solution for modelizing a structure which uses generics like this,
#[derive(Debug, Serialize, Deserialize, Model)]
struct Person<'a> {
#[serde(rename = "_id", skip_serializing_if = "Option::is_none")]
id: Option<ObjectId>,
name: Cow<'a, str>
}
But it still does not support a structure whose serde::ser::Deserialize<'de>
trait is not implemented by the macro, but by manual.
For example:
#[derive(Debug, Serialize, Model)]
struct Person<'a> {
#[serde(rename = "_id", skip_serializing_if = "Option::is_none")]
id: Option<ObjectId>,
name: Cow<'a, str>,
}
struct StringVisitor;
impl<'de> serde::de::Visitor<'de> for StringVisitor {
type Value = Person<'de>;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
formatter.write_str("a name")
}
fn visit_borrowed_str<E>(self, v: &'de str) -> Result<Self::Value, E> where E: serde::de::Error {
Ok(Person {
id: None,
name: Cow::Borrowed(v),
})
}
fn visit_string<E>(self, v: String) -> Result<Self::Value, E> where E: serde::de::Error {
Ok(Person {
id: None,
name: Cow::Owned(v),
})
}
}
impl<'de> serde::Deserialize<'de> for Person<'de> {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error> where D: serde::Deserializer<'de> {
deserializer.deserialize_str(StringVisitor)
}
}
To make the above code work, we need a new attribute meta to assign a lifetime that is used for 'de
. Like the following code,
#[derive(Debug, Serialize, Model)]
#[model(de = "'a")]
struct Person<'a> {
#[serde(rename = "_id", skip_serializing_if = "Option::is_none")]
id: Option<ObjectId>,
name: Cow<'a, str>,
}
#[derive(Debug, Serialize, Deserialize, Clone, Model)]
pub struct User {
/// The ID of the model.
#[serde(rename = "_id", skip_serializing_if = "Option::is_none")]
pub id: Option<ObjectId>,
pub nick: String,
}
I have a very simple structure, after save to the db and return back in a api call, there is a $oid
in the _id field. How to serialize it only with _id?
{
"code": 0,
"message": null,
"data": {
"_id": {
"$oid": "5f5ae5b400f0368a00715eca"
},
"nick": "Tm999y"
}
}
Thanks.
I see in the examples that db.clone()
should be given to function. Isn't it a performance issue? Is there nicer API? why should one use wither rather than official driver?
There were a few things mentally blocking me on this at first.
Now that I've been able to think about it a bit, I'm thinking that a solid path forward will be to use a custom derive for all of the core components, and then allow users to optionally implement a new Migrate
trait on their models, which is where the model's migrations will be defined.
Deriving Model
on your structs will give you a default implementation of sync
, so that you can sync your indices with the database. If you choose to also impl Migrate
on your struct, you will get a default implementation of sync_and_migrate
, which will call the default sync
first, and then execute the migrations.
According to the MongoDB docs:
To add or change index options other than collation, you must drop the index using the dropIndex() method and issue another db.collection.createIndex() operation with the new options.
Currently, Model::sync
only diffs the keys of the index itself, but does not take into account the possibility that the options of the index may have changed. It definitely should take this into account and drop an index first if needed.
The code resides in wither::model::sync_model_indexes
. A private function of the module which is called from Model::sync
.
In order to be able to query on the target document more precisely, we need to update the Model.update
method to also take an optional filter document. The document will be unpacked and the _id
field will be forced to be the ObjectId of the current model. This will ensure consistent behavior.
This is needed when you need to conditionally perform an update on the target document and you want to use the DB as your mechanism of staving off race conditions. EG: you want to update a field on the model, but only if the field is currently null. If it is not null, then someone else beat you to the update and the operation should fail.
Hello,
I try to use Model::Find to complete a find_all function, but when I get the ModelCursor as result, I don't know how to iterate it. I can neither use the example code by calling next() nor use the example of Mongo official as cursor is a private member of ModelCursor.
wither::ModelCursor<models::role::Role>
)let mut cursor = User::find(mongodb, None, None).await?;
let mut v: Vec<User> = vec![];
while let Some(user) = cursor.next().await {
v.push(user);
}
let cursor = coll.find(Some(doc! { "x": 1 }), None).await?;
let results: Vec<Result<Document>> = cursor.collect().await;
It will be appreciated if there is any examples, thanks.
This lib uses the standard logging facade. Add some notes in the documentation & readme on how to utilize this from something like slog
.
Hey man,
I'm somewhat new to rust and really appreciate your work.
I wanted to set up a new project and just copied your example code from the README but It doesn't seem to work for me. Am I doing something wrong?
error[E0433]: failed to resolve: use of undeclared type or module `futures`
--> src/main.rs:1:5
|
1 | use futures::stream::StreamExt;
| ^^^^^^^ use of undeclared type or module `futures`
error[E0433]: failed to resolve: use of undeclared type or module `async_trait`
--> src/main.rs:8:17
|
8 | #[derive(Debug, Model, Serialize, Deserialize)]
| ^^^^^ use of undeclared type or module `async_trait`
|
= note: this error originates in a derive macro (in Nightly builds, run with -Z macro-backtrace for more info)
error[E0433]: failed to resolve: use of undeclared type or module `tokio`
--> src/main.rs:18:3
|
18 | #[tokio::main]
| ^^^^^ use of undeclared type or module `tokio`
warning: use of deprecated item 'wither::model::Model::sync': Index management is currently missing in the underlying driver, so this method no longer does anything. We are hoping to re-enable this in a future release.
--> src/main.rs:22:3
|
22 | User::sync(db.clone()).await?;
| ^^^^^^^^^^
|
= note: `#[warn(deprecated)]` on by default
error[E0599]: no method named `next` found for struct `wither::cursor::ModelCursor<User>` in the current scope
--> src/main.rs:33:33
|
33 | while let Some(user) = cursor.next().await {
| ^^^^ method not found in `wither::cursor::ModelCursor<User>`
|
::: /Users/chris/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-util-0.3.5/src/stream/stream/mod.rs:222:8
|
222 | fn next(&mut self) -> Next<'_, Self>
| ----
| |
| the method is available for `std::boxed::Box<wither::cursor::ModelCursor<User>>` here
| the method is available for `std::sync::Arc<wither::cursor::ModelCursor<User>>` here
| the method is available for `std::rc::Rc<wither::cursor::ModelCursor<User>>` here
|
= help: items from traits can only be used if the trait is in scope
help: the following trait is implemented but not in scope; perhaps add a `use` for it:
|
1 | use futures_util::stream::stream::StreamExt;
|
error[E0277]: `main` has invalid return type `impl std::future::Future`
--> src/main.rs:19:20
|
19 | async fn main() -> Result<()> {
| ^^^^^^^^^^ `main` can only return types that implement `std::process::Termination`
|
= help: consider using `()`, or a `Result`
error[E0752]: `main` function is not allowed to be `async`
--> src/main.rs:19:1
|
19 | async fn main() -> Result<()> {
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ `main` function is not allowed to be `async`
error: aborting due to 6 previous errors; 1 warning emitted
Some errors have detailed explanations: E0277, E0433, E0599, E0752.
For more information about an error, try `rustc --explain E0277`.
Sometimes we don't want that the id field is named id
, and don't want it to be an ObjectId
(Option<mongodb::oid::ObjectId>
precisely).
Perhaps wither
can allow the following code in the future:
#[derive(Serialize, Deserialize, Model)]
struct Document {
#[model(id)]
#[serde(rename = "_id")]
uid: u64
}
I have the following code-
use futures::stream::StreamExt;
use serde::{Serialize, Deserialize};
use wither::{prelude::*, Result};
use wither::bson::{doc, oid::ObjectId};
use wither::mongodb::Client;
use chrono::{DateTime, Duration, Utc};
use uuid::Uuid;
#[derive(Debug, Model, Serialize, Deserialize)]
#[model(index(keys=r#"doc!{"uid": 1}"#, options=r#"doc!{"unique": true}"#))]
struct ToDo {
/// The ID of the model.
#[serde(rename="_id", skip_serializing_if="Option::is_none")]
pub id: Option<ObjectId>,
/// The to_do's email address.
pub uid: String,
pub task: String,
pub completed: bool,
pub created_at: DateTime<Utc>,
pub updated_at: DateTime<Utc>,
pub deleted_at: DateTime<Utc>
}
#[tokio::main]
async fn main() -> Result<()> {
// Connect & sync indexes.
let db = Client::with_uri_str("mongodb://localhost:27017/").await?.database("mydb");
//User::sync(db.clone()).await?;
// Create a user.
let mut me = ToDo {
id: None, uid: Uuid::new_v4().to_string(),
task: String::from("Task 1"), completed: false,
created_at: Utc::now(), updated_at: Utc::now(),
deleted_at: Utc::now()
};
me.save(db.clone(), None).await?;
// Update user's email address.
me.update(db.clone(), None, doc!{"$set": doc!{"task": "New task1", "updated_at": Utc::now()}}, None).await?;
// Fetch all users.
let mut cursor = ToDo::find(db.clone(), None, None).await?;
while let Some(to_do) = cursor.next().await {
println!("{:?}", to_do);
}
Ok(())
}
toml-
[dependencies]
chrono = { version = "0.4", features = ["serde"] }
futures = "0.3.5"
global = "0.4.3"
serde = "1.0.114"
serde_derive = "1.0.114"
wither = { version = "0.9.0-alpha.1", default-features = false, features = ["async-std-runtime"] }
#tokio = { version = "0.2.21", features = ["full"] }
tokio = { version = "0.2.21", features = ["macros"] }
uuid = { version = "0.8", features = ["serde", "v4"] }
This code fails at ToDo::find and gives runtime error-
Err(BsonDe(DeserializationError { message: "invalid type: map, expected a formatted date and time string or a unix timestamp" }))
While inserting record it stores datetime value in string format in Mongo, but at update it saves update_at in perfect datetime in Mongo. But then while fetching it breaks, If you will not pass updated_at while updating record it works well without an error.
If deleted_at will make it as optional as "pub deleted_at: Option<DateTime<Utc>>" and pass None while inserting record then also at retrieve code fails with the same error.
Err(BsonDe(DeserializationError { message: "invalid type: map, expected a formatted date and time string or a unix timestamp" }))
Expected is to store perfectly DateTime marked data in datetime format at insertion as well as at update in Mongo. And If None is provided for optional then the program should tackle without failing while retrieving records.
Value of datetime in mongo while inserting is "2020-07-05T11:39:39.352802472Z" and after the update is ISODate("2020-07-05T11:39:39.360Z").
I see that via an IntervalMigration I can set/unset fields in documents that pass a filter. Is there yet a way to transform data in a field from one format to another?
I think this may get tricky due to the fact that the change to the struct
would cause a deserialisation error on the old version. If there's a good solution I'd be happy to work on it.
What is the intended method to run an aggregation query with wither?
Currently I am doing something like this:
let path_regex = wither::bson::Regex { pattern: format!("{}.*", path), options: String::new() };
let pipeline = vec!
[
doc!{
"$match": {
"path": path_regex
}
},
doc!{
"$sample": {
"size": 1
}
}
];
let mut cursor: Cursor<_> = DataBlockMongo::collection(&db).aggregate(pipeline, AggregateOptions::default()).await.unwrap();
if let Some(chosen_item) = cursor.next().await {
let chosen_item = DataBlockMongo::instance_from_document(chosen_item.unwrap()).unwrap();
}
Which works, but requires converting to and from a document.
Is this the intended method?
If so, I can't seem to find this documented anywhere.
A manager object is needed to resolve the following difficulties:
Model::sync
works for indices and such. Not from a separate CLI system.#[serde(default)] | #[serde(default = "path")]
could not be used.Option<T>
for the field type. Serde will deserialize the record from the database as None
. Then deal with that condition in your model code if you don't want it to be None
.Model::migrate
, which will receive its execution orders from Model::migrations
, which will run whatever mutations against the database are needed according to the migrations specs (these should always be coded to be idempotent).Option<T>
fields may be coded as simple T
(not wrapped in an Option
) as the old data has been fully updated from the migrations.Leverage Rust nightly plugins/custom attributes/&c to define a system which will use the document __version
field to automatically handle document updates from version to version, removing fields, changing field types, adding new fields &c from lower versions up to the latest version of the Model
.
This would be shooting for the stars ... and I don't currently have time to explore this. But it would be fucking awesome. And better than anything else out there in any other language, including interpreted languages (Python & Ruby have solid stories in this realm).
Wither started out with application-level automatic index synchronization capabilities via the Model::sync
method. As of [email protected]
, index management has not yet been implemented in the driver, and thus, as of the [email protected]
release, the index syncing features have been disabled.
If this is a feature you enjoyed, used, or would otherwise like to have once again, please upvote this issue.
If this is something you don't particularly care for, and you have other potentially better approaches to handling these sorts of tasks, please share your thoughts and what tools you would recommend.
Thank you in advance for any participation!
Much in the way Mongoose deals with schemas, I think it would be cool to see Wither handle document schemas in a way that's more natural with how documents are formatted within MongoDB itself, through either Diesel-style migrations using RON files in a migrations/[migration].ron
kind of folder structure, or through macros.
An example RON file:
Users (
_id: {
type: String, // maybe as an enum of all supported types
},
username: {
type: String,
options: (
// some miscellaneous options for this field
required: true,
unique: true,
),
},
// can also declare objects
stats: {
blogs: { type: Number, options: (required: false, default: 0)},
},
createdAt: {
type: Date,
},
)
(issue migrated from #47 so as not to clutter that one)
It seems like Wither has the potential to be essentially the equivalent of what mongoose is to Node, but the lack of recent development makes me somewhat concerned when considering switching off of NodeJS/mongoose and onto Rust/Wither for a production app. I get the feeling that a large reason for this being the case is simply that for this project to move forward, there had to be progress with the MongoDB driver. Now that it is this case that there's a new officially supported driver that's been released, is Wither going to resume active development again? Do you have a roadmap?
Just started to use the crate and can't get it to work as I'm using r2d2-mongodb to create a pooled connection. This works with the pure rust driver.
this works
#[get("/hello")]
pub fn hello_world(connection: DbConn) -> JsonValue {
let db = &connection;
db.collection("hello")
.insert_one(doc! { "name": "John" }, None)
.unwrap();
json!({ "status": "ok"})
}
this doesn't work
pub fn all_customer_country_query(
connection: DbConn,
) -> Result<Vec<RestaurantCountry>, MongoDBError> {
let db = &connection;
RestaurantCountry::find(db.clone(), None, None)
}
error
error[E0308]: mismatched types
--> src/restaurants/repository.rs:11:29
|
11 | RestaurantCountry::find(db.clone(), None, None)
| ^^^^^^^^^^ expected struct `std::sync::Arc`, found reference
|
= note: expected type `std::sync::Arc<mongodb::db::DatabaseInner>`
found type `&connection::DbConn`
As of 0.9.0-alpha.0
the synchronous code has been disabled. This is due to a few difficulties with maintaining both the sync and async code in the same crate, specifically the difficulties exist around ensuring that docs are built properly to expose both interfaces. This difficulty comes about due to some types being made private in the mongodb crate based on the existence of some feature flags.
All in all, the easiest path forward may be to crate a new wither_sync
crate which exposes the sync code if folks find themselves in need of this code.
If you are one of those individuals, please let me know!
I am using Mongoose with Node.js and find it handy to only call once to connect to the database.
mongoose.connect('mongodb://localhost/dbname');
then manipulate adding, modifying, deleting without adding database parameters:
User.updateOne({_ id: ''}, {name: 'Your name'});
Would be great if you add this to wither
Alternatively, you can manage pools connected to mongodb using r2d2_mongodb
https://docs.rs/r2d2-mongodb/0.2.2/r2d2_mongodb/
Thank you!
Let's go ahead and add the three find_one_and_* methods.
Add a code block to the readme on how to use the underlying driver in a case where this lib doesn't quite do what you need it to do.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.