elixir-mongo / mongodb_ecto Goto Github PK
View Code? Open in Web Editor NEWMongoDB adapter for Ecto
License: Apache License 2.0
MongoDB adapter for Ecto
License: Apache License 2.0
We should think about how embedding will work.
I think there are two cases to consider: embedding a full-fledged model, and having a field that would be basically a map without any imposed structure.
With both cases (models/maps) we should support embedding one object or an array of them.
For the unstructured case I think it's a matter of providing a new Ecto type to express this.
For embedded models I imagine the high level interface to be embeds_one
and embeds_many
in the schema definition similar to has_one
and has_many
.
What are your thoughts?
Running 'simple' example:
** (Mix) Could not start application simple: Simple.App.start(:normal, []) returned an error: shutdown: failed to start child: Simple.Repo
** (EXIT) an exception was raised:
** (UndefinedFunctionError) undefined function: Simple.Repo.pool/0
(simple) Simple.Repo.pool()
(mongodb_ecto) lib/mongo_ecto.ex:333: Mongo.Ecto.start_link/2
(stdlib) supervisor.erl:314: :supervisor.do_start_child/2
(stdlib) supervisor.erl:297: :supervisor.start_children/3
(stdlib) supervisor.erl:263: :supervisor.init_children/2
(stdlib) gen_server.erl:306: :gen_server.init_it/6
(stdlib) proc_lib.erl:237: :proc_lib.init_p_do_apply/3
I'm trying to create a project with Postgres Phoenix (PgRepo) and Mongo (MongoRepo), but I was going the same exception.
With PgRepo no problem, but when I try MongoRepo.start_link, it gives the same error when I run the 'Simple' example
I Check commit 'Update ecto, correct handling pool' and thought it was resolved.
Sorry if you're in it.
Use case is for basing an aggregation query on an ecto query. I would be willing to work on this.
I'm curious if anyone has an example of how to pass the :slave_ok
option for queries mentioned in the source?
I've tried variations on MyRepo.one(query, {:slave_ok})
with no luck. I'm sure I'm just missing something simple.
Even better would be this ability to just force all queries to use this option. Maybe during the connection config or something?
Thanks. I really appreciate any help.
We should think of a better way of loading models. Currently Ecto supports only passing a tuple, and it seems to be wasteful, as we already have a map with fields.
Currently we need to use something like this:
row = model.__schema__(:fields)
|> Enum.map(&Map.get(document, Atom.to_string(&1)))
|> List.to_tuple
model.__schema__(:load, coll, 0, row, id_types)
We need to compare which operations are supposed by both Mongo and Postgres and discuss which ones should be added to Ecto.
First, thanks for your hard work on this! Second, I ran into a problem getting things set up a in a new Phoenix project. I created a model and navigated in Chrome to the route I set up for the endpoint. I then got this error:
[error] #PID<0.1553.0> running CommonStandardsProject.Endpoint terminated
Server: localhost:4000 (http)
Request: GET /api/jurisdictions
** (exit) an exception was raised:
** (UndefinedFunctionError) undefined function: Mongo.Ecto.all/5
(mongodb_ecto) Mongo.Ecto.all(CommonStandardsProject.Repo, #Ecto.Query<from j in CommonStandardsProject.Jurisdiction, select: j>, [], #Function<0.119612849/2 in Ecto.Repo.Queryable.preprocess/2>, [])
(ecto) lib/ecto/repo/queryable.ex:21: Ecto.Repo.Queryable.all/4
web/controllers/jurisdiction_controller.ex:9: CommonStandardsProject.JurisdictionController.index/2
web/controllers/jurisdiction_controller.ex:1: CommonStandardsProject.JurisdictionController.phoenix_controller_pipeline/2
(common_standards_project) lib/phoenix/router.ex:281: CommonStandardsProject.Router.dispatch/2
(common_standards_project) web/router.ex:1: CommonStandardsProject.Router.do_call/2
(common_standards_project) lib/common_standards_project/endpoint.ex:1: CommonStandardsProject.Endpoint.phoenix_endpoint_pipeline/1
(common_standards_project) lib/plug/debugger.ex:90: CommonStandardsProject.Endpoint."call (overridable 3)"/2
Is this something I've done or a bug in the library? Here's the repo: https://github.com/scottmessinger/elixir-api-spike
Currently the adapter supports only the most basic queries, that simply retrieve documents from the database.
But there is much more we can do using aggregates - counts, groups, more fine-tuned matching or odering and document transformation.
What I'm not sure about is whether the normal adapter callbacks should support only simple queries, and more advanced ones should be allowed through some other mechanism, or maybe the adapter should detect if an advanced query is needed and prepare it - all with the normal functions.
This would be pretty cool. They also have a nice guide for it.
I spun up a temp. test db on Compose.io and am having issues connecting. I can connect with the shell but I must be messing up the config somehow? It's also a replica set so I used the same hostname that the shell used. It wasn't clear how the mongo URI should be used in the config.
I can’t figure out why it’s crashing… Any tips? It works great with a local mongo. I can login using the shell:
mongo candidate.16.mongolayer.com:11213/chatty -u testu -p testp
config :chatty, Chatty.Repo,
database: "chatty",
username: "testu",
password: "testp",
#hostname: "candidate.16.mongolayer.com:11213,candidate.57.mongolayer.com"
hostname: "candidate.16.mongolayer.com",
port: "11213"
and the error on startup:
State: %{auth: [{"testu", "testp"}], database: "chatty", opts: [backoff: 1000, pool_name: Chatty.Repo.Pool, pool: Ecto.Pools.Poolboy, otp_app: :chatty, repo: Chatty.Repo, hostname: 'candidate.16.mongolayer.com', port: "11213"], queue: %{}, request_id: 0, socket: nil, tail: nil, timeout: 5000, wire_version: nil, write_concern: [w: 1]}
[error] GenServer #PID<0.1716.0> terminating
** (stop) an exception was raised:
** (FunctionClauseError) no function clause matching in :inet_tcp.getserv/1
(kernel) inet_tcp.erl:36: :inet_tcp.getserv("11213")
(kernel) gen_tcp.erl:157: :gen_tcp.connect1/4
(kernel) gen_tcp.erl:144: :gen_tcp.connect/4
(mongodb) lib/mongo/connection.ex:187: Mongo.Connection.connect/2
(connection) lib/connection.ex:608: Connection.enter_connect/5
(stdlib) proc_lib.erl:240: :proc_lib.init_p_do_apply/3
(kernel) gen_tcp.erl:149: :gen_tcp.connect/4
(mongodb) lib/mongo/connection.ex:187: Mongo.Connection.connect/2
(connection) lib/connection.ex:608: Connection.enter_connect/5
(stdlib) proc_lib.erl:240: :proc_lib.init_p_do_apply/3
The URI compose gives me is:
mongodb://<user>:<password>@candidate.16.mongolayer.com:11213,candidate.57.mongolayer.com:10138/chatty?replicaSet=set-56489fdcbab49bb86e000bfb
Mongo is schemaless, and so it doesn't enforce any particular fields to be defined or with their types known. If fact, the same key within the same collection can all hold different types for each record. This is one of Mongo's strengths and Ecto should be careful not to take that away. Currently an Ecto model is strict about which fields it has (defined in the schema
macro block). However, there are times which it is necessary to say a particular model will not define its schema and will be open to "dynamic attributes". This is an extremely powerful feature of Mongo.
Heres an example from Mongoid:
class User
include Mongoid::Document
embeds_one :preferences, class_name: 'Preference'
field :name
end
class Preference
include Mongoid::Document
include Mongoid::Attributes::Dynamic
embedded_in :user
end
User.create(name: 'User', preferences: {notify_on_something: true, notify_on_something_else: false})
# MOPED: 127.0.0.1:27017 INSERT database=myapp_development collection=users documents=[{"_id"=>BSON::ObjectId('558ac0114d617453740d0000'), "name"=>"User", "preferences"=>{"_id"=>BSON::ObjectId('558ac0114d617453740e0000'), "notify_on_something"=>true, "notify_on_something_else"=>false}}] flags=[]
This is more than just saying that the preferences
field is of type Map
. We have full support of adding validations, callbacks, ect. For example, if we only wanted to allow boolean flags, we could validate all fields are of type boolean, but we allow any number of keys. Or we could write a custom validation to limit the number of keys within the preferences model. Lots of power is obtainable here by sometimes being less strict. Sometimes the data structure is less known at development time and this is exactly where Mongo shines.
As a side note, Mongoid does also not require that the type of any particular field be described in the schema block. While less useful, it is still an ability Mongo has that we wouldn't want to restrict via Ecto. The User
model above has a field :name
which doesnt have a type as an example. We could store a string (for just one name), or a map (for splitting out first/last names, ect, which can happen sometimes as the code evolves) in this field if we chose to do so.
In the Data Types in the doc, it says
symbol ???
What does this mean?
I upgraded to the latest version and the other bug was fixed! I can now see output on my browser. However, I'm running into another problem. Is there a method I need to implement?
Repo: https://github.com/scottmessinger/elixir-api-spike
[info] GET /api/jurisdictions
[debug] Processing by CommonStandardsProject.JurisdictionController.index/2
Parameters: %{"format" => "json"}
Pipelines: [:api]
[debug] FIND coll="jurisdictions" query=%{} projection=%{_id: true, inserted_at: true, title: true, type: true, updated_at: true} [] OK query=2.4ms queue=0.2ms
[info] Sent 200 in 7ms
[error] GenServer #PID<0.978.0> terminating
Last message: {:tcp, #Port<0.166602>, <<176, 19, 0, 0, 167, 90, 1, 0, 166, 90, 1, 0, 1, 0, 0, 0, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 101, 0, 0, 0, 58, 0, 0, 0, 85, 0, 0, 0, 2, 95, 105, 100, 0, 33, 0, 0, ...>>}
State: %{auth: [], database: "common-standards-project", opts: [backoff: 1000, port: 27017, otp_app: :common_standards_project, repo: CommonStandardsProject.Repo, adapter: Mongo.Ecto, hostname: 'localhost'], queue: %{}, request_id: 1, socket: #Port<0.166602>, tail: "", timeout: 5000, wire_version: nil, write_concern: [w: 1]}
** (exit) an exception was raised:
** (FunctionClauseError) no function clause matching in Mongo.Connection.message/4
(mongodb) lib/mongo/connection.ex:282: Mongo.Connection.message(nil, 88742, {:op_reply, [:await_capable], 0, 101, 58, [%{"_id" => "B1339AB05F0347E79200FCA63240F3B2", "title" => "California", "type" => "state"}, %{"_id" => "78D8430CBE88474FAB3CDAAD9094C961", "title" => "Iowa", "type" => "state"}, # < MORE ... >
I get the following error when running mix ecto.migrate
. I think the migration works as it should, but the command gives an error exitcode. This is a problem for me when I build do builds.
mix ecto.migrate
19:16:03.206 [info] == Running MyApp.Repo.Migrations.CreatePosts.change/0 forward
19:16:03.208 [info] == Migrated in 0.0s
** (UndefinedFunctionError) undefined function: Mongo.Ecto.stop/2
(mongodb_ecto) Mongo.Ecto.stop(#PID<0.151.0>, 5000)
(ecto) lib/mix/ecto.ex:75: Mix.Ecto.ensure_stopped/2
(mix) lib/mix/cli.ex:55: Mix.CLI.run_task/2
(elixir) lib/code.ex:363: Code.require_file/2
The prefixes should change the database, once it's possible from the driver.
They should be supported in all queries and in migrations: elixir-ecto/ecto#1004
Ecto master now supports the query above, we need to ensure Mongo.Ecto handles it.
While ideally we would use $set
, we may not be able to emulate this operation on SQL databases. So we need to find a good middle ground. We may also have different behaviour depending on map or embed. So I have opened this for discussion.
They are currently decoded as binaries: https://github.com/michalmuskala/mongodb_ecto/blob/cde237fd347ed5e1c8570f4b6a7be26e1698fef5/lib/mongo_ecto/decoder.ex#L27-L28.
It seems like all these decodings are incorrect if we don't have a schema type, because we don't know how to encode them later when we are going to persist them. For example a BSON.Binary
decoded to a simple binary https://github.com/michalmuskala/mongodb_ecto/blob/cde237fd347ed5e1c8570f4b6a7be26e1698fef5/lib/mongo_ecto/decoder.ex#L25-L26. But when encoding it is not encoded back to a BSON.Binary
https://github.com/michalmuskala/mongodb_ecto/blob/cde237fd347ed5e1c8570f4b6a7be26e1698fef5/lib/mongo_ecto/encoder.ex#L30-L31.
I have a model that goes something like this (obviously contrived) and I'm connecting to a legacy database. I'm having some issues with the {:array, subtype}
field, though
defmodule Foo do
schema "foo" do
field :some_val, :integer
field :value_history, {:array, :integer}, default: []
end
end
Everything is working perfectly fine on reads, but when I create a new instance of the model and try to assign the value_history
attribute - it will explode if I have any values in the array/list that are nil when I try to insert. I get this, because I said the subtype is integer - but I'm really hoping there is another way to handle this.
fails upon insert
foo = %Foo{some_val: 5, value_history: [5, 1, 2, nil, 6, 2, 2]}
works fine upon insert
foo = %Foo{some_val: 5, value_history: [5, 1, 2, 6, 2, 2]}
Is there a way to opt out of the type-check here or at least say, "the array subtype is integer and nil is allowed"?
Unfortunately I have two years of mongo docs across three massive replica sets (20+ servers total) that follow this paradigm, so I'm really hoping for a clean work around.
The SQL adapter has extensive logging. Currently we don't have any.
It's quite easy with SQL to log the query string, with MongoDB we should think about an appropriate format for the logs.
For legacy reason, I have to use a customized string for my Mongo _id
This error happened after that
** (exit) an exception was raised:
** (FunctionClauseError) no function clause matching in Mongo.Ecto.update/7
(mongodb_ecto) lib/mongo_ecto.ex:483: Mongo.Ecto.update(App.Repo, %{context: nil, model: App.Tracking, source: {nil, "trackings"}, state: :loaded}, [current_status: "picked_up", external_carrier: "VTP", external_tracking: "SNPE37043100000186", updated_at: {{2015, 12, 8}, {21, 3, 49, 0}}], [_id: "E37043100000186"], nil, [], [])
(ecto) lib/ecto/repo/model.ex:253: Ecto.Repo.Model.apply/4
(ecto) lib/ecto/repo/model.ex:152: anonymous fn/10 in Ecto.Repo.Model.do_update/4
(ecto) lib/ecto/repo/model.ex:25: Ecto.Repo.Model.update!/4
My schema:
defmodule App.Tracking do
use App.Web, :model
@primary_key {:_id, :string, []}
@derive {Phoenix.Param, key: :_id}
@derive {Poison.Encoder, only: [
:_id, :is_external, :external_carrier, :external_tracking, :business_id,
:current_status, :balance_adjustment, :inserted_at, :updated_at, :modifications
]}
schema "trackings" do
field :is_external, :boolean, default: false
field :external_carrier, :string
field :external_tracking, :string
field :current_status, :string
field :balance_adjustment, :integer
field :business_id, :binary_id
embeds_many :modifications, App.PackageModification
timestamps
end
@required_fields ~w(_id is_external current_status balance_adjustment)
@optional_fields ~w(external_carrier external_tracking business_id modifications business_id)
def changeset(model, params \\ :empty) do
model
|> cast(params, @required_fields, @optional_fields)
end
end
As discussed with @josevalim we could support some migrations, especially for adding and removing indexes.
I'm getting this error while trying to update a single record with mongodb_ecto:
iex(1)> rec = Simple.Repo.get(Weather, "1d4197db0398eb193c5315cd")
iex(2)> Simple.Repo.update!(Ecto.Changeset.change(rec, temp_hi: 50))
** (FunctionClauseError) no function clause matching in Mongo.Ecto.update/7
(mongodb_ecto) lib/mongo_ecto.ex:459: Mongo.Ecto.update(Simple.Repo, {nil, "weather", Weather}, [temp_hi: 50, updated_at: {{2015, 7, 26}, {9, 53, 9, 0}}], [id: %Ecto.Query.Tagged{tag: nil, type: :binary_id, value: <<29, 65, 151, 219, 3, 152, 235, 25, 60, 83, 21, 205>>}], nil, [], [])
(ecto) lib/ecto/repo/model.ex:116: anonymous fn/11 in Ecto.Repo.Model.update/4
(ecto) lib/ecto/repo/model.ex:25: Ecto.Repo.Model.update!/4
Using Ecto with Postgresql it works:
iex(1)> rec = Simple.Repo.get(Weather, 1)
iex(2)> Simple.Repo.update!(Ecto.Changeset.change(rec, temp_hi: 50))
%Weather{__meta__: %Ecto.Schema.Metadata{source: {nil, "weather"},
state: :loaded},
city: #Ecto.Association.NotLoaded<association :city is not loaded>,
city_id: nil, id: 1, inserted_at: #Ecto.DateTime<2015-07-26T09:39:45Z>,
prcp: 0.0, temp_hi: 50, temp_lo: 23,
updated_at: #Ecto.DateTime<2015-07-26T09:52:40Z>}
This is with master mongodb_ecto (commit 9b07403) and ecto (commit a19072ba9ca1ffa0b440578a7e345bf9a582f344).
I'm using Mongo version 3.0.4
We have just merged it into master.
MongoDB supports querying text fields with regexes.
Both Elixir and Mongo use PCRE, so we could use Elixir's literal syntax, but I'm not sure that this is a good idea (there might be some unforeseen incompatibilities, and the regexes would be compiled on both sides).
Other way to support these is to provide a regex
function that would return expected data, similar to the javascript
we're considering in #1.
After #31 we need to add Mongo guide for Phoenix, similar to http://www.phoenixframework.org/docs/using-mysql
It will traverse all collections and remove them. This will be used both by Mongo tests and exposed as user API.
When making a projection (selecting document parts we want) we can either select what fields we want - that's what I do currently with select clauses. But there is also possibility to unselect certain fields - to request all fields except that one. Both cannot be combined. It's related to the #3.
How should we go about this? Something similar to select? I know we shouldn't add another field to the Query struct. Maybe then some kind of a modifier? I was thinking about something similar to this:
from u in User
select: not u.name
This is a list of MongoDB operators. I don't think it would be reasonable to support all of those, especially the more exotic ones like the $geoWithin. An approach similar to SQL's fragments could probably work here quite well. The ones that are marked are used somewhere as of now.
There are special cases for not value in array
to use $nin operator, and for not is_nil(value)
to use $neq with null.
The $query and $orderby operators are used to implement order_by clauses.
These allow for some really advanced queries, basically a mapReduce but without javascript. Using these it would be possible to simulate group_by or having SQL clauses. I won't list them, as I think they are way beyond the current scope.
This should be handled by the adapter so we need to figure out which information to pass in the callbacks.
MongoDB support query where we ask for presence of a field in a document. I think this would be good as a function, similar to inline javascript. The null value and existence of a field are two separate issues, so is_nil
, won't be a good solution.
I'm thinking of something like this:
from u in User,
where: exists(u.name)
I've been reading Platformatec's guide on adding embedded models. However, I am getting a constraint error when I try and insert the new embeds via changesets using Mongo.Ecto.
This is what I followed in the guide:
defmodule MyApp.Permalink do
use Ecto.Model
embedded_schema do
field :url
timestamps
end
end
defmodule MyApp.Post do
use Ecto.Model
schema "posts" do
field :title
field :body
has_many :comments, MyApp.Comment
embeds_many :permalinks, MyApp.Permalink
timestamps
end
end
# Generate a changeset for the post
changeset = Ecto.Changeset.change(post)
# Let's track the new permalinks
changeset = Ecto.Changeset.put_change(changeset, :permalinks,
[%Permalink{url: "example.com/thebest"},
%Permalink{url: "another.com/mostaccessed"}]
)
# Now let's insert the post with permalinks at once!
post = Repo.insert!(changeset)
I get the following error when I post = Repo.insert!(changeset)
:
iex(8)> post = MyApp.Repo.insert!(changeset)
[debug] INSERT coll="posts" document=[body: "This is the body", _id: #BSON.ObjectId<1dfafe19149351241a15b1b1>, inserted_at: #BSON.DateTime<2015-12-09T14:58:33Z>, permalinks: [[_id: #BSON.ObjectId<1dfb04191493512880ee927d>, inserted_at: #BSON.DateTime<2015-12-09T15:24:09Z>, updated_at: #BSON.DateTime<2015-12-09T15:24:09Z>, url: "example.com/thebest"], [_id: #BSON.ObjectId<1dfb04191493512880ee927e>, inserted_at: #BSON.DateTime<2015-12-09T15:24:09Z>, updated_at: #BSON.DateTime<2015-12-09T15:24:09Z>, url: "another.com/mostaccessed"]], title: "Test Post", updated_at: #BSON.DateTime<2015-12-09T14:58:33Z>] [] ERROR query=0.7ms
** (Ecto.ConstraintError) constraint error when attempting to insert model:
* unique: _id_
If you would like to convert this constraint into an error, please
call unique_constraint/3 in your changeset and define the proper
constraint name. The changeset has not defined any constraint.
(ecto) lib/ecto/repo/model.ex:274: anonymous fn/5 in Ecto.Repo.Model.constraints_to_errors/3
(elixir) lib/enum.ex:1387: Enum."-reduce/3-lists^foldl/2-0-"/3
(ecto) lib/ecto/repo/model.ex:91: anonymous fn/10 in Ecto.Repo.Model.do_insert/4
(ecto) lib/ecto/repo/model.ex:14: Ecto.Repo.Model.insert!/4
Was unsure whether this was a bug, or something I have done wrong?
Thanks!
We are creating this here but the changes go to the driver. /cc @ericmj
Hi,
(I am new to Elixir) I experiment a problem when inserting an Ecto.Model containing a map containing end_date and start_date Ecto.DataTime fields (see below an extract simplified version of the map). I get the following message when inserting into MongoDB with the insert! function :
%ArgumentError{message: "Invalid expression for MongoDB adapter in insert command"}
As soon as I replace the end_date and start_date Ecto.DateTime fields by :integer fields containing timestamps then the insert works, and the created_at field located in the %Event map of type Ecto.DateTime is correctly persisted as an ISODate field into Mongo.
Could you give some advice about what I should do to solve this problem ? Thanks for your work.
%Event{__meta__: #Ecto.Schema.Metadata<:built>, the_id: 135,
created_at: #Ecto.DateTime<2010-10-19T18:30:05Z>,
event_data: %{area_id: 230, currency_code: "EUR",
end_date: #Ecto.DateTime<2013-09-04T00:50:03Z>,
is_closed: true,
start_date: #Ecto.DateTime<2013-08-28T00:50:03Z>,
stock: 1, type_id: 1, year: 2009}, event_type: "the_created_event"}
Below is the %Event module used to persist data :
defmodule Event do
use Ecto.Model
@primary_key {:id, :binary_id, autogenerate: true}
schema "my_events" do
field :event_type, :string
field :event_data, :map
field :the_id, :integer
timestamps([inserted_at: :created_at, updated_at: false])
end
end
``̀`
Currently it's not possible to pass a variable to limit or offset, which is often required to implement some form of pagination. This is not allowed:
iex(1)> limit = 1; Repo.all from(w in Weather, limit: ^limit)
** (FunctionClauseError) no function clause matching in Mongo.Ecto.NormalizedQuery.offset_limit/1
(mongodb_ecto) lib/mongo_ecto/normalized_query.ex:272: Mongo.Ecto.NormalizedQuery.offset_limit(%Ecto.Query.QueryExpr{expr: {:^, [], [0]}, file: "iex", line: 6, params: nil})
(mongodb_ecto) lib/mongo_ecto/normalized_query.ex:220: Mongo.Ecto.NormalizedQuery.limit_skip/1
(mongodb_ecto) lib/mongo_ecto/normalized_query.ex:65: Mongo.Ecto.NormalizedQuery.find_all/6
(mongodb_ecto) lib/mongo_ecto.ex:419: Mongo.Ecto.execute/6
(ecto) lib/ecto/repo/queryable.ex:95: Ecto.Repo.Queryable.execute/5
(ecto) lib/ecto/repo/queryable.ex:15: Ecto.Repo.Queryable.all/4
This way we don't traverse the whole fields just once. Then we can simply work with the normalised query instructions.
MongoDB gives possibility to query using javascript. It is somewhat similar to fragment
. Is there a possibility to use custom functions in queries or only the predefined ones? Is the function apporoach good here? It would look something like this:
from u in User,
where: javascript("this.name == this.username")
What should we do about them? Maybe something to mark a field optional? I think this is one of the main selling points of MongoDB, so it would be a pity not to support this. I'm not sure how big of an impact would that have on Ecto. I was thinking about something like this:
schema "users" do
field :name, :string, optional: true
end
Then when the field won't exists we would get a struct without that key.
We probably need to support commands for database management. Currently there's Mongo.Ecto.command/2
to send direct command to the database. I implemented also functions that use this for particular commands: create_collection
, list_collections
and drop_collection
for managing collections (as I needed those for the truncate
function).
There are a lot of commands, and some should have a specialized function, but majority shouldn't - I'm not sure where to draw the line.
The list of all the commands is here: http://docs.mongodb.org/manual/reference/command/
I think we should definitely have specialized functions for collections and indexes, about others I have no idea.
Basically the equivalent of Ecto.Adapters.SQL.query
. We need to discuss what the API should be like because the driver exposes multiple functions where the SQL drivers only have one.
I would be willing to work on this.
Not sure whether it is a bug or not.
This works:
Query.from(
c in CodeChallengeEx.Campaign,
select: c,
limit: 10
)
|> Repo.all
This doesn't:
n = 10
Query.from(
c in CodeChallengeEx.Campaign,
select: c,
limit: ^n
)
|> Repo.all
** (FunctionClauseError) no function clause matching in Mongo.Ecto.NormalizedQuery.offset_limit/1
(mongodb_ecto) lib/mongo_ecto/normalized_query.ex:272: Mongo.Ecto.NormalizedQuery.offset_limit(%Ecto.Query.QueryExpr{expr: {:^, [], [0]}, file: "/Users/leikind/projects/code_challenge_ex/lib/code_challenge_ex/campaign.ex", line: 20, params: nil})
(mongodb_ecto) lib/mongo_ecto/normalized_query.ex:220: Mongo.Ecto.NormalizedQuery.limit_skip/1
(mongodb_ecto) lib/mongo_ecto/normalized_query.ex:65: Mongo.Ecto.NormalizedQuery.find_all/6
(mongodb_ecto) lib/mongo_ecto.ex:419: Mongo.Ecto.execute/6
(ecto) lib/ecto/repo/queryable.ex:95: Ecto.Repo.Queryable.execute/5
(ecto) lib/ecto/repo/queryable.ex:15: Ecto.Repo.Queryable.all/4
(code_challenge_ex) lib/code_challenge_ex/clone_service.ex:4: CodeChallengeEx.CloneService.clone/1
I am sorry if it not a bug, in this case what am I doing wrong?
The problem is described here:
https://gist.github.com/fcruxen/9ad456039eee91632cd8
José Valim in the google groups topics suggested using the timeout attribute when configuring the Repo, same thing, queries still fail after 5000ms.
Increasing pool size had no effect on this too.
Thanks in advance,
Felipe Cruxen
Hi,
I don't see any docs on adding indexes, can you please help?
Today's syntax:
from p in Post, update: [set: ["field.nested.really": bar]]
Does not work due to the type checker.
Hopefully this is just my own lack of understanding but i'm trying to use mongodb_ecto (0.1.2 or HEAD) in a Phoenix (1.1.2) app with Phoenix Ecto (2.0.0).
I'm trying to use javascript in the where clause as documented but it complains of
** (Ecto.QueryError) web/models/post.ex:15: value `%Mongo.Ecto.JavaScript{code: "this.value == count", scope: [count: 1]}` in `where` cannot be cast to type :boolean in query:
The model looks something like this
defmodule Project.Post do
use Project.Web, :model
import Ecto.Query, only: [from: 2]
import Mongo.Ecto.Helpers
alias Project.Repo
schema "posts" do
field :value, :integer
end
def test
j = Mongo.Ecto.Helpers.javascript "this.value == count", count: 1
q = from p in Project.Post, where: ^j
Repo.all q
end
end
The stacktrace seems to suggest that mongodb_ecto isn't even getting a look in so this quite possibly is a phoenix_ecto bug?
(elixir) lib/enum.ex:1387: Enum."-reduce/3-lists^foldl/2-0-"/3
(elixir) lib/enum.ex:1102: Enum."-map_reduce/3-lists^mapfoldl/2-0-"/3
(ecto) lib/ecto/repo/queryable.ex:91: Ecto.Repo.Queryable.execute/5
(ecto) lib/ecto/repo/queryable.ex:15: Ecto.Repo.Queryable.all/4
(stdlib) erl_eval.erl:669: :erl_eval.do_apply/6
(iex) lib/iex/evaluator.ex:117: IEx.Evaluator.handle_eval/5
(iex) lib/iex/evaluator.ex:110: IEx.Evaluator.do_eval/3
(iex) lib/iex/evaluator.ex:90: IEx.Evaluator.eval/3
Could you shed any light?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.