mimetis / dotmim.sync Goto Github PK
View Code? Open in Web Editor NEWA brand new database synchronization framework, multi platform, multi databases, developed on top of .Net Standard 2.0. https://dotmimsync.readthedocs.io/
License: MIT License
A brand new database synchronization framework, multi platform, multi databases, developed on top of .Net Standard 2.0. https://dotmimsync.readthedocs.io/
License: MIT License
Happy to help!
Hello,
i have tried to sync a database with about 250MB and the initial "Ensuring" takes some time, but the progress remains at 0 until the inserts start.
Would it be possible to provide a feedback in this phase?
Be able to deprovisionning datastore, from the Sync Provider.
the idea is to create a CLI (Command Line Interface) to be able to sync through a command line.
Please post a comment to discuss what you think :)
The cli will be added as a package to the dotnet
command.
The command line exe will be called (by convention) : dotnet-sync.exe
to respect the naming convention from the dotnet
runtime.
Once installed, the command line will be then called like this : dotnet sync [command] [--arguments]
The CLI will create what we can call a CLI Project.
The CLI Project represents a sync processus beetween a server and a client.
No sync process can be launched if no CLI Project is loaded (Actually, a default project is allways created when launching the CLI).
Here is the most straightforward sample to launch a Sync process:
$ dotnet sync provider -p SqlSyncProvider -c "Data Source=(localdb)...." -s Server
Server provider of type SqlSyncProvider added.
$ dotnet sync provider -p SqliteSyncProvider -cs "adWorks.db" -s Client
Client provider of type SqliteSyncProvider added.
$ dotnet sync table --add ProductCategory, Product
Table ProductCategory added.
$ dotnet sync -s
Sync Start
Begin Session.
Ensure Scopes
Ensure Configuration
Configuration readed. 2 table(s) involved.
Selecting changes...
Changes selected : 12230
Applying changes...
Changes applied : 12230
Writing Scopes.
436250e7-316e-45e5-ad9e-bae089e528ff synced at 06/11/2017 10:46:37.
59a439cf-73e5-4cfb-8e19-092560554495 synced at 06/11/2017 10:46:37.
End Session.
Using a YAML file can be easier (see YAML section bellow)
$ dotnet sync yaml -f "projectsync.yml"
Loading yaml file
Project "projectsync01" loaded
$ dotnet sync -s
Sync Start
Begin Session.
Ensure Scopes
Ensure Configuration
Configuration readed. 2 table(s) involved.
Selecting changes...
Changes selected : 10
Applying changes...
Changes applied : 10
Writing Scopes.
436250e7-316e-45e5-ad9e-bae089e528ff synced at 06/11/2017 10:46:37.
59a439cf-73e5-4cfb-8e19-092560554495 synced at 06/11/2017 10:46:37.
End Session.
After a first sync, your last project is allways loaded when you launch the CLI.
You can so directly launch the next sync process :
$ dotnet sync -s
Sync Start
Begin Session.
Ensure Scopes
Ensure Configuration
Configuration readed. 2 table(s) involved.
Selecting changes...
Changes selected : 0
Applying changes...
Changes applied : 0
Writing Scopes.
436250e7-316e-45e5-ad9e-bae089e528ff synced at 06/11/2017 10:46:37.
59a439cf-73e5-4cfb-8e19-092560554495 synced at 06/11/2017 10:46:37.
End Session.
The CLI will create a CLI Project stored in a datastore. Actually, a SQLite database is used to store CLI projects.
A CLI Project is mandatory to be able to launch a synchronization beetween two databases. When launching the CLI, a default project is loaded (called __sync_default_project
).
A CLI Project can be represented by a yaml file to describe itself (see section on YAML bellow).
A CLI Project is defined by :
bidirectional
)TODO : How to install the CLI ?
References :
dotnet
command : abusing dotnet core clidotnet
extensibility model https://docs.microsoft.com/fr-fr/dotnet/core/tools/extensibilitydotnet ef
CLI : https://docs.microsoft.com/en-us/ef/core/miscellaneous/cli/dotnet : Great, but require a .csproj
file.Since we are called from the dotnet
command, and to be compliant with the dotnet
command extensibility, each command will begin like this :
$ dotnet sync [command] [--arguments]
Some useful requests don't need [command] and are called directly with their [arguments]
-v
or --verion
: Get the current CLI & Dotmim.Sync version.-h
or --help
: Get the help informations.-s
or --sync
: Launch the sync process on the actual loaded project.--verbose
: Enable verbose output.All project commands are called prefixed with [command] project
.
The project
command is the default command, you can ommit it .
Every sync process is associated with a project. When you launch for the first time the CLI, a default project is created (called __sync_default_project
).
Arguments available within the project
command :
-n
or --new
: Creating a new project with a unique name.-l
or --load
: Load an existing project by its unique name.-d
or --delete
: Delete an existing project by its unique name.-ls
or --list
: List all projects created and saved within CLI.Creating a CLI project called "syncproject01" :
$ dotnet sync project -n "syncproject01"
Creating new project "syncproject01"
Project "syncproject01" loaded
Getting an existing CLI Project :
$ dotnet sync project -l "syncproject01"
Project "syncproject01" loaded
Deleting an existing CLI Project :
$ dotnet sync project -d "syncproject01"
Project "syncproject01" deleted
Once you have loaded your CLI project, you can add providers.
You must add one server and one client provider.
All providers commands are available through the [command] provider
:
$ dotnet sync provider [arguments]
Arguments available :
-p
or --providerType
: Adding a provider type, like SqlSyncProvider
or SqliteSyncProvider
or MySqlSyncProvider
.-c
or --connectionString
: Adding the provider connection string.-s
or --syncType
: Adding the provider sync type : could be Server
or Client
Adding providers of type SqlSyncProvider
as server side and SqliteSyncProvider
as client side :
$ dotnet sync provider -p SqlSyncProvider -c "Data Source=(localdb)...." -s Server;
Server provider of type SqlSyncProvider added to project syncproject01
$ dotnet sync provider -p SqliteSyncProvider -cs "adWorks.db" -s Client;
Client provider of type SqliteSyncProvider added to project syncproject01
Calling the the dotnet sync provider
again will replace the current provider :
$ dotnet sync provider -p MySqlSyncProvider -c "...." -s Server;
Server provider of type MySqlSyncProvider replaced in the project syncproject01
One you have loaded your CLI Project, you can add tables.
At least one table is mandatory to be able to launch the sync processus
All tables commands are available through the [command] table
:
$ dotnet sync table [arguments]
Arguments availables:
-a
or --add
: Adding a table identified with its name.-s
or --schema
: Set the schema name for the current table. Only used with the SqlSyncProvider
.-r
or --remove
: Remove the specfied table from the sync process.-d
or --direction
: Set the sync direction for the current table. Could be Bidirectional
, UploadOnly
or DownloadOnly
Adding tables to the current project :
$ dotnet sync table --add ProductCategory
Table ProductCategory added to the project syncproject01
$ dotnet sync table -a Product -d downloadonly
Table Product [DownloadOnly] added to the project syncproject01
$ dotnet sync table -a Employee -d downloadonly -s Sales
Table Sales.Employee [DownloadOnly] added to the project syncproject01
You can specify several options for your current CLI Project through the configuation command.
All configuration options are available through the [command] conf
:
$ dotnet sync conf [arguments]
Arguments availables:
-c
or --conflict
: can be ServerWins
or ClientWins
. Default is ServerWins
.-s
or --batchSize
: set the batch size. Default is 1000.-d
or --batchDirectory
: Set the batch directory. Default is your environment temp folder.-f
or --format
: Set the serialization format. Can be json
or dm
. Default is json
-o
or --bulkOperations
: Set if you want to use bulk operations when using the SqlSyncProvider
is used. Default is true
.Adding configuration to the current loaded CLI Project:
$ dotnet sync conf -c ClientWins -f dm -o true
Working with a yml file could be easier.
All YAML options are available through the [command] yaml
:
$ dotnet sync yaml [arguments]
Arguments availables:
-f
or --file
: Set the file name to load. if not set, the default directory is used.-d
or --directory
: Set the directory where the YAML file is stored.Loaded a CLI Project stored in a YAML file :
$ dotnet sync yaml -f "projectsync.yml"
Example of what we can have in such way :
project: projectsync01
providers:
- providerType: SqlSyncProvider
connectionString : "Data Source=(localdb)\MSSQLLocalDB; Initial Catalog=AdventureWorks;Integrated Security=true;"
syncType: Server
- providerType: SqliteSyncProvider
connectionString : "adWorks.db"
syncType: Client
tables:
- name : ProductCategory
schema : dbo
syncDirection : bidirectional
- name : Product
schema : dbo
syncDirection : bidirectional
configuration:
- conflictResolution : ServerWins
- downloadBatchSizeInKB : 1000
- batchDirectory : "C:\\tmp"
- serializationFormat : json
- useBulkOperations : true
Available on github : https://github.com/aaubry/YamlDotNet
I test database synchronization with 50 tables. Synchronization interval 30 seconds. I noticed that the changes that are made at the time of synchronization (1 -3 seconds) do not register as changes and then do not synchronize.
I made two videos to better understand the situation.
https://drive.google.com/open?id=1Gb4e9qjVwm5yzu_kv_n1oAFSv24QTN0m
https://drive.google.com/open?id=1yZ7dP10c968cXyucr5tCteRTB4ti0ksX
This is the code I use
private async Task DbSync()
{
SqlSyncProvider serverProvider = new SqlSyncProvider(db1.ServerConnectionString);
SqlSyncProvider clientProvider = new SqlSyncProvider(db1.ClientConnectionString);
string[] tables = new DbAccess().GetTablesForSync(db1.ServerConnectionString);
SyncAgent agent = new SyncAgent(clientProvider, serverProvider, tables);
var s = await agent.SynchronizeAsync();
GlobalVariables.syncLog += $"Total Changes DO : {s.TotalChangesDownloaded}, UP : {s.TotalChangesUploaded }, conflicts : {s.TotalSyncConflicts }, errors : {s.TotalSyncErrors }, Start: {s.StartTime } Duration : {s.CompleteTime - s.StartTime } \n";
}
Hi! Your project is very interesting for me. I also see demo entities referencing XAF core tables...this is the framework that I use for business so it would be very promissing working with Dotmim at sync scenarios.
I wonder if there is a recomended approach, not to make a complete sync, but using Dotmim trigger capabilities and agent logic, subscribing to the event or method that tries to sync data to client provider. My idea is to make a light use and make by myself a custom insert, update or delete at client direction because is not a stadar RDBMS (I plan using couchbase syncing data with mysql).
Regards,
José Enrique
Hello Sébastien,
I have done some tests with my databases and discovered, that fieldnames like
[File], [Read], [From] [To], [Document] are not supported.
Is this by design or will you enhance to support these fieldnames too?
Thanks
Reinhold
Hi!
I know, that Dotmim.Sync focuses on client-server synchronization.
For one customer, we have the requirement, to sync from client to client as well.
Do you have an idea if this would (at all) be possible to implement in Dotmim.Sync?
And as I am sure you had quite serious reasons to not implement it, what are those reasons?
Thank you for you insights!
Create the PostgreSQL provider.
Based on the MySqlSyncProvider
, this new provider should not be so complicated to implement.
I wanted to use Dotmim.Sync on my project, but I got across with an error. When I examine the source code, the InsertOrUpdateScopeInfo(ScopeInfo scopeInfo) method in ...ScopeInfoBuilder.cs has scopeInfo.Id = (Guid) reader ["sync_scope_Id"];
The line I casts an error because the characters I and i are different characters as the big typed Turkish Character.
Hi,
i have used the client I posted in the other thread to test sync with MariaDB. And I get the following result:
Master: http://localhost:56782/api/values
Slave1: Server=127.0.0.1; Port=3306; Database=DotminXAF1; Uid=root; Pwd=geheim;
Slave2: Server=127.0.0.1; Port=3306; Database=DotminXAF2; Uid=root; Pwd=geheim;
Sync Start
Begin Session.
Ensure Scopes
Ensure Configuration
Configuration readed. 8 table(s) involved.
End Session.
Error occured during EnsureDatabase of type DataStore: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 'inFile VARCHAR(36),
inOptimisticLockField INT,
inGCRecord INT
)
BEGIN
IF ((SEL' at line 4
Sync Ended. Press a key to start again, or Escapte to end
Hello,
after updating to latest packages, the MySqlCommandBuilder seems to be substituted with a new class.
Can you have a look?
Hi
Yesterday, I was working on a software based on SyncWinrt, and I remembered, that it has a quite significant issue:
It requires us to run SQL scripts at setup time, so the synchronization can actually work.
As we are using EntityFramework Code First, we had to include these scripts in the EF migrations.
That, however, had a quite negative impact: We could not use Effort - an in-memory provider for EF - anymore.
This significantly slowed down our test performance.
So my question:
What is the best way to use Dotmim.Sync in combination with EF Migrations?
Did you think about this? :-)
Ba able to set a direction on each table:
Since we are adding tables as string array, we have to change the way we add tables in configuration to be able to add this option
Hello!
I was wondering, how you are planning to solve the problem, that client and server schemas are not always quite the same.
For example: In our production system, we have a tasks table.
This table looks different on the server and client. I.e. server and client table share a set of columns, but both, server and client, add additional columns to their own tasks tables.
This also leads me to the following question:
Are you planning to solve the problem of the old sync framework which is, that those schemas must be equal and must not ever change?
(e.g. a schema update would be necessary)
You could:
add an (optional) middle layer between client and server that can process all rows and convert them from one schema to another. That way you could allow downward compatibility in case the server schema changes.
please have a look at this library, as I think it could save you a to of work: DatabaseSchemaReader
Not only allows it to inspect the schema of all databases you want to support, it even allows to create schema-migration sql and the like.
So - very powerful stuff here.
Also, it runs on .NET Core.
Create the Oracle provider, based on the SqlSyncProvider
Is there an option to put a prefix on the tracking tables and stored procedures ?
Need to change :
MySqlParameter sqlParameter4 = new MySqlParameter("update_timestamp", MySqlDbType.Int64);
To
MySqlParameter sqlParameter4 = new MySqlParameter();
sqlParameter4.ParameterName = "update_timestamp";
sqlParameter4.MySqlDbType = MySqlDbType.Int64;
See mysql-net/MySqlConnector#402 for more information about this issue
Breaking Changes : Need to delete all stored procedures and re-generate them.
Hello Sebastien,
I have successfully synchronized a huge database (which I uploaded), but starting the same with syncConsole and WebServer gives an object ref not set error.
How can I extract a better error message to pass this one to you?
Hello Sebastien,
i'm back in "test-mode" :-).
I have a DB with about 2GB and local sync works fine (MSSQL-MSSQL).
But as soon as I use the WebProxy (debug mode in VS) I get an error "Object Reference not set".
I suppose, this is just a timeout error.
How can I increase timeout when using WebProxy?
Thanks
Reinhold
Would be great if supports authentication, especially when dealing with Web API
Hi,
actually, adding for example a ";" at the end of SQLite ConnectionString produces a database file like: TestDB.db;
Could this be parsed by using "Data Source=TestDB.db; Journal Mode=WAL;"
It is actually possible activating Journal Mode=WAL, adding this line in SQLiteSyncProvider.cs:
builder.JournalMode = SQLiteJournalModeEnum.Wal;
Maybe it is possible to read this setting from ConnectionString
Hello Sebastien,
I'm not sure, if you want to exclude such a table structrue, but in any case I would like to inform you, that a table with the following structure causes sync to stop:
CREATE TABLE [dbo].[Log](
[Oid] [uniqueidentifier] ROWGUIDCOL NOT NULL,
[TimeStamp] [datetime] NULL,
[Operation] nvarchar NULL,
[ErrorDescription] nvarchar NULL,
[OptimisticLockField] [int] NULL,
[GCRecord] [int] NULL,
CONSTRAINT [PK_Log] PRIMARY KEY CLUSTERED
(
[Oid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
Hello Sebastien,
I have recorded a sample video to demonstrate the problem. In this video i use 3 MSSQL databases (DotminXAF as master and DotminXAF1, DotminXAF2 as slaves).
First I start the webserver of your samples that is slightly modified and then the client console, that syncs both slaves.
Now at first attempt the sync works fine, but when I modify the same field on both slaves, changes of first slave are synced to master, but second slave does not get the changes! Check out the video please.
i don't know if there is a solution for that
but if i make any column a unique (Not the primary key) and tried to sync it server wins, client wins, filters, no filters it just go crazy with exceptions and cancel the sync process
i think if it insert a new row with a unique column and it's already there it should try to replace/update it depend on what ever server wins or client or at least give me the option to do so
Hi!
I have had a look at Dotmim.Sync, and compared its current capabilities with the stuff, that we currently have in production. (we are using SyncWinRT - I guess you know that library ;-))
What we definitively need is:
1) a way to specify a filter on a table that is being synced.
Currently, when a filter is active, the Microsoft Syncframework regards all filtered records as "out of the sync scope". However, when the rows change in a way that should re-include them to that sync scope, the old Syncframework was not able to detect this.
We made that possible by updating the "_tracking" tables manually, using our server side logic.
1b) we even need to be able to filter based on a table join!
We currently achieve that by modifying the sql scripts (stored procedures) which are generated by the framework, so that they add a join expression. The complexity in that approac is, that the join filter also needs to take into account, that the joined table might have changes as well. (so while the current row should have not been synced to the client, it now should based on some changes in the joined table)
The only part where it got too complicated was with many-to-many joins. (I might be able to assist you with that feature ;-))
2) Another part that was kinda cumbersome is, that SyncWinRT identified the sync master by its fully qualified domain name.
That was a problem when users hat to use a different URL when they were in their corporate network.
A quite easy fix would be, that the master database generates some guid, which is in turn used to identify it. In that case, the URL does not matter anymore.
3) allow to change serialization format
I have been playing around with protobuf-net recently and I am amazed by its performance.
As it normally uses reflection just as any other formatter, it also allows you compile a dll with a "hard-coded" serializer. Dude that one is sooooo fast!!!
So I would really like to use that insead of json or the like. (Again: I could help you with that)
4) Support one-way sync.
That one is interesting, because we are thinking about syncinc an Event Stream (neventstore) from server to client, thereby applying some filters (as in (1) and (1b))
However, as the event stream must only be edited on the server, there is no need for two-way syncing.
So a one-way sync option for a specific table would be great!
5) Use dotmim.sync for a file syncing api
A colleague of me created a github library, that allows you to split up your binary files into aprts, send them to a Web-API server, and fuse them back together.
That allows you to send large files over weak internet connections.
Now it would be great to have that combined with dotmim sync as a file sync framework:
That way, on the client side you know which files are already uploaded, and which are pending.
You can also send the list of files a client requires (using sync) and let him/her decide which files to actually download.
Sure, this is far beyond the scope of Dotmim.Sync, but I wanted to share this idea with you nevertheless, as you might have some interesting ideas for it too. ;-)
So what do you think about my ideas and feature requests?
Hello, I'm trying to sync a database on MSSQL.2016 and get for example this error:
Web sync start
Error occured during TableChangesSelecting of type DataStore: The stored procedure "dbo.XEvent_selectchanges" has not been found.
Sync Ended. Press a key to start again, or Escapte to end
Though the table and XEvent_tracking are created in target database.
Shall I provide soure and database sample?
I have 3 databases:
Master, Slave1, Slave2.
Configuration is set to: ConflictResolutionPolicy.ServerWins;
a) change the same field in Master and Slave1 -> Slave1 is aligned to Master
b) change the same field in Slave1 and Slave2 -> fields on Slave1 and Slave2 retain their values. Value of 1 is passed to Master, but not foreward to Slave2
Hi!
I recently stumbled across a shortcoming of Microsoft Sync Framework again and wanted to know if this was fixed in doxmim sync.
With Sync Framework, you had to
If you added data before the initial sync with the server, you'd get a "serverBlob is null" exception.
But sometimes, a user has some app, which is not yet connected to a server and (s)he adds data anyways. When online again, (s)he naturally wants to push those changes to the server nontheless.
Is this possible with dotmimsync, or do you still have to run an initial sync before editing data?
We have the following scenario in our application:
Would it be possible to add a "Transmitted" bit column onto the tracking tables?
This would need to be updated in the same transaction that selects the changes to transmit to the client during a sync session.
It seems to me that this would already do the trick on the server side.
As far as I understand, for each user/device dotmim.sync creates a new scopeid.
That scopeid then can be correlated (by application logic outside of the sync framework) to correlate a user and his/her device(s)
What do you think?
Hello Sebastien,
do you see any chance to introduce a switch (maybe by recompiling the code my own) to store guids in sqlite as char?
My clients do not support guid-blob out of the box and I would be a lot of work to change.
Thanks
Reinhold
It is possible to include system.data.sqlite as SqliteSynchProvider? So that can use sqlite database version 3 and password.
Thanks
I'm not sure how merging rows works when a conflict is detected, so I have a question regarding of this.
Suppose that at server side a user modifies only a column value and at client replicated database side the same object/row is modified by another user who only changes a different column. So the obvious question is, what will it happen? Will DotmimSync merge the row preserving changes on both sides?
Regards,
José Enrique
Today, only dbo
schema is allowed.
Make a modification on the SqlSyncProvider
project to handle any schemas
Hello Mimetis,
It would be better if we could provide a reference column from the table to be synced on which synchronization would work. For e.g. we are generating a unique column in addition to auto generated primary key in the table.
Let me explain the scenario why we need this;
Lets say we have three offline databases A, B and C which needs to be synchronized to a central database X.
When we add data to a table, it will auto generate data in primary key column. so offline databases A, B and C will generate their own data in a table and these data needs to union and stored in database X, which generates its own primary key data in the column.
So, when we sync the data from either A, B or C to database X, it overwrites the data on X with the same primary key. So Lets say there is a table City in all the databases and the first row in each database is as follows:
ID 1 in A: NYC
ID 1 in B: Sydney
ID 1 in C: Delhi
Now every time I sync the data from the last one to sync will overwrite the data at ID 1 in X.
What i need is
ID 1 in X: NYC
ID 2 in X: Sydney
ID 3 in X: Delhi
Kindly suggest a work around or the already implemented feature in your marvellous project.
Thanks,
Nischal
Hi,
I'm trying to run this sync sample.
It works, when I put only one table in tables-array. But when adding more or all tables of my database, it does not. What am I doing wrong here?
Thanks
Reinhold
Hi Sébastien,
I think there is a small issue when synchronizing via the webserver. I have added the tables to sync in the webserver, and have modified the SyncServer project to connect to the webserver.
While the included database syncs fine when using direct sync, it produces the following error via webserver: System.String cannot be converted to System.Byte.
Please restore the database, adjust connectionstrings and run webserver+syncServer.
Regards
Reinhold
DotminSyncWeb.zip
Hello Sebastien,
as sync with local databases seems to work quite well, I would like do some deeper tests using "Dotmim.Sync.SampleConsole" and "UWPSyncSampleWebServer".
But I get some errors when starting the "UWPSyncSampleWebServer". Could you do a quick check of this combination and let me know, when ready to check out?
Thanks
Reinhold
Hello,
I know, this is acutally not supported, but IMO this should be part of the solution:
Sample: Table "Folders" with fields Id[guid] and ParentId[guid] and FolderName[string]
Actually I need to setup this without ForeignKey, as otherwise Sync stopps.
It's not a showstopper, but if there is a solution to Support this Scenario, this would be great.
thx
reinhold
AutoIncrement is failed if the row data type is Int64
Bug happened when a conflict occurs on a row where Primary Key is Int64 auto inc
Hello!
I recently came up with multiple sync scenarios that cause me some headache.
In both cases, there might not be an actual sync conflict (by two users editing the same row), but constraint violations.
#Scenario 1:
We have a list of tasks in our system. Those tasks (somebody has to do something until the date xy) can be created via our web UI. But additionally, they can come from an external system which utilizes our REST API.
Now for those systems to identify their tasks, we provide a special column "reference" which is an arbitrary 120 character string. The only requirement is, that it is unique.
We ensure this - obviously - using a database constraint.
However, now it can happen, that two users add a new task with the same value for "reference". So although both tasks have unique primary keys, there will be constraint violation error.
My question: Are there any sync tactics to overcome these problems? What would you do to solve this kind of problem with Dotmim.Sync?
#Scenario 2
Say you have some masterdata (list of equipments) that you store in a historized fashion.
Meaning: Each row identifying an equipment has a ValidFrom and ValidTo column.
Now whenever an equipment is modified, a new row is added to the table with [ValidFrom = getutcdate()] and validTo = infinity. And the "old" record is updated to [ValidTo = getutcdate()]
Now say you sync this table to your clients.
Both clients A and B update an existing equipment, thereby creating a new row each.
How would you solve this sync conflict?
I am really banging my head on this one...
The last version (called 0.1.5) will integrate a major update on the SqliteSyncProvider
Actually, I have switched the base Sqlite
provider from System.Data.Sqlite
to Microsoft.Data.Sqlite
.
For references, you will find usefuls informations from those two SQLite
providers here:
System.Data.Sqlite
is not (yet) .Net Standard
compliant. Actually, it works, thanks to the .net standard shim system, but in many ways, it's also painful in some situations.Microsoft.Data.Sqlite
is a leightweight version, developed by @bricelam from the Entity Framework team.One of the most impactful issue comes from the Guid
behavior in Sqlite
. Actually, it's stored as blob
by default in SQLite.
System.Data.Sqlite
allows you to specify a BinaryGUID
directive to be able to store Guid
as String
.
This behavior is not a SQLite behavior, just a system flag managed by System.Data.Sqlite
Microsoft.Data.Sqlite
does not allow us to add any parameter on the connection string.
See why here :
So, Every SQLite databases will now be generated with blob
data type for every Guid
type.
If you want to use the last version (and you have to !) You have to regenerate your SQLite databases.
The breaking changes from a string
to a blob
type for any Guid
is not compatible.
If you are Inside a .Net application, using Microsoft.Data.Sqlite
framewok, you have to use a SqliteParameter
to be able to interact with a Guid
stored as a blob
:
var insertRowScript =
$@"INSERT INTO [ServiceTickets] ([ServiceTicketID], [Title], [Description], [StatusValue], [EscalationLevel], [Opened], [Closed], [CustomerID])
VALUES (@id, 'Insert One Row in Sqlite client', 'Description Insert One Row', 1, 0, datetime('now'), NULL, 1)";
int nbRowsInserted = 0;
using (var sqlConnection = new SqliteConnection(fixture.ClientSqliteConnectionString))
{
using (var sqlCmd = new SqliteCommand(insertRowScript, sqlConnection))
{
sqlCmd.Parameters.AddWithValue("@id", newId);
sqlConnection.Open();
nbRowsInserted = sqlCmd.ExecuteNonQuery();
sqlConnection.Close();
}
If you want to make a SQLite Select statement on any table with a blob data type containing a guid, you could make a call like that.
select quote(sync_scope_id), sync_scope_name from scope_info
If you want to use a PRAGMA
directive, you can't use a connection string parameter anymore. But hopefully you can send a PRAGMA request, once your connection is opened:
SqliteConnection c = new SqliteConnection($"Data Source=fabrikam");
c.StateChange += (_, e) =>
{
if (e.CurrentState == ConnectionState.Open)
{
var cmd = c.CreateCommand();
cmd.CommandText = "PRAGMA journal_mode=WAL;";
var i = cmd.ExecuteScalar();
}
};
Hi!
Seems you are making great progress here! It really looks awesome!
I went through the wiki, and saw the schema option:
https://github.com/Mimetis/Dotmim.Sync/wiki/04-Sql-Server-provider-:-Schema
I know a CMS called "Orchard" (Orchard Core is now the vnext in beta) which also supports a wide range of sql databases and requires something like a schema where it is not supported (mysql, sqlite, etc.)
Here, they simply use the schema name as prefix (e.g. with an underscore)
So in the example
agent.Configuration["Address"].Schema = "SalesLT";
On sqlite thos could become something like
SalesLT_Address
This could also be configurable:
This would be a nice addition :-)
Please try the following steps:
Hello,
What do you think about a Filter for post request execution to execute the sync ?
public class SyncAttribute : IResultFilter
{
public string[] OnMethods { get; set; } = new[] { "POST" };
public void OnResultExecuted(ResultExecutedContext context)
{
if (OnMethods.Contains(context.HttpContext.Request.Method))
{
WebProxyServerProvider _webProxyService = (WebProxyServerProvider)context.HttpContext.RequestServices.GetService(typeof(WebProxyServerProvider));
_webProxyService.HandleRequestAsync(context.HttpContext);
}
}
public void OnResultExecuting(ResultExecutingContext context)
{
}
}
Hi Sébastien,
i have discovered a bug and will try to describe the steps to reproduce.
a) Master database on MSSQL
b) sync to SQLite database Slave1
c) sync to SQLite database Slave2
d) Now insert a new record in one of the slave databases and sync. The record is uploaded to master.
e) Sync on second slave: no record is downloaded to slave2
f) update the record on slave1 and sync: update is synched to master
g) sync on slave2, record is now downloaded
If this is not enough information to reproduce, please let me know and I will setup a small reduced sample.
Reinhold
Hello,
i have now tested sync with sampleebserver and syncclient, i posted. Both (all three) ends are MSSQL.
Adding 1 file works, adding more files results in:
Begin Session.
Ensure Scopes
Ensure Configuration
Configuration readed. 8 table(s) involved.
Selecting changes...
Changes selected : 56
End Session.
UNKNOW EXCEPTION : Not Found
Sync Ended. Press a key to start again, or Escapte to end
Hello!
Just had a look at the conflict resolution documentation here
https://github.com/Mimetis/Dotmim.Sync/wiki/06-Sync-conflict
My question:
What if I wanted to manually merge the information from client and server (combine them)
Would that be possible, by editing the client-rows like so?
var dmRowClient = e.Conflict.LocalChanges.Rows[0];
// Example 1 : Resolution based on rows values
if ((int)dmRowServer["ClientID"] == 100 && (int)dmRowClient["ClientId"] == 0)
{
dmRowClient["CLientId"] = 100;
e.Action = ApplyAction.RetryWithForceWrite;
}
Would that already work?
But in that case, it would overwrite all the columns of the server row.
Would it also be possible to modify the server row (so I can edit only the one or two columns from the client that I care about, without risking to overwrite server columns that should not be affected) with client data and store that one?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.