Giter Site home page Giter Site logo

capgemini / xrm-datamigration Goto Github PK

View Code? Open in Web Editor NEW
23.0 23.0 11.0 4.36 MB

Export and import data for Microsoft Dataverse. Supports JSON and CSV.

License: MIT License

C# 100.00%
power-apps power-platform dynamics-365 dynamics-crm dynamics common-data-service cds microsoft powerplatform

xrm-datamigration's Introduction

Capgemini CRM Data Migration

Build Status Build status

Description

This Data Migration project provides a flexible powerful engine based on the XRM SDK which allows CRM Dynamics Configuration, Reference and Seed data to be extracted, stored in version control and loaded to target instances. The engine supports two file formats JSON and CSV and supports the migration of simple reference data entities (e.g. Titles, Countries) to more complex scenarios around Security Roles and Teams. This allows data to be managed in the same way as code and a release can be created that can load the required data to support the released functionality.

Table Of Contents

  1. Installation
  2. Usage
    1. Obfuscation
  3. Contributing
  4. Credits
  5. License

Installation

Visual Studio Instructions

Clone the Git Repository

Open example project (Capgemini.Xrm.Datamigration.Examples) and edit configuration file (App.config):

  <applicationSettings>
    <Capgemini.Xrm.Datamigration.Examples.Properties.Settings>
      <setting name="DemoScenarioName" serializeAs="String">
        <value>Contacts</value>
      </setting>
      <setting name="CrmExportConnectionString" serializeAs="String">
        <value>AuthType=OAuth; Username=name[at]example.onmicrosoft.com; Password=*********; Url=https://contosotest.crm.dynamics.com; AppId=51f81489-12ee-4a9e-aaae-a2591f45987d; RedirectUri=app://58145B91-0C36-4500-8554-080854F2AC97; LoginPrompt=Auto; RequireNewInstance=True;</value>
      </setting>
      <setting name="CrmImportConnectionString" serializeAs="String">
        <value>AuthType=OAuth; Username=name[at]example.onmicrosoft.com; Password=*********; Url=https://contosotest.crm.dynamics.com; AppId=51f81489-12ee-4a9e-aaae-a2591f45987d; RedirectUri=app://58145B91-0C36-4500-8554-080854F2AC97; LoginPrompt=Auto; RequireNewInstance=True;</value>
      </setting>
      <setting name="UseCsvImport" serializeAs="String">
        <value>False</value>
      </setting>
    </Capgemini.Xrm.Datamigration.Examples.Properties.Settings>
  </applicationSettings>
  • DemoScenarioName - Scenario name from the scenarios in DemoScenarios Folder :
  • CrmExportConnectionString - Connection string for the source Dynamics 365 instance - used by export
  • CrmImportConnectionString - Connection String for the target Dynamics 365 instance - used by import
  • UseCsvImport - True - Csv format used, False - Json files used

The example project will extract data from one CRM instance (Export Instance) to a folder and then will subsequently load the data into a second CRM istance (Import Instance).

Initially set up some data in the Export CRM Instance

Run the console application and follow messages

In the bin folder there will be output folder and files with exported data created, eg. running Contacts scenario

In the Import CRM instance you can check if all the data is created as expected.

Command Line Instructions

Download the Latest Release of Capgemini.Xrm.DataMigration.Engine.zip

Unblock the zip and extract the contents

As part of the package there are a number of examples in the Demo Scenarios folder. Please visit the Examples wiki page for more details on the examples

To execute a scenario navigate to the Capgemini.Xrm.Datamigration.Examples.exe.config file and edit the file. You can create your own scenarions by adding aditional DemoScenarios folders and creating your own configuration files.

Edit configuration file:

  <applicationSettings>
    <Capgemini.Xrm.Datamigration.Examples.Properties.Settings>
      <setting name="DemoScenarioName" serializeAs="String">
        <value>Contacts</value>
      </setting>
      <setting name="CrmExportConnectionString" serializeAs="String">
        <value>AuthType=OAuth; Username=name[at]example.onmicrosoft.com; Password=*********; Url=https://contosotest.crm.dynamics.com; AppId=51f81489-12ee-4a9e-aaae-a2591f45987d; RedirectUri=app://58145B91-0C36-4500-8554-080854F2AC97; LoginPrompt=Auto; RequireNewInstance=True;</value>
      </setting>
      <setting name="CrmImportConnectionString" serializeAs="String">
        <value>AuthType=OAuth; Username=name[at]example.onmicrosoft.com; Password=*********; Url=https://contosotest.crm.dynamics.com; AppId=51f81489-12ee-4a9e-aaae-a2591f45987d; RedirectUri=app://58145B91-0C36-4500-8554-080854F2AC97; LoginPrompt=Auto; RequireNewInstance=True;</value>
      </setting>
      <setting name="UseCsvImport" serializeAs="String">
        <value>False</value>
      </setting>
    </Capgemini.Xrm.Datamigration.Examples.Properties.Settings>
  </applicationSettings>
  • DemoScenarioName - Scenario name from the scenarios in DemoScenarios Folder
  • CrmExportConnectionString - Connection string for the source Dynamics 365 instance - used by export
  • CrmImportConnectionString - Connection String for the target Dynamics 365 instance - used by import
  • UseCsvImport - True - Csv format used, False - Json files used

Initially set up some data in the Export CRM Instance

To execute run Capgemini.Xrm.Datamigration.Examples.exe

When prompted confirm the scenario being executed

Prior to Export you will be prompted about the Export folder the data will be exported to

Once the data export part of the process completes you will be asked to confirm if you wish to continue and import the data to the Import CRM instance. At this point it is possible to verify the data exported in the ExportData folder which is created in the specific Demo folder you are executing. For example C:........\DemoScenarios\Contacts\ExportedData

Following import you will receive confirmation and you can verify the data in the target CRM Instance

Usage

Create a new console app and add Capgemini.Xrm.DataMigration Nuget

Xrm DataMigration Engine classes are available to be used in any custom scenario eg.

Export Example

        static void ExportData(string connectionString, string schemaPath, string exportFolderPath)
        {
            if (!Directory.Exists(exportFolderPath))
                Directory.CreateDirectory(exportFolderPath);

            var tokenSource = new CancellationTokenSource();
            var serviceClient = new CrmServiceClient(connectionString);
            var entityRepo = new EntityRepository(serviceClient, new ServiceRetryExecutor());
            var logger = new ConsoleLogger();
            var exportConfig = new CrmExporterConfig()
            {
                BatchSize = 1000,
                PageSize = 500,
                FilePrefix = "EX0.1",
                JsonFolderPath = exportFolderPath,
                OneEntityPerBatch = true,
                SeperateFilesPerEntity = true,
                TopCount = 10000,
                CrmMigrationToolSchemaPaths = new List<string>() {schemaPath}
            };

            // Json Export
            var fileExporterJson = new CrmFileDataExporter(logger, entityRepo, exportConfig, tokenSource.Token);
            fileExporterJson.MigrateData();

            // Csv Export
            var schema = CrmSchemaConfiguration.ReadFromFile(schemaPath);
            var fileExporterCsv = new CrmFileDataExporterCsv(logger, entityRepo, exportConfig, tokenSource.Token, schema);
            fileExporterCsv.MigrateData();
        }

Import Example

        public static void ImportData(string connectionString, string schemaPath, string exportFolderPath)
        {
            var tokenSource = new CancellationTokenSource();
            var serviceClient = new CrmServiceClient(connectionString);
            var entityRepo = new EntityRepository(serviceClient, new ServiceRetryExecutor());
            var logger = new ConsoleLogger();

            var importConfig = new CrmImportConfig()
            {
                FilePrefix = "EX0.1",
                JsonFolderPath = exportFolderPath,
                SaveBatchSize = 20
            };

            // Json Import
            var fileImporterJson = new CrmFileDataImporter(logger, entityRepo, importConfig, tokenSource.Token);
            fileImporterJson.MigrateData();

            //Csv Import
            var schema = CrmSchemaConfiguration.ReadFromFile(schemaPath);
            var fileImporterCsv = new CrmFileDataImporterCsv(logger, entityRepo, importConfig, schema, tokenSource.Token);
            fileImporterCsv.MigrateData();
        }

The engine has been used for a number of scenarios on a number of projects. See wiki for a list of examples.

Features of the engine include the support for many-to-many relationships, application of filters, building relations via composite keys and GUID mappings.

The engine is controlled by three configuration files, a fuller explanation of the values can be found in the wiki. DataSchema.xml - Defines details of the entities and attributes that are to be extracted.

DataExport.json โ€“ Holds details of the schema to use, filters to be applied and other run controls.

DataImport.json - Holds details of the location and prefix of the Exported files that are to be loaded.

Obfuscation

What is data obfuscation?

Data Obfuscation, also referred to as data masking or scrambling, is the process of manipulating data so that it bears no resemblance to the original value, whilst still following any patterns that are required for a systems testing and validation processes to function. For example, if postcode validation is a feature of the system then when obfuscating a postcode, it must be substituted with a real postcode as a replacement.

Obfuscation is a highly important process when moving sensitive data such as Personally Identifiable Information or sensitive data related to a person or commercial entity between Dynamics Organisations. Failing to obfuscate sensitive data during a data migration could expose the information to unauthorised personnel which would potentially be a breach of the General Data Protection Regulation (GDPR).

Usage

The sample project in the repository contains a DemoScenario called Obfuscation that applies obfuscation to some of the fields in a contact record. This demo scenario does not cover all fields in the entity and should not be used in production.

When running the sample console application with Obfuscation set as the DemoScenarioName's value as below, an ExportConfig.json will be generated by the application at runtime. The file is located in the bin folder bin\Debug\DemoScenarios\Obfuscation.

<setting name="DemoScenarioName" serializeAs="String">
    <value>Obfuscation</value>
</setting>

Configuration

There are two methods of obfuscating data that can be applied. The default method takes the original value and replaces it using a scrambling algorithm suitable for the data type of the attribute. The other option is applying a format in the same way you would use String.Format in C#. Each of the format items are replaced by a format argument with types of RandomString, RandomNumber and Lookup that can be applied.

Below is the schema of an obfuscation item.

{
  "FieldsToObfuscate": [
    {
      "EntityName": "Required",
      "FieldsToBeObfuscated": [
        {
          "FieldName": "Required",
          "ObfuscationFormat": "Optional",
          "ObfuscationFormatArgs": [
            {
              "FormatType": "Optional",
              "Arguments": {Optional}
            }
          ]
        }
      ]
    }
  ]
}

For more information about the obfuscation feature please take a look at the Wiki page.

Contributing

To contribute to this project, report any bugs, submit new feature requests, submit changes via pull requests or even join in the overall design of the tool.

Credits

Special thanks to the entire Capgemini community for their support in developing this tool.

License

The Xrm Data Migration is released under the MIT license.

xrm-datamigration's People

Contributors

bancey avatar davidjpowell80 avatar dependabot[bot] avatar ewingjm avatar kevlee1401 avatar ksulikow avatar markcunninghamuk avatar realekhor avatar shraines avatar tdashworth avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

xrm-datamigration's Issues

Versions 3.1x are not packed correctly - missing dependencies

Describe the bug
Versions 3.1x are not packed correctly - missing dependencies

To Reproduce
Install nuget and see that only main dll is included

Expected behavior
All referenced dlls should be added as well

Additional context
Issue with nuget pack step

Import fails when root business unit is included in the data

Describe the bug
Import fails when root business unit is included in the data

To Reproduce
Prepare package with root BU included , export and import to the target environment, you will receive an error.

Expected behavior
The root BU should be mapped and updated.

Code Smell: Code Formatting

Raised by Sonar Cloud:

Opening brace should be preceded by a space

https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1012&severities=MAJOR&id=xrm-datamigration&open=AXwn37q4146gT_OOOYq-

Using directive for 'System.Collections.Generic' should appear before directive for 'Newtonsoft.Json'

https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1208&severities=MAJOR&id=xrm-datamigration&open=AYHIVG6NIQuHv600XTv6
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1208&severities=MAJOR&id=xrm-datamigration&open=AYHIVG6NIQuHv600XTv7

Braces for multi-line statements should not share line

https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1500&severities=MAJOR&id=xrm-datamigration&open=AXwn37q4146gT_OOOYrA
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1500&severities=MAJOR&id=xrm-datamigration&open=AXwn37q4146gT_OOOYrB

Elements should be separated by blank line

https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1516&severities=MAJOR&id=xrm-datamigration&open=AYHImH0sknSGSKrfpo6s
https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1516&severities=MAJOR&id=xrm-datamigration&open=AYHImH0sknSGSKrfpo6t

Opening brace should be preceded by a space

https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1012&severities=MAJOR&id=xrm-datamigration&open=AXwn37q4146gT_OOOYq-

Closing generic bracket should be followed by a space

https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1015&severities=MAJOR&id=xrm-datamigration&open=AXwn37q4146gT_OOOYq_

Generic type constraints should be on their own line

https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ASA1127&severities=MAJOR&id=xrm-datamigration&open=AYHImHxNknSGSKrfpo6r

CSV Mapping fields aren't wrapped with quotes

Describe the bug
My export config has two lookup mappings defined which target string fields. When I extract as CSV (via the XRM ToolBox), these fields are not wrapped with quotes ("...") like the other string fields. I'm not sure if this is because the lookup field is not defined in the Schema file?

The result of this is a broken CSV field if any mapping field value contains a comma (,).

To Reproduce
Within the Lookup Mappings, specify a lookup to a text field.

{
  "LookupMapping": {
    "teamroles": {
      "roleid": [
        "name",
        "businessunitid"
      ],
      "teamid": [
        "name",
        "businessunitid"
      ]
    }
  }
}

Expected behavior
Mapped text values to also be wrapped with quotes.

Entity Image fails for CSV Export

Describe the bug
I have a schema with the field <field displayname="Image" name="new_image" type="image" primaryKey="false" customfield="false" /> which is failing with the error Error:Not supported attribute type System.Byte[] only for CSV exports (JSON worked fine).

To Reproduce

  1. Generate a schema which includes an entity with an entityimage and include that field like above.
  2. Run this through the CSV data exporter and the above error should occur.

Expected behavior
This error not to be thrown and instead the base64 to be exported or safe handling of this field type to be ignored.

Update authentication method to support Oauth

Is your feature request related to a problem? Please describe.
Office 365 is deprecated and insecure by modern standards

Describe the solution you'd like
support Oauth by updating Microsoft.CrmSdk.CoreTools to atleast version 9.1.0.13, also if using the https://github.com/seanmcne/Microsoft.Xrm.Data.PowerShell this will require updating also

Additional context
https://docs.microsoft.com/en-us/power-platform/important-changes-coming#deprecation-of-office365-authentication-type-and-organizationserviceproxy-class-for-connecting-to-dataverse

Limit of 100,000 records export

Describe the bug
There appears to be a limit of 100,000 records that can be extracted. I'm not sure if this is a Dynamics limit, an Sdk limit, or this tool's limit but the Fast Record Counter XrmToolbox tool can count past it.

To Reproduce
Export all records in a table with more than 100,000 records

Expected behavior
All records are to be extracted.

Refactor to use ILogger

Is your feature request related to a problem? Please describe.
This isn't problem-related but while refactoring powerapps-packagedeployertemplate, I switch the services to use Microsoft's Logging Extensions. While integrating with this library, I wrote a wrapper for a very similar ILogger interface.

Describe the solution you'd like
My question and the proposal is to update this library to use the same logging abstraction provided by MS for better integration with other tooling. For example, the respective Xrm Toolbox plugin could also be updated with a custom ILogger implementation.

Additional context
https://docs.microsoft.com/en-us/dotnet/core/extensions/logging

I've opened this as a discussion before we consider investing time in the refactor. ๐Ÿ˜„

Code Smell: Remove the underscores from member name

Release the CLI tool

Is your feature request related to a problem? Please describe.
The Samples project, XrmToolbox plugin, and CLI tool are three ways to use the engine. Of course, we could use the nuget package in our own projects as well. Personally, I like the CLI since it's up to date with the latest engine and I could use it outside the context of the Samples project. Unfortunately, it means pulling the latest codebase and building manually.

Describe the solution you'd like
Build and publish the CLI tool as a dotnet global tool with the latest version of the engine.

It is also worth adding a small section in the README on the tool although it should be self documenting (via -help).

Make the File Store generic (not tied to Dynamics)

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like
A clear and concise description of what you want to happen.

Additional context
Add any other context or screenshots about the feature request here.

Consistent naming/branding/terminology

Is your feature request related to a problem? Please describe.

The terminology and branding of this repository is a bit inconsistent. For example, we have:

  • xrm-datamigration
  • CRM Data Migration
  • Xrm Data Migration

And if we include the XrmToolBox plug-in repository:

  • CDS Data Migrator
  • CRM environments (as opposed to CDS/Dataverse environments)

Describe the solution you'd like

It would be great if we could update all of the terminology and naming to be consistent. At the time of writing, we should probably be referring to Dataverse where we've got XRM/CDS/CRM. We're starting to see a suite of tools emerging for Power Apps with similar branding on our NuGet organisation -

image

It's debatable if a more appropriate namespace/package title for this would be Capgemini.PowerApps.DataMigration (to align with what we've already got) or Capgemini.Dataverse.DataMigration.

Additional context

The repository name would probably also need to be updated as part of this.

No warning/error if data file can't be found

I was trying to import JSON files using the Dynamics 365: Data Importer Release task. My file name was starting with the prefix "ExtractedData". It was neither importing file nor giving an error message/warning. Below is the log of import task

image

When I tried to import the same file with ExtractedData prefix using CDS data migrater import xrm toolbox it worked. Later, I changed the filename prefix to "ExportedData" and it worked from "Dynamics 365: Data Importer Release task". We should handle this and give an appropriate error message/ warning like No file found something like that.

Test Documentation

Today I tried to run the full test suite locally before submitting a PR. This included the integrations tests.

Capgemini.Xrm.DataMigration.Core.IntegrationTests worked perfectly after supplying some connection strings in the app.config.

Capgemini.Xrm.DataMigration.Tests.Integration did not work, again, after supplying connection strings in the app.config.

I think we need to:

  1. add the to CONTRIBUTING.md with steps on configuring the integration tests - even if it's simple.
  2. ensure to tests can be run on any environment without any manual pre-configuration.

Export filter doesn't work

Description
I'm trying to export data from several tables and for one of the tables I added a Fetch XML filter.
If the flag "OnlyActiveRecords" is set to True, then export process ignores my filter and exports all the active records from all the tables that are described in the schema file.
If I set that flag to false, the tool fails when tries to export data for the filtered entity. Log:

23-Jun-2022 00:01:57 - Verbose:CrmFileDataExporter GetProcessors started
23-Jun-2022 00:01:57 - Verbose:CrmFileDataImporter GetProcessors finished
23-Jun-2022 00:01:57 - Info:GenericDataMigrator MigrateData started
23-Jun-2022 00:01:57 - Info:GenericDataMigrator PerformMigratePass started, passNo:1
23-Jun-2022 00:01:57 - Info:DataFileStoreWriter Reset performed
23-Jun-2022 00:01:57 - Verbose:DataCrmStoreReader Reset performed
23-Jun-2022 00:01:57 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:0, page0
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader retrieved entity:msdyn_cannedmessage, page:1, query:0, retrievedCount:10, totalEntityCount:10
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:1, page:0, totalCount:10
23-Jun-2022 00:01:58 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:1 entities:10 FirstEntity:msdyn_cannedmessage
23-Jun-2022 00:01:58 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:10, batchNo:1
23-Jun-2022 00:01:58 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:01:58 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:1 entities:10
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:1, page0
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader retrieved entity:msdyn_msdyn_cannedmessage_liveworkstream, page:1, query:1, retrievedCount:10, totalEntityCount:10
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:2, page:0, totalCount:20
23-Jun-2022 00:01:58 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:2 entities:10 msdyn_msdyn_cannedmessage_liveworkstream
23-Jun-2022 00:01:58 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:10, batchNo:2
23-Jun-2022 00:01:58 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:01:58 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:2 entities:10
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:2, page0
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader retrieved entity:msdyn_msdyn_cannedmessage_msdyn_octag, page:1, query:2, retrievedCount:4, totalEntityCount:4
23-Jun-2022 00:01:58 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:3, page:0, totalCount:24
23-Jun-2022 00:01:58 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:3 entities:4 msdyn_msdyn_cannedmessage_msdyn_octag
23-Jun-2022 00:01:58 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:4, batchNo:3
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:3 entities:4
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:3, page0
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader retrieved entity:msdyn_liveworkstream, page:1, query:3, retrievedCount:10, totalEntityCount:10
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:4, page:0, totalCount:34
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:4 entities:10 msdyn_liveworkstream
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:10, batchNo:4
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:4 entities:10
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:4, page0
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader retrieved entity:msdyn_octag, page:1, query:4, retrievedCount:3, totalEntityCount:3
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:5, page:0, totalCount:37
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:5 entities:3 msdyn_octag
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:3, batchNo:5
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:5 entities:3
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:5, page0
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader retrieved entity:msdyn_livechatconfig, page:1, query:5, retrievedCount:2, totalEntityCount:2
23-Jun-2022 00:01:59 - Verbose:DataCrmStoreReader ReadBatchDataFromStore finished, queryIndex:6, page:0, totalCount:39
23-Jun-2022 00:01:59 - Info:GenericDataMigrator PerformMigratePass retrieved data, batchNo:6 entities:2 msdyn_livechatconfig
23-Jun-2022 00:01:59 - Verbose:DataFileStoreWriter SaveBatchDataToStore started, records:2, batchNo:6
23-Jun-2022 00:02:00 - Verbose:DataFileStoreWriter SaveBatchDataToStore finished
23-Jun-2022 00:02:00 - Info:GenericDataMigrator PerformMigratePass saved data, batchNo:6 entities:2
23-Jun-2022 00:02:00 - Verbose:DataCrmStoreReader ReadBatchDataFromStore started, queryIndex:6, page0
23-Jun-2022 00:02:54 - Error: Sql error: Generic SQL error. CRM ErrorCode: -2147204784 Sql ErrorCode: -2146232060 Sql Number: 156

Fetch XML is tested in the builder and works fine.
image

ExportConfig:

{
"ExcludedFields": [],
"CrmMigrationToolSchemaPaths": [
"D:\xx\ConfigMigration\schema.xml"
],
"CrmMigrationToolSchemaFilters": {
"msdyn_oclocalizationdata": "<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false">\n <entity name="msdyn_oclocalizationdata">\n <attribute name="msdyn_localizedtext" />\n <attribute name="msdyn_customerlanguageid" />\n <attribute name="statecode" />\n <attribute name="msdyn_oclocalizationdataid" />\n <order attribute="msdyn_localizedtext" descending="false" />\n \n <condition attribute="msdyn_oclocalizationdataid" operator="in" uitype="msdyn_oclocalizationdata">\n 44ff8f8e-5ded-ec11-bb3d-000d3af4d379\n a7527efe-73ed-ec11-bb3d-000d3af4d379\n \n \n \n"
},
"PageSize": 1000,
"BatchSize": 1000,
"TopCount": 10000,
"OnlyActiveRecords": true,
"JsonFolderPath": "D:\xx\ConfigMigration\ExtractedData",
"OneEntityPerBatch": true,
"FilePrefix": "ExportedData",
"SeperateFilesPerEntity": true,
"FieldsToObfuscate": [],
"LookupMapping": {}
}

Please correct me, if I'm doing something wrong.
Thank you.

Filter config is ignored for many to many relations

Describe the bug
When in export config filter is set up for many to many, it is not used and all records are exported

To Reproduce
...

Expected behaviour
Filter config to be considered when exporting many-to-many relationships.

Additional context
Migrated from internal Azure DevOps.

Code Smell: 'ProcessZeroEntitiesFirst' calls 'All' but does not use the value the method returns. Linq methods are known to not have side effects. Use the result in a conditional statement, assign the result to a variable, or pass it as an argument to another method

'ProcessZeroEntitiesFirst' calls 'All' but does not use the value the method returns. Linq methods are known to not have side effects. Use the result in a conditional statement, assign the result to a variable, or pass it as an argument to another method.

https://sonarcloud.io/project/issues?resolved=false&rules=external_roslyn%3ACA1806&severities=MAJOR&id=xrm-datamigration&open=AXlmWBvHKjdVQvrIoNHe

Excessive login prompts

Describe the bug
A clear and concise description of what the bug is.
When importing a lot of data (> 1000 records) and using lookup maps and using an OAuth connection string, I am being asked to log in multiple times. Sometimes the login fails after too many correct attempts.

I'm not really sure why this is, are we creating new CrmServices?

To Reproduce
Provide code snippets which demonstrate the usage scenario to reproduce the behavior.
I'm not sure...

Expected behavior
A clear and concise description of what you expected to happen.
Non-interactive login to work. (I am using the CLI)

Additional context
Add any other context about the problem here.

Move Build Pipelines to GitHub Support project

Both the PR build and CI build/release Azure DevOps pipelines are located in our internal Capgemini Reusable IP project.

These should both be removed from Capgemini Reusable IP and recreated in GitHub Support so that they are publicly available.

Add BypassAllCustomLogicForEntities config to Import.json

Is your feature request related to a problem? Please describe.
Microsoft have documented functionality that allows you to bypass custom logic whilst performing API requests: https://docs.microsoft.com/en-us/powerapps/developer/data-platform/bypass-custom-business-logic. Being able to apply said logic during data import would be fantastic in data migration scenarios which often require plug ins and workflows to be deactivated manually.

Describe the solution you'd like
To supply an array of entity names for which the Upsert requests will have this setting enabled, thus being able to control this at quite a granular level.

Code Smell: Loops should be simplified with "LINQ" expressions

Improve performance for complex mappings

Is your feature request related to a problem? Please describe.
Requests are made per record. This can exhaust the API and is extremely slow for large data sets.

Describe the solution you'd like
Use an in-memory map and make fewer requests (maybe batch) to increase performance.

Additional context
Migrated from internal Azure DevOps.

README needs refined

Is your feature request related to a problem? Please describe.

I've noticed a couple of issues with the README that make it appear a little clumsy -

  • Installation section is talking about using the sample projects rather than installing the package - the section's content is also formatted more like Wiki than README content
  • The description header is superfluous
  • The table of contents has only one sub-section under Usage (Obfuscation)
  • Obfuscation is actually a header on the same level as Usage, rather than a sub-header as the ToC suggests
  • Usage instructs people to a create a new console app - this is overly prescriptive and not necessary
  • Obfuscation has its own sub-headers that aren't on the ToC
  • Contributing section should point to the CONTRIBUTING.md file
  • Credits section should probably be removed if we're just crediting ourselves

Describe the solution you'd like

  • Correct the installation section to be about installing the package (probably move the detail about the samples to the Wiki as it bloats the README)
  • Remove the description header and put the descriptive text under the title
  • Move obfuscation under usage and shift a considerable amount of this to the Wiki to avoid bloating the README with only specific features
  • Ensure the ToC is reflective of the actual headings
  • Replace the text in the Credits section with something along the lines of "Please refer to the Contributing guide." (with a link)
  • Remove the credits section

Unify all nuget dependencies

Remove not required nuget dependencies and ensure the same versions are used by all projects to prevent issues with dll versions mapping by the consuming client

Migration Report/Summary

Is your feature request related to a problem? Please describe.
The logging we currently have is extensive and detailed. This is great for understanding exactly what happened but is too complicated for most of our user base.

Describe the solution you'd like
It would be nice if the engine returned a summary of what happened.

  • How many rows were read/written per entity?
  • How many failures? What were they?
  • How many warnings? What were they?
  • How long did it take per entity?

This should be a typed object that the various interfaces can surface accordingly (e.g. the XrmToolBox visually displays).

Error when original guid does not exist in the source file for mapped reference field

Describe the bug
When export config contains fields mappings defined, but has the reference field excluded from export, the original guid will not be included in the exported data. It will cause error and fail during import.

To Reproduce
Create export config file and configure lookup mapping and add the field to excluded fields
"ExcludedFields": [
"systemuserid",
"roleid",
"systemuserrolesid",
"businessunitid"
],
"LookupMapping": {
"systemuser": {
"systemuserid": [
"domainname"
],
"businessunitid": [
"name"
]
},
"systemuserroles": {
"systemuserid": [
"domainname"
],
"roleid": [
"name",
"businessunitid"
]
}
}

Expected behavior
In this situation if lookup value can be found in the target system then it should be used, if not it should be left empty

Additional context
Fatal error during import

Explore `PowerPlatform-DataverseServiceClient`

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Use the newer Client API for newer features.

Describe the solution you'd like
A clear and concise description of what you want to happen.

Use: https://github.com/microsoft/PowerPlatform-DataverseServiceClient

Additional context
Add any other context or screenshots about the feature request here.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.