Giter Site home page Giter Site logo

anatolyuss / nmig Goto Github PK

View Code? Open in Web Editor NEW
415.0 415.0 75.0 972 KB

NMIG is a database migration tool, written in Node.js and highly inspired by FromMySqlToPostgreSql.

License: GNU General Public License v3.0

TypeScript 99.38% JavaScript 0.45% Dockerfile 0.16%

nmig's People

Contributors

adrianparsons avatar anatolyfromperion avatar anatolyuss avatar breart avatar chamby avatar dependabot[bot] avatar fjfalcon avatar freezy avatar jhere1 avatar lsei avatar moominpappa avatar noah1989 avatar reloflex avatar vinerich avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nmig's Issues

Migrating autoincrement/sequence values

I noticed when migrating my database that even though NMIG was able to create the correct sequences for each table, it didn't set the values for any of them... is there some reason they can't be set (or is there a way to set them)? Getting MySQL's auto_increment value for a table should be a simple enough query:

SELECT `AUTO_INCREMENT` FROM  INFORMATION_SCHEMA.TABLES
WHERE TABLE_SCHEMA = 'DatabaseName' AND   TABLE_NAME   = 'TableName';

Thanks

Hello,
Thanks for posting your work on this. Great to see a fresh take using node.js!

I just wanted to bring to your attention in case you weren't already aware;

I updated the config and ran node nmig.js. The log scrolled by, and I think the app worked. There were a lot of these warnings too, I think they're OK to ignore but just wanted you to know.
(node:16263) DeprecationWarning: Calling an asynchronous function without callback is deprecated.

I think it's related to node 7.0, which I recently updated to.

Config option to skip schema migration?

I used this to migrate a database created with Rails' ActiveRecord, which tracks the Schema for me. ActiveRecord did a better job of setting up the schema properly in Postgres, due to it's better knowledge of the context. It would be great to be able to skip NMIG's schema migration, and just migrate data into a preset schema. (Is there any way to do that?)

Can I use this to keep two DBs in sync?

Hi, just one question.

After the initial migration, can I run this on a set interval to keep two databases in sync?
Will it automatically detect changes and update rows / insert new rows / delete deleted rows?

Run in cli and upload to npmjs.com

This tools is one of a kind, I thank you so much for this work!, what is missing is an easier and straight forward install and execution process. I think the steps would be making it a CLI tool and uploading to npmjs.com

Empty strings interpreted as NULL

We have a MySQL database where some tables have the empty string as a value with a not-null constraint. nmig generates CSV files that don't quote the empty string, so PostgreSQL (by default) interprets these fields as null. The NOT NULL creation later fails.

--[populateTableWorker] Error loading table data:

What looks a little suspicious to me is the 873021 in the prepareDataChunks vs 87303 down in the LIMIT. I can paste the SQL into the mysql database and it works as one would expect.

  --[TableProcessor::createTable] Table "v25tmp"."jos_comprofiler" is created...
  --[prepareDataChunks] Total rows to insert into "v25tmp"."jos_comprofiler": 873021
  --[populateTableWorker] Error loading table data:
   SELECT `id` AS `id`,`user_id` AS `user_id`,`firstname` AS `firstname`,`middlename` AS `middlename`,`lastname` AS `lastname`,`hits` AS `hits`,IF(`message_last_sent` IN('0000-00-00', '0000-00-00 00:00:00'), '-INFINITY', CAST(`message_last_sent` AS CHAR)) AS `message_last_sent`,`message_number_sent` AS `message_number_sent`,`avatar` AS `avatar`,`avatarapproved` AS `avatarapproved`,`approved` AS `approved`,`confirmed` AS `confirmed`,IF(`lastupdatedate` IN('0000-00-00', '0000-00-00 00:00:00'), '-INFINITY', CAST(`lastupdatedate` AS CHAR)) AS `lastupdatedate`,`registeripaddr` AS `registeripaddr`,`cbactivation` AS `cbactivation`,`banned` AS `banned`,IF(`banneddate` IN('0000-00-00', '0000-00-00 00:00:00'), '-INFINITY', CAST(`banneddate` AS CHAR)) AS `banneddate`,IF(`unbanneddate` IN('0000-00-00', '0000-00-00 00:00:00'), '-INFINITY', CAST(`unbanneddate` AS CHAR)) AS `unbanneddate`,`bannedby` AS `bannedby`,`unbannedby` AS `unbannedby`,`bannedreason` AS `bannedreason`,`acceptedterms` AS `acceptedterms`,`website` AS `website`,`location` AS `location`,`occupation` AS `occupation`,`interests` AS `interests`,`company` AS `company`,`address` AS `address`,`city` AS `city`,`state` AS `state`,`zipcode` AS `zipcode`,`country` AS `country`,`phone` AS `phone`,`fax` AS `fax`,`cb_assub` AS `cb_assub`,`cb_ossub` AS `cb_ossub` FROM `jos_comprofiler` LIMIT 0,87303;

Invalid string length

Greetings, recieved this stacktrace when trying to import database.

/var/lib/postgresql/nmig/migration/fmtp/csvStringifyModified.js:64
       return callback(null, chunks.join(''));
                                    ^

RangeError: Invalid string length
    at Array.join (native)
    at null.<anonymous> (/var/lib/postgresql/nmig/migration/fmtp/csvStringifyModified.js:64:37)
    at emitNone (events.js:85:20)
    at emit (events.js:179:7)
    at endReadableNT (_stream_readable.js:913:12)
    at _combinedTickCallback (node.js:377:13)
    at process._tickCallback (node.js:401:11)

Nodejs version: 5.7.0

There are no file errors_only.log in logs_directory.
grep error in logs_directory also says nothing.

Thank you.

Name of the tables are created with double quotes

After migration tables are created with double quotation in the name. Like this: mysql table is foobar and in postgres are created "foobar".

postgres> select * from foobar;
postgres> ERROR:  relation "foobar" does not exist
postgres> select * from "foobar"; 
postgres> // thats ok 

load too slow

mysql database 33G, It cost 9 hours, how can I speed up? I need your help, thx!

awesome

thanks A LOT for this great tool

Use of COPY only works when PostgreSQL is on the same host as MySQL

The way COPY is being used is not entirely optimal:

  1. It requires that the PostgreSQL connection user has access to the CSV file
  2. It assumes that the CSV file is on the same host as the PostgreSQL server.

I think it would be better to use the STDIN variation of COPY, which doesn't have these limitations.

Convert MUYSQL BLOB FIELD

In Mysql I have a table column(blob) that keeps php serialized object.
Using nmig I migrated that table in ot postgresql and later I found that, I can not unserialize that column (bytea) using php unserialize( ).

I feel that there is something not right at converting blog to byeta.

`module.exports = (arrTableColumns, mysqlVersion) => {
let strRetVal = '';
const arrTableColumnsLength = arrTableColumns.length;
const wkbFunc = mysqlVersion >= 5.76 ? 'ST_AsWKB' : 'AsWKB';

for (let i = 0; i < arrTableColumnsLength; ++i) {
    const field = arrTableColumns[i].Field;
    const type  = arrTableColumns[i].Type;

    if (isSpacial(type)) {
        strRetVal += 'HEX(' + wkbFunc + '(`' + field + '`)) AS `' + field + '`,';
    } else if (isBinary(type)) {
        strRetVal += 'HEX(`' + field + '`) AS `' + field + '`,';
    } else if (isBit(type)) {
        strRetVal += 'BIN(`' + field + '`) AS `' + field + '`,';
    } else if (isDateTime(type)) {
        strRetVal += 'IF(`' + field +  '` IN(\'0000-00-00\', \'0000-00-00 00:00:00\'), \'-INFINITY\', CAST(`'
            +  field + '` AS CHAR)) AS `' + field + '`,';
    } else {
        strRetVal += '`' + field + '` AS `' + field + '`,';
    }
}

return strRetVal.slice(0, -1);

};`
Its really appreciate your help about this.

Error while Migrate only data

--[populateTableWorker] error: extra data after last expected column

SQL: COPY "public"."job_tasks" FROM STDIN DELIMITER ',' CSV;

Do this error message is causing data corrupted ?
Thank you for any response

Convert tinyint(mysql) to boolean(psql)

Greetings.
Returned to attemp to migrate production database from mysql to postgresql.
Founded one issue that maybe should be solved in NMIG?

When i use mysql i use type tinyint with only two values: 0 - false, 1 - true.
But with postgresql i will need to convert all my tinyint columns to boolean...
Maybe NMIG can convert it when migrating databases?

p.s. i started my first project on node.js(mysql_camelcase_renamer), i hope you don't mind if i take some of your templates?
Thank you for you software.

Infinite loop "loading the data"

Cool tool. Just gave it a try. It seems to work fine, but looks to be in an infinite loop.
The data looks to have successfully transferred in a few minutes but for some hours now it continues to print to the screen:
--[loadData] Loading the data...
--[loadData] Loading the data...

script tries to load no columns

The script tries to convert this MySQL table:

CREATE TABLE `category_company` ( 
	`id` Int( 10 ) UNSIGNED AUTO_INCREMENT NOT NULL,
	`company_id` Int( 10 ) UNSIGNED NOT NULL,
	`category_id` Int( 10 ) UNSIGNED NOT NULL,
	PRIMARY KEY ( `id` ) )
CHARACTER SET = utf8mb4
COLLATE = utf8mb4_unicode_ci
ENGINE = InnoDB;

with this SELECT query:

--[DataLoader::populateTableWorker] Error: ER_PARSE_ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'FROM `category_company` LIMIT 0,325585' at line 1

        SQL: SELECT  FROM `category_company` LIMIT 0,325585;

(notice there is no list of columns)

The script goes on but raises these warnings:

(node:13548) UnhandledPromiseRejectionWarning: #<Object>
(node:13548) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:13548) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

So it's something in ColumnsDataArranger maybe? and a stray promise.

Out of memory when importing a moderately sized database

I am running on a desktop with 16GB RAM and no swap.

Here's the output:

[nix-shell:~/work/nmig]$ node main.js 

    NMIG - the database migration tool
    Copyright 2016 Anatoly Khaytovich <[email protected]>

    --[boot] Boot...
    --[readDataTypesMap] Data Types Map is loaded...
    --[boot] Boot is accomplished...
    --[createLogsDirectory] Creating logs directory...
    --[createLogsDirectory] Logs directory already exists...
    --[createTemporaryDirectory] Creating temporary directory...
    --[createTemporaryDirectory] Temporary directory is created...
    --[loadStructureToMigrate] Source DB structure is loaded...
    --[loadStructureToMigrate] Tables to migrate: 71
    --[loadStructureToMigrate] Views to migrate: 3
    --[createTable] Currently creating table: `accessControlList`
    --[createTable] Currently creating table: `assemblyPricing`
    --[createTable] Currently creating table: `assemblySnapshot`
    --[createTable] Currently creating table: `assemblyTimeTrack`
    --[createTable] Currently creating table: `assemblyTimeTrackTask`
    --[createTable] Currently creating table: `assemblyTiming`
    --[createTable] Currently creating table: `board`
    --[createTable] Currently creating table: `boardImage`
    --[createTable] Currently creating table: `boardLayer`
    --[createTable] Currently creating table: `boardOrder`
    --[createTable] Currently creating table: `boardPricing`
    --[createTable] Currently creating table: `boardQuote`
    --[createTable] Currently creating table: `boardSnapshot`
    --[createTable] Currently creating table: `bomSnapshot`
    --[createTable] Currently creating table: `category`
    --[createTable] Currently creating table: `categoryAttribute`
    --[createTable] Currently creating table: `categorySchema`
    --[createTable] Currently creating table: `categoryValue`
    --[createTable] Currently creating table: `comment`
    --[createTable] Currently creating table: `commentAttachment`
    --[createTable] Currently creating table: `component`
    --[createTable] Currently creating table: `document`
    --[createTable] Currently creating table: `domainBody`
    --[createTable] Currently creating table: `domainFootprint`
    --[createTable] Currently creating table: `domainSymbol`
    --[createTable] Currently creating table: `domainTemplate`
    --[createTable] Currently creating table: `edaLibrary`
    --[createTable] Currently creating table: `event`
    --[createTable] Currently creating table: `eventTransaction`
    --[createTable] Currently creating table: `firmware`
    --[createTable] Currently creating table: `foo`
    --[createTable] Currently creating table: `footprint`
    --[createTable] Currently creating table: `image`
    --[createTable] Currently creating table: `inst`
    --[createTable] Currently creating table: `instName`
    --[createTable] Currently creating table: `installed_migrations`
    --[createTable] Currently creating table: `issue`
    --[createTable] Currently creating table: `issueComment`
    --[createTable] Currently creating table: `octoV3PartCache`
    --[createTable] Currently creating table: `orderAttachment`
    --[createTable] Currently creating table: `overageCostTrack`
    --[createTable] Currently creating table: `overageCostTrackType`
    --[createTable] Currently creating table: `panel`
    --[createTable] Currently creating table: `part`
    --[createTable] Currently creating table: `partCache`
    --[createTable] Currently creating table: `partDocument`
    --[createTable] Currently creating table: `partRelation`
    --[createTable] Currently creating table: `partSpecValue`
    --[createTable] Currently creating table: `progress`
    --[createTable] Currently creating table: `project`
    --[createTable] Currently creating table: `projectBom`
    --[createTable] Currently creating table: `projectBomRef`
    --[createTable] Currently creating table: `projectBomSubstitute`
    --[createTable] Currently creating table: `projectOrder`
    --[createTable] Currently creating table: `projectOrderPcb`
    --[createTable] Currently creating table: `projectOrderStencil`
    --[createTable] Currently creating table: `projectQuote`
    --[createTable] Currently creating table: `projectRevision`
    --[createTable] Currently creating table: `projectRevisionFiles`
    --[createTable] Currently creating table: `projectStar`
    --[createTable] Currently creating table: `schematic`
    --[createTable] Currently creating table: `schematicPage`
    --[createTable] Currently creating table: `source`
    --[createTable] Currently creating table: `sourceAvailability`
    --[createTable] Currently creating table: `sourcePricing`
    --[createTable] Currently creating table: `stripeSource`
    --[createTable] Currently creating table: `user`
    --[createTable] Currently creating table: `userAddress`
    --[createTable] Currently creating table: `userAuthProvider`
    --[createTable] Currently creating table: `userComponent`
    --[createTable] Currently creating table: `userOrganization`
    --[createTable] Table "uniplex"."assemblyTimeTrackTask" is created...
    --[populateTable] Currently populating table: `assemblyTimeTrackTask`
    --[createTable] Table "uniplex"."assemblyTimeTrack" is created...
    --[populateTable] Currently populating table: `assemblyTimeTrack`
    --[createTable] Table "uniplex"."assemblySnapshot" is created...
    --[populateTable] Currently populating table: `assemblySnapshot`
    --[createTable] Table "uniplex"."boardOrder" is created...
    --[populateTable] Currently populating table: `boardOrder`
    --[createTable] Table "uniplex"."assemblyTiming" is created...
    --[populateTable] Currently populating table: `assemblyTiming`
    --[createTable] Table "uniplex"."boardQuote" is created...
    --[populateTable] Currently populating table: `boardQuote`
    --[createTable] Table "uniplex"."accessControlList" is created...
    --[populateTable] Currently populating table: `accessControlList`
    --[createTable] Table "uniplex"."board" is created...
    --[populateTable] Currently populating table: `board`
    --[createTable] Table "uniplex"."boardSnapshot" is created...
    --[populateTable] Currently populating table: `boardSnapshot`
    --[createTable] Table "uniplex"."categorySchema" is created...
    --[populateTable] Currently populating table: `categorySchema`
    --[createTable] Table "uniplex"."categoryValue" is created...
    --[populateTable] Currently populating table: `categoryValue`
    --[createTable] Table "uniplex"."bomSnapshot" is created...
    --[populateTable] Currently populating table: `bomSnapshot`
    --[createTable] Table "uniplex"."categoryAttribute" is created...
    --[populateTable] Currently populating table: `categoryAttribute`
    --[createTable] Table "uniplex"."component" is created...
    --[populateTable] Currently populating table: `component`
    --[createTable] Table "uniplex"."commentAttachment" is created...
    --[populateTable] Currently populating table: `commentAttachment`
    --[createTable] Table "uniplex"."comment" is created...
    --[populateTable] Currently populating table: `comment`
    --[createTable] Table "uniplex"."document" is created...
    --[populateTable] Currently populating table: `document`
    --[createTable] Table "uniplex"."domainBody" is created...
    --[populateTable] Currently populating table: `domainBody`
    --[createTable] Table "uniplex"."category" is created...
    --[populateTable] Currently populating table: `category`
    --[createTable] Table "uniplex"."boardPricing" is created...
    --[populateTable] Currently populating table: `boardPricing`
    --[createTable] Table "uniplex"."foo" is created...
    --[populateTable] Currently populating table: `foo`
    --[createTable] Table "uniplex"."boardImage" is created...
    --[populateTable] Currently populating table: `boardImage`
    --[createTable] Table "uniplex"."domainTemplate" is created...
    --[populateTable] Currently populating table: `domainTemplate`
    --[createTable] Table "uniplex"."event" is created...
    --[populateTable] Currently populating table: `event`
    --[createTable] Table "uniplex"."edaLibrary" is created...
    --[populateTable] Currently populating table: `edaLibrary`
    --[createTable] Table "uniplex"."firmware" is created...
    --[populateTable] Currently populating table: `firmware`
    --[createTable] Table "uniplex"."boardLayer" is created...
    --[populateTable] Currently populating table: `boardLayer`
    --[createTable] Table "uniplex"."image" is created...
    --[populateTable] Currently populating table: `image`
    --[createTable] Table "uniplex"."domainFootprint" is created...
    --[populateTable] Currently populating table: `domainFootprint`
    --[createTable] Table "uniplex"."issueComment" is created...
    --[populateTable] Currently populating table: `issueComment`
    --[createTable] Table "uniplex"."installed_migrations" is created...
    --[populateTable] Currently populating table: `installed_migrations`
    --[createTable] Table "uniplex"."issue" is created...
    --[populateTable] Currently populating table: `issue`
    --[createTable] Table "uniplex"."instName" is created...
    --[populateTable] Currently populating table: `instName`
    --[createTable] Table "uniplex"."orderAttachment" is created...
    --[populateTable] Currently populating table: `orderAttachment`
    --[createTable] Table "uniplex"."panel" is created...
    --[populateTable] Currently populating table: `panel`
    --[createTable] Table "uniplex"."footprint" is created...
    --[populateTable] Currently populating table: `footprint`
    --[createTable] Table "uniplex"."inst" is created...
    --[populateTable] Currently populating table: `inst`
    --[createTable] Table "uniplex"."domainSymbol" is created...
    --[populateTable] Currently populating table: `domainSymbol`
    --[createTable] Table "uniplex"."overageCostTrackType" is created...
    --[populateTable] Currently populating table: `overageCostTrackType`
    --[createTable] Table "uniplex"."overageCostTrack" is created...
    --[populateTable] Currently populating table: `overageCostTrack`
    --[createTable] Table "uniplex"."partDocument" is created...
    --[populateTable] Currently populating table: `partDocument`
    --[createTable] Table "uniplex"."partRelation" is created...
    --[populateTable] Currently populating table: `partRelation`
    --[createTable] Table "uniplex"."partCache" is created...
    --[populateTable] Currently populating table: `partCache`
    --[createTable] Table "uniplex"."assemblyPricing" is created...
    --[populateTable] Currently populating table: `assemblyPricing`
    --[createTable] Table "uniplex"."eventTransaction" is created...
    --[populateTable] Currently populating table: `eventTransaction`
    --[createTable] Table "uniplex"."project" is created...
    --[populateTable] Currently populating table: `project`
    --[createTable] Table "uniplex"."projectOrderPcb" is created...
    --[populateTable] Currently populating table: `projectOrderPcb`
    --[createTable] Table "uniplex"."progress" is created...
    --[populateTable] Currently populating table: `progress`
    --[createTable] Table "uniplex"."projectOrder" is created...
    --[populateTable] Currently populating table: `projectOrder`
    --[createTable] Table "uniplex"."projectOrderStencil" is created...
    --[populateTable] Currently populating table: `projectOrderStencil`
    --[createTable] Table "uniplex"."octoV3PartCache" is created...
    --[populateTable] Currently populating table: `octoV3PartCache`
    --[createTable] Table "uniplex"."partSpecValue" is created...
    --[populateTable] Currently populating table: `partSpecValue`
    --[createTable] Table "uniplex"."projectStar" is created...
    --[populateTable] Currently populating table: `projectStar`
    --[createTable] Table "uniplex"."schematic" is created...
    --[populateTable] Currently populating table: `schematic`
    --[createTable] Table "uniplex"."schematicPage" is created...
    --[populateTable] Currently populating table: `schematicPage`
    --[createTable] Table "uniplex"."projectRevision" is created...
    --[populateTable] Currently populating table: `projectRevision`
    --[createTable] Table "uniplex"."stripeSource" is created...
    --[populateTable] Currently populating table: `stripeSource`
    --[createTable] Table "uniplex"."projectRevisionFiles" is created...
    --[populateTable] Currently populating table: `projectRevisionFiles`
    --[createTable] Table "uniplex"."projectBom" is created...
    --[populateTable] Currently populating table: `projectBom`
    --[createTable] Table "uniplex"."projectBomRef" is created...
    --[populateTable] Currently populating table: `projectBomRef`
    --[createTable] Table "uniplex"."userOrganization" is created...
    --[populateTable] Currently populating table: `userOrganization`
    --[createTable] Table "uniplex"."userComponent" is created...
    --[populateTable] Currently populating table: `userComponent`
    --[populateTable] Total rows to insert into "uniplex"."assemblySnapshot": 48
    --[populateTable] Total rows to insert into "uniplex"."assemblyTimeTrackTask": 18
    --[populateTable] Total rows to insert into "uniplex"."boardOrder": 47
    --[populateTable] Total rows to insert into "uniplex"."assemblyTimeTrack": 65
    --[populateTable] Total rows to insert into "uniplex"."assemblyTiming": 39
    --[populateTable] Total rows to insert into "uniplex"."boardQuote": 3
    --[populateTable] Total rows to insert into "uniplex"."accessControlList": 179
    --[populateTable] Total rows to insert into "uniplex"."boardSnapshot": 44
    --[populateTable] Total rows to insert into "uniplex"."categorySchema": 0
    --[processEnum] Defines "ENUMs" for table "uniplex"."categorySchema"
    --[processNull] Defines "NULLs" for table: "uniplex"."categorySchema"
    --[populateTable] Total rows to insert into "uniplex"."board": 772
    --[populateTable] Total rows to insert into "uniplex"."categoryValue": 0
    --[processEnum] Defines "ENUMs" for table "uniplex"."categoryValue"
    --[processNull] Defines "NULLs" for table: "uniplex"."categoryValue"
    --[populateTable] Total rows to insert into "uniplex"."bomSnapshot": 587
    --[populateTable] Total rows to insert into "uniplex"."categoryAttribute": 0
    --[processEnum] Defines "ENUMs" for table "uniplex"."categoryAttribute"
    --[processNull] Defines "NULLs" for table: "uniplex"."categoryAttribute"
    --[populateTable] Total rows to insert into "uniplex"."component": 1719
    --[populateTable] Total rows to insert into "uniplex"."comment": 70
    --[populateTable] Total rows to insert into "uniplex"."document": 313
    --[populateTable] Total rows to insert into "uniplex"."commentAttachment": 11
    --[populateTable] Total rows to insert into "uniplex"."domainBody": 133
    --[populateTable] Total rows to insert into "uniplex"."foo": 0
    --[processEnum] Defines "ENUMs" for table "uniplex"."foo"
    --[processNull] Defines "NULLs" for table: "uniplex"."foo"
    --[processDefault] Defines default values for table: "uniplex"."foo"
    --[processNull] Set "ENUM" for table "uniplex"."categorySchema" column: "categoryID"
    --[processNull] Set "ENUM" for table "uniplex"."categoryValue" column: "objectID"
    --[createTable] Table "uniplex"."part" is created...
    --[populateTable] Currently populating table: `part`
    --[populateTable] Total rows to insert into "uniplex"."category": 633
    --[populateTable] Total rows to insert into "uniplex"."domainTemplate": 45
    --[populateTable] Total rows to insert into "uniplex"."firmware": 15
    --[populateTable] Total rows to insert into "uniplex"."edaLibrary": 1016
    --[populateTable] Total rows to insert into "uniplex"."image": 0
    --[processEnum] Defines "ENUMs" for table "uniplex"."image"
    --[processNull] Defines "NULLs" for table: "uniplex"."image"
    --[processNull] Set "ENUM" for table "uniplex"."categorySchema" column: "attributeID"
    --[populateTable] Total rows to insert into "uniplex"."event": 2561
    --[populateTable] Total rows to insert into "uniplex"."issueComment": 70
    --[populateTable] Total rows to insert into "uniplex"."installed_migrations": 1
    --[processNull] Set "ENUM" for table "uniplex"."categoryValue" column: "objectType"
    --[createTable] Table "uniplex"."projectQuote" is created...
    --[populateTable] Currently populating table: `projectQuote`
    --[populateTable] Total rows to insert into "uniplex"."issue": 47
    --[populateTable] Total rows to insert into "uniplex"."panel": 12
    --[populateTable] Total rows to insert into "uniplex"."footprint": 395
    --[populateTable] Total rows to insert into "uniplex"."orderAttachment": 26
    --[populateTable] Total rows to insert into "uniplex"."instName": 2769
    --[populateTable] Total rows to insert into "uniplex"."inst": 2105
    --[processNull] Set "ENUM" for table "uniplex"."categorySchema" column: "id"
    --[populateTable] Total rows to insert into "uniplex"."overageCostTrackType": 4
    --[populateTable] Total rows to insert into "uniplex"."overageCostTrack": 4
    --[populateTable] Total rows to insert into "uniplex"."partRelation": 0
    --[processEnum] Defines "ENUMs" for table "uniplex"."partRelation"
    --[processNull] Defines "NULLs" for table: "uniplex"."partRelation"
    --[populateTable] Total rows to insert into "uniplex"."partDocument": 313
    --[populateTable] Total rows to insert into "uniplex"."partCache": 0
    --[processEnum] Defines "ENUMs" for table "uniplex"."partCache"
    --[processNull] Defines "NULLs" for table: "uniplex"."partCache"
    --[populateTable] Total rows to insert into "uniplex"."domainSymbol": 1197
    --[populateTable] Total rows to insert into "uniplex"."project": 467
    --[populateTable] Total rows to insert into "uniplex"."projectOrderPcb": 1
    --[processNull] Set "ENUM" for table "uniplex"."categoryAttribute" column: "name"
    --[processNull] Set "ENUM" for table "uniplex"."categoryValue" column: "categoryID"
    --[populateTable] Total rows to insert into "uniplex"."progress": 1752
    --[populateTable] Total rows to insert into "uniplex"."projectOrderStencil": 45
    --[populateTable] Total rows to insert into "uniplex"."projectOrder": 62
    --[populateTable] Total rows to insert into "uniplex"."projectStar": 3
    --[populateTable] Total rows to insert into "uniplex"."schematic": 773
    --[populateTable] Total rows to insert into "uniplex"."schematicPage": 1356
    --[populateTable] Total rows to insert into "uniplex"."projectRevision": 826
    --[populateTable] Total rows to insert into "uniplex"."stripeSource": 10
    --[populateTable] Total rows to insert into "uniplex"."partSpecValue": 250
    --[populateTable] Total rows to insert into "uniplex"."boardImage": 3265
    --[populateTable] Total rows to insert into "uniplex"."userOrganization": 62
    --[populateTable] Total rows to insert into "uniplex"."userComponent": 1165
    --[processNull] Set "ENUM" for table "uniplex"."categorySchema" column: "createdAt"
    --[populateTable] Total rows to insert into "uniplex"."boardLayer": 7437
    --[processNull] Set "ENUM" for table "uniplex"."categoryAttribute" column: "valueType"
    --[processNull] Set "ENUM" for table "uniplex"."categoryValue" column: "attributeID"
    --[processNull] Set "ENUM" for table "uniplex"."categorySchema" column: "updatedAt"
    --[processDefault] Defines default values for table: "uniplex"."categorySchema"
    --[createSequence] Trying to create sequence : "uniplex"."categorySchema_id_seq"
    --[populateTable] Total rows to insert into "uniplex"."projectRevisionFiles": 5501
    --[processNull] Set "ENUM" for table "uniplex"."categoryValue" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."categoryAttribute" column: "id"
    --[createTable] Table "uniplex"."sourceAvailability" is created...
    --[populateTable] Currently populating table: `sourceAvailability`
    --[createTable] Table "uniplex"."sourcePricing" is created...
    --[populateTable] Currently populating table: `sourcePricing`
    --[processNull] Set "ENUM" for table "uniplex"."categoryValue" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."categoryAttribute" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."image" column: "consumerID"
    --[createTable] Table "uniplex"."user" is created...
    --[populateTable] Currently populating table: `user`
    --[processNull] Set "ENUM" for table "uniplex"."categoryValue" column: "updatedAt"
    --[processDefault] Defines default values for table: "uniplex"."categoryValue"
    --[createSequence] Trying to create sequence : "uniplex"."categoryValue_id_seq"
    --[processNull] Set "ENUM" for table "uniplex"."categoryAttribute" column: "updatedAt"
    --[processDefault] Defines default values for table: "uniplex"."categoryAttribute"
    --[createSequence] Trying to create sequence : "uniplex"."categoryAttribute_id_seq"
    --[createTable] Table "uniplex"."userAuthProvider" is created...
    --[populateTable] Currently populating table: `userAuthProvider`
    --[processNull] Set "ENUM" for table "uniplex"."image" column: "consumerType"
    --[createTable] Table "uniplex"."userAddress" is created...
    --[populateTable] Currently populating table: `userAddress`
    --[populateTable] Total rows to insert into "uniplex"."projectBomRef": 68023
    --[populateTable] Total rows to insert into "uniplex"."assemblyPricing": 98472
    --[processNull] Set "ENUM" for table "uniplex"."image" column: "id"
    --[processIndexAndKey] "uniplex"."foo": PK/indices are successfully set...
    --[processNull] Set "ENUM" for table "uniplex"."partRelation" column: "type"
    --[createTable] Table "uniplex"."projectBomSubstitute" is created...
    --[populateTable] Currently populating table: `projectBomSubstitute`
    --[processNull] Set "ENUM" for table "uniplex"."image" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."partRelation" column: "superiorID"
    --[processNull] Set "ENUM" for table "uniplex"."image" column: "updatedAt"
    --[processDefault] Defines default values for table: "uniplex"."image"
    --[createSequence] Trying to create sequence : "uniplex"."image_id_seq"
    --[processNull] Set "ENUM" for table "uniplex"."partRelation" column: "inferiorID"
    --[processNull] Set "ENUM" for table "uniplex"."partRelation" column: "creatorID"
    --[processNull] Set "ENUM" for table "uniplex"."partRelation" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."partRelation" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."partCache" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."partRelation" column: "updatedAt"
    --[processDefault] Defines default values for table: "uniplex"."partRelation"
    --[processNull] Set "ENUM" for table "uniplex"."partCache" column: "createdAt"
    --[createTable] Table "uniplex"."source" is created...
    --[populateTable] Currently populating table: `source`
    --[processNull] Set "ENUM" for table "uniplex"."partCache" column: "updatedAt"
    --[processDefault] Defines default values for table: "uniplex"."partCache"
    --[createSequence] Trying to create sequence : "uniplex"."partCache_id_seq"
    --[populateTable] Total rows to insert into "uniplex"."eventTransaction": 31288
    --[populateTable] Total rows to insert into "uniplex"."domainFootprint": 716
    --[populateTable] Total rows to insert into "uniplex"."part": 19115
    --[populateTable] Total rows to insert into "uniplex"."projectQuote": 6203
    --[populateTable] Total rows to insert into "uniplex"."boardPricing": 136255
    --[createSequence] Sequence "uniplex"."categoryValue_id_seq" is created...
    --[createSequence] Sequence "uniplex"."categoryAttribute_id_seq" is created...
    --[createSequence] Sequence "uniplex"."categorySchema_id_seq" is created...
    --[createSequence] Sequence "uniplex"."image_id_seq" is created...
    --[populateTable] Total rows to insert into "uniplex"."projectBom": 24200
    --[populateTable] Total rows to insert into "uniplex"."user": 2058
    --[populateTable] Total rows to insert into "uniplex"."userAuthProvider": 2050
    --[populateTable] Total rows to insert into "uniplex"."userAddress": 44
    --[populateTable] Total rows to insert into "uniplex"."sourceAvailability": 247835
    --[populateTableWorker]  For now inserted: 48 rows, Total rows to insert into "uniplex"."assemblySnapshot": 48
    --[populateTable] Total rows to insert into "uniplex"."octoV3PartCache": 4640
    --[populateTable] Total rows to insert into "uniplex"."sourcePricing": 567961
    --[processEnum] Defines "ENUMs" for table "uniplex"."assemblySnapshot"
    --[processNull] Defines "NULLs" for table: "uniplex"."assemblySnapshot"
    --[populateTable] Total rows to insert into "uniplex"."projectBomSubstitute": 13226
    --[populateTable] Total rows to insert into "uniplex"."source": 269925
    --[populateTableWorker]  For now inserted: 18 rows, Total rows to insert into "uniplex"."assemblyTimeTrackTask": 18
    --[processEnum] Defines "ENUMs" for table "uniplex"."assemblyTimeTrackTask"
    --[processNull] Defines "NULLs" for table: "uniplex"."assemblyTimeTrackTask"
    --[processDefault] Set default value for table "uniplex"."partRelation" column: "type"
    --[populateTableWorker]  For now inserted: 65 rows, Total rows to insert into "uniplex"."assemblyTimeTrack": 65
    --[processDefault] Set default value for table "uniplex"."partRelation" column: "moderated"
    --[createSequence] Trying to create sequence : "uniplex"."partRelation_id_seq"
    --[processEnum] Defines "ENUMs" for table "uniplex"."assemblyTimeTrack"
    --[processNull] Defines "NULLs" for table: "uniplex"."assemblyTimeTrack"
    --[populateTableWorker]  For now inserted: 39 rows, Total rows to insert into "uniplex"."assemblyTiming": 39
    --[populateTableWorker]  For now inserted: 3 rows, Total rows to insert into "uniplex"."boardQuote": 3
    --[processEnum] Defines "ENUMs" for table "uniplex"."boardQuote"
    --[processNull] Defines "NULLs" for table: "uniplex"."boardQuote"
    --[processEnum] Defines "ENUMs" for table "uniplex"."assemblyTiming"
    --[processNull] Defines "NULLs" for table: "uniplex"."assemblyTiming"
    --[populateTableWorker]  For now inserted: 179 rows, Total rows to insert into "uniplex"."accessControlList": 179
    --[processEnum] Defines "ENUMs" for table "uniplex"."accessControlList"
    --[processNull] Defines "NULLs" for table: "uniplex"."accessControlList"
    --[populateTableWorker]  For now inserted: 44 rows, Total rows to insert into "uniplex"."boardSnapshot": 44
    --[processEnum] Defines "ENUMs" for table "uniplex"."boardSnapshot"
    --[processNull] Defines "NULLs" for table: "uniplex"."boardSnapshot"
    --[populateTableWorker]  For now inserted: 47 rows, Total rows to insert into "uniplex"."boardOrder": 47
    --[processEnum] Defines "ENUMs" for table "uniplex"."boardOrder"
    --[processNull] Defines "NULLs" for table: "uniplex"."boardOrder"
    --[createSequence] Sequence "uniplex"."partCache_id_seq" is created...
    --[populateTableWorker]  For now inserted: 772 rows, Total rows to insert into "uniplex"."board": 772
    --[processEnum] Defines "ENUMs" for table "uniplex"."board"
    --[processNull] Defines "NULLs" for table: "uniplex"."board"
    --[populateTableWorker]  For now inserted: 587 rows, Total rows to insert into "uniplex"."bomSnapshot": 587
    --[processEnum] Defines "ENUMs" for table "uniplex"."bomSnapshot"
    --[processNull] Defines "NULLs" for table: "uniplex"."bomSnapshot"
    --[populateTableWorker]  For now inserted: 1719 rows, Total rows to insert into "uniplex"."component": 1719
    --[processEnum] Defines "ENUMs" for table "uniplex"."component"
    --[processNull] Defines "NULLs" for table: "uniplex"."component"
    --[populateTableWorker]  For now inserted: 70 rows, Total rows to insert into "uniplex"."comment": 70
    --[processEnum] Defines "ENUMs" for table "uniplex"."comment"
    --[processNull] Defines "NULLs" for table: "uniplex"."comment"
    --[populateTableWorker]  For now inserted: 313 rows, Total rows to insert into "uniplex"."document": 313
    --[processEnum] Defines "ENUMs" for table "uniplex"."document"
    --[processNull] Defines "NULLs" for table: "uniplex"."document"
    --[populateTableWorker]  For now inserted: 11 rows, Total rows to insert into "uniplex"."commentAttachment": 11
    --[processEnum] Defines "ENUMs" for table "uniplex"."commentAttachment"
    --[processNull] Defines "NULLs" for table: "uniplex"."commentAttachment"
    --[populateTableWorker]  For now inserted: 133 rows, Total rows to insert into "uniplex"."domainBody": 133
    --[processEnum] Defines "ENUMs" for table "uniplex"."domainBody"
    --[processNull] Defines "NULLs" for table: "uniplex"."domainBody"
    --[populateTableWorker]  For now inserted: 633 rows, Total rows to insert into "uniplex"."category": 633
    --[processEnum] Defines "ENUMs" for table "uniplex"."category"
    --[processNull] Defines "NULLs" for table: "uniplex"."category"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "updatedAt"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "projectOrderId"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "projectRevisionId"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "boardAreaMM"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "bothSides"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "leadlessFits"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "leadlessPads"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "lineItems"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "smtFits"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "smtPads"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "solderJoints"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "thFits"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "thPads"
    --[processNull] Set "ENUM" for table "uniplex"."assemblySnapshot" column: "minimumPadPitchMM"
    --[processDefault] Defines default values for table: "uniplex"."assemblySnapshot"
    --[createSequence] Trying to create sequence : "uniplex"."assemblySnapshot_id_seq"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrackTask" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrackTask" column: "taskName"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrackTask" column: "taskRate"
    --[processDefault] Defines default values for table: "uniplex"."assemblyTimeTrackTask"
    --[createSequence] Trying to create sequence : "uniplex"."assemblyTimeTrackTask_id_seq"
    --[populateTableWorker]  For now inserted: 15 rows, Total rows to insert into "uniplex"."firmware": 15
    --[processEnum] Defines "ENUMs" for table "uniplex"."firmware"
    --[processNull] Defines "NULLs" for table: "uniplex"."firmware"
    --[populateTableWorker]  For now inserted: 45 rows, Total rows to insert into "uniplex"."domainTemplate": 45
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrack" column: "id"
    --[processEnum] Defines "ENUMs" for table "uniplex"."domainTemplate"
    --[processNull] Defines "NULLs" for table: "uniplex"."domainTemplate"
    --[processNull] Set "ENUM" for table "uniplex"."boardQuote" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrack" column: "createdAt"
    --[createSequence] Sequence "uniplex"."partRelation_id_seq" is created...
    --[processNull] Set "ENUM" for table "uniplex"."boardQuote" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrack" column: "updatedAt"
    --[processNull] Set "ENUM" for table "uniplex"."boardQuote" column: "boardID"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrack" column: "projectOrderId"
    --[processNull] Set "ENUM" for table "uniplex"."boardQuote" column: "instID"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrack" column: "userId"
    --[processNull] Set "ENUM" for table "uniplex"."boardQuote" column: "boardsPerPanel"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrack" column: "assemblyTimeTrackTaskId"
    --[processNull] Set "ENUM" for table "uniplex"."boardQuote" column: "panelQuantity"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTiming" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrack" column: "duration"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTiming" column: "updatedAt"
    --[processNull] Set "ENUM" for table "uniplex"."boardQuote" column: "setupPrice"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrack" column: "hasTiming"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTiming" column: "timeTrackId"
    --[processNull] Set "ENUM" for table "uniplex"."boardQuote" column: "unitPrice"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTimeTrack" column: "timingOverriden"
    --[processDefault] Defines default values for table: "uniplex"."assemblyTimeTrack"
    --[processNull] Set "ENUM" for table "uniplex"."boardQuote" column: "shippingPrice"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTiming" column: "startedAt"
    --[processNull] Set "ENUM" for table "uniplex"."boardQuote" column: "leadTime"
    --[processDefault] Defines default values for table: "uniplex"."boardQuote"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTiming" column: "operatorId"
    --[processNull] Set "ENUM" for table "uniplex"."accessControlList" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTiming" column: "taskId"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTiming" column: "orderId"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTiming" column: "pauseDuration"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTiming" column: "pauseMax"
    --[processNull] Set "ENUM" for table "uniplex"."assemblyTiming" column: "pauseCount"
    --[processDefault] Defines default values for table: "uniplex"."assemblyTiming"
    --[processNull] Set "ENUM" for table "uniplex"."accessControlList" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."boardSnapshot" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."accessControlList" column: "updatedAt"
    --[processNull] Set "ENUM" for table "uniplex"."boardSnapshot" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."accessControlList" column: "organizationID"
    --[processNull] Set "ENUM" for table "uniplex"."boardSnapshot" column: "updatedAt"
    --[processNull] Set "ENUM" for table "uniplex"."accessControlList" column: "role"
    --[processNull] Set "ENUM" for table "uniplex"."boardSnapshot" column: "projectOrderID"
    --[processNull] Set "ENUM" for table "uniplex"."accessControlList" column: "resource"
    --[processNull] Set "ENUM" for table "uniplex"."boardSnapshot" column: "projectRevisionID"
    --[processNull] Set "ENUM" for table "uniplex"."accessControlList" column: "resourceID"
    --[processNull] Set "ENUM" for table "uniplex"."boardSnapshot" column: "quantity"
    --[processNull] Set "ENUM" for table "uniplex"."accessControlList" column: "access"
    --[processDefault] Defines default values for table: "uniplex"."accessControlList"
    --[processNull] Set "ENUM" for table "uniplex"."boardSnapshot" column: "blindBuriedHoles"
    --[processNull] Set "ENUM" for table "uniplex"."boardSnapshot" column: "numLayers"
    --[processNull] Set "ENUM" for table "uniplex"."boardSnapshot" column: "silkScreenBothSides"
    --[processNull] Set "ENUM" for table "uniplex"."boardSnapshot" column: "hasSMDBothSides"
    --[processDefault] Defines default values for table: "uniplex"."boardSnapshot"
    --[populateTableWorker]  For now inserted: 1016 rows, Total rows to insert into "uniplex"."edaLibrary": 1016
    --[processEnum] Defines "ENUMs" for table "uniplex"."edaLibrary"
    --[processNull] Defines "NULLs" for table: "uniplex"."edaLibrary"
    --[populateTableWorker]  For now inserted: 2561 rows, Total rows to insert into "uniplex"."event": 2561
    --[processEnum] Defines "ENUMs" for table "uniplex"."event"
    --[processNull] Defines "NULLs" for table: "uniplex"."event"
    --[processNull] Set "ENUM" for table "uniplex"."boardOrder" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."boardOrder" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."boardOrder" column: "updatedAt"
    --[processNull] Set "ENUM" for table "uniplex"."board" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."boardOrder" column: "projectOrderId"
    --[processNull] Set "ENUM" for table "uniplex"."board" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."boardOrder" column: "name"
    --[processNull] Set "ENUM" for table "uniplex"."board" column: "updatedAt"
    --[processNull] Set "ENUM" for table "uniplex"."boardOrder" column: "quantity"
    --[processNull] Set "ENUM" for table "uniplex"."board" column: "projectRevisionID"
    --[processNull] Set "ENUM" for table "uniplex"."boardOrder" column: "leadTime"
    --[processNull] Set "ENUM" for table "uniplex"."board" column: "blindBuriedHoles"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."boardOrder" column: "calculatedCost"
    --[processNull] Set "ENUM" for table "uniplex"."board" column: "numLayers"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."boardOrder" column: "userPrice"
    --[processNull] Set "ENUM" for table "uniplex"."board" column: "silkScreenBothSides"
    --[processNull] Set "ENUM" for table "uniplex"."boardOrder" column: "repeat"
    --[processDefault] Defines default values for table: "uniplex"."boardOrder"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "projectOrderID"
    --[processNull] Set "ENUM" for table "uniplex"."board" column: "hasSMDBothSides"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "bomId"
    --[processNull] Set "ENUM" for table "uniplex"."board" column: "selectedSolderMaskColor"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "partId"
    --[processNull] Set "ENUM" for table "uniplex"."board" column: "selectedSilkscreenColor"
    --[processDefault] Defines default values for table: "uniplex"."board"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "partUrn"
    --[processNull] Set "ENUM" for table "uniplex"."component" column: "partID"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "manufacturer"
    --[processNull] Set "ENUM" for table "uniplex"."component" column: "ownerID"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "supplier"
    --[processNull] Set "ENUM" for table "uniplex"."component" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "spn"
    --[processNull] Set "ENUM" for table "uniplex"."comment" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."component" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "referenceDesignators"
    --[processNull] Set "ENUM" for table "uniplex"."comment" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."component" column: "updatedAt"
    --[processDefault] Defines default values for table: "uniplex"."component"
    --[createSequence] Trying to create sequence : "uniplex"."component_id_seq"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "lineQuantity"
    --[processNull] Set "ENUM" for table "uniplex"."document" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."comment" column: "updatedAt"
    --[processNull] Set "ENUM" for table "uniplex"."document" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."commentAttachment" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."comment" column: "userId"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "orderQuantity"
    --[processNull] Set "ENUM" for table "uniplex"."document" column: "updatedAt"
    --[processDefault] Defines default values for table: "uniplex"."document"
    --[processNull] Set "ENUM" for table "uniplex"."comment" column: "body"
    --[processDefault] Defines default values for table: "uniplex"."comment"
    --[createSequence] Trying to create sequence : "uniplex"."comment_id_seq"
    --[processNull] Set "ENUM" for table "uniplex"."commentAttachment" column: "commentId"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "unitPrice"
    --[processNull] Set "ENUM" for table "uniplex"."commentAttachment" column: "name"
    --[processNull] Set "ENUM" for table "uniplex"."bomSnapshot" column: "orderPrice"
    --[processDefault] Defines default values for table: "uniplex"."bomSnapshot"
    --[processNull] Set "ENUM" for table "uniplex"."commentAttachment" column: "url"
    --[processDefault] Defines default values for table: "uniplex"."commentAttachment"
    --[createSequence] Trying to create sequence : "uniplex"."commentAttachment_id_seq"
    --[processNull] Set "ENUM" for table "uniplex"."domainBody" column: "ownerID"
    --[populateTableWorker]  For now inserted: 70 rows, Total rows to insert into "uniplex"."issueComment": 70
    --[processEnum] Defines "ENUMs" for table "uniplex"."issueComment"
    --[processNull] Defines "NULLs" for table: "uniplex"."issueComment"
    --[populateTableWorker]  For now inserted: 1 rows, Total rows to insert into "uniplex"."installed_migrations": 1
    --[processEnum] Defines "ENUMs" for table "uniplex"."installed_migrations"
    --[processNull] Defines "NULLs" for table: "uniplex"."installed_migrations"
    --[processDefault] Defines default values for table: "uniplex"."installed_migrations"
    --[processNull] Set "ENUM" for table "uniplex"."domainBody" column: "templateID"
    --[processNull] Set "ENUM" for table "uniplex"."domainBody" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."domainBody" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."domainBody" column: "updatedAt"
    --[processDefault] Defines default values for table: "uniplex"."domainBody"
    --[createSequence] Trying to create sequence : "uniplex"."domainBody_id_seq"
    --[populateTableWorker]  For now inserted: 47 rows, Total rows to insert into "uniplex"."issue": 47
    --[processEnum] Defines "ENUMs" for table "uniplex"."issue"
    --[processNull] Defines "NULLs" for table: "uniplex"."issue"
    --[populateTableWorker]  For now inserted: 12 rows, Total rows to insert into "uniplex"."panel": 12
    --[processEnum] Defines "ENUMs" for table "uniplex"."panel"
    --[processNull] Defines "NULLs" for table: "uniplex"."panel"
    --[populateTableWorker]  For now inserted: 26 rows, Total rows to insert into "uniplex"."orderAttachment": 26
    --[processEnum] Defines "ENUMs" for table "uniplex"."orderAttachment"
    --[processNull] Defines "NULLs" for table: "uniplex"."orderAttachment"
    --[populateTableWorker]  For now inserted: 2105 rows, Total rows to insert into "uniplex"."inst": 2105
    --[processEnum] Defines "ENUMs" for table "uniplex"."inst"
    --[processNull] Defines "NULLs" for table: "uniplex"."inst"
    --[populateTableWorker]  For now inserted: 2769 rows, Total rows to insert into "uniplex"."instName": 2769
    --[processEnum] Defines "ENUMs" for table "uniplex"."instName"
    --[processNull] Defines "NULLs" for table: "uniplex"."instName"
    --[populateTableWorker]  For now inserted: 395 rows, Total rows to insert into "uniplex"."footprint": 395
    --[processEnum] Defines "ENUMs" for table "uniplex"."footprint"
    --[processNull] Defines "NULLs" for table: "uniplex"."footprint"
    --[populateTableWorker]  For now inserted: 4 rows, Total rows to insert into "uniplex"."overageCostTrack": 4
    --[processEnum] Defines "ENUMs" for table "uniplex"."overageCostTrack"
    --[processNull] Defines "NULLs" for table: "uniplex"."overageCostTrack"
    --[populateTableWorker]  For now inserted: 313 rows, Total rows to insert into "uniplex"."partDocument": 313
    --[processEnum] Defines "ENUMs" for table "uniplex"."partDocument"
    --[processNull] Defines "NULLs" for table: "uniplex"."partDocument"
    --[populateTableWorker]  For now inserted: 467 rows, Total rows to insert into "uniplex"."project": 467
    --[populateTableWorker]  For now inserted: 4 rows, Total rows to insert into "uniplex"."overageCostTrackType": 4
    --[processEnum] Defines "ENUMs" for table "uniplex"."overageCostTrackType"
    --[processNull] Defines "NULLs" for table: "uniplex"."overageCostTrackType"
    --[processEnum] Defines "ENUMs" for table "uniplex"."project"
    --[processNull] Defines "NULLs" for table: "uniplex"."project"
    --[populateTableWorker]  For now inserted: 1 rows, Total rows to insert into "uniplex"."projectOrderPcb": 1
    --[populateTableWorker]  For now inserted: 207 rows, Total rows to insert into "uniplex"."domainSymbol": 1197
    --[processEnum] Defines "ENUMs" for table "uniplex"."projectOrderPcb"
    --[processNull] Defines "NULLs" for table: "uniplex"."projectOrderPcb"
^[[B    --[populateTableWorker]  For now inserted: 1752 rows, Total rows to insert into "uniplex"."progress": 1752
    --[processEnum] Defines "ENUMs" for table "uniplex"."progress"
    --[processNull] Defines "NULLs" for table: "uniplex"."progress"
    --[populateTableWorker]  For now inserted: 414 rows, Total rows to insert into "uniplex"."domainSymbol": 1197
        --[populateTableWorker]  For now inserted: 621 rows, Total rows to insert into "uniplex"."domainSymbol": 1197
    --[populateTableWorker]  For now inserted: 45 rows, Total rows to insert into "uniplex"."projectOrderStencil": 45
    --[processEnum] Defines "ENUMs" for table "uniplex"."projectOrderStencil"
    --[processNull] Defines "NULLs" for table: "uniplex"."projectOrderStencil"
        --[populateTableWorker]  For now inserted: 834 rows, Total rows to insert into "uniplex"."domainSymbol": 1197
    --[populateTableWorker]  For now inserted: 1038 rows, Total rows to insert into "uniplex"."domainSymbol": 1197
    --[populateTableWorker]  For now inserted: 1197 rows, Total rows to insert into "uniplex"."domainSymbol": 1197
    --[processEnum] Defines "ENUMs" for table "uniplex"."domainSymbol"
    --[processNull] Defines "NULLs" for table: "uniplex"."domainSymbol"
    --[populateTableWorker]  For now inserted: 773 rows, Total rows to insert into "uniplex"."schematic": 773
    --[processEnum] Defines "ENUMs" for table "uniplex"."schematic"
    --[processNull] Defines "NULLs" for table: "uniplex"."schematic"
    --[populateTableWorker]  For now inserted: 10 rows, Total rows to insert into "uniplex"."stripeSource": 10
    --[processEnum] Defines "ENUMs" for table "uniplex"."stripeSource"
    --[processNull] Defines "NULLs" for table: "uniplex"."stripeSource"
    --[populateTableWorker]  For now inserted: 1356 rows, Total rows to insert into "uniplex"."schematicPage": 1356
    --[processEnum] Defines "ENUMs" for table "uniplex"."schematicPage"
    --[processNull] Defines "NULLs" for table: "uniplex"."schematicPage"
    --[populateTableWorker]  For now inserted: 826 rows, Total rows to insert into "uniplex"."projectRevision": 826
    --[processEnum] Defines "ENUMs" for table "uniplex"."projectRevision"
    --[processNull] Defines "NULLs" for table: "uniplex"."projectRevision"
    --[populateTableWorker]  For now inserted: 250 rows, Total rows to insert into "uniplex"."partSpecValue": 250
    --[populateTableWorker]  For now inserted: 3 rows, Total rows to insert into "uniplex"."projectStar": 3
    --[processEnum] Defines "ENUMs" for table "uniplex"."projectStar"
    --[processNull] Defines "NULLs" for table: "uniplex"."projectStar"
    --[processEnum] Defines "ENUMs" for table "uniplex"."partSpecValue"
    --[processNull] Defines "NULLs" for table: "uniplex"."partSpecValue"
    --[populateTableWorker]  For now inserted: 62 rows, Total rows to insert into "uniplex"."projectOrder": 62
    --[processEnum] Defines "ENUMs" for table "uniplex"."projectOrder"
    --[processNull] Defines "NULLs" for table: "uniplex"."projectOrder"
    --[populateTableWorker]  For now inserted: 3265 rows, Total rows to insert into "uniplex"."boardImage": 3265
    --[processEnum] Defines "ENUMs" for table "uniplex"."boardImage"
    --[processNull] Defines "NULLs" for table: "uniplex"."boardImage"
    --[populateTableWorker]  For now inserted: 1165 rows, Total rows to insert into "uniplex"."userComponent": 1165
    --[processEnum] Defines "ENUMs" for table "uniplex"."userComponent"
    --[processNull] Defines "NULLs" for table: "uniplex"."userComponent"
    --[populateTableWorker]  For now inserted: 5501 rows, Total rows to insert into "uniplex"."projectRevisionFiles": 5501
    --[processEnum] Defines "ENUMs" for table "uniplex"."projectRevisionFiles"
    --[processNull] Defines "NULLs" for table: "uniplex"."projectRevisionFiles"
    --[populateTableWorker]  For now inserted: 7437 rows, Total rows to insert into "uniplex"."boardLayer": 7437
    --[processEnum] Defines "ENUMs" for table "uniplex"."boardLayer"
    --[processNull] Defines "NULLs" for table: "uniplex"."boardLayer"
    --[populateTableWorker]  For now inserted: 16940 rows, Total rows to insert into "uniplex"."projectBomRef": 68023
    --[populateTableWorker]  For now inserted: 33880 rows, Total rows to insert into "uniplex"."projectBomRef": 68023
    --[processNull] Set "ENUM" for table "uniplex"."category" column: "id"
    --[processNull] Set "ENUM" for table "uniplex"."category" column: "createdAt"
    --[processNull] Set "ENUM" for table "uniplex"."category" column: "updatedAt"
    --[processDefault] Defines default values for table: "uniplex"."category"
    --[createSequence] Trying to create sequence : "uniplex"."category_id_seq"
    --[populateTableWorker]  For now inserted: 50820 rows, Total rows to insert into "uniplex"."projectBomRef": 68023
    --[populateTableWorker]  For now inserted: 12668 rows, Total rows to insert into "uniplex"."assemblyPricing": 98472
    --[populateTableWorker]  For now inserted: 62 rows, Total rows to insert into "uniplex"."userOrganization": 62
    --[processEnum] Defines "ENUMs" for table "uniplex"."userOrganization"
    --[processNull] Defines "NULLs" for table: "uniplex"."userOrganization"
    --[populateTableWorker]  For now inserted: 51083 rows, Total rows to insert into "uniplex"."projectBomRef": 68023
    --[populateTableWorker]  For now inserted: 25336 rows, Total rows to insert into "uniplex"."assemblyPricing": 98472

<--- Last few GCs --->

  135023 ms: Scavenge 1398.0 (1457.1) -> 1398.0 (1457.1) MB, 3.9 / 0 ms (+ 1.4 ms in 1 steps since last GC) [allocation failure] [incremental marking delaying mark-sweep].
  135348 ms: Mark-sweep 1398.0 (1457.1) -> 1390.0 (1457.1) MB, 324.8 / 0 ms (+ 2.2 ms in 2 steps since start of marking, biggest step 1.4 ms) [last resort gc].
  135661 ms: Mark-sweep 1390.0 (1457.1) -> 1394.1 (1457.1) MB, 312.3 / 0 ms [last resort gc].


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x381660f36ae1 <JS Object>
    1: stringify [/home/ollie/work/nmig/migration/fmtp/csvStringifyModified.js:~209] [pc=0x22322be38cbf] (this=0x189ff01f3989 <JS Object>,line=0x23d3941af399 <an Object with map 0x15692138d431>)
    2: write [/home/ollie/work/nmig/migration/fmtp/csvStringifyModified.js:184] [pc=0x22322be4779f] (this=0x189ff01f3989 <JS Object>,chunk=0x23d3941af399 <an Object with map 0x15692138d431>,encoding=0x3...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Aborted

Cannot read property 'mySqlVarLenPgSqlFixedLen' of undefined

Hello, unsure what is going on, but trying a migration of an existing 5.5 mysql db gives

/Users/joel/gitstuff/nmig/migration/fmtp/FromMySQL2PostgreSQL.js:198
        } else if ('decimal(19,2)' === mySqlDataType || objDataTypesMap[strDataType].mySqlVarLenPgSqlFixedLen) {
                                                                                    ^

TypeError: Cannot read property 'mySqlVarLenPgSqlFixedLen' of undefined
    at mapDataTypes (/Users/joel/gitstuff/nmig/migration/fmtp/FromMySQL2PostgreSQL.js:198:85)
    at /Users/joel/gitstuff/nmig/migration/fmtp/FromMySQL2PostgreSQL.js:911:69
    at null.<anonymous> (/Users/joel/gitstuff/nmig/node_modules/pg/node_modules/pg-pool/index.js:77:9)
    at Pool.dispense [as _dispense] (/Users/joel/gitstuff/nmig/node_modules/pg/node_modules/pg-pool/node_modules/generic-pool/lib/generic-pool.js:310:14)
    at Pool.release (/Users/joel/gitstuff/nmig/node_modules/pg/node_modules/pg-pool/node_modules/generic-pool/lib/generic-pool.js:429:8)
    at null.<anonymous> (/Users/joel/gitstuff/nmig/node_modules/pg/node_modules/pg-pool/index.js:72:21)
    at null.callback (/Users/joel/gitstuff/nmig/migration/fmtp/FromMySQL2PostgreSQL.js:918:45)
    at Query.handleReadyForQuery (/Users/joel/gitstuff/nmig/node_modules/pg/lib/query.js:114:10)
    at null.<anonymous> (/Users/joel/gitstuff/nmig/node_modules/pg/lib/client.js:172:19)
    at emitOne (events.js:82:20)
    at emit (events.js:169:7)

Error "const fs = require('fs');

I'm on Ubuntu Server 14.04 and attempting to use NMIG. I have config.json properly configured, but when I run

nodejs nmig.js (I should be using this instead of using 'node' because of a package conflict on Ubuntu, right?

I get the following output:

/home/ryan/nmig/nmig.js:23
const fs   = require('fs');
^^^^^
SyntaxError: Use of const in strict mode.
    at Module._compile (module.js:439:25)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Function.Module.runMain (module.js:497:10)
    at startup (node.js:119:16)
    at node.js:902:3

Any idea on how to resolve this issue?

How to migrate DATETIME with Timezone

Hi,

I'm migrating a large database from MySQL to PostgreSQL, I did that DB with Django and I've a problem when nmig import datetime to timestamp without timezone because after I get a lot of errors for that.

Large datasets not converting

I'm getting HUGE error logs consisting of entries like this:

        SQL: COPY "public"."carets_media_prop_media" FROM '/Users/mike/Developer/nmig/temporary_directory/carets_media_prop_media3923640.csv' DELIMITER ',' CSV;

        --[populateTableWorker] error: could not open file "/Users/mike/Developer/nmig/temporary_directory/carets_media_prop_media3933720.csv" for reading: No such file or directory


        SQL: COPY "public"."carets_media_prop_media" FROM '/Users/mike/Developer/nmig/temporary_directory/carets_media_prop_media3933720.csv' DELIMITER ',' CSV;

        --[populateTableWorker] error: could not open file "/Users/mike/Developer/nmig/temporary_directory/carets_media_prop_media3936240.csv" for reading: No such file or directory

Any idea how to fix this? I have attempted 3 times now to convert a database from MySQL to Postgres and always ran into these errors, which eventually caused the conversion process to grind to a halt.

Multiple COPY errors

Hi,

While migrating a database (on windows), some table outputs errors :
--[populateTableWorker] error: could not open file "path\to\csvfile.csv" for reading: Permission denied

Migrating views

Only one view was migrated which didn't have back-ticks ` like standard mysql practice....

For example:

select packages.shipment_id AS shipment_id,group_concat(packages.tracking_number separator ', ') AS tracking_numbers,sum(packages.insurance_amount) AS insurance_amount,sum(packages.weight) AS weight,sum(packages.cod_amount) AS cod_amount from packages group by packages.shipment_id

Also notice the function replacement that needs to be done as well...

Error converting Views

In the MySQL DB i have a set of views and conversion fails on them with a syntax error..

In MySQL the view is:
select tracks.id AS id,tracks.name AS name,best_laps.laptime AS laptime from (best_laps join tracks on((best_laps.id = tracks.id)))

nmig throws the following error:

--[ViewGenerator] error: syntax error at or near "("

SQL: CREATE OR REPLACE VIEW "public"."fastestlaps" AS select "tracks"."id" AS "id","tracks"."name" AS "name","best_laps"."laptime" AS "laptime" from "public".("best_laps" join "public"."tracks" on(("best_laps"."id" = "tracks"."id")));

the SQL generated has the bracket misplaced ... it should be before "public". so the correct string should be:
CREATE OR REPLACE VIEW "public"."fastestlaps" AS select "tracks"."id" AS "id","tracks"."name" AS "name","best_laps"."laptime" AS "laptime" from ("public"."best_laps" join "public"."tracks" on(("best_laps"."id" = "tracks"."id")));

Error loading table data

Hello.
The following error appears during migration:

--[populateTableWorker] Error loading table data:
SELECT `id`,`journalized_id`,`journalized_type`,`user_id`,`notes`,IF(`created_on` IN('0000-00-00', '0000-00-00 00:00:00'), '-INFINITY', CAST(`created_on` AS CHAR)),`private_notes` FROM `journals` LIMIT 11683125,31155;

Thank you for great tool.

Migration differences

I'll use this as a catch-all for my observations, I can break them out if you'd rather.

Using NMIG off d88d364
PostgreSQL Server 9.5.4 (I know the main page says 9.3 is supported)
MySQL 5.6.16-1~exp1

If the issue is the PostgreSQL version I can retry, I'm hoping that it isn't...

I dumped the MySQL schema, I dumped a native PostgreSQL schema, and I dumped the schema from a MySQL -> PostgreSQL migrated result. I then compared the native PostgreSQL schema with the migrated schema with apgdiff. The following differences exist:

Indices are renamed

E.g.
In MySQL: KEY associations_origin_id_idx (origin_id),
In native PostgreSQL: associations_origin_id_idx
In migrated schema: public_associations_origin_id0_idx

Tinyint not converted to boolean

Addressed in #13 - I made the changes you suggested in the discussion (after verifying all my tinyints were booleans) and it worked and preserved the values. Would be marginally nicer to have a more accessible configuration, but that's getting picky. A toggle that would allow assuming all tinyint(1) are boolean would work for us.

Boolean default values are not preserved

E.g.
In MySQL: is_time_defined tinyint(1) DEFAULT '0'
In native PostgreSQL: is_time_defined boolean DEFAULT false
In migrated schema: is_time_defined boolean

String default values are not preserved

E.g.
In MySQL: 'channel_code` varchar(3) DEFAULT '','
In native PostgreSQL: 'channel_code character varying(3) DEFAULT ''::character varying,'
In migrated schema: 'channel_code character varying(3)'

Other than that, the migration went smoothly and the output is relatively easily corrected. Any configuration options I might be missing to address the above mentioned discrepancies? The only thing (besides #13) that I changed was the config.json. Let me know if I can provide any more information to help flesh this out.

Thanks a lot for this tool!

error on execution

/home/dapingwing/Downloads/nmig/main.js:26
fs.readFile(__dirname + '/config.json', (error, data) => {
^
SyntaxError: Unexpected token >
at Module._compile (module.js:439:25)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
at node.js:902:3

TypeError: done is not a function

ubuntu 12.04, node 6.3.1:

node nmig.js

NMIG - the database migration tool
Copyright 2016 Anatoly Khaytovich <[email protected]>
 Boot...
--[readDataTypesMap] Data Types Map is loaded...
--[createLogsDirectory] Creating logs directory...
--[createLogsDirectory] Logs directory already exists...
--[createTemporaryDirectory] Creating temporary directory...

/root/nmig/migration/fmtp/FromMySQL2PostgreSQL.js:272
done();
^

TypeError: done is not a function
at pg.connect (/root/nmig/migration/fmtp/FromMySQL2PostgreSQL.js:272:17)
at adjustCallback (/root/nmig/node_modules/pg/node_modules/generic-pool/lib/generic-pool.js:187:7)
at /root/nmig/node_modules/pg/node_modules/generic-pool/lib/generic-pool.js:227:13
at connectError (/root/nmig/node_modules/pg/lib/index.js:54:9)
at g (events.js:286:16)
at emitOne (events.js:96:13)
at emit (events.js:188:7)
at . (/root/nmig/node_modules/pg/lib/client.js:121:14)
at emitOne (events.js:96:13)
at emit (events.js:188:7)

Timestamp import problem

When I ran this, it refused to transfer any tables with timestamp data.

I found out that the IF-function used to fetch dates from MySQL prevents the node mysql module from returning timestamps as dates. Instead of dates, they will return a Buffer, which will not be imported from the CSV.
E.g. the query
select IF(cs_timestamp IN('0000-00-00', '0000-00-00 00:00:00'), '-INFINITY', cs_timestamp) from cs where id=1
returns a binary (buffer), but the query
select cs_timestamp from cs where id=1
returns a Date object.
I simply commented out the timestamp clause in ColumnsDataArranger.js, and it worked fine.
Maybe there is a better way to handle zero dates?
This was with version 2.11.1 of mysql.

Great tool, BTW!

Make error message more clear

I've got this error when trying to replicate Zabbix database from MySQL to PostgreSQL:

        --[populateTableWorker] Error loading table data:
SELECT `itemid` AS `itemid`,`type` AS `type`,`snmp_community` AS `snmp_community`,`snmp_oid` AS `snmp_oid`,`hostid` AS `hostid`,`name` AS `name`,`key_` AS `key_`,`delay` AS `delay`,`history` AS `history`,`trends` AS `trends`,`status` AS `status`,`value_type` 
AS `value_type`,`trapper_hosts` AS `trapper_hosts`,`units` AS `units`,`snmpv3_securityname` AS `snmpv3_securityname`,`snmpv3_securitylevel` AS `snmpv3_securitylevel`,`snmpv3_authpassphrase` AS `snmpv3_authpassphrase`,`snmpv3_privpassphrase` AS `snmpv3_privpas
sphrase`,`formula` AS `formula`,`error` AS `error`,`lastlogsize` AS `lastlogsize`,`logtimefmt` AS `logtimefmt`,`templateid` AS `templateid`,`valuemapid` AS `valuemapid`,`params` AS `params`,`ipmi_sensor` AS `ipmi_sensor`,`authtype` AS `authtype`,`username` AS
 `username`,`password` AS `password`,`publickey` AS `publickey`,`privatekey` AS `privatekey`,`mtime` AS `mtime`,`flags` AS `flags`,`interfaceid` AS `interfaceid`,`port` AS `port`,`description` AS `description`,`inventory_link` AS `inventory_link`,`lifetime` A
S `lifetime`,`snmpv3_authprotocol` AS `snmpv3_authprotocol`,`snmpv3_privprotocol` AS `snmpv3_privprotocol`,`state` AS `state`,`snmpv3_contextname` AS `snmpv3_contextname`,`evaltype` AS `evaltype`,`jmx_endpoint` AS `jmx_endpoint`,`master_itemid` AS `master_ite
mid` FROM `items` LIMIT 88767,29589;

        --[loadData] Loading the data...
        --[populateTableWorker] Error loading table data:
SELECT `itemid` AS `itemid`,`type` AS `type`,`snmp_community` AS `snmp_community`,`snmp_oid` AS `snmp_oid`,`hostid` AS `hostid`,`name` AS `name`,`key_` AS `key_`,`delay` AS `delay`,`history` AS `history`,`trends` AS `trends`,`status` AS `status`,`value_type` 
AS `value_type`,`trapper_hosts` AS `trapper_hosts`,`units` AS `units`,`snmpv3_securityname` AS `snmpv3_securityname`,`snmpv3_securitylevel` AS `snmpv3_securitylevel`,`snmpv3_authpassphrase` AS `snmpv3_authpassphrase`,`snmpv3_privpassphrase` AS `snmpv3_privpas
sphrase`,`formula` AS `formula`,`error` AS `error`,`lastlogsize` AS `lastlogsize`,`logtimefmt` AS `logtimefmt`,`templateid` AS `templateid`,`valuemapid` AS `valuemapid`,`params` AS `params`,`ipmi_sensor` AS `ipmi_sensor`,`authtype` AS `authtype`,`username` AS
 `username`,`password` AS `password`,`publickey` AS `publickey`,`privatekey` AS `privatekey`,`mtime` AS `mtime`,`flags` AS `flags`,`interfaceid` AS `interfaceid`,`port` AS `port`,`description` AS `description`,`inventory_link` AS `inventory_link`,`lifetime` A
S `lifetime`,`snmpv3_authprotocol` AS `snmpv3_authprotocol`,`snmpv3_privprotocol` AS `snmpv3_privprotocol`,`state` AS `state`,`snmpv3_contextname` AS `snmpv3_contextname`,`evaltype` AS `evaltype`,`jmx_endpoint` AS `jmx_endpoint`,`master_itemid` AS `master_ite
mid` FROM `items` LIMIT 0,29589;

It looks like error, but error message does not contain any useful information, so i cant fix anything to avoid this problem, because i dont know WHAT and where to fix.
Please, add the text describing error reasons returned from SQL server in your app error messages, so users can understand whats going on whithout tricks like tcpdump'ing or coding.

Thank you!

Good project, but useless in production

Hello here, i glad tryin to use your app, but, i think im unable to migrate big tables, because table migration with 70M row takes ages.
hint: do not use offset.
use primary key if it exists.
thx

empty string interpreted as NULL

Using the latest rev today (de78b97).

I have a NOT NULL column in a table, but empty string in one place. I see this error during the translation:

    --[processNull] Error while setting NULL for "test_schema"."location"."address"...
error: column "address" contains null values
mysql> select * from location where address is null;
Empty set (0.00 sec)
mysql> select * from location where address = '';
+-----+------+------------+-----------------------------+
| id  | name | address     | last_updated              |
+-----+------+------------+-----------------------------+
| 241 | me  |                | 2015-04-06 11:51:09 |
+-----+------+------------+-----------------------------+
1 row in set (0.00 sec)

Errors during data migration, columns aren't passed in the right order

While data is being copied between databases, it seems that for some tables data isn't passed in the right order from the csv files. For example, I get this message during data transfer :

error: invalid input syntax for integer: "Validated Data".

"Validated data" is a string content from the first column of the table named STATUS, followed by three integer. In the corresponding csv file, three integers are passed followed by the string STATUS.

TypeError: Cannot read property 'rowCount' of undefined

I started getting this error after a large table with 1.7 million rows completed transferring. Any ideas what might be causing this?

    --[readDataPool] Data-Pool is loaded...
    --[loadData] Loading the data...
/home/forge/nmig/migration/fmtp/DataLoader.js:173
                                                                        if (isIntNumeric(result.rowCount)) {
                                                                                               ^

TypeError: Cannot read property 'rowCount' of undefined
    at /home/forge/nmig/migration/fmtp/DataLoader.js:173:96
    at null.callback (/home/forge/nmig/migration/fmtp/DataLoader.js:89:16)
    at Query.handleError (/home/forge/nmig/node_modules/pg/lib/query.js:131:17)
    at null.<anonymous> (/home/forge/nmig/node_modules/pg/lib/client.js:180:26)
    at emitOne (events.js:90:13)
    at emit (events.js:182:7)
    at Socket.<anonymous> (/home/forge/nmig/node_modules/pg/lib/connection.js:121:12)
    at emitOne (events.js:90:13)
    at Socket.emit (events.js:182:7)
    at readableAddChunk (_stream_readable.js:153:18)

Update: It was trying to read a table that I had renamed after interrupting the migration. I couldn't figure out how to remove the old table name from the migration list. Is there a way to do that?

From dump file

Is it possible or planning to allow migrate from dump file? (e.g. .sql file).

Thanks! Great tool!

Convert redmine database fails

Trying to convert redmine database using nmig tool.

Seems that all datetime columns can not be handled.
For example I`ve got:

    --[populateTableByInsert] INSERT failed...

error: incorrect binary data format in bind parameter 6

    SQL: INSERT INTO "redmine_production"."custom_values" VALUES($1,$2,$3,$4,$5,$6);

This is one line from custom_values214008.csv:
478524,Issue,27462,4,Val1,"{""type"":""Buffer"",""data"":[50,48,49,54,45,48,53,45,48,52,32,48,57,58,51,53,58,48,56]}"

This is mysql table:
mysql> desc custom_values;
+-----------------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| customized_type | varchar(30) | NO | MUL | | |
| customized_id | int(11) | NO | | 0 | |
| custom_field_id | int(11) | NO | MUL | 0 | |
| value | text | YES | | NULL | |
| updated_on | datetime | YES | | NULL | |
+-----------------+-------------+------+-----+---------+----------------+
6 rows in set (0.00 sec)

mysql> select * from custom_values where id=478524;
+--------+-----------------+---------------+-----------------+-------+---------------------+
| id | customized_type | customized_id | custom_field_id | value | updated_on |
+--------+-----------------+---------------+-----------------+-------+---------------------+
| 478524 | Issue | 27462 | 4 | Val1 | 2016-05-04 09:35:08 |
+--------+-----------------+---------------+-----------------+-------+---------------------+
1 row in set (0.00 sec)

Also trying to set different datestyle in new postgres database:
ALTER DATABASE redmine_production SET datestyle = 'ISO, YMD';
or
ALTER DATABASE redmine_production SET datestyle = 'ISO, DMY';
but have no success with the same error.

Tables containing a field of type `point` are not copied

I'm attempting to migrate a database from MySQL 5.5 to Postgres 9.5. Everything works great except for tables that include spatial data in the form of a point type field.

I checked logs_directory/errors-only.log and found several errors similar to this:

--[populateTableWorker] Error: ER_SP_DOES_NOT_EXIST: FUNCTION database_name.ST_AsWKB does not exist

SQL: SELECT `id`,`lr_id`,`parcel`,`street_num`,`street_dir`,`street_name`,`city`,`zip`,HEX(ST_AsWKB(`coords`)),`created_at`,`updated_at` FROM `lr_address` LIMIT 0,486;

The query works fine if I input it directly into MySQL, but for some reason it does not work in the nmig script.

No permissions

I'm getting this error:

SQL: COPY "test_schema"."FiscalNCM" FROM '/root/nmig/temporary_directory/FiscalNCM7248.csv' DELIMITER ',' CSV;

        --[populateTableWorker] error: could not open file "/root/nmig/temporary_directory/FiscalTipoProduto0.csv" for reading: Permission denied

But I'm running the script as root so it should not be a problem.

process out of memory

Looks like a memory leak as usage steadily climbs until bust.
Large mysql table ( 8,000,000 rows, 63 columns but none are TEXT). Machine has 4GB RAM
"data_chunk_size" : 32,

<--- Last few GCs --->

  131383 ms: Scavenge 1396.3 (1460.3) -> 1396.3 (1460.3) MB, 0.6 / 0 ms (+ 3.1 ms in 1 steps since last GC) [allocation failure] [incremental marking delaying mark-sweep].
  132672 ms: Mark-sweep 1396.3 (1460.3) -> 1395.5 (1460.3) MB, 1288.7 / 0 ms (+ 4.8 ms in 2 steps since start of marking, biggest step 3.1 ms) [last resort gc].
  133959 ms: Mark-sweep 1395.5 (1460.3) -> 1396.3 (1460.3) MB, 1287.7 / 0 ms [last resort gc].


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x2b0326fe3ac1 <JS Object>
    1: stringify [/home/xxxx/src/nmig/migration/fmtp/csvStringifyModified.js:~209] [pc=0xcf9371cbcae] (this=0x6191bedd9a9 <JS Object>,line=0x8e7ee691059 <an Object with map 0x21cd8a610441>)
    2: write [/home/xxxx/src/nmig/migration/fmtp/csvStringifyModified.js:184] [pc=0xcf937106fd6] (this=0x6191bedd9a9 <JS Object>,chunk=0x8e7ee691059 <an Object with map 0x21cd8a610441>,encoding=0x2b032...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Aborted

node --version

v5.7.1

uname -a

Linux xxx 2.6.32-431.el6.x86_64 #1 SMP Fri Nov 22 03:15:09 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux

Delimiter configuration

Not an actual "Issue" but it would be really useful to be able to customize the delimiter to use to make the internal csv and copy. My use case is that in the original database I had to copy there were many fields using commas inside, so using comma as a delimiter was not an option. It took me a while to realize where the issue was, and hardcoding another delimiter (';') did the job, but it would have been nice to be able to set up the delimiter in the config.json, just a suggestion.

Thanks for the amazing job with this script :)

Problem with datetime

I'm trying to migrate from mysql to postgresql
I use windows 10
MySQL 5.0.24 on a remote server
PostgreSQL 9.5 on my machine

I get two kind of errors:

  1. In mySQL database I have columns of datetime type that give the following import error:
    --[populateTableWorker] error: invalid input syntax for type timestamp: "{"type":"Buffer","data":[50,48,49,49,45,48,57,45,48,55,32,49,55,58,48,48,58,48,48]}"
  2. I get also the error
    --[populateTableByInsert] INSERT failed...
    error: incorrect binary data format in bind parameter 57 SQL: INSERT INTO "test_schema"."tinfra_a1x2" VALUES($1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12,$13,$14,$15,$16,$17,$18,$19,$20,$21,$22,$23,$24,$25,$26,$27,$28,$29,$30,$31,$32,$33,$34,$35,$36,$37,$38,$39,$40,$41,$42,$43,$44,$45,$46,$47,$48,$49,$50,$51,$52,$53,$54,$55,$56,$57);

errors-only.zip

Issues with blob columns not getting converted propertly

I have an application with encrypted data stored in blob columns. When converting to postgres, it put the data in bytea columns, but doesn't seem to have converted the data properly.

I noticed that if I used the --hex-blob option when dumping with mysqldump and then use decode(..., 'hex') when importing the value, it worked.

For example:

mysqldump -u root -p --skip-quote-names --hex-blob --skip-triggers --compact --no-create-info <mydb> <my table> | sed "s/0x\([0-9A-F]*\)/decode('\1','hex')/g"

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.