Giter Site home page Giter Site logo

microsoft / dacfx Goto Github PK

View Code? Open in Web Editor NEW
285.0 41.0 15.0 176 KB

SQL Server database schema validation, deployment, and upgrade runtime. Enables declarative database development and database portability across SQL Server versions and environments.

Home Page: https://aka.ms/sqlpackage-ref

License: MIT License

C# 95.81% TSQL 4.19%
sql sql-server microsoft azure-sql dacfx dotnet dotnet-core

dacfx's Introduction

DacFx and Related Components

Component Links Summary
SqlPackage 📦 NuGet Microsoft.SqlPackage is a cross-platform command-line utility for creating and deploying .dacpac and .bacpac packages. SqlPackage can be installed as a dotnet tool.
DacFx 📦 NuGet The Microsoft SQL Server Data-Tier Application Framework (Microsoft.SqlServer.DacFx) is a .NET library which provides application lifecycle services for database development and management for Microsoft SQL Server and Microsoft Azure SQL Databases. Preview versions of DacFx are frequently released to NuGet.
Dacpacs.(Master,Msdb) 📦 Master
📦 Msdb
Microsoft.SqlServer.Dacpacs.Master and Microsoft.SqlServer.Dacpacs.Msdb is a set of NuGet packages containing .dacpac files for Microsoft SQL Server system databases (master, msdb) with versions across SQL Server 2008 (100) through SQL Server 2022 (160).
Dacpacs.Azure.Master 📦 NuGet Microsoft.SqlServer.Dacpacs.Azure.Master is a NuGet package containing a .dacpac file for the Azure SQL Database master database.
Dacpacs.Synapse.Master 📦 NuGet Microsoft.SqlServer.Dacpacs.Synapse.Master is a NuGet package containing a .dacpac file for the Azure Synapse Analytics master database.
Dacpacs.SynapseServerless.Master 📦 NuGet Microsoft.SqlServer.Dacpacs.SynapseServerless.Master is a NuGet package containing a .dacpac file for the Azure Synapse Analytics serverless SQL pools master database.
ScriptDom 📦 NuGet
🛠️ Code
Microsoft.SqlServer.TransactSql.ScriptDom is a NuGet package containing the Transact-SQL parser ScriptDOM. The source code is licensed MIT.
Microsoft.Build.Sql 📦 NuGet
🛠️ Code
Microsoft.Build.Sql (preview) is a .NET project SDK for SQL projects, compiling T-SQL code to a data-tier application package (dacpac). In preview, source code in this repository.
Project Templates 📦 NuGet
🛠️ Code
Microsoft.Build.Sql.Templates (preview) is a set of .NET project templates for SQL projects. In preview, source code in this repository.

Microsoft.Build.Sql projects documentation

Quickstart

🛠️ Install SqlPackage

SqlPackage is a command line interface to DacFx and is available for Windows, macOS, and Linux. For more about SqlPackage, check out the reference page on Microsoft Docs.

If you would like to use the command-line utility SqlPackage for creating and deploying .dacpac and .bacpac packages, you can obtain it as a dotnet tool. The tool is available for Windows, macOS, and Linux.

dotnet tool install -g microsoft.sqlpackage

Optionally, SqlPackage can be downloaded as a zip file from the SqlPackage documentation.

📁 Create a SQL project

Install the Microsoft.Build.Sql.Templates NuGet package to get started with a new SQL project.

dotnet new -i Microsoft.Build.Sql.Templates

Create a new SQL project using the sqlproj template.

dotnet new sqlproj -n ProductsTutorial

Add a new table dbo.Product in a .sql file alongside the project file.

CREATE TABLE [dbo].[Product](
    [ProductID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,
    [ProductName] [nvarchar](200) NOT NULL
);

Build the project to create a .dacpac file.

dotnet build

🛳️ Publish a SQL project

Publish a SQL project to a database using the SqlPackage publish command. Learn more about the publish command in the SqlPackage documentation, where additional examples and details on the parameters are available.

# example publish to Azure SQL Database using SQL authentication and a connection string
sqlpackage /Action:Publish /SourceFile:"bin/Debug/ProductsTutorial.dacpac" \
    /TargetConnectionString:"Server=tcp:{yourserver}.database.windows.net,1433;Initial Catalog=ProductsTutorial;User ID=sqladmin;Password={your_password};Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"

Repository Focus

Feedback

This repository is available for transparently triaging and addressing feedback on DacFx, including the NuGet package and the cross-platform CLI SqlPackage. We welcome community interaction and suggestions! For more information on contributing feedback through interacting with issues see Contributing.

Related Open Source Projects

This repository is available to make related open source components accessible even from their early stages. Feedback and contributions are welcome!

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see Code of Conduct.

dacfx's People

Contributors

benjin avatar bjss-colin avatar chris904apps avatar dependabot[bot] avatar dzsquared avatar erikej avatar j-dc avatar kisantia avatar llali avatar marijnz0r avatar microsoft-github-operations[bot] avatar microsoftopensource avatar seenaaugusty avatar zijchen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dacfx's Issues

DacServices.GenerateCreateScript ignores DoNotEvaluateSqlCmdVariables

  • SqlPackage or DacFx Version: 150.5164.1
  • .NET Framework (Windows-only) or .NET Core: net5.0
  • Environment (local platform and source/target platforms): Windows 10 latest patch level

Steps to Reproduce:

  1. Create an instance of DacDeployOptions with DoNotEvaluateSqlCmdVariables set to true.
  2. Create an instance of DacPackage that contains a SqlCmdVariable.
  3. Run DacServices.GenerateCreateScript with that previously created instances as parameters.

You will get the following exception:

DeploymentFailedException: Missing values for the following SqlCmd variables:MyVariable.
DacServicesException: An error occurred during deployment plan generation. Deployment cannot continue.

Thank you!

Readonly filegroup causes new database deployment to fail

When using SqlPackage.exe to deploy a new database with a readonly filegroup, the deployment fails. I understand that for an existing database you would not want to automatically unlock and relock the database, but for a new database it should be ok.

I think the deploy should set the readonly status at the end of the deploy

Issues referencing sys.sysprocesses

Hi,

I am having issues resolving sys.sysprocesses, despite referencing the master database. In my solution (SQL Server Project created in VS2019), I have a view which returns the sys.sysprocesses [spid], [blocked], [open_tran] and [cmd] fields, and joins on the [context_info] field. This results in a SQL71561 unresolved reference build error.

I have also created an empty SQL project and have added a single SP that returns all columns from sys.syspresesses. This time the build succeeds but with a SQL71502 unresolved references warning.

Replication Steps

  1. Using Visual Studio 2019, create a new SQL Server project
  2. Add a new Stored Procedure that returns all columns from sys.sysprocesses
  3. Right-click project references and select Add Database Reference... from the context menu
  4. From the Add Database Reference dialog, choose the System database option, and the master database from the drop-down
  5. OK the dialog and the master database should now be listed as a project reference
  6. Build solution and observe warning

Example Stored Procedure

CREATE PROCEDURE [dbo].[GetSysProcesses]
AS
	SELECT *
	FROM [sys].[sysprocesses];

RETURN 0

Further Information
See discussion at bottom of this article

Kaine

Get a list of DacFx operations?

I'm trying to document what changes and differences in comparing models cause what events to happen - for example if you change the ANSI_NULLS property on a table you will get a table rebuild, if you drop a column you get a "alter table, drop column".

I've got lots of cases covered but am getting stuck understanding what the "operations" are that are being passed around, I guess in the code they are constants so are being converted to an int in reflector.

An example is in

Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeploymentPlanGenerator+DeploymentScriptDomGenerator

GenerateFragment - the first parameter is operation and it is an int

Is it possible to get a list of all the operations that the DacFx generates when working out the actions to update a database?

Allow excluding Sensitivity Classifications from schema comparisons

One of our cloud engineers recently started adding sensitivity classifications to columns in our Azure SQL database, via the Azure portal. We're using DacFx for schema updates as part of our automated release process, and these are getting dropped:

DROP SENSITIVITY CLASSIFICATION FROM
    [dbo].[MyTable].[MyColumn];

This makes sense since these exist in the target and not the source, but ideally we'd to just ignore them so that security/ops people are free to add them independent of our code release cycles. But I'm struggling to figure out how to do this with the options available. My understanding is that these are extended properties, but none of these work:

opts.DropExtendedPropertiesNotInSource = false;
opts.IgnoreExtendedProperties = true;
opts.ExcludeObjectTypes = new[] {ObjectType.ExtendedProperties , ...};

After much experimenting, it seems that excluding ObjectType.Tables is the only thing that works, but unfortunately that will also exclude just about everything we want included. Am I correct that DacFx simply doesn't support excluding these?

As a side note, where can I find the DacFx source? Or is it not open source?

Extensions discovery issue in DacPac's DacServices.Deploy()

Due to specifics of our application we load the assembly Microsoft.SqlServer.Dac.dll, as well as all other it relies on, by leveraging the method Assembly.Load(byte[]). Then, once we call DacServices.Deploy(), it fails with the below ArgumentException:

image

because its implementation:
image

is not ready to handle the contract of Assembly.Location saying: "... If the assembly is loaded from a byte array, such as when using the Load(Byte[]) method overload, the value returned is an empty string ("") ..."

So, the above ArgumentException is because Path.GetDirectoryName was supplied an empty string. The code should verify the property Assembly.Location for an empty string prior trying to use it.

Unable to deploy dacpac with Active Directory Interactive Authentication

I am attempting to deploy a dacpac to an Azure SQL Database using dacfx.

This works fine for Active Directory Password and Active Directory Integrated, however it does not work for Active Directory Interactive.

When using this connection string:

Data Source=testval.database.windows.net;authentication=active directory interactive;user [email protected]

and configuring my DacServices object like so:

var dacServiceInstance = new DacServices(connectionString);

I get the following error:

Cannot find an authentication provider for 'ActiveDirectoryInteractive'

After looking at the documentation, its clear that there is an overload for the controller that accepts an IUniversalAuthProvider instance. Looking online I see no classes that already implement this, so I decided to try and implement my own. It grabs the token and returns it in the GetValidAccessToken() method.

Now my call looks like this:

var authProvider = new AzureADAuth(connectionString);
                var dacServiceInstance = new DacServices(connectionString, authProvider);

When I run the deployment now, I am able to verify that my interface implementation is being called (by setting breakpoints), but after it returns the access token, it errors out with the following message:

InvalidOperationException: Cannot set the AccessToken property if 'UserID', 'UID', 'Password', or 'PWD' has been specified in connection string.

Ok so now I try and remove the username from the connection string before passing it in:

var authProvider = new AzureADAuth(connectionString);
                var builder = new SqlConnectionStringBuilder(connectionString);
                builder.Remove("User Id");
                var dacServiceInstance = new DacServices(builder.ConnectionString.Replace("ActiveDirectoryInteractive", "Active Directory Interactive"), authProvider);

This time my interface does not even get called and it fails with this error, demanding that I provide a userid:

Cannot use 'Authentication=Active Directory Interactive' without 'User ID' or 'UID' connection string keywords.

So in one step it requires the UserID and the other it says that it cannot be used with the UserId provided...

Please help provide a working example of how to deploy to an Active Directory Interactive endpoint using DacFx.

As a note, I am using the most up to date version of DacFX and also have the most recent version of System.Data.SqlClient installed as part of the project.

ACTION REQUIRED: Microsoft needs this private repository to complete compliance info

There are open compliance tasks that need to be reviewed for your DacFx repo.

Action required: 4 compliance tasks

To bring this repository to the standard required for 2021, we require administrators of this and all Microsoft GitHub repositories to complete a small set of tasks within the next 60 days. This is critical work to ensure the compliance and security of your microsoft GitHub organization.

Please take a few minutes to complete the tasks at: https://repos.opensource.microsoft.com/orgs/microsoft/repos/DacFx/compliance

  • The GitHub AE (GitHub inside Microsoft) migration survey has not been completed for this private repository
  • No Service Tree mapping has been set for this repo. If this team does not use Service Tree, they can also opt-out of providing Service Tree data in the Compliance tab.
  • No repository maintainers are set. The Open Source Maintainers are the decision-makers and actionable owners of the repository, irrespective of administrator permission grants on GitHub.
  • Classification of the repository as production/non-production is missing in the Compliance tab.

You can close this work item once you have completed the compliance tasks, or it will automatically close within a day of taking action.

If you no longer need this repository, it might be quickest to delete the repo, too.

GitHub inside Microsoft program information

More information about GitHub inside Microsoft and the new GitHub AE product can be found at https://aka.ms/gim.

FYI: current admins at Microsoft include @MitchellSternke, @bergeron, @saurabh500, @jeffsaremi, @swells, @revachauhan, @Benjin, @kburtram, @pensivebrian, @yorek, @ramnov, @aguzev, @shueybubbles

Open internal utils for matching ColumnReferenceExpression to object in SchemaModel

We are trying to match column references from the ScriptDom with columns in the schema (for example to check for non-indexed columns during static code analysis). Sounds easy, but it is not trivial at all.

The factory rule NonIndexedColumnFromInPredicateRule in Dac.Extensions is similar with what we aim for, but the implementation goes really deep and complex for matching the column, and all the utils and tools in the implementation are internal.

It would be amazing to have access to these utils, as it would allow for more advanced custom code analysis rules to be written.
Even if the actual implementation remains mostly internal sealed as it is now, maybe expose a few methods that'll try to do the binding (even if it is ambiguous).

Database diagrams for Azure SQL DB import/export

Is your feature request related to a problem? Please describe.
When database diagrams are created on an Azure SQL database, they cannot be exported/imported to another instance.

Describe the solution you'd like
Database diagrams to be included in bacpac import/export processes for Azure SQL database.

Open source Microsoft.SqlServer.TransactSql.ScriptDom

Can you please please open source Microsoft.SqlServer.TransactSql.ScriptDom.dll?

The parser is useful, but there are still a lot of gaps that we'd like to contribute to.

This doesn't strike me as a sensitive library and we can already 'see it' in ILSpy or any other decompiler.

Could someone please raise it to the relevant people at Microsoft?

Deploy a specific schema instead of a full database

With the rise of Azure SQL Database (which is 1 database), we more often move from multiple databases towards 1 database with multiple schema's.

We have an increased need to deploy specific schema's instead of a full database, for more granular deployments. More controllable impact, less impact on the whole database.

We currently use https://github.com/GoEddie/DeploymentContributorFilterer in combination with version 150 to filter changes specific to a schema. It works to a certain point, unless the compare errors out before that. And unnamed constraints for example need to be named.

It would be great if there is out-of-the-box support for deploying a specific schema, instead of a full database. Hopefully as a side-effect schema compares will also speed up as only specific schema's need to be checked.

Unable to read data-tier application registration after Publish using SqlPackage

Here are the steps to reproduce the problem I met:

  1. Run SQL Server Container for Linux

    docker run -e "ACCEPT_EULA=Y" -e "SA_PASSWORD=Ver7CompleXPW" -p 1433:1433 --name sql1 -d mcr.microsoft.com/mssql/server:2019-latest
  2. Deploy ContosoUniversity.dacpac as data-tier application

    sqlpackage /Action:Publish /SourceFile:"ContosoUniversity.dacpac" /TargetDatabaseName:"ContosoUniversity" /TargetServerName:"." /TargetUser:sa /TargetPassword:Ver7CompleXPW /p:RegisterDataTierApplication=true /p:BlockWhenDriftDetected=true  /TargetTrustServerCertificate:true
  3. Deploy again using the exact same above command.

    sqlpackage /Action:Publish /SourceFile:"ContosoUniversity.dacpac" /TargetDatabaseName:"ContosoUniversity" /TargetServerName:"." /TargetUser:sa /TargetPassword:Ver7CompleXPW /p:RegisterDataTierApplication=true /p:BlockWhenDriftDetected=true  /TargetTrustServerCertificate:true

    Here is the output:

    Publishing to database 'ContosoUniversity' on server '.'.
    Initializing deployment (Start)
    Initializing deployment (Failed)
    *** Could not deploy package.
    Unable to read data-tier application registration.
    Time elapsed 0:00:10.68
    
  4. I'm trying to create a DriftReport

    sqlpackage /Action:DriftReport /TargetDatabaseName:"ContosoUniversity" /TargetServerName:"." /TargetUser:sa /TargetPassword:Ver7CompleXPW /OutputPath:DriftReport.xml

    Here is the output:

    Generating drift report for database 'ContosoUniversity' on server '.'.
    *** Could not generate drift report.
    Unable to read data-tier application registration.
    Time elapsed 0:00:00.64
    
  5. If I delete the data-tier application and register it from SSMS. Then the above commands are all works.

  6. If I use the following command to publish again.

    sqlpackage /Action:Publish /SourceFile:"ContosoUniversity.dacpac" /TargetDatabaseName:"ContosoUniversity" /TargetServerName:"." /TargetUser:sa /TargetPassword:Ver7CompleXPW /p:RegisterDataTierApplication=true /p:BlockWhenDriftDetected=false

    Note: the /p:BlockWhenDriftDetected is false which means this don't read the data-tier application registration.

  7. Then the /Action:DriftReport still showing Unable to read data-tier application registration.* error message.

    sqlpackage /Action:DriftReport /TargetDatabaseName:"ContosoUniversity" /TargetServerName:"." /TargetUser:sa /TargetPassword:Ver7CompleXPW /OutputPath:DriftReport.xml

    logs:

    Generating drift report for database 'ContosoUniversity' on server '.'.
    *** Could not generate drift report.
    Unable to read data-tier application registration.
    Time elapsed 0:00:00.64
    

Is this a bug?

The dependency of DacPacs DacServices.Deploy() on Assembly.Location is erroneous when DLLs are embedded into EXE

Hi!

We use "DACExtensions" in a desktop WPF application and embed all our dependencies (DLLs) into the resulting executable by the means of Costura.Fody.

We've stumbled across an issue with Microsoft.SqlServer.DacFx.x86 package newer than "150.4384.2" due to it retrieves the Assembly.Location path and uses it to construct a directory path. The Assembly.Location property returns an empty string when the assembly is embedded into an EXE.

Needless to say, it leads to the following exception:

ERROR System.ArgumentException: The path is not of a legal form.
at System.IO.Path.NormalizePath(String path, Boolean fullCheck, Int32 maxPathLength, Boolean expandShortPaths)
at System.IO.Path.InternalGetDirectoryName(String path)
at Microsoft.SqlServer.Dac.DacServices.GetDeploymentContributorLoadPaths(IPackageSource packageSource, DacDeployOptions options) in f:\B\16847\6200\Sources\Product\Source\DeploymentApi\DacServices.cs:line 884
at Microsoft.SqlServer.Dac.DacServices.InternalDeploy(IPackageSource packageSource, Boolean isDacpac, String targetDatabaseName, DacDeployOptions options, CancellationToken cancellationToken, DacLoggingContext loggingContext, Action`3 reportPlanOperation, Boolean executePlan) in f:\B\16847\6200\Sources\Product\Source\DeploymentApi\DacServices.cs:line 809
at Microsoft.SqlServer.Dac.DacServices.Deploy(DacPackage package, String targetDatabaseName, Boolean upgradeExisting, DacDeployOptions options, Nullable`1 cancellationToken) in f:\B\16847\6200\Sources\Product\Source\DeploymentApi\DacServices.cs:line 743

May such a uses case with embedded libraries be considered for this package?

There is a very similar issue already reported:
#402 "Extensions discovery issue in DacPac's DacServices.Deploy()"

SqlPackage/SSDT table rebuild losing Change Tracking CHANGETABLE data

There are various scenarios where sqlpackage generates a table rebuild script e.g. adding a column in the middle of a table.

However, when this is done, all Change Tracking data for the table held in the CHANGETABLE is lost. This is dangerous in production environments where certain data that should be synchronised is also lost due to it never being recognised as changed.

It seems that SSDT/SqlPackage and Change Tracking are completely incompatible.

Steps to Reproduce:

  1. Enable Change Tracking on database + table
  2. Insert record into table
  3. In SSDT add column in middle of the table
  4. Publish database via SSDT/SqlPackage

Did this occur in prior versions? If not - which version(s) did it work in?

As far as I'm aware this has never worked.

(DacFx/SqlPackage)

Not actionable extension related errors in DacServices.Deploy()

I have .net core 3.1 application (targeting win-x64, also tried win10-x64 with the same results) referencing netstadard 2.0 Microsoft.SqlServer.DACFx 150.4826.1 which fails with not actionable exception:

Microsoft.SqlServer.Dac.DacServicesException
  HResult=0x80131500
  Message=An error occurred during deployment plan generation. Deployment cannot continue.
Error 72002: The extension type Microsoft.SqlServer.Dac.Deployment.Internal.InternalDeploymentPlanExecutor could not be instantiated.
Error 72002: The extension type Microsoft.SqlServer.Dac.Deployment.Internal.InternalDeploymentPlanModifier could not be instantiated.

  Source=Microsoft.SqlServer.Dac
  StackTrace:
   at Microsoft.SqlServer.Dac.DacServices.CreateController(SqlDeployment deploymentEngine, ErrorManager errorManager)
   at Microsoft.SqlServer.Dac.DeployOperation.<>c__DisplayClass16_1.<CreatePlanInitializationOperation>b__1()
   at Microsoft.Data.Tools.Schema.Sql.Dac.OperationLogger.Capture(Action action)
   at Microsoft.SqlServer.Dac.DeployOperation.<>c__DisplayClass16_0.<CreatePlanInitializationOperation>b__0(Object operation, CancellationToken token)
   at Microsoft.SqlServer.Dac.Operation.Microsoft.SqlServer.Dac.IOperation.Run(OperationContext context)
   at Microsoft.SqlServer.Dac.ReportMessageOperation.Microsoft.SqlServer.Dac.IOperation.Run(OperationContext context)
   at Microsoft.SqlServer.Dac.OperationExtension.CompositeOperation.Microsoft.SqlServer.Dac.IOperation.Run(OperationContext context)
   at Microsoft.SqlServer.Dac.OperationExtension.CompositeOperation.Microsoft.SqlServer.Dac.IOperation.Run(OperationContext context)
   at Microsoft.SqlServer.Dac.DeployOperation.Microsoft.SqlServer.Dac.IOperation.Run(OperationContext context)
   at Microsoft.SqlServer.Dac.OperationExtension.Execute(IOperation operation, DacLoggingContext loggingContext, CancellationToken cancellationToken)
   at Microsoft.SqlServer.Dac.DacServices.InternalDeploy(IPackageSource packageSource, Boolean isDacpac, String targetDatabaseName, DacDeployOptions options, CancellationToken cancellationToken, DacLoggingContext loggingContext, Action`3 reportPlanOperation, Boolean executePlan)
   at Microsoft.SqlServer.Dac.DacServices.Deploy(DacPackage package, String targetDatabaseName, Boolean upgradeExisting, DacDeployOptions options, Nullable`1 cancellationToken)
   at Microsoft.Dynamics365.Commerce.Tools.DacValidator.DacServicesFacade.DeployPackage(String connectionString, String databaseName, DacPackage package, DacDeployOptions deployOptions, Boolean ignoreCustomVersioning, Nullable`1 cancellationToken, ILogger logger) in H:\Repos\Tools\src\Tools\Database\DacValidator\DacServicesFacade.cs:line 224

Not actionable - because there are no any details explaining why those types could not be loaded. The exception is thrown here:

image

And I was only able to figure out the details by attaching a debugger and disabling "Just My Code":
image

The original valuable exception indeed is provided to the method CreateInvalidTypeException but it is simply used there in Exception's base constructor and is not used anywhere for the above error reporting making real life investigation without debugger very difficult.

Can you please leverage the original exception so it is available in the final exception's message? Please also scan the code for all other similar occurrences and fix them as well.

By the way, it seems like DacFx library is relying on MEF1 rather than on MEF2 which works in net core 3.1, are there plans to update the MEF in DacFx?

avoid truncate for data extraction from DACPAC graph tables data - truncate vs edge constraint

Originally posted https://feedback.azure.com/forums/908035-sql-server/suggestions/43813659-avoid-truncate-for-data-extraction-from-dacpac-gra

it is possible to extract DACPAC containing table data. This works also for graph tables containing edge constraints. But it is not possible to restore the data from the DACPAC because the restore process uses truncate, and this is not valid on tables with edge constraints.

How to reproduce:

create graph tables (nodes and edges)
define edge constraints
fill the tables with data
extract to a dacpac using the /Action:Extract and /p:ExtractAllTableData=TRUE
try to restore dacpac with data using /Action:Publish
see the errors
Updating database (Failed)
*** Could not deploy package.
Warning SQL72038: The object [readonly] already exists in database with a different definition and will not be altered.
Error SQL72014: .Net SqlClient Data Provider: Msg 13944, Level 16, State 1, Line 1 Cannot truncate table 'graph.RepoObject' because it is being referenced by an EDGE constraint.
Error SQL72045: Script execution error. The executed script:
TRUNCATE TABLE [graph].[RepoObject];

see a detailed description and workaround here:
https://datahandwerk.github.io/dhw/0.1.0/manual/backup-repo-db.html

Improve the default folder pathing to use ISO8601 when using sqlpackage.exe to exporting data to synapse

Currently if a root folder path is not provided the resulting folder is created as M-DD-YY_HH:MM_(AM|PM)

I think it would be better if the format used an ISO-8601 based format with more consistent separators and casing throughout. possibly leveraging folders:

Here would be some samples for the container root (assuming export time is 2021-06-16T19:49:38Z)
Note: AM PM indicator is unneeded as we are already using a 24hr clock to resolve ambiguity.

  1. file based: \2021-06-16T19:49-01\

  2. folder based: \2021\06\16\1949\01\

the trailing 01 would increase in case multiple export were received within the same minute. alternatively this could be some type of session or requestid from the dedicated pool

Finally each schema should be put into a separate sub folder, as opposed to having a single flat folder for all objects.

Hope this is useful

When using DoNotDropObjectTypes argument, sqlpackage ignores DropIndexesNotInSource=false

We use azure sql tuning advisor to apply indexes it sees as beneficial. As such our db structure in our solution isn't always in sync with what's in the target schema.

We attempted to use the "DropIndexesNotInSource" argument set to false to prevent the azure generated indexes from being dropped before we have a time to analyze them and internalize them into our solution.

Based on the documentation we've found, it does not seem like indexes is an accepted value for passing to the "DoNotDropObjectTypes" argument.
Arg list
/Action:Publish `
/SourceFile:$dacpacPath `
/TargetConnectionString:$connString `
/V:Configuration=$publishConfig `
/p:BlockOnPossibleDataLoss=false `
/p:NoAlterStatementsToChangeCLRTypes=true `
/p:DeployDatabaseInSingleUserMode=false `
/p:IncludeTransactionalScripts=$includeTransactionalScripts`
/p:CreateNewDatabase=false `
/p:BackupDatabaseBeforeChanges=false `
/p:DropObjectsNotInSource=true `
/P:DropIndexesNotInSource=false `
/p:ScriptDatabaseCompatibility=true `
/p:ScriptDatabaseOptions=true `
/p:IgnoreFileAndLogFilePath=true `
/p:IgnoreFillFactor=true `
/p:IgnoreUserSettingsObjects=true `
/p:AllowIncompatiblePlatform=true `
/p:DoNotDropObjectTypes=$objectTypesToKeep `
/p:IgnoreColumnOrder=true `
/p:CommandTimeout=1200

But what we end up seeing is the indexes being dropped anyway:

2020-06-15T05:00:48.7683713Z Dropping [account].[AccountHolder].[nci_wi_AccountHolder_B9027D8081748BB108F9F00BEB10DE87]...
2020-06-15T05:00:48.7984403Z Dropping [account].[Ledger].[nci_wi_Ledger_428E9D816F0FE79354607E484490B20A]...
2020-06-15T05:00:48.8257682Z Dropping [account].[Ledger].[nci_wi_Ledger_6D80D088BFFD5AE25F144BABD15B1A40]...
2020-06-15T05:00:48.8526572Z Dropping [account].[Ledger].[nci_wi_Ledger_6E25B19DFA6E3BB8967B5726B270DB17]...
2020-06-15T05:00:48.8791748Z Dropping [account].[Note].[nci_wi_Note_8B5E0B53408FB1FB6F82EF7A7EA270FE]...
2020-06-15T05:00:48.9059037Z Dropping [claim].[ClaimHistory].[nci_wi_ClaimHistory_CF1A71F321501BAD2004E62F3192F48F]...
2020-06-15T05:00:48.9322502Z Dropping [class].[VINMaster].[nci_wi_VINMaster_D3AFEE5A69AD073E60392DDEE8BDE6BC]...
2020-06-15T05:00:48.9589680Z Dropping [contract].[Contract].[nci_wi_Contract_50476B2C852EE5C7E3B3E7CF2EB6AB98]...
2020-06-15T05:00:48.9856878Z Dropping [contract].[Contract].[nci_wi_Contract_7E4E65F5E25E6513116DEB3EFCC250AA]...
2020-06-15T05:00:49.0124444Z Dropping [contract].[Contract].[nci_wi_Contract_8C185D570D064D556F7182700139EB3F]...
2020-06-15T05:00:49.0387907Z Dropping [contract].[Contract].[nci_wi_Contract_901FCDCDD84F918D585710F2AA5C827B]...
2020-06-15T05:00:49.0670467Z Dropping [contract].[ContractDisbursement].[nci_wi_ContractDisbursement_D5625CDF24A25C52627E9CCAFDCF12C9]...
2020-06-15T05:00:49.0933761Z Dropping [contract].[HoldDisbursement].[nci_wi_HoldDisbursement_5BA3E741687E9D908ED219534ADAD32B]...
2020-06-15T05:00:49.1188825Z Dropping [contract].[HoldDisbursement].[nci_wi_HoldDisbursement_707969C363B517A7E0D9E28839640333]...

Add support "add column statement with 'with values' when nullable" deploy option

I'm always frustrated when add column with default values and is nullable.
I always modify a statement as below.

before

ALTER TABLE [dbo].[TableName]
    ADD [ColumnName] VARCHAR (1) DEFAULT ('0') NULL;

after

ALTER TABLE [dbo].[TableName]
    ADD [ColumnName] VARCHAR (1) NULL DEFAULT ('0') WITH VALUES;

Describe the solution you'd like
Add deploy option that insert 'with values' to "ALTER TABLE ... ADD ..." statement.

Synapse Serverless SQL Pool support

At this moment there is no Synapse Serverless SQL Pool support, effectively blocking any enterprise usage/deployment.

Even a schema compare just to check differences errors out, because the version of Serverless is not understood.

DacFX DMA Assessment and SSMS DacFX Error - Ignore Errors Codes DMA Not working

  • SqlPackage or DacFx Version: 5.4.5184.4
  • .NET Framework (Windows-only) or .NET Core:
  • Environment (local platform and source/target platforms): SQL Server 2014 on Prem to SQL Server 2019

Steps to Reproduce:

  1. Trying to create assessment via DMA source SQL Server to SQL Server 2019 and multiple databases fail with "Could not extract package from specified database."
  2. Attempted to add in the SQL Error code into the of MDA.exe.config file, save and restart the DMA and still same error
  3. Tested via SSMS and extract data tier application and same error
  4. DMA seems to not be reading from the config file and ignoring any errors codes specified.

Did this occur in prior versions? If not - which version(s) did it work in?
First time I have tried and always keep version up to date.

(DacFx/SSMS)

How get the default value if the specific field

I'm using t4 C3 script to create some SQL server object dynamically from predefined tables. Everything works except default values
Example

intake.TabA(
id int not null,
Name varchar(10) not null default('')
)

I need to generate persist table as follows

persist.TabA(
id int not null,
Name varchar(10) not null **default('')**
DateEff datetime,
DateExp datetime
)

I can generate everything but not default('')
Here is the snippet

             foreach (var col in table.GetReferenced(Table.Columns))
                {

                    string columnText;
                    string columnName = col.Name.Parts[2];
                    //string columnName3 = col.Name.Parts[3];
                    //string columnName4 = col.Name.Parts[4];


                    // this attempts to limit to only columns from the source. there's gotta be a cleaner way.
                    if (!skipColumns.Contains(columnName))
                    {

                        int length = col.GetProperty<int>(Column.Length);
                        int precision = col.GetProperty<int>(Column.Precision);
                        int scale = col.GetProperty<int>(Column.Scale);
						//string defaultExpression = col.GetProperty<string>(Column.Expression).ToString();

                        string suffix;
                        if (length != 0)
                        {
                            suffix = String.Format("({0})", length);
                        }
                        else if (precision != 0)
                        {
                            suffix = String.Format("({0},{1})", precision, scale);
                        }
                        else if (precision == 0 && scale != 0)
                        {
                            suffix = String.Format("({0})", scale);
                        }
                        else
                        {
                            suffix = "";
                        }

                        bool nullable = col.GetProperty<bool>(Column.Nullable);
                        string nullText = nullable ? "NULL" : "NOT NULL";

                        string dataType = col.GetReferenced(Column.DataType).FirstOrDefault().Name.ToString();
						//string defaultExpression = "";
						//defaultExpression = col.GetReferenced(Column.Expression).FirstOrDefault().Name.ToString();
                        ////string dataType = col.GetReferenced(Column.Con)
						

                        columnText = String.Format("[{0}] {1}{2} {3} {4}", columnName, dataType, suffix, nullText, defaultExpression);

						WriteLine("			" + columnText + ",");
                    }
                }

Add support for "Prevent saving changes that require table re-creation" option in SqlPackage / SSDT

In SQL Server Management Studio, there's a designer option called "Prevent saving changes that require table re-creation".

It would be extremely useful to have a similar option in SqlPackage.exe as well, as it could stop dangerous deployments that could potentially re-create very large tables and severely impact production environments while doing so.

NOTE: This is a re-submission of the Azure feedback item:

https://feedback.azure.com/forums/908035-sql-server/suggestions/43852278-add-support-for-prevent-saving-changes-that-requi

DacFx doesn't work on .NET 6

  • SqlPackage or DacFx Version: 150.5164.1
  • .NET Framework (Windows-only) or .NET Core: .NET 6
  • Environment (local platform and source/target platforms): macOS, Linux

Hi there, I'm the maintainer of MSBuild.Sdk.SqlProj and we've been working on upgrading that project to make it compatible with .NET 6. Unfortunately we've ran into an issue in DACFX where it seems that it is depending on Strong-name signing which seems to be no longer supported on .NET 6. We are seeing the following stack trace:

System.PlatformNotSupportedException: Strong-name signing is not supported on this platform.
     at System.Reflection.AssemblyName.get_KeyPair()
     at Microsoft.Data.Tools.Schema.Extensibility.ExtensionTypeLoader.ExtensionAssemblies.CreateAssemblyName(String partialName, AssemblyName templateName)
     at Microsoft.Data.Tools.Schema.Extensibility.ExtensionTypeLoader.ExtensionAssemblies.CreateAssemblyLoadName(String partialName, Nullable`1 specificVersion, Nullable`1 isLazy)
     at Microsoft.Data.Tools.Schema.Extensibility.ExtensionTypeLoader.ExtensionAssemblies.GetNames(HashSet`1& allNames)
     at Microsoft.Data.Tools.Schema.Extensibility.ExtensionTypeLoader.LoadTypes()
     at Microsoft.Data.Tools.Schema.Extensibility.ExtensionManager..ctor(String databaseSchemaProviderType)
     at Microsoft.Data.Tools.Schema.Extensibility.ExtensionManager.<>c__DisplayClass20_0.<SetLazyExtensionManager>b__0()
     at System.Lazy`1.ViaFactory(LazyThreadSafetyMode mode)
     at System.Lazy`1.ExecutionAndPublication(LazyHelper executionAndPublication, Boolean useDefaultConstructor)
     at System.Lazy`1.CreateValue()
     at System.Lazy`1.get_Value()
     at Microsoft.Data.Tools.Schema.Extensibility.ExtensionManager.GetExtensionManager(String dsp)
     at Microsoft.Data.Tools.Schema.Sql.SchemaModel.ModelBuildingUtils.CreateEmptyModel(Type dspType, Action`1 ctorSetter)
     at Microsoft.Data.Tools.Schema.Sql.SchemaModel.ModelBuildingUtils.CreateEmptyModel(SqlPlatforms platform, Action`1 ctorSetter)
     at Microsoft.SqlServer.Dac.Model.SqlSchemaModelCreator.CreateEmptyModel(SqlServerVersion modelTargetVersion, TSqlModelOptions modelCreationOptions)
     at Microsoft.SqlServer.Dac.Model.TSqlModel..ctor(SqlServerVersion modelTargetVersion, TSqlModelOptions modelCreationOptions)

Steps to Reproduce:

  1. Create a new Console app on .NET 6 (dotnet new console)
  2. Instantiate a new TSqlModel using the constructor that takes a SqlServerVersion and TSqlModelOptions.
  3. Run the app and see the above exception

Did this occur in prior versions? If not - which version(s) did it work in?
It works fine on .NET 5.

(DacFx/SqlPackage/SSMS/Azure Data Studio)

sqlpackage.exe /a:script (and its equivalents) needs to have a consistent ordering to the output

Sorry in advance if this is the wrong place to log this issue. Kindly point me in the correct direction if there is a better place to log dacpac related issues and requests.

Consider I have production databases A and B, and a sqlproj with changes that will need to go to both.

If I generate an update script from the SQL proj's dacpac using VS, sqlpackage, or dac services api against both A and B, the results will be differently ordered. This makes it hard to determine if I can use the same generated script against both A and B. EDIT: meaning I cant easily diff the scripts and have to resort to re-ordering the two in some way just to do a basic comparison.

As an example, if the changes are for stored procs SP1, SP2, and SP3, one script will have them as SP2,SP1,SP3 and the other will have them as SP1,SP2,SP3. For a trivial example like this the differences are easy to identify, but for scripts that are going to be updating > 20 objects, making the determination of if I can apply the same script to both (and to databases C-Z) is not easily possible.

The same is true for the deploy report action, its never in a consistent order. Can these be made to produce consistently ordered outputs?

When using SchemaComparison with two dacpacs, DacFX takes a long time and throws lots of SocketExceptions

Issue

When comparing two dacpacs located on the local file system, DacFx tries to open a SQL connection to a unexisting SQL Server database. It fails consistently with Socket Exception and this lasts for several minutes until it gives up and continues onto doing the comparison.

This is independent of the dacpacs provided. A simple dacpac with a single table compared with an empty dacpac surfaces this behavior.

The comparison is performed correctly, but the extra minutes spent trying to connect to a database that does not exist causes significant delays to the comparison process.

We use DacFx to run comparisons in our build system to verify for breaking changes, and this is adding delays to our build pipeline.

Version

<PackageReference Include="Microsoft.SqlServer.DacFx" Version="150.4573.2" />
[andre@scout ~]$ dotnet --version
3.1.108

Example

// sourceDacpacPath and targetDacpacPath are two local paths to the local file system
var source = new SchemaCompareDacpacEndpoint(sourceDacpacPath);
var target = new SchemaCompareDacpacEndpoint(targetDacpacPath);
var comparison = new SchemaComparison(source, target);
this.schemaComparison.Compare(); // several minutes throwing and internally catching socket exception before this return

Call Stack

System.Net.Sockets.dll!System.Net.Sockets.Socket.DoConnect(System.Net.EndPoint endPointSnapshot, System.Net.Internals.SocketAddress socketAddress) (Unknown Source:0)
System.Net.Sockets.dll!System.Net.Sockets.Socket.Connect(System.Net.EndPoint remoteEP) (Unknown Source:0)
System.Net.Sockets.dll!System.Net.Sockets.Socket.Connect(System.Net.IPAddress address, int port) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SNI.SNITCPHandle.Connect(string serverName, int port, System.TimeSpan timeout) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SNI.SNITCPHandle.SNITCPHandle(string serverName, int port, long timerExpire, object callbackObject, bool parallel) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SNI.SNIProxy.CreateTcpHandle(System.Data.SqlClient.SNI.DataSource details, long timerExpire, object callbackObject, bool parallel) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SNI.SNIProxy.CreateConnectionHandle(object callbackObject, string fullServerName, bool ignoreSniOpenTimeout, long timerExpire, out byte[] instanceName, ref byte[] spnBuffer, bool flushCache, bool async, bool parallel, bool isIntegratedSecurity) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SNI.TdsParserStateObjectManaged.CreatePhysicalSNIHandle(string serverName, bool ignoreSniOpenTimeout, long timerExpire, out byte[] instanceName, ref byte[] spnBuffer, bool flushCache, bool async, bool parallel, bool isIntegratedSecurity) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.TdsParser.Connect(System.Data.SqlClient.ServerInfo serverInfo, System.Data.SqlClient.SqlInternalConnectionTds connHandler, bool ignoreSniOpenTimeout, long timerExpire, bool encrypt, bool trustServerCert, bool integratedSecurity, bool withFailover) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SqlInternalConnectionTds.AttemptOneLogin(System.Data.SqlClient.ServerInfo serverInfo, string newPassword, System.Security.SecureString newSecurePassword, bool ignoreSniOpenTimeout, System.Data.ProviderBase.TimeoutTimer timeout, bool withFailover) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SqlInternalConnectionTds.LoginNoFailover(System.Data.SqlClient.ServerInfo serverInfo, string newPassword, System.Security.SecureString newSecurePassword, bool redirectedUserInstance, System.Data.SqlClient.SqlConnectionString connectionOptions, System.Data.SqlClient.SqlCredential credential, System.Data.ProviderBase.TimeoutTimer timeout) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SqlInternalConnectionTds.OpenLoginEnlist(System.Data.ProviderBase.TimeoutTimer timeout, System.Data.SqlClient.SqlConnectionString connectionOptions, System.Data.SqlClient.SqlCredential credential, string newPassword, System.Security.SecureString newSecurePassword, bool redirectedUserInstance) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SqlInternalConnectionTds.SqlInternalConnectionTds(System.Data.ProviderBase.DbConnectionPoolIdentity identity, System.Data.SqlClient.SqlConnectionString connectionOptions, System.Data.SqlClient.SqlCredential credential, object providerInfo, string newPassword, System.Security.SecureString newSecurePassword, bool redirectedUserInstance, System.Data.SqlClient.SqlConnectionString userConnectionOptions, System.Data.SqlClient.SessionData reconnectSessionData, bool applyTransientFaultHandling, string accessToken) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SqlConnectionFactory.CreateConnection(System.Data.Common.DbConnectionOptions options, System.Data.Common.DbConnectionPoolKey poolKey, object poolGroupProviderInfo, System.Data.ProviderBase.DbConnectionPool pool, System.Data.Common.DbConnection owningConnection, System.Data.Common.DbConnectionOptions userOptions) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(System.Data.Common.DbConnection owningConnection, System.Data.ProviderBase.DbConnectionPoolGroup poolGroup, System.Data.Common.DbConnectionOptions userOptions) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(System.Data.Common.DbConnection owningConnection, System.Threading.Tasks.TaskCompletionSource<System.Data.ProviderBase.DbConnectionInternal> retry, System.Data.Common.DbConnectionOptions userOptions, System.Data.ProviderBase.DbConnectionInternal oldConnection, out System.Data.ProviderBase.DbConnectionInternal connection) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(System.Data.Common.DbConnection outerConnection, System.Data.ProviderBase.DbConnectionFactory connectionFactory, System.Threading.Tasks.TaskCompletionSource<System.Data.ProviderBase.DbConnectionInternal> retry, System.Data.Common.DbConnectionOptions userOptions) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.ProviderBase.DbConnectionClosed.TryOpenConnection(System.Data.Common.DbConnection outerConnection, System.Data.ProviderBase.DbConnectionFactory connectionFactory, System.Threading.Tasks.TaskCompletionSource<System.Data.ProviderBase.DbConnectionInternal> retry, System.Data.Common.DbConnectionOptions userOptions) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SqlConnection.TryOpen(System.Threading.Tasks.TaskCompletionSource<System.Data.ProviderBase.DbConnectionInternal> retry) (Unknown Source:0)
System.Data.SqlClient.dll!System.Data.SqlClient.SqlConnection.Open() (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Common.SqlClient.ReliableSqlConnection.OpenConnection.AnonymousMethod__48_0() (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Common.SqlClient.RetryPolicy.ExecuteAction.AnonymousMethod__0(Microsoft.Data.Tools.Schema.Common.SqlClient.RetryState _) (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Common.SqlClient.RetryPolicy.ExecuteAction.AnonymousMethod__0(Microsoft.Data.Tools.Schema.Common.SqlClient.RetryState retryState) (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Common.SqlClient.RetryPolicy.ExecuteAction<System.__Canon>(System.Func<Microsoft.Data.Tools.Schema.Common.SqlClient.RetryState, System.__Canon> func, System.Threading.CancellationToken? token) (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Common.SqlClient.RetryPolicy.ExecuteAction(System.Action<Microsoft.Data.Tools.Schema.Common.SqlClient.RetryState> action, System.Threading.CancellationToken? token) (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Common.SqlClient.RetryPolicy.ExecuteAction(System.Action action, System.Threading.CancellationToken? token) (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Common.SqlClient.ReliableSqlConnection.OpenConnection() (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Common.SqlClient.ReliableSqlConnection.Open() (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Common.SqlClient.ReliableConnectionHelper.OpenConnection(Microsoft.Data.Tools.Schema.Common.SqlClient.SqlConnectionFactory connectionFactory, bool useRetry) (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeploymentPlanGenerator.OnInitialize(Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeployment engine) (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeploymentPlanGenerator.Initialize(Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeployment engine) (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeployment.InitializePlanGeneratator() (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeployment.CreateController(System.Action<Microsoft.Data.Tools.Schema.DataSchemaError> msgHandler) (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeployment.CreateController() (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Utilities.Sql.SchemaCompare.DeploymentWrapper.Compare(bool applyRefactorChanges, bool isForShowingComparisonResult) (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Utilities.Sql.SchemaCompare.DeploymentWrapper.CompareForComparisonResult() (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Utilities.Sql.SchemaCompare.DeploymentTargetComparer.Compare() (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Utilities.Sql.SchemaCompare.SchemaCompareController.Compare() (Unknown Source:0)
Microsoft.Data.Tools.Schema.Sql.dll!Microsoft.Data.Tools.Schema.Utilities.Sql.SchemaCompare.DataModel.SchemaCompareDataModel.Compare(System.Threading.CancellationToken token) (Unknown Source:0)
Microsoft.SqlServer.Dac.Extensions.dll!Microsoft.SqlServer.Dac.Compare.SchemaComparison.Compare(System.Threading.CancellationToken cancellationToken) (Unknown Source:0)
Microsoft.SqlServer.Dac.Extensions.dll!Microsoft.SqlServer.Dac.Compare.SchemaComparison.Compare() (Unknown Source:0)

Exception

{System.Net.Internals.SocketExceptionFactory+ExtendedSocketException (111): Connection refused 127.0.0.1:1433
   at System.Net.Sockets.Socket.DoConnect(EndPoint endPointSnapshot, SocketAddress socketAddress)}

Late initialization of ModelTypeClasses (at least) such as PrimaryKeyConstraint.TypeClass??

I'm experimenting with a SSDT DeploymentPlanModifier C# project that, for now, enumerates various constraints being dropped and re-added - this is being loaded by the SqlPackage.exe command-line utility with the /Properties:AdditionalDeploymentContributors= Mine.MyContributor switch.

I'm able to repeatedly able to demonstrate a frustrating condition of what I guess is a late initialization of some of the DacFX internals. PrimaryKeyConstraint.TypeClass is null, as is PrimaryKeyConstraint.Columns, when the following code is run at full speed:

[ExportDeploymentPlanModifier("Mine.MyContributor", "1.0.0.0")]
public class MyContributor: DeploymentPlanModifier
{
    protected override void OnExecute(DeploymentPlanContributorContext context)
    {
        // PrimaryKeyConstraint
        PublishMessage(new ExtensibilityError($"PrimaryKeyConstraints", Severity.Message));
        ProcessConstraints(context, PrimaryKeyConstraint.TypeClass, PrimaryKeyConstraint.Columns);
    }
}

If I set a breakpoint in Visual Studio before the ProcessConstraints() call, the debugger breaks there correctly and the Autos show that both PrimaryKeyConstraint.TypeClass and PrimaryKeyConstraint.Columns are null! Adding an explicit Watch shows the same. Eventually (a minute? though not necessarily time-based), and without allowing the execution to Continue, the Autos eventually change to the non-null, expected values! I'm kind of baffled by this - with the debugger paused, I don't think it's possible for some background thread responsible for populating this value to complete its work - so it must be based on some side-effect of me probing the various values through the IDE. If I wait until after these values are populated to Continue execution, the program runs correctly and gives the expected output. If I don't wait, I can step-into ProcessConstraints() with the debugger showing that null was passed in - and this gives unexpectedly blank behavior.

I'm not doing any threading or async operations in ProcessConstraints() (nor anywhere else). Is there some initialization of the DacFX, or Ready event that I must wait for?

Note: I am debugging by having deployed the latest assemblies to C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\130\Extensions\MyContributor and running the correlating version of SqlPackage C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\130\SqlPackage.exe and not using the Visual Studio Hosting process (not that I know that it's interfering with anything).

The code for ProcessConstraints() is uninteresting for this problem, I think. It goes wrong at the very beginning, since typeClass is clearly being passed in as null:

private void ProcessConstraints(DeploymentPlanContributorContext context, ModelTypeClass typeClass, ModelRelationshipClass relationshipClass)
    {
        var droppedPKConstraints =
            context.ComparisonResult.ElementsToDrop
            .Where(ele => ele.ObjectType == typeClass)
            ...

Unable to specify TempDirectoryForTableData for Import action

TempDirectoryForTableData is available for Export action, and makes it possible to have a dedicated Folder/Drive to hold all the data, and is especially useful for very large databases, and not run out of space on the OS drive.

The same option should be available for Importing databases, can you consider this as a candidate to add the option?

Cannot deploy dacpac with Graph Table that has edges

Steps to reproduce:

  1. Back up a dacpac with DacService.Extract, with ExtractAllTableData = true
  2. Restore the backup with DacServices.Deploy. Iterations I've tried are:
    • Completely dropping the table and then creating it with Deploy
    • Using CreateNewDatabase = true in the options object, and specifying upgradeExisting = false param.
    • Specify options with DoNotDropObjectTypes = new ObjectType[] { ObjectType.Tables}

Error:
{"Could not deploy package.\r\nError SQL72014: Core Microsoft SqlClient Data Provider: Msg 13944, Level 16, State 1, Line 1 Cannot truncate table 'dbo.B2bCustomerNode' because it is being referenced by an EDGE constraint.\r\nError SQL72045: Script execution error. The executed script:\r\nTRUNCATE TABLE [dbo].[B2bCustomerNode];\r\n\r\n\r\n"}

B2bCustomerNode is a Sql Graph Node, and it does have an edge table constraint.

Microsoft.SqlServer.DacFx nuget package as unsigned assemblies

  • SqlPackage or DacFx Version: 150.5164.1
  • .NET Framework (Windows-only) or .NET Core: .net standard 2.0.
  • Environment (local platform and source/target platforms): Local Windows OS

*Steps to Reproduce:

I am building a custom MS build task in c# using .net standard 2.0.

I am referencing Microsoft.SqlServer.DacFx 150.5164.1 package and use Microsoft.SqlServer.TransactSql.ScriptDom.dll to do some sql script parsing.

However MSBuild cannot run my custom task because the strong name validation fails on

`Microsoft.SqlServer.TransactSql.ScriptDom.dll'. It is marked as delay signed. Why is this dll not signed in the package?

I plan to build my custom MS build task into a package, that I need to reference in a .net 5.0 project to run the target in the msbuild.

How do we get the Microsoft.SqlServer.TransactSql.ScriptDom.dll that is signed and .net standard 2.0 compatible?

(DacFx/SqlPackage/SSMS/Azure Data Studio)

SqlPackage does not deploy referenced DAC package

Summary

Although IncludeCompositeObjects is set to true, the SqlPackage deploys only "main" DAC package.

Environment

OS: Windows 10 Pro x64
SQL Server version: 13.0.4001
SqlPackage version: 15.0.5176.1

Steps to reproduce

I have created a solution with two database projects: Dependent.sqlproj and Dependency.sqlproj. The Dependent.sqlproj references to Dependency.sqlproj:

<ItemGroup>
  <SqlCmdVariable Include="Dependency">
    <DefaultValue>Dependency</DefaultValue>
    <Value>$(SqlCmdVar__4)</Value>
  </SqlCmdVariable>
</ItemGroup>
<ItemGroup>
  <ProjectReference Include="..\Dependency\Dependency.sqlproj">
    <Name>Dependency</Name>
    <Project>{d635d16a-0d11-4c0a-b4c3-a646e442d8f4}</Project>
    <Private>True</Private>
    <SuppressMissingDependenciesErrors>False</SuppressMissingDependenciesErrors>
    <DatabaseSqlCmdVariable>Dependency</DatabaseSqlCmdVariable>
  </ProjectReference>
</ItemGroup>

After compitation of Dependent.sqlproj the bin folder looks fine, because it contains following files:

  • Dependency.dacpac;
  • Dependency.dll;
  • Dependency.pdb;
  • Dependent.dacpac;
  • Dependent.dll;
  • Dependent.pdb.

To deploy databases I use following PowerShell script:

sqlpackage /a:Publish ^
  /sf:Dependent.dacpac ^
  /tsn:"(localdb)\MSSQLLocalDB" ^
  /tdn:Dependent ^
  /dsp:Deployment.sql ^
  /df:Deployment.log ^
  /p:IncludeCompositeObjects=true ^
  /v:Dependency=Dependency

Expected behaviour

It should deploy Dependent and Dependency databases to the target server.

Actual behaviour

It deploys only Dependent database. Deployment.sql also contains nothing about Dependency database objects.

Attachments

DB import fails on 'ASYNC_STATS_UPDATE_WAIT_AT_LOW_PRIORITY'

  • SqlPackage or DacFx Version:
  • .NET Framework (Windows-only) or .NET Core: .NET core
  • Environment (local platform and source/target platforms): local platform

Steps to Reproduce:

  1. Have azure sql Db with the feature on ASYNC_STATS_UPDATE_WAIT_AT_LOW_PRIORITY
  2. Export the DB using sql export into a bacpac
  3. Locally import into on-prem/stand alone SQL

Expected behavior - if there are any SQL azure features not applicable for on-prem sqlpackage should handle and let the DB gets imported successfully

Actual behavior - Failing wit the following exception

The import fails with the below error -

Error output was: \r\n *** Error importing database:Could not import package. Error SQL72014: Core Microsoft SqlClient Data Provider: Msg 102, Level 15, State 1, Line 5 Incorrect syntax near 'ASYNC_STATS_UPDATE_WAIT_AT_LOW_PRIORITY'. Error SQL72045: Script execution error. The executed script: IF EXISTS (SELECT 1 FROM [master].[dbo].[sysdatabases] WHERE [name] = N'$(DatabaseName)') BEGIN ALTER DATABASE SCOPED CONFIGURATION SET ASYNC_STATS_UPDATE_WAIT_AT_LOW_PRIORITY = ON; ALTER DATABASE SCOPED CONFIGURATION SET GLOBAL_TEMPORARY_TABLE_AUTO_DROP = OFF; END

Suggested fix:
IF ( EXISTS (select * from sys.database_scoped_configurations where name = 'ASYNC_STATS_UPDATE_WAIT_AT_LOW_PRIORITY'))

BEGIN

End

Did this occur in prior versions? If not - which version(s) did it work in?

(DacFx/SqlPackage/SSMS/Azure Data Studio) - it fails with all the sqlpackage versions and it never worked

Unable to start-up application with Microsoft.SqlServer.DacFX installed in .NET 5.

  • DacFx Version: 150.5164.1
  • .NET Framework (Windows-only) or .NET Core: .NET 5
  • Environment (local platform and source/target platforms): x64 Microsoft Windows Server 2019 Standard -version 1809 OS build 17763.1098

Steps to Reproduce:

  1. Starting an .NET 5 application with DacFX installed runs into an error on app start. Error: Could not load file or assembly: The assembly may have been tampered with, or it was delay signed but not fully signed with the correct private key
  2. Current work around is to uninstall this library(and remove usages).

Did this occur in prior versions? If not - which version(s) did it work in?
When running in .NET Framework 472 things worked fine

SqlPackage.exe plus _BIN2_UTF8 collation generates invalid BACPAC

Originally posted https://feedback.azure.com/forums/908035-sql-server/suggestions/43730865-sqlpackage-exe-plus-bin2-utf8-collation-generates

When a database has a char column with a _BIN2_UTF8 collation, SqlPackage for Windows export generates a BACPAC file that SqlPackage .NET Core fails to import.

This problem can manifest as various error messages. The one that I have seen most often is, reformatted for clarity:

Exception System.FormatException: Unexpected end of stream encountered.
with Data:
    annotation_tableName=[core].[ObjectChangeType],
    DacLogContext_EntryId=bbd5810d-0795-4e0a-8a82-a34e58f1f72a
at Microsoft.Data.Tools.Schema.Sql.SqlClient.Bcp.ColumnSerializer.ReadBytes(BinaryReader reader, Int32 len, Byte[]& bytes)
in F:\B\16846\6200\Sources\Product\Source\SchemaSql\SqlClient\Bcp\ColumnSerializer.cs:line 100
The stack trace suggests a problem in BCP file (de)serialization. Indeed, there is a difference in the BCP files generated by SqlPackage for Windows vs. SqlPackage .NET Core. For a char(1) Latin1_General_100_BIN2_UTF8 column value 'X', the former tool writes bytes 02 00 58 00, while the latter tool writes just 58 00.

Affected Verisons:

  • SqlPackage for Windows 15.0.5164.1
  • SqlPackage .NET Core 15.0.5084.2

Steps to Reproduce:

  • Have a local SQL Server 2019 instance.
  • Install SqlPackage for Windows: https://go.microsoft.com/fwlink/?linkid=2165211
  • Install SqlPackage .NET Core: choco install sqlpackage
  • Run the following SQL script.
  • Run the following PowerShell commands.
CREATE DATABASE Repro COLLATE Latin1_General_100_CI_AI_SC_UTF8;
GO

USE Repro;
GO

CREATE TABLE dbo.Foo
(
    A char(1)
        COLLATE Latin1_General_100_BIN2_UTF8
        NOT NULL
        PRIMARY KEY
,
    B varchar(10)
        NOT NULL
);

INSERT dbo.Foo VALUES ('X', 'whatever');
$spwin = "${env:ProgramFiles}\Microsoft SQL Server\150\DAC\bin\SqlPackage.exe"

$spcore = "${env:ChocolateyInstall}\lib\sqlpackage\tools\sqlpackage.exe"

& $spwin  /a:Export /ssn:. /sdn:Repro  /tf:repro.bacpac

& $spcore /a:Import /tsn:. /tdn:Repro2 /sf:repro.bacpac

DACPAC Publish fails when altering colum constraint to NOT NULL

We use Azure SQL Database deployment Task Version : 1.171.4 to deploy to Azure SQL db, when we tried to deploy a column constraint we got the following error in the log.

2020-10-26T14:30:50.1206693Z *** The column MarketId on table [bdl_DW].[MarketLanguage] must be changed from NULL to NOT NULL. If the table contains data, the ALTER script may not work. **To avoid this issue, you must add values to this column for all rows** or mark it as allowing NULL values, or enable the generation of smart-defaults as a deployment option. ... 2020-10-26T14:30:52.0466530Z ##[error]lumn LanguageId on table [bdl_DW].[MarketLanguage] must be changed from NULL to NOT NULL. If the table contains data, the ALTER script may not work. To avoid this issue, you must add values to this column for all rows or mark it as allowing NULL values, or 2020-10-26T14:30:52.0468615Z ##[error] enable the generation of smart-defaults as a deployment option. Warning SQL72016: The column MarketId on table [bdl_DW].[MarketLanguage] must be changed from NULL to NOT NULL. If the table contains data, the ALTER script may not work. To avoid this issue 2020-10-26T14:30:52.0470791Z ##[error], you must add values to this column for all rows or mark it as allowing NULL values, or enable the generation of smart-defaults as a deployment option. Error SQL72014: .Net SqlClient Data Provider: Msg 50000, Level 16, State 127, Line 8 Rows were detecte 2020-10-26T14:30:52.0473247Z ##[error]d. The schema update is terminating because data loss might occur. Error SQL72045: Script execution error. The executed script: **IF EXISTS (SELECT TOP 1 1 FROM [bdl_DW].[MarketLanguage]) RAISERROR** (N'Rows were detected. The schema upda 2020-10-26T14:30:52.0474749Z ##[error]te is terminating because data loss might occur.', 16, 127) WITH NOWAIT;

The database we deploy to does NOT contain any null values in that column but deployment still fails , I know that it is possible to enable GenerateSmartDefaults but we don't want to to this and from the log it looks like code is designed to raise an error if the table contains rows even tough that there are no rows that are in conflict with the NOT NULL constraint.

I believe this to be a bug as the informational message suggest that having no null values in the column would solve this.

SQL46010: Incorrect Syntax near INCLUDE, for table variable indexes.

Originally posted https://developercommunity.visualstudio.com/t/SQL46010:-Incorrect-Syntax-near-INCLUDE/1352571

In an SQL Server Database Project, we VS2019 shows a syntax error when trying to create an index with included fields for a table variable:

CREATE PROCEDURE [dbo].[Procedure1]
	@param1 int = 0,
	@param2 int
AS

DECLARE @myTable TABLE (
  Field1 AS INT,
  Field2 AS INT,
  INDEX [IX_TMP_1] NONCLUSTERED ([Field1]) INCLUDE ([Field2])
)

	SELECT @param1, @param2
RETURN 0

It works perfectly on SSMS.
Azure SQL Database with Compatibility Level 150

DacFx - only return public types as properties

If you have a TSqlObject like a Column, you can get the Length using:

var length = column.GetProperty<int>(Column.Length);

Length is an int of the actual length. If we wanted to know something more exotic such as the expression on a calculated column we should do:

var expression = column.GetProperty<SqlScriptProperty>(Column.Expression);

but SqlScriptProperty is an internal class so we can't use the typed version, instead we need to fall back to getting a object:

var expression = column.GetProperty(Column.Expression);

It happens to be a string but we don't know at compile time what the type is or just always assume it ends up being a string.

We can't force a string as in:

var expression = column.GetProperty<string>(Column.Expression);

Because we get a cast failure (can't case SqlScriptProperty to string)

Please can you only return public classes when we use TSqlObject.GetProperty.

**If there is somewhere better to raise DacFx issues please say :) **

Advice or strategy for merging multiple layered dacpacs into one

Hi @dzsquared. We are trying to solve a problem similar to a previously filed issue. Specifically, we have 15 layered and independently versioned dacpacs deployed in sequence to a target database. Each of these dacpacs primarily focuses on a specific database schema and its sqlproject is contained in the same repo as the api/app. During deployment, our CD pipeline checks whether the target database has had this dacpac and version deployed already. If it has, then it skips to the next dacpac. However, the pipeline in its current state could use some improvements. Specifically, we'd like to be able to preview changes (breaking or otherwise) that will be made against production database before approving it. We thought of using sqlpackage's deployreport action to generate an xml report which we can parse for breaking changes and prevent deployment from going through without additional approval from DBAs. With 15 dacpacs to process, this would be difficult and time consuming. Hence, we looked for ways to merge all 15 dacpacs into 1 when building the "overall" package/release and that's how we stumbled onto Ed Elliot's sample project for merging dacpacs to 1. There are some limitations with the same project such as the inability to combine dacpac refs, sqlcmd vars, refactor logs, just to name a few. I've forked Ed's repo in order to make it compile as 64 bit, migrate packages.config to PackageReference, and remove 2 sample dacpac projects I'm unable to find.

Each of our dacpacs can come with:

  • an optional pre-predeployment phase containing 0, 1, or more idempotent scripts for handling breaking changes before model comparison process is initiated by sqlpackage. This is something we handle separately and can merge since they fall outside of a dacpac.
  • an optional post-postdeployment phase containing files to bcp data into the db if/when specific tables are empty. This is only used for really large pre-seeded table. Bacpac is available but we've never tried whether it works in an idempotent manner when deploying against a database that has data in the target tables already.

We have made sure the database properties of all 15 dacpacs are identical. During a merge, we expect to only inherit the database properties coming from the first dacpac provided (out of the 15). Any advice you could provide to address some of the limitations would be appreciated.

Add support for Enterprise SQL Server features (i.e. Indexes ONLINE=ON)

When building a SQL Server Database project, if SQL files include enterprise features such as Online Index building, these statements are ignored when outputting the SQL code or applying to target databases.

How to reproduce: add an index to a database project that would get deployed. test it with "WITH (ONLINE = ON)" and without. You'll see that when looking at the output, you'll see that the "WITH (ONLINE = ON)" is ignored.

Schema update unnecessarily drops/creates related objects when Sensitivity Classification changes are detected

Summary:

When schema compare (correctly) detects that a Sensitivity Classification should be created, dropped, or modified in the target, the generated script takes the unnecessary steps of dropping/recreating all indexes and constraints related to that column. Index rebuilding in particular is problematic in our ability to perform zero-downtime/zero-disruption production releases. Since sensitivity classifications currently cannot be ignored by schema compare, it seems our only recourse is to ensure these classifications are always in sync between the dacpac and target databases, without relying on DacFx to help with this. This is a little painful given our security/ops people apply Azure-recommended classifications directly to the target Azure databases from time to time.

  • DacFx Version: 150.5282.3
  • .NET Framework (Windows-only) or .NET Core: .NET 5
  • Environment (local platform and source/target platforms):
    • Windows 10
    • Visual Studio 2019 (v16.11.5)
    • SQL project with DSP Microsoft.Data.Tools.Schema.Sql.Sql150DatabaseSchemaProvider or Microsoft.Data.Tools.Schema.Sql.SqlAzureV12DatabaseSchemaProvider
    • .NET 5 console app with Microsoft.SqlServer.DacFx v150.5282.3 installed
    • Target database SQL Server 2019 or Azure SQL

Steps to Reproduce:

(easiest if you already have a SQL project and sync'd live database to experiment with)

  1. Define a table in a SQL project that contains an index and constraint on some column, build dacpac.
  2. Create a target database (SQL 2019 or Azure SQL) with the same table defined (manually or by deploying the dacpac).
  3. In the target database, add a sensitivity classification (with T-SQL or otherwise) to a column that has an index and constraint.
  4. Generate a schema update script with the following code (or similar):
var source = new SchemaCompareDacpacEndpoint(dacpacPath);
var target = new SchemaCompareDatabaseEndpoint(targetConnectionString);
var comp = new SchemaComparison(source, target);
var script = comp.Compare().GenerateScript(dbName).Script;
  1. Inspect generated script. Notice it correctly drops the classification, but it also unnecessarily drops and recreates any indexes and constraints defined on that column.

Did this occur in prior versions? If not - which version(s) did it work in?

Yes, same behavior in older versions of DacFx published to NuGet. I'm fairly certain this bug has existed since support for sensitivity classifications was first introduced.

Build DACPAC with references

At the moment it seems that it is not possible to build a dacpac that references objects in other dacpac packages, or I just can't figure out how to do it. Is this something that's planned to be added?

IncludeFileNamesCollector doesn't work cross-platform

I've been working on an open source project called MSBuild.Sdk.SqlProj which allows defining SSDT like projects using the simplified SDK-style project files while also enabling support for building .dacpac's cross-platform.

One of the things that has been requested is support for merging pre- and/or post-deployment scripts into a single file (see rr-wfm/MSBuild.Sdk.SqlProj#23), which can then be packaged into the resulting .dacpac. I've looked at how this works in Visual Studio and that seems to rely on the IncludeFileNamesCollector from the Microsoft.Data.Tools.Schema.Sql.Deployment namespace in the Microsoft.Data.Tools.Schema.Sql assembly. This type parses the SQL scripts and resolves imported scripts (using the :r OtherScript.sql syntax), which is exactly what I need.

Now I know that this is an internal type, but I'd rather not re-invent the wheel so I'm using reflection to use it. However, I'm running into an issue where this type is normalizing paths to be all uppercase. That works just fine on Windows of course, but not so much on operating systems that have a case sensitive file systems (like Linux).

Not sure if this is the right place, but I'm wondering if there's something that can be done here so that I can unblock my users. The assembly I'm using is part of the Microsoft.SqlServer.DACFX NuGet package.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.