I don't use the Visual Studio SSDT Publish

  • Posted on: 19 August 2015
  • By: Ed Elliott

I see questions pretty regularly on stack overflow and the ssdt msdn forum that is some variation on "I am doing a publish through visual studio and it doesn't quite do what I want" or "I want to publish the database when I press F5 except when I don't want it to publish". The second variation I particularly enjoy because I have always though that what I want is a computer that I can think something and it will do it - I guess we are a little way off but one day hopefully that will be possible.

It occurred to me pretty early on in my SSDT usage that publishing through visual studio wasn't really the way I wanted to do my development so what I do is:

  • 1. Write a powershell script that calls sqlpackage.exe to deploy a dacpac to my dev instance
  • 2. Write my code in Visual Studio / SSDT (New code and living the refactoring dream)
  • 3. Build the project, ensure there are no errors
  • 4. Alt+Tab to my powershell session and run the deploy script
  • 5. Watch the powershell script and if it deployes successfully, continue my work

This has a number of benefits:

Benefit #1 - Simplicity

I know exactly when a deploy is going to happen and I know exactly what options it will use. If I want to add a variable or change a parameter then I can do it and it is straight forward.

Benefit #2 - Errors + Warnings

I see quickly and brightly any errors or warnings, I don't really like the publish dialogue in SSDT because it sort of sits at the bottom and doesn't shout when it is finished. I often sit waiting after it has finished and then have to click around for the errors - this isn't a moan about the ui, I am just more comfortable with a command line and a bit of console colouring!

Benefit #3 - Monitor redeployments

If there is an issue like you have a default constraint with a definition in SSDT that SQL changes when it store it (such as adding brackets around things) then it will be constantly re-deployed. By running in a command window I get to see exactly what changes are happening and if I see the amount of things being re-deployed every time I can do something about it rather than the changes being hidden, again unless I really go looking for it.

Benefit #4 - Testing the CI process

This one is what I get most excited about (it really really does!). I use the same script to drive sqlpackage.exe for my local instance that I use on the CI server and other environments to deploy changes and generate change scripts. This means that the most well tested part of my development process is that bit that is often seen as the most risky (i.e. deploying T-SQL changes). It also gives me the chance to really build in confidence that sqlpackage.exe will work and that I can trust it.

On trusting sqlpackage.exe I have used it a lot now, I must have done local deploys numbering in the low thousands and I haven't had any issues that weren't caused by me - the last few versions have been reliable and I am really excited about the future of SQL Server development with SSDT - no other RDBMS has anything anywhere near the capabilities of SSDT and it is free!

Benefit #5 - Obvious where to put deployment contributors

If you are doing SSDT development seriously then you will likely want to use my deployment contributor at some time or another (http://agilesqlclub.codeplex.com/) or another contributor. Deploying the contributors for Visual Studio is machine wide whereas with sqlpackage.exe I can gather all the files, add an extensions directory and add the filter to that. It makes life so much simpler!

Complexity #1 - Extra effort

It isn't all chips and gravy (incidentally I had chips and gravy for the first time recently, it was ok but I am sticking to salt, vinegar and a bit of ketchup).

I need to write the powershell script and run it. It could well be a batch file or an app that uses the DacFX api to deploy the dacpac but I prefer good old sqlpackage.exe

Complexity #2 - Context switching from Visual Studio

I need to alt+tab out of visual studio and run something else. It would be nice to run it all from Visual Studio but after the build I am likely going to run the application or tSQLt tests so I would probably be context switching anyway.

I have played around with getting this step into Visual Studio, I have tried:

  • Using the command window to call powershell - this gets boring quite quickly, the colours are just so dull
  • Using StudioShell - this was the most interesting but ultimately I went back to the simpler approach


Overall I am really pleased that I use sqlpackage.exe rather than Visual Studio, I can still use the xml profiles and it is a simple way to deploy changes or generate scripts.

Happy sqlpackage'ing!

Why does sqlpackage.exe not compare after the pre deployment script has run?

  • Posted on: 24 June 2015
  • By: Ed Elliott

It isn't a Pre compare script

When you use SSDT to deploy your dacpac, a pre deployment script lets you run some T-SQL scripts before the script which brings your database up to date with your dacpac.

The important thing to note is that the process is:

  • sqlpackage compares the dacpac to the database
  • sqlpackage generates a list of changes to make
  • sqlpackage generates the T-SQL statements to make those changes
  • sql package either creates a script for later execution or deploys the changes

The script that is created *or* is executed contains, 0 or 1 pre deployment scripts, the statements to generate the required changes and then 0 or 1 post deployment scripts.

The pre deployment script is not actually executed before the compare, that would be a pre compare script and if you were only generating a script which can be manually reviewed and deployed later, would you want the pre compare script to be executed? I would say no.

If the script is not run before a compare when generating a script, should it be run before a compare which will then be automatically deployed? I would say probably yes but that would be confusing and I am sure it would cause lots of problems for people.

I need to run a script before the compare

Ok well there are two general approaches, the first is to have a script you run before you call sqlpackage.exe. This is the simplest and if you are sure your script is fine to run then do it.

The second is to stage changes in releases and make the changes you would in the pre deploy script as part of a normal deployment and then in a future deployment make the second set of changes, this doesn't work in all scenarios but can help if you are disciplined about making sure that the second part actually happens.

Overall when using sqlpackage there is a lot of flexibility but it isn't going to do everything for you, use good testing processes to catch when things go wrong and make sure you understand the tooling to work out what does and doesn't work for you.

Why does SqlPackage keep dropping and recreating constraints?

  • Posted on: 16 January 2015
  • By: Ed Elliott

I sometimes see this when deploying via sqlpackager.exe:

Analyzing deployment plan (Complete)
Updating database (Start)
Dropping DF_XXX...
Creating DF_XXX...
Update complete.
Updating database (Complete)
Successfully published database.

This happens everytime I publish, even though there have not been any changes.

Typically this is caused by Sql taking the create statement from SSDT and changing it, it does things like changes GETDATE to getdate and (0) to ((0)), to be honest I am not 100% sure what it will and won't change but my fix for this when I get it happening is to go to SSMS and script out the table and copy and paste the constraint back into SSDT, then deploy and it should stop.

An example is the extra parenthesis which get added,so if you take:

CREATE TABLE xxx ( id int not null, status int not null default (0));

When you deploy to Sql this ends up as:

CREATE TABLE xxx ( id int not null, status int not null default ((0)));

Who knows why it does this but it does, just remember script it back from SSMS and forget about it!

HOWTO: Filter Dacpac Deployments

  • Posted on: 14 January 2015
  • By: Ed Elliott

Updated 2015-02-02

I have changed the details about passing multiple filters at the bottom, see the codepex site for full details.


Download the filter from: https://agilesqlclub.codeplex.com/releases/view/610727 put the dll into the same folder as sqlpackage.exe and add these command line parameters to your deployment:

/p:AdditionalDeploymentContributors=AgileSqlClub.DeploymentFilterContributor /p:AdditionalDeploymentContributorArguments="SqlPackageFilter=IgnoreSchema(BLAH)"

This will neither deploy, drop or alter anything in the BLAH schema.

More Info

I recently discovered deployment contributors which let you write .net code to interrupt the dacpac deployment process and modify either the steps involved or the actual scripts themselves. They are documented: http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.dac.deployme...

Well, I cannot tell you how excited I was and indeed how excited I still am - very excited.

When you use the schema compare tool inside visual studio you can choose what you deploy and what you do not deploy but when you have a dacpac and try to deploy that outside of visual studio, there isn't a way to do it so I thought I would write a generic filter that can be used to filter what is deployed.

This also gave me a chance to test visual studio online with a real project and also codeplex as I was interested how it compares to github.

I have now deployed the code to codeplex and you can use it to see how the deployment contributors are written but they are very simple, the releases section also has the compiled code for anyone who doesn't want to use git clone:


How do I use this?

Ok so the way the DacFx api works is that you need to put the dll that contains the contributor into the same folder as sqlpackage.exe - this obviously isn't ideal if it is a shared machine so I would copy the DAC folder that contains sqlpackage.exe and move it somewhere else and copy the dll into that folder and run the deployment from there.

Once the dll is in the same folder as sqlpackage.exe you need to tell it to load the contributor which you do using this argument:


The next step is to decide which filters you would like to use.

There are two types of filters: Keep and Ignore.

Keep filters stop objects being dropped when they are in the dacpac but not the destination, if they are in the dacpac and not in the destination *or are different* then they will be created or altered.

Ignore filters stop any sort of operation, create, alter or drop so there is some flexibility.

Keep are really only ever used in combination with /p:DropObjectsInSource=True otherwise they wouldn't be dropped anyway.

Once you know what type of filter you want you need to decide what you will filter on, your choices are: Name, SchemaName and object type (stored procedure, function, table, user, role, rolemembership etc etc).

Name filters work on an objects name, pretty straight forward/

Schema filters work on the name of the schema so you can keep or ignore everything in a specific schema

Object type filters work on the type of the object as the Dac Fx api sees it, these types are all documented as properties of the ModelSchema class:


The object types are all fields, so the list starts Aggregate, ApplicationRole etc etc.

Once you have decided how you will filter you specify the filter itself which is a regex, but don't be scared it doesn't have to be complex.

Because of the way we pass the arguments to SqlPackage.exe and it then parses them and passes them onto the deployment contributor it is a little rigid, but essentially the filter itself look like:

To keep everything in dbo:


To ignore all Tables:


To keep a table called MyTable or MyExcellentFunnyTable:


When you have decided onthe filter you use need to pass it to SqlPackage.exe using:


You can specify multiple filters by seperating them with a semi colon so and adding a uniqeifier to the end of each arg name:


(The reason for the uniqueifier is detailed: https://connect.microsoft.com/SQLServer/feedback/details/1112969)

That should be it, it is fairly straightforward and if you get any issues then please shout either via email or twitter @EdDebug and I will help as quickly as I can, although you can do a git clone, fix and issues and send a pull request!


Enforcing T-SQL quality with SSDT Analysis Extensions

  • Posted on: 30 December 2014
  • By: Ed Elliott


Simply put, it is possible to include checks in the SSDT (Sql Server Data Tools) build process - you come up with a set of rules or start with these (https://tsqlsmellsssdt.codeplex.com/) and when you check-in your code, as part of your normal build or CI build you run the code analysis - the code analysis verifies that the code is good and either continues the build or fails it if you have committed a crime against your rules.


Code quality is important, one part of a quality codebase is consistency and standards and if you don't enforce them, how do you know the standards are being followed? The quick answer is that you don't, and you don't have to either, just decide on the rules and enforce them and then go on worrying about something else (like how to deploy data as part of an SSDT deployment!)


The first step is to work out what rules you want to enforce and then you need to implement the code to enforce them, sounds easy right?

There are a couple of approaches,the simplest would be to get the code smells extension from https://tsqlsmellsssdt.codeplex.com/ and either use that as it is or extend the rules, it is open source so you can and should contribute back any additions you make to it.

If you want to write the rules yourself then you can start with the DacFx samples from Microsoft: https://dacsamples.codeplex.com/SourceControl/latest#RuleSamples/Capital...

Once you have the rules and have compiled them (they are C#) you end up with a dll you need to copy into the Dac folder, if you are using Visual Studio 2013 and the November 2014 SSDT release then the extension folder is:

C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120

I will put up a seperate blog about where to find the extensions folder for each release as it does float around a little!

Once you have the dll in place, restart visual studio to pick up the new dll and then right click on the solution that contains the SSDT project (don't click on the project, click on the solution) and then choose "Run Code Analysis On Solution":

To run the code analysis everytime you build the project you can set the project option "Enable Code Analysis on Build":

To run the code analyis from the command line on a specific project run:

msbuild /t:StaticCodeAnalysis project_name.sqlproj


Hopefully this has piqued someones interest enough to start checking the quality of their code as part of their build process, writing the rules themselves is a little tricky but there are enough samples around to get you started and hopefully more people will start sharing what it is they do so we can all benefit in this extremely important development activity.