What is best way to run script against MS-SQL?

Necropolis

Executive Member
Joined
Feb 26, 2007
Messages
8,401
The other option is to roll your own C# (or whatever language) console application that does exactly what you want.

Shouldn't be difficult to setup.
 

Urist

Expert Member
Joined
Mar 20, 2015
Messages
3,656
Not sure how @@trancount solves the problem?

Thought you wanted to make one big script out of a bunch of smaller ones with dependencies on each-other. i didn`t understand why you don`t nest them through normal transaction processing, but your previous post explained it.
 

Spacerat

Expert Member
Joined
Jul 29, 2015
Messages
1,328
OK let me rephrase my question. What do you guys use to:
- Write TSQL scripts
- Version control you database migrations
- Deploy your migration scripts to local and remote DBs in a controlled way.
- Do you create one big script or are they also broken up into smaller ones?

Red-gate has terrific tools for this, so maybe that is the way to go. But I cannot justify the cost just yet. The version control on the DB structure looks insane!
 

MielieSpoor

Expert Member
Joined
Dec 6, 2006
Messages
1,984
OK let me rephrase my question. What do you guys use to:
- Write TSQL scripts
- Version control you database migrations
- Deploy your migration scripts to local and remote DBs in a controlled way.
- Do you create one big script or are they also broken up into smaller ones?

Red-gate has terrific tools for this, so maybe that is the way to go. But I cannot justify the cost just yet. The version control on the DB structure looks insane!


Sql Server Data Tools?

Since you need for this to happen in Azure, that is the route I would go.

Best of all, it's free!

To answer your questions with a quote:
Managing Schema
What about schema? Even if you don’t have logic in your database in the form of stored procedures, you’re bound to change the schema at some stage. Don’t do it using manual scripts. Start using SQL Server Data Tools (SSDT) for managing your schema. Would you change the code directly on a webserver to implement new features? Of course not – you want to have source control and testing etc. So why don’t we treat databases the same way? Most teams seem happy to “just fix it on the server” and hope they can somehow replicate changes made on the DEV databases to QA and PROD. If that’s you – stop it! Get your schema into an SSDT project and turn off DDL-write permissions so that they only way to change a database schema is to change the project, commit and let the pipeline make the change.

The advantage of this approach is that you get a full history (in source control) of changes made to your schema. Also, sqlpackage calculates the diff at deployment time between your model and the target database and only updates what it needs to to make the database match the model. Idempotent and completely uncaring as to the start state of the database. Which means hassle free deployments.
 
Last edited:

Spacerat

Expert Member
Joined
Jul 29, 2015
Messages
1,328
Sql Server Data Tools?

Since you need for this to happen in Azure, that is the route I would go.

Best of all, it's free!

To answer your questions with a quote:


Well this is the thing. The scripts I am talking about are schema (and data) changes required as part of a release. Say I need to add a column to a table, I create a script that adds the column. That's part of the dev process. Like writing a piece of code. This script goes into version control as part of the release. Then if another table needs to drop a column, that goes into another script. Also put into version control. So when the release needs to be propagated to test, I run only single script that includes all the required ones to deploy the updates to the test DB. The binaries are deployed and testing can begin. Same process to deploy to live.

I never fix things (schemas) on live. That is just not done. It gets properly dev'ed and put into version control. Then propagated to test. The automated tests run. Only after clean run does everything get promoted to live. I dont use diffs at all. I know exactly what schema version each db is at any point because EVERYTHING is scripted.

I might just add that the above method is how I have done it up to now with non-MSSQL DBs. The tools for MSSQL are obviously light
years ahead of the tools I am used to. But the process I have used up to now will work equally well with MSSQL. But maybe it is time to change and have the DB managed from a repository like that from red-gate: http://www.red-gate.com/products/sql-development/sql-toolbelt/?gclid=CL7qp8yIqdICFQe4GwoddrgKhw

But eye-wateringly expensive for the whole suite
 

Pho3nix

The Legend
Joined
Jul 31, 2009
Messages
30,589
Well this is the thing. The scripts I am talking about are schema (and data) changes required as part of a release. Say I need to add a column to a table, I create a script that adds the column. That's part of the dev process. Like writing a piece of code. This script goes into version control as part of the release. Then if another table needs to drop a column, that goes into another script. Also put into version control. So when the release needs to be propagated to test, I run only single script that includes all the required ones to deploy the updates to the test DB. The binaries are deployed and testing can begin. Same process to deploy to live.

I never fix things (schemas) on live. That is just not done. It gets properly dev'ed and put into version control. Then propagated to test. The automated tests run. Only after clean run does everything get promoted to live. I dont use diffs at all. I know exactly what schema version each db is at any point because EVERYTHING is scripted.

I might just add that the above method is how I have done it up to now with non-MSSQL DBs. The tools for MSSQL are obviously light
years ahead of the tools I am used to. But the process I have used up to now will work equally well with MSSQL. But maybe it is time to change and have the DB managed from a repository like that from red-gate: http://www.red-gate.com/products/sql-development/sql-toolbelt/?gclid=CL7qp8yIqdICFQe4GwoddrgKhw

But eye-wateringly expensive for the whole suite

Buy RedGate for 1 user and put it on a server?
Finally home and reading through your posts.

Wouldn't the easiest option then be to carry on your dev as is.
Take all your completed scripts and put them in a folder. Create a ssis package that loops through the folder and runs the scripts. All you'd have to do is push the scripts to the folder and run the job.

All in sql and would work with azure.
 

[)roi(]

Executive Member
Joined
Apr 15, 2005
Messages
6,282
Well this is the thing. The scripts I am talking about are schema (and data) changes required as part of a release. Say I need to add a column to a table, I create a script that adds the column. That's part of the dev process. Like writing a piece of code. This script goes into version control as part of the release. Then if another table needs to drop a column, that goes into another script. Also put into version control. So when the release needs to be propagated to test, I run only single script that includes all the required ones to deploy the updates to the test DB. The binaries are deployed and testing can begin. Same process to deploy to live.

I never fix things (schemas) on live. That is just not done. It gets properly dev'ed and put into version control. Then propagated to test. The automated tests run. Only after clean run does everything get promoted to live. I dont use diffs at all. I know exactly what schema version each db is at any point because EVERYTHING is scripted.

I might just add that the above method is how I have done it up to now with non-MSSQL DBs. The tools for MSSQL are obviously light
years ahead of the tools I am used to. But the process I have used up to now will work equally well with MSSQL. But maybe it is time to change and have the DB managed from a repository like that from red-gate: http://www.red-gate.com/products/sql-development/sql-toolbelt/?gclid=CL7qp8yIqdICFQe4GwoddrgKhw

But eye-wateringly expensive for the whole suite
When dealing with scenarios like this I usually ask the following questions:
  • Can I assume that the team will be able to maintain custom scripts / code in my absence?
  • or... does it justify my cost to build something truly custom (usually alway NO)
  • versus.... the cost to buy an off the shelf solution that some techie could be trained to use
You know your situation... personally I always opt for the off the shelf for these "run of mill" type of things -- last thing I want to be doing in 6 months is maintaining a "quickie scripting thing".

Naturally if you're contracting services to "company x"; it may be worth your while provide this as a service for Rxx.xx/pm
 
Last edited:

Spacerat

Expert Member
Joined
Jul 29, 2015
Messages
1,328
[)roi(];19227720 said:
When dealing with scenarios like this I usually ask the following questions:
  • Can I assume that the team will be able to maintain custom scripts / code in my absence?
  • or... does it justify my cost to build something truly custom (usually alway NO)
  • versus.... the cost to buy an off the shelf solution that some techie could be trained to use
You know your situation... personally I always opt for the off the shelf for these "run of mill" type of things -- last thing I want to be doing in 6 months is maintaining a "quickie scripting thing".

Naturally if you're contracting services to "company x"; it may be worth your while provide this as a service for Rxx.xx/pm

very true... my preference is a bought-out tool. developing a script-thingy usually costs you more in the long run. But was also wondering what the other ppls here approach is. Surprisingly very little info to date...
 

prod

Executive Member
Joined
Nov 11, 2009
Messages
6,132
very true... my preference is a bought-out tool. developing a script-thingy usually costs you more in the long run. But was also wondering what the other ppls here approach is. Surprisingly very little info to date...

Well there are a few approaches to take, though it may not be relevant in your case here is what I've used in the past:

OK let me rephrase my question. What do you guys use to:
- Write TSQL scripts
- Version control you database migrations
- Deploy your migration scripts to local and remote DBs in a controlled way.
- Do you create one big script or are they also broken up into smaller ones?

- SSMS
- All our .sql scripts are then committed to a code repository, if they need to be in a release then they are also commited to a Release folder of sorts. Bamboo will build and package the scripts under the release folder into a single NuGet file for the 'SQL project'
- Octopus deploy will extract the NuGet file on the environment you're deploying to and https://dbexecutor.codeplex.com/ will excecute the scripts, works like a charm. The C# implementation we had to write is literally like 20 lines of code
- Doesn't really matter

Setting this up may have been time consuming and a learning curve for us, but at the same time it's saving so much time when the release date comes - it's literally a click of a button on a dashboard. Also we didn't have to spend a cent on extra tools since most of them were already used in-house and the rest are open-source.
 
Last edited:

Pho3nix

The Legend
Joined
Jul 31, 2009
Messages
30,589
Well there are a few approaches to take, though it may not be relevant in your case here is what I've used in the past:



- SSMS
- All our .sql scripts are then committed to a code repository, if they need to be in a release then they are also commited to a Release folder of sorts. Bamboo will build and package the scripts under the release folder into a single NuGet file for the 'SQL project'
- Octopus deploy will extract the NuGet file on the environment you're deploying to and https://dbexecutor.codeplex.com/ will excecute the scripts, works like a charm. The C# implementation we had to write is literally like 20 lines of code
- Doesn't really matter

Setting this up may have been time consuming and a learning curve for us, but at the same time it's saving so much time when the release date comes - it's literally a click of a button on a dashboard. Also we didn't have to spend a cent on extra tools since most of them were already used in-house and the rest are open-source.

Nice implementation. :)
No Jenkins though?
 

Spacerat

Expert Member
Joined
Jul 29, 2015
Messages
1,328
Well there are a few approaches to take, though it may not be relevant in your case here is what I've used in the past:



- SSMS
- All our .sql scripts are then committed to a code repository, if they need to be in a release then they are also commited to a Release folder of sorts. Bamboo will build and package the scripts under the release folder into a single NuGet file for the 'SQL project'
- Octopus deploy will extract the NuGet file on the environment you're deploying to and https://dbexecutor.codeplex.com/ will excecute the scripts, works like a charm. The C# implementation we had to write is literally like 20 lines of code
- Doesn't really matter

Setting this up may have been time consuming and a learning curve for us, but at the same time it's saving so much time when the release date comes - it's literally a click of a button on a dashboard. Also we didn't have to spend a cent on extra tools since most of them were already used in-house and the rest are open-source.

tx, interesting info
 

MielieSpoor

Expert Member
Joined
Dec 6, 2006
Messages
1,984
Well this is the thing. The scripts I am talking about are schema (and data) changes required as part of a release. Say I need to add a column to a table, I create a script that adds the column. That's part of the dev process. Like writing a piece of code. This script goes into version control as part of the release. Then if another table needs to drop a column, that goes into another script. Also put into version control. So when the release needs to be propagated to test, I run only single script that includes all the required ones to deploy the updates to the test DB. The binaries are deployed and testing can begin. Same process to deploy to live.

I never fix things (schemas) on live. That is just not done. It gets properly dev'ed and put into version control. Then propagated to test. The automated tests run. Only after clean run does everything get promoted to live. I dont use diffs at all. I know exactly what schema version each db is at any point because EVERYTHING is scripted.

I might just add that the above method is how I have done it up to now with non-MSSQL DBs. The tools for MSSQL are obviously light
years ahead of the tools I am used to. But the process I have used up to now will work equally well with MSSQL. But maybe it is time to change and have the DB managed from a repository like that from red-gate: http://www.red-gate.com/products/sql-development/sql-toolbelt/?gclid=CL7qp8yIqdICFQe4GwoddrgKhw

But eye-wateringly expensive for the whole suite
Ok, but you are missing the point and purpose of the product. What you are trying to do manually there, SSDT does for you. You extract your db schema to your file system, check it into your preferred sc system and build it whenever you need to start to test. The resulting schema from your build is then used to update the database in the target environment. Once you are ready to move from QA to staging, you use this exact same schema file to update the next test environment.

From everything you explained, SSDT will solve your problems. Yes you can fork out the money for the Redgate tools, but install SSDT and give it a try - it won't cost you a cent, only time to trail, but you will have to do that with any tool you are considering in any case.
 

Spacerat

Expert Member
Joined
Jul 29, 2015
Messages
1,328
Ok, but you are missing the point and purpose of the product. What you are trying to do manually there, SSDT does for you. You extract your db schema to your file system, check it into your preferred sc system and build it whenever you need to start to test. The resulting schema from your build is then used to update the database in the target environment. Once you are ready to move from QA to staging, you use this exact same schema file to update the next test environment.

From everything you explained, SSDT will solve your problems. Yes you can fork out the money for the Redgate tools, but install SSDT and give it a try - it won't cost you a cent, only time to trail, but you will have to do that with any tool you are considering in any case.

Sure I have never used SSDT. So does SSDT generate delta scripts?
Because I need to deploy delta scripts to my DBs
 

Solarion

Honorary Master
Joined
Nov 14, 2012
Messages
21,885
The problem is a script to include data. If your db is anything over a gig it becomes incredibly difficult to work with as I found I start running out of memory when trying to run it.
 
Top