Best practices for working with SQL Server databases

Background

My group has 4 SQL Server databases:

  • Products
  • Uat
  • Test
  • Dev

I work in a Dev environment. When the time comes to promote the objects that I worked on (tables, views, functions stored by procs), I make a request to my manager, who is moving towards testing. After testing, she sends a request to the administrator, who promotes the UAT. After successful user testing, the same Administrator advances to production.

Problem

The whole process is inconvenient for several reasons.

  • Each person must manually track their changes. If I update, add, delete any objects that I need to track, so that my mailing request contains everything that I did. Theoretically, if I miss something, or the UAT should catch it, but this is not obvious, and this is a waste of time for the tester. Anyway.
  • A lot of the changes that I make are iterative and are done in the GUI, which means that there are no records of what changes I made, only the final result (at least as far as I know).
  • We are in the rather early stages of building the data array, so most of the changes made, at least in terms of quantity, are minor: changing the data type for the column, changing the table names as we crystallize, what they will be used for, setting up functions and saved procedures, etc.

Question

People have been doing this kind of work for decades, so I believe there must be a much better way to manage the process. I would like if I could run the difference between the two databases to see how the structure was different, use this diff to create a script change, use this script change as my promotion request. Is it possible? If not, are there other ways to organize this process?

For the record, we are a 100 percent Microsoft store, now we are upgrading everything to SQL Server 2008, so any tools available in this package will be honest.


I should clarify that I'm not necessarily looking for diff tools. If this is the best way to synchronize our environment, then that’s fine, but if there is a better way, I'm looking for it.

An example of what I really want are migrations to Ruby on Rails. Dead simple syntax, all changes are well documented automatically and by default, determining which migrations to run is almost trivially easy. I would really like it if something like that for SQL Server.

My ideal solution is 1) easy and 2) hard to spoil. Rails Migrations - both; everything I have done so far on SQL Server is neither one nor the other.

+6
sql database sql-server
source share
8 answers

Version control and your database

The root of all evil things makes a difference to the user interface. SSMS is a DBA tool, not a developer. Developers should use scripts for any changes to the database model / schema. Versioning your metadata and updating the script from each version of N to version N + 1 is the only way that has been proven to work reliably. This is a solution that SQL Server itself deploys to track metadata changes (db resource changes).

Comparison tools such as SQL Compare or vsdbcmd and .dbschema files from VS Database projects are the latest tools for stores that cannot make the correct version. They work in simple scenarios, but I see that they all do not impress with a serious deployment. One simply does not trust the tool to modify the + 5TB table if the tools try to copy the data ...

+3
source share

In our team, we process database changes as follows:

  • We (re) create a script that creates a complete database and checks it for version control along with other changes. We have 4 files: tables, user-defined functions and views, stored procedures and permissions. This is fully automated - only a double click is needed to generate the script.
  • If the developer needs to make changes to the database, she does it on her local db .
  • For each change, we create update scripts . They are easy to create: the developer will regenerate the db script of his local db. All changes are now easily identified thanks to version control. Most of the changes (new tables, new views, etc.) can be simply copied to the update script, other changes (adding columns, for example) must be created manually.
  • The script update was checked either in our shared developer database, or by rolling back the local database to the last backup that was created before the database change began. If it passes, time for fixing changes.
  • The update scripts follow the naming convention so that everyone knows in what order to execute them.

This works pretty well for us, but still needs some coordination if several developers change the very same tables and views a lot. This often happens infrequently.

Important points:

  • The database structure is only changed by scripts , with the exception of the local db developer. It is important.
  • SQL scripts are versioned using a control source - db can be created as it was at any time in the past
  • database backups are created regularly - at least before making changes to db
  • changes to db can be executed quickly - because scripting for these changes is relatively easy.

However, if you have many long-term development branches for your projects, this may not work.

This is far from an ideal solution, and special precautions must be taken. For example, if there are updates that may fail, depending on the data present in the database, the update should be tested on a copy of the production database.

Unlike rail migration, we do not create scripts to modify update changes. But this is not always possible in any case, at least with respect to the data (the contents of the lost column are lost, even if you recreate the column).

+3
source share

RedGate sells SQL Compare , a great change scripting tool.

Visual Studio also has editions that support the database. Previously called Database Edition .

When I work, we have long since canceled the separation of Dev / Test / UAT / Prod in favor of a very fast release cycle . If we put something broken into production, we will fix it quickly. Our customers are certainly happier, but in the event of a risk to prevent a corporate enterprise, it can be difficult to sell.

+2
source share

Several tools are available for you. One of the Red-Gate is called SQL Compare . Amazing and highly recommended. SQL Compare allows you to make the difference in schemas between two databases and even create sql shift scripts for you.

Note that they have been working on the SQL Server version control product for some time now.

Another (if you are a visual studio store) is the function of comparing the schemas and data that are part of Visual Studio (not sure which versions).

+2
source share

Agree that SQL Compare is an awesome tool.

However, we do not make any changes to the database structure or objects that are not created by the script and are not saved in the original control in the same way as all other code. Then you know exactly what belongs to the version that you are advertising, because you have scripts for this particular version.

A bad idea is to make structural changes through the graphical interface. If you have a lot of data, this is much slower than using the alter table, at least in SQL Server. You want to use only tested scripts to make changes to prod.

+2
source share

I agree with the comments made by marapet, where every change should be scripted.
However, the problem you may be experiencing is creating, testing, and tracking these scenarios.
Take a look at the repair mechanism used in DBSourceTools.
http://dbsourcetools.codeplex.com

It was specifically designed to help developers get SQL Server databases under source control.

This tool will allow you to install the base database at a specific point and create a named version (v1).
Then create a deployment target - and upgrade the named version to version 2.
Add corrections to the patch directory for any changes to the schema or data.
Finally, check the database and all the fixes in the source code of the control to distribute them with the help of the developers.

This gives you the repeating process of checking all the patches that will be applied from v1 to v2.
DBSourceTools also has functionality to help you create these scripts, i.e. compare schemas or script tools.

Once you're done, just send all the files in the patch directory to your database administrator to upgrade from v1 to v2.

Good luck.

+1
source share
0
source share
  • Save database version in version table
  • Save script file name that was successfully applied
  • Save the md5 sum of each sql script that has been applied. It should ignore spaces when calculating the sum of md5. Must be effective.
  • Keep track of who applied the script. Information about when the script was applied.
  • Database must be checked at application startup
  • New sql script should be applied automatically
  • If the md5 amount has been changed from a script that has already been applied, the error must be selected (in production mode)
  • When the script was released, it cannot be changed. This should be unchangeable in a production environment.
  • Script must be written so that it can be applied to various types of databases (see Liquibase).
  • Since most ddl statements are automatically committed to most databases, it is best to have one ddl statement for the SQL script.
  • The DDL sql statement should run in a way, so it can be executed several times without error. Really helps in dev mode when you can edit a script several times. For example, create a new table only if it does not exist, or even drop the table before creating a new one. This will help you in dev mode, with a script that has not been released, change it, clear the md5 sum for this script, run it again.
  • Each sql script must run in its own transaction.
  • Triggers / procedures should be discarded and created after each db update.
  • Sql script is stored in a version control system such as svn
  • The script name contains the date it was committed, the existing (jira) problem number, a short description
  • Avoid adding rollback functions in scripts (Liquibase allows this). This makes it difficult to record and maintain. If you use exactly one ddl statement for a script, and dml statements are run within a transaction, even with a script error, there won't be a big problem to resolve it
0
source share

All Articles