Running stored procedures in a different database context?

Due to security restrictions, each client must have its own data stored in its own database. Sometimes clients require settings that require specific stored procedures within their scope to be changed, sometimes significantly. This creates a nightmare for storing stored procedures in all databases. It seems like this is often a problem. Are there generally accepted standards for handling such situations?

Some approaches that I reviewed:

A central database containing "standard" stored procedures. The process first checks if the stored procedure exists in the client database; if it does not exist, it is executed from the "standard" database. Problem: I cannot figure out how to execute a stored procedure from one database and have links to tables in another database without generating dynamic SQL. This seems like the best solution for me, since you need to save a single stored procedure, it is easy to configure settings. I just can't figure out how to make it work, even if it's possible. It is always executed in the context of the database containing the SP.

Any settings must be made to a copy of the standard stored procedure with the suffix _Custom. The entire stored procedure calls the first test to find out if _Custom SP exists, invoking the normal, rather than the standard, upon detection. The setup is more obvious, however, each SQL call must have a different SQL query prefix to verify the SP name. In addition, any changes to the β€œstandard” stored procedures must still be replicated to hundreds of databases.

Delete stored procedures together. Store them instead as T-SQL statements, either in files or somewhere in a table. It is necessary to create a completely new management system for access, testing, updating T-SQL code (for this, Management Studio is currently used).

I'm looking for information on how to create a simple and elegant solution to this problem, or at least improve the current situation, which is to manually update each stored procedure, looking for conflicting settings as you go.

+4
source share
5 answers

There is undocumented access to the tailgate for this:

  • make sure the procedure name starts with sp_
  • put the procedure in master
  • mark the procedure as a system by executing sp_marksystemobject 'sp_...'

Thus, the procedure is "magically" available in each database and will always act on local objects. For instance. if the procedure calls select ... from dbo.Foo and is called in the context of the bar database, the effect will be to select from bar.dbo.Foo , although the procedure was declared in master .

Of course, using an undocumented procedure ( sp_marksystemobject ), you expose yourself to all the problems of using undocumented functions (it may change without warning in a future version or even when updating SP / CU, product support may refuse to help you, etc. etc.) .

If I were in your power, I would deploy the procedure to each tenant's database. The real problem is change management, my favorite solution is described in "Version Control" and your database .

+6
source

What if you want to move away from stored procedures, this architecture seems to be very dependent on stored procedures

When the smart goes wrong

A better approach would be to move this logic into your domain layer and use composition / polymorphism to eliminate the differences between client and client.

+2
source

As for your first approach: you can reference tables in another database from which the stored procedure is executed. You just need to fully qualify the table name [database_name]. [Scheme]. [TableName]. [ColumnName]. There is no need to create a dynamic SQL string.

+1
source

I'm not sure if you mean that each client has a separate database or a separate database server. If the first, then the answer above would look obvious: lookup tables using the syntax DatabaseName.SchemaName.TableName (or just database_name ... TableName if you always use the same schema). If the latter, you will need Google for an exotic solution to access the stored procedure code from another server.

In any case, for me, this is screaming for a better solution for automatic deployment, rather than trying to execute stored procedures along database boundaries. You can configure the solution for deploying the stored procedure so that you can quickly deploy new versions of the stored procedure in all databases or just a subset. What if a client requests a working offline copy of its database? This seems to me a much better solution.

Duplicate code is not always bad.

+1
source

Super old post, but can you suffix your own stored procs using ClientIdentifier?

 dbo.sp_getOrders dbo.sp_getOrders_xyz123 

Each standard proc then has a ClientIdentifier parameter that will try to invoke a custom proc if one exists.

Then all your code is in one database, with one set of source codes, any settings are obvious (and to whom they belong), easy deployment and source code management.

depending on clients accessing their databases, I would probably confuse clientId ...

0
source

All Articles