Due to security restrictions, each client must have its own data stored in its own database. Sometimes clients require settings that require specific stored procedures within their scope to be changed, sometimes significantly. This creates a nightmare for storing stored procedures in all databases. It seems like this is often a problem. Are there generally accepted standards for handling such situations?
Some approaches that I reviewed:
A central database containing "standard" stored procedures. The process first checks if the stored procedure exists in the client database; if it does not exist, it is executed from the "standard" database. Problem: I cannot figure out how to execute a stored procedure from one database and have links to tables in another database without generating dynamic SQL. This seems like the best solution for me, since you need to save a single stored procedure, it is easy to configure settings. I just can't figure out how to make it work, even if it's possible. It is always executed in the context of the database containing the SP.
Any settings must be made to a copy of the standard stored procedure with the suffix _Custom. The entire stored procedure calls the first test to find out if _Custom SP exists, invoking the normal, rather than the standard, upon detection. The setup is more obvious, however, each SQL call must have a different SQL query prefix to verify the SP name. In addition, any changes to the βstandardβ stored procedures must still be replicated to hundreds of databases.
Delete stored procedures together. Store them instead as T-SQL statements, either in files or somewhere in a table. It is necessary to create a completely new management system for access, testing, updating T-SQL code (for this, Management Studio is currently used).
I'm looking for information on how to create a simple and elegant solution to this problem, or at least improve the current situation, which is to manually update each stored procedure, looking for conflicting settings as you go.
source share