There are not many absolute pros and cons in this argument, so the answer is "dependent." Some scenarios with different conditions that affect this decision may be:
Client server application
One example of a place where this might be advisable is the older 4GL or rich client , where all database operations were performed using updates based on stored procedures, inserting, deleting sprocs. In this case, the essence of the architecture was to make sprocs act as the main interface for the database, and all the business logic related to specific objects lived in one place.
This type of architecture is somewhat unfashionable these days, but at some point it was considered the best way to do this. Many VB, Oracle Forms, Informix 4GL, and other client-server applications of the era have been executed this way, and it really works quite well.
This is not without its drawbacks, however - SQL is not particularly good at abstraction, so itβs pretty easy to end up with rather dumb SQL code, which presents a maintenance problem because it is difficult to understand, and not as modular as we would like.
How relevant is it today? Quite often, a rich client is a suitable platform for the application, and, of course, there are many new developments related to Winforms and Swing. Today, we have good open source ORMs where, in the original 1995 Oracle Forms application, it may not have been possible to use this type of technology. However, the decision to use ORM, of course, is not black and white - Fowler Enterprise application architecture templates work well through a series of data access strategies and discussion of their relative merits.
Three-level application with a rich object model
This type of application takes the opposite approach and puts all of the business logic in the object layer of a mid-level model with a relatively thin database layer (or, possibly, a ready-made mechanism such as ORM ). In this case, you are trying to put all the application logic in the middle tier. The data access level has relatively little intelligence, with the possible exception of a few stored procedures necessary to circumvent ORM restrictions.
In this case, SQL-based business logic is minimized because the core storage of the application logic is the middle tier.
Overspeed Batch Processing
If you need to do a periodic run to select records that meet some complex criteria and do something with them, it may be appropriate to implement this as a stored procedure. For something that might have to overcome a large part of a decent size database, a sproc-based approach would probably be the only reasonably effective way to do such things.
In this case, SQL may well be the appropriate way to do this, although traditional 3GLs (specifically COBOL) have been developed specifically for this type of processing. In really large volumes (in particular, mainframes), performing this type of processing using flat or VSAM files outside the database can be the fastest way to do this. In addition, some jobs may be recording and procedural in nature, or they may be much more transparent and reliable if implemented in this way.
In paraphrase Ed Post, "you can write COBOL in any language" - although you may not want to. If you want to store it in a database, use SQL, but this, of course, is not the only game in the city.
Reports
The nature of reporting tools, as a rule, is dictated by the means of coding business logic. Most of them are designed to work with SQL-based data sources, so the nature of the tool makes you choose.
Other domains
Some applications, such as ETL processing, may work well for SQL. ETL tools start getting hassle-free if the conversion becomes too complex, so you might want to use a architecture based on a stored procedure. Blending queries and transforms during retrieval, ETL processing, and stored procedure processing can lead to a transformation process that is difficult to verify and eliminate.
If you have a significant portion of your logic in sprocs, it might be better to put all the logic in this, since it provides a relatively uniform and modular code base. In fact, I have a fairly good authority that approximately half of all data warehouse projects in the banking and insurance sectors are done in this way as an explicit design decision - for this very reason.