Look at the INSULATION LEVEL option in SQL Server. It should be set as high as possible to prevent data corruption, however, which can and will lead to a large number of locks, and parallel operation can make the situation worse, not better.
Depending on the code of your procedure, you may even get deadlocks that will simply roll back from some calls that need to be reissued, which can cause additional deadlocks, and before you know it, your database is in completely crap 'd.
What I would recommend for such intensive data operations is not to paralyze them at all, just run them one by one. If your server is a multi-core machine, it can even paralyze the situation on its own.
Of course, these are wide hand waves. To get specific information, it is necessary to analyze the data structure (along with any indexes), as well as the code that will be executed.
source share