I agree with most of the above regarding unit testing. However, I consider it important to emphasize that the use of Mock Repositories and unit tests does not give you the same level of tests as the database integration test.
For example, our databases often have cascading deletes built right into the schema. In this case, deleting the primary object in the aggregate will automatically delete all child objects. However, this will not be automatically applied in the mock repository, which was not backed up by a physical database with these business rules (unless you built all these rules in Mock). This is important because if someone comes and changes the design of my circuit, I need it to break my tests so that I can adjust the code / circuit accordingly. I appreciate that this is Integration Testing, not Unit Testing, but thought it was worth mentioning.
My preferred option is to create a master design database containing data samples (the same data that you will create in your Mocks). During the start of each test run, I have an automated script that backs up MasterDB and restores it to "TestDB" (which uses all my tests). Thus, I maintain a repository of clean test data in the Wizard, which recreates myself with each test. My tests can reproduce data and check all the necessary scripts.
When I debug the application, I have another script that backs up and restores the main database to the DEV database. I can play with the data here without worrying about losing my data. Usually I do not run this particular script every session because of the delay waiting for the database to recover. I can run it once a day and then play / debug the application during the day. If, for example, I delete all records from the table as part of my debugging, I would run a script to recreate DevDB when done.
These steps sound as if they would add a tremendous amount of time to the process, but in reality - they do not. Our application is currently located in the area of ββ3,500 tests, with about 3,000 of them gaining access to the database at some point. Backing up and restoring a database usually takes about 10-12 seconds at the beginning of each test run. And since the whole set of tests is performed only when checking TFS, we do not mind if we have to wait even longer. On average, our entire test suite takes about 15-20 minutes a day.
I appreciate and agree that integration testing is much slower than unit testing (due to the inherent need to use a real database), but it more closely represents a real-world application. For example, DB error codes are not returned to Mock Repositories, not a timeout, they are not blocked, they do not have enough disk space, etc.
Unit tests are approved for simple calculations, basic business rules, etc., and, of course, they are absolutely the best choice for most operations that are not related to access to the database (or other resource). But I do not think that they are as valuable as integration tests - people talk a lot about unit tests, but they say little about integration tests.
I expect those who are passionate about unit tests to send me flames. This is good - Iβm just trying to bring some balance and remind people that projects that are full of passed unit tests can still fail the moment you implement them in the field.