Big data unit test

Recently, I wrote a set of unit tests that were based on a large set of test data. The set contained twelve elements, and although it does not sound like it was when used with tests.

Each element needed to set several properties with unique values. The problem was that this method was that the factory method that created this dataset was huge.

What are the best practices regarding this issue? My application actually reads the data through a file, but for the tests I used mock data from the storage in memory.

Any tips?

+5
source share
6 answers

What do your tests look like?

, , ? unit test , , , (, mocking).

, . , . , , , , .

, , CSV , :

  • CVS
  • , CVS .

, , , , , , , ( , , , , ..). , , , , , .

+8

? , - , , , .

0

. .

. , / . / (, ) .

" " ( , ).

, ( ).

0

xUnit Test Patterns: Gerard Meszaros.

:

, Minimal Fixture. , Fresh Fixture . Shared Fixture, Make Resource Unique Sandbox . ( , (. " " ) , , , , !)

0

?

, . , .

, , , .

0

If your question is about creating a larger set of test data, you can use some library like NBuilder. We used NBuilder to generate a large set of test data. It offers a very smooth interface and is very easy to use. You can find a small demo of the same here .

0
source

All Articles