Finding Failure Patterns in Unit Test

I'm new to Unit Testing, and I just got into the routine of creating test suites. I have something that will be a fairly large project with which I want to start the tests from the very beginning.

I am trying to figure out common strategies and patterns for creating test suites. When you look at a class, many tests come to you, obviously because of the nature of the class. Say, for the "user account" class with basic CRUD operations related to the database table, we want to check - well, CRUD.

  • creating an object and viewing its existence
  • request its properties
  • change some properties
  • change some properties to invalid values
  • and delete it again.

Regarding how to break things, there are error tests common to most CRUD classes, for example:

  • Invalid input types
  • Number as identifier that exceeds the range of the selected data type
  • Entering Incorrect Character Encoding
  • Entry too long

And so on and so forth.

For a unit test related to file operations, a list of โ€œbreaking thingsโ€ might be

  • Invalid characters in file name
  • File Name Too Long
  • The file name is using the wrong protocol or path.

I am sure that similar patterns applicable outside the unit test, which currently work, can be found for most of the tested devices.

Now my question is:

  • Do I see these "violation patterns" correctly? Or am I getting something completely wrong regarding unit testing, and if I did it right it would not be a problem at all? Is unit testing a process of finding as many possibilities as possible for the correct operation of a device?

  • If I am right: are there existing definitions, lists, cheat sheets for such templates?

  • Are there any provisions (mainly in PHPUnit, as well as in the structure in which I work) to automate such templates?

  • Is there any help - in the form of checklists or software - to help write complete tests?

+7
php unit-testing phpunit
source share
1 answer

You are basically right. Looking for ways that can break your code is a key part and unit testing skills. However, the unit testing used in TDD works a little differently. In TDD, you first write a test for new functionality, and then create code to pass this test. Thus, the emphasis here is different, although the end result is similar.

In TDD, one "changes hats" constantly - a little testing, a little coding. Thus, in this method, testing is not an automated part, but we can say that this is the key to the creative process. When writing tests, you also develop the interface of your department and think from the point of view of its (future) customers - what can they expect and what should they provide? Then you switch hats and go inside the device to fulfill these expectations.

Therefore, I do not think that this can be replaced simply by checking the items in the list. Of course, once you run out of ideas to test the actual block, it never hurts to check such a list. However, such sheets by their nature may contain only generalizations that may or may not apply to a specific project and a specific class for testing. But you probably have the experience and the mind to find good test cases for your specific units :-)

+4
source share

All Articles