The #30daysoftesting challenge is almost complete, today is day 29 and the challenge is:
Share how you manage test data in your automated checks
- First and foremost: Test data should be separated from your test framework. As mentioned in yesterday’s challenge where a developer on my team reviewed my automation code, I have not followed that principle consistently myself, but there always has to be room so improvement, so no worries there!
- The test data is created automatically with a script and in most cases managed in external files, so that end-users can also manage it easily if needed. It also allows me to more easily copy paste new test data directly from e-mails.
- The used test data is cleaned up after use to ensure tests always start in the exact same state when I run them again.
- Masking test data that initially comes from a production environment is more important than ever since the GDPR is live.
- Refreshing test data, to ensure it stays relevant. For example: The system under test launched in the United Kingdom but a year later they are also active in France and Germany. I want my test data to reflect this new situation and cover the added test cases this change introduces. (In my current project these possible changes are in the broadcasting world. Refreshed data can be needed when the company launches a new television channel (or they stop / sell a certain tv-channel.
There are plenty more things related to test data management. If you have any suggestions, let me know!
Happy testing 🙂