How I automate tests (or try to) inside Sprint
Versão em português no Blog da Concrete e no meu LinkedIn. Se gostarem, voltem e batam palmas aqui ou deem like LinkedIn! Comentem também!
Attending Meetups and talking to QAs from other companies, the difficulty of implementing test automation within Sprint always come as subject. So I decide to tell how was (and is going) my experience with test automation at Concrete. Note that this is my recipe and worked for two projects that I passed. Consider if it fits in your case, ok?
In order for the method work, some backgrounds are needed, based on conversations and readings about test automation and my experience with Concrete.
The most successful cases cited in the conversations, the company (or at least some managements positions) take the struggle to run agility regardless of the framework (whether Scrum, Kanban or other). In these cases, if the team fail, they were encouraged to try again, and not punished for mistakes.
This struggle also goes to the stakeholder that wants to “waterfall” the process. If your company complies with stakeholder’s decisions on how the team should work, to not lose the sale, the chances of failure are high.
We will consider here team as a group of people working to hit a common goal that adds value to the product, all together.
The team must share information, ideas, concerns, success and failure, working as a single entity.
A team differs from a work group the way people behave. In a work group, each one does their own job without thinking about how it can affect the others. Has the Sprint failed? “I deliver my work on time, I have nothing to do with it.” Or worse: “who did not get the job done was that person there,” pointing the finger.
I’ll put one more category, the “team-ish”. The team-ish knows how to work in group, but only do what is written to do. They may even think of improvements, but if the goal was achieved, why do more?
If you’re in a work group, you’ll hardly be able to automate during Sprint. In other cases, the likelihood is high. And if you’re in a team-ish, why not try to make it a team? Tell your ideas of integration in the retrospective or call the team to chat after a daily, since you will be together.
To automate a feature that does not exist takes a little imagination. If you already have the drawing of the screens becomes easier, but if you haven't, imagine the flow. Your code, at first place, will automate features like filling a field or the click of a button regardless of its position on the screen, color or shape.
“But I don't know IDs !!!”, you say. Two solutions I use:
- Leave the field blank or with some indication to complete later, or
- Create a variable that allows key-value construction (hash, hashmap, or dict, for example). Fill in the field with variable_name [key]. Then just complete the second member of the equality with the values that you are discovering.
Using this method I have already completed my code along with Dev! So, just map the IDs and make small adjustments that the functionality was already automated.
It all starts at the planning meeting, where we decide the sprint backlog. Let’s assume that the purpose of the sprint is to perform the login (always login as example …).
Write down all the possible scenarios for a login. Before starting to automate all, define which will be unit / instrumented, which will be services / APIs, and which will be the front.
Divide the responsibilities and automate first all the “happy paths”, those in which everything will be done perfectly. There is no point in a feature that does not perform its function if the user does everything right.
Going back, write all the possible login scenarios successfully, for example:
Write down all that your system will support.
Implement the scenarios
Here comes the imagination. Scenarios will be implemented only with login screen drawing.
My steps file, without the implementation, looks like this:
Note that even though my feature file has several scenarios, there is only three steps to implement. When writing anything in your code it is very important think about reuse. After implementing “with my imagination”, the steps will look like this:
And the page objects, variables, and methods files, using SitePrism and Capybara:
Now just wait for the login screen be done, complete the code and the feature is already automated! Said no one ever …
Even if you replace everything right, the chances that you make some mistake in the first (second, third …) run is huge. The above example serve only as a basis. Try to fix the errors as they appear in log. Even with these errors, it is easier and quicker to make adjustments than waiting for the functionality to be fully ready to write the code. Errors will decrease over time.
Also, it’s not just because you’ve finished a feature that will sit and wait. In an agile team, there is always something to do. Talk with the responsible for DevOps and see how the CI / CD pipeline is going, see how is going the application development with the developers, talk about the backlog and usability. Interact with the team, give suggestions for improvements, point problems (already with solutions, if possible). You are responsible for product quality, not just automate testing.
Don't forget the exploratory manual test
Always take a time to use your product. Only this way you will have the complete knowledge of how it is working. Don't automate only the requirements described in the stories. Many issues and even new automations will emerge as you use the application. Automated and manual tests are complementary, not exclusive.
I would dare to say that today, with the great demand for automation, QAs should also “step back” a little and do manual testing.
I hope you can get the general idea of this text and apply in the context of your company.
And don't forget to comment here how the experience is going! And if you have a question or want to share your experience, leave a comment below!
- Samanta Cicília and Frederico Moreira podcasts (in portuguese — highly recommended!)
- Book Scrum: The Art of Doing Twice the Work in Half the Time
- Book Agile Testing: A Practical Guide for Testers and Agile Teams
- Book More Agile Testing: Learning Journeys for the Whole Team