When new products are launched, the disconnect between business professionals and engineers often results in wasted time and resources. A strategy for improving communication can help prevent bottlenecks to the project’s progress.
When business managers understand the capabilities of the engineering team, and when engineers understand what the business requires from its software, both sides can work together to create applications with real business value. That’s where behavior-driven development (BDD) comes in to achieve this business value.
We’re sure you would have many questions about how BDD can help your teams become more engaged and focused on business outcomes without wasting time in endless meetings or unmaintainable test scripts.
In our very first session of Voices of Community, we had a special guest John Ferguson Smart, Founder at Serenity BDD, teamed up with Manoj Kumar, VP of Developer Relations at LambdaTest, to discuss how to avoid BDD pitfalls and three simple steps to embed effective BDD practices into your teams.
If you missed the power-packed webinar, let us look at the event’s major highlights.
About the Webinar
The webinar starts with John highlighting the red flags when implementing BDD and avoiding pitfalls. John claims that around 95% of software teams make use of BDD in the wrong way.
John highlights the agenda for the webinar. This is as follows:
How you write User Stories can hold back your team (and the biggest single mistake teams make).
How almost everyone using Cucumber and BDD for automation testing is doing it WRONG.
Three steps to embedding effective BDD and test automation practices into your teams.
After this, John explains the major BDD pitfalls faced by software teams.
Pitfall #1 — User Story Writing
John explains that the trap of BDD comes from misunderstanding what User Stories are about. He insists developers ask themselves, “Why are their User Stories holding them back?” He further breaks down this question that developers should address while using BDD. They are:
Do you struggle to break down and organize your requirements?
Do stories take a long time to prepare?
Do user stories that are complete end up needing rework and fixes?
According to John, the most significant indicator of dysfunctional user stories is using “Given, When, and Then” in the initial story descriptions. He explains that using them breaks the agile development flow and indicates that the team considers writing user stories as an old-school requirement document. This also shows that teams don’t know how to use “Given, When, and Then” while writing user stories.
John gives an example of a standard dysfunctional user story error to prove his point. You can see it in the screenshot below.
With the help of this example, John explains that when teams give definitive acceptance criteria, they provide an impression that they have done the work, but this practice cuts off the conversation.
John suggests that acceptance criteria should be in bulleted points or given with examples, not in a definitive manner.
Pitfall #2 — Asking Product Owners or Business Analysts to Write User Stories in Gherkin
He says product owners or business analysts are not very good at writing user stories. They’re not well-trained to write stories in Gherkin and to try to express their business requirements, so by asking them to write this given one-name format, you’re just putting an unnecessary burden on them and distracting them from writing down the actual requirements.
This also makes them slower as they take longer to write the requirements of much poorer quality. It also cuts out the conversation, and you end up not giving the team a chance to understand and discuss to try and flesh out the details.
John then explains an old concept called the Card, Conversation, and Confirmation founded by Ron Jeffery.
With the help of various examples, John teaches the audience how to write user stories in plain English.
John then highlights the most effective “Conversation” techniques that teams can implement. They are as follows:
We can record the conversation in a tabular format, writing tables on a whiteboard or a virtual whiteboard that works well.
We can do example mapping, where we flesh out the business rules and come up with examples and counterexamples. Example mapping is an excellent technique because you quickly get much breadth and edge cases.
We can also do feature mapping. Feature mapping is all about understanding user journeys and user flows and mapping out variations of the flows.
John then explains how we can ensure the “Confirmation” aspect for the correct executable specifications. He suggested four ways:
Formalize the acceptance criteria.
Ratify the acceptance criteria.
Automate the acceptance criteria.
Demonstrate the acceptance criteria.
Pitfall #3 — BDD Test Scripts — Why isn’t your test automation delivering?
John goes on to explain why BDD scripts could not deliver effectively. He states the following reasons:
Test scripts might be flaky or brittle.
Teams struggle to finish automation within each sprint.
Tests not giving clarity about progress or coverage.
John then explains what effective automation looks like. As per it should do the following:
Give fast, actionable feedback if something goes wrong. You should know what goes wrong, why, and how to fix it.
It should also deliver a report on meaningful business-related progress by showing what you have delivered in business terms.
It needs to be stable and trustworthy.
It should be completed within the sprint.
John states that “using Cucumber to automate test scripts misses the benefits of both BDD and Agile Test Automation.” He explains this statement with an example.
John then explains the BDD approach with the help of a flowchart.
Moving forward, John covers the essential tips for teams to make BDD work for them effectively. These are as follows:
Find your BDD champions to grow your BDD practices organically and sustainably.
Discover your BDD Dialect and learn how to express your requirements to tailor it to your domain.
Build an automation framework that scales. Test automation should be easier as it grows.
Q&A Session:
Before wrapping up, John answered several questions the viewers raised. Here are some of the insightful questions from the session:
How can we use BDD in the performance testing phase? Is there any advantage of using it for performance testing?
For non-functional requirements, you can use BDD, the discovery process to articulate. The idea of concrete examples of non-functional requirements is essential. In the case of accessibility, you might come up with a concrete example of a user who is colorblind. Your accessibility issue is what sort of concrete use cases you have to deal with, and suddenly that becomes not non-functional but very functional indeed.
Many people use Cucumber for test automation to check bugs after the code has already been implemented. Is that BDD? What are your thoughts?
That’s not BDD. That’s nothing to do with BDD. That’s simply using Cucumber as a bad test grifting tool, so that will cause great pain if you do that. Still, then again, writing test automation after the fact in that way is relatively not great in any case, so it’s not the fault of Cucumber. It’s just the thought that that’s not a great way to automate.
What do you recommend using, literal strings or RegEX for your parameters?
I don’t use RegEx very much anymore. I generally use Cucumber Expressions as they are more powerful and customizable. Still, the RegEx concept helps make your step definition triggerable, so I recommend it.
How do we deal with negative scenarios that we need to cover, or do we need to have a separate automation suite for this or do we keep these corner scenarios for exploratory tests during their release?
There are different approaches. My general approach is that you do not need to put negative scenarios or edge cases if the business is interested in them. If the business is interested in negative scenarios, there are different countering or examples.
Hope You Enjoyed The Webinar!
We hope you liked the webinar. In case you missed it, please find the webinar recording above. Make sure to share this webinar with anyone who wants to learn more about BDD pitfalls and how they can avoid them. Stay tuned for more exciting LambdaTest Webinars. You can also subscribe to our newsletter Coding Jag to stay on top of everything testing and more!
That’s all for now, happy testing!