We had an interesting session this year at the Automation Guild Conference with John Ferguson Smart, creator of the Serenity framework on the Screenplay Pattern. It created a lot of interest and conversation among Guild attendees and speakers alike; so much so that someone in the group recommended that I create a separate podcast dedicated to the topic … so here it is!
About John Smart
John Smart is an international speaker, consultant, author and trainer well known in the Agile community for his many books, articles and presentations, particularly in areas such as BDD, TDD, test automation, software craftsmanship and team collaboration. John is the author of the best-selling BDD in Action, as well as Jenkins: The Definitive Guide and Java Power Tools. He is also very active in the Open Source community, John also leads development on the innovative Serenity BDD test automation library, described as the “best opensource selenium webdriver framework”.
Quotes & Insights from this Test Talk
- The essence of what we're talking about here is not just writing automated tests that check various aspects of a website or an application. When we start to do something like screenplay,
we're actually changing our perspective on automated tests. Rather than just writing tests that verify whether the application works after the fact, which is the more traditional way of writing automated tests, we're actually moving towards a more proactive approach, a more dynamic approach; we're actually using automated tests to help us focus on what is really important – to give feedback to the business, to give feedback to the team, to enable us, empower us, to deliver applications and new features much faster. It is very much in the line of behavior driven development of DevOps, of Lean Enterprise, and all of those good things. We're going to see
how screenplay helps you deliver on these promises. - In a nutshell, what is Serenity BDD? It's an open-source test library, with a very strong focus on, not so much test recording – although, obviously it does do test recording – it's very focused on requirements reporting. What I like to call feature coverage, or release readiness. How do you know whether your application is actually ready to go into production? How do you know what features you've delivered, and how well they're tested? This is something that is not immediately obvious in a typical test report. We'll see what this means later on, and what it looks like.
- Another big issue we find is reusability. There's a huge maintenance issue in code that's hard to reuse, or is not reusable at all. In Serenity BDD, what we do is we use reusable components; not at a technical level. We have page objects, but they're not the main focus. We take reusability to the next level up: we have reusable business components. So we compose tests using business concepts, not using technical concepts. That makes a huge difference in its ability to scale, to write tests that will stand the tests of time, that will bring people on board and get them up to speed quickly, and to write tests for very large applications.
- Screenplay uses the idea of actors, tasks, and goals to express tests in business terms, rather than in terms of interactions with the system. In screenplay, we describe tests in terms of an actor who has goals
Resources
Connect with John Smart
- Twitter:@wakaleo
- Blog:Â johnfergusonsmart
May I Ask You For a Favor?
Thanks again for listening to the show. If it has helped you in any way, shape or form, please share it using the social media buttons you see on the page.
Additionally, reviews for the podcast on iTunes are extremely helpful and greatly appreciated! They do matter in the rankings of the show and I read each and every one of them.
Test Talks is sponsored by the fantastic folks at Sauce Labs. Try it for free today!
Seems like an interesting pattern. Something that builds on top of BDD to abstract things out even more. I always feel like the Page Object Pattern tests lack a good way to define the features and link to requirements. Therefore, I have always pondered whether moving to BDD would be more advantageous.
The screenplay pattern definitely takes this to a whole new level and something that I want to explore in detail. It seems to make sense regarding why it exists. However, looking at the code presented by John Smart, right away I wonder if there is too much unecessary complexity? This is just my thought process, not necessarily conclusions. There are tons of classes strung together. Seems like every action is a class, so I almost wonder where do all the objects live?
Furthermore, this barrier to entry just seems ginormous. Not only are we forced to learn all the tools required to run a test like an IDE, test runner, Selenium…. Now a user needs to learn BDD, Serenity, and ScreenPlay pattern.
I almost wonder if this is just unnecessary complexity introduced on top of BDD type automation? I need to explore this further for sure though.
I do love how easy the reports are created though!
This is a very practical approach and definitely would work. I had seen it work across 4 independent product teams I was working as a lead. Though I have not used the Serenity framework we adopted the same principles as enunciated here using concepts and frameworks like SpecExplorer and design concepts of Designing to Interfaces adopting SOLID in the overall design.
The approach was giving result right at the design discussion stages itself as we involved the manual testers and business analysts in coming up with the DSL for each of the product and even before we started writing the automation implementation the BA and manual testers started using the DSL in writing their product specific scenarios. This gave us valuable feedback in terms of the expressiveness of the DSL itself , in that we refactored the keywords and its argument.
The direct benefit of this approach were
– Manual testers were able to write the automation scenarios themselves as they were involved in creating the DSL keywords
– We were able to switch multiple toolsets to provide the GUI level implementation with Selenium, Coded UI and AutoITX seamlessly.
– As all the test scenarios were written based on Actors and their operations, the automation DSL were able to used by Business Analysts right at their discussion stage for new features and once written it became immediately executable as an automated script. Gaps in new feature implementation were able to be provided by the framework team with much less time without depending on development team to deliver any working code.
– As the automation development progressed independent of the development team , automation requirements were incorporated as coding checklists for testability in the development code.