Want to share your experience & expertise at the upcoming Automation Guild '24?
The 8th Annual E2E Automation Testing Online Event Goes LIVE from Feb 5th to Feb 9th, 2024.
**HURRY CALL FOR SPEAKERS ENDS OCT 6th 2023
AI in Testing/Automation: This topic appears multiple times, with mentions like "How AI could help", "AI-enabled tools", "AI in QA", "AI for test automation", "AI tooling", "AI in software automation testing", "AI role in automation", and more. It's evident that AI's role in testing and automation is a significant area of interest.
Automation Frameworks/Tools: Many respondents mentioned specific tools or frameworks, such as "Selenium", "Appium", "Playwright", "pytest-playwright framework", "Robot Framework", "Cypress", and "Webdriverio framework".
Performance Testing: This topic is mentioned in various contexts, such as "Performance Testing and Mobile Testing", "Performance testing", "Python, selenium, AI and ML", and "Performance and chaos testing".
Advanced Testing Techniques: Topics like "A/B Testing", "Model Based Testing", "Contract Testing", and "Combinatorial Interactive Testing" indicate a desire to learn about more advanced testing methodologies.
- UI Automation Challenges: Many respondents mentioned issues related to UI automation, such as it taking too long, being unreliable, and the challenge of handling dynamic locator values. There's also mention of the flakiness of UI tests, especially in complex scenarios with multiple redirects.
- Maintenance and Test Data Management: Maintenance of existing automation and tests was a recurring theme. This includes the challenge of keeping scripts up-to-date with changing application features and dealing with framework updates. Managing and maintaining test data for various scenarios was also a significant concern.
- Integration and Mocking Challenges: Several respondents mentioned difficulties in mocking services in a distributed microservices environment, especially with tools like Wiremock. There were also mentions of challenges integrating various tools and services, such as Playwright with Azure Services and Appium with SauceLabs.
- Mobile Automation: Starting with and scaling up mobile automation was a concern for many. This includes challenges with specific tools like Appium and the complexities of mobile app hybrid automation.
- Learning and Upskilling: A lack of programming skills and the need to quickly learn new technologies and programming languages were frequently cited. This includes upskilling for specific tools like Cypress and understanding advanced coding concepts.
- AI and Advanced Testing Techniques: Many respondents expressed interest or challenges in integrating AI into their test automation processes. This includes leveraging generative AI techniques, using AI for test case generation, and keeping up with cutting-edge AI-based testing technologies.
- Organizational and Process Challenges: Getting buy-in from other team members and convincing stakeholders of the benefits of automation were significant concerns. Additionally, having a good process, balancing between manual and automated testing, and dealing with changing product roadmaps and unclear specifications were also mentioned.
- Cloud and Infrastructure Challenges: Several respondents mentioned difficulties related to cloud environments. This includes running automation test suites on the cloud without manual intervention, dockerizing existing Selenium scripts, and challenges with specific cloud platforms like Azure. There's also mention of the need for solutions that can be hosted on internal networks and not just cloud-based.
- Advanced Testing Scenarios: There were mentions of specific, advanced testing scenarios that respondents found challenging. This includes testing data science projects, mechatronics in vehicle autopilot, security testing, in-sprint automation, and testing integrations with tools like Adobe Analytics. The diversity of these challenges underscores the wide range of applications and domains in which automation testing is applied.
- Team and Collaboration Challenges: Some respondents highlighted team dynamics and collaboration challenges. This includes getting buy-in from developers, dealing with developers' code that affects automation, training team members unfamiliar with coding, and the challenge of introducing automation into small Agile teams. There's also mention the difficulty in transitioning from testing a monolithic application to a microservice-based system.
Genuine Content Over Vendor Promotion: Multiple respondents preferred genuine content, craftsmanship workshops, and practical sessions over vendor-specific content. They would appreciate less promotion and more genuine insights.
Practical Over Theoretical: There's a desire for more practical demonstrations and real-world examples rather than just theoretical knowledge. Attendees want actionable insights they can apply in their roles.
Presentation Quality: The charisma and presentation skills of the presenter matter. Even if the topic is interesting, its presentation can make a significant difference. Some attendees suggested guiding presenters on enhancing their presentation skills.
Access to Materials: Attendees would appreciate easy access to materials, supporting documents, and a connection to an active community.
Diverse Topics: While many respondents are interested in AI and its applications in testing, there's also a desire for a broader range of topics. Some attendees wished to move beyond just UI testing and delve into backend, performance, security, and other areas.
Vendor Presentations: While there's a general sentiment against too much vendor-driven content, some attendees expressed that if vendors provided deeper, more valuable sessions that connect their products to practical use cases, it would be more beneficial.
Real-World Challenges: Attendees are interested in discussions about real-world challenges in the testing domain, such as the evolution from QA to QE to DE, challenges in automation maturity, and the impact of rapidly evolving technology.
Interactive Workshops: Code-along workshops and interactive sessions were appreciated in previous events, and attendees would like to see more of them.
Limit AI Focus: While AI is a trending topic, some attendees expressed that they would like the event not to be overly focused on AI. They want a balanced approach that addresses current challenges faced by testers
Sponsorship Feedback: There's feedback about sponsors' presentations being too high-level and not adding value. Attendees would appreciate sponsors connecting their products to useful, longer sessions and providing practical insights.
Please submit your session idea by answering the questions below: