On Monday, January 15, 2018, PQA’s Director of Quality, Mike Hrycyk, brought together a group of PQA testers to form an Automation Panel.

Automation has been one of the testing world’s most discussed and most intimidating topics over the past few years. As more and more clients ask about Automation testing, and as project teams integrate it into their testing strategies, there are a lot of questions around its importance and the role it can, or should, play in a testing environment.

This Panel transcription is the first in a series of three. Part 1: What is the Role of Automation in the Testing World? will touch on the relationship between Automation and Manual Testing, the benefits and limitations of Automation, and the possible transition paths one can take from being a Manual Tester to learning more about how to Automate.

MEET OUR PANELISTS

Mike Hrycyk, Director of Quality: Currently the Director of Quality, Mike has been in the world of quality since he first did user acceptance testing 18 years ago. He has survived all of the different levels and a wide spectrum of technologies and environments to become the quality dynamo that he is today.

Nathaniel (Nat) Couture, VP of Professional Services: Nat has over 15 years of experience in leadership roles in the technology sector. He possesses a solid foundation of technical skills and capabilities in product development including embedded systems, software development and quality assurance.

John McLaughlin, Senior Automation Consultant: John is a Senior Automation Consultant with more than 12 years of experience in software testing and test Automation. Coming from a background in Computer Science, John has designed frameworks and automated testing strategies for a number of software projects while providing training on test Automation and its various toolsets.

Jim Peers, QA Practitioner: As a QA Practitioner, Jim comes with a background of more than 16 years of experience in software development and testing in multiple industry verticals. After working in the scientific research realm for a number of years, Jim moved into software development, holding roles as a tester, test architect, developer, team lead, project manager, and product manager.

Benoit (Ben) Poudrier, Senior Automation Specialist: Ben is an Automation Specialist who has performed manual, functional, automated and performance testing, in addition to building regression testing suites for several clients. While training many people on Ranorex Automation, he has also designed and implemented various Automation frameworks while working at PQA.

Jordan Coughlan, Automation Software Tester: Jordan is a recent graduate from the University of New Brunswick with a Bachelor’s degree in Computer Science. As a member of the Advanced Solutions Team, he has contributed to a variety of projects, taking on different roles as an Automation, performance, and security tester.

PART 1: WHAT IS THE ROLE OF AUTOMATION IN THE TESTING WORLD?

Mike: What is the role of Automation in this new testing world? Where does Automation fit? Should everyone be an Automater? Is there still room for Manual Testers? We’re all hearing these questions through all of the different talks and articles and conferences. Can you tell me how you would answer them?

Jim: What I see in the people that I talk to, and with my clients, is that Automation is an important part of what they do in terms of producing a safety net of tests that prevent regressions, or at least show them up anyway. Those things can be built into a CI (Continuous Integration) pipeline, which is becoming much more the way of working these days. So, these tests get run on a per check-in basis; it’s very fast, and happens many times per day.

Now, to your point about whether Manual Testers are still needed; absolutely, and that will never change. I think the Automation provides the underpinning of work that Manual Testers used to have to do, but now they don’t have to do it anymore. So, instead, now they can be free to do more exploratory kinds of testing and provide the value that they’re very good at doing.

John: And you should think about the angle. We’re all testing for an end user, we’re not testing for an automated end user, but a real person end user. So, to me, Manual testing is never going to go away. We can get quite good at scripting and making smart scripts that are pretty powerful, but the scripts can’t make decisions like an intelligent or an experienced tester could. I believe the trick is to use Automation as a tool, not as a catch-all for testing. Use it as a tool to help your Manual test team to explore and challenge the system a little differently than they could on their own.

Ben: I do agree with everything that’s been said. I just view it much more as a supplemental tool for the Manual testers to use in order to sometimes maybe get to an area that you may need to get to; probably a little bit faster. Sometimes Manual testing can take weeks or months to get to a certain point, whereas if you can automate some parts of it, it might make it a little easier for the Manual Testers to get to the end point, and to test things that are at the end of a project.

John: You could think about them too in terms of time, but there’s also challenge. There are different things you can do with a scripting language that will challenge the application differently than manually doing it over and over again. I think that’s one of the aspects that kind of gets lost in terms of test Automation. People that request test Automation want test coverage to repeat the regression test when really, if they had an Automation tester sit with an experienced Manual Tester, they could come up with some pretty interesting and meaningful challenges to test the application. They would be using Automation knowledge, but not necessarily a test that you’re going to repeat over, and over, and over again.

Mike: A simple example of that is just permutations.

Nat: At the end of the day, when we’re talking about Automation, people tend to take a bit of a narrow view. One of the main objectives of Automation is to reduce manual labour, and people tend to see the biggest manual labour in testing as executing the tests. But, there are all kinds of things that you can automate within the realm of quality assurance including environment management and deployment to data preparation, data loading transfer, migration, etc. So, I think that for all of the things that involve lots of manual labour, we should seek to find solutions for reducing the manual aspect of testing. Now, there are obviously going to be a lot of tasks that can’t, or shouldn’t, be automated, but I think that people sometimes have too narrow a view of what Automation is. It’s about leveraging the tools that we have, so our computers and our ability to program reduce that manual labour. And it’s not always about executing against the user interface and it’s not always about test execution; it’s about the other things. I look at, even in a CI, will there still be Manual Testers?

If you look at what the big guys are doing, like LinkedIn, Facebook and so on, from what I gather, a huge amount of their QA is actually being carried out manually. It’s not the company that is paying for it. What they’re doing is they’re deploying a release on a subset of their servers and diverting a small subset of real time traffic in their live environment to that one new build. So, they’re crash test demo-ing a bunch of their user base into it. That’s really Manual testing, just not incurring the cost internally. We can get more intelligent about that as well.

Jim: Those are great points. As Nat said, we don’t tend to think about Automating tasks; we think about Automation tests. But the tasks, like separating data or creating containers, those things are important Automation activities. They’re part of what Automation is. I was at a conference last year and testers from The Guardian publication were talking about putting out their every release, and they were looking at these indicators that might have been a problem. They were monitoring, and monitoring is the big thing for them. They don’t actually manually test what they do. So, they look for little blips and what’s going on. Once those blips come up, then they fix them. But, it happens hundreds of times a day as they put out releases. That’s the pace of things now and it was quite interesting to hear about how they do this.

Mike: There is a counterpoint. I agree with a lot of what you said, but the counterpoint is, banks, financials, and other types of business where you can’t let something unsafe to be found in the wild. It’s a big, scary thing.

Jim: The Guardian example that I used is a newspaper, so mistakes can be made, but for health companies, or national security, or companies that control planes, then those things can’t get out into the wild.

Mike: My next question is a loaded question. My personal belief is that an Automater should become a Manual Tester first. So, let’s talk about whether you feel that’s true, but let’s also talk about your own introduction to Automation. Did some of you walk out of school, sit down, and then all of a sudden become an Automater? Did you get into testing first, and how did that work?

John: I think it is very important to do the testing part first. I came directly out of school and right into Automation. I kind of picked up the testing techniques as I moved along. Retrospectively, we probably could have built a stronger test, or a stronger solution, had we had the core knowledge and the core strategies that go into straight testing. I think it can be done coming from a coding background and working into the testing knowledge, but it requires an understanding on the part of somebody with a development skillset that testing and testing techniques may not be their strength and they have to be open to learning. Now, if you do it the other way around and start with testing, sometimes folks can be intimidated by the development language and technology. That can hinder them too. You have to find the sweet spot in the middle; somebody who recognizes that, regardless of the direction you take, there’s going to be a challenge and you have to be open to roll with that.

Mike: There’s not a lot of training in university and tech courses about Automation. So, how did you actually get started, John?

John: We kind of flew by the seat of our pants. We had Manual test cases that we knew we wanted to automate. A lot of it was trial and error. We could script a whole bunch of things and then assess the pack as a whole. We used our knowledge that we gained through different coding exercises and decided amongst ourselves how to make it better, and how to make it easier to use. Almost like a user to the automated solution, we were considering that user base, rather than the user to the application. That was generally how we got started. Everything was a trial and error and built on successes and built on failures moving forward. And, of course, building on our testing base along the way.

Ben: I want to point out an example from a development background perspective. I came out of college and I decided to go work as a programmer in the gaming industry. I followed how the dev industry works, and after being laid off just three years after I started, I realized that I didn’t want to do that anymore. There was just no job security in it. So, I moved back to Fredericton to get back into school and then one of my friends helped me find a job at PQA. I started doing Manual testing, which was okay, but I wanted to use my more technical skills and background to get things done more efficiently. Eventually, that chance came along when PQA first formed the Advanced Solutions Team and were interviewing people to join. I had just come back to work after a long break and I decided to join. I believe I was the first member on that team, and I just never really looked back from then on. I think it’s great. It’s really a tool to supplement testers’ work in order to facilitate things for them.

Nat: I think there are two bits of knowledge that come into play with this and I don’t think Automaters need to be testers necessarily. But, someone needs to come up with what good tests are, or what should be tested. It’s the job of the Automater to know that and they could come up with that if they have a good foundation in testing. However, if they do not, that does not mean they cannot automate because, at the end of the day, what they’re trying to do is automate a task. If everyone building software had to be an expert user of the software, then you’d never get anything written in terms of software. The goal of automating any business process or any task, like an automated test, requires a technical skillset. When you marry the two – that good tester knowledge with the technical skillset – you get the best of both worlds. So, I don’t think you need to be necessarily a tester, but it certainly helps when a technical person has that understanding. Then you don’t have to build a team to define what you’re going to test and then automate it.

Jim: I used to be a research scientist doing chemical physics and I would have to write subroutines to model my data, so I had to learn Fortran. It was entirely self-taught, and this goes back to when the mountains were cooling, of course. Then, I ended up doing an internship at IBM as a developer, and at other companies, before coming to PQA. During that time, I started building Automation that was entirely home built. It was python scripts to drive C++ DLLs. From there, I tried some record and playback stuff. Now, I’ve happily settled into things like REST Assured and Postman, where you’re driving Automation at the API level. That’s what I’m most interested in right now.

Jordan: I’m sort of similar to John where I came to PQA straight out of school and jumped into Automation. I did a bit of Manual testing in parallel when I was learning Automation, so those two things together did influence each other. Certainly doing Manual testing helps when doing Automation.

Benoît Poudrier is an Automation Specialist at PLATO Testing. Ben has been immersed in programming and QA for over 10 years. He has worked in various roles throughout his career and his experience working with a broad range of clients has provided Ben with a wide array of skills. He has performed manual, functional, automated and performance testing, and he has built regression testing suites for several clients. In addition to training many people on Ranorex automation, he has designed and implemented various automation frameworks while working at PLATO. Ben has spent over a year learning about security testing on his own through various online courses and through extensive research on a wide range of security-oriented topics.

https://www.linkedin.com/in/benoit-poudrier-807a1323/

Jim Peers is currently Test Manager at the Integrated Renewal Program at the University of British Colombia and is an alumni QA Practitioner from PLATO Testing.  Jim has more than sixteen years of experience in software development and testing in multiple industry verticals. After working in the scientific research realm for a number of years, Jim moved into software development, holding roles as tester, test architect, developer, team lead, project manager, and product manager. As a trusted technical advisor to clients, Jim has created test strategies approaches and plans for the most complicated of systems and loves to mentor and assist testers on multiple projects

https://www.linkedin.com/in/jim-peers-70977a6/, @jrdpeers

John is a Senior Automation Consultant with more than 13 years of experience in software testing and test Automation. Coming from a background in Computer Science, John has designed frameworks and automated testing strategies for a number of software projects while providing training on test Automation and its various toolsets.

Jordan is a recent graduate from the University of New Brunswick with a Bachelor’s degree in Computer Science. As a member of the Advanced Solutions Team, he has contributed to a variety of projects, taking on different roles as an Automation, performance, and security tester.

Mike Hrycyk has been trapped in the world of quality since he first did user acceptance testing 21 years ago. He has survived all of the different levels and a wide spectrum of technologies and environments to become the quality dynamo that he is today. Mike believes in creating a culture of quality throughout software production and tries hard to create teams that hold this ideal and advocate it to the rest of their workmates. Mike is currently the VP of Service Delivery, West for PLATO Testing, but has previously worked in social media management, parking, manufacturing, web photo retail, music delivery kiosks and at a railroad. Intermittently, he blogs about quality at http://www.qaisdoes.com.

Twitter: @qaisdoes
LinkedIn: https://www.linkedin.com/in/mikehrycyk/

Nathaniel (Nat) Couture, BSc MSc., has over a 15 years of experience in leadership roles in the technology sector. He possesses a solid foundation of technical skills and capabilities in product development including embedded systems, software development and quality assurance. With a decade of experience as CTO in two successful small ITC companies in New Brunswick, Canada, Nat has proven himself as a solid leader, capable of getting the job done when it counts by focusing on what’s important. He’s a very effective communicator who has mastered the art of simplicity. Nat has served as an expert witness on matters of software quality. He is always driving to adopt and create technologies that incrementally reduce manual labor and at the same time improve the quality of the end product.”

LinkedIn: https://www.linkedin.com/in/natcouture/,

Twitter: @natcouture

Categories: Test Automation