Ep 77: The sonic screwdriver won’t get me out of this one

This week I talk to Dan Billing! (Check out his podcast: Screen Testing (co-hosted with another friend of the show Neil Studd!))

We cover how to get into Security Testing, a brief look into the mindset of security testing, and share resources to allow you to start Security Testing ethically, legally, and without making your Sys Admins angry.

Topics covered:

  • Resources and tools!
  • Being legal and ethical
    • Check your local laws – in the UK/EU/US it’s illegal to hack a production site
    • There are some fake sites to train/practice this testing:
    • If you’re bringing this testing into your workplace, seek permission first
      • Talk to your system admins/security team/technical team/line manager
    • Get a quarantined environment to work on
    • Take a backup on the environment first
    • Warn your sys admin team before you start crawling sites/running reports – they may have logging and be alerted to suspicious behaviour (and do you ever really want to piss off your sys admins?)

Thanks to Dan for being on the show, and thanks for reading/listening. If you want to support the show you can rate and review us on iTunes or check out the Patreon!

Ep 73: There is no spoon!

This week on the show we’re talking to Maaike Brinkhof about cognitive biases! There are so many biases, but we talk about a few here, and share some resources for learning more.

Why are we biased?

Four categories

  • Too much information → We filter it
  • Not enough meaning → We fill in the gaps
  • The need to act fast → We want to feel in control
  • What should we remember? → We pick out what stands out

Is it a bad thing or a blessing in disguise?

As with all things it can be both. Biases are evolutionary as brains can’t process everything, but they can be a crutch, a mental shortcut which can cause issues.

If you view biases as a bad thing, then you’re missing the point. You can choose to view it as something you can learn more about and getting to know more about yourself – such as learning to recognise when you might be biased and trying to adjust your behaviour.

Biases in testing – its not just about your own biases

Often when we are testing, we are looking for problems caused by biases from people around us, such as the biases of developers or project managers. Often people will write their own biases into products without realising it.

We also have to be aware of our own biases guiding us, hiding or obscuring information. We might like to think we are objective in our thinking, but we are not perfect either.

Understanding biases can also help you explain and justify your testing, questions, problems or information that you’re providing to people.

Lets talk about some biases!

Confirmation bias (the biggest bias of all, and can be broken down into several ‘smaller’ biases, for example the ones below):

How do you deal with biases?

  • Work alone less and pair or mob more
  • Focus and de-focus
  • Self-awareness of your own concentration levels and behaviours
  • Awareness of biases
  • Referring to and using heuristics
  • Stepping back and examining your thinking, concentration, and working patterns to avoid relying on biases to take shortcuts.

Books and courses we’ve mentioned

Ep 56: And our survey says

An interview with Rosie Hamilton!

This year Rosie did a large survey of software testers and then did the number crunching to produce a snapshot of testers in 2016. All the data and the scripts she used are up on GitHub for transparency, and for others to do their own analysis if they wish.

Rosie has now made a web app for people to explore the data she found without having to run scripts or learn R. She has just published a blog post about how she did this, with a link to the app.

I’ve been interested in the make up of testers; where we come from, when and how we decide to become testers for a while now, and I knew when I saw the survey that I wanted to invite Rosie on to the show to chat about it.

We talk about the survey process, from inception to analysis; what the survey taught her about testers, and about surveys, where testers come from, and the plans for the next iteration of the survey.

Summarised transcription:

The inception, the decision making, whys, whens etc. Did you have it all planned out from the beginning and have the analyses you wanted to do before you formulated the questions or what it more ad hoc than that?

  • It kind of happened accidentally. A lot of people had been asking me the same questions. I tried to answer from personal experience but couldn’t, I reached out to some friends on social media to get their views, but still wasn’t any close to answers.
  • Had a few doubts about asking people to fill out a form, mostly I wasn’t sure how I was going to analyse the results to find answers, but decided to go for it.
  • I spent about 4 weeks planning the survey. I had a list of questions I wanted answered and I worked backwards from there. I tested the survey on the testers I work with. The main concern which was raised while testing the survey was keeping it totally anonymous. Some of the questions got changed before it was unleashed on the community.
  • While the survey was open and collecting data I started planning how I was going to deal with the results. I started learning R which was a very steep learning curve at first.

Lessons learned, both from the survey and from doing the survey?

What I learned about surveys:

  • Open ended text boxes are a real pain to clean up and make sense of the results.
  • I should have asked more demographic questions which would have allowed the results to be split into more groups. From the start I didn’t want to divide the sample by gender as I thought gender was irrelevant but I wish I had asked which country or continent people were working in as it would have been nice to try find some patterns there.
  • The questions which were crystal clear with closed answers like yes/no true/false gave the best results.
  • The more data collected the better. You can never have too much data.
  • I learned that R is REALLY powerful. It lets you start writing the analysis before all the results are in. You can just keep feeding it data as the collection of responses grows.
  • I think being transparent from the start about why the data was being collected, what it was going to be used for and also making the results publicly available was what has made this study quite unique. I know there are other surveys about testing like the ‘state of testing’ but the raw data from that one is not in the public domain, only someone’s interpretation of the results is available.

What I learned about testers:

  • We are a really diverse group and we all have our own stories.
  • I spoke to one tester while the survey was going on, they said they had struggled to answer the what made you apply for your first testing job. They said they came to work one day and were told ‘you are a tester now, your old job doesn’t exist’.
  • I feel testing is still has a bit of stigma around being an unglamourous job. It is very hard to hire testers. I also feel that testing is not a really aspirational career choice given how few testers wanted to test while in Education.
  • One of the results which surprised me was that 2 out of 3 testers which started testing in the last two years didn’t study computing. I think students which study computing are mostly moving into development jobs. Which means junior testing jobs are being filled by people without computing backgrounds.
  • There was one person that completed the survey that when asked what the liked about testing answered ‘none of the above’. One in every four testers aren’t happy in their jobs so I think testing isn’t for everyone. I also think there are some really bad testing jobs out there. In the worst workplaces we have testers getting blamed for missing bugs, management not understanding what testers do and forcing testers to sign disclaimers stating there are no bugs.

Feedback from other people?

The feedback has been really positive. I’ve also realised even though the blog posts were written for a testing audience, other groups have been reading about the survey.

Future plans?

I have this crazy plan to turn all the data from this survey into a web application. I think this will be good because it will provide an interface for people to explore the data themselves. If I manage to create this web app that I’m dreaming about, I am certainly going to blog about it.

I definitely think I am going to survey testers again next summer, which should give me enough time to design a much better survey for 2017.

Ep 33: These ARE the testers you’re looking for

This week I talk to Amy Newton. Amy is Head Of Testing Practice at Vertical IT. She’s incredibly passionate about testers, and the testing community in the North West.

We discuss:

Ep 11: There’s No I In Team

(But There Is In Chris)

Today’s recording is a bit echoey, and that’s because I recorded this episode at a venue called Madlab, who very kindly lent me a room for this interview.

This episode is an interview with Chris Northwood, who is Lead Dev as BBC Connected Studios, working on a fantastic project called BBC Taster.

Disclaimer: Chris is talking to me in a personal capacity, not on behalf of or representing the BBC.

We talk about how software development works in the BBC , and Chris’ team in particular, and the benefits of having QA embedded in a Scrum team.

Chris Northwood co-runs Manchester Tech Nights, a meetup aimed at Manchester’s technology community, and happens every other month. Find out more by going to http://manchestertechnights.org/.

Find Chris on twitter @cnorthwood

Ep 8: Rage Against the QA

This week I talk to Mike Bell, Drupal Dev, friend, and colleague about the relationship between QA, Dev, and Client.

We talk about how QAs and Devs think differently, reporting bugs to devs, writing AC, and the affect of planetary movements on testing. I also try and fail to remember the term Galumphing and James Bach, the name of the person who coined it.

Mike Bell can be found on Mike Bell, and will be talking at PHPNW15 this year. He’s previously spoken at NWDUG and various Drupal Camps.

In between recording this episode and posting it, its been announced that Mike will be talking as part of a keynote at DrupalCamp Barcelona! Read more about his talk here: https://events.drupal.org/barcelona2015/sessions/mental-health-and-open-source