Ep 79: It Takes Two (Rebroadcast)

Back in October last year, I spoke to Maaret Pyhäjärvi about pair testing. We did a session of pair testing then recorded our debrief/thoughts on the session. It is a great episode, and the shownotes are full of great resources, including the mindmap we used during our session.

However, the editing and sound quality was not amazing. Maaret was quite quiet compared to me, and it was difficult to listen to. I’ve been wanting to reedit and rebroadcast for a while, and finally circumstances came together to do this. The sound still isn’t perfect, but I think it’s a lot clearer, and the volume makes more sense. I’ve also made the outro a bit quieter so there isn’t suddenly a big jump in sound.

I hope you all enjoy the new improved episode 61!

Ep 78: May the Testbash be with you

So this week we finally discuss Testbash Brighton and the freshly announced Testbash Manchester!

First, Testbash Brighton. Testbash was awesome as always. Brighton moved to a new venue but all the normal Testbash awesomeness applied!

 

Conference Day Takeaways

  • Nice to see a focus on empathy
  • Empathy for developers
  • Empathy for customers – Tito customer feedback/Ethics in testing
  • Good balance of tech/non-tech
  • Range of topics! Starting as a new tester in agile, AI and testers, ethics, how testing has changed, API testing, Toolsmiths.
  • Not all speakers were testers: there was a professor + 2 Devs turned CEOs!
  • Testers creating better Turing tests kind of blew my mind despite being bloody obvious now I think about it
    • Testers as the ideal middleman between understanding how people work and how machines work.
  • Lots of juicy ideas to take into my new job that I’ve just started. Amy Philips and Del Dewar’s talks covering ideas on starting on a new project and what good leadership really means.

Open Space Takeaways

  • A nice, low pressure environment to share
    • Can be a discussion or a question, not necessarily a talk or workshop (though they can be done as well)
    • Dan Billing did a demo of Burp Suite and a whirlwind tour to Ticket Magpie (see episode 77)
  • Matt did a session on bringing testing into education (schools, uni, etc – see episode 75 for our talk on a testing syllabus)
  • I did a session on mental health with Mike Talks, which was so awesome
  • A talk popped up during the last break of the open space where we shared stories about people in our lives we were grateful for, and why, and it was a genuine love-in.

Testbash Takeover!

So if you check out the lineup for Testbash Manchester you should see a couple of familiar names! Matt is doing a workshop on API testing, aimed at people wanted to get into API testing who may not have done so before.

I am giving a talk called Anxiety Under Test. With as few spoilers as possible, I’m going to be talking about my anxiety, and how I’ve applied the strategies I’ve learned to help cope with my anxiety to testing, with a quick look at how you can check in with your own mental health and keep on top of it.

The rest of the lineup look A+ and we’re looking forward to an amazing few days!

If you can’t make the conference there will be the pre- and post-testbash meetups as always. These are free to attend so come along and hang out with a bunch of great people!

Ep 77: The sonic screwdriver won’t get me out of this one

This week I talk to Dan Billing! (Check out his podcast: Screen Testing (co-hosted with another friend of the show Neil Studd!))

We cover how to get into Security Testing, a brief look into the mindset of security testing, and share resources to allow you to start Security Testing ethically, legally, and without making your Sys Admins angry.

Topics covered:

  • Resources and tools!
  • Being legal and ethical
    • Check your local laws – in the UK/EU/US it’s illegal to hack a production site
    • There are some fake sites to train/practice this testing:
    • If you’re bringing this testing into your workplace, seek permission first
      • Talk to your system admins/security team/technical team/line manager
    • Get a quarantined environment to work on
    • Take a backup on the environment first
    • Warn your sys admin team before you start crawling sites/running reports – they may have logging and be alerted to suspicious behaviour (and do you ever really want to piss off your sys admins?)

Thanks to Dan for being on the show, and thanks for reading/listening. If you want to support the show you can rate and review us on iTunes or check out the Patreon!

Ep 76: You don’t own the bug

This week I talk about Bug Advocacy! This came swirling into my mind after seeing a couple of older blog posts about the matter and it turns out I have feelings on Bug Advocacy!

Definition: Simply put bug advocacy is that: advocating for a bug. Advocating that it should be prioritised and fixed.

There are different ways to do this:
– A decent bug report. A bug report in and of itself is bug advocacy, because you’re saying ‘hey, I found a thing, this is the affect it has’. If you give enough information, you’ll be conveying the importance of the bug and it’s effects.
– Advocacy during prioritisation meetings. Telling people that you think this bug is important, that it should be fixed.

One of these is definitely something testers should do. Bug reports should contain information so people can see and understand the effect and impact a bug or defect will have on the system and it’s users. This information is crucial so that the bug can be prioritised correctly for the team’s priorities.

Sometimes you might not agree with how the team has prioritised your bug. They might not think it’s actually a blocking bug, or that it’s that big a deal. Maybe they don’t think it’s a bug but instead a change request.

So what do you do here?
You can make sure that the PO or lead dev truey understands what the bug is and what affect it has. Maybe demo the bug to them, or talk them through it. If they explain their reasoning to them you can either counter it, or accept it (however begrudgingly).

You can fight for it, and this is where people start to be anti-advocacy. If a tester’s job is to present information only, and let other team members take that information and use it, then fighting for a bug to be fixed goes against that.

There are also plenty of reasons a bug might not be prioritised in a way you agree with: ignorance, incompetence, lack of time or resource, knowledge of business or user cases and functions.

If the team doesn’t think it’s worth fixing or fixing any time soon, you have to let it go. Just because you raised the bug, doesn’t mean you own the bug.

Further Reading
https://medium.com/allthingstesting/death-to-bug-advocacy-82cf0c057900
https://www.safaribooksonline.com/library/view/lessons-learned-in/9780471081128/ch04.html
https://www.associationforsoftwaretesting.org/training-2/courses/bug-advocacy/

Ep 75: What is the plural of syllabus?

A few weeks ago I took part in Weekend Testing Europe, and I enjoyed it so much we did a whole episode on it! There is a blog post on the site that covers the scenario and discussions: http://weekendtesting.com/?p=4496

I was in Group C (the C stood for Cool): http://weekendtesting.com/wp-content/uploads/2017/02/WTEU73-Group-C.pdf

Our mission was to imagine we’d been granted permission to present a testing module within a university Computer Science course for a single semester (12 weeks of 1-hour classes), plus homework/assignments.

This didn’t give us a lot of time. I found there were a few immediate difficulties in coming up with a syllabus.

Where do you start?

Do we start with the philosophical stuff? Such as “What is testing?” or What is a tester?”. Or do we jump straight into the nitty gritty (Techniques, heuristics, etc)?

Honestly if someone had said to me that ‘Quality is value to some person’ then I wouldn’t know what that meant. Now obviously it makes sense to me, but I’m not sure it would’ve back then. YMMV and all that. To balance that you don’t want to go too deep and detailed too early, an overview and introduction is needed.

What don’t you include?

By choosing what you include you obviously don’t include some things, but there are also things you make a deliberate choice not to include. Do we give a flavour of the types of testing you can specialise in (security testing, performance testing, etc)?

There’s not enough time to cover all of it in any type of detail but a high level look at how varied testing is might be enough to give people an idea of what to look into more.

What do you include?

Everyone has a different testing role, and path into testing, which makes boiling testing down into a series of lectures difficult, but I think there are some core skills we can set down:

  • Communication
  • Thinking
  • The various flavours of testing
  • The role of automation
  • The role of manual/exploratory testing
  • Shift left and right

Things I didn’t include but considered:

  • Learning
    • They’ll be getting some of these skills by being at uni.
  • Code skills
    • This is part of a comp sci degree. Yes there will be differences in how to code tests but I feel the degree will give a good background.
  • Note taking.
    • While can be done as part of an assignment I feel, again, some of these skills come from uni.
  • Homework:
    • Practical
      • How to provide a good practical experience
        • Test real apps?
        • Test other programming student’s project work?
      • What experience to offer?
        • App?
        • Website?
        • API?
        • Mobile
    • Bug reports?
    • Testing notes
    • Non-practical
  • Resources out of the wazoo
    • Clear
    • Concise
    • Useful
    • Relevant

Challenges with this idea

There are several challenges with trying to create a module like this in reality:

  • How do we keep the module up-to-date? The industry moves extremely fast.
  • How do we balance keeping the content general but interesting? How do we improve awareness of specalised areas without having to go into detail?
  • How do we help improve awareness of testing? Modules such as this would only help those that attend univeristy and many testers don’t attend univeristy.
  • Do we even need a degree or module? We wouldn’t want to see such a module becoming a pre-requisite for testing and turning people away. Maybe even a strength in testing is that we have lots of people become testers from lots of different backgrounds.

Ep 74: Around the World in 80 Tests

Firstly: This month is #TryPod, a month where podcasters, and people who love podcasts bring podcasts to those who do not listen to them. I share the love for some of my latest subscriptions here

Secondly: Matt and I will both be at Testbash Brighton! come say hi if you’re there 😀

And now, onto the episode. This week we’re discussing localisation:

Here are our talking points, in both mindmap and notes form:

  • Cultures
    • language
      • error messages
      • context for translations
      • play on words
    • expectations
    • Local holidays
    • Personal name and title conventions
    • Aesthetics
    • Colour symbolism
    • Ethnicity
    • Socioeconomic status of people
    • Local customs and superstitions
    • Differences in address, post code and telephone numbers
    • Measurements
    • Voltage and current standards
  •  Translations
    • Video or audio – dubbing or subtitles?
    • Printed materials?
    • Logos or graphics?
    • Layout issues with the length of translation or character sizes
    • Dialect? Register? Variety?
    • Format of numbers?
    • Date and time format?
    • Different calendars?
    • Do you need to translate? Common phrases or graphics
    • Consistency heuristic
  • Regulatory compliance
    • Laws
    • Taxes
    • Restrictions
    • Disclaimers
    • Great Firewall of China
  • Third party service availability e.g. Google Maps or PayPal
  • Device differences
  • Languages
    • Complex text layout where characters change depending on context
    • Writing direction (left to right or right to left)
    • Different text sorting rules
    • Numeral systems
    • Different pluralisation rules
    • Different punctuation
    • Keyboard shortcuts
    • Accessibility/Screen reading
  • Technical challenges
    • Translating easy – but maintaining with new text or fixes?
    • Building software that is easy to localise
    • Separating locale-dependent parts
    • Deployments
    • Timezones
    • Daylight savings

Ep 73: There is no spoon!

This week on the show we’re talking to Maaike Brinkhof about cognitive biases! There are so many biases, but we talk about a few here, and share some resources for learning more.

Why are we biased?

Four categories

  • Too much information → We filter it
  • Not enough meaning → We fill in the gaps
  • The need to act fast → We want to feel in control
  • What should we remember? → We pick out what stands out

Is it a bad thing or a blessing in disguise?

As with all things it can be both. Biases are evolutionary as brains can’t process everything, but they can be a crutch, a mental shortcut which can cause issues.

If you view biases as a bad thing, then you’re missing the point. You can choose to view it as something you can learn more about and getting to know more about yourself – such as learning to recognise when you might be biased and trying to adjust your behaviour.

Biases in testing – its not just about your own biases

Often when we are testing, we are looking for problems caused by biases from people around us, such as the biases of developers or project managers. Often people will write their own biases into products without realising it.

We also have to be aware of our own biases guiding us, hiding or obscuring information. We might like to think we are objective in our thinking, but we are not perfect either.

Understanding biases can also help you explain and justify your testing, questions, problems or information that you’re providing to people.

Lets talk about some biases!

Confirmation bias (the biggest bias of all, and can be broken down into several ‘smaller’ biases, for example the ones below):

How do you deal with biases?

  • Work alone less and pair or mob more
  • Focus and de-focus
  • Self-awareness of your own concentration levels and behaviours
  • Awareness of biases
  • Referring to and using heuristics
  • Stepping back and examining your thinking, concentration, and working patterns to avoid relying on biases to take shortcuts.

Books and courses we’ve mentioned

Ep 72: Mayday! Mayday!

Today, in a first of what will probably be a semi-regular series: ‘How is [career] like testing’, we cover Air Crash Investigations.

Investigating what has caused an accident/bug

  • First action?
    • Find the black boxes! (logs/monitoring)
    • Collect all of the evidence (effects of the bug, clues in data that led to it, eye witness accounts)
    • Come up with hypotheses with the data and attempt to disprove
    • Simulate the problem/Steps to reproduce
    • Sometimes its not possible to identify the exact problem, only give a summary of findings and suggest any possible improvements

Implement new procedures

  • Providing evidence and explanation for new procedures
  • A tester/investigator only provides recommendations, they do not implement them

Post-incident investigation and action

  • Mentality
    • Focus on facts
    • Thinking outside of the box, outside of the immediately obvious
    • Gathering as much information as possible – even it appears irrelevant – then assessing with all the information
    • Coming up with hypotheses and attempting to disprove it
    • Being unafraid of uncovering the truth, despite politics

 

  • Needing to work fast
    • Evidence may disappear over time
    • A need to quickly discover if there are any fatal design flaws that could affect other similar planes
  • Always considering human factors
    • Considering the mental state of those involved
      • What were they thinking?
      • Were they stressed or under pressure? Did they have other motivations which drove their decisions?
      • What was their experience?
    • Psychology, sociology
      • Seniority affecting juniors?
    • Popularity (“no one questions this person”)
      • Cultural or language barriers
        • Tenerife accident – “Ok” meaning both confirmation of receiving and giving clearance.
        • Korean Air Cargo Flight 8509 – communication breakdown because junior officers wouldn’t take control from captain.

Not like testing?

  • Its only investigating incidents, not really being involved in the process of building directly
  • We have to explore what hasn’t happened yet and try to create the problems, as well as triage and investigate them.

Some anecdotes…..

Test environment not the same as production

Atlantic Southeast Airlines Flight 2311

  • Propellers got stuck in a position that caused drag
  • Was tested for this scenario, but only on the ground in a lab
  • Accident happened because it behaved differently at altitude, in the air
  • Investigator (tester) felt sure this was the case, but had to prove it with live test that shocked the engine suppliers

Checklists – how difficult it is to write them

1999 South Dakota Learjet crash

  • Checklist for depressurisation warning started with diagnosing the problem
  • By the time the crew got to the check about donning their oxygen masks, they had already lost their ability to think
  • Checklist should start with donning masks no matter what

The interaction of humans with automation – people tend to assume software is smarter than it really is

Asiana Airlines Flight 214

  • Plane crashed due to landing short of the runway
  • Caused by the pilots inputting the wrong setting into the autopilot
  • Pilots were trained to let the automation assist them
  • This broke down when they gave the automation the wrong command and they didn’t understand its behaviour
  • Lesson here on abstracting important procedures and relying on automation
  • Automation cannot think or second guess, but people can believe its more intelligent than it really is

Ep 70: The Novella Of Testing

This week I am talking to Mel Eaden AKA Mel the Tester!

This is gonna be a two parter as we spoke for a while about many many things, so part one today, part two next week, then back with our normal fortnightly schedule the week after that. That’s right, three episodes in three weeks! That’s how much we love you, listeners <3

As well as being a tester, Mel organises the Ministry of Testing’s writing community, which is an awesome place for anyone who wants to write, edit, proof-read or in someway contribute to the Dojo on the Ministry of Testing. If you’re interested, ping Mel on either the Testers.io or Ministry of Testing Slack, or email melthetester@gmail.com

The basis of this episode was the conflict between testers as story tellers and being concise, and was based on this blogpost from Mel:

http://testingandmoviesandstuff.blogspot.co.uk/2016/10/my-november-goal-and-now-for-rest-of.html

My personal goal for November is to answer a question first and then ask if the person wants more of a story or explanation of why I gave the answer I did. This is my personal goal based on feedback. I was made aware of the round about way I approach my answers, thinking I have to give an explanation up front for everything I’m going to say when a person doesn’t have context.

This intrigued me – I often feel like I’m just spaffing on at people, and figuring out what’s too much info and what’s the info they actually need is something I think about a lot, and so I reached out to Mel to be on the show.

We also discuss when soapboxing is needed, testers as journalists and translators, and ‘technical testing’.

TMI?

  • Balance between context and not blinding people with details
  • The issue is how do you know if they have enough context?
    • Ask them if they need more info
    • Could point them towards a document/place/email with more context if possible?
  • Cultural habits – people get stuck in conversations they can’t seem to get out of. “If you stand there like you don’t have somewhere else to go, I’ll keep talking.”
  • Getting annoyed with the short version of the answer – I keep asking questions because I want to have the conversation and flesh it out completely when someone is explaining something to me. I like details, no matter how long it takes to get them.

Bug demos

  • Agree there is some bugs that don’t need demoing
  • Hard to know which without domain knowledge sometimes but can learn the lines on the job – see what prompts devs to come over to you to ask
  • Stopping the demo train – call a meeting? Demo to everybody at once? Ask what’s missing from the defect? Email chains?

Story tellers

  • I have a habit of overthinking decisions and so arming myself with lots of reasons for why, which then turn into a massive response when some asks ‘so what are you doing with x?’ when all they needed was my decision
  • Over-thinking the solution
  • Big picture thinking vs small picture thinking (Hammer/Nail problem for some people while for me it’s more “Why do I have a hammer?” or “Are there only nails?”)
  • Writing and Using examples and stories in writing (My current business card says Quality Storyteller)