I want to revisit mobile testing as I realised I focused a lot on choosing the correct devices and tools and then was just …’and then you test stuff? Websites are easy to test LOL’ which is a little reductive, so I’m gonna talk about actual testing. I think the reason I skipped over it is I realised that my input into mobile stuff generally comes in at requirement capture. We’ve had more that one client completely forget that you can’t hover on a mobile device, so if they want that feature, we’ll have to consider how to implement it on a mobile device in a way that makes sense.
There are also things to consider like how user journeys and expectations change on mobile devices. You need to make sure links aren’t too small or close together, you need to make sure you don’t rely on hovers or dragging, or anything gesture-y that might either go against what gestures mean on a phone or that people would have issues carrying out on a phone screen. Swiping is about as complex to go outside of games, I think.
I also missed some great tools you can get from using emulators – and the emulation of the touch screen is one of them – if you can’t test on devices, then you can use chrome’s dev tools and the cursor will be replaced by a small circle, to show the touch area.
I also like the network and location emulation. You can enter a latitude and longitude and the phone will act like you’re in that spot for any geo-location dependant features you may need (good for looking at shipping on shop sites).
Part two, and we hit the desktop versions.
You can’t fully support every browser; it’s like trying to hit WCAG AAA, it’s impossible for all areas of a site or application, and even then you’ll still miss some users. So you hit the most you can sensibly, and then try to degrade as gracefully as possible. As long as people can use it on IE7, then does it matter if it’s not as pretty as on IE 11, or Firefox?
Cross browser testing is important because people have their own weird set ups (One of my coworkers installed Fedora on his mac, because he couldn’t get on with OSX, for example), and so you want to make sure as many people can access and use your site or app.
We’ve been revisiting our browser support policy in light of IE kicking most versions of their browsers out of support, and from reviewing the stats it does look like most IE using people are on IE11. Oddly enough, IE8 is the next most popular IE version, but that’s only 2% market share, and no longer supported, so that’s definitely in the ‘gracefully degrade’ category of browsers.
And market share is how you define the browser you’re willing to support. You can check both on a global scale, and , if you’ve got access to a current site with Google Analytics, you can pull data that’s directly relevant to the client to put a proposal together.
There are more than just browser versions and browser/OS combinations to think of, there things like HTML standards, which browsers support which parts of CSS3/HTML5? What about JS? Webfonts? OH GOD. It’s a pain in the arse, but it’s got to be done.
So once you’ve got a policy of browsers we will fully support, and ones we’ll ensure the system works and is useable, but it may not look as good, then I can base my testing on that policy.
You can save more time by only testing on the latest version of Chrome and Firefox, as these are often upgraded automatically, and even then, unless someone has brought out a really old version, the differences are minimal. It’s only Safari and IE that we have to take note of version with.
Again, Ghostlab can be used to test multiple browsers in one go. I have some issues with using it with a VM so testing IE on my mac is a pain with it, but even if I just test Opera/Chrome/Firefox/Safari in one go, then do IE separately, it’s still reducing the workload a good amount.
I have two strategies for cross browser testing. I test each story individually on browsers – rare that anything other than visual bugs come out of this testing – and then, towards the end of the project, I do a couple of test sessions just to give the site a full integration check, and I’ll do cross-browser and device testing then. This is just to catch the bits of integration or visual bugs that may have been missed during testing individual stories, and it just makes sure everything is more polished.
Like with device testing, I tend to screenshot and make notes as I go then create issues/write up afterwards.