Today, beside developing the actual application, there are a host of testing frameworks and methodologies that have to be implemented during development. They range from either unit tests, continuous integration, screen tests, or social and 3rd party API testing to name a few. For the most part these tests have become the norm in every day app development life. However one thing this chap never thought off preparing for was user testing. I don’t mean UX, it’s actual field testing with real customers. It might have been because up until now we’ve only built apps that didn’t require us having data collected before shipping an application.
So we’re building this mobility mapping application for the disabled community. Think google maps for wheelchair users! If I say it this way sure it sounds easy, right?
- Collect location data from users
- Record other metric to determine difficulty and obstacles
- Implement the strokes algorithm we’ve spent months developing drive obstacle detection.
Wait up, we also need a path navigation from point A to point B. Ok! Based on paths taken by other wheelchair users. Ok? So how are you going to populate the data for the path navigation to work before public release? EASY, give the app to a few hundered volunteers for them to go out and map the streets of Sydney and Melb…ohhhhh I see your point!!!
From a commercially strategic point, going iOS first makes sense. But not so when you factor in the limitations Apple imposes on licensed devices. With only a 100 devices per development license, and the fact that once a device is registered it can’t be removed from the account until account renewal. And facing with an open ended number of unknown volunteers we knew we had a major problem on our hand.
The early choices for us were either:
- License a few more individual app development accounts: Costly and a nightmare to manage the different certificates, profiles and builds.
- Switch to Enterprise licensing which will give us 500 devices: We need to work smarter not harder. Plus it's still costly.
- Develop an Android version at the last minute: Android doesn’t like it when it’s 2nd choice.
- Try to devise a plan rather than throw money at the problem.
We went with choice number 4, and here is how we did it:
#1 Divide and ConquerThe first bit is identifying what we haven’t accounted for.
- A field testing strategy
- A device collection, registration and app distribution strategy
- An automated on-boarding process
- A monitoring and testing strategy for the volunteers
#2 Number of DevicesThere is no way around this, so we had to place a number, a cap, on the number of outside devices we will add to our system and decided on the number of internal devices that will need to be circulated during tests.
- We built a simple spreadsheet that outlines the exact number of volunteers needed for each area before the test
- We segmented the volunteers by wheelchair type and collection routes
- Identified what type of device each volunteer will have (i.e. owned or internal)
- Choose which team members that are going to be on site during the collection process.
#3 On-BoardingEarly on we identified that we needed a proper user guide on how to acquire a UDID, and how to install and collect data. That might seem too much, but ordinary users are not computer geeks. We wanted the user guides to be a team effort, and we didn’t want to buy extra licenses for InDesign for teams, so we searched for a cloud based tool that
- Allows for multi editing between team members
- Have InDesign like graphic capabilities
- Doesn’t s**k
After a brief search we picked LucidPress (Note: Affiliate link). We designed the layout, wrote the content and added the screen shots all in a single day.
For the on-boarding process we felt that spreadsheets weren’t going to be enough to cover it. Not with the amount of back and forth and different team members having to do different tasks. So we re-purposed our CRM tool Insightly to handle the on-boarding process. Once a volunteer enters the system a set of automated task-sets and events are executed in sequence, each task and event assigned to the right person, and it worked beautifully. Except for that first batch where I woke up and found out that my calendar was two feet high!
#4 Distribution and Monitoring
We did have one of two choices in this area, either go with TestFlight or Fabric. We liked Fabric, we’ve been using it ever since it was only Crashlytics, it is not as seamless as TestFlight for non technical people, it also does have it’s installation quirks, but the good parts outweighs the bad ones. The real-time monitoring and crash tracing have placed Fabric at the top of our dev stack.
1st Field Collection Day
The very first big day, so how did it go? well in the first few hours we faced turbulence, lack of experience I guess, but we didn’t burn and crash, which is good. The tools and processes held together, we fixed the issues from the morning and after a few short episodes of drama the collection commenced and the day ended successfully. We went back and iterated on the process and fixed the broken steps. The remaining collection days didn’t have any issues.
Real user testing is a not something far off, it can happen in any project. The current licensing systems doesn't account for it. Maybe one day when it becomes an issue for a lot of organizations those restrictions will fade away or platform will offer an all in one solution. In the mean time plan for it. If for anything take it as an innovative practice for you and your team. Innovation is part of what makes this whole business fun, even if it's crude and rough around the edges.
Have you or your team faced a similar situation? I'm curious to learn about your innovative solution. Write it below in the comments section.