My approach


In any design effort, I prefer to follow the strategy below.

Fortunately, it is a common-sense approach that is easy to adapt to any work environment, whether waterfall, Agile, Lean, design thinking, or other.

Essentially, the key is to find out what users need, design solutions for those needs, and then learn from those tests.

 
 



2012 - 2013 / A startup dreams of bringing mobile fun to the masses

Mobile games


Threadbare was an Agile startup in San Francisco. The studio wanted to create a line of tablet games, focusing on the turn-based strategy genre.

As with most startups, they had very limited funding. The goal, therefore, was to produce MVP versions of games as quickly as possible in order to find a winner and start generating revenue.

The process

We generally followed the approach shown in the corner, but the process was considerably shortened compared to non-Agile environments.

For the first couple of weeks, the development team learned iOS, while I studied research on what makes games fun. I produced several lists of game elements that we could build into our designs.

After that initial period, the work was a series of sprints. The first would involve the dev team starting on the core code while I laid out the basic UI.

I was brought on as a UX consultant. My main job was to ensure that the games we created were fun, playable, and something on which people wanted to spend money. However, I also ensured that they were usable and learnable.

 

My role

I was brought on as a UX consultant. My main job was to ensure that the games we created were fun, playable, and something on which people wanted to spend money. However, I also ensured that they were usable and learnable.

...and the results

One game released. It got favorable reviews but was not a commercial success.

One game that made it to mid-stage development, but which had to be put on hold because of technical challenges.

One game that made it to early-stage development, but which was canceled by the company shutting down.

 

The activities

  • Page flows (Google Docs*)
  • •Storyboards (Google Docs)
  • •Wireframes (Google Docs)
  • •Mockups (Google Docs and Axure)
  • •Prototypes (Axure)
  •    http://fnq3ia.axshare.com/
  • •Game tutorials
  • •Usability testing (remote)
  • •User stories
  • •Design principles

* Google Docs was the main creation tool, because it provides good collaboration capabilities at no cost. Axure was used for prototypes.




Very limited resources meant development considerations tended to have higher priority than UX ones.

This issue was difficult to overcome, and we were unable to fully address it. We made good use of Skype and Google Docs for sharing and collaboration, but it was clear that my full-time presence with the rest of the team would have increased the efficiency and efficacy of our work. The main lesson here is that co-location is best for Agile work, but there are ways to do it with a distributed team.

Challenges, strategies, and lessons

 

I worked remotely from Norway with an Agile development team in San Francisco.

This is an issue in many Agile environments. It worked out well enough in this case because, despite the distance, I had good communication with the dev team. I understood the constraints that they were under, and they understood the need to prioritize essential UX issues. Communication is the lesson.



2012 - 2015 / Saving lives by improving drone use in search and rescue missions

Command & control


The problem (opportunity)

Drones have an increasing presence in search-and-rescue (SAR) events.

However, their use is haphazard, because field commanders have no direct visualization of where the drones are or what they are recording.

Rather, that information reaches them indirectly via radio from the drone operators out on scene.

 

The idea

Provide a system for first responders in SAR efforts to locate, direct, and monitor the feeds of different drones being operated on scene.

My role

I led the design of the system’s UX and UI. The focus was on needs identification, requirements generation, human-factors considerations, wireframing, and evaluation.

 

The approach

My team followed the typical user-centered design process shown here. It was a multi-year effort divided roughly into two stages: research and iterative design.

End users were involved throughout the process, providing input through interviews, workshops, participatory design sessions, and usability tests.

 

The research stage

The research stage began with an analysis of the SAR domain. We reviewed previous projects and research, studied existing systems in the area, talked with experts in the field, and had a workshop with SAR stakeholders.

We also used a SAR operations manual to create a mind map showing the interactions between different roles in SAR efforts.

The result was a set of user requirements for the system interface.

 

The design stage

Design proceeded over several iterations separated by workshops with users. It produced:

  • Usage scenarios (to focus the design efforts)
  • Wireframes (created in PowerPoint for easier collaboration with remote teams, and easier transfer to Word for final delivery)
  • Mockups (also PowerPoint)
  • Storyboards (PowerPoint)
  • Paper prototypes (…paper)
  • Working prototypes (built in C# by front-end developers on the team)

Challenges, strategies, and lessons

The companies handled this by arranging a number of meetings throughout the year. However, the different groups still operated within their own silos. One consequence is that early in the project, we discovered that the actual implementation of the system interface had diverged from my team’s designs. That required some extra effort to get us aligned, which was a valuable demonstration of the need to set up good communication channels in any distributed project team.

The project team included a dozen different partners (a mix of academic, corporate, SME, and research organizations) scattered across Europe, making coordination difficult. The situation was complicated in that the partners all had different work models and short-term goals.

We had to be both strategic and opportunistic with our user testing and research. On the one hand, we had to be ready to gather input whenever the chance presented itself. On the other, we had to look ahead for events (e.g., product demonstrations, first-responder training exercises) that would give us access to end users. We also had to work at extending our professional networks to get access to the right people.

SAR events occurred infrequently and irregularly, so observing them and testing solutions in the wild was unlikely.