HCD is For Everyone

Many organizations are requiring human-centered design practices. But what does that really mean?

It's becoming increasingly common to see companies say they center their customers' needs or to see government agencies have customer or user experience requirements (such as the recent Executive Order on Customer Experience affecting federal agencies).

However, change is difficult, even if it yields positive results. Organizational changes often require support, resources, training, and monitoring to measure their value. Research on change management from Gartner shows that half of all organizational change initiatives will fail and that nearly 75% of change-affected employees experience moderate to high stress as a result (also known as change fatigue).

Research from Gartner shows that changes (even if positive) can cause stress and performance issues

Our challenge: Support lasting HCD practices while minimizing change fatigue.

Our learning plan had 3 parts: 1:1 dyads, custom content, and safe risk-taking

1:1 Dyads

Our learning program paired an expert practitioner (in this case, a UX Researcher or UX Designer) with 2 apprentices for 6 weeks. This program became the expert's main work focus so that they could devote appropriate time to the apprentices without being overburdened.

The dyads created a working agreement to align on expectations related to working hours, communication preferences, and escalation paths (if needed). This helped the dyads find quick harmony for their 6 weeks together.

The dyads also carved out time to meet several times a week to discuss the material, answer any questions, or shadow other practitioners doing work relevant to the apprentice.

1:1 time facilitated an accelerated learning path for dyads (Photo: WOCinTechChat.com)

Custom content, not one-size-fits-all curriculum

After an initial conversation about the apprentice's skills and learning styles, the expert put together a learning plan (like the excerpted plan shown below) for the apprentice. This included a mixture of content, including websites, artifacts, online classes, and ideas for applying the skills they were learning. Apprentices reviewed, revised, and approved their plans before moving forward.

Apprentice participation and buy-in were crucial to ensure an active and effective learning experience during their short time with the expert.

An excerpt from an apprentice's learning plan with customized goals and skills areas

Safe risk-taking

Traditional learning is often assessed by exams or practical demonstrations. However, a 6 week, fully-remote, HCD expert/Federal employee learning dyad was anything but traditional. Creating an exam or requiring a project didn't seem relevant to this program, but how else could the apprentice test out their skills and receive feedback?

Following education research that shows that safe risk-taking increases positive learning outcomes, apprentices were given opportunities to apply their new skills in supervised environments where "failure" did not carry heavy consequences.

One apprentice, a registered nurse, held a workshop on how health care could change with new technologies. The workshop was attended by their colleagues and several HCD practitioners, who all participated and provided feedback on the content and delivery.

Another apprentice, a technical manager, wanted to practice assessing program needs. To do this, they conducted 1:1 interviews and analyzed the results with an affinity map diagram to look for themes and insights.

Excerpt from the planning session used to collaboratively structure an apprentice's workshop event

How was the apprentices' experience with this program?

Always iterating, always open to feedback

We were open with our apprentices about how experimental this learning format was and how crucial their feedback would be to ensuring that future iterations of the program were valuable. We solicited feedback in two ways from apprentices:

  • Weekly reflection surveys completed by the apprentices

  • A daily diary log completed by the expert

  • A post-program retrospective with all parties in attendance

After the 6-week program was over, team members (who did not serve as experts or apprentices) synthesized all the feedback collected to determine what worked, what changes should be made, and how the apprentices experienced the program. This data was plotted onto a matrix to assess for level of effort and value to the apprentice and/or expert.

A sample of the effort/value matrix used to synthesize program feedback holistically

Future HCD apprenticeship programs are currently being planned using the lessons learned from this pilot to improve and deliver value.

For example, one observation from the expert involved was that apprentices needed more time from their supervisors to focus on the program: "I think expecting a 15% commitment from apprentices without actually lowering their workload is just a recipe for them to be behind OR overwork themselves".

Additionally, the expert benefitted from support and backup from other team members, stating, "[colleague] gave me some really good feedback on [apprentices] plan, which is making me feel more confident."

Selected apprentice feedback

Ann was dedicated to being successful, regardless of time constraints
— Apprentice 1
My SMEs are fully engaged and committed to the process and Ann and [colleague name redacted] are very intentional in trying to help me achieve my objectives in the program
— Apprentice 2
Next
Next

Unreliable observations?