Here you’ll find a few quick examples of work I’ve done…
Working through complexity
I initiated this piece of work following a team retro where my feelings of overwhelm were shared with some of the team.
It had been decided that the team would start with the performance reporting part of the offer.
However when we started to get into the detail it seemed that the team did not agree what the work was.
To help the team to get aligned I asked them to do some ‘homework’ and send me their definition of performance management as a quick message on teams.
On a Miro board I plotted the definitions on a scale from minimum amount of work to most amount of work.

I then took the key words from those definitions and plotted them in a way that you could work across the board to form a task that became part of the performance management work.
I shared this at a team meeting to demonstrate not only that we were not aligned but that if we were all working to deliver each task that we would be doing a huge amount of work.
Sharing this helped the team to understand why the work was feeling so overwhelming.
It also kicked off a piece of work with our team leader to create and agree a definition of the work.
Understand what the data is telling you
When colleagues in the team shared the data from the latest round of partner reporting, it seemed to show that a high number of organisations had not completed all the sections.
When I was asked to help draft the follow up email to prompt people to complete their reporting I considered what the data was telling us to see if I could include some further support in the email. When you looked at the data from the partner perspective it led me to ask questions. I wondered why people would start a section but then not finish it. I also questioned why people would leave a section 60% finished but then tell us it was complete (via a ‘completed’ button).
My questions prompted a colleague to call a couple of the organisations to get their take on it. Just by speaking to three people she began to suspect that the issue may be on our side.

We learned that although there were organisations who had not completed their reporting, it was significantly fewer than we had originally thought. This meant, as my colleague shared above that we didn’t make the mistake of undermining confidence in the new system and creating a lot more work for those partners and the team.
Get to the test
The team had agreed to introduce a consistent way to measure audience feedback for every event.
We had an idea what we wanted to ask, a combination of quantitative and qualitative feedback.
It was suggested that tools like Slido and Mentimeter had the functionality we wanted but we needed to work on the logistics of getting accounts.
There were a couple of events coming up that we could test with but time was running out.
I started to explore the tools we had available. I got up and running with Microsoft Forms, and I put together a prototype, using the questions we had agreed.

When I shared this with the team we were able to move forward. Although it wasn’t perfect, we knew we’d be able to use this to test with it.
It meant that we could focus on planning how to integrate the evaluation test into the upcoming meetings. Without it we may have missed the opportunity.
