00:00 – Introduction: Caroline and Nicola set the stage for the episode and introduce guest Emily Williams.
01:20 – Ice-breaker: Most unusual interview questions.
02:38 – What does great customer success look like? Emily explains why strong relationships and trust are the cornerstone of successful delivery.
04:28 – Defining success beyond go-live: How to measure impact after implementation and keep iterating.
06:50 – Lessons from client partnerships: Emily reflects on moments that shaped her approach to customer success.
09:59 – What successful clients do differently: Consistency and alignment across markets as the key to scale.
12:01 – When things don’t go to plan: Why delays or “no” answers can sometimes lead to better outcomes.
16:00 – Myth-busting in assessments: Emily challenges the idea that “shorter equals better” for candidate experience.
20:16 – What others should know about Customer Success: Emily explains why CS isn’t just a team, but a mindset shared across the organisation.
22:41 – Science or Fiction? Quickfire round where Emily reveals truths and myths about onboarding, fast implementation, customisation, candidate experience, and scalability.
31:18 – Key Takeaways: Emily discusses the importance of clear communication and business-wide understanding when rolling out assessments. Nicola and Caroline wrap up with their key takeaways on trust, collaboration, and a customer success mindset.
35:56 – Outro: Closing thanks and where to find more episodes.
Caroline Fry (00:00)
Welcome to The Score, where we break down what's really working in hiring, separate the science from the fluff, and hear from the experts shaping the future of talent.
Nicola Tatham (00:08)
I'm Nicola Tatham, Chief IO Psychologist. I help organisations put the right people into the right roles with skills assessments that are scientifically valid, reliable, and fair.
Caroline Fry (00:19)
And I'm Caroline Fry, Head of Product at Sova. I build the bridge between assessment science and hiring reality, making it work for recruiters and candidates.
Nicola Tatham (00:28)
In each episode, we bring in a guest with talent assessment expertise to help us tackle the big challenges in hiring.
Caroline Fry (00:35)
Because when it comes to talent, science-backed decisions aren't optional.
Nicola Tatham (00:39)
Today we're joined by Emily Williams. She's Sova's Customer Success Director.
Emily leads the team that makes sure talent assessment solutions go beyond implementation and actually drive value for clients and candidates alike.
So Emily's seen it all – global rollouts, tricky timelines, stakeholder wrangling. Today she's here to share what makes delivery really work, what real partnerships look like, and why customer success is a lot more strategic than people think.
So hi Emily, good to have you with us. Caroline, I'll let you ask the first question.
Emily Williams (01:12)
Hi. Thanks for having me.
Caroline Fry (01:20)
Thank you. Hi Emily, welcome. We're going to start with something light and easy to break us in. What's the most unexpected question you've ever been asked during an interview?
Emily Williams (01:25)
That's an interesting question. I've never had any of the ones people tend to mention, like “what ice cream flavour would you be?” or anything unusual like that.
But when I was at uni, I applied for a youth working role at our local community centre. They gave us a series of monsters that represented different emotions, and asked us to turn that into some kind of exercise or game for teenagers. I thought that was quite unexpected.
It was interesting because I was a teenager myself at the time, trying to figure out how you'd make that into a game that other teenagers would actually want to do – which obviously most wouldn't! But it was definitely an unusual and memorable experience.
Nicola Tatham (02:33)
Yeah, that does make some sense for the role you were applying for. Interesting. Thank you.
So, moving on to some questions that are slightly more relevant to the role you're in now. Let's start with quite a big one: what does great customer success actually look like in our space, in the assessment space, how's that different from just supporting the technology?
Emily Williams (03:07)
It's a great question. I think customer success is multifaceted. There are so many different parts a customer success team plays in client relationships.
One of those things is relationships. Building a strong, foundational relationship with clients and really understanding what they're aiming to achieve helps to shape the direction.
The tech is part of it – making sure there's a clear understanding of the solution and what it's bringing to the organisation is key. But it's also about looking at the different groups of individuals using that technology, and thinking about how their goals align with the overall direction we're heading in.
There's that initial relationship building, which is vital when you implement something for the first time. But it doesn’t stop there. It's about looking at the whole 360 view of what we're aiming to achieve and the different ways we could get there.
So yes, there's a lot within it, but I think the cornerstone is strong relationship building, trust, and making sure everyone is heading in the same direction towards the same goals.
Nicola Tatham (04:28)
So how do you define success beyond just getting something live? I mean, getting something live is one element of success in its own right. But how do you define success beyond that, especially when every client’s goal is different? And even within a client, different stakeholders will define success differently.
Emily Williams (04:50)
Exactly. Every client has a different goal, and within that, different stakeholders have their own.
Like I mentioned, it’s about getting to the “why”. That’s the most critical thing, because if you really unpick why we’re doing something, that drives everything else.
Caroline Fry (05:12)
Thank you.
Emily Williams (05:18)
You're all aiming to achieve something ultimately. The initial go-live tends to be what everyone focuses on – it's the immediate goal. But then it’s about asking: how often is this being used across business areas or markets? Are there blockers stopping some areas from adopting it? What can we do to engage them?
It’s also about looking at the way solutions have been built. Often, with a first go-live, things are created in a very specific way, especially for new customers who aren’t yet familiar with our solutions. But once they’ve used the platform for a while, there’s often iteration. They might realise some things work better than others, so how can we enhance this and make it even stronger?
We can learn a lot from the data after that first go-live, and use it to shape what comes next. So I think success isn’t static – it’s about never resting, always iterating, and being creative about how we drive outcomes while keeping the ultimate goals in mind.
Caroline Fry (06:36)
Thinking more widely about your experience at Sova and also previously in CS, can you share a moment from a client partnership that really stuck with you, and maybe something that changed your approach to how you work?
Emily Williams (06:50)
That's an interesting question. When I first joined Sova, I was in the professional services team, then I had a bit of time out on maternity leave, and came back into customer success.
What’s been interesting for me is that there have been a number of moments that feel quite similar. For example, where I had worked with customers in my previous role, we were talking about what they wanted to achieve and how we were planning to deliver on that. Then, after taking some time away and coming back into those conversations, I found the goals were largely the same. There are variations, particularly if we’re working with new stakeholders, but broadly the goals are still the same as months or even years before.
What I find really interesting is that we’re often always driving forward towards the goal, but not reflecting back on the journey so far. What great things have we already done that are bringing us closer to that ultimate goal? I think that’s really important, because it’s not just a tick-box of “do these three things and we’ll get there.”
Sometimes the goal might feel far away or unattainable because it’s not something you see immediate results from. But in the meantime, there’s great work happening that slowly moves the needle and gets you further towards that end goal. It’s about reflecting on all those other things – the progress already made – rather than always just looking forward to the next thing.
Caroline Fry (08:48)
Yeah, I was going to ask how you make that actionable, even with clients who might feel like they don’t have the time to look back. But it’s a good lesson for all of us, right? To look at the progress we’ve made and do those check-ins. It’s the same with individual development – why not do that collectively with vendor or supplier partnerships? Do clients respond to that? Are they open to considering the journey they’ve taken with us?
Emily Williams (08:57)
Some more than others – it depends on where they are in their journey. It’s a really good thing to do at milestone moments, like the start of a new year, when you can sit down, reflect, and have conversations about how you’ve been doing things and how you want to move forward.
It’s particularly nice when you do it with the broader teams who’ve actually delivered on some of this, because they might be hearing about the goals we’re trying to reach, but not necessarily about the progress that’s already been made. So it varies, but it’s always a nice thing to do.
Nicola Tatham (09:59)
So related to that, in your experience, what is it that your most successful clients do differently? What makes them stand out? And we won’t ask you to name any names.
Emily Williams (10:19)
I think one of the main things is consistency – in a number of different ways. Consistency is really important for making sure there’s a solution in place that works for everyone.
When we’re reflecting back on goals and what we’re trying to reach, if there’s a more consistent process in place across markets or business areas, it gives the opportunity to use data to see where things are working and where they could be improved or enhanced. That really helps drive a deeper understanding of how to optimise.
It’s a big journey – a huge change management piece – for a business to align and build that consistency across processes. But ultimately, it helps us see clearly where they are and how to move further towards their goals.
There are still lots of clients that do things in a more bespoke way across different areas, and that works too. But where I’ve seen it work really well at scale, there’s definitely that consistency, even if the journey looks slightly different for each part of the business.
Caroline Fry (11:40)
So my next question: we’ve talked about the best examples of things going well – theoretically, in an ideal world, how things can work. Can you tell us about a time when something didn’t go to plan from your perspective, but actually ended up being a better outcome in the end? Whether that’s a client implementation or something more specific.
Emily Williams (12:01)
There have been many times where things haven’t quite gone to plan. One that comes to mind – and this has happened a few times – is around features in our platform and specific requests clients have made, which Caroline, you’ll sympathise with.
Caroline Fry (12:27)
Yes.
Emily Williams (12:28)
Clients sometimes ask for very specific things for our platform to do. That might be because they’ve used a different platform before that did that exact thing, or because they’ve thought of another way of doing something to reach their goal.
Sometimes we can do it – the feature request is quick, and we roll it out. Other times it’s a “yes, but not right now,” and we do it further down the line. From their perspective, that can feel like something hasn’t gone to plan, because they were hanging their hopes on that feature.
But what it allows us to do is go back to the “why”. Why do you need the platform to do this specific thing in this specific way? Is it because you’ve done it before and it worked well? What challenge are we solving? Where has this come from, and how can we move it forward?
It might still end up being a feature request – and that’s fine – but sometimes there’s a different way to use the platform, or a different process we can put in place, that achieves the same thing in a more creative way without relying on the tech. That often helps uncover what we’re really trying to achieve, instead of assuming the platform has to solve it directly.
Nicola Tatham (14:00)
So instead of knee-jerk reactions, it’s about stopping, pausing, reflecting – why do we need this? Is there a quicker alternative route? Do we actually need to do it this way, or is it just because it’s always been done this way? Is there a more innovative approach? That makes sense.
Emily Williams (14:10)
Yeah, absolutely. Sometimes a longer-term focus helps eradicate the problem entirely. It’s about thinking creatively about how we could do things differently to get the same outcome, without relying on the platform to change.
Caroline Fry (14:49)
Speaking for every product person ever – that is hugely appreciated. We often find those knee-jerk requests end up being features that get used once and then just make a platform more complex.
You’re basically doing a product process in those conversations: finding the pain point and figuring out the right solution.
Caroline Fry (15:17)
The solution might be technical, but it might also be process. Especially for us, because our platform is about people, it’s not always clear-cut that the solution should be technical.
Doing that analysis early helps, and it’s good for transparency with clients – helping them understand why something may or may not be approved, and encouraging them to think in that way too. So, great job – thanks.
Emily Williams (15:31)
Hmm.
Nicola Tatham (15:52)
So what's one myth about rolling out assessments that you wish we could leave behind for good?
Emily Williams (16:00)
Good question. One myth is that a shorter assessment automatically means a great candidate experience.
There’s been a big drive, especially in recent years, to make assessments as short as possible, with the idea that candidates will have a better experience because they’re not spending as long taking the assessment. And that’s true to a point – shorter assessments can improve the experience, particularly if there are multiple components being measured.
But candidate experience isn’t just about how long they spend in the platform or how they perceive the steps they’re going through. It’s also about the broader context of what great candidate experience really is. Fairness is a huge part of that. If candidates aren’t having a fair experience – even if they’re not aware of it at the time – it will impact them and the outcomes of their journey.
So it’s about that trade-off, Nic, which you’re always talking about:
Nicola Tatham (17:16)
Yes, constantly.
Emily Williams (17:22)
Finding the balance – keeping assessments as short and engaging as possible, while also making sure we don’t lose the really vital parts that ensure validity and fairness.
Nicola Tatham (17:35)
Exactly. It comes back to those pillars that underpin good assessment: reliability and fairness. And I think probably in the last ten years, candidate experience has risen to sit alongside those other pillars. In the past, I remember designing assessments where it was just one psychometric test, but…
Nicola Tatham (18:02)
Some assessments used to take over an hour to complete – just for one – and people were asked to do a battery of them. It’s about finding that sweet spot and balance, recognising that short isn’t always going to give you the psychometric robustness you want. That’s not to say we can’t develop shorter assessments that are still valid and robust, but it’s about not cutting corners to the detriment of quality. of those underlying principles. So yeah, I completely agree with that, Emily.
Caroline Fry (18:35)
I think product and platform play a big part in that too. Short doesn’t necessarily equal good candidate experience. What matters is that the assessment is engaging to take.
Nic and I are working on designs right now that are flexible in length, as long as the candidate experience is engaging and enjoyable. We’ve all taken very dry, stale tests and had a poor loading experience – not exactly exciting. Gamified assessments have had their day, and things have evolved.
I think there’s more we can do to make assessments genuinely engaging. Good doesn’t equal short – it equals good and engaging. That’s a challenge both my team and Nic’s team have to rise to when we’re developing things.
Emily Williams (19:23)
Hmm.
Caroline Fry (19:34)
So we can capture the data we need in the time it takes, without creating a negative experience for candidates along the way.
Emily Williams (19:45)
Mm-hmm. Yeah, absolutely.
Caroline Fry (19:48)
Slightly related to that – since we’ve talked about both product and IO – your team also works closely with many other business functions. What’s one thing you wish each of them better understood about your world in customer success? Be as open and frank as you like, we can take it.
Nicola Tatham (20:11)
I’ve got my notepad ready for this.
Caroline Fry (20:12)
Yeah, me too.
Emily Williams (20:16)
Customer success is interesting because I don’t think it’s universally well understood. It looks a little different in every business. Increasingly, there’s a lot more great knowledge sharing out there as a function, but it’s still something of an unknown.
Because it’s so multifaceted in terms of what customer success teams focus on and deliver, that can make it confusing. Often the customer success team are the ones who “wear all the hats” – regardless of the question or situation, we’re the team people turn to, which means we’re pulled in many directions at once.
One thing that would be really useful for everyone to understand is that customer success isn’t just a team within the organisation – it’s also a mindset. It’s something every team across the business contributes to. Customer success is threaded into what we all do every day, regardless of which team we’re in.
I mentioned earlier the internal relationships and the trust we place in colleagues across the business to deliver their part of the journey brilliantly. We wouldn’t be successful without the support we get from every team – everyone contributes to driving customer outcomes and helping us reach those goals.
So it’s less about what we do in customer success, and more about the contribution everyone makes, and how much we rely on those great relationships and the knowledge that sits across the business. We’re constantly reaching out to every department. That’s something I think isn’t always fully understood, but it would be great if it were.
Caroline Fry (22:41)
I like it. So effectively we’re all one big customer success team. Makes sense.
Okay, before we close the episode, Emily, we’d like to play a short game called “Science or Fiction.” We’ll throw out some statements and ask you to tell us which are good assessment advice and which are myths. No pressure.
The first one: science or fiction – great onboarding guarantees great outcomes.
Emily Williams (22:45)
Yeah, pretty much, yeah. I think onboarding is really critical.
Caroline Fry (23:16)
Okay, I like it. Straight to the point. Let’s carry on. Nic, are you going to ask the next one?
Emily Williams (23:16)
Yeah. Sorry – do I need to expand on this?
Nicola Tatham (23:26)
I’ll add to that one. Emily said great onboarding guarantees great outcomes – I think there’s a nuance to it. You need great onboarding if you’re going to get a great outcome, but you also need more than that. You need a great platform, a great team, great assessments.
So yes, it’s science – but with a caveat. You can’t get a great outcome without a great onboarding experience, but you could have a great onboarding experience and still end up with a poor outcome. Controversial, but true.
Emily Williams (24:01)
Mm, yeah.
Caroline Fry (24:07)
Okay.
Emily Williams (24:11)
Yeah, very true.
Caroline Fry (24:12)
So we could flip it and say: bad onboarding guarantees bad outcomes. That’s probably more clear cut.
Nicola Tatham (24:20)
Yes.
Emily Williams (24:22)
Exactly. The onboarding phase is the foundation that sets you on the right path. There are iterations after that, but you really need to be set up for success from the start. It’s critical. I completely agree – there’s more to it than just onboarding, but it’s a fundamental step.
Caroline Fry (24:29)
Mm-hmm.
Nicola Tatham (24:38)
It means you don’t want to cut corners when it comes to onboarding. Which takes me to my next statement: fast implementation means you have to cut corners. Science or fiction?
Emily Williams (24:51)
I would say fiction. If you’re doing something very quickly, you have to prioritise and figure out what’s essential for success. That might mean changing the scope slightly.
But doing things fast, in a consistent way, or making good use of off-the-shelf tools rather than customising everything, doesn’t mean the outcomes won’t be as strong. It’s about asking: what are we really trying to achieve, and how can we do that in the time available?
Sometimes it means doing things differently than you first planned, but it doesn’t mean you won’t see the results you’re aiming for.
Nicola Tatham (25:57)
So fast implementation doesn’t mean cutting corners. It means thinking more innovatively about how to get there – creating efficiencies, prioritising key steps, and pushing others further down the line, but without detriment to quality.
Cutting corners has such a negative tone to it, and that’s not something we’d ever encourage. Fast implementation is about using innovation, better processes, and so on.
You mentioned “off the shelf.” The flip side of that is customisation. So the next statement is: customisation always makes the solution better. Science or fiction?
Emily Williams (26:34)
Mm-hmm, yeah. Customisation can be great and sometimes it’s exactly what clients need to make their solution work best for them. But it’s not always the thing that leads to great outcomes.
As I said before, it’s about figuring out what we’re trying to achieve, and the time we have available. Then we decide the best route forward. And we don’t always have to keep things the same – there are opportunities to iterate and develop later.
Nicola Tatham (27:30)
It also depends on what you mean by “better.” If your success criteria is delivery by December and it’s already November, then the best solution is going to be off the shelf, because it’s easier to get live for the client.
Emily Williams (27:39)
Yeah, absolutely. We also have so much data that backs up our off-the-shelf assessments – which we may not have for customised ones. A lot of work goes into validating those assessments to make sure they’re predictive of success in the role.
So it’s about asking: what’s the best route right now, and what could it look like moving forward? Some of the best solutions I’ve seen clients implement use a combination of both – off the shelf and customised.
Caroline Fry (27:50)
Yes.
Emily Williams (28:14)
Some customised parts and some off-the-shelf parts – that combination works really well for many clients. So again, really getting under the skin of what they’re looking to achieve is important.
Nicola Tatham (28:23)
Next one, Emily. The statement is: candidate experience is someone else’s problem. Science or fiction?
Emily Williams (28:33)
Fiction. Candidate experience is everyone’s problem.
Different teams might focus on different parts of it, but everybody needs to have a focus on candidate experience – that’s ultimately what makes the solution work well. Of course, there are other factors like efficiency, but candidate experience is sometimes lost along the way, and that’s really critical.
It’s about thinking through how candidates interact with the platform, the stages of their journey, what information they need to have, and when. Some clients do this really, really well. The key is to make sure candidate experience is always brought to the forefront.
Nicola Tatham (29:27)
So a definite fiction. No grey area in that one.
Emily Williams (29:29)
Yeah.
Nicola Tatham (29:33)
Final one for you, Emily. You can’t build a truly scalable solution without saying no sometimes.
Emily Williams (29:44)
Science. That’s true.
I talked earlier about consistency being key to scalability. You have to consider all the inputs people bring to the overall journey – but ultimately, some things just won’t work if global or cross-business consistency is what you’re aiming for. There will always be requests that simply aren’t scalable. Saying no is an important part of that journey.
That can include feature requests or requests from different teams, markets, or business areas. But saying no is also tied to building trust. If you always say yes, you’re not being transparent or honest – with clients or internally. You have to be able to say no and suggest a different approach that will actually deliver the best outcomes.
Nicola Tatham (31:18)
Thank you, Emily. Thank you so much for all of the insights you’ve shared today. One more question: if there’s one piece of advice you’d give to someone about to roll out a talent assessment solution, what would it be?
Emily Williams (31:36)
Also a good question. I’d say: make sure what you’re doing is well understood by the business. That’s ultimately what leads to buy-in and helps you reach the goals I keep talking about.
If it’s clear what you’re doing, why you’re doing it, and what outcome it will drive – whether that’s efficiency, fairness, or something else – then you’ll get better adoption from the different users in the platform day to day. That also drives the data you need to start iterating and improving as you move forward. So yes, that would be my one piece of advice.
Nicola Tatham (32:29)
That brings us to the end of our podcast for today. Emily, we really appreciate your time and your insights – it’s been fascinating.
As a final reflection, there’s been a lot to take in. With my Sova hat on, it feels clear that relationships between teams within our business are vital to good customer success. We all need to understand what our clients’ goals are, how they define success, and then align across our teams to achieve that.
Sometimes what one team is trying to achieve can feel like it’s in competition with another’s priorities. But we need to work closely together to get to the point where we’re all pushing in the same direction for the client. The only way to do that is by building trust, having strong communication, and being willing to say no to each other when needed.
Emily Williams (33:47)
Mmm.
Nicola Tatham (33:54)
It’s not just about client relationships – it’s also about internal ones. We need to be comfortable saying, “That might not work, let’s think about a different way of achieving it.” That would be my main reflection. Caroline, anything you’d add?
Caroline Fry (34:14)
Not much to add. I think the comment you made, Emily, about us all having that customer success ethos or mindset really stood out. It makes sense – that’s why we’re all here: to build the best solutions, the best experiences, and get the best outcomes for our clients.
And like Nic said, relationships, trust, and transparency are key. Just as clients have competing priorities, we have them internally too. But when your team acts as the conduit for what the client wants – and we know you’ve had those nuanced conversations already – it gives the rest of us confidence. We can trust that when you bring challenges to the business, they’ve been carefully thought through with the client, and we can all work together to solve them.
Caroline Fry (35:35)
I think the value of your team really stands out. Every time Nic and I do this and speak to people within our business, I learn so much. It shows how all the different parts of our organisation work together to deliver the best outcomes for clients.
So thank you for everything you’ve shared. But yes, the time has come to close the episode. Thanks, Emily – and thanks to everyone tuning into The Score.
Emily Williams (35:56)
Thank you.
Caroline Fry (36:04)
If you found this episode helpful – whether you’re rolling out assessments or supporting talent strategies – check out our previous episodes. New episodes drop every two weeks on YouTube, Spotify, or wherever you get your talent acquisition fix.
Nicola Tatham (36:19)
Thank you.
Caroline Fry (36:19)
Emily, you’re amazing. That was so good.
Customer success is not just the job of one team. It’s a mindset that runs through the whole organisation. While Emily’s team leads on client relationships, successful outcomes rely on support and input from every department. Internal trust and collaboration are just as important as the work done directly with clients.
Success comes from understanding client goals in detail and aligning everyone around them. Launching a solution is only the first step. What matters is how it is adopted across the business, how obstacles are managed, and how the solution evolves over time. Trust and clear communication are essential, both with clients and internally. Sometimes that means being honest and saying no when requests don’t support long-term goals.
There’s a common belief that shorter assessments equal better candidate experience. In reality, quality depends on fairness, engagement, and clarity as much as speed. A well-designed assessment can be longer and still positive for candidates, while a short but poorly designed test can feel frustrating or meaningless. Everyone involved in building and delivering assessments has a role in shaping candidate experience.
Good onboarding is the foundation for long-term success. Without it, programmes risk poor adoption and weak outcomes. But onboarding alone is not enough. A strong platform, sound science, and consistent delivery are all needed to make sure assessments continue to create value well beyond the first launch.
Fast implementations do not have to cut corners. They require clear priorities, efficient processes, and sometimes making use of off-the-shelf tools rather than building heavy customisation from day one. Moving quickly can still mean moving well, as long as quality and fairness remain central.
Customisation can add value, but it is not always the best answer. Off-the-shelf assessments come with strong validation and data, making them reliable starting points. Some of the best solutions combine customised and off-the-shelf elements, balancing client needs with proven approaches.
Success is not defined by going live. It is about how well solutions are used across markets or teams, whether they deliver against agreed goals, and how they improve over time through iteration and learning from data. Taking time to reflect on progress is just as important as looking ahead to the next milestone.
Emily’s single piece of advice is simple: make sure the purpose and outcomes of the assessment are well understood across the business. Clear communication of the “why” builds buy-in, drives adoption, and ensures the data generated is meaningful and actionable.
Talent assessment programmes succeed when science, technology, and relationships all come together. They require clarity of purpose, collaboration between teams, and an ongoing focus on fairness and candidate experience. Success is not about a single moment at launch but about building trust, adapting to challenges, and keeping client goals at the centre.
Sova is a talent assessment platform that provides the right tools to evaluate candidates faster, fairer and more accurately than ever.