- IF's newsletter
- Posts
- #5 Changing our relationship with services
#5 Changing our relationship with services
Hi,
We are super excited that Mark Priestly has recently joined as Partner, bringing a wealth of tech, product, and design expertise to the team. We cant wait for the impact he will bring.
Over the past month, I've been reviewing third-party privacy policies to ensure they align with GDPR and our own privacy standards before committing to use. I've noticed one common issue - vague language around data security. Statements like "We will do our best to protect your personal information" and "strict procedures and security features are in place to prevent unauthorised access" these statements lack clear, detailed commitments or specific actions to mitigate risk. If companies had this transparency at the start it would earn my trust, and would save me a lot of time requesting more information. - Jo
We are Projects by IF. We help our clients move faster, with less risk, by creating products and services that earn and maintain trust. We help our clients do 3 things:
- Grow in new markets.
- Deepen customer relationships.
- Derisk innovation.
Learn more or talk to us.
What we’ve been up to in the studio
Trust is a critical factor in AI adoption
AI is increasingly everywhere. It’s in our mapping apps, fitness trackers, customer service chatbots. It’s making decisions in hiring, finance, and healthcare. But most people don’t see how these systems work. And that’s a problem.
Our latest blog post explores why a lack of trust in AI creates adoption challenges for organisations, and what to do.

How to achieve meaningful adoption of AI
To date product teams have prioritised interfaces that are easy to use. For years the mantra has been “Don’t make users think!”
With AI this is exactly the wrong model. It risks misuse, or over-reliance on the AI. Instead we need to "Make users think!", designing interfaces that help users evaluate and understand enough about the AI at point of use, without losing the ambition of simplicity.
It enables co-production between the AI and human users - reducing risks and skills fade - enabling meaningful adoption.
John Maeda featured some of our thinking on adoption of AI in his 11th #DesignInTech annual report (https://lnkd.in/gsp5SYTH). Thanks John!

AI Agents will change our relationship with services and each other
Listen to Sarah on the Design of AI podcast sharing her extensive knowledge and expertise on the role of trust, and how it will change with the increasing use of AI.
Sarah speaks to the hosts about AI agents, and asks - what will happen when we will entrust agents to operate on our behalf? What will we want to understand about their actions, in order to trust that they are operating as we expect?
Empowering Communities: Shaping AI and Data for a Better Future
It was great playing a role in the Connected by data unconference with DXW and
the Ada Lovelace Institution.
Connected by Data have another event coming up, and is part of their project on giving communities a powerful say in public sector data and AI. It's happening on Wednesday 9 April, and here's the link to register your interest.
Photography by Paul Clarke

Jeni Tennison sharing her wisdom
Coming up
Calling all public service product people
On 4th April we will be at Product for People’s unconference - giving a lightning talk on trust in AI. Product for People arrange community events for public service product people. Government, health and care. Tickets have sold out, but there’s a waiting list.
What we have been reading
23andMe's inadequate product design, security and policy led to big data breaches and loss of trust in 2021 and 2023, and they have now filed for bankruptcy. 23andMe’s trust breach has featured on our website, as a warning story of what not to do. It doesn’t surprise us that the organisation is in even more trouble. What could be more personal, or matter more, than data about your ancestry?
Originally hackers stole ancestry data, enabled by poor data practices that failed to recognise or act on the relationship between individuals and their relatives. 6.9 million accounts breached, 50% of users were impacted and countless relatives of those users were also affected. Whilst any purchaser of 23andMe must stick to the company’s current privacy policies that protect consumer data, customers are making data deletion requests on mass.A year ago Klarna cut jobs by 50% with the CEO claiming AI could do all human jobs! We knew this wasn't true. A year later they admit they got that wrong. It's not just the increased hype here, it's also the lack of authenticity - fuelling the confusion around AI's real capabilities. There were private and public organisations that tried to follow Klarna's lead with so much wasted innovation, time and resources.
Tanya O'Carroll won her case against Meta on use of data in targeted advertising. The settlement reaffirms our right to object to personal data being used for direct marketing. You can send a GDPR objection email to Meta to exercise your rights against unwanted direct marketing and data misuse.
We worked with Luminate in 2023 on issues around this case. Our work was to uncover ways organisations could advertise more responsibly, and in ways that genuinely meets people’s needs. That work is published and we’d love to hear what you think about it.
See you next month -
— The IF Team