This website uses cookies. Learn more

Okay
- By  Claire Southeard, Managing Director
Lansons AI is the new ESG but faster and messier

There’s a familiar pattern emerging in how organisations are responding to AI. And it looks a lot like the early days of ESG.

A complex, fast-moving issue. Multiple stakeholders with competing expectations. A rush to define positions, often before the strategy is fully formed.

But there is one critical difference; AI is moving much faster. And the consequences of getting it wrong are more immediate.

At our recent roundtable with senior communications and corporate affairs leaders exploring the impact of AI on corporate reputation, one point came through clearly: the margin for vagueness on AI has disappeared.

Organisations are being asked to explain their position on AI now, not in six months.

Meanwhile, the reality vs expectation gap is widening

Most (all?) organisations are still experimenting, piloting, and building AI capability. At the same time, they are under increasing pressure to communicate with clarity and confidence.

Regulators are moving quickly from high-level principles to hard expectations on governance and accountability. Investors want evidence of impact and return, not stated ambition.

Employees are asking direct questions about skills, job security and what AI means for them.
And customers are forming views based on what they can find (or, increasingly, what AI systems select to surface).

The result is a growing disconnect between internal realities and external demand for comms and that tension is creating reputational risk.

The reputation fundamentals remain. But the stakes are higher

Security. Responsibility. Accuracy. Transparency. The foundations of strong corporate reputation haven’t changed.

But AI is shifting the conditions under which organisations are judged against these things – and that’s what makes this a different kind of reputation challenge. The risks are immediate, the scrutiny is constant, and the gaps are harder to hide.

AI will expose how decisions are really made, where accountability sits, and whether trust built over time is still warranted.

Waiting to control the narrative is futile

The instinct is to hold back – until the strategy is clearer, the data is stronger, or the organisation feels ready to speak with confidence. It’s understandable. But it’s the wrong call.

The debate is happening regardless. Stakeholders aren’t waiting for a polished position. And silence doesn’t create space, it fills it.


Communications both about AI and for AI

It’s not enough to define a corporate position on AI. You also need to ensure your organisation is visible in the AI-mediated environments where audience understanding and perceptions are shaped.

That means managing two communications tracks in parallel:

  • A corporate narrative about AI: what you are doing, the impact that is having, and how you are managing it responsibly, communicated in a way that resonates with investors, regulators, employees, and customers.
  • A content strategy for AI: ensuring your organisation’s story, expertise and value remain visible and trusted.

Both demand clarity and consistency – and neither one can wait for the other to be ‘complete.’

Confident, coherent communication is key

The organisations navigating this transition most effectively are not necessarily those moving fastest on AI operationally. What sets them apart is how they communicate.

It requires alignment across strategy, governance and comms, and leadership willing to speak openly about the journey – even if the destination remains uncertain.

Get comfortable with uncertainty and open about learning

The clamour for answers on ESG grew over time. Initially, imperfect disclosures were expected. Narratives evolved gradually. There were time and space to iterate.

AI hasn’t offered the same luxury. Mistakes are immediate and visible. Narratives are tested in real time. Inconsistency is quickly exposed (not least by AI systems themselves).

It demands a different kind of communications leadership. One that is comfortable operating without complete answers, open about what it is still learning, and willing to move before the full picture is clear.

Because if ESG was steering a tanker through unchartered waters, AI is building the plane while flying it…

**

Want to join our next roundtable discussion?

We are hosting a series of breakfast roundtables for senior communications and corporate affairs professionals discussing the impact of AI on reputation and trust. We have dates coming up in May and June so please contact us if you are interested in joining one and will reserve you a place or add you to the waiting list for our autumn dates.

Stay in the loop with our experts

Subscribe for the latest news, events and insights straight to your inbox