Cyber incidents are human crises first. Treat them that way.

When organisations prepare for cyber incidents, the focus is usually on technology, containment, and recovery timelines. These are essential considerations, but they are not the whole picture. In practice, every serious cyber incident is also a human crisis, and how leaders respond to that reality often determines whether the organisation emerges stronger or quietly weakened.

In the midst of an incident, it is people who make the critical decisions. It is people who investigate, communicate, contain, and recover. Under pressure, they work long hours, juggle incomplete information, and carry the weight of knowing that mistakes can have serious consequences. Yet staff welfare is still too often treated as an afterthought, something to be dealt with once systems are restored and headlines fade.

This is a mistake, both ethically and operationally.

From experience, the organisations that handle incidents most effectively are not those with the most elaborate technical playbooks, but those that recognise the limits of human endurance and plan accordingly. Fatigue, stress, and fear are not abstract concepts during an incident. They are daily realities that increase the likelihood of errors, slow decision making, and damage long term resilience.

Senior leaders play a decisive role here. The tone they set before an incident ever occurs shapes how teams behave when pressure hits. If staff believe that exhaustion is a badge of honour or that speaking up will be seen as weakness, problems will go unreported until they become unavoidable. Conversely, when leaders actively encourage openness and normalise asking for support, teams are far more likely to sustain performance through prolonged disruption.

Incident response plans often describe roles, escalation paths, and technical actions in great detail, but say little about how people will actually cope with the demands placed upon them. Who steps in if a key individual becomes unavailable? How are rest periods protected when the situation feels urgent? How are expectations set around working hours, especially when incidents stretch into nights or weekends? These questions matter because unanswered uncertainty is itself a source of stress.

Communication is another area where leadership intent is tested. During an incident, silence breeds anxiety, not just among those responding directly but across the wider organisation. At the same time, uncontrolled information flow can overwhelm responders and distract them from critical tasks. Striking the right balance requires preparation and discipline, not improvisation under fire.

There is also a deeper concern that many leaders underestimate. During serious incidents, staff worry about personal consequences. They worry about job security, reputational damage, and whether their own data has been exposed. Clear, honest communication about what is known, what is not yet known, and how the organisation intends to navigate the situation can significantly reduce this background stress and help people stay focused on the work that matters.

Practising incident response is often framed as a way to test systems and processes. It is just as important as a way to prepare people. Exercises help teams build confidence, understand their roles, and experience pressure in a controlled environment. They also give leaders an opportunity to observe how individuals react under stress and where additional support or adjustments may be needed.

Ultimately, putting staff welfare at the heart of incident response is not about being soft. It is about being realistic. Cyber incidents are rarely short, neat, or predictable. They demand sustained effort from individuals who are not immune to stress or fatigue. Organisations that plan for this reality are better equipped to respond effectively and recover fully.

For senior leaders responsible for cyber security, the message is simple. If your incident response strategy does not explicitly account for the wellbeing of the people executing it, then it is incomplete. Technology may fail fast, but people fail quietly, and the cost of ignoring that truth is often paid long after the incident itself is declared over.