The gap between redesigning a process and embedding one is not a communication problem. It is a structural one — and most institutions discover it only after the design work is complete.
What looked like transformation was documentation — necessary, but not sufficient.
Institutions that had completed their as-is and to-be process mapping found that the new process existed on paper but not in practice. Staff continued working in familiar ways. Workarounds persisted. SLAs were set but not enforced. The quality layer that should have converted documentation into adoption was never built.
Supervisory expectations were increasing, elevating SLA performance from a management metric to a governance one. And institutions were managing distributed workforces for the first time — discovering that hybrid work had exposed how much of their operational knowledge lived in informal channels that no longer existed.
This issue examines the implementation gap — why designed processes do not become adopted ones without deliberate quality infrastructure — what distributed work revealed about process integrity, and why SLA frameworks treated as targets rather than accountability structures fail to deliver the governance outcomes they were designed for.
Why Redesign Is Not Adoption
The quality layer that converts documentation into operational reality
Process documentation is not process adoption. The distinction seems obvious stated plainly. In practice, institutions consistently treat the completion of as-is and to-be mapping as the end of the transformation work rather than the beginning of it.
When institutions began operating redesigned processes, they confronted an uncomfortable reality. The new processes existed — mapped, documented, approved, and communicated. Staff had attended briefings. Procedures had been updated. And yet the processes were not being followed.
Not because of resistance, but because the infrastructure for adoption had not been built. No attestation framework. No daily quality checks. No coaching structure for teams with high defect rates. No consequence mechanism for SLA breaches. The design phase had produced documentation. What it had not produced was the operational discipline needed to embed the new way of working into daily practice.
Rising supervisory expectations around operational discipline meant that institutions with documented standards they could not demonstrate compliance with had created a governance gap — one that a robust quality layer, had it been built, would have closed before it became a regulatory exposure.
A redesigned process that is not adopted is just documentation with a different date on it. The implementation gap is structural, not motivational — and closing it requires a quality layer that most institutions never build.
The Quality Layer — What It Consists Of
- Standard templates — consistent documentation formats that make compliance verifiable and deviation visible across all units
- Attestation frameworks — formal confirmation by process owners that standards are being met, on a defined schedule
- Daily spot checks — quality inspections that identify defects before they compound, not after they become audit findings
- Coaching structures — support for teams with elevated defect rates, distinguishing between knowledge gaps and process design failures
- Automated controls — where feasible, auto-approval and auto-certification for routine transactions to remove manual intervention points
- SLA reporting cadence — operational performance reporting that reaches the right governance level at the right frequency
Measuring What Changed — Not What Was Designed
The measurement problem in 2021 was that most institutions were measuring design outputs rather than adoption outcomes. Process maps completed, procedures updated, briefings delivered. These metrics confirmed activity. They did not confirm change.
Turnaround Time
Is the redesigned process actually faster? TAT reduction is the primary operational benefit of process improvement. If it is not moving, the process has not been adopted.
Defect Rate
Are errors, rework, and exceptions declining? A stable or rising defect rate after process redesign indicates that the new process has not replaced the old one in practice.
SLA Compliance
Are commitments to members being met consistently? SLA performance is the most visible adoption metric and the one most directly connected to member experience.
Cost-to-Income Impact
Is operational cost declining relative to income? Process improvement that does not show in the cost-to-income ratio has not delivered financial value regardless of what the process maps say.
Programme completion should be measured against operational outcomes, not documentation milestones. A process is implemented when it is being followed consistently — not when it has been mapped.
When the Office Was the Process
What distributed work revealed about institutional knowledge
The shift to hybrid and remote work in 2021 exposed something institutions had not documented and in many cases had not recognised: a significant proportion of their operational knowledge lived in informal channels — conversations, proximity, observation — that no longer existed in a distributed environment.
Institutions were operating in a genuinely changed environment. Staff who had worked in physical proximity for years were now distributed across home offices, branch locations, and hybrid arrangements. The informal coordination mechanisms that had allowed undocumented processes to function — the quick conversation, the desk visit, the observed practice — were gone.
What quality teams began reporting was a consistent failure mode: process integrity deteriorated not because staff were unwilling to follow procedures, but because the knowledge required to follow them correctly had never been formally captured.
New staff joining during the hybrid period had no access to the informal apprenticeship model through which operational knowledge had been transmitted. Experienced staff, isolated from colleagues, were making individual judgements about edge cases that had previously been handled through informal consultation.
You cannot document what you do not know is being done. The institutional memory that made undocumented workarounds functional disappeared with the office. What remained was the process map — and the gap between it and reality.
Exception Handling
How non-standard cases are resolved. This knowledge typically resided with one or two experienced individuals and was transmitted by proximity, not by documentation.
System Workarounds
How system limitations were compensated for in daily operation. These workarounds were functional but invisible — until they stopped working without the informal coordination that maintained them.
Informal Escalation Paths
How problems were actually resolved versus how the formal escalation structure said they should be. The informal path was faster and more effective — and completely invisible to the formal governance structure.
The process that worked in the office worked because the office compensated for what the documentation missed. Remote work removed the compensation mechanism.
Why SLA Frameworks Fail in Practice
Why service level agreements fail when treated as targets rather than accountability structures — and what effective SLA governance actually requires
Service level agreements were, for most institutions, aspirational targets dressed as governance instruments. The difference between the two only became visible when rising operational risk scrutiny began to treat SLA performance as a signal of institutional discipline — not just a customer service metric.
Most institutions had SLA frameworks in some form. Turnaround time commitments for loan processing, account opening, and transaction settlement were documented across the sector. What varied enormously was whether those commitments were being met, measured, and governed.
Increasing operational risk scrutiny brought SLA performance into governance conversations in a way it had not been before. Institutions that had treated SLAs as aspirational targets found themselves unable to demonstrate compliance — not because performance was poor, but because measurement systems were inadequate and accountability structures were absent.
Three structural failures explain why SLA frameworks consistently underperform as governance instruments in small-market financial institutions.
Setting an SLA is not the same as governing to one. A target without an accountability structure, a measurement mechanism, and a consequence framework is not an SLA. It is an aspiration with a number attached.
Failure 1 — Targets Without Measurement
SLAs set without a corresponding measurement system produce no accountability. If performance is not tracked, the SLA cannot function as a governance instrument regardless of how precisely the target is stated.
Failure 2 — Measurement Without Accountability
Performance data that reaches no decision-maker with authority to act produces no change. SLA reporting that lands in a monitoring function without escalation triggers is performance theatre, not governance.
Failure 3 — Accountability Without Consequence
Named owners with no mechanism for consequence produce the same outcome as anonymous SLAs. Accountability without consequence is just visibility — useful, but not governance.
An SLA framework that cannot answer three questions is not functioning as a governance instrument: Who owns this SLA? How is performance measured? What happens when it is breached?
2022 Focus Areas
Execution conditions shaping the year ahead
As institutions move into 2022, the process redesign and quality embedding work of 2020–2021 is producing a new visibility problem: the gap between what redesigned processes require from staff and what the existing workforce is capable of delivering. The capacity illusion is about to surface.
The Skills Gap Emerging
Redesigned processes — particularly those incorporating automation and digital channels — require capability profiles that differ significantly from the roles they replaced. Training programmes designed to bridge this gap are discovering that the gap is not primarily about knowledge. It is about analytical capability and comfort with ambiguity that cannot be addressed through procedural training alone.
The Retention Risk
Staff most valuable to quality embedding — those with deep institutional knowledge and the capability to operate redesigned processes — are also the most mobile. As the Jamaican economy reopened and regional competition for financial services talent increased, institutions found that transformation was creating retention pressure they had not anticipated or budgeted for.
Where Attention Should Concentrate
- Capability Assessment — Before beginning the next phase of process redesign, assess whether the workforce has the capability to operate what has already been designed. Redesigning ahead of capability is as wasteful as redesigning without adoption infrastructure.
- Knowledge Capture — The institutional memory exposed as a gap in 2021 needs to be systematically captured before it walks out the door. This is not a documentation exercise — it is a structured knowledge elicitation programme treated as a transformation workstream in its own right.
- New Instrument Readiness — The operational readiness question for any new payment instrument is not technological — it is structural. Can the institution absorb it without disrupting the operational environment it is still stabilising? Institutions that cannot answer this before committing to an integration timeline will answer it mid-implementation.
Designing a process is an intellectual exercise.
Embedding it is an operational one.
Most institutions only budget for the first.
The implementation gap is not a failure of intent or effort. It is a structural consequence of treating process redesign as the transformation deliverable rather than the transformation input. The deliverable is adoption — measurable, sustained, and auditable change in how the institution actually operates.
Closing the gap requires a quality layer that is separate from, and as deliberate as, the design work that preceded it. Attestation, spot checks, coaching, automated controls, and SLA governance are not administrative overhead. They are the transformation programme.
Institutions that built this layer in 2021 are entering 2022 with a genuine operational foundation. Those that did not are carrying forward the same implementation gap — now compounded by a capability gap that the redesigned processes have made visible.
Workforce & Capability
the capacity illusion, the skills gap institutions cannot see, and why process transformation and people transformation are the same problem.
About This Publication
Signal is a research series from Tumblehill Holdings, written for executives responsible for transformation execution in financial services — not those designing strategy, but those accountable for delivery.