Every TA team has rolled out improvement initiatives that survived the planning meeting and quietly died within six months. Interview training for hiring managers. Weekly pipeline reviews. A new approval workflow. Each one launched the same way. The slide deck was clean, the HRBP nodded, and by month three nobody was following through.
The pattern repeats because the default response to any TA problem is the same. People aren't doing the right thing, so let's get them to. More training. More oversight. More process steps. Every one of these initiatives depends on dozens of people sustaining a behavior change that the system itself does nothing to support.
Two principles, both violated
Systems design theory offers a cleaner diagnostic. Paul Hawken, an environmentalist and entrepreneur who wrote The Ecology of Commerce while trying to understand why industrial systems resist improvement, states it plainly. Good design follows two principles. First, change the fewest possible elements to get the greatest result. Second, remove stress from the system rather than adding it.
Bad design does the opposite on both counts. It targets the most uncontrollable element (usually human behavior) and layers on enforcement to hold the change in place.
Hawken's original example was environmental policy. The U.S. spent over a trillion dollars on pollution regulation, inspectors, compliance mandates, penalties for violations. The approach demanded that thousands of companies change their behavior, monitored by thousands of regulators. Revenue-neutral green taxes in several European countries changed one variable, the price signal, and achieved larger emission reductions with a fraction of the enforcement overhead. One structural lever, moved once, versus decades of behavioral policing.
A design that requires a hundred people to change their habits needs constant supervision and degrades the moment attention shifts elsewhere. A design that changes one default or one information flow redirects behavior without anyone needing to want to behave differently.
Where TA improvement fails both tests
Take the most common TA improvement initiative. Interview training. The target is hiring manager behavior, which is the most uncontrollable element in the hiring system. You're asking fifty managers across different functions, geographies, and experience levels to remember a two-hour workshop three months later when they're twenty minutes into an interview with a candidate they already like. The training adds stress. Now there's a new technique to recall on top of everything else happening in the room.
Run it through the two questions. Are we changing the fewest elements? No, we're trying to change every interviewer. Are we removing stress? No, we're adding one more thing for them to hold in their heads.
The good-design alternative is to change the scorecard. When a scorecard requires evidence-linked ratings before an overall recommendation (instead of an overall impression followed by reasons invented to match it), the quality of assessment data improves without anyone receiving training. The form does the cognitive work that the workshop tried to install temporarily. One structural element changed. Stress removed from the evaluator, who no longer has to remember techniques. The scorecard structure does the remembering.
The same pattern shows up when time-to-fill creeps up. Weekly pipeline reviews, escalation procedures, additional approval gates. Each addition creates more meetings and more overhead. All of it targets the thing you can least control, which is whether recruiters and hiring managers prioritize speed in a given week. Meanwhile, the actual bottleneck is often one approval step where the HRBP signs off on a role that three other stakeholders have already approved, or a scheduling sequence that requires four calendars to align before a panel can happen. Remove that one structural chokepoint and the system accelerates. No weekly review needed.
The most stubborn version of this shows up with hiring manager intake. Every organization I've worked in has tried some version of training managers to write better job descriptions. High resistance. Whatever managers learned faded within weeks. What actually moved the needle was redesigning the intake form so the fields themselves shaped the thinking. When the form asks "What will success look like at six months?" instead of "List the requirements," the output improves because the question does the work. The form is the intervention, not the training.
Applying the filter
The next time a TA improvement project lands on your desk, run it through both questions before committing resources.
Start with who has to change. If the answer involves hiring manager behavior, recruiter habits, or the way interviews are conducted across the org, you're targeting the most uncontrollable element. Look for the structural variable underneath instead. The default, the form, the sequence that could change once and produce the same result without sustained behavior change.
Then count. How many new steps, meetings, approvals, and reporting requirements does the initiative introduce? How many does it eliminate? If the first number is larger, you're adding stress. Look for what you can take away. The approval step no one reads, the status update that duplicates information already in the ATS, the phone screen that exists because two teams don't share notes.
Most TA improvements that survived the time changed something structural and removed something unnecessary. The ones that died tried to change how people act and gave them more to do.
And if the filter itself matters enough to use, put it on the proposal template. Two fields. "What structural element changes?" and "What does this remove?" Advice in someone's head is behavioral. A field on a form is structural. The model's own logic applies to itself.
When the filter doesn't apply
Sourcing a CTO from a global talent pool of two hundred people depends on relational skill, timing, and judgment that you can't structurally induce. No form redesign builds trust with a passive candidate who isn't looking. And in crisis hiring, when a VP leaves unexpectedly and the role needs filling in three weeks, adding meetings and compressed timelines is the right response. The filter works for recurring organizational patterns. One-time coordination problems and relationship-dependent work operate on different rules.
The next time someone pulls up a slide deck with a new TA initiative, you'll have two questions. Most initiatives can't answer them.
Models in this article
Good Design Principles: Good system design changes the fewest structural elements to get the greatest result and removes stress rather than adding it, bad design targets human behavior and layers on enforcement.
Discipline: Systems design
Key research: Donella Meadows (leverage points); Thaler & Sunstein (nudge/default effects)
Source: Paul Hawken, The Ecology of Commerce (1993)
The Recruiting Lattice applies mental models from diverse disciplines to the daily craft of talent acquisition.