CRIAD
BIMclash detectionconstruction technologycoordinationrework

We Automated the Easy Part: Why BIM Clash Detection Isn't Fixing What's Actually Broken

Harsh Singh·

If you've worked on a large construction project in the last decade, you've sat through the meeting. You know the one. Someone opens Navisworks, runs the clash test, and the screen fills with red. Hundreds of clashes. Sometimes thousands. The room goes quiet for a second, someone sighs, and then you spend the next two hours triaging which red dots actually matter.

During my time at Disperse, I watched this scene play out across dozens of projects we monitored. And here's what I've come to believe: clash detection, as it's practiced today, is one of the most oversold promises in construction technology.

Don't get me wrong — the concept is sound. Finding conflicts in a digital model before they become conflicts on a physical site is obviously better than the alternative. McKinsey estimated that digital adoption including BIM could boost construction productivity by 14–15% and cut costs by up to 20%. The UK Government's BIM Strategy targeted 33% cost savings. These are real numbers backed by real research.

But theory and practice are different animals.

The noise problem nobody admits

Here's what actually happens on a typical project: you federate your models, run the automated clash detection, and get back a report with somewhere between 500 and 5,000 clashes. Studies show that up to 60% of those are false positives. A light fixture flagged because it's 1mm too close to a wall. A pipe "clashing" with another pipe at an intentional connection point. Geometry that overlaps because two teams modelled the same element independently.

The actual critical clashes — the ones that would cost you real money on site — are buried somewhere in that haystack. And someone has to find them.

That someone is usually your BIM coordinator. And they're drowning.

The human cost of "automated" detection

We call it automated, but what we really mean is that the detection is automated. The resolution is entirely manual. Your coordinator runs the test, filters the noise, categorises what's left, assigns responsibilities, schedules a coordination meeting, walks the room through each clash, documents the decisions, and then follows up to make sure the changes actually get made in the model.

Now multiply that by every model update. On a busy project, that's weekly. Sometimes more.

I've spoken to BIM managers who spend 60–70% of their week on clash management alone. Not designing. Not solving problems. Not adding value. Just triaging an ever-growing list of red dots.

The meeting that eats your week

Then there's the coordination meetings themselves. In theory, these are where smart people get together to solve design conflicts before they hit site. In practice, a huge chunk of the time goes to reviewing low-impact issues, confirming that previously identified clashes were actually fixed, and debating whether something is a "real" clash or a modelling artefact.

The high-impact conversations — the ones about the duct that won't fit below the beam, or the structural change that invalidates the MEP coordination for an entire floor — get compressed into whatever time is left. If they happen at all.

Meanwhile, the project manager is looking at their watch wondering why coordination is taking so long when they were told BIM would make everything faster.

The numbers that should worry us

The construction industry has a rework problem, and it's enormous. Rework accounts for roughly 52% of total cost growth on projects and drives schedule overruns of up to 22%. About 70% of rework incidents trace back to design inconsistencies or errors — exactly the kind of thing clash detection is supposed to catch.

Globally, McKinsey puts the cost of construction inefficiency at $1.6 trillion per year, with budget overruns ranging from 20% to 45%. And 85% of projects across 20 countries over a 70-year study period experienced cost overruns, averaging 28%.

So we have a tool that's supposed to prevent rework, and we have an industry that's still losing trillions to rework. Something isn't connecting.

The real problem: detection without intelligence

Here's what I think is actually going on. Clash detection tools are very good at answering one question: do these two geometries overlap? They are terrible at answering the questions that actually matter:

  • Is this clash going to cost me money on site?
  • Which clashes are connected to design changes I haven't seen yet?
  • What's the downstream impact if I ignore this one?
  • Has this same type of clash been resolved before, and how?
  • Who actually needs to act on this, and by when?

In other words, we've automated the easy part and left the hard part entirely to humans. The detection is instant. The intelligence is still manual.

What better looks like

I don't think the answer is better clash detection. I think the answer is making the underlying data accessible enough that you catch problems before they become clashes in the first place.

If every person on the project could query the federated model in plain language — "show me what changed in the MEP model since last Tuesday" or "are there any structural elements within 100mm of the new ductwork?" — you wouldn't need a coordinator to spend three days preparing a clash report. You'd catch conflicts as they emerge, not after they've compounded.

If design changes triggered automatic alerts to the affected trades — not a 500-line clash report, but a targeted notification saying "the beam on Level 3, Grid C-7 moved 200mm south, this affects your ductwork run" — you'd resolve issues in hours instead of waiting for the next coordination meeting.

And if the model's version history was genuinely auditable — diffs you could actually read, not just a log of file uploads — you'd stop spending half your coordination meetings trying to figure out what changed and when.

The shift we need

The construction industry doesn't need more clash reports. It needs fewer reasons to generate them. That means:

  • Accessible data — not locked behind specialist software that only three people on the project can use
  • Live models — not weekly exports that are already outdated by the time anyone opens them
  • Change intelligence — not "something changed" but "here's what changed, here's who it affects, and here's how bad it is"
  • Team-wide visibility — so the site manager, the QS, and the project manager can all see the same truth without asking the BIM team to extract it

We built Criad because we kept watching this pattern repeat. Smart people, expensive software, and projects that still overrun because the data didn't reach the right person at the right time.

Clash detection isn't broken. But it's not enough. And pretending it is enough is costing us all a lot more than we like to admit.


Harsh Singh is the co-founder and CEO of Criad, an AI-powered BIM platform that federates construction models into a single live source of truth. If you're tired of drowning in clash reports, you can book a call or join the waitlist.