A GIS data audit is the most common first engagement we run with a new utility client — and for good reason. It answers the questions a utility needs answered before making any significant technology or remediation investment: how reliable is our data, where are the gaps, and what would it actually take to fix them?
But "GIS data audit" is a phrase that gets used to describe a wide range of activities, from a two-day desk review of schema documentation to a multi-month field verification program. What you're buying when you commission a GIS data audit depends entirely on how it's scoped. This article is a guide to scoping one honestly — what questions to ask, what to expect in a proposal, and how to recognize whether what you're being offered will actually tell you what you need to know.
What a GIS data audit should answer
A useful audit answers four questions, each at a level of specificity that supports real decision-making:
- Completeness: What percentage of the network exists in the GIS, and with what attribute completeness? Which asset classes are well-represented, and which have significant gaps?
- Accuracy: Where the data exists, how closely does it match physical reality? Is the topology correct? Are the attributes reliable? Are positions within acceptable tolerances?
- Currency: How current is the data? When was the last significant update? What is the typical lag between a field change and a GIS update?
- Governance: Who owns the data, how is it maintained, and where do the workflows break down? This is often the most important finding, and the one least likely to be surfaced by a purely technical audit.
An audit that answers only the first two questions tells you what's wrong. An audit that answers all four tells you why it's wrong — which is what you need to fix it.
The three types of audit
Before scoping, it helps to be clear about which type of audit you need. They're genuinely different exercises with different costs and different outputs.
| Type | What it involves | What it produces | Typical duration |
|---|---|---|---|
| Desktop audit | Analysis of existing GIS data — topology checks, attribute completeness analysis, schema review, comparison against other records (as-builts, CMMS, engineering drawings) | Statistical completeness and quality report, topology error inventory, schema gap analysis | 2–4 weeks |
| Governance audit | Interviews with GIS owners, field supervisors, engineers, IT; review of existing workflows, policies, and as-built processes | Data ownership map, workflow gap analysis, governance recommendations | 2–3 weeks |
| Field verification | Physical verification of GIS records against network infrastructure — GPS collection, visual inspection, comparison of field observations against GIS attributes | Field vs. GIS discrepancy report, verified asset inventory for a bounded area | 4–12 weeks depending on scope |
Most utilities need a combination of desktop and governance audit to make sound decisions. Field verification is reserved for situations where digital record accuracy is genuinely unknown and the stakes — regulatory compliance, operational safety, capital investment — are high enough to justify the cost of physical verification.
What to look for in a proposal
A well-scoped audit proposal answers these questions before you have to ask them:
What data sources are included? A GIS audit that only looks at the GIS is an incomplete audit. The most useful analysis compares GIS records against as-builts, engineering drawings, CMMS work orders, inspection records, and any other authoritative source of record. Each comparison surface exposes a different class of discrepancy.
What asset classes are covered? "Auditing the network" means different things. Does it include services as well as mains? Appurtenances as well as primary network? Customer assets? The scope of asset classes affects both the cost and the completeness of the findings.
What does "topology check" mean specifically? Topology errors range from trivially correctable (duplicate vertices, short segments) to operationally significant (disconnected networks, incorrect connectivity, wrong flow direction). A proposal should specify which topology rules will be validated and what the pass/fail criteria are.
What are the deliverables, and in what format? A narrative report is necessary but not sufficient. A useful audit also produces a structured findings dataset — an error layer in the GIS, a prioritized error register, attribute completeness statistics by asset class — that your team can act on directly rather than translate from a document.
Does it include workflow interviews? The governance layer is easy to omit because it requires time from people across multiple departments, not just GIS staff. If a proposal doesn't include stakeholder interviews as an explicit deliverable, the governance findings will be surface-level at best.
An audit that tells you your data is 60% complete is useful. An audit that tells you why it's 60% complete — and what it would take to get to 90% — is what you need to build a budget.
What to do with the findings
A GIS data audit is not an end in itself. The findings report is the input to three decisions:
- Remediation prioritization: Which gaps are most urgent — from a safety, regulatory, or operational standpoint — and which can be addressed over time?
- Remediation scoping: What would it actually cost to fix the highest-priority gaps? A good audit provides enough specificity in the findings to build a defensible remediation estimate.
- Technology sequencing: Which technology investments are ready to proceed on the current data foundation, and which should wait until specific data quality thresholds are met?
The third decision is the one most utilities get wrong. The pressure to proceed with a GIS modernization program — Utility Network migration, mobile GIS rollout, ADMS integration — often leads organizations to treat the audit findings as a parallel workstream rather than a precondition. The result is a modernization built on a foundation that hasn't been fixed.
Red flags in an audit proposal
- No governance or workflow component — the proposal stays entirely within the GIS and doesn't engage the organizational layer
- Deliverables described only as reports, with no structured data outputs that the GIS team can act on directly
- Timeline that seems very short for the scope — a three-week "comprehensive audit" of a large distribution network is probably not comprehensive
- No field component for a network where digital record accuracy is genuinely unknown
- Findings delivered at the end with no interim check-in — audits surfacing unexpected problems mid-engagement need space to adjust scope
When a GIS audit is the right first step
A GIS data audit makes sense as a first engagement when any of these are true:
- You're planning a significant technology investment (Utility Network migration, ADMS integration, mobile GIS rollout) and need to understand data readiness before committing
- You've had regulatory or audit pressure related to infrastructure records and need an objective assessment of current state
- Operations has expressed concerns about map accuracy and you need a quantified answer rather than an anecdotal one
- You're building a multi-year GIS improvement program and need a defensible baseline to measure progress against
It doesn't make sense as a first step when you already have a clear, well-quantified picture of your data quality problems and what's causing them. In that case, the audit findings are already in hand and the next step is remediation design.
If you're not sure whether you need an audit or something else, the right starting point is a discovery conversation — a 30-minute call where we understand your situation and tell you what we think the most useful first step is. Sometimes that's an audit. Sometimes it's a governance review. Sometimes it's something more targeted. We'll tell you which, and why.