← Richard Sutcliffe

I Built a Sector Intelligence Platform in Three Weeks. Any Board Should Be Asking Why.

What a personal build with an AI development tool taught me about technology leadership

The Housing Ombudsman publishes thousands of structured findings every year about how landlords handle complaints, who committed maladministration, and what remedies were ordered. For any housing association trying to understand its regulatory exposure, this is the most important public dataset in the sector. Almost nobody reads it at scale. I built a platform to do so, on my own, in three weeks of evenings and weekends. The interesting question is not whether you believe me. It is what this single fact does to the economics of product development on your board.

I am the CTO of ThinkTribal, a UK consultancy building AI products for social housing. Twenty-five years across the NHS, Accenture, and AND Digital. Programmes, business cases, product backlogs, steering groups. None of those activities built the platform. I sat down with an AI development tool and built it myself.

The problem was structural, not technical

A compliance manager preparing for a board meeting reads the cases relevant to their own organisation, and if they are thorough, a handful from peers. The sector-wide picture sits in thousands of individual records, unlinked, unfiltered, impossible to compare. A Head of Governance asking whether the organisation is typical or exposed has no efficient way to answer.

The platform fills the gap. It takes the full corpus of published determinations, structures them, applies risk scoring, and turns the result into board-ready intelligence. Landlord profiles, peer benchmarking, sector-wide risk signals, and inspection packs generated on demand. An AI synthesis layer reading patterns a human team would need weeks to assemble.

Building it yourself means something specific

I want to be precise here, because there is a press-release version of this story and an honest one. The honest one is more useful.

Working with an AI development tool is closer to directing than to writing code. You describe what you need, review what comes back, correct it, refine it, iterate. The AI writes the code. You make the decisions. Architecture decisions. Data model decisions. User experience decisions. The constant stream of small judgement calls about what to prioritise, what to defer, and what to cut.

This is where twenty-five years of cross-domain experience turns out to matter more than it ever did in a conventional development setting. I drew on data governance knowledge to structure the records correctly. User research experience to design search filters matched to how compliance managers think. Product management instincts to decide which features belonged in the MVP and which belonged in a later tier. Metadata strategy work to ensure the risk scoring methodology was defensible.

None of this required me to write JavaScript. All of it required me to know what good looks like, and to recognise when the AI had produced something below the bar.

The generalist experience the technology industry spent twenty years treating as a weakness turns out to be the thing making AI-assisted development work. A specialist who knows one domain deeply will build a narrow tool. A generalist who has worked across architecture, governance, user research, benefits management, testing, and delivery will build something coherent across all of those dimensions. The AI handles the implementation. The human handles the integration of concerns.

What this means if you sit on a board

The cost of building a sector-specific intelligence platform has collapsed. Not by ten or twenty per cent, but by an order of magnitude. The product I built would have required a team of four or five, a six-figure budget, and six months of calendar time to reach the same point two years ago. I built it in evenings and weekends, using a tool costing less per month than a decent lunch.

Most boards have not absorbed this shift. A CTO or CDO with the right domain knowledge and the willingness to get their hands dirty is now a credible alternative to commissioning a team. Not for everything, and not without governance. But for proving out a product concept, validating market demand, and getting something real in front of users, the solo AI-assisted build is now a serious option.

The questions any board should ask are straightforward. Does the person building have the domain expertise to make the right design decisions, or are they relying on the AI to decide things it should not be deciding. Is there a governance framework around the build, including version control, test coverage, and architecture documentation. Is the product designed for handover to a team if it succeeds, or is it locked inside one person's head.

For this build, the answers were yes, yes, and yes. GitHub for version control. Playwright for test-driven development. Architecture Decision Records. A clear separation between what I built and what a team would need to maintain. Those were not afterthoughts. They were the things separating a prototype from a product.

The leaders who close the loop will win

The technology industry has spent the last decade telling senior leaders to stay strategic and delegate the building to specialists. The advice made sense when building was slow, expensive, and required deep expertise in a single language or framework.

It makes less sense now. The leaders who understand both the problem domain and the technology well enough to direct an AI through a build will produce better products, faster, than the leaders who write a brief and wait for a team to interpret it. Not because teams are unnecessary, but because the feedback loop between domain insight and implementation has shortened to the point where intermediaries introduce more friction than value in the early stages.

This does not replace development teams. It changes what a CTO does in the first phase of a product's life. And the people best placed to take advantage of it are the ones with broad, cross-domain experience who have spent their careers being told they should specialise.

They were right not to.