Platform Teams Forgot They Have Customers
Why most internal platforms look like catalogs of unused features, and the three metrics that force the conversation back to adoption.
The engineering director had fourteen slides ready, one per platform feature shipped that quarter. Each slide carried a small green check. He was proud of it, and he should have been. The team had delivered everything on the roadmap, on time, under budget.
I asked him a question I ask every platform leader now. How many of those fourteen features had more than three teams using them last month?
Quiet.
He said he would check. The answer came back a week later, and we went through it together. Nine of the fourteen had single-team adoption. Two had zero. The three capabilities everyone in the room actually argued about in hallway conversations were not on the deck, because the platform team had not built them yet. That gave us a useful place to start: what would it take to get those three into the next quarter’s plan, and what could we sunset to make room.
That meeting stayed with me. The director was doing his job well by the definition he had been given. His scorecard covered deliverables shipped, incident count, and SLA. All three were green. Adoption was not on the scorecard, so it never became a target, and the team had no reason to think of it as part of their work.
This is the quiet trap every platform team slides into. They get budget because the business wants faster delivery. They get leadership because good engineers want to multiply what other engineers can do. Then they get handed a scorecard pulled straight from infrastructure operations, and the product discipline that would have saved them never enters the room.
A platform is a product. The customer is an internal team. The measure of success is whether that team picks your thing when no one is forcing them to.
Say that to a platform leader and they will agree. Every one of them has read the InfoQ piece, or the DORA report, or the Team Topologies book. The language has traveled. The behavior has not. Most platform teams I have worked with over the years still do not interview their customers. They do not run discovery. And when adoption does not happen on its own, the team calls it a marketing problem.
Here is a conversation I had with a platform PM about two years into a rollout I was advising:
Me: Which three features drive most of the value for your customer teams?
PM: I can tell you which three we shipped last.
Me: That is not the same question.
PM: Right. I would have to go ask.
She did not come back for a week. When she did, she had a spreadsheet. Three features had real traction. Two had a single power user apiece. Seven had been adopted once, then abandoned. The team had been running for eight quarters. They had never retired a single thing.
That is the pattern. Platform teams accumulate capabilities without ever pruning them. A product team that refused to sunset features would not survive its next review. Platform teams do it every quarter, and they call it stability.
One pattern I had to correct in myself as an advisor was letting platform leaders define their own success on infrastructure terms, because those terms were easier to measure and harder to argue with. Uptime. Deploy frequency. P95 latency. All useful. None of them tell you whether the thing is working as a product.
The hardest work in a platform rollout is the moment a platform PM has to walk into a review and say, we are killing feature seven, and we are redirecting two engineers to the thing three teams have been begging for. That conversation costs political capital, costs sunk-cost grief, and costs a certain amount of face for the engineer who built feature seven. It is still the job.
A platform team that does not own adoption is a team producing output without owning outcome. Artifacts ship on schedule while the customer teams quietly route around them. Six quarters in, someone asks why adoption is the bottleneck, and the platform team points at everyone else. Adoption is hard. Treating it as someone else’s responsibility is what keeps a platform team from ever fixing it.
When a platform team picks up product discipline, the work looks different in three places.
Discovery becomes a habit. Real conversations with the five teams most likely to pick up the next capability, asking what is painful right now and what they would trade for it. The format is boring on purpose: a half-hour call, two questions, a notebook. Sometimes the answer is boring too. Often the answer reshapes the roadmap.
Features get killed, out loud, in the review, with a short note to affected users and a migration path where possible. Nobody celebrates a sunset. Good product teams still do them, because shipping and keeping are different decisions.
Adoption becomes the measure that matters most. That brings me to the three metrics I would put on any platform team’s scorecard.
First, weekly active teams. Count unique internal teams that used a platform capability in the last seven days. Not accounts, not API calls. Teams. This number should grow each quarter or the platform has a product problem, not a marketing one.
Second, feature sunset rate. Count how many capabilities you retired last quarter. Zero is a red flag. Platform backlogs grow forever, but platform surface area should not. A healthy team retires roughly 10 to 20 percent of its capability count per year, sometimes more when the organization’s needs shift.
And then, time to first value for a new team. Measure the days from the moment a new engineering team decides to adopt your platform until they ship something real to production on top of it. I have seen this number range from two days in the best platforms I have worked with, to over three months in the worst. Three months is a second layer of infrastructure that customer teams have to work around.
None of these numbers are hard to collect. The reason most platform teams do not collect them is that the results force conversations nobody wants to have. Feature seven gets killed, engineer four switches projects, and the roadmap gets rewritten around something that looks less impressive on a slide.
That is the job. A platform team that is not willing to do it ends up running a subsidized infrastructure group with better branding, and the adoption number will never move no matter how many features ship.
Platform as a product starts the quarter someone has the courage to measure it like one.



