I Drove $31M in Bookings. The System Said I Needed Improvement.
Here's what my peers wrote in my performance review: "One of the better examples of what the Leadership Principles are all about." "A thought leader and transformation driver." "His ability to lead large-scale cloud transformations, combining deep technical expertise with strategic long-term vision." "A trusted advisor across organizations."
Here's what the system said: Needs Improvement. Development Needed.
Same review. Same year. Same company.
The Numbers
I led the largest cloud consulting engagement in the healthcare and life sciences vertical for a major cloud provider's professional services org. An $80B biopharma company. 55 engineers across 5 scrum teams. $18M in revenue. $31M in bookings. The engagement became the reference architecture — the blueprint for how every future deal in that vertical was sold. I flew out with the VP of the entire HCLS vertical to help close deals with other Fortune 500 pharma companies. Leadership was pulling me off the engagement to do trusted advisor work with new customers. I gave private executive briefings before dinner at the industry's largest conference to CIOs and CISOs from some of the largest pharmaceutical companies in the world.
I got production access approved for our consultants in a customer environment — something that had never been done before. Two people tried before me and failed. I wrote a 30-page security narrative, sat in front of review boards for 9 months, and got it approved. Then I got a third-party consulting partner on our paper approved for prod access too — that was unheard of.
Account vending went from 30 days to 45 minutes. The customer got early access to generative AI services because the foundation we built made it possible. Their CIO presented our work on the keynote stage at the industry's largest conference. The platform won Intelligent Digital Enterprise of the Year, Data Mesh of the Year, and a CIO 100 Award.
What I Didn't Do
While I was running all of that — while I was up until midnight protecting my team during a production incident, while I was banning the sales team from making promises we couldn't deliver, while I was navigating political firestorms between consultancies and internal security teams — I didn't write enough internal blog posts. I didn't get a basic intro-to-AI certification by a specific date — while I was literally working with the AI product team during early release, helping my customer implement it before it was even generally available. I now run three AI platforms. But the system needed that certificate. I submitted some timecards late.
That's what the system measured. Not the $31M. Not the reference architecture. Not the executive briefings at re:Invent. Timecards.
The Customer Satisfaction Score
It gets better. Part of the performance system was tied to customer satisfaction surveys. Our customer gave us an 8 out of 10. In this company's system, an 8 was a nuclear event. Suddenly directors were calling the customer, leadership was in crisis mode, and every engineer on the engagement took a hit in their performance review.
The customer's actual feedback? He was happy with the thought leadership — meaning me and my leads — but unhappy with the number of offshore resources. A staffing decision his boss and the sales team made. Not us. He later told me: "Brian, from now on just tell me what to put in the forms. I'm not trying to cause problems for you all."
But the score was in the system. And the system doesn't do nuance.
The Sales Credit Problem
Here's the part that still stings. The engagement I built became the go-to-market template for the entire vertical. My peers wrote it explicitly: "This has translated to a go-to-market approach" for the industry. But I never got credit for "building the business." The sales team packaged up what I delivered and sold it as their own. In the performance system, they got the points. I got told to write more blog posts.
Why This Matters Beyond My Story
I'm not writing this for sympathy. I took the severance. I'm building OutcomeOps now and I've never been more energized. I'm writing this because every senior engineer and technical leader in a large organization has some version of this story. The system rewards the people who optimize for the system, not the people who own the outcome.
When you measure what actually matters — time to outcome, customer success, business impact, architectural quality — you find the people who are actually driving your organization forward. When you measure timecards and certifications, you find the people who are best at filling out forms.
I built OutcomeOps because I'm done waiting for large organizations to figure this out. The Outcome Engineer doesn't exist inside a stack ranking system. They exist in operating models that escape the local optimization trap and align everyone around one thing: did the customer win?
My peers knew the answer. The system didn't care.