System Intelligence vs. Everyone About How It Works Apply for Access →
← Intelligence Feed
Vendor Evaluation

You Did Not Buy Software. You Bought a Retainer.

How to tell the difference — and why most restaurant AI platforms are consulting engagements wearing a SaaS price tag

April 9, 2026 7 min read superGM Intelligence Team
competitivevendor evaluationindustryoperations

There is a class of restaurant technology platform that markets as software and operates as consulting.

You pay a monthly license. You receive, in practice: a quarterly analysis of your own operation delivered by their team, a Success Manager whose job is to interpret the platform for you, and an implementation project that runs for months because the software cannot configure itself for your operation.

This is not software. This is a retainer with a good interface.

The tell is simple. Count the humans between the signal and the action. Every one of them is a consulting layer wearing a software price tag.

The QBR Is the Consulting Invoice

The Quarterly Business Review is standard in enterprise software. A vendor rep meets with your leadership team, walks through a deck of your operational data, and tells you what it means.

Consider what is actually happening in that meeting. Your vendor has built software that processes your operational data. The software surfaces data. The data requires interpretation. The interpretation is performed by their team. The meeting exists to deliver the interpretation to you.

That is a consulting deliverable. Consulting firms charge $400 to $600 per hour for that deliverable. Your QBR is bundled into your license fee. It is not a gift. It is the business model.

The question to ask: if your vendor stopped showing up for the QBR, would the software still generate value? If the answer is no — if the platform requires their team to explain what it means — you are not running software. You are running a retainer.

The Implementation Project Is a Discovery Engagement

Platforms that require extended implementation projects are, by the nature of that timeline, indicating something about the degree to which the software can configure itself independently. The answer is: not enough.

Implementation specialists are consultants. Their job during the implementation project is to learn your operation and configure the platform to serve it. That learning — mapping your workflows, understanding your compliance exposure, identifying your yield patterns — is work that the software was supposed to do autonomously.

It did not. Their team is doing it instead. You are paying for the implementation project. You are funding a consulting engagement to compensate for what the software could not do by itself.

Ask your vendor: what does the platform do on day one, before the implementation project is complete? The honest answer tells you the ratio of software to consulting in what you purchased.

The Success Manager Is a Consultant

Customer success management is a legitimate function. It exists to help customers get value from software they purchased. The question is what the success manager actually does.

In a genuine software product, the success manager helps customers use features they have not discovered, troubleshoot edge cases, and maximize adoption. In a consulting-ware product, the success manager is an ongoing interpretation layer. They attend your operational reviews. They help you understand what the platform surfaced. They translate the output into action recommendations.

The translation requirement is the tell. Software that requires a translator is not software that you can operate independently. The success manager is the translator. The translation is the product. You are paying for a person, not a platform.

The Playbook Is the Confession

When a vendor provides you with a playbook — a guide to how to use their intelligence to drive operational outcomes — read it carefully. Not for the content. For what it implies.

The playbook exists because the software surfaces outputs that require a procedure to act on. The procedure cannot be automated. If it could be automated, the software would automate it and there would be no playbook. The playbook is documentation of the human workflow required to extract value from the platform.

That human workflow is a consulting engagement encoded in a PDF. You are managing the engagement. Your operators are executing the deliverables. The vendor calls it a playbook. Consulting firms call it a methodology.

The Benchmark Contextualization Call

Benchmarking platforms produce reports comparing your performance to industry peers. The reports are accurate. The comparison is meaningful. And then — because the data requires context to be actionable — the vendor schedules a call to explain what the comparison means for your specific operation in your specific markets.

That call is an analyst engagement. The analyst works for the benchmarking company. You are paying for the benchmark report. The analyst interpretation is bundled in. It is presented as customer success. It is a consulting deliverable.

Without the analyst call, the benchmark report produces: a number that is above or below median. With the analyst call, the benchmark report produces: a number that is above or below median, with a 45-minute explanation of why and what to do about it. The 45-minute explanation is the value. The value is human. The human is on their payroll. The software is the reason the call is necessary.

The Roadmap Seat Is Unpaid Consulting

Many platforms offer customers a seat on the product roadmap — input into what gets built next. This is framed as a partnership. It is, in practice, a user research engagement.

When you participate in roadmap discussions, you are doing the work of their product team. You are scoping the problem they should solve next. You are describing the gap between what their software does and what you need. That work has value to them. They use it to build the next version. You are not compensated for it.

At best, the roadmap seat means the next version will address your problem. At best, you are also the beta customer for the next version. At best, your problem is solved in Q3 of next year, for a platform you are paying for starting now.

What Software Actually Looks Like

Software does not require a Success Manager to generate value. It generates value while you are sleeping. It does not need you to understand its outputs — it acts on them. It does not have an implementation project because it configures itself against your operational parameters. It does not need a playbook because the playbook is encoded in the system.

Software produces outcomes, not insights. An insight is a piece of intelligence that a human must interpret and act on. An outcome is a thing that happened. The GM reached Table 9 before the window closed. The emergency order was placed before the vendor window expired. The pricing adjustment executed before service started. Those are outcomes. No consulting required.

Count the humans between the signal and the action.

Each one is a layer of consulting you are paying for at software prices.

Related Intelligence

MORE TO READ.

Application Review

MOST OPERATORS
WHO APPLY
WILL NOT BE SELECTED.

We work with operators whose operation, culture, and competitive position fit what we built this for. We review every application individually. We select from the backlog.

If you are reading this because a competitor sent it to you, they may already be in production. We don’t confirm or deny active deployments.

Applications reviewed individually · Not all are accepted