AI for operations: docs, vendors, and processes that do not drift
Generating SOPs from chaos, summarizing vendor contracts, building process maps. The unsexy work AI is genuinely good at.
Operations is the team that gets blamed when entropy wins. A vendor auto-renews because nobody flagged the 60-day notice. The onboarding doc still references a tool you stopped using in 2023. Two teams run the same process three different ways and nobody notices until a customer complaint surfaces it. The job, stripped down, is fighting drift. Documents go stale. Processes mutate. Contracts pile up unread. SOPs from 2022 quietly become fiction.
This is exactly the kind of work AI is unreasonably good at, because pattern-matching against recurring structures is the thing it does best. You are not asking it to be creative. You are asking it to read a lot, notice the shape, and produce something boringly consistent. That is its happy place.
Here are the four ops workflows where it earns its keep.
1. Generating SOPs from chaos
Most of your team's process knowledge lives in a Slack thread, a Loom recording, or someone's head. The bottleneck isn't writing the SOP, it's transcribing the chaos into something readable.
Paste the raw input into Claude (long context windows make this easy) or ChatGPT and ask for a numbered SOP with owners, inputs, outputs, edge cases, and the things that break. Granola or Otter will hand you the transcript if it started as a meeting.
Here's a Slack thread where Maria explains how she runs payroll.
She's leaving in 6 weeks. I need this turned into an SOP her replacement can follow.
Output format:
1. Numbered steps, with the tool used at each step
2. Owner column (who does this) and a "called when" trigger
3. Inputs required before starting, outputs produced
4. Edge cases Maria mentions (she calls them "weird ones")
5. A failure mode list: what happens if step X breaks, who gets paged
Voice: practical, no fluff. Assume the reader has done payroll before, just not here.
[paste Slack thread]
The result is a draft. It is not the SOP. Send it to the person who actually does the work and have them red-pen it before you publish. More on that below.
2. Vendor contract summaries
A 30-page MSA takes a careful reader an hour. Claude reads it in eight seconds. Paste the contract and ask for the fields you actually care about: renewal date, auto-renewal clause, termination notice period, payment terms, liability cap, indemnification language that looks unusual, and any data or IP clauses that diverge from your standard template.
Drop the output into Airtable or a Notion database with a renewal-date view, and you get something most companies your size do not have: a contract calendar that pings you 90 days before every renewal.
This is a draft. For anything material (a master agreement, a partnership deal, a contract with real liability exposure) you still want a lawyer reading the actual paper. AI gets you to the right questions faster, it does not replace the answer.
3. Process maps and workflows
Take a multi-team process (employee onboarding, vendor procurement, the customer escalation path) and ask for a Mermaid diagram or a step-by-step doc. Then ask the better question: at each step, what are the failure modes? What happens if the input is missing? Who gets paged?
Most ops docs describe the happy path. The failures are where your team actually spends time. Force the model to enumerate them and you end up with a doc that's useful at 2am, not just on training day.
4. Internal Q&A across all your docs
The "where did we agree on the new PTO policy" question shouldn't be a 20-minute scavenger hunt across Notion, Drive, Slack, and Confluence. Glean and Guru both index across those surfaces and let anyone ask in plain language. Notion AI does a lighter version inside Notion alone.
The unlock isn't the search box, it's that people stop pinging you with questions whose answers already exist. Ops gets its time back.
The trap
AI generates plausible processes that don't match what your team actually does. It will invent a step that sounds reasonable. It will assign an owner who hasn't done that work in two years. It will smooth over the messy real-world workaround that exists for a reason.
Always have someone who actually does the work review the SOP before it ships. Don't publish a doc claiming "this is how we do it" without that loop. Plausible and accurate are not the same thing, and ops is the team that pays when they get confused.
Light automation
Once the docs are clean, the next layer is the "when X happens, draft Y" pattern. Zapier, Make, and n8n all have AI Actions that let you chain a trigger (new vendor invoice, new hire in your HRIS, contract renewal in 90 days) to a model call and an output. New vendor in NetSuite, draft the security questionnaire. New hire on Monday, draft their week-one onboarding doc from the role template.
This is a deeper topic and it gets its own guide. For now, know it exists and that the patterns above are the inputs that make it work. Clean SOPs and structured contract data are what light automation runs on.
Closer
That wraps Pillar 2: nine guides covering AI by role. Founders, sales, marketing, support, finance, recruiting, design, engineering, ops. The pattern repeats across all of them. AI handles the recurring structure, you handle the judgment calls and the relationships.
Pillar 3 is for readers who want to go from using AI to building with it. Custom workflows, agents, internal tools, the move from "I prompt Claude in a tab" to "Claude runs against our data on a schedule and ships output to the team that needs it." If the audit you ran in Pillar 1 turned up patterns no off-the-shelf tool quite covers, that's where you go next.