The firm
Cascade Wealth Partners is a 35-employee Registered Investment Advisor based in the Pacific Northwest, managing approximately $450 million in assets under management across roughly 850 client households. The firm provides comprehensive wealth management — financial planning, investment management, tax-aware portfolio construction, and a small estate-planning collaboration with outside attorneys.
The firm has been SEC-registered since 2019, when their AUM crossed the threshold for federal registration. Their examination cycle runs every three to four years; their last examination had been in early 2022 and had concluded without significant findings.
The forcing function
Sarah, the firm's Chief Compliance Officer, had been following industry publications closely through 2023 and 2024. She read about the SEC's "AI Washing" enforcement actions against advisers who mislabeled their use of AI in marketing materials. She read about RIAs receiving exam questions on AI tool inventories and client data handling. She read about deficiency findings citing "inadequate supervision of artificial intelligence in adviser workflows."
She read about the OCIE Risk Alert from late 2024 that flagged AI use as an examination focus for 2025.
Sarah didn't have an AI policy. She didn't have an AI tool inventory. She didn't know what her firm's twenty-eight advisors and support staff were using — and she had a strong suspicion the answer was "more than zero."
In an October 2024 staff meeting, she made the case to the managing partners: the firm's next SEC examination would almost certainly include AI-related questions, and the firm would almost certainly not have ready answers. The cost of getting ahead of it would be small; the cost of finding out during the exam would not.
The managing partners agreed. The firm had three to six months before they could plausibly be scheduled for an examination, given their cycle position.
The decision
Sarah evaluated three paths:
Path 1: Do it internally with a template. Adapt one of the AI policy templates circulating in industry compliance forums. Run a half-day training. Hope the SEC examiner accepted it.
Path 2: Hire a generalist compliance consultant. Have an outside firm produce a policy and run training.
Path 3: Engage a specialist for a structured, documented Sprint. Get a written evidence package the SEC examiner could look at.
She picked Path 3 for three reasons:
- A generic AI policy from a template wouldn't address the specific tools her advisors were using — and she didn't know what those tools were yet without a discovery process.
- A documented Sprint produced what the examiner would actually look for: an inventory, a risk classification, a control map, training records, written acknowledgments. Templates don't produce those.
- The cost differential between Paths 2 and 3 was modest; the difference in evidence quality was significant.
She found Shadow AI Labs through a referral from her cyber insurance broker, who had been recommending the Sprint to mid-sized advisers. Cascade engaged for the AI Risk Sprint — $5,500, two weeks.
Inside the Sprint
Deliverable 01 — AI Tool Discovery (Days 1–4)
Browser telemetry across 35 employees, an anonymous survey, procurement audit, and a review of recent IT tickets for any mention of AI features. Findings:
- 14 AI tools in active use across the firm — Sarah had estimated the answer would be six to eight
- Three financial planning software vendors had silently activated AI features in 2024 — features that ingested client financial data as part of "AI-assisted plan generation"
- One CRM tool had launched an AI meeting-notes feature that captured advisor-client call summaries to vendor-controlled infrastructure
- Six advisors were using consumer ChatGPT accounts for "market commentary drafting" and "client letter drafting" — including two who had pasted client portfolio summaries to get personalized language suggestions
- Two support staff were using AI-powered transcription tools for paper documentation backfill, including some client onboarding documents containing identifying information
The discovery's most-cited finding by Sarah herself was the financial planning software piece — three vendors the firm had had documented contracts with for years had introduced AI features that ingested client data without any contract amendment Cascade had signed.
Deliverable 02 — Risk Classification Matrix (Days 3–6)
Each tool mapped against NIST AI RMF severity and a fiduciary-specific criticality dimension (which weights potential breach of duty of care or duty of loyalty as a separate factor):
| Severity | Count | Pattern |
|---|---|---|
| Critical | 3 | Consumer AI used with client portfolio or planning data |
| High | 4 | Vendor AI features activated without contract update |
| Medium | 5 | Productivity tools with some advisor-client communication exposure |
| Low | 2 | General-use tools with limited client-data context |
The Critical-severity classification covered the patterns most likely to surface in an SEC examination's adviser-workflow questions.
Deliverable 03 — SEC Examination Readiness Gap Analysis (Days 5–9)
Side-by-side comparison of likely examination questions (drawn from the OCIE Risk Alert, recent enforcement actions, and the published examination priorities for 2025) against Cascade's documented state. The deliverable mapped 9 likely examination scopes:
- 7 of 9 had no documented evidence (AI tool inventory, AUP, training records, supervision framework, client communication review, marketing material review, vendor procurement review)
- 1 of 9 had partial documentation (existing IT security controls overlapped with some AI scope but did not specifically address AI-tool usage)
- 1 of 9 had complete documentation (existing cybersecurity policy covered relevant data-handling controls at a general level)
All 7 gaps were addressable in 90 days if remediation started immediately. The deliverable explicitly framed each one in the language Sarah could expect to see in an exam request list.
Deliverable 04 — AUP & Advisor-Specific Provisions (Days 6–10)
The AUP was drafted with explicit provisions for an RIA practice:
- Sanctioned tools list with categories: financial planning software (vendor-specific provisions for the three vendors whose AI features had been activated), CRM (with the meeting-notes AI feature explicitly addressed), market research and commentary tools (with sanctioned alternatives that did not retain client interaction data), productivity (Microsoft Copilot E5 with documented exclusions for client portfolio data), and prohibited (consumer AI for any client-data context).
- Client communication provisions. Specific language addressing AI use in client-facing materials — both the firm's marketing materials (per the "AI Washing" enforcement focus) and individual advisor communications.
- Fiduciary framing. Each provision included a brief annotation linking it to the relevant fiduciary obligation — duty of care for tools that could affect recommendation quality, duty of loyalty for tools that could create undisclosed conflicts of interest, and duty of confidentiality for tools that could expose client information.
- No-blame reporting standard, with explicit reference to FINRA whistleblower protections and SEC tipster protections.
Training was outlined as three modules totaling 60 minutes, with advisors receiving an additional 20-minute module on fiduciary-specific AI considerations.
Deliverable 05 — 90-Day Pre-Examination Roadmap (Days 9–13)
Sequenced action plan with owners, effort estimates, and acceptance criteria. Highlights:
- Week 1: Distribute AUP firm-wide with required acknowledgment. Disable the 3 Critical-severity tools at the network and endpoint level. Initiate contract amendment conversations with the three financial planning software vendors.
- Month 1: Microsoft Copilot E5 deployed firm-wide. Modules 1 and 2 training delivered. Initial AI tool inventory submitted to internal compliance log.
- Month 2: Vendor contract amendments executed (or vendor AI features disabled where amendments could not be reached). Module 3 training delivered. Supervision framework operationalized.
- Month 3: First quarterly governance committee meeting. Examination-readiness evidence package assembled and stored in compliance vault.
Deliverable 06 — Executive Readout (Day 14)
A 60-minute readout with the Chief Compliance Officer, the two managing partners, the CFO, and the firm's outside ERISA/securities counsel. The 22-page PDF report — including the 14-tool inventory, line-by-line examination-scope mapping, vendor amendment recommendations, and acceptance criteria for each remediation step — was delivered same-day. Three decisions were made during the meeting:
- Authorize the Implementation engagement to execute the 90-day roadmap.
- Add AI governance as a standing agenda item on the firm's quarterly compliance committee, with the CCO chairing.
- Update the firm's ADV Part 2A disclosure to reflect the firm's documented AI tool governance, in advance of the next annual update.
The follow-on
Cascade engaged Shadow AI Labs for the AI Governance Implementation — $22,000 over eight weeks, given the smaller scope of a proactive engagement vs. a post-incident recovery. The Implementation covered:
- Microsoft Copilot E5 deployment and migration
- Vendor contract amendment support (legal review, redlines, negotiation guidance)
- AUP training delivered and acknowledgment infrastructure stood up
- ADV Part 2A AI disclosure language drafted (reviewed by Cascade's outside counsel before filing)
- Compliance vault structure for the AI governance evidence
The Fractional retainer was deferred — Sarah and her team had the capacity to operate the program in steady state, and the firm preferred to bring SAL back as needed for new vendor reviews or material policy changes.
The SEC examination
Nine months after the Sprint, Cascade received the SEC examination notice. The exam request list included three AI-specific sections:
- AI tool inventory with vendor information and data-handling controls
- AI Acceptable Use Policy and training records
- Description of supervisory review for AI-assisted adviser communications
Sarah handed the examiner the complete evidence package. The exam concluded without findings related to AI. The examiner included a verbal note in the exit interview that Cascade's documentation "represents the kind of proactive program we hope to see more of from advisers in this size range."
The exam findings overall were minor — two small documentation gaps in the marketing material review process, unrelated to AI. Both were addressed in the deficiency letter response within thirty days.
The numbers
| Category | Year 1 cost |
|---|---|
| AI Risk Sprint | $5,500 |
| AI Governance Implementation | $22,000 |
| Microsoft Copilot E5 (35 seats × 12 mo) | $14,000 |
| Vendor contract amendments (legal review) | $8,500 |
| Staff training time (35 × 60 min + advisor supplement) | ~$11,000 |
| CCO time on the engagement (~80 hours allocated) | ~$15,000 |
| Total Year 1 investment | ~$76,000 |
The counterfactual — an SEC examination deficiency finding related to AI — would have looked like:
- Deficiency remediation legal fees: $25,000–60,000 (industry-typical)
- Documentation back-fill consultant: $30,000–80,000 (compressed timeline)
- Potential follow-up enforcement action: highly variable; "AI Washing" actions in 2024 ranged from $200K to $4M depending on scope
Even at the low end of those ranges, the proactive Sprint represented a meaningful expected-value play. The CCO's stated rationale was simpler: "I would rather know what we are doing about AI than find out what we are not doing about AI in an exam."
Sarah's reflection
"The two findings the examiner did flag were minor and unrelated to AI. The big finding was the absence of an AI finding — I could see her checking each item off her list as I handed it to her. We spent less than the cost of one client dinner per AUM dollar to protect the firm. When the examination concluded, the managing partners asked me what was next. I told them: the next examination, in three to four years. We will be ready for that one too."
What we'd tell another RIA
1. The examination cycle is the forcing function
You know roughly when your next exam is coming. The window for proactive remediation is the period before the request list arrives. Once the list is in your hands, every documented evidence gap is a deficiency in formation.
2. Your existing vendors may have given you AI exposure without telling you
The financial planning software space is the most common vector — multiple vendors have introduced AI features in 2024 without contract amendments. Inventory your existing vendor relationships first; the AI tools you knew about are usually a smaller problem than the AI features that were silently enabled.
3. Fiduciary framing matters
A general-purpose AI policy will not satisfy an examiner looking at adviser conduct. The AUP needs to specifically address how AI use intersects with duty of care, duty of loyalty, and duty of confidentiality — because those are the lenses the examiner uses.
4. ADV Part 2A is part of the package
Advisers who use AI in any adviser-client workflow should consider whether disclosure language is appropriate in the next annual ADV update. The SEC has been clear that "AI Washing" (overstating AI use) is an enforcement target — but understating documented AI use, or omitting it entirely, is also a disclosure issue.
Heading into an SEC examination cycle?
If your firm has not documented its AI tool inventory, AUP, training program, or supervisory framework — and your next examination is in the 6 to 18-month range — the Sprint produces the evidence package the examiner will ask for.
Take our free AI Risk Assessment to see where your firm sits relative to the 2025 examination priorities — or book a Discovery call to talk through your specific examination timeline.
This case study is a composite based on real-world engagement patterns with SEC-registered investment advisers. Firm name, CCO name, and specific operational details have been modified to protect confidentiality while preserving the educational value of the scenario.
