Windsurf Track
Module 32
Windsurf Track -- Module 32
Windsurf Across All ThreadCo Brands: Maya wants both developers across all three ThreadCo brands using Windsurf consistently. This module covers the enterprise setup: SSO, shared base .windsurfrules for all ThreadCo projects, model access policy (Sonnet for most work, Opus only for campaign copy), and the 90-day productivity measurement plan.

Enterprise Windsurf Deployment

Deploying Windsurf across an engineering organisation requires decisions about data residency, model access controls, usage monitoring, and integration with existing developer toolchains. This module covers the enterprise-specific considerations.

Windsurf for Teams

Windsurf for Teams adds centralised licence management, SSO integration, usage dashboards, and admin controls. Team admins can set organisation-wide default models, restrict which models developers can access, and view per-user usage metrics.

Data Privacy Controls

By default, code context sent to Windsurf is not used for model training. For organisations with strict data residency requirements, enable the "no telemetry" mode which prevents any code from being logged. Verify the current Codeium DPA covers your jurisdiction before rollout.

Self-Hosted Models

Windsurf supports connecting to self-hosted models (Ollama, vLLM, Azure OpenAI) or using your organisation's own Anthropic API key. This keeps all inference within your cloud boundary -- essential for air-gapped environments or organisations that have negotiated zero-logging agreements.

Developer Productivity Metrics

Windsurf for Teams provides: completion acceptance rate, lines of AI-assisted code per developer, Flow usage frequency, and time-in-AI-mode. Use these to identify power users (potential Champions), measure productivity gains, and justify the licence cost to leadership.

Enterprise Rollout Checklist

Procurement and Legal

Review Codeium DPA and Terms of Service. Confirm data residency model meets your requirements. Sign enterprise agreement. Set up SSO (Okta, Azure AD, Google Workspace). Assign admin accounts.

Security Review

Assess what code context leaves the developer machine (completion requests, Flow prompts, indexed codebase metadata). Confirm no source code is retained by Codeium beyond the inference request. Document findings in your vendor risk register.

Pilot Cohort

Start with 5-10 volunteer developers across 2-3 teams. Measure baseline productivity metrics before rollout. Collect structured feedback after 4 weeks: what worked, what did not, what conventions need to be added to .windsurfrules.

Shared .windsurfrules Baseline

Work with pilot cohort to create a base .windsurfrules template for each major stack your organisation uses. These become the starting point for every new project repository.

Training Programme

Run 2-hour hands-on workshops for each engineering team -- not slide decks. Each developer solves a real task from their backlog using Windsurf during the session. Practical experience in the first session is the strongest predictor of long-term adoption.

Measure and Report

After 90 days, measure: PR cycle time reduction, lines of AI-assisted code, developer NPS, and estimated hours saved per sprint. Report to leadership with cost-per-developer vs productivity gain. This is your evidence base for org-wide rollout.

ShopMate -- Windsurf Enterprise Config

JSON -- Windsurf Enterprise Policy for ThreadCo
{
  "organisation": { "name": "ThreadCo Ltd", "ssoProvider": "google" },
  "modelPolicy": {
    "allowedModels": ["claude-sonnet-4-6", "claude-haiku-4-5-20251001"],
    "defaultModel": "claude-haiku-4-5-20251001",
    "opusAllowedGroups": ["senior-dev"]
  },
  "privacy": { "disableTelemetry": true, "disableTraining": true },
  "contextPolicy": {
    "excludePatterns": ["**/.env", "**/secrets/**", "**/brands.yaml"]
  }
}
Python -- scripts/windsurf_productivity.py
# 90-day productivity report for Maya's investor update
import subprocess
from datetime import datetime, timedelta

def git_report(days: int = 90) -> dict:
    since = (datetime.now() - timedelta(days=days)).strftime("%Y-%m-%d")
    commits    = subprocess.getoutput(f"git log --since={since} --oneline | wc -l")
    ai_commits = subprocess.getoutput(f"git log --since={since} --oneline --grep='Cascade' | wc -l")
    features   = subprocess.getoutput(f"git log --since={since} --oneline --grep='feat:' | wc -l")
    return {
        "total_commits":       int(commits),
        "ai_assisted_commits": int(ai_commits),
        "new_features":        int(features),
        "ai_assist_rate":      f"{int(ai_commits)/max(int(commits),1):.0%}"
    }

stats = git_report()
print(f"Last 90 days:")
for k, v in stats.items(): print(f"  {k}: {v}")