Kai Meredith

Hi, I'm Kai.

This portfolio documents my work in architecting internal tools, automating critical workflows, and deploying systems that allow organizations to run smoothly.

Internal Operations / Enterprise Architecture

The MJO Dashboard: A Full-Stack Operations Platform

AppSheet Google Apps Script Acuity API Slack API Google Cloud GCP
The Mission MJO is a high-volume public-facing social services office in NYC. We manage court-mandated services for 3,000+ participants annually with a team of 35. I architected and built the entire day-to-day digital operations platform to streamline foot traffic, automate complex compliance reporting, and bridge the gap between our team and the court system. I led development from 0 to 1 and drove adoption across the team.
MJO Dashboard Front Desk SPOG

System Ecosystem

graph TB User[35+ Staff Members] -->|Mobile/Web App| UI[AppSheet Editor Layer] subgraph Backend [Backend] direction TB UI -->|Read/Write| DB[(Google Sheets DB)] DB -->|On Edit| Triggers[Apps Script Triggers] end subgraph Automation [Automation Layer] direction TB Triggers -->|Sync| Cal[Acuity Scheduling API] Triggers -->|Alert| Slack[Slack Webhooks] Triggers -->|Generate| Docs[Google Docs Reports] Triggers -->|Email| Gmail[Gmail API] end Cal -.->|Appointment Data| DB style UI fill:#ffffff,stroke:#333,stroke-width:2px style DB fill:#e8f5e9,stroke:#2e7d32 style Triggers fill:#e3f2fd,stroke:#1565c0

Deep Dives: Operational Impact

Situational Awareness Engine

Front Desk Single-Pane-of-Glass

The Problem: A new office layout broke sightlines between reception and staff. Reliance on paper sign-ins created safety blindspots and slowed down critical communication about high-needs clients.

The Build: An AppSheet dashboard acting as a "Mission Control" for reception.

  • Real-Time Logic: Pulls Acuity API data to show today's appointments, wait times, and group rosters in one view.
  • Safety Alerts: A "Special Instructions" virtual column flags banned or high-needs clients instantly upon name entry.
  • Slack Integration: Signing a client in triggers a webhook that posts to the office channel and auto-tags the assigned social worker in a thread.
High-Stakes Automation

Compliance Reporting Stack

The Problem: Court reporting was a manual nightmare. Finding cases, looking up dockets, and drafting emails took ~3 minutes per case. With 30+ cases a day, this cost the team 5-6 hours weekly.

The Build: A "Human-in-the-Loop" automation suite using Google Apps Script.

  • Drive Database: Script auto-generates a nested folder structure (Year > Month > Week > Day) for organized filing.
  • Template Engine: Pulls case metadata to pre-fill headers in Google Doc templates, leaving the narrative body for human validation.
  • Email Drafts: Auto-drafts emails to court parts based on a user-to-court mapping system.
Closed-Loop Logistics

Phone Distribution System

The Problem: Social workers requested phones via ad-hoc Slack DMs. The sole Office Manager was overwhelmed tracking requests across multiple channels, leading to cognitive overload and lost tickets.

The Build: A centralized ticketing system with automated state management.

  • Workflow Automation: "New Request" triggers a DM to the Office Manager. "Acknowledge" notifies the Social Worker.
  • State Machine: Tracks status from PendingPrepReady/ShippedCompleted.
  • Audit Trail: Created a reliable "Source of Truth" for leadership to audit hardware expenses.

Consulting Work

Retail / Payroll Automation

Kinship Barbershop: Real-Time Payroll ETL

The Job A high-volume Brooklyn barbershop needed to decouple their payroll logic from their POS limitations. I engineered a custom ETL pipeline that extracts raw Square data, applies complex commission rules impossible in standard software, and feeds a real-time Looker Studio dashboard for owner visibility.

System Architecture

graph TB SQ[Square API] -->|Fetch Payments/Orders| AS[Apps Script ETL] subgraph TL[" "] direction TB AS -->|Resolve Staff| ID[Identity Mapping] ID -->|Logic| CL[Commission Rules Engine] CL -->|Split| TAX[Tax & Fee Separation] end TL_LABEL["Transformation Layer"]:::groupLabel TL_LABEL -.-> AS TAX -->|Write Row| DB[(Google Sheet 'Processed')] DB -->|Visualize| LS[Looker Studio Dashboard] state[Cursor State] -.->|Sync Token| AS style TL fill:#ffffff,stroke:#e6e2da,stroke-dasharray:4 4 classDef groupLabel fill:transparent,stroke:transparent,color:#6b7280,font-size:12px,font-weight:600

Key Engineering Challenges

  • Missing Staff Attribution: Square's API often drops staff IDs on Quick Charge transactions. Built a heuristic fallback engine findStaffByCustomerAppointment_ that correlates timestamps with booking history to recover lost attribution.
  • Complex Commission Logic: Implemented a tiered logic engine that handles different commission rates for Services vs. Products, splits processing fees 50/50, and isolates tax liabilities based on owner/contractor status.
  • Incremental Sync: Implemented cursor-based pagination (SYNC_CURSOR_KEY) to fetch only modified transactions, respecting API rate limits.

Code Snippet

syncSquareToSheet.js
// From syncSquareToSheet()
// Logic: Batch processing with Idempotency & Staff Recovery

function syncSquareToSheet() {
  const props = PropertiesService.getScriptProperties();
  const nowIso = new Date().toISOString();
  // Resume from last sync cursor
  const beginIso = props.getProperty(SYNC_CURSOR_KEY) || isoDaysAgo_(30);

  const payments = fetchPaymentsUpdatedSince_(beginIso, nowIso);
  
  // Batch Fetch related Orders to minimize API calls (N+1 problem)
  const orderIds = unique_(payments.map(p => p.order_id));
  const ordersById = batchRetrieveOrders_(orderIds);
  
  // Heuristic: Pre-fetch booking info to resolve missing staff
  const bookingInfo = prefetchBookingInfo_(ordersById);

  payments.forEach(p => {
    // Transform raw API data into payroll row
    const row = buildProcessedRow_(
        p, ordersById[p.order_id], catalogInfo, staffById, 
        customersById, commissionData, bookingInfo
    );
    // Upsert logic (Update if exists, Append if new)
    if (paymentRowIndex[p.id]) {
      updates.push({row: paymentRowIndex[p.id], values: row});
    } else {
      appends.push(row);
    }
  });

  props.setProperty(SYNC_CURSOR_KEY, nowIso);
}
Consulting / Resource Planning

New Tomorrow Partners: Resource Forecasting Middleware

Harvest API Airtable API Apps Script Idempotent Sync
The Job A consulting firm struggled to forecast resource availability across multiple projects using siloed tools. I built a middleware solution that synchronizes Harvest time-tracking data with Airtable, creating a unified 'Source of Truth' that automatically calculates utilization rates and burn charts.

System Architecture

graph TB HV[Harvest API] -->|Fetch Time & Projects| MW[Middleware Script] subgraph ML[" "] direction TB MW -->|Aggregate| AG[Group by User/Week] AG -->|Calculate| UTIL[Utilization %] UTIL -->|Check| ID[Idempotency Check] end ML_LABEL["Middleware Logic"]:::groupLabel ML_LABEL -.-> MW ID -->|Upsert Payload| AT[Airtable API] AT -->|BI View| V[Forecasting Interface] style ML fill:#ffffff,stroke:#e6e2da,stroke-dasharray:4 4 classDef groupLabel fill:transparent,stroke:transparent,color:#6b7280,font-size:12px,font-weight:600

Key Engineering Challenges

  • API Rate Limiting & Idempotency: Airtable API has strict rate limits. I implemented a syncIdempotent function that first fetches all existing record IDs, creates a hash map, and creates separate `toInsert` and `toUpdate` batches to minimize API calls.
  • Data Normalization: Harvest stores data in nested JSON objects. The middleware flattens this into a relational structure suitable for Airtable, calculating "Burn Rate" and "Projected Finish" dates on the fly.

Code Snippet

syncIdempotent.js
// From syncIdempotent()
// Logic: Batching requests to respect API limits & prevent duplicates

function syncIdempotent(sheetName, tableId, uniqueFieldName) {
  // 1. Fetch all existing Airtable records to build a Lookup Map
  const existingRecords = fetchAllAirtableRecords(tableId);
  const recordMap = new Map();
  existingRecords.forEach(r => recordMap.set(r.fields[uniqueFieldName], r.id));

  // 2. Diff Local Data vs Remote Data
  const {hdr, data} = read(sheetName);
  const toInsert = [];
  const toUpdate = [];

  data.forEach((row) => {
    const uniqueVal = row[uniqueColIndex];
    const fields = mapRowToFields(row);

    if (recordMap.has(uniqueVal)) {
      toUpdate.push({ id: recordMap.get(uniqueVal), fields: fields });
    } else {
      toInsert.push({ fields: fields });
    }
  });

  // 3. Execute Batched API Calls (Max 10 per request)
  while (toInsert.length > 0) {
    const batch = toInsert.splice(0, 10);
    airtablePOST(tableId, { records: batch });
    Utilities.sleep(250); // Rate limit backoff
  }
}
Insurance Marketing Automation

Public Loss Adjusters: Direct Mail Drip Campaigns

Google Sheets Postalytics API Time-Based Triggers Webhooks
The Job To automate lead nurturing without a heavy CRM, this firm needed a lightweight, reliable drip campaign. I turned Google Sheets into a state machine that tracks lead maturity and triggers physical mailers via the Postalytics API, bridging the digital-to-print gap automatically.

System Architecture

stateDiagram-v2 [*] --> Week1 Week1 --> Week2: 7 Day Delay Week2 --> Week3: 7 Day Delay Week3 --> Week4: 7 Day Delay Week4 --> Week5: 7 Day Delay state "Trigger Check" as TC Week1 --> TC TC --> PostalyticsAPI: Active & Due? PostalyticsAPI --> MailSent MailSent --> UpdateRow: Set Next Trigger

Key Engineering Challenges

  • State Management without a Database: Used Google Sheets as a state machine. The script calculates nextTriggerDate based on the current stage and runs an hourly cron job (`weeklyTrigger`) to check for rows that have matured.
  • Visual Feedback: Implemented a dynamic progress bar function (generateProgressBar) that updates cell values with emojis (🟩⬜) to give non-technical staff instant visual status of a lead's position in the drip campaign.

Code Snippet

processLead.js
// From processLead()
// Logic: State machine for drip campaign execution

function processLead(rowIndex, weekStage) {
  // 1. Validate Data Integrity
  if (!hasRequiredFields(row) || row[IS_PAUSED_INDEX]) return;

  // 2. Construct Payload for Direct Mail API
  const payload = buildPayload(row);
  const webhookUrl = props.getProperty(weekStage + '_WEBHOOK');

  // 3. Execute External Action
  const response = sendToPostalytics(webhookUrl, payload, authHeader);

  if (response.getResponseCode() === 200) {
    // 4. Update State Machine
    const nextWeek = new Date(now.getTime() + (7 * 24 * 60 * 60 * 1000));
    const stageNum = parseInt(weekStage.replace('WEEK_', ''));
    
    // Update Stage, Progress Bar, and Next Trigger Date
    sheet.getRange(rowIndex, 2).setValue('Week ' + stageNum);
    sheet.getRange(rowIndex, 3).setValue(generateProgressBar(stageNum));
    sheet.getRange(rowIndex, 5).setValue(nextWeek);
  }
}
Recruiting / CRM

Busy Bees Maid Service: Hiring Automation CRM

Twilio API Google Docs (Templates) Forms Event-Driven
The Job A service business faced high candidate drop-off due to slow manual follow-ups. I developed an event-driven hiring CRM that instantly engages applicants via SMS (Twilio) and manages the interview lifecycle, ensuring no candidate falls through the cracks during off-hours.

System Architecture

graph TD Manager[Manager Changes Status] -->|Trigger| Script Script -->|Decision| Check{Which Stage?} Check -->|Invite| SMS[Send Twilio SMS] Check -->|Reject| Email[Send Gmail] SMS -->|Wait| Form[Watch for Form Submit] Form -->|Update| Sheet[Update Status: 'Completed'] Sheet -->|Notify| Slack[Slack Alert]

Key Engineering Challenges

  • "Quiet Hours" Logic: The system queues SMS messages. If a manager updates a status at 11 PM, the script detects this and schedules the Twilio API call for 8 AM the next morning to avoid disturbing candidates.
  • Dynamic Templating: Built a mapping system where the script pulls email/SMS body text from specific Google Docs (`Template Mappings` sheet), allowing the non-technical owner to rewrite copy without touching the code.

Code Snippet

onEdit.js
// Logic: Watch for Status Change -> Execute Action

function onEdit(e) {
  const range = e.range;
  const sheet = range.getSheet();
  
  // Only trigger on 'Pipeline Progress' column changes
  if (sheet.getName() === 'Application' && range.getColumn() === PIPELINE_COL) {
    const status = e.value;
    const row = range.getRow();
    const candidate = getCandidateData(row);

    if (status === 'Invite to Interview') {
       // Check Quiet Hours before sending SMS
       if (isQuietHours()) {
         scheduleMessageForMorning(candidate);
       } else {
         sendTwilioSMS(candidate.phone, getTemplate('INVITE_SMS'));
       }
    }
  }
}