After five years of stitching together time trackers, KPI sheets, and appraisal docs that never spoke to each other, I built PulseWork OS — the AI Work OS for mid-size teams — on two AI development platforms in parallel. Live product in 10 hours. Scalable SaaS foundation in weeks. Zero engineers hired. Try it free →
For years, running OnPoint meant operating with a fundamental gap in visibility. Who is working on what right now? Are my team members being utilized efficiently? How do I evaluate someone fairly at appraisal time? Are project allocations matching actual capacity? Is my team productive — or just busy?
The market response to these questions is supposed to be HRMS software. So that's where I looked. I evaluated multiple platforms. I subscribed to a few. None of them connected the four things I needed: daily work → KPIs → reports → appraisals. Each tool owned one slice. None owned the whole picture.
So the data stayed scattered. And every appraisal cycle, I made decisions about people's careers using a fraction of the information I needed.
| What I Needed | What the Market Offered |
|---|---|
| Daily work logging that captures what was done | Time trackers that captured how long |
| AI-generated performance reports | Static dashboards |
| KPIs tied to actual work output | KPIs as standalone spreadsheets |
| Appraisal workflows informed by daily data | Appraisal tools disconnected from operations |
| Built for agencies, lightweight and fast | Built for enterprises — bloated, expensive, slow |
I'm not a developer. I can read code, I understand architecture, but I don't write software for a living. So the idea of building a platform should have been off the table. Then AI changed the equation.
If AI can write code from natural language descriptions, then the bottleneck is no longer engineering hours. The bottleneck is clarity of thought. That, I had.
I knew exactly what I needed. I'd been frustrated for years. I had a full mental model of the product. What I needed was an execution partner that wouldn't get tired, wouldn't quit, and wouldn't need a six-figure salary.
I decided to run a parallel experiment — build the same product on two different AI development platforms, learn the strengths of each, and let the better approach win.
Each platform optimized for something different. By running them in parallel, I didn't have to bet on one. I learned which tool fit which stage of the journey — and let the better approach win each round.
CLAUDE.md briefing document — read at every session startCLAUDE.md saved ten hours of confused output later.
The Claude Code track centered on one strategic file: CLAUDE.md. A master briefing document — read by the AI at the start of every session — containing the product vision, the complete stack with exact versions, file structure conventions, architectural rules, role definitions, and the current phase priorities.
With this in place, every coding session began with a fully briefed engineer. No drift. No re-explaining. No context loss. Sub-directories of the codebase had their own scoped CLAUDE.md for module-specific rules — so the auth module didn't have to know about the work-logs schema, and vice versa.
The investment pays back exponentially. Every hour writing the briefing document saved ten hours of confused output downstream.
What started as a work-logging tool evolved into a full work intelligence platform. Each phase shippable on its own, each building toward the next.
What started as my own work-logging frustration is now a live product positioned as the AI Work OS for mid-size teams — built for the organizations that have outgrown spreadsheets but refuse to drown in enterprise software. Teams too small for SAP. Too serious for sticky notes.
PulseWork OS — live at pulse-stage.onpointnexus.com
Measured against OnPoint's own pre-Pulse baseline — using gut-feel project allocation, manual monthly report compilation, and 2-hour-per-person appraisal prep.
CLAUDE.md is the new system prompt.Lovable is a slingshot — fast, precise, and gets you to the target. Claude Code is a workshop — slower to set up, but capable of building anything you can describe. For Pulse, both played essential roles. Lovable proved the concept and got it into the team's hands. Claude Code is building the scalable SaaS product.
Pilot tier is free for up to 50 people, with every module included. Or — if you're a founder thinking about building your own AI-assisted product — I'm happy to share the strategy openly: the CLAUDE.md template, the phase planning approach, the parallel-platform method.