Case Study
Building a Fitness App from Scratch: React Native, Apple Watch, and Health Integration
Developed a fitness-focused mobile application using React Native for Energize Me UK, owning frontend delivery for a greenfield product. Integrated Apple Watch health data, cross-platform support with Expo, and voice features using text-to-speech and speech-to-text APIs. Delivered approximately 35% of the frontend roadmap while reducing code duplication by ~25% through reusable component design.
React Native Developer · 6 months
TL;DR
- •25% reduction in code duplication due to modular component architecture
- •35% of the full frontend roadmap delivered during the six-month engagement
- •100% cross-platform support achieved through Expo for targeting both iOS and Android
Context
At Energize Me UK Limited, I joined as a freelance React Native developer to build a fitness-focused mobile application from the ground up. The product had no existing codebase — the challenge was to architect the frontend from scratch to support complex AI-driven workflows, real-time Apple Watch health data integration, voice interactions, and a roadmap that included conversational AI and calendar syncing.
My Role
I was the sole frontend engineer for the duration of the engagement. I worked directly with non-technical stakeholders to translate ambiguous product requirements into concrete, user-focused features. Architecture decisions, component design, and delivery were entirely my responsibility.
Approach
The most important early decision was component architecture. Given the complexity of the planned feature set — AI workflows, health data, voice input/output — I broke down each domain into modular, self-contained components. This prevented the codebase from becoming entangled as new features were added, and reduced duplication by approximately 25% compared to a less structured approach.
Apple Watch integration used Apple HealthKit to read health metrics in real time. Cross-platform support was managed through Expo, which allowed a single codebase to target both iOS and Android while keeping native module access available when needed.
Voice features were implemented using text-to-speech and speech-to-text APIs, enabling conversational interactions with the app. These were designed as isolated modules so they could be extended into a full conversational AI experience as the product roadmap progressed.
Throughout the engagement, I worked closely with non-technical stakeholders, translating requirements that arrived as rough ideas into well-scoped, deliverable components with clear acceptance criteria.
Results
I delivered approximately 35% of the full frontend roadmap during the six-month engagement, with the work laying the foundation for the remaining roadmap items: conversational AI, calendar integration, and expanded real-time health tracking. The modular architecture meant the team could continue building on the codebase without needing to refactor the foundation I had established.
Learnings
Working with non-technical stakeholders on a greenfield product reinforced the importance of scoping before building. Requirements that sounded simple — "the app should talk to the user" — contained significant technical complexity. Taking time to break those into concrete, bounded problems before writing code prevented mid-sprint rework and kept delivery predictable.
