When I joined Zola Electric as Head of Engineering in early 2024, our first challenge was crystal clear: How could we harness Gen AI to power reliable, affordable energy for 3 billion people—reaching the homes, schools, clinics, and businesses that are the backbone of their communities? With AI announcements flooding in daily, it’s like drinking from a firehose – especially for a lean team like ours. But here’s the silver lining: being small and nimble means we can move fast, experimenting with the latest, most powerful tools without getting slowed down by organizational resistance. Better yet, we can shape our product roadmap around these emerging AI capabilities instead of being held back by legacy systems.

Revolutionizing Customer Support with Zi Assistant

Today, Zola delivers community level electrification through our technology platform to more than 10 countries across 4 continents. The launch of our Zola iNTELLIGENCE (Zi) Assistant has transformed how we support our global customers, offering written and spoken assistance in their preferred languages through its deep understanding of our knowledge base. The application includes a feedback loop, retains the history of previous interactions, provides a personalized experience for users and ensures continuity in conversations across different contexts.

We started with OpenAI models for our purposes, with Pinecone serving as our vector store for managing our knowledge base. We built our architecture, enabling us to plug-in alternative models easily as AI technology is rapidly evolving. As Gen AI capabilities continue to evolve, we intend to deliver several AI agents such as call analysis, credit assessment and site configuration tools on our platform while also deploying other ML use cases such as predictive maintenance assisting our customers in their community level electrification journey.

Building a Living Knowledge Base

A decade of operations in Tanzania has given us a wealth of business data and market insights – vital knowledge that powers our Zi Assistant. When you have rotating teams as in a startup, the first challenge is a lack of cohesive understanding about the in-house data. We embarked on having a dynamically generated knowledge base leveraging Gen AI tools, our existing data stores and our code base.

To be effective, we embedded efficient AI-guided conversations with subject matter experts (SMEs) in our process. The team also needed to build a prompt library to be effective at getting the right response based on content type. The entire process included three important feedback loops: search for additional documentation to fill gaps, updates to overall structure based on code base, and revising component documentation based on SME feedback. Tools like Claude and Cursor have been invaluable in this process, though diagram generation tools like Lucid and Mermaid are still building their Gen AI capabilities.

Evolving Our Data Architecture

Our next step was to build a scalable data architecture that can enable the various use cases on our Zi platform and bring valuable insights on the data to our customers. This included a re-assessment of the needs of our data platform – storage classes, data quality, ML Ops, and dynamic, data-driven interfaces such as “Prompt to Dashboard” Analytics and Reports. Part of this journey is to also consider the appropriate framework to make available the AI rubrics (Accuracy, Completeness, Quality/Usefulness – user adoption KPI). The system provides insights on areas with low scores via an admin portal, enabling our customer support team to update the knowledge base and maintain proper supervision.

To learn more about our data architecture and ML Ops journey, read Data Architecture and ML Ops to scale our Zi platform by our senior data engineer Mansi Kapoor.

Supercharging Development with AI

As a team, we were excited to re-think our end-to-end software development cycle (SDLC) to best leverage AI tools, frameworks, and methodologies. This meant reimagining everything from product managers building UX with prompts to auto-conversion of designs to code, with AI code assistants supporting the full application development cycle. We made a deliberate choice to use TypeScript within our application stack (Material UI, Next JS, and Node JS) and found our optimal tooling combination with v0 + Cursor + Claude. The results have been dramatic: we can now complete a full feature module development with 5-10 screens within a week compared to more than two weeks earlier. While security concerns aren’t significant at our current scale of adoption – and we’re using the tools’ privacy modes as a starting point – we’re working on locking down other aspects as we grow.

For insights into our tech stack choices and AI tools adoption, check out “New Tech Stack and AI tools adoption at the start of our CRM development journey” by our senior full-stack engineer Alex Marchenko.

Looking Ahead

This is a dynamic landscape and we’re just getting started. We see tools evolving across the stack that can enable both SDLC efficiency and enhance our product feature set. There is more to be done on cost optimization for AI and associated data components as we scale. Given our focus on the global south, we expect to see some challenges with data and model access in areas with weak networks, as well as complexities around regional accents and languages as we move on this journey. The key is to have a culture that is mission-driven and enables the team to champion new ideas and tools while maintaining alignment on end strategy and outcomes.