From browser to hardware — Arduino, ESP32, Raspberry Pi ....
Products that ship
Models that think
Code meets hardware
27 projects across web, AI, and robotics
An autonomous AI agent that scrapes major Ghanaian news sites daily, summarises headlines using an LLM, converts the summary into a voice audio briefing via ElevenLabs, and delivers a morning email digest with the audio attached. Built as a scheduled pipeline — scraper feeds into summariser, summariser feeds into voice generator, output fires through email automatically every morning with zero manual input.
Extended the MPU6050 head tracker into a full wireless RC car control system. Strapped the IMU to a helmet, applied a Kalman filter to fuse gyroscope and accelerometer data — eliminating the drift and phantom movements that made raw gyro integration unusable for physical control. The filtered yaw angle maps directly to steering and pitch to throttle. ESP8266 on the transmitter side reads the sensor at 50Hz, runs the Kalman filter onboard, and fires packets over ESP-NOW to a second ESP8266 on the car. The receiver ESP forwards motor commands over SoftwareSerial to an Arduino Uno which drives the L298N motor driver. Built a side-by-side browser visualization in Next.js showing two 3D head models receiving the same sensor data — one raw, one filtered — with live scrolling charts for pitch, roll and yaw so the noise reduction is visible in real time. Also added a web UI with a D-pad and speed slider as a fallback controller. Turn head left, car turns left.
A Model Context Protocol (MCP) server exposing four clinical tools to any AI agent — patient demographics lookup, active medication retrieval, lab result analysis, and drug interaction checking. Built on FHIR R4 standards, the server fetches real patient data from FHIR-compliant hospital systems and falls back to structured mock data when needed. Implements the SHARP extension spec for automatic patient context propagation — when a clinician has a patient file open in their EHR, the patient ID and FHIR credentials flow automatically into every tool call via HTTP headers without any manual input. The drug interaction checker catches dangerous combinations like Warfarin + Aspirin before they reach the patient. An AI agent connecting to this server autonomously chains tool calls — fetching medications then checking each one against a new drug — without being explicitly programmed to do so. It reads the tool descriptions and figures out the sequence itself. Deployed via Docker on Render with SSE transport, registered on the Prompt Opinion healthcare AI marketplace where any compliant agent can discover and invoke the tools.
A multi-agent simulation engine inspired by MiroFish — a Chinese open-source swarm intelligence system that went viral with 10k+ GitHub stars. The engine extracts character personalities directly from any raw text document, builds independent AI agents from those profiles, and simulates how those characters would interact around a given question or scenario. For the Bible demo: key scripture passages are fetched per character, fed to an extractor that derives personality traits, beliefs, speech style, inner conflicts and relationships solely from the text evidence — not from the LLM's general knowledge. Each character becomes a fully independent agent with its own system prompt. Agents are genuinely unaware of each other's profiles — they only know what has been said in the shared conversation transcript, which is what produces emergent behavior nobody explicitly programmed. A final Report Agent watches the entire simulation and synthesizes the conversation into a structured insight — identifying the core tension, what each character represents, where they found common ground, and what the exchange reveals that no single perspective could. The engine is document-agnostic. The Bible is the demo. Feed it a news article, a novel chapter, a historical speech, or any private document the LLM has never seen — it extracts whoever is in that text and simulates them.
A personal AI agent running on Discord, powered by Groq (LLaMA 3.3 70B) with a full function-calling tool loop. The agent receives natural language messages and autonomously decides which tools to invoke — weather lookups via OpenWeatherMap, live currency conversion across 170+ currencies including GHS, real-time news digests via NewsAPI, web search via Tavily, and anime discovery via the Jikan/MyAnimeList API. Built with a clean multi-layer architecture separating the agent loop, tool registry, Discord gateway, and conversation memory. The core loop keeps iterating — calling tools, feeding results back to the LLM, and chaining calls — until the model produces a final answer. Per-channel conversation memory persists across message turns within a session.
A fully autonomous job search agent that runs every two hours without supervision. Searches government employment portals across Germany, Canada, Poland, Netherlands, New Zealand, and Japan for visa-sponsored factory, farm, and manual labour positions. The German Federal Employment Agency API alone returns 90+ fresh listings per run. For each job found, the agent calls a CV matcher that scores the role against the applicant's background using an LLM. Jobs scoring above 60% proceed to cover letter generation — a tailored, human-sounding letter written specifically for that role, company, and country. Jobs with apply emails receive the letter and CV automatically by email. Jobs without emails are logged to Firebase for manual form submission. Built as a separated MCP server and agent — the tools run as a standalone MCP server while the agent loop is a separate Python process that connects to it. Firebase Realtime Database serves as persistent memory, preventing duplicate applications across runs. A digest email arrives every morning summarising what was applied to overnight.
A test project for tracking head orientation in real time using an MPU6050 IMU on an ESP8266. Mounted the sensor on a bottle cap to sit on the head. Firmware reads pitch, roll, and yaw at 50Hz — gyro calibrates on startup, deadzone filters idle noise, angles integrated from gyro data. A Python bridge reads serial output and pushes live JSON over WebSocket to a Next.js frontend that renders a 3D head model in Three.js mirroring the movement. Wanted to see if the yaw angle alone was stable enough to use as a directional control signal — it is.
End-to-end election management system built for organisations that want to run secure, transparent digital votes without the logistical overhead of paper ballots. The platform is split into two distinct surfaces: a sleek, dark-themed voter portal and a full-featured admin control panel. On the admin side, election officers can upload an entire voter list from a standard Excel file in one step — the system parses the sheet client-side, previews every row, flags duplicate student IDs in real time, hashes all passwords server-side with bcrypt, and bulk-inserts only clean records. Single voters can also be added manually through a form. Each voter can be individually enabled or disabled at any point, and admins can search, filter by department or status, and see a live department-breakdown bar chart that updates as the voter list grows. Election lifecycle is controlled through a four-state machine — Not Started → Active → Paused → Closed — with a dedicated Elections panel showing a pulsing live indicator, timestamp trail, real-time turnout percentage, and a gradient progress bar. Admins can pause voting mid-session to address issues, resume it, or close it permanently, all with a confirm dialog to prevent accidents. A full reset wipes all votes and voter statuses for a clean re-run. The admin dashboard also features a live Voting Results chart with a Standings view (ranked aspirants per position with crown indicator and progress bars) and a Trend view (cumulative votes per hour as a multi-line chart), both polling the database every 30 seconds. On the voter side, students visit the home URL and see a context-aware screen — a countdown-style holding page if the election hasn't started, a paused notice if it's temporarily halted, or a dark-themed login card when voting is live. Authentication uses Student ID and password with a JWT session cookie valid for eight hours. Once in, voters navigate between positions using pill-shaped tabs that turn green as each choice is confirmed. Votes save automatically on selection with a quiet 'Saved ✓' indicator — no submit button required per position. Voters can freely change their selection on any position for as long as the election remains active.
Full-stack evangelism tracking platform for church teams. Workers log GPS coordinates of every person they visit; pastors and admins monitor team activity in real time on a shared live map. Features a public landing page showing all visited locations interconnected with animated arcs, a real-time group chat with WebRTC voice calls, per-visit records with filtering, and role-based access (Admin / Worker / User) with an approval flow.
Affiliate marketing platform where product owners list items and offer commissions. Affiliates copy product links, share them, and earn money directly to their mobile accounts when sales are made.
Full-featured POS for restaurant management with role pages for hot kitchen, cold larder, and pastries. Supports facial/PIN login, real-time order tracking via sockets, and an admin panel for taxes, discounts, tables, and sales reports.
Full-stack app for managing and scraping RSS feeds. Go backend handles feed creation, scraping, and Firebase auth. Next.js frontend provides a dashboard to create, view, and follow feeds with real-time updates.
Ticketing platform where users can sell and purchase event tickets. Sellers list tickets with event details; buyers browse and securely pay through Paystack.
Donation platform where users create payment endpoints for receiving donations via mobile money and credit cards. Each endpoint generates a unique QR code — no donor account required.
Dynamic NGO platform enabling organizations to manage donations, create projects, and collect funds via Mobile Money. Provides transparency and efficiency in fundraising efforts.
Web app for online food ordering from a local restaurant. Features menu browsing, cart management, and secure online ordering.
Web app using the NASA API to provide real-time space data. Shows a map with your location and the ISS position, distance, and speed. Also displays astronauts aboard the ISS and lets users browse Mars rover images by Sol (Martian day).
Web-based click game where players shoot ravens moving across the screen. Includes sound effects. Built entirely with vanilla JS.
Listens to live preaching and suggests relevant Bible verses in real time. Processes speech and displays full scripture references based on implicit and explicit cues from the preacher.
Uses sentence embeddings to compare and match sentences based on semantic similarity. Provides a 2D visualization using UMAP for dimensionality reduction. Input sentences and instantly see how closely related they are.
On-screen game using OpenCV to track hand movements via a camera in real time. Players burst randomly appearing balls by moving their hands over them — no controller needed.
Multi-domain text classification system built end-to-end. Fine-tuned BERT and DistilBERT models achieving 90%+ accuracy across news categorisation (AG News), spam detection (SMS), and toxic content moderation (Jigsaw). Includes a modular preprocessing pipeline handling class imbalance and text normalisation. Models deployed to Hugging Face Hub.
ASL sign language image classifier trained on Google Colab using a free T4 GPU. Improved accuracy from 60% (fully-connected network) to 85% by switching to a CNN architecture on the same dataset. Documented real-world domain-shift limitations and deployed the final model to Hugging Face Hub for public inference.
Applied K-Means clustering to 10,422 cryptocurrencies to group coins by market behaviour, revealing four distinct clusters — BTC, USDT, altcoins, and outliers — useful for identifying true portfolio diversification. Used the elbow method to determine K=4, StandardScaler for normalisation across vastly different scales, and PCA for 2D visualisation of 11-dimensional data.
A purely analog wireless system with zero microcontrollers. A salvaged phone speaker acts as a microphone — blowing air or making sound generates a signal that gets amplified by an op-amp, modulated onto a 433 MHz RF transmitter, and received wirelessly to trigger an LED on the other end. Raw electronics, no firmware.
Flutter app for tracking devices and discovering nearby amenities. Users search for places like schools, restaurants, or hospitals, view them on a map, and adjust the search radius with a distance slider.
Flutter app for tracking and saving visited locations. Users can store routes and revisit them. Integrates Gemini API to provide historical information about the current location.