Hack #8: Cursor · Cursor
14 May, 08:58
There are hundreds of millions of visually impaired people around the world. Things many of us do without thinking aren’t that simple for them. Beacon is a voice-first, hands-free mobile app built to help visually impaired users better understand and interact with the world around them. It runs on a smartphone mounted to a specialized lightweight chest harness and pairs with a Bluetooth remote that lives in the user’s pocket and instantly starts the assistant, turning the system into wearable AI. No screens, no menus, no touching the display. Users simply talk, and Beacon sees the world and talks back in real time. Beacon delivers real-time scene narration, hazard detection, object recognition, and real-time reading for signs, menus, books, and labels — all spoken aloud naturally through conversation. It also supports walking turn-by-turn navigation, live web search, location-aware weather, and smart-home control entirely through voice. The entire experience is fully hands-free with natural speech, sub-second barge-in, and no keyboards, menus, or screen interaction. Built with Cursor for rapid agentic development, ElevenLabs (Conversational AI Agents, Text-to-Speech, Custom Voice Design, Sound Effects), Gemini as the vision intelligence layer, Firecrawl and SerpAPI for web search, Google Maps + OSRM for navigation, Open-Meteo for weather, and Tuya for smart-home control. "Beacon. A guiding light for everyday freedom."
