Hack #1: Firecrawl · Firecrawl
26 Mar, 14:14
Beacon is a voice-first AI assistant built specifically for visually impaired users — people who navigate the world without ever touching a screen. You call a number, you talk, Beacon handles the rest. Built on ElevenLabs Conversational AI, Beacon listens, reasons, and responds in natural speech using a custom voice. It uses Firecrawl Search as its live intelligence layer — fetching real-time weather forecasts, nearby services, load shedding schedules, community safety alerts, and road closures from across the web, then distilling that into clean spoken summaries. No raw data, no screen, no friction. From a single voice conversation, Beacon can check the weather, find a nearby pharmacy, verify there's no load shedding before you leave home, flag any safety alerts in your area, add a reminder to Google Calendar, and send a WhatsApp message — all hands-free, all by voice. What makes Beacon different is who it's built for. Most AI assistants assume a sighted user at some point. Beacon assumes limited screen interaction. It's independence — delivered through a phone call. (The number is not "live" as this is in a conceptual state for the hackathon) #ElevenLabsMusic & TTS used.
