Use Be My Eyes
Blind and low-vision people launch Be My Eyes in order to recruit remote human and AI helpers whenever inaccessible design or missing visual information blocks daily life.
ENABLE Model location
Be My Eyes operates after other builders ship websites, appliances, documents, hotel rooms, or stores without sufficient accessibility. Blind and low-vision users then compensate: they open the app, route a task through a global pool of volunteers or AI, and restore access that upstream actors could have embedded earlier. The tool sits inside the assistive-technology platform economy, where companies, investors, and partners fund infrastructure while users and volunteers supply labor. (Be My Eyes)
3. What it isβ
Developers at a Danish startup created Be My Eyes as a smartphone-first service that connects blind and low-vision users with sighted volunteers and, more recently, with an AI-based image describer.(Be My Eyes)
Through the app, a blind user starts a session; the system routes a live camera feed or still image to either:
- a volunteer who joins through one-way video / two-way audio and narrates the scene, or
- Be My AI, a GPT-4-powered assistant that receives images and returns detailed descriptions plus follow-up dialog. (Be My Eyes)
Within the ENABLE temporal frame, Be My Eyes functions as a navigator-side compensation:
- Upstream builders in retail, banking, transit, hospitality, and software design and ship products that demand vision.
- Blind and low-vision people then carry the cost of inaccessibility by operating Be My Eyes, holding the phone, framing the scene, and articulating questions.
- Volunteers and AI services absorb part of the visual labor that inaccessible systems offload. (Wikipedia)
4. Why it mattersβ
Inside the global digital-accessibility and platform economy, Be My Eyes reveals a layered redistribution of labor.
- Companies that neglect accessible content, signage, or interfaces still collect revenue.
- Blind and low-vision users recruit volunteers or AI through the app, spend time steering cameras, and often repeat tasks across many inaccessible contexts.
- Volunteers donate cognitive and emotional labor; AI infrastructure consumes investment and technical work from Be My Eyes, OpenAI, and corporate partners. (Wikipedia)
The ENABLE frame highlights this as downstream work: instead of a bank, hotel, or grocer embedding accessible flows, people with limited vision must stitch access together from phone, app, volunteer, and AI every time a label, interface, or environment fails them. Partnerships with firms like Hilton, Meta, and Tesco extend the platform into hotels, smart glasses, and supermarkets, so compensation flows into more physical settings while upstream obligations around accessible design still fluctuate. (Wikipedia)
5. Real-world exampleβ
A blind traveler checks into a Hilton property in North America. Hilton staff previously trained Be My AI on common room layouts and controls. The traveler opens Be My Eyes from the hotel room, taps the Hilton option, and hands control of the camera feed to a staff member who sits in a call center. That staff member narrates thermostat controls, curtain switches, and safe placement so the guest can settle in without flagging down on-site staff. (Stories From Hilton)
Another traveler shops in a Tesco supermarket that now participates in a six-month Be My Eyes pilot. The shopper starts a call through the Tesco tile, connects to a store employee, and requests help locating a specific brand and verifying expiration dates. The employee scans shelves with the phone camera and guides the shopper through aisles that previously forced them to rely on friends or strangers. (Wikipedia)
Another user launches Be My Eyes from their phone, connects with a volunteer, and receives guidance while locating a missing object at home and later navigating a public space.(Reddit)
Another user relies on Be My AI to read a restaurant menu and then escalates to a human helper when AI descriptions leave ambiguity. (Be My Eyes)
These scenes unfold inside a broader assistive-tech platform system, where corporate partners integrate the app into their own processes and where blind users navigate both accessible and inaccessible infrastructure on the same day.
6. What care sounds like (hypothetical)β
βI design every interaction so you navigate without asking anyone for help.β
βI label every element clearly so you read or scan what you need at your own pace.β
βI check each flow with blind users until every barrier falls away before launch.β
7. What neglect sounds like (hypothetical)β
βOur team ships the visual update first and leaves blind users to figure out whatever breaks.β
βWe push new features every quarter and postpone accessibility fixes whenever the backlog grows.β
βWe skip testing with screen-reader users again and treat their complaints as edge-case noise.β
8. What compensation sounds like (hypothetical)β
βI angle my phone up and down for three full minutes before the volunteer catches the tiny logo I need.β
βI reopen the app twice every morning because the first connection drops while I stand in the cold with my groceries.β
βI juggle my cane, my bags, and the phone at once; every small task turns into a mini-rehearsal before I even reach a human voice.β
These quotes locate labor with blind and low-vision users who must coordinate devices, environments, and strangers in order to reclaim basic access.
All observations occur within the context of the digital accessibility and assistive-technology platform system.