Lifetime Citizen Portal Access — AI Briefings, Alerts & Unlimited Follows
Panel demonstration shows limits of AI translation and automated video for client‑facing materials
Loading...
Summary
In a live example, a panelist demonstrated that an AI tool produced unintelligible Somali audio, incorrect script choices and mixed subtitles; panelists urged community review and native‑speaker QA before deploying automated multilingual content.
Morgan Morissette, Home Safe coordinating attorney at Legal Aid of Southeastern Central Ohio, presented an example of an AI platform used to create short multilingual videos and highlighted several failures that make unvetted outputs risky for client‑facing use.
Morissette played a 15‑second excerpt of an eight‑minute AI video and said the tool used the wrong script and produced incomprehensible spoken Somali: the video substituted different alphabets, mixed in Chinese and Hindi script in subtitles, and used synthesized speech that native Somali speakers would not understand. She said only a Somali‑speaking paralegal on her team identified the issues. “The spoken portion is incomprehensible… the words are all technically words, but it is spoken in a way that no one who actually speaks the language would understand it,” Morissette said.
Panelists recommended relying on community speakers and bilingual staff for quality assurance, and treating AI‑generated multilingual content as a draft requiring human review. Morgan said she does use AI tools for drafting emails and volunteer materials but warned that client‑facing translations and audio must be validated by native speakers to avoid serious communication failures.
The panel did not endorse any specific AI vendor; instead, it advised legal‑aid programs to pilot outputs with community reviewers and to keep human oversight in any automated multilingual workflow.
Closing note: attendees were urged to involve native speakers in QA and to prioritize human review before public deployment.

