INTERVIEW
Wall Street has been parsing text for years looking to spot changes. Bloomberg publishes a side-by-side comparison of Fed announcements to show whether the central bank’s language is becoming more hawkish or dovish.
AI can now listen in on how executives deliver company updates.
Alex Kim, a Ph.D. candidate in accounting at the University of Chicago, has pioneered ways to quantify subtle vocal cues using AI. Research in communications has long shown that up to 70% of information in oral communications comes through voice rather than text.
Kim's work on AI and LLMs has caught the attention of dozens of hedge funds worldwide, including some of Wall Street's biggest names.
The paper, "Vocal Delivery Quality in Earnings Conference Call" has been conditionally accepted for publication in the Journal of Accounting and Economics. Companies like Markets EQ, and Speech Craft Analytics are marketing this new way of understanding tone.
Just a few years ago, real-time transcription and vocal…


