• AI Street
  • Posts
  • Powell Calls AI One of the Most Important Tech Shifts in Decades

Powell Calls AI One of the Most Important Tech Shifts in Decades

Hey, it's Matt. Here’s what’s up in AI + Wall Street.
Was this email forwarded to you? Sign up below:

THE FED

Fed Chair AI Evangelist

Definitely Not Real! Made with ChatGPT and Canva

“[AI is] like a better version of a person.”

This sounds like a quote from a VC bro.

But it was actually uttered by the world’s most important central banker, Fed Chair Jerome Powell, during an interview yesterday at the Economic Club of Chicago.

“Like everyone who’s been exposed to what it’s capable of, it’s just — it’s beyond… I started thinking about it as like a better version of Google, but it’s not. It’s like a better version of a person.”

He wasn’t joking.

He went further, saying AI could be “one of the two or three things most likely to bring dramatic change to the economy all around the world in the next 20 years.”

← pretty striking for someone who chooses his words very carefully.

It signals that we may be moving past the “AI hallucinates so how can it be useful?” stage of public discourse now that central bankers are evangelists.

In the short term, Powell says he doesn’t know if AI will replace people or just make them more productive.

But Powell highlighted that over the last 250 years technological innovation tends to, over the long run, boost productivity and living standards.

Check out the exchange below, which starts at 41:35.

USE CASES

Quant Vet Chan Talks AI Potential and His Latest Book

Ernie Chan, a quantitative finance expert and author of multiple books on the subject, believes generative AI holds a lot of potential for financial markets but says today’s tools won’t replace humans quite yet.

Chan, who began his AI career at Morgan Stanley’s lab in 1997 and now runs PredictNow, has long worked at the intersection of AI and finance.

At a recent conference, Chan said many of the “quant luminaries” in attendance were "up to their chin in generative AI." While they didn’t reveal exactly what they were working on, he noted that several claimed to be finding significant value from it.

His latest book, co-authored with GenAI expert Hamlet Batista, looks at how generative AI and other ML techniques can enhance trading by improving signal generation, risk management, and portfolio optimization across both text and structured financial data.

Made with ChatGPT

"A lot of common perception is that generative AI is all about large language models, particularly for finance," Chan said. "But it is much broader than that. You should be able to train it on any data, such as market time series."

Chan explained that the mathematical framework powering generative AI can be applied to numerical time series just as effectively as to text. He referenced published research where models were pre-trained on stock returns data from thousands of companies across global markets, similar to how language models are pre-trained on vast text corpora.

"The idea of generative AI is the same premise as language," Chan said. "Essentially you can pre-train a model on a massive amount of data, whether that's relevant or not to your particular problem." This approach allows quants to overcome the traditional limitations of financial data scarcity by leveraging patterns learned across broader datasets.

Chan mentioned a paper titled "StockGPT" which applied transformer architecture to numeric stock return data rather than text. According to the authors, the model achieved annual returns of 119% with a 6.5 Sharpe ratio in out-of-sample testing.

Chan explained that generative AI offers two key advantages for quantitative finance: the ability to pre-train models on massive datasets before fine-tuning for specific problems, and better capability to recognize when market conditions fall outside a model's training parameters.

"In applying AI to finance, we have always been limited by data scarcity," Chan said. "Generative AI opened this new arena for us."

However, Chan expressed skepticism about current generative AI tools like ChatGPT for developing investment strategies, describing his experiments as "quite a disaster."

"They don't have a deep understanding. They just read a lot of articles and stick it in," Chan said. "It is purely parroting what they read."

Chan compares its best use case to Microsoft’s “co-pilot” concept: useful for offerings suggestions and catching mistakes, but not flying solo. At PredictNow, his firm focuses on using AI for macro-level risk management, drawing on features from across asset classes—not just equities.

While unimpressed with today’s chatbots, that could change quickly.

“Five years is a long time in AI,” he said. “We may see sudden, qualitative leaps in ability that surprise all of us.”

DATA

AI Keeps Getting Cheaper

Made with ChatGPT

If you follow AI at all, you know that it’s very expensive to build these large language models because you need an enormous amount of computing power, and Nvidia is pretty much the only game in town.

But what’s less known is that using AI keeps getting cheaper, dramatically so. I wrote two months ago about how OpenAI’s pricing has dropped more than 90% in two years.

Earlier this week, OpenAI released a model lineup that is less expensive, while outperforming older models. For example, GPT-4.1 mini matches or exceeds GPT‑4o in intelligence evals while reducing latency by nearly half and reducing cost by 83%.

Lower costs will lead to more adoption. We’re seeing this play out.

Rajarshi Gupta, head of AI at cryptocurrency exchange Coinbase, said the company decided to "invest heavily" in generative AI as a platform around two years ago, soon after OpenAI released GPT-3.5.

The improved quality of "cheap and small models" from Gemini has been "phenomenal," Gupta said during a Google Cloud roundtable.

"As you start building these more complex use cases, you realize that sending everything to the top-end LLM is very expensive and has a lot of latency."

Coinbase sends simpler queries to cheaper, smaller models and only what’s necessary to the top-end LLMs for complex queries.

OpenAI remains one of the pricier LLM providers, and these recent pricing changes suggest it’s feeling competitive pressure as cheaper models continue to improve.

Bottom line: Expensive AI today will be cheap tomorrow.

RESEARCH

Watch Out for Bias in AI + Investing Research

Made with ChatGPT

I skim a lot of finance and AI papers and the first thing I do is plug them into Claude and say

  • “Please summarize this” (in case AI becomes our robot overlords, I can point out that I was polite.)

  • and then “How did the authors address look-ahead bias, i.e. making sure the LLM wasn’t trained on the data being tested.”

I wrote about this topic a couple months back after reading this academic paper on the subject. It makes a compelling case that LLMs’ standout results in finance may be overstated, driven largely by look-ahead bias—models often see the answers before the test.

Other studies have shown that LLMs can memorize specific information even if it appears just once during training.

Even the experts don’t know exactly how LLMs do what they do. So just a word of caution…

How did you like today's newsletter?

Login or Subscribe to participate in polls.

Reply

or to participate.