I enjoy speaking at financial firms on human rationality. Unless otherwise requested, I offer an introduction to the field of cognitive biases – known bugs in human reasoning – with a special emphasis on how these biases make us stumble in life in general, and trading in particular. Speaking fee currently midrange for the financial sector (not cheap, not expensive).


“Eliezer Yudkowsky’s talk was the sharpest introduction I’ve seen to heuristics and biases. Yudkowsky has thought deeply about maximizing the real-world positive effect, and he emphasizes key pitfalls I’ve never seen discussed in other presentations. The talk was among the best we’ve ever had at providing practical value – an exceptional investment. If you don’t come away seriously questioning your decision processes, you weren’t paying attention.”

— Sandor Lehoczky, Jane Street Capital

My cognitive biases paper is a good example of the kind of material I present. That particular paper was a chapter for a book on Global Catastrophic Risks, so it’s tuned to an audience interested in black swans and high stakes. The talk I give at trading firms is more targeted to the problems of thinking about finance. Still, the book chapter serves as a concentrated example of the kind of experimental examples I use, and my style of exposition.

My usual talk is not mathy, but please inform me whether I will be speaking to quants.

On the topics of transhumanism / the Singularity / AI risks / self-improving AI, I usually speak to spread the word. Inquiries from popular venues should describe the size and makeup of the audience. Academic conferences should be aware that I write very slowly in formal style, so I’m unlikely to contribute a paper – a presentation (with slides) is more doable. Private venues will be charged my standard speaking fee.

For speaking inquiries, contact yudkowsky@gmail.com. Please include “Speaking” in the subject title.

Speaker bio:

Eliezer Yudkowsky is an Artificial Intelligence theorist who also writes on the topic of human rationality. He cofounded the Machine Intelligence Research Institute, a nonprofit devoted to research and advocacy on the topics of Artificial General Intelligence, self-improving AI, and superintelligence. Yudkowsky was one of the founding directors of the World Transhumanist Association. He previously blogged on human rationality at the econblog Overcoming Bias and now writes at the community blog Less Wrong, which have together received over 7 million pageviews. Yudkowsky has appeared on the BBC, the History Channel and the Discovery Channel; and has presented at popular, academic, and government conferences and workshops.