STAT+: Why we don’t need AI to be board-certified
You’re reading the web edition of STAT’s AI Prognosis newsletter, our subscriber-exclusive guide to artificial intelligence in health care and medicine. Sign up to get it delivered in your inbox every Wednesday. …

You’re reading the web edition of STAT’s AI Prognosis newsletter, our subscriber-exclusive guide to artificial intelligence in health care and medicine. Sign up to get it delivered in your inbox every Wednesday.
Do androids dream of test anxiety?
I’ve never met a standardized test I wasn’t preeeeetty good at (except for the chemistry GRE). However, as anyone who’s ever had test anxiety can attest, these multiple-choice exams aren’t a great measure of how smart you are or how good of a lawyer or doctor you might be. These tests are mostly really good at determining if you’re good at taking multiple-choice tests.
At least back in early 2023, the way people talked about ChatGPT passing medical boards or bar exams assumed that ChatGPT learned the way humans do — building up a knowledge base one can only regurgitate if one has truly digested the material. The discourse also assumed that having learned that information, the AI models would be able to apply that information in the future as an AI lawyer or AI doctor. But large language models do not learn and retain a body of knowledge — they have simply memorized which words, averaged over the entire internet, are closely associated with other words and have learned which combination of those words will impress a human reading the output.