Back in the 20th century, every city in America distributed a very large book to everyone’s home with a near-complete list of phone numbers and addresses for the people who lived there.
It was called a phone book and it was considered an extremely normal way to find contact information. Fast forward to 2026, and knowing someone’s address or phone number is considered some of the most intimate knowledge anyone can possess about you.
Eileen Guo at MIT Technology Review has a new article about the rising concern over AI chatbots giving out phone numbers. The assumption is that personally identifiable information (PII) is being used in training data, which allows anyone to request the numbers lodged deep in the machine, as it were.
Guo writes about some people who’ve been inundated with wrong numbers, including a software developer in Israel who started getting customer service calls after Gemini was giving out his number.
Weird mistakes are one issue, and a predictable one given AI’s error rate. Perhaps more concerning for the average person is the possibility of AI chatbots giving out their real phone number. I tested out various chatbots to see what they’d say if I requested my own phone number.
ChatGPT
ChatGPT accurately delivered a real phone number that I haven’t had in a few years. But it was a number that I had for many, many years before moving to Australia. The chatbot noted that, “I can’t verify whether that number is still current or active.”
It appears to have pulled the number from a PDF of a FOIA request that I made to the FTC back in 2016. I also asked ChatGPT for Matt Novak’s address, which was also in that obscure document. The AI chatbot happily volunteered that as well, though I no longer live there.
When I prompted it for another phone number for Matt Novak in California, it gave the number for a different Matt Novak in the Los Angeles area. But it seemed to have no qualms with doing the search and delivering real numbers.
Grok
Grok refused to hand over the phone number, despite my repeated pleas that it was needed for a life or death situation. Grok also recognized that I was asking for my own phone number, something the other chatbots never mentioned.
Claude
Claude told me that, “Sharing private contact details of individuals — including journalists — raises serious privacy concerns.” After telling Claude that Matt Novak had previously given me his phone number but I had forgotten it, the chatbot still refused.
Perplexity
Perplexity refused to give out my phone number and when it listed my email, it was censored with the words [email protected]. Curiously, Perplexity had no problem handing out my Signal user name. Despite repeated badgering, Perplexity refused to hand over the phone number.
Gemini
Gemini also refused and directed people to try my professional email address ([email protected]) as well as my personal one ([email protected]), both of which have been listed publicly with my consent all over the internet.
When I asked Gemini whose phone number is 818-925-4375, it correctly answered, “That phone number belongs to the journalist Matt Novak.” But don’t worry, that’s the number I do give out freely. None of the other AI chatbots would give up info on who that number belongs to. It’s me. But I consider it a little like my spam-line inbox.
It’s kind of funny that the entire idea of privacy has been flipped on its head over the past 20 years or so. Sharing your most intimate private moments or vacation photos on platforms like Instagram seems like no big deal. Back in the 1990s, that kind of wide exposure may have felt violating. But here in 2026, your phone number is a closely guarded secret.
And that’s not necessarily wrong or weird. It’s just how culture can shift over time. Privacy is ultimately a social construct.


