A recent study entitled “Are Poles Not Fooled by Disinformation?”, shows that a majority of Poles, especially younger generations, have difficulty identifying fake news on social media, highlighting the challenges of online information literacy in an era of rapid digital communication.
The research, conducted by NASK in cooperation with the Association of Digital Transformation Practitioners (Praktycy.eu), surveyed a representative sample of 850 adult Poles in mid-December 2025. Michał Marek, PhD, Head of the External Threat Analysis Team at NASK, said social polarization is among the key factors that make Poles susceptible to disinformation.
“No society is completely immune to false information. We are not as immune to disinformation as we sometimes perceive ourselves to be, or as we would like to see ourselves,” Marek said. He noted that Poles are highly polarized politically and ideologically, which amplifies reactions to global and domestic events, including US politics, ongoing support for Kyiv, and national policy debates.
The study found that 42% of respondents were unsure whether they encountered false information online, and 36% could not assess whether they had ever been misled by disinformation. While 45% said they frequently come across fake news on platforms such as Facebook, Instagram, TikTok, and YouTube, 25% admitted to being misled by it frequently. One in seven Poles said they rarely encounter deliberately misleading content, while for one in five, exposure to disinformation is a daily occurrence.
Among different age groups, Generation Z (18- to 24-year-olds) was the least certain in recognizing fake news. According to the survey, 58% of young adults “have no opinion” on whether content is true or false, and 44% were unsure if they had ever been fooled by misleading content. Ten percent reported encountering fake content “very often,” and another 10% admitted to being misled frequently. Only 5% said they had rarely or never seen disinformation on their social media.
By comparison, 29% of respondents over 65 were unable to recognize false content, while 25% admitted to frequently encountering it. Meanwhile, 20% said they rarely or never see disinformation, and 3% reported being misled. Marek emphasized that older and younger users are the most vulnerable to misleading online content, though the platforms and formats that affect them differ. “For young people, this would primarily be TikTok, while for those over 50, it would be Facebook, for example,” he said.
Despite widespread exposure, most participants expressed confidence in their ability to detect disinformation. Approximately 39% claimed they had never or almost never fallen victim to misleading content, a group that likely includes many seniors. Marek said, however, that this confidence may not always reflect reality, as disinformation is often designed to exploit biases and emotions.
He explained that false information is difficult for average users to detect because it is carefully camouflaged and tailored to the recipient’s worldview. “People do not even know when they are being manipulated by fake news. It can even lead to subliminal manipulation, leading them to perceive their own country as a hostile, threatening entity,” Marek said. Examples include conspiracy-driven social media groups and forums that mix alarming messages—about Armageddon, World War III, or potential conflict with Russia—with neutral content such as DIY advice.
The study also explored whether education protects against disinformation. People with higher education were more likely to notice fake news, but they were influenced by it almost as frequently as those with only primary education (25% in both groups).
“Of course, education matters, but I wouldn't consider it a 100% determining factor here. Despite this, I have come across research that indicates that education is important, especially because it relates to language skills, which increase the ability to verify information,” Marek said.
Emotional state was another significant factor. Individuals who are dissatisfied with life, financially struggling, or otherwise emotionally distressed are more prone to believe misleading content, while those in a better emotional state are more likely to regulate their reactions and verify information even during crises.
The study also assessed participants’ ability to distinguish real images from AI-generated or manipulated content. Only 9% said they could “definitely” identify real photos, while 43% reported difficulty. Older adults were most uncertain: 57% of respondents aged 65 and over said it was “difficult to say” whether they could distinguish real from AI-altered images.
Marek stressed the importance of educating older users about AI-generated content, advising them not to blindly trust information on social media and to rely more on official announcements and traditional media.
“To protect ourselves from disinformation, we should remain calm in the face of alarming or controversial posts and materials we encounter online. When we encounter alarming information online, we should not share it, like it, call and tell friends about it, unless we are certain that a given crisis has actually occurred. It is better to wait for official announcements confirming or denying this information. Common sense and emotional control are key,” he said.
The research was commissioned by the Association of Digital Transformation Practitioners, established in 2024 by former managers of Poland’s largest media outlets to support journalists and media organizations in navigating the evolving digital landscape. The study provides a detailed snapshot of how different age groups perceive, recognize, and are affected by disinformation online, highlighting both vulnerabilities and areas for public education. (PAP)
PAP - Science in Poland
azk/ bst/ amac/
tr. RL