What transpires when those tasked with safeguarding order, lacking an understanding of science, begin to regard every new and innovative study with unwarranted fear? This isn’t just a hypothetical question—it’s the reality faced by Dr. Oleg Maltsev, a distinguished scientist whose work has been criminalized, not because of demonstrable harm, but because its purpose and implications have eluded those tasked with judging it.
It is one matter when ignorance resides among certain quarters of society—those who, whether through want of opportunity, reluctance to engage with knowledge, or a peculiar propensity to embrace falsehoods over truth, remain estranged from the pursuit of understanding. Yet, it becomes a far more perilous affair when such intellectual deficiencies rise to the levels, where decisions of weight and consequences are made.
Maltsev’s plight exemplifies what sociologist Stanley Cohen famously termed moral panic: a collective reaction in which certain individuals or practices are framed as existential threats to social order. But is this panic justified? Or does it reflect a broader failure to engage with complex ideas?
In this article, we explore how misunderstanding, moral panic, and the dynamics of suggestion can turn scientific innovation into criminal behavior. Most importantly, we examine what can be done to prevent future cases where fear trumps reason and society punishes what it cannot comprehend.
How Science Becomes Criminalized
In the realm of science, “criminalization” refers to the process by which certain research or practices are deemed illegal and subject to prosecution. Ideally, this would occur only when scientific activities genuinely threaten public safety, ethical standards, or national security. However, the criminalization of science is not always grounded in genuine risk or wrongdoing. As the Maltsev case shows, criminalization often stems not from real dangers but from misunderstanding, amplified fears, and institutional missteps.
At the heart of this process are three key mechanisms: knowledge gaps, institutional shortcomings, and media distortion.
The Knowledge Gap: Science operates in a world of intricate methodologies and specialized terminology—far removed from the language and experiences of the general public. This knowledge gap creates fertile ground for misinterpretation. Much like overhearing a foreign language and imagining meanings that aren’t there, people often “hear” science without comprehending its true context. As a result, complex research can be reframed as dangerous or unethical simply because it’s not easily understood.
Institutional Shortcomings: Law enforcement agencies and policymakers frequently find themselves tasked with interpreting whether scientific activities fall within legal or ethical boundaries. Strangely enough, while they seem to have no trouble rounding up researchers like Maltsev and building cases against them, they rarely demonstrate the same level of expertise when it comes to actually understanding the scientific research. This curious imbalance—exceptionally skilled at arresting but hopelessly out of their depth in comprehending—underscores the need for specialized training to prevent such fiascos.
Rather than consulting scientific experts to gain clarity, authorities often fall back on assumptions or sensationalized narratives. This approach creates a chilling effect: scientists, wary of being misunderstood or vilified, may steer clear of innovative or controversial fields altogether. Meanwhile, the legal system continues to operate as if ignorance of science is a valid substitute for evidence, treating unfamiliarity as proof of wrongdoing.
Media Distortion: Jean Baudrillard’s theory of hyperreality aptly explains how media coverage can distort public perceptions of science. In this age of 24/7 news, potential risks—no matter how remote—are amplified into apocalyptic scenarios, overshadowing the nuanced realities of research. Headlines emphasizing fear over facts create societal panic, pressuring law enforcement to act against “threats” that are more imagined than real.
The Psychology of Fear and Suggestion
Analyzing the Maltsev Case through Boris Sidis’s research provides valuable insights into how society could be so easily persuaded to see a scientist as a threat. Sidis, a pioneer in the study of suggestion and the subconscious mind, highlighted the powerful and often subtle ways suggestibility shapes collective behavior. His work on mass psychology sheds light on the mechanisms that likely drove the public’s quick acceptance of the narrative against Maltsev.
Sidis believed that people have different levels of suggestibility, which can become significantly heightened under conditions of stress, fear, or uncertainty. In the Maltsev Case, the societal environment likely amplified this susceptibility. Factors such as Russia’s military aggression, the personal tragedies affecting nearly every Ukrainian family, and the witch hunts initiated by certain government authorities created a climate that was ripe for the kind of psychological manipulation Sidis described. During times of collective anxiety, individuals are more likely to accept external suggestions without critical thought, especially when these suggestions align with their deepest fears.
Sidis also warned that those in positions of power often exploit suggestion to shape public opinion, presenting their claims as indisputable truths. In the Maltsev Case, the portrayal of Maltsev as a threat likely stemmed from influential figures or institutions whose statements carried excessive authority. This manipulation of trust bypasses critical analysis, making it easier for people to adopt and spread the narrative as fact.
Bridging the Gap: A Way Forward
If we are to avoid repeating the injustices of the Maltsev case, society must adopt a more rational and informed approach to scientific research. This requires coordinated efforts to address the root causes of misunderstanding and fear.
Training Law Enforcement and Policymakers: Law enforcement officials need more than the ability to arrest scientists; they require training to understand the principles and processes of scientific work. Programs should focus on the fundamentals of scientific methodology, the distinction between theoretical risks and real-world threats, and historical examples of science being criminalized. Collaborative workshops between law enforcement and scientific institutions could foster mutual understanding and reduce missteps.
Effective Public Science Communication: The scientific community must take responsibility for making complex ideas accessible. Scientists can employ storytelling, analogies, and visual tools to explain their work in relatable terms. Public forums, interactive Q&A sessions, and media partnerships can build trust and dispel myths, ensuring that research is seen for what it is: an attempt to expand knowledge, not endanger society.
Collaboration Across Disciplines: Bringing together scientists, media professionals, and legal authorities is essential for creating balanced narratives. Cross-disciplinary workshops can help journalists and law enforcement understand scientific concepts, while dedicated science liaisons within legal and media organizations can bridge communication gaps. Such collaboration ensures that research is reported and evaluated accurately, reducing the risk of moral panic.
From Fear to Understanding
Banning or prosecuting scientific research risks dragging society back into an intellectual dark age, reminiscent of times when ignorance and fear dictated the limits of knowledge. Consider the era of witch hunts, when superstition overpowered reason, and questioning dominant narratives often led to persecution.
The criminalization of Academician Maltsev’s research reveals more about society’s shortcomings than it does about science. It is a story of how fear, misunderstanding, and institutional failure can conspire to stifle progress and punish innovation. Furthermore, such actions set a perilous precedent. If one scientist’s work can be reframed as criminal based on misunderstanding or fear, what is to stop other groundbreaking research from facing the same fate?
Yet, it also offers an opportunity for reflection and change.
Wouldn’t it be refreshing if the same institutions that excel at identifying “threats” also excelled at understanding them? By fostering education, dialogue, and collaboration, we can build a society where scientific inquiry is not feared but celebrated as a cornerstone of human progress. Only through understanding can we dismantle the mechanisms of fear that have historically hindered the pursuit of knowledge.