Connect with us

Brownstone Institute

Why Is Our Education System Failing to Educate?

Published

30 minute read

From the Brownstone Institute

BY Julie PonesseJULIE PONESSE

I suspect many of you know my story. But, for those who don’t, the short version is that I taught philosophy — ethics and ancient philosophy, in particular — at Western University in Canada until September 2021 when I was very publicly terminated “with cause” for refusing to comply with Western’s COVID-19 policy.

What I did — question, critically evaluate and, ultimately, challenge what we now call “the narrative” — is risky behaviour. It got me fired, labeled an “academic pariah,” chastised by mainstream media, and vilified by my peers. But this ostracization and vilification, it turns out, was just a symptom of a shift towards a culture of silence, nihilism, and mental atrophy that had been brewing for a long time.

You know that parental rhetorical question, So if everyone jumped off a cliff, would you do it too?” It turns out that most would jump at the rate of about 90 percent and that most of the 90 percent wouldn’t ask any questions about the height of the cliff, alternative options, accommodations for the injured, etc. What was supposed to be a cautionary rhetorical joke has become the modus operandi of the Western world.

Admittedly, I am a bit of an odd choice as the keynote speaker for an education conference. I have no specialized training in the philosophy of education or in pedagogy. In graduate school, you receive little formal instruction about how to teach. You learn by experience, research, trial by fire, and by error. And, of course, I was terminated from my position as a university teacher. But I do think a lot about education. I look at how many people are willing to outsource their thinking and I wonder, what went wrong? Confronted with the products of our public school system every day for 20 years, I wonder what went wrong? And, finally, as the mother of a 2-year-old, I think a lot about what happens in the early years to encourage a better outcome than we are seeing today.

My aim today is to talk a bit about what I saw in university students during my teaching career, why I think the education system failed them, and the only two basic skills any student at any age really needs.

Let’s start by doing something I used to do regularly in class, something some students loved and others hated. Let’s brainstorm some answers to this question: What does it mean to “be educated?”

[Answers from the audience included: “to acquire knowledge,” “to learn the truth,” “to develop a set of required skills,” “to get a degree.”]

Many answers were admirable but I noticed that most describe education passively: “to be educated,” “to get a degree,” “to be informed” are all passive verbs.

When it comes to writing, we are often told to use the active voice. It is clearer, more emphatic, and creates greater emotional impact. And yet the predominant way we describe education is passive. But is education really a passive experience? Is it something that just happens to us like getting rained on or being scratched by a cat? And do you need to be acted on by someone else in order to become educated? Or is education a more active, personal, emphatic and impactful experience? Might “I am educating,” “I am learning” be more accurate descriptions?

My experience in the classroom was certainly consistent with thinking of education as a passive experience. Over the years, I saw an increasing trend towards timidity, conformity and apathy, all signs of educational passivity. But this was a strict departure from the university culture that met me as an undergraduate in the mid-90s.

As an undergraduate, my classes were robust theaters of The Paper Chase-style effervescent debate. But there was a palpable shift sometime in the late 90s. A hush fell over the classroom. Topics once relied on to ignite discussion — abortion, slavery, capital punishment — no longer held the same appeal. Fewer and fewer hands went up. Students trembled at the thought of being called on and, when they did speak, they parroted a set of ‘safe’ ideas and frequently used “of course” to refer to ideas that would allow them to safely navigate the Scylla and Charybdis of topics considered to be off-limits by the woke zealots.

The stakes are even higher now. Students who question or refuse to comply are rejected or de-enrolled. Recently, an Ontario university student was suspended for asking for a definition of “colonialism.” Merely asking for clarification in the 21st century is academic heresy. Professors like myself are punished or terminated for speaking out, and our universities are becoming increasingly closed systems in which autonomous thought is a threat to the neoliberal groupthink model of ‘education.’

I spent some time thinking in concrete terms about the traits I saw in the novel, 21st century student. With some exception, most students suffer from the following symptoms of our educational failure. They are (for the most part):

  1. “Information-focused,” not “wisdom-interested:” they are computational, able to input and output information (more or less), but lack the critical ability to understand why they are doing so or to manipulate the data in unique ways.
  1. Science and technology worshipping: they treat STEM (science, technology, engineering and mathematics) as a god, as an end in itself rather than an instrument to achieve some end.
  1. Intolerant of uncertainty, complications, gray areas, open questions, and they are generally unable to formulate questions themselves.
  1. Apathetic, unhappy, even miserable (and I’m not sure they ever felt otherwise so they may not recognize these states for what they are).
  1. Increasingly unable to engage in counterfactual thinking. (I will return to this idea in a moment.)
  1. Instrumentalist: everything they do is for the sake of something else.

To elaborate on this last point, when I used to ask my students why they were at university, the following sort of conversation would usually ensue:

Why did you come to university?

To get a degree. 

Why? 

So I can get into law school (nursing or some other impressive post-graduate program). 

Why? 

So I can get a good job. 

Why? 

The well of reflex answers typically started to dry up that point. Some were honest that the lure of a “good job” was to attain money or a certain social status; others seemed genuinely perplexed by the question or would simply say: “My parents tell me I should,” “My friends are all doing it,” or “Society expects it.”

Being an instrumentalist about education means that you see it as valuable only as a way to obtain some further, non-educational good. Again, the passivity is palpable. In this view, education is something that gets poured into you. Once you get enough poured in, it’s time to graduate and unlock the door to the next life prize. But this makes education, for its own sake, meaningless and substitutable. Why not just buy the subject-specific microchip when it becomes available and avoid all the unpleasant studying, questioning, self-reflection, and skill-building?

Time has shown us where this instrumentalism has gotten us: we live in an era of pseudo-intellectuals, pseudo-students and pseudo-education, each of us becoming increasingly less clear why we need education (of the sort offered by our institutions) , or how it’s helping to create a better world.

Why the change? How did intellectual curiosity and critical thinking get trained out of our universities? It’s complex but there are three factors that surely contributed:

  1. Universities became businesses. They became corporate entities with boards of governors, customers and ad campaigns. In early 2021, Huron College (where I worked) appointed its first board of governors with members from Rogers, Sobeys, and EllisDon, a move author Christopher Newfield calls the “great mistake.” Regulatory capture (of the sort that led the University of Toronto to partner with Moderna) is just one consequence of this collusion.
  1. Education became a commodity. Education is treated as a purchasable, exchangeable good, which fits well with the idea that education is something that can be downloaded to anyone’s empty mind. There is an implicit assumption of equality and mediocrity, here; you must believe that every student is roughly the same in skill, aptitude, interest, etc. to be able to be filled this way.
  2. We mistook information for wisdom. Our inheritance from the Enlightenment, the idea that reason will allow us to conquer all, has morphed into information ownership and control. We need to appear informed to seem educated, and we shun the uninformed or misinformed. We align with the most acceptable source of information and forego any critical assessment of how they attained that information. But this isn’t wisdom. Wisdom goes beyond information; it pivots on a sense of care, attention, and context, allowing us to sift through a barrage of information, selecting and acting only on the truly worthy.

This is a radical departure from the earliest universities, which began in the 4th century BC: Plato teaching in the grove of Academus, Epicurus in his private garden. When they met to discuss, there were no corporate partnerships, no boards of directors. They were drawn together by a shared love of questioning and problem-solving.

Out of these early universities was born the concept of liberal arts — grammar, logic, rhetoric, arithmetic, geometry, music and astronomy — studies which are “liberal” not because they are easy or unserious, but because they are suitable for those who are free (liberalis), as opposed to slaves or animals. In the era before SME’s (subject matter experts), these are the subjects thought to be essential preparation for becoming a good, well-informed citizen who is an effective participant in public life.

In this view, education is not something you receive and certainly not something you buy; it is a disposition, a way of life you create for yourself grounded in what Dewey called “skilled powers of thinking.” It helps you to become questioning, critical, curious, creative, humble and, ideally, wise.

The Lost Art of Counterfactual Thinking

I said earlier that I would return to the subject of counterfactual thinking, what it is, why it’s been lost and why it’s important. And I would like to start with another thought experiment: close your eyes and think about one thing that might have been different over the last 3 years that might have made things better.

What did you pick? No WHO pandemic declaration? A different PM or President? Effective media? More tolerant citizens?

Maybe you wondered, what if the world was more just? What if truth could really save us (quickly)?

This “what if” talk is, at its core, counterfactual thinking. We all do it. What if I had become an athlete, written more, scrolled less, married someone else?

Counterfactual thinking enables us to shift from perceiving the immediate environment to imagining a different one. It is key for learning from past experiences, planning and predicting (if I jump off the cliff, x is likely to happen), problem solving, innovation and creativity (maybe I’ll shift careers, arrange my kitchen drawers differently), and it is essential for improving an imperfect world. It also underpins moral emotions like regret and blame (I regret betraying my friend). Neurologically, counterfactual thinking depends on a network of systems for affective processing, mental stimulation, and cognitive control, and it is a symptom of a number of mental illnesses, including schizophrenia.

I don’t think it would be an exaggeration to say that we have lost our ability for counterfactual thinking en masse. But why did this happen? There are a lot of factors — with political ones at the top of the list — but one thing that surely contributed is that we lost a sense of play.

Yes, play. Let me explain. With a few exceptions, our culture has a pretty cynical view of the value of play. Even when we do it, we see play time as wasted and messy, allowing for an intolerable number of mistakes and the possibility of outcomes that don’t fit neatly into an existing framework. This messiness is a sign of weakness, and weakness is a threat to our tribal culture.

I think our culture is intolerant of play because it is intolerant of individuality and of distractions from the messaging we’re “supposed” to hear. It is also intolerant of joy, of anything that helps us to feel healthier, more alive, more focused and more jubilant. Furthermore, it doesn’t result in immediate, “concrete deliverables.”

But what if there was more play in science, in medicine and in politics? What if politicians said “What if we did x instead? Let’s just try out the idea?” What if, instead of your doctor writing a script for the “recommended” pharmaceutical, s/he said “What if you reduced your sugar intake… or… tried walking more? Let’s just try.”

“The stick that stirs the drink”

The non-superficiality of play is hardly a new idea. It was central to the development of the culture of Ancient Greece, one of the greatest civilizations in the world. It is telling that Greek words for play (paidia), children (paides) and education (paideia) have the same root. For the Greeks, play was essential not just to sport and theatre, but to ritual, music, and of course word play (rhetoric).

The Greek philosopher, Plato, saw play as deeply influential to the way children develop as adults. We can prevent social disorder, he wrote, by regulating the nature of children’s play. In his Laws, Plato proposed harnessing play for certain purposes: “If a boy is to be a good farmer or a good builder, he should play at building toy houses or at farming and be provided by his tutor with miniature tools modelled on real ones…One should see games as a means of directing children’s tastes and inclinations to the role they will fill as adults.”

Play is also the basis of the Socratic method, the back-and-forth technique of questioning and answering, trying things out, generating contradictions and imagining alternatives to find better hypotheses. Dialectic is essentially playing with ideas.

A number of contemporaries agree with Plato. The philosopher Colin McGinn wrote in 2008 that “Play is a vital part of any full life, and a person who never plays is worse than a ‘dull boy:’ he or she lacks imagination, humour and a proper sense of value. Only the bleakest and most life-denying Puritanism could warrant deleting all play from human life…..”

And Stuart Brown, founder of the National Institute for Play, wrote: “I don’t think it is too much to say that play can save your life. It certainly has salvaged mine. Life without play is a grinding, mechanical existence organized around doing things necessary for survival. Play is the stick that stirs the drink. It is the basis of all art, games, books, sports, movies, fashion, fun, and wonder — in short, the basis of what we think of as civilization.”

Education as Activity

Play is key but it’s not the only thing missing in modern education. The fact that we have lost it is a symptom, I think, of a more fundamental misunderstanding about what education is and is meant to do.

Let’s go back to the idea of education being an activity. Perhaps the most well-known quotation about education is “Education is not the filling of a pail, but the lighting of a fire.” It litters university recruitment pages, inspirational posters, mugs, and sweatshirts. Typically attributed to William Butler Yeats, the quotation is actually from Plutarch’s essay “On Listening” in which he writes “For the mind does not require filling like a bottle, but rather, like wood, it only requires kindling to create in it an impulse to think independently and an ardent desire for the truth.”

The way Plutarch contrasts learning with filling suggests that the latter was a common, but mistaken, idea. Strangely, we seem to have returned to the mistake and to the assumption that, once you get your bottle filled up, you are complete, you are educated. But if education is a kindling instead of a filling, how is the kindling achieved? How do you help to “create an impulse to think independently?” Let’s do another thought experiment.

If you knew that you could get away with anything, suffering no impunity, what would you do?

There is a story from Plato’s Republic, Book II (discussing the value of justice) that fleshes out this question. Plato describes a shepherd who stumbles upon a ring that grants him the ability to become invisible. He uses his invisibility to seduce the queen, kill her king, and take over the kingdom. Glaucon, one of the interlocutors in the dialogue, suggests that, if there were two such rings, one given to a just man, and the other to an unjust man, there would be no difference between them; they would both take advantage of the ring’s powers, suggesting that anonymity is the only barrier between a just and an unjust person.

Refuting Glaucon, Socrates says that the truly just person will do the right thing even with impunity because he understands the true benefits of acting justly.

Isn’t this the real goal of education, namely to create a well-balanced person who loves learning and justice for their own sakes? This person understands that the good life consists not in seeming but in being, in having a balanced inner self that takes pleasure in the right things because of an understanding of what they offer.

In the first book of his canonical ethical text, Aristotle (Plato’s student) asks what is the good life? What does it consist of? His answer is an obvious one: happiness. But his view of happiness is a bit different from ours. It is a matter of flourishing, which means functioning well according to your nature. And functioning well according to human nature is achieving excellence in reasoning, both intellectually and morally. The intellectual virtues (internal goods) include: scientific knowledge, technical knowledge, intuition, practical wisdom, and philosophical wisdom. The moral virtues include: justice, courage, and temperance.

For Aristotle, what our lives look like from the outside — wealth, health, status, social media likes, reputation — are all “external goods.” It’s not that these are unimportant but we need to understand their proper place in the good life. Having the internal and external goods in their right proportion is the only way to become an autonomous, self-governing, complete person.

It’s pretty clear that we aren’t flourishing as a people, especially if the following are any indication: Canada recently ranked 15th on the World Happiness Report, we have unprecedented levels of anxiety and mental illness, and in 2021 a children’s mental health crisis was declared and the NIH reported an unprecedented number of drug overdose deaths.

By contrast with most young people today, the person who is flourishing and complete will put less stock in the opinions of others, including institutions, because they will have more fully developed internal resources and they will be more likely to recognize when a group is making a bad decision. They will be less vulnerable to peer pressure and coercion, and they will have more to rely on if they do become ostracized from the group.

Educating with a view to the intellectual and moral virtues develops a lot of other things we are missing: research and inquiry skills, physical and mental agility, independent thinking, impulse control, resilience, patience and persistence, problem solving, self-regulation, endurance, self-confidence, self-satisfaction, joy, cooperation, collaboration, negotiation, empathy, and even the ability to put energy into a conversation.

What should be the goals of education? It’s pretty simple (in conception even if not in execution). At any age, for any subject matter, the only 2 goals of education are:

  1. To create a self-ruled (autonomous) person from the ‘inside out,’ who…
  2. Loves learning for its own sake

Education, in this view, is not passive and it is never complete. It is always in process, always open, always humble and humbling.

My students, unfortunately, were like the Republic’s shepherd; they measure the quality of their lives by what they can get away with, what their lives look like from the outside. But their lives, unfortunately, were like a shiny apple that, when you cut into it, is rotten on the inside. And their interior emptiness left them aimless, hopeless, dissatisfied and, unfortunately, miserable.

But it doesn’t have to be this way. Imagine what the world would be like if it were made up of self-ruled people. Would we be happier? Would we be healthier? Would we be more productive? Would we care less about measuring our productivity? My inclination is to think we would be much, much better off.

Self-governance has come under such relentless attack over the last few years because it encourages us to think for ourselves. And this attack didn’t begin recently nor did it emerge ex nihilo. John D. Rockefeller (who, ironically, co-founded the General Education Board in 1902) wrote, “I don’t want a nation of thinkers. I want a nation of workers.” His wish has largely come true.

The battle we are in is a battle over whether we will be slaves or masters, ruled or self-mastered. It is a battle over whether we will be unique or forced into a mold.

Thinking of students as identical to one another makes them substitutable, controllable and, ultimately, erasable. Moving forward, how do we avoid seeing ourselves as bottles to be filled by others? How do we embrace Plutarch’s exhortation to “create […] an impulse to think independently and an ardent desire for the truth?”

When it comes to education, isn’t that the question we must confront as we move through the strangest of times?

Author

  • Julie Ponesse

    Dr. Julie Ponesse, 2023 Brownstone Fellow, is a professor of ethics who has taught at Ontario’s Huron University College for 20 years. She was placed on leave and banned from accessing her campus due to the vaccine mandate. She presented at the The Faith and Democracy Series on 22, 2021. Dr. Ponesse has now taken on a new role with The Democracy Fund, a registered Canadian charity aimed at advancing civil liberties, where she serves as the pandemic ethics scholar.

Todayville is a digital media and technology company. We profile unique stories and events in our community. Register and promote your community event for free.

Follow Author

Brownstone Institute

The CDC Planned Quarantine Camps Nationwide

Published on

From the Brownstone Institute

By Jeffrey A Tucker Jeffrey A. Tucker 

The document was only removed on about March 26, 2023. During the entire intervening time, the plan survived on the CDC’s public site with little to no public notice or controversy. 

No matter how bad you think Covid policies were, they were intended to be worse. 

Consider the vaccine passports alone. Six cities were locked down to include only the vaccinated in public indoor places. They were New York City, Boston, Chicago, New Orleans, Washington, D.C., and Seattle. The plan was to enforce this with a vaccine passport. It broke. Once the news leaked that the shot didn’t stop infection or transmission, the planners lost public support and the scheme collapsed.

It was undoubtedly planned to be permanent and nationwide if not worldwide. Instead, the scheme had to be dialed back.

Features of the CDC’s edicts did incredible damage. It imposed the rent moratorium. It decreed the ridiculous “six feet of distance” and mask mandates. It forced Plexiglas as the interface for commercial transactions. It implied that mail-in balloting must be the norm, which probably flipped the election. It delayed the reopening as long as possible. It was sadistic.

Even with all that, worse was planned. On July 26, 2020, with the George Floyd riots having finally settled down, the CDC issued a plan for establishing nationwide quarantine camps. People were to be isolated, given only food and some cleaning supplies. They would be banned from participating in any religious services. The plan included contingencies for preventing suicide. There were no provisions made for any legal appeals or even the right to legal counsel. 

The plan’s authors were unnamed but included 26 footnotes. It was completely official. The document was only removed on about March 26, 2023. During the entire intervening time, the plan survived on the CDC’s public site with little to no public notice or controversy. 

It was called “Interim Operational Considerations for Implementing the Shielding Approach to Prevent COVID-19 Infections in Humanitarian Settings.” 

By absence of empirical data, the meaning is: nothing like this has ever been tried. The point of the document was to map out how it could be possible and alert authorities to possible pitfalls to be avoided.

“This document presents considerations from the perspective of the U.S. Centers for Disease Control & Prevention (CDC) for implementing the shielding approach in humanitarian settings as outlined in guidance documents focused on camps, displaced populations and low-resource settings. This approach has never been documented and has raised questions and concerns among humanitarian partners who support response activities in these settings. The purpose of this document is to highlight potential implementation challenges of the shielding approach from CDC’s perspective and guide thinking around implementation in the absence of empirical data. Considerations are based on current evidence known about the transmission and severity of coronavirus disease 2019 (COVID-19) and may need to be revised as more information becomes available.”

The meaning of “shielding” is “to reduce the number of severe Covid-19 cases by limiting contact between individuals at higher risk of developing severe disease (‘high-risk’) and the general population (‘low-risk’). High-risk individuals would be temporarily relocated to safe or ‘green zones’ established at the household, neighborhood, camp/sector, or community level depending on the context and setting. They would have minimal contact with family members and other low-risk residents.”

In other words, this is what used to be concentration camps.

Who are these people who would be rounded up? They are “older adults and people of any age who have serious underlying medical conditions.” Who determines this? Public health authorities. The purpose? The CDC explains: “physically separating high-risk individuals from the general population” allows authorities “to prioritize the use of the limited available resources.”

This sounds a lot like condemning people to death in the name of protecting them.

The model establishes three levels. First is the household level. Here high-risk people are“physically isolated from other household members.” That alone is objectionable. Elders need people to take care of them. They need love and to be surrounded by family. The CDC should never imagine that it would intervene in households to force old people into separate places.

The model jumps from households to the “neighborhood level.” Here we have the same approach: forced separation of those deemed vulnerable.

From there, the model jumps again to the “camp/sector level.” Here it is different. “A group of shelters such as schools, community buildings within a camp/sector (max 50 high-risk individuals per single green zone) where high-risk individuals are physically isolated together. One entry point is used for exchange of food, supplies, etc. A meeting area is used for residents and visitors to interact while practicing physical distancing (2 meters). No movement into or outside the green zone.”

Yes, you read that correctly. The CDC is here proposing concentration camps for the sick or anyone they deem to be in danger of medically significant consequences of infection.

Further: “to minimize external contact, each green zone should include able-bodied high-risk individuals capable of caring for residents who have disabilities or are less mobile. Otherwise, designate low-risk individuals for these tasks, preferably who have recovered from confirmed COVID-19 and are assumed to be immune.”

The plan says in passing, contradicting thousands of years of experience, “Currently, we do not know if prior infection confers immunity.” Therefore the only solution is to minimize all exposure throughout the whole population. Getting sick is criminalized.

These camps require a “dedicated staff” to “monitor each green zone. Monitoring includes both adherence to protocols and potential adverse effects or outcomes due to isolation and stigma. It may be necessary to assign someone within the green zone, if feasible, to minimize movement in/out of green zones.”

The people housed in these camps need to have good explanations of why they are denied even basic religious freedom. The report explains:

“Proactive planning ahead of time, including strong community engagement and risk communication is needed to better understand the issues and concerns of restricting individuals from participating in communal practices because they are being shielded. Failure to do so could lead to both interpersonal and communal violence.”

Further, there must be some mechanisms to prohibit suicide:

Additional stress and worry are common during any epidemic and may be more pronounced with COVID-19 due to the novelty of the disease and increased fear of infection, increased childcare responsibilities due to school closures, and loss of livelihoods. Thus, in addition to the risk of stigmatization and feeling of isolation, this shielding approach may have an important psychological impact and may lead to significant emotional distress, exacerbate existing mental illness or contribute to anxiety, depression, helplessness, grief, substance abuse, or thoughts of suicide among those who are separated or have been left behind. Shielded individuals with concurrent severe mental health conditions should not be left alone. There must be a caregiver allocated to them to prevent further protection risks such as neglect and abuse.

The biggest risk, the document explains, is as follows: “While the shielding approach is not meant to be coercive, it may appear forced or be misunderstood in humanitarian settings.”

(It should go without saying but this “shielding” approach suggested here has nothing to do with focused protection of the Great Barrington Declaration. Focused protection specifically says: “schools and universities should be open for in-person teaching. Extracurricular activities, such as sports, should be resumed. Young low-risk adults should work normally, rather than from home. Restaurants and other businesses should open. Arts, music, sport and other cultural activities should resume. People who are more at risk may participate if they wish, while society as a whole enjoys the protection conferred upon the vulnerable by those who have built up herd immunity.”)

In four years of research, and encountering truly shocking documents and evidence of what happened in the Covid years, this one certainly ranks up at the top of the list of totalitarian schemes for pathogenic control prior to vaccination. It is quite simply mind-blowing that such a scheme could ever be contemplated.

Who wrote it? What kind of deep institutional pathology exists that enabled this to be contemplated? The CDC has 10,600 full-time employees and contractors and a budget of $11.5 billion. In light of this report, and everything else that has gone on there for four years, both numbers should be zero.

Author

Jeffrey A Tucker

Jeffrey Tucker is Founder, Author, and President at Brownstone Institute. He is also Senior Economics Columnist for Epoch Times, author of 10 books, including Life After Lockdown, and many thousands of articles in the scholarly and popular press. He speaks widely on topics of economics, technology, social philosophy, and culture.

Continue Reading

Brownstone Institute

They Are Scrubbing the Internet Right Now

Published on

From the Brownstone Institute

By Jeffrey A TuckerJeffrey A. TuckerDebbie Lerman  

For the first time in 30 years, we have gone a long swath of time – since October 8-10 – since this service has chronicled the life of the Internet in real time.

Instances of censorship are growing to the point of normalization. Despite ongoing litigation and more public attention, mainstream social media has been more ferocious in recent months than ever before. Podcasters know for sure what will be instantly deleted and debate among themselves over content in gray areas. Some like Brownstone have given up on YouTube in favor of Rumble, sacrificing vast audiences if only to see their content survive to see the light of day.

It’s not always about being censored or not. Today’s algorithms include a range of tools that affect searchability and findability. For example, the Joe Rogan interview with Donald Trump racked up an astonishing 34 million views before YouTube and Google tweaked their search engines to make it hard to discover, while even presiding over a technical malfunction that disabled viewing for many people. Faced with this, Rogan went to the platform X to post all three hours.

Navigating this thicket of censorship and quasi-censorship has become part of the business model of alternative media.

Those are just the headline cases. Beneath the headlines, there are technical events taking place that are fundamentally affecting the ability of any historian even to look back and tell what is happening. Incredibly, the service Archive.org which has been around since 1994 has stopped taking images of content on all platforms. For the first time in 30 years, we have gone a long swath of time – since October 8-10 – since this service has chronicled the life of the Internet in real time.

As of this writing, we have no way to verify content that has been posted for three weeks of October leading to the days of the most contentious and consequential election of our lifetimes. Crucially, this is not about partisanship or ideological discrimination. No websites on the Internet are being archived in ways that are available to users. In effect, the whole memory of our main information system is just a big black hole right now.

The trouble on Archive.org began on October 8, 2024, when the service was suddenly hit with a massive Denial of Service attack (DDOS) that not only took down the service but introduced a level of failure that nearly took it out completely. Working around the clock, Archive.org came back as a read-only service where it stands today. However, you can only read content that was posted before the attack. The service has yet to resume any public display of mirroring of any sites on the Internet.

In other words, the only source on the entire World Wide Web that mirrors content in real time has been disabled. For the first time since the invention of the web browser itself, researchers have been robbed of the ability to compare past with future content, an action that is a staple of researchers looking into government and corporate actions.

It was using this service, for example, that enabled Brownstone researchers to discover precisely what the CDC had said about Plexiglas, filtration systems, mail-in ballots, and rental moratoriums. That content was all later scrubbed off the live Internet, so accessing archive copies was the only way we could know and verify what was true. It was the same with the World Health Organization and its disparagement of natural immunity which was later changed. We were able to document the shifting definitions thanks only to this tool which is now disabled.

What this means is the following: Any website can post anything today and take it down tomorrow and leave no record of what they posted unless some user somewhere happened to take a screenshot. Even then there is no way to verify its authenticity. The standard approach to know who said what and when is now gone. That is to say that the whole Internet is already being censored in real time so that during these crucial weeks, when vast swaths of the public fully expect foul play, anyone in the information industry can get away with anything and not get caught.

We know what you are thinking. Surely this DDOS attack was not a coincidence. The time was just too perfect. And maybe that is right. We just do not know. Does Archive.org suspect something along those lines? Here is what they say:

Last week, along with a DDOS attack and exposure of patron email addresses and encrypted passwords, the Internet Archive’s website javascript was defaced, leading us to bring the site down to access and improve our security. The stored data of the Internet Archive is safe and we are working on resuming services safely. This new reality requires heightened attention to cyber security and we are responding. We apologize for the impact of these library services being unavailable.

Deep state? As with all these things, there is no way to know, but the effort to blast away the ability of the Internet to have a verified history fits neatly into the stakeholder model of information distribution that has clearly been prioritized on a global level. The Declaration of the Future of the Internet makes that very clear: the Internet should be “governed through the multi-stakeholder approach, whereby governments and relevant authorities partner with academics, civil society, the private sector, technical community and others.”  All of these stakeholders benefit from the ability to act online without leaving a trace.

To be sure, a librarian at Archive.org has written that “While the Wayback Machine has been in read-only mode, web crawling and archiving have continued. Those materials will be available via the Wayback Machine as services are secured.”

When? We do not know. Before the election? In five years? There might be some technical reasons but it might seem that if web crawling is continuing behind the scenes, as the note suggests, that too could be available in read-only mode now. It is not.

Disturbingly, this erasure of Internet memory is happening in more than one place. For many years,  Google offered a cached version of the link you were seeking just below the live version. They have plenty of server space to enable that now, but no: that service is now completely gone. In fact, the Google cache service officially ended just a week or two before the Archive.org crash, at the end of September 2024.

Thus the two available tools for searching cached pages on the Internet disappeared within weeks of each other and within weeks of the November 5th election.

Other disturbing trends are also turning Internet search results increasingly into AI-controlled lists of establishment-approved narratives. The web standard used to be for search result rankings to be governed by user behavior, links, citations, and so forth. These were more or less organic metrics, based on an aggregation of data indicating how useful a search result was to Internet users. Put very simply, the more people found a search result useful, the higher it would rank. Google now uses very different metrics to rank search results, including what it considers “trusted sources” and other opaque, subjective determinations.

Furthermore, the most widely used service that once ranked websites based on traffic is now gone. That service was called Alexa. The company that created it was independent. Then one day in 1999, it was bought by Amazon. That seemed encouraging because Amazon was well-heeled. The acquisition seemed to codify the tool that everyone was using as a kind of metric of status on the web. It was common back in the day to take note of an article somewhere on the web and then look it up on Alexa to see its reach. If it was important, one would take notice, but if it was not, no one particularly cared.

This is how an entire generation of web technicians functioned. The system worked as well as one could possibly expect.

Then, in 2014, years after acquiring the ranking service Alexa, Amazon did a strange thing. It released its home assistant (and surveillance device) with the same name. Suddenly, everyone had them in their homes and would find out anything by saying “Hey Alexa.” Something seemed strange about Amazon naming its new product after an unrelated business it had acquired years earlier. No doubt there was some confusion caused by the naming overlap.

Here’s what happened next. In 2022, Amazon actively took down the web ranking tool. It didn’t sell it. It didn’t raise the prices. It didn’t do anything with it. It suddenly made it go completely dark.

No one could figure out why. It was the industry standard, and suddenly it was gone. Not sold, just blasted away. No longer could anyone figure out the traffic-based website rankings of anything without paying very high prices for hard-to-use proprietary products.

All of these data points that might seem unrelated when considered individually, are actually part of a long trajectory that has shifted our information landscape into unrecognizable territory. The Covid events of 2020-2023, with massive global censorship and propaganda efforts, greatly accelerated these trends.

One wonders if anyone will remember what it was once like. The hacking and hobbling of Archive.org underscores the point: there will be no more memory.

As of this writing, fully three weeks of web content have not been archived. What we are missing and what has changed is anyone’s guess. And we have no idea when the service will come back. It is entirely possible that it will not come back, that the only real history to which we can take recourse will be pre-October 8, 2024, the date on which everything changed.

The Internet was founded to be free and democratic. It will require herculean efforts at this point to restore that vision, because something else is quickly replacing it.

Authors

Jeffrey A Tucker

Jeffrey Tucker is Founder, Author, and President at Brownstone Institute. He is also Senior Economics Columnist for Epoch Times, author of 10 books, including Life After Lockdown, and many thousands of articles in the scholarly and popular press. He speaks widely on topics of economics, technology, social philosophy, and culture.

Continue Reading

Trending

X