Brownstone Institute
Why Is Our Education System Failing to Educate?

From the Brownstone Institute
BY
I suspect many of you know my story. But, for those who don’t, the short version is that I taught philosophy — ethics and ancient philosophy, in particular — at Western University in Canada until September 2021 when I was very publicly terminated “with cause” for refusing to comply with Western’s COVID-19 policy.
What I did — question, critically evaluate and, ultimately, challenge what we now call “the narrative” — is risky behaviour. It got me fired, labeled an “academic pariah,” chastised by mainstream media, and vilified by my peers. But this ostracization and vilification, it turns out, was just a symptom of a shift towards a culture of silence, nihilism, and mental atrophy that had been brewing for a long time.
You know that parental rhetorical question, “So if everyone jumped off a cliff, would you do it too?” It turns out that most would jump at the rate of about 90 percent and that most of the 90 percent wouldn’t ask any questions about the height of the cliff, alternative options, accommodations for the injured, etc. What was supposed to be a cautionary rhetorical joke has become the modus operandi of the Western world.
Admittedly, I am a bit of an odd choice as the keynote speaker for an education conference. I have no specialized training in the philosophy of education or in pedagogy. In graduate school, you receive little formal instruction about how to teach. You learn by experience, research, trial by fire, and by error. And, of course, I was terminated from my position as a university teacher. But I do think a lot about education. I look at how many people are willing to outsource their thinking and I wonder, what went wrong? Confronted with the products of our public school system every day for 20 years, I wonder what went wrong? And, finally, as the mother of a 2-year-old, I think a lot about what happens in the early years to encourage a better outcome than we are seeing today.
My aim today is to talk a bit about what I saw in university students during my teaching career, why I think the education system failed them, and the only two basic skills any student at any age really needs.
Let’s start by doing something I used to do regularly in class, something some students loved and others hated. Let’s brainstorm some answers to this question: What does it mean to “be educated?”
[Answers from the audience included: “to acquire knowledge,” “to learn the truth,” “to develop a set of required skills,” “to get a degree.”]
Many answers were admirable but I noticed that most describe education passively: “to be educated,” “to get a degree,” “to be informed” are all passive verbs.
When it comes to writing, we are often told to use the active voice. It is clearer, more emphatic, and creates greater emotional impact. And yet the predominant way we describe education is passive. But is education really a passive experience? Is it something that just happens to us like getting rained on or being scratched by a cat? And do you need to be acted on by someone else in order to become educated? Or is education a more active, personal, emphatic and impactful experience? Might “I am educating,” “I am learning” be more accurate descriptions?
My experience in the classroom was certainly consistent with thinking of education as a passive experience. Over the years, I saw an increasing trend towards timidity, conformity and apathy, all signs of educational passivity. But this was a strict departure from the university culture that met me as an undergraduate in the mid-90s.
As an undergraduate, my classes were robust theaters of The Paper Chase-style effervescent debate. But there was a palpable shift sometime in the late 90s. A hush fell over the classroom. Topics once relied on to ignite discussion — abortion, slavery, capital punishment — no longer held the same appeal. Fewer and fewer hands went up. Students trembled at the thought of being called on and, when they did speak, they parroted a set of ‘safe’ ideas and frequently used “of course” to refer to ideas that would allow them to safely navigate the Scylla and Charybdis of topics considered to be off-limits by the woke zealots.
The stakes are even higher now. Students who question or refuse to comply are rejected or de-enrolled. Recently, an Ontario university student was suspended for asking for a definition of “colonialism.” Merely asking for clarification in the 21st century is academic heresy. Professors like myself are punished or terminated for speaking out, and our universities are becoming increasingly closed systems in which autonomous thought is a threat to the neoliberal groupthink model of ‘education.’
I spent some time thinking in concrete terms about the traits I saw in the novel, 21st century student. With some exception, most students suffer from the following symptoms of our educational failure. They are (for the most part):
- “Information-focused,” not “wisdom-interested:” they are computational, able to input and output information (more or less), but lack the critical ability to understand why they are doing so or to manipulate the data in unique ways.
- Science and technology worshipping: they treat STEM (science, technology, engineering and mathematics) as a god, as an end in itself rather than an instrument to achieve some end.
- Intolerant of uncertainty, complications, gray areas, open questions, and they are generally unable to formulate questions themselves.
- Apathetic, unhappy, even miserable (and I’m not sure they ever felt otherwise so they may not recognize these states for what they are).
- Increasingly unable to engage in counterfactual thinking. (I will return to this idea in a moment.)
- Instrumentalist: everything they do is for the sake of something else.
To elaborate on this last point, when I used to ask my students why they were at university, the following sort of conversation would usually ensue:
Why did you come to university?
To get a degree.
Why?
So I can get into law school (nursing or some other impressive post-graduate program).
Why?
So I can get a good job.
Why?
The well of reflex answers typically started to dry up that point. Some were honest that the lure of a “good job” was to attain money or a certain social status; others seemed genuinely perplexed by the question or would simply say: “My parents tell me I should,” “My friends are all doing it,” or “Society expects it.”
Being an instrumentalist about education means that you see it as valuable only as a way to obtain some further, non-educational good. Again, the passivity is palpable. In this view, education is something that gets poured into you. Once you get enough poured in, it’s time to graduate and unlock the door to the next life prize. But this makes education, for its own sake, meaningless and substitutable. Why not just buy the subject-specific microchip when it becomes available and avoid all the unpleasant studying, questioning, self-reflection, and skill-building?
Time has shown us where this instrumentalism has gotten us: we live in an era of pseudo-intellectuals, pseudo-students and pseudo-education, each of us becoming increasingly less clear why we need education (of the sort offered by our institutions) , or how it’s helping to create a better world.
Why the change? How did intellectual curiosity and critical thinking get trained out of our universities? It’s complex but there are three factors that surely contributed:
- Universities became businesses. They became corporate entities with boards of governors, customers and ad campaigns. In early 2021, Huron College (where I worked) appointed its first board of governors with members from Rogers, Sobeys, and EllisDon, a move author Christopher Newfield calls the “great mistake.” Regulatory capture (of the sort that led the University of Toronto to partner with Moderna) is just one consequence of this collusion.
- Education became a commodity. Education is treated as a purchasable, exchangeable good, which fits well with the idea that education is something that can be downloaded to anyone’s empty mind. There is an implicit assumption of equality and mediocrity, here; you must believe that every student is roughly the same in skill, aptitude, interest, etc. to be able to be filled this way.
- We mistook information for wisdom. Our inheritance from the Enlightenment, the idea that reason will allow us to conquer all, has morphed into information ownership and control. We need to appear informed to seem educated, and we shun the uninformed or misinformed. We align with the most acceptable source of information and forego any critical assessment of how they attained that information. But this isn’t wisdom. Wisdom goes beyond information; it pivots on a sense of care, attention, and context, allowing us to sift through a barrage of information, selecting and acting only on the truly worthy.
This is a radical departure from the earliest universities, which began in the 4th century BC: Plato teaching in the grove of Academus, Epicurus in his private garden. When they met to discuss, there were no corporate partnerships, no boards of directors. They were drawn together by a shared love of questioning and problem-solving.
Out of these early universities was born the concept of liberal arts — grammar, logic, rhetoric, arithmetic, geometry, music and astronomy — studies which are “liberal” not because they are easy or unserious, but because they are suitable for those who are free (liberalis), as opposed to slaves or animals. In the era before SME’s (subject matter experts), these are the subjects thought to be essential preparation for becoming a good, well-informed citizen who is an effective participant in public life.
In this view, education is not something you receive and certainly not something you buy; it is a disposition, a way of life you create for yourself grounded in what Dewey called “skilled powers of thinking.” It helps you to become questioning, critical, curious, creative, humble and, ideally, wise.
The Lost Art of Counterfactual Thinking
I said earlier that I would return to the subject of counterfactual thinking, what it is, why it’s been lost and why it’s important. And I would like to start with another thought experiment: close your eyes and think about one thing that might have been different over the last 3 years that might have made things better.
What did you pick? No WHO pandemic declaration? A different PM or President? Effective media? More tolerant citizens?
Maybe you wondered, what if the world was more just? What if truth could really save us (quickly)?
This “what if” talk is, at its core, counterfactual thinking. We all do it. What if I had become an athlete, written more, scrolled less, married someone else?
Counterfactual thinking enables us to shift from perceiving the immediate environment to imagining a different one. It is key for learning from past experiences, planning and predicting (if I jump off the cliff, x is likely to happen), problem solving, innovation and creativity (maybe I’ll shift careers, arrange my kitchen drawers differently), and it is essential for improving an imperfect world. It also underpins moral emotions like regret and blame (I regret betraying my friend). Neurologically, counterfactual thinking depends on a network of systems for affective processing, mental stimulation, and cognitive control, and it is a symptom of a number of mental illnesses, including schizophrenia.
I don’t think it would be an exaggeration to say that we have lost our ability for counterfactual thinking en masse. But why did this happen? There are a lot of factors — with political ones at the top of the list — but one thing that surely contributed is that we lost a sense of play.
Yes, play. Let me explain. With a few exceptions, our culture has a pretty cynical view of the value of play. Even when we do it, we see play time as wasted and messy, allowing for an intolerable number of mistakes and the possibility of outcomes that don’t fit neatly into an existing framework. This messiness is a sign of weakness, and weakness is a threat to our tribal culture.
I think our culture is intolerant of play because it is intolerant of individuality and of distractions from the messaging we’re “supposed” to hear. It is also intolerant of joy, of anything that helps us to feel healthier, more alive, more focused and more jubilant. Furthermore, it doesn’t result in immediate, “concrete deliverables.”
But what if there was more play in science, in medicine and in politics? What if politicians said “What if we did x instead? Let’s just try out the idea?” What if, instead of your doctor writing a script for the “recommended” pharmaceutical, s/he said “What if you reduced your sugar intake… or… tried walking more? Let’s just try.”
“The stick that stirs the drink”
The non-superficiality of play is hardly a new idea. It was central to the development of the culture of Ancient Greece, one of the greatest civilizations in the world. It is telling that Greek words for play (paidia), children (paides) and education (paideia) have the same root. For the Greeks, play was essential not just to sport and theatre, but to ritual, music, and of course word play (rhetoric).
The Greek philosopher, Plato, saw play as deeply influential to the way children develop as adults. We can prevent social disorder, he wrote, by regulating the nature of children’s play. In his Laws, Plato proposed harnessing play for certain purposes: “If a boy is to be a good farmer or a good builder, he should play at building toy houses or at farming and be provided by his tutor with miniature tools modelled on real ones…One should see games as a means of directing children’s tastes and inclinations to the role they will fill as adults.”
Play is also the basis of the Socratic method, the back-and-forth technique of questioning and answering, trying things out, generating contradictions and imagining alternatives to find better hypotheses. Dialectic is essentially playing with ideas.
A number of contemporaries agree with Plato. The philosopher Colin McGinn wrote in 2008 that “Play is a vital part of any full life, and a person who never plays is worse than a ‘dull boy:’ he or she lacks imagination, humour and a proper sense of value. Only the bleakest and most life-denying Puritanism could warrant deleting all play from human life…..”
And Stuart Brown, founder of the National Institute for Play, wrote: “I don’t think it is too much to say that play can save your life. It certainly has salvaged mine. Life without play is a grinding, mechanical existence organized around doing things necessary for survival. Play is the stick that stirs the drink. It is the basis of all art, games, books, sports, movies, fashion, fun, and wonder — in short, the basis of what we think of as civilization.”
Education as Activity
Play is key but it’s not the only thing missing in modern education. The fact that we have lost it is a symptom, I think, of a more fundamental misunderstanding about what education is and is meant to do.
Let’s go back to the idea of education being an activity. Perhaps the most well-known quotation about education is “Education is not the filling of a pail, but the lighting of a fire.” It litters university recruitment pages, inspirational posters, mugs, and sweatshirts. Typically attributed to William Butler Yeats, the quotation is actually from Plutarch’s essay “On Listening” in which he writes “For the mind does not require filling like a bottle, but rather, like wood, it only requires kindling to create in it an impulse to think independently and an ardent desire for the truth.”
The way Plutarch contrasts learning with filling suggests that the latter was a common, but mistaken, idea. Strangely, we seem to have returned to the mistake and to the assumption that, once you get your bottle filled up, you are complete, you are educated. But if education is a kindling instead of a filling, how is the kindling achieved? How do you help to “create an impulse to think independently?” Let’s do another thought experiment.
If you knew that you could get away with anything, suffering no impunity, what would you do?
There is a story from Plato’s Republic, Book II (discussing the value of justice) that fleshes out this question. Plato describes a shepherd who stumbles upon a ring that grants him the ability to become invisible. He uses his invisibility to seduce the queen, kill her king, and take over the kingdom. Glaucon, one of the interlocutors in the dialogue, suggests that, if there were two such rings, one given to a just man, and the other to an unjust man, there would be no difference between them; they would both take advantage of the ring’s powers, suggesting that anonymity is the only barrier between a just and an unjust person.
Refuting Glaucon, Socrates says that the truly just person will do the right thing even with impunity because he understands the true benefits of acting justly.
Isn’t this the real goal of education, namely to create a well-balanced person who loves learning and justice for their own sakes? This person understands that the good life consists not in seeming but in being, in having a balanced inner self that takes pleasure in the right things because of an understanding of what they offer.
In the first book of his canonical ethical text, Aristotle (Plato’s student) asks what is the good life? What does it consist of? His answer is an obvious one: happiness. But his view of happiness is a bit different from ours. It is a matter of flourishing, which means functioning well according to your nature. And functioning well according to human nature is achieving excellence in reasoning, both intellectually and morally. The intellectual virtues (internal goods) include: scientific knowledge, technical knowledge, intuition, practical wisdom, and philosophical wisdom. The moral virtues include: justice, courage, and temperance.
For Aristotle, what our lives look like from the outside — wealth, health, status, social media likes, reputation — are all “external goods.” It’s not that these are unimportant but we need to understand their proper place in the good life. Having the internal and external goods in their right proportion is the only way to become an autonomous, self-governing, complete person.
It’s pretty clear that we aren’t flourishing as a people, especially if the following are any indication: Canada recently ranked 15th on the World Happiness Report, we have unprecedented levels of anxiety and mental illness, and in 2021 a children’s mental health crisis was declared and the NIH reported an unprecedented number of drug overdose deaths.
By contrast with most young people today, the person who is flourishing and complete will put less stock in the opinions of others, including institutions, because they will have more fully developed internal resources and they will be more likely to recognize when a group is making a bad decision. They will be less vulnerable to peer pressure and coercion, and they will have more to rely on if they do become ostracized from the group.
Educating with a view to the intellectual and moral virtues develops a lot of other things we are missing: research and inquiry skills, physical and mental agility, independent thinking, impulse control, resilience, patience and persistence, problem solving, self-regulation, endurance, self-confidence, self-satisfaction, joy, cooperation, collaboration, negotiation, empathy, and even the ability to put energy into a conversation.
What should be the goals of education? It’s pretty simple (in conception even if not in execution). At any age, for any subject matter, the only 2 goals of education are:
- To create a self-ruled (autonomous) person from the ‘inside out,’ who…
- Loves learning for its own sake
Education, in this view, is not passive and it is never complete. It is always in process, always open, always humble and humbling.
My students, unfortunately, were like the Republic’s shepherd; they measure the quality of their lives by what they can get away with, what their lives look like from the outside. But their lives, unfortunately, were like a shiny apple that, when you cut into it, is rotten on the inside. And their interior emptiness left them aimless, hopeless, dissatisfied and, unfortunately, miserable.
But it doesn’t have to be this way. Imagine what the world would be like if it were made up of self-ruled people. Would we be happier? Would we be healthier? Would we be more productive? Would we care less about measuring our productivity? My inclination is to think we would be much, much better off.
Self-governance has come under such relentless attack over the last few years because it encourages us to think for ourselves. And this attack didn’t begin recently nor did it emerge ex nihilo. John D. Rockefeller (who, ironically, co-founded the General Education Board in 1902) wrote, “I don’t want a nation of thinkers. I want a nation of workers.” His wish has largely come true.
The battle we are in is a battle over whether we will be slaves or masters, ruled or self-mastered. It is a battle over whether we will be unique or forced into a mold.
Thinking of students as identical to one another makes them substitutable, controllable and, ultimately, erasable. Moving forward, how do we avoid seeing ourselves as bottles to be filled by others? How do we embrace Plutarch’s exhortation to “create […] an impulse to think independently and an ardent desire for the truth?”
When it comes to education, isn’t that the question we must confront as we move through the strangest of times?
Brownstone Institute
The Doctor Will Kill You Now

From the Brownstone Institute
Way back in the B.C. era (Before Covid), I taught Medical Humanities and Bioethics at an American medical school. One of my older colleagues – I’ll call him Dr. Quinlan – was a prominent member of the faculty and a nationally recognized proponent of physician-assisted suicide.
Dr. Quinlan was a very nice man. He was soft-spoken, friendly, and intelligent. He had originally become involved in the subject of physician-assisted suicide by accident, while trying to help a patient near the end of her life who was suffering terribly.
That particular clinical case, which Dr. Quinlan wrote up and published in a major medical journal, launched a second career of sorts for him, as he became a leading figure in the physician-assisted suicide movement. In fact, he was lead plaintiff in a challenge of New York’s then-prohibition against physician-assisted suicide.
The case eventually went all the way to the US Supreme Court, which added to his fame. As it happened, SCOTUS ruled 9-0 against him, definitively establishing that there is no “right to die” enshrined in the Constitution, and affirming that the state has a compelling interest to protect the vulnerable.
SCOTUS’s unanimous decision against Dr. Quinlan meant that his side had somehow pulled off the impressive feat of uniting Antonin Scalia, Ruth Bader Ginsberg, and all points in between against their cause. (I never quite saw how that added to his luster, but such is the Academy.)
At any rate, I once had a conversation with Dr. Quinlan about physician-assisted suicide. I told him that I opposed it ever becoming legal. I recall he calmly, pleasantly asked me why I felt that way.
First, I acknowledged that his formative case must have been very tough, and allowed that maybe, just maybe, he had done right in that exceptionally difficult situation. But as the legal saying goes, hard cases make bad law.
Second, as a clinical physician, I felt strongly that no patient should ever see their doctor and have to wonder if he was coming to help keep them alive or to kill them.
Finally, perhaps most importantly, there’s this thing called the slippery slope.
As I recall, he replied that he couldn’t imagine the slippery slope becoming a problem in a matter so profound as causing a patient’s death.
Well, maybe not with you personally, Dr. Quinlan, I thought. I said no more.
But having done my residency at a major liver transplant center in Boston, I had had more than enough experience with the rather slapdash ethics of the organ transplantation world. The opaque shuffling of patients up and down the transplant list, the endless and rather macabre scrounging for donors, and the nebulous, vaguely sinister concept of brain death had all unsettled me.
Prior to residency, I had attended medical school in Canada. In those days, the McGill University Faculty of Medicine was still almost Victorian in its ways: an old-school, stiff-upper-lip, Workaholics-Anonymous-chapter-house sort of place. The ethic was hard work, personal accountability for mistakes, and above all primum non nocere – first, do no harm.
Fast forward to today’s soft-core totalitarian state of Canada, the land of debanking and convicting peaceful protesters, persecuting honest physicians for speaking obvious truth, fining people $25,000 for hiking on their own property, and spitefully seeking to slaughter harmless animals precisely because they may hold unique medical and scientific value.
To all those offenses against liberty, morality, and basic decency, we must add Canada’s aggressive policy of legalizing, and, in fact, encouraging industrial-scale physician-assisted suicide. Under Canada’s Medical Assistance In Dying (MAiD) program, which has been in place only since 2016, physician-assisted suicide now accounts for a terrifying 4.7 percent of all deaths in Canada.
MAiD will be permitted for patients suffering from mental illness in Canada in 2027, putting it on par with the Netherlands, Belgium, and Switzerland.
To its credit, and unlike the Netherlands and Belgium, Canada does not allow minors to access MAiD. Not yet.
However, patients scheduled to be terminated via MAiD in Canada are actively recruited to have their organs harvested. In fact, MAiD accounts for 6 percent of all deceased organ donors in Canada.
In summary, in Canada, in less than 10 years, physician-assisted suicide has gone from illegal to both an epidemic cause of death and a highly successful organ-harvesting source for the organ transplantation industry.
Physician-assisted suicide has not slid down the slippery slope in Canada. It has thrown itself off the face of El Capitan.
And now, at long last, physician-assisted suicide may be coming to New York. It has passed the House and Senate, and just awaits the Governor’s signature. It seems that the 9-0 Supreme Court shellacking back in the day was just a bump in the road. The long march through the institutions, indeed.
For a brief period in Western history, roughly from the introduction of antibiotics until Covid, hospitals ceased to be a place one entered fully expecting to die. It appears that era is coming to an end.
Covid demonstrated that Western allopathic medicine has a dark, sadistic, anti-human side – fueled by 20th-century scientism and 21st-century technocratic globalism – to which it is increasingly turning. Physician-assisted suicide is a growing part of this death cult transformation. It should be fought at every step.
I have not seen Dr. Quinlan in years. I do not know how he might feel about my slippery slope argument today.
I still believe I was correct.
Brownstone Institute
Trump Covets the Nobel Peace Prize

From the Brownstone Institute
By
Many news outlets reported the announcement of the Nobel Peace Prize on Friday by saying President Donald Trump had missed out (Washington Post, Yahoo, Hindustan Times, Huffington Post), not won (USA Today), fallen short (AP News), lost (Time), etc. There is even a meme doing the rounds about ‘Trump Wine.’ ‘Made from sour grapes,’ the label explains, ‘This is a full bodied and bitter vintage guaranteed to leave a nasty taste in your mouth for years.’

For the record, the prize was awarded to María Corina Machado for her courageous and sustained opposition to Venezuela’s ruling regime. Trump called to congratulate her. Given his own attacks on the Venezuelan president, his anger will be partly mollified, and he could even back her with practical support. He nonetheless attacked the prize committee, and the White House assailed it for putting politics before peace.
He could be in serious contention next year. If his Gaza peace plan is implemented and holds until next October, he should get it. That he is unlikely to do so is more a reflection on the award and less on Trump.
So He Won the Nobel Peace Prize. Meh!
Alfred Nobel’s will stipulates the prize should be awarded to the person who has contributed the most to promote ‘fraternity between nations…abolition or reduction of standing armies and…holding and promotion of peace congresses.’ Over the decades, this has expanded progressively to embrace human rights, political dissent, environmentalism, race, gender, and other social justice causes.
On these grounds, I would have thought the Covid resistance should have been a winner. The emphasis has shifted from outcomes and actual work to advocacy. In honouring President Barack Obama in 2009, the Nobel committee embarrassed itself, patronised him, and demeaned the prize. His biggest accomplishment was the choice of his predecessor as president: the prize was a one-finger send-off to President George W. Bush.
There have been other strange laureates, including those prone to wage war (Henry Kissinger, 1973), tainted through association with terrorism (Yasser Arafat, 1994), and contributions to fields beyond peace, such as planting millions of trees. Some laureates were subsequently discovered to have embellished their record, and others proved to be flawed champions of human rights who had won them the treasured accolade.
Conversely, Mahatma Gandhi did not get the prize, not for his contributions to the theory and practice of non-violence, nor for his role in toppling the British Raj as the curtain raiser to worldwide decolonisation. The sad reality is how little practical difference the prize has made to the causes it espoused. They bring baubles and honour to the laureates, but the prize has lost much of its lustre as far as results go.
Trump Was Not a Serious Contender
The nomination processes start in September and nominations close on 31 January. The five-member Norwegian Nobel committee scrutinises the list of candidates and whittles it down between February and October. The prize is announced on or close to 10 October, the date Alfred Nobel died, and the award ceremony is held in Oslo in early December.
The calendar rules out a newly elected president in his first year, with the risible exception of Obama. The period under review was 2024. Trump’s claims to have ended seven wars and boasts of ‘nobody’s ever done that’ are not taken seriously beyond the narrow circle of fervent devotees, sycophantic courtiers, and supplicant foreign leaders eager to ingratiate themselves with over-the-top flattery.
Trump Could Be in Serious Contention Next Year
Trump’s 20-point Gaza peace plan falls into three conceptual-cum-chronological parts: today, tomorrow, and the day after. At the time of writing, in a hinge moment in the two-year war, Israel has implemented a ceasefire in Gaza, Hamas has agreed to release Israeli hostages on 13-14 October, and Israel will release around 2,000 Palestinian prisoners (today’s agenda). So why are the ‘Ceasefire Now!’ mobs not out on the streets celebrating joyously instead of looking morose and discombobulated? Perhaps they’ve been robbed of the meaning of life?
The second part (tomorrow) requires Hamas demilitarisation, surrender, amnesty, no role in Gaza’s future governance, resumption of aid deliveries, Israeli military pullbacks, a temporary international stabilisation force, and a technocratic transitional administration. The third part, the agenda for the day after, calls for the deradicalisation of Gaza, its reconstruction and development, an international Peace Board to oversee implementation of the plan, governance reforms of the Palestinian Authority, and, over the horizon, Palestinian statehood.
There are too many potential pitfalls to rest easy on the prospects for success. Will Hamas commit military and political suicide? How can the call for democracy in Gaza and the West Bank be reconciled with Hamas as the most popular group among Palestinians? Can Israel’s fractious governing coalition survive?
Both Hamas and Israel have a long record of agreeing to demands under pressure but sabotaging their implementation at points of vulnerability. The broad Arab support could weaken as difficulties arise. The presence of the internationally toxic Tony Blair on the Peace Board could derail the project. Hamas has reportedly called on all factions to reject Blair’s involvement. Hamas official Basem Naim, while thanking Trump for his positive role in the peace deal, explained that ‘Palestinians, Arabs and Muslims and maybe a lot [of] people around the world still remember his [Blair’s] role in causing the killing of thousands or millions of innocent civilians in Afghanistan and Iraq.’
It would be a stupendous achievement for all the complicated moving parts to come together in stable equilibrium. What cannot and should not be denied is the breathtaking diplomatic coup already achieved. Only Trump could have pulled this off.
The very traits that are so offputting in one context helped him to get here: narcissism; bullying and impatience; bull in a china shop style of diplomacy; indifference to what others think; dislike of wars and love of real estate development; bottomless faith in his own vision, negotiating skills, and ability to read others; personal relationships with key players in the region; and credibility as both the ultimate guarantor of Israel’s security and preparedness to use force if obstructed. Israelis trust him; Hamas and Iran fear him.
The combined Israeli-US attacks to degrade Iran’s nuclear capability underlined the credibility of threats of force against recalcitrant opponents. Unilateral Israeli strikes on Hamas leaders in Qatar highlighted to uninvolved Arabs the very real dangers of continued escalation amidst the grim Israeli determination to rid themselves of Hamas once and for all.
Trump Is Likely to Be Overlooked
Russia has sometimes been the object of the Nobel Peace Prize. The mischievous President Vladimir Putin has suggested Trump may be too good for the prize. Trump’s disdain for and hostility to international institutions and assaults on the pillars of the liberal international order would have rubbed Norwegians, among the world’s strongest supporters of rules-based international governance, net zero, and foreign aid, the wrong way.
Brash and public lobbying for the prize, like calling the Norwegian prime minister, is counterproductive. The committee is fiercely independent. Nominees are advised against making the nomination public, let alone orchestrating an advocacy campaign. Yet, one laureate is believed to have mobilised his entire government for quiet lobbying behind the scenes, and another to have bad-mouthed a leading rival to friendly journalists.
Most crucially, given that Scandinavian character traits tip towards the opposite end of the scale, it’s hard to see the committee overlooking Trump’s loud flaws, vanity, braggadocio, and lack of grace and humility. Trump supporters discount his character traits and take his policies and results seriously. Haters cannot get over the flaws to seriously evaluate policies and outcomes. No prizes for guessing which group the Nobel committee is likely to belong to. As is currently fashionable to say when cancelling someone, Trump’s values do not align with those of the committee and the ideals of the prize.
-
Automotive2 days ago
Parliament Forces Liberals to Release Stellantis Contracts After $15-Billion Gamble Blows Up In Taxpayer Faces
-
National2 days ago
Politically Connected Canadian Weed Sellers Push Back in B.C. Court, Seek Distance from Convicted Heroin Trafficker
-
Alberta2 days ago
Petition threatens independent school funding in Alberta
-
Courageous Discourse2 days ago
No Exit Wound – EITHER there was a very public “miracle” OR Charlie Kirk’s murder is not as it appears
-
Business2 days ago
Quebecers want feds to focus on illegal gun smuggling not gun confiscation
-
Business2 days ago
Canada has fewer doctors, hospital beds, MRI machines—and longer wait times—than most other countries with universal health care
-
MAiD1 day ago
Disabled Canadians increasingly under pressure to opt for euthanasia during routine doctor visits
-
Censorship Industrial Complex2 days ago
Who tries to silence free speech? Apparently who ever is in power.