Back to Top

The Road Not Taken
By Philip Mella
Pikes Peak Courier

A Brief History of Governance

It’s convenient, if misinformed, to think America’s system of governance evolved
serendipitously from the minds of a few unquestionably enlightened men, our Founding
Fathers. In truth, there’s a discernible thread of evolutionary thought in how nations
designed and developed their governments that directly influenced our founders.
One of the oldest sets of governing laws is the Code of Hammurabi, which dates to
1754 BC, in ancient Mesopotamia. It lists 282 laws, with scaled punishments based on
the severity of the offense, which range from the rules for wages and terms for
commodity exchanges to contractual liability and inheritance, divorce, and paternity.
Besides belying the notion that our distant forebears inhabited chaotic and unstructured
societies, the code highlights a remarkable fidelity to our own laws. It also underscores
the fact that people are drawn to and support strong, enlightened leaders whose rulings
provide structure, security, and predictability to their lives.
In the Western and Eastern Roman Empires, we see the emergence of the foundation
of Western civil law, from the Twelve Tablets (c. 449 BC), which codified the citizens’
rights and duties, to Justinian’s Corpus Juris Civilis (“Body of Civil Law,” 530 AD), which
formed the bulwark of jurisprudence. These constituted Western Europe’s legal and
civil infrastructure through the 18th century, and deeply imbued the thinking of America’s
Founders.
That blueprint of advanced civic institutions was adopted by a truly enlightened ruler,
Charlemagne, the first Holy Roman Emperor, who united most of Europe with his
reforms, from monetary and ecclesiastical to cultural and governmental. Anglo-Saxon
law in England, which prevailed from the 6th century until the Norman Conquest (1066)
saw the transition from tribal, kinship rules to guilds and townships with codified
authority.
We turn next to the Magna Carta, the 1215 foundational document that informed both
English and American constitutional law. It was the result of protracted abuses and led
to a vital rebalancing of the relationship between the monarch and the barons, which
indirectly provided more accountable representation to ordinary people.
The Thirty Years War was triggered by religious tensions which devastated Europe but
led in 1648 to the Peace of Westphalia, the cornerstone of our modern understanding of
national sovereignty.
But it was our Founding Fathers, who studied the Greeks and Romans, and the writings
of such prescient thinkers as John Locke—whose intellectual fingerprints are on our
founding documents—that established our unprecedented system of checks and
balances, the rule of law, and our hallowed liberties.
In retrospect we can clearly see how subsequent nations advanced the work of their
predecessors: The precocity of Hammurabi’s Code, the ingenious Roman emperors,
Charlemagne’s bold reforms, Anglo-Saxon law, and the enlightened Magna Carta, all
informed our Founders’ thinking. It’s a legacy of which we can be proud, and a
reminder that until the advent of America, untold millions lived and died in fundamentally
unjust nations.

Artificial Intelligence:
Blessing or Bane?

Unlike most of the great technological advancements, the advent of Artificial Intelligence
(AI) raises unprecedented warnings. One of the world’s premier physicists, the late
Stephen Hawking, darkly observed that AI could easily outperform humans and become
“a new life form.” In an interview with the magazine, Wired, he said "I fear that AI may
replace humans altogether. If people design computer viruses, someone will design AI
that improves and replicates itself.”
The father of modern computing, Alan Turing, posited that "if a human could not
distinguish between responses from a machine and a human, the machine could be
considered intelligent.” AI’s inception began in the mid-50s at Dartmouth College and
astonished the world by solving simple algebraic problems and logical theorems.
Decades later, so-called “expert systems” appeared. These replicated human
reasoning in complex problem-solving, using “if-then” cognitive functions, but AI purists
dismissed them because they aren’t able to learn from the environment. Subsequent
advancements were largely due to Moore’s Law, which asserts that the number of
transistors in a dense integrated circuit (i.e., semiconductor) doubles every two years; it
has since been deemed outmoded.
In 1997, IBM’s chess computer, Deep Blue, beat reigning champion Gary Kasparov,
which heralded the development of “deep-learning” methods. An apt example is the
Deep Neural Network, a subcategory of the Artificial Neural Network, which determines
the correct mathematical manipulation to transition an input into an output. It uses
probability computation to filter large amounts of data to draw a logical conclusion. It
features prominently in the areas of speech and visual recognition software, which have
broad commercial and military applications.
Although these advances are beneficial in everything from medicine to mechanization,
there’s an unavoidably threatening aspect to AI that’s recently become apparent. To
wit, we’re on the cusp of developing machines that so closely replicate human cognitive
functions, including consciousness, that ethical concerns must be examined.
Those include software programming with deliberate morally reprehensible capabilities.
The notion of Artificial Moral Agents was introduced by Wendell Wallach in his book
Moral Machines, in which he asks whether software designers should be constrained
from developing morally injurious programs. This obviously raises First Amendment
considerations.
Equally unsettling are autonomous systems that can act with machine intelligence
independent of human restraints. Charles T. Rubin, a political scientist, has written of
his concern that as the sophistication in autonomous systems rises the risk that "any
sufficiently advanced benevolence may be indistinguishable from malevolence”
commensurately increases.
Is there a clear line between human cognition and emotions such as empathy and their
precise replication in a machine? Joseph Weizenbaum, one of the fathers of AI and an
MIT professor who died in 2008, was convinced AI will never be able to replicate human
traits such as sympathy and criticized scientists who characterize complex computer
functions as tantamount to human thought processes.
In a truly disturbing twist, Hans Moravec, a robot designer, and his colleagues have
predicted the merging of humans and machines into cyborgs and be far more
intelligent—and lethal—than either.
Another area of concern is AI’s impact on labor. The Economist suggested that “AI
could do to white-collar jobs what steam power did to blue-collar ones during the
Industrial Revolution." However, others have observed that automation has often had a
net increase on employment due to somewhat unpredictable downstream micro- and
macroeconomic effects.
We’re clearly in ethically and technically uncharted territory where caution and attention
to the potential for unintended consequences should be paramount. We can only
hope—and pray—that AI will play a positive role in our lives, as well as those of future
generations.

By Philip Mella
Pikes Peak Courier

Can America Survive?

Last week we explored the moral malaise that is suffocating our nation. This week we’ll
examine whether we’ve reached the point of no return. We begin with a 2017 Gallup
poll that found 81 percent of Americans believe our state of moral values is fair or poor.
Social conservatives tend to believe we’re in moral decline, but this poll found 71
percent of social liberals concur.
The Assyrian, Babylonian, and Roman Empires shared a common fate—they inevitably
succumbed to influences that led to their demise. Although the forensics of their
downfall are complex and varied, the precursor to their decline can be traced to internal
decay, itself the result of a moral collapse.
In his 1998 book, The Death of Outrage, former Education Secretary William Bennett,
wrote, "National prosperity is largely dependent on good private character. If lying, sloth,
lack of discipline, and personal irresponsibility become commonplace, the national
economy grinds down. The breaking up of families means more foster homes and lower
high school graduation rates. Just as there are enormous financial benefits to moral
health, there are enormous financial costs to moral collapse.”
Historians have documented the trendlines of decline and have defined several stages.
They begin with a rejection of God, then proceed to a fracturing of the traditional family,
on to the degradation of human life (abortion), thence to base and immoral
entertainment, on to violent crime, a declining middle class, and an insolvent
government. Then government itself thrives off of society’s moral decay, growing ever
more powerful and autocratic. The final stage is a failure of the people to understand
what’s happening due to an incremental debasement of traditional education and moral
discipline.
It’s clear that through our acquiescence to evil we have willfully and obtusely checked
every box on this list. But where exactly are we on the continuum and can the trend be
reversed? Although I’m generally optimistic, I’m also a realist, and I see little or no
evidence of a nationwide moral awakening; in fact, quite the opposite. We can blame
politicians, but they merely mirror the moral and cultural contours of the nation, because
most of them are drawn to the siren song of power.
We’re in an uncharted spiritual desert, sans the historical moral markers that guided our
behavior. We abort over a million innocent souls a year, we embrace immoral behavior
while wondering why our youth seem lost, and the rules for civic engagement have
been fundamentally rewritten in a code that borders on barbarism.
Yet our nation is economically sound, if spiritually bereft, which when coupled with our
preeminent military, means there’s little risk of cataclysmic failure. More likely, we and
future generations will become acclimated to a morally compromised nation which
includes a culturally coarsened landscape reflected in the dark world of our anarchical
social media.
In his book, The End of Christendom (1980), Malcolm Muggeridge, a Christian convert,
wrote: “Civilizations, like every other human creation, wax and wane…there can never
be a lasting civilization any more than there can be a lasting spring or lasting happiness.
It’s in the nature of man and of all that he constructs to perish, and it must ever be so.
The world is full of the debris of past civilizations…Man is fated to exist in the no man’s
land between the perfection he can conceive and the imperfection that characterizes his
own nature and everything he does.”
I understand that this column is at once somber and sobering. But it also realistically
characterizes the challenges we must face, if this exceptional nation, under God, has
any hope of survival.

Pikes Peak Courier
Can we learn the lessons of history?

Because it is susceptible to wide and often subjective interpretation, the study of history
has always been controversial. The quandary known as the Thucydides Trap is an
excellent case in point. This is the question of whether a nation should take preemptive
action against a rising power, a war which could deplete its resources, or stand by and
risk that it will lose a defensive war.
Monumental historical events can be characterized by the antipodal twins of humility
and hubris, the former leading to deeper strategic insight, the latter to myopic blunders.
Moreover, our view of history is circumscribed by the horizon of our understanding,
which is filtered through the hazy lens of personal bias.
The study of history elicits certain patterns concerning the causation of landmark
events, from the fall of empires and republics to the conflagration of war. Because of
that, historians question why we repeat the mistakes of the past. As has been
observed, “History doesn’t repeat itself, but it does rhyme.”
The misappraisal of evolving events recalls Hegel’s maxim that the Owl of Minerva only
flies at sunset, meaning we can only fully understand events belatedly. My lifelong
study has divined three broad catalysts for major historical events: hegemony (the
desire to expand power), revanchism (revenge), and irredentism, the reclaiming of
ethnic lands that arguably belong to a nation (e.g., Russia’s annexation of Crimea).
The question is whether we can more productively script unfolding events or whether
human foibles unavoidably consign us to a deterministic fate, until, in the words of
Macbeth, “the last syllable of recorded time.” Let’s examine the illustrative example of
the Fall of the Roman Empire.
The academic consensus is that the precursors of Rome’s decay were embedded in its
remarkable success. But many historians focus on internal civic decay, overlooking the
cultural rot that took root well before its fall was evident. Cultural confidence is the
bulwark against not only kinetic invasions but also the divisiveness that incrementally
transforms collective optimism into abject pessimism.
The corruption that resulted in the empire’s nascent deterioration led to the failure of its
far-flung outposts, the monetary devaluations that resulted in inflation, and critically, the
vulnerabilities that encouraged the Visigoths to invade from the north. Events conspired
with a succession of parochial leaders who failed to appreciate the symptoms of
weakness that were clearly taking hold of the empire’s physical defenses.
Fast forward to World War One and the obtuse multination treaty obligations which
papered over the nations’ underlying historical animosities and competitive aspirations
that effectively guaranteed war. This dovetailed with the epic failure of military
leadership such as British Field Marshall Lord Haig, under whom two million soldiers
perished. The Treaty of Versailles, which French General Ferdinand Foch presciently
said “…was not a peace…it is an armistice for twenty years,” foreshadowed the horrors
of World War Two. Subsequent efforts to impose a permanent peace, with arguably
more enlightened treaties such as Locarno and the Kellogg-Briand Pact, as well as the
League of Nations, were ignored by belligerents which led to war.
Could that war have been prevented if the French had stopped Hitler when he invaded
the Rhineland in 1936? Hitler said if he had seen one French bayonet he would have
retreated. History is replete with successful and failed preemptive interventions but
quelling a gathering storm in its infancy is almost always wise.
In conclusion, world history rhymes with the cadence of fear that threats may jeopardize
our security, which is best tempered by a system of checks and balances, that is
enshrined in the ingenious structure of America’s tripartite system of government.

by Philip Mella

The Death of History
Pikes Peak Courier

Consistent with post-modernism’s disdain of absolutes and fidelity to objective analysis,
many contemporary historians bring a studied agnosticism to the lessons of history.
Just as misguided, they filter the evidence through their politically biased views. These
jaundiced approaches to historical analysis have unavoidably found their way to the
cultural groundwater of our thinking.
Beginning in the late 50s a cultural revolution took root. It was anti-establishment in
character, featuring a skewed understanding of history, and blended with a hubris
worthy of Greek tragedies. From this was bred such notions as Situational Ethics,
which insists that ethical decisions should be driven by circumstances not inviolable
moral absolutes.
From our public education system to popular culture, the only absolute is the supremacy
of relativism. Liberals in particular appear intent on convincing us that right and wrong
are in the eyes of the beholder, that we all have the right to realign historical truths to
our unique perspective, or, more candidly, to suit our convenience.
Questioning the veracity of historical events can be intellectually healthy, but only after
objectively studying the history in question. But concurrent to this revolution was a
willful rejection of historical truths, underwritten by a consensus that the record is
inherently untrustworthy. Thought leaders concluded that our absolutes and truths,
those that inform our Constitution as well as much of Western civilization, are merely
convenient conventions intended to shape public opinion and control social behavior.
This ill-conceived and malign thinking has infected history books, curriculums from
grade school through college, and despite the unprecedented amount of information at
our fingertips, knowledge of history is rapidly fading. There’s an effort by liberal
cognoscenti to nullify historical truths that conflict with presumably enlightened
modernism, which they conflate with a preoccupation with identity politics and class.
Collateral damages include parents questioning their own authority, and teachers and
administrators faithfully reflecting the agnosticism regarding right and wrong; and, our
children’s educational and moral development suffer as a result.
Statues of universally recognized great political leaders are under assault because
these historical figures lived in times that accepted subsequently rejected scourges
such as slavery. But rather than rejecting or sanitizing them, we should understand
their flaws in light of the accepted standards when they lived, while studying their
ingenious contributions to our nation.
There’s also a disturbing intellectual unanimity in most university history departments
that excoriates Western civilization. That is likely related to the fact that the ratio of
liberal to conservative professors is about seventy to one. The tacit enforcement of
liberal orthodoxy as settled wisdom is antithetical to a liberal arts education.
Speech codes are emerging at schools and universities as administrators become more
dismissive of basic Constitutional rights. They selectively target language deemed
offensive, which amounts to a prior restraint on the free exchange of ideas.
A book titled “Academically Adrift: Limited Learning on College Campuses”, reports a
pandemic lack of intellectual rigor at our universities. Debra Humphreys, vice president
for communications and public affairs for the Association of American Colleges and
Universities said this book shows that "you can accumulate an awful lot of credits and
not learn anything."
When our collective history is extinguished or distorted beyond recognition it allows
what HG Wells described as control by a credentialed elite. This “emergent class of
capable men,” Wells wrote, will assume the task of “controlling and restricting…the nonfunctional masses.” This new elite, he predicted, would replace democracy with “a
higher organism” of what he called “the New Republic.”
We may not be on the precipice of that event, but the evident trends should be deeply
concerning to all.

A Brief History of Governance
By Philip Mella

It’s convenient, if misinformed, to think America’s system of governance evolved
serendipitously from the minds of a few unquestionably enlightened men, our Founding
Fathers. In truth, there’s a discernible thread of evolutionary thought in how nations
designed and developed their governments that directly influenced our founders.
One of the oldest sets of governing laws is the Code of Hammurabi, which dates to
1754 BC, in ancient Mesopotamia. It lists 282 laws, with scaled punishments based on
the severity of the offense, which range from the rules for wages and terms for
commodity exchanges to contractual liability and inheritance, divorce, and paternity.
Besides belying the notion that our distant forebears inhabited chaotic and unstructured
societies, the code highlights a remarkable fidelity to our own laws. It also underscores
the fact that people are drawn to and support strong, enlightened leaders whose rulings
provide structure, security, and predictability to their lives.
In the Western and Eastern Roman Empires, we see the emergence of the foundation
of Western civil law, from the Twelve Tablets (c. 449 BC), which codified the citizens’
rights and duties, to Justinian’s Corpus Juris Civilis (“Body of Civil Law,” 530 AD), which
formed the bulwark of jurisprudence. These constituted Western Europe’s legal and
civil infrastructure through the 18th century, and deeply imbued the thinking of America’s
Founders.
That blueprint of advanced civic institutions was adopted by a truly enlightened ruler,
Charlemagne, the first Holy Roman Emperor, who united most of Europe with his
reforms, from monetary and ecclesiastical to cultural and governmental. Anglo-Saxon
law in England, which prevailed from the 6th century until the Norman Conquest (1066)
saw the transition from tribal, kinship rules to guilds and townships with codified
authority.
We turn next to the Magna Carta, the 1215 foundational document that informed both
English and American constitutional law. It was the result of protracted abuses and led
to a vital rebalancing of the relationship between the monarch and the barons, which
indirectly provided more accountable representation to ordinary people.
The Thirty Years War was triggered by religious tensions which devastated Europe but
led in 1648 to the Peace of Westphalia, the cornerstone of our modern understanding of
national sovereignty.
But it was our Founding Fathers, who studied the Greeks and Romans, and the writings
of such prescient thinkers as John Locke—whose intellectual fingerprints are on our
founding documents—that established our unprecedented system of checks and
balances, the rule of law, and our hallowed liberties.
In retrospect we can clearly see how subsequent nations advanced the work of their
predecessors: The precocity of Hammurabi’s Code, the ingenious Roman emperors,
Charlemagne’s bold reforms, Anglo-Saxon law, and the enlightened Magna Carta, all
informed our Founders’ thinking. It’s a legacy of which we can be proud, and a
reminder that until the advent of America, untold millions lived and died in fundamentally
unjust nations.

By Philip Mella
Is God Listening?

One of the paradoxes of our modern age is that as our knowledge and understanding of
the universe has expanded so has the number of people who profess no faith in God.
Although there are many intervening and confounding variables, it reminds us that
paramount among sins is that of pride, for it’s the gateway through which all other
human failings enter.
Against that somewhat guarded preamble, and in light of the monstrous horrors of wars,
dictators, and despots in the 20th century, modern man has often questioned whether
God listens to our prayers. It’s a fair question, which as we’ll see, may elicit some
surprising answers.
For background, we must examine how our culture has inadvertently or otherwise led to
a weakening of faith. Despite its aspirations of scrupulous objectivity, our federal
judiciary has generally reflected cultural trends, responding with ever greater fidelity to
society’s warp and woof. Therefore, in the past half-century we’ve witnessed the
judiciary’s overreach relative to the Establishment Clause of the First Amendment—the
so-called ‘separation of church and state’.
The outcome is the incremental relegation of God and Christian symbols such as the
crucifix from the proverbial public square, not to mention our schools. Religious thought
leaders call that the secularizing of society, which some celebrate, and others see as
the demonstrable cause of the moral decay that has clearly taken root. It’s been an
imperceptible, slow process, which means it’s all the more insidious. But when moral
certainty—which is implausible without faith—is removed, confusion and its close ally,
evil, fill the vacuum.
Returning to our deeper understanding of the universe, in a recent editorial on Apollo
11, columnist George Will, wrote, “The universe, 99.9 percent of which is outside
Earth’s atmosphere, is expanding at 46 miles per second per megaparsec. (One
megaparsec is approximately 3.26 million light years.) This cooling cinder called Earth,
spinning in the darkness is a minor speck of residue from the Big Bang, which lasted
less than a billionth of a trillionth of a trillionth of a second 13.8 billion years ago.” Will, a
professed agnostic, predictably fails to consider whether the hand of God created the
Big Bang?
Regardless, the human imagination simply can’t comprehend this vastness, which
encourages us to question whether an omniscient God is overseeing it all. Psalm 8
captures that sense of forlornness by asking, “What is man that thou art mindful of
him?” Which returns us to the question of whether God listens to us when we pray?
The question is complicated by the fact that we humans are forever yearning, and
instinctively pray for favorable outcomes, and when we’re disappointed, we naturally
question whether God is, in fact, with us. The problem is in the unavoidable premise,
which is that we use the metrics—i.e., the tools—we best understand. That is, when we
pray for a given outcome, we constantly check for any hint that it’s trending the way we
hoped—that is to say, on our terms.
Since spiritual humility suggests we often don’t know what’s best for us, it’s wise to use
a different approach. A clue is that Hebrews 11 tells us, “faith is the assurance of things
hoped for, the conviction of things not seen.” Faith is a kind of translator of God’s will,
but its code is purposely written in the language of prayer. If God doesn’t answer us it
doesn’t mean He’s not listening, it may mean we’re not hearing what He’s telling us. Or,
that we think we know better than God what’s good for us. Ergo, the sin of pride.

By Philip Mella
April 15, 2019
Serious Problems with ‘Red Flag’ Law

Governor Jared Polis recently signed into law the controversial “red flag” gun bill
allowing the seizure of a person’s guns without due process if that individual is deemed
a threat to him- or herself, or others. The Democrat-sponsored law allows family,
household members, or law enforcement to petition a court for an “extreme risk
protection order” (ERPO).
After the guns are seized the individual will be given a hearing within 14 days to
determine if a longer-term order is warranted for up to 364 days. The court can order a
mental health evaluation, as well as mental health treatment. The law places the
burden of proof on the gun owner to prove that he or she no longer poses a risk in order
to repossess the firearms. The assumption of innocence is thereby abrogated.
Democrats believe the law will save lives and that it therefore supersedes issues of due
process. Republicans are convinced it’s an unconstitutional breach of our rights under
the Second, Fifth and Fourteenth Amendments. That it’s politically volatile is supported
by the fact that it has triggered recall threats as well as demands by state Attorney
General Phil Weiser that sheriffs uphold the law or resign.
Currently, 38 of Colorado's 64 counties, including Teller, have declared their opposition,
and 35 have passed formal resolutions in opposition. Many of the resolutions declare
their jurisdictions are Second Amendment "sanctuary" or "preservation" counties, and
pledge not to allocate resources to enforcement of the law.
I spoke with Teller County Sheriff Jason Mikesell, who has said he would risk being
jailed before he would enforce a court order to seize a person’s guns without due
process. “This has a number of serious problems, starting with who substantiates the
allegations of the accuser? We have family members who come in and make
allegations which often turn out to be false. Judges will be required to make decisions
based on an accuser’s perception. And, with this law, the accused has no
representation or recourse.”
In addition, the sheriff talked about the financial burden of an accused who must hire a
psychiatrist and an attorney to defend himself. “That can cause significant financial
hardship for someone merely facing an accusation. And, what if my deputies are
searching for guns and find apparently illegal drugs?”
House Minority Leader Patrick Neville says the bill would discourage citizens from
seeking help because of the stigma associated with mental illness. “No one should feel
they have to choose between their guns and getting the help they need,” Neville said in
a statement.
Many critics of this law have argued that its focus is misplaced, that it targets weapons
instead of treating people with mental illness. A review of our civil commitment laws
highlights that issue. There are two main legal principles that underly the state’s
interest in the process of civil commitment. The first is parens patriae, which is a Latin
term that means “parent of the country.” It refers to a doctrine from English common
law that assigns to the government a responsibility to intervene on behalf of citizens
who cannot act in their own best interest. A second principle is the responsibility of
enforcement, police power, which requires a state to protect the interests of its citizens.
The deinstitutionalization movement of the past half-century has resulted in a patchwork
of ineffective laws with many people suffering chronic mental illness in community
settings without psychiatric support, some of whom perpetrate heinous crimes. This law
will undoubtedly face court challenges, but our Democratic legislators are in uncharted
territory because an ERPO nullifies due process under the law.


Philip Mella
Powered by CampaignPartner.com - Political Websites
Close Menu