This is hilarious. Syrian boy brings clock to school. From Duffle Blog.
The post A different boy with a different clock gets treated differently appeared first on Ricochet.
This is hilarious. Syrian boy brings clock to school. From Duffle Blog.
The post A different boy with a different clock gets treated differently appeared first on Ricochet.
I’ve been pondering an item I saw in the Wall Street Journal the other day about states creating “automatic IRAs” for residents who don’t have a retirement plan at work. So far only three states – California, Illinois, and Oregon – have approved such programs, and none have actually gone into operation as of yet. The general idea is that workers without any other retirement plan would have an automatic payroll deduction into an IRA, but they could opt out if they want.
In general, I like to see the states moving into areas that are thought (incorrectly) to be federal concerns. The federal social security system has no legitimate constitutional basis – it was upheld in the midst of the New Deal by an FDR-friendly majority of the court. Justice Cardozo’s opinion was based mainly on the idea that Social Security was good policy given the “crisis” of the Depression. The policy argument looks a little thin now, with the Social Security Trust Fund facing depletion. That’s not to mention the inherent unfairness of paying into a hypothetical retirement account that your heirs cannot inherit.
Granted, a number of factors make one suspicious of the current state initiatives. First – the fact that California and Illinois are among the early adopters suggest that the legislatures are looking to plunder retirement savings in the manner of the US Congress. Secondly, the Obama Administration itself supports the state initiatives, which is normally a good enough reason for opposition.
But the fact that state retirement programs are blessed by Democrats could be an opening for a conservative policy proposal to shift retirement savings from Uncle Sam to the states. Policy wonks should be able to devise a system in which workers in states with a state-based retirement plan get a full or partial exemption from Social Security taxes, or perhaps a tax credit against Social Security taxes. For younger workers in those states, Social Security could be phased out entirely in favor of the state-based system.
In order to qualify, the state-based system would have to offer privately-managed, fully-portable retirement accounts. That shouldn’t be difficult – isn’t that what state 529 (college savings plans) already do? In New York, for example, the 529 plan offers only funds managed by Vanguard, and the money in those funds is – as far as I know – completely off-limits to the pols in Albany. Basically, state-based retirement accounts should be like 529 plans, except that workers are automatically enrolled and would have to affirmatively opt out if they don’t want to participate.
Granted, some state systems would be poorly designed, but the beauty of federalism is that the competition for businesses and taxpayers would create powerful incentives to get it right. Besides, almost anything would be an improvement over Social Security. I also appreciate the serious intellectual question of whether the government should have any role whatsoever in retirement planning. However, given the political reality, I don’t think we’ll ever get government entirely out of the game, so why not a competitive 50-state market for automatic retirement accounts?
Watch as a Suicide Bomber kills his own mourners (as his explosive vest detonates):
The post Allen West aptly sums this up: “Yes, God Is Great.” appeared first on Ricochet.
“The American project, as originally defined, is dead,” says Dr. Charles Murray.
The AEI scholar and accomplished author joins Ben on today’s episode to discuss his new book, By the People: Rebuilding Liberty Without Permission.
Murray believes that an emasculated limited government has allowed the modern regulatory state to run amok, steamrolling the liberties of everyday Americans. But it’s not all bad. He says there’s reason for hope.
Murray argues that by litigating rather than legislating, conservatives and libertarians can roll back the administrative state. Reasoning that “the regulatory apparatus cannot withstand a regulatory assault,” the scholar proposes an innovative approach to reclaiming rights lost: legal defense funds.
“I want to make government into an insurable hazard,” he quips.
Click here to subscribe or use the embedded link below to listen.
When Presidential candidate Carly Fiorina referenced video footage of a baby that survived an abortion who was filmed “on the table, its heart beating, its legs kicking” during the second GOP debate on Wednesday, the media called her a liar.
Amanda Marcotte of Slate said that the video didn’t exist:
It was quite a performance, and it opens the question of what Fiorina was inhaling before she watched those videos. There is nothing in the videos made by CMP, either in the edited or full-length versions, that has anything approaching images of legs kicking or hearts beating.
Sarah Kliff of Vox called it “pure fiction:”
But the things Fiorina describes — the legs kicking, the intact “fully formed fetus,” the heart beating, the remarks about having to “harvest its brain” — are pure fiction.
Refinery 29 has also claimed that the videos don’t exist in a piece entitled: “Fiorina’s Planned Parenthood Comment Was Graphic, Upsetting — And Totally Made-Up.”
Fusion also joined the media’s bandwagon of denial and nearly copied Planned Parenthood’s talking points in their report:
To be clear, Fiorina, like the other Republicans attacking Planned Parenthood, doesn’t have her facts straight. None of the videos have anyone talking about “harvesting” brains. The supposedly macabre video she’s talking about was highly, selectively edited by right-wing activists.
In the video in question, a technician is talking about harvesting the brain of an alive, fully formed fetus. While she tells her story, there is footage of another baby of roughly the same gestational age as the one whose brain she harvested. This baby is seen still kicking and its heart still beating.
While it is obviously not the same baby as the one she harvested the brain of, the footage helps viewers to understand what a 19-week old baby looks like when hearing the testimony of an ex-employee who harvested brains from babies of the same age. Illustrating stories with appropriate images is a common journalistic technique, one used by all media outlets.
Others assert that the baby was stillborn, a falsehood that goes back to The Hill, which made this claim in a story in August:
The anti-abortion-rights group targeting Planned Parenthood is acknowledging that its most recent video used an image of a stillborn baby that was made to look like an aborted fetus. The Center for Medical Progress posted a new link on its video late Thursday, adding that one of the images was actually a baby named Walter Fretz, born prematurely at 19 weeks.
This is inaccurate on multiple counts. The video shows two different babies, neither of whom are stillborn. One was an image of Fretz, who was not a stillborn baby, but was born born prematurely at 19 weeks and died in his parents arms. This image of Fretz appeared during the 8:59 minute mark of the video, where he appears to be wrapped in a blanket and have a clip on his umbilical chord to keep it from getting infected.
Earlier in the video, around the 5:56 mark, there is footage of another baby boy around the same gestational age as Fretz who is not stillborn either, but a baby who survived an abortion and was left in a metal bowl to die. In the footage, he kicks his legs and twitches his arms during the final moments of his life, and a pair of forceps lays beside him. The footage was provided by The Center for Bio-Ethical Reform, a pro-life organization headquartered in Lake Forrest, California.
The Hill’s claim is inaccurate, as neither of these babies were stillborn. Both were born alive and died outside of the womb. One was a survivor of an abortion who was left to die of exposure in a metal bowl at the abortion clinic, while the other was born to a mother who wanted him, and died in her arms.
The baby seen in the footage at the 5:56 mark was indeed taken from inside an abortion clinic, according to the owner of the footage. Gregg Cunningham, executive director of The Center for Bio-Ethical Reform, the organization that obtained the footage and provided it to CMP, said in a statement to The Federalist:
“The video clip we provided to CMP depicted an intact delivery abortion. It was filmed at an abortion clinic. It was not a miscarriage. Mothers don’t go to abortion clinics to miscarry. Had this case been a miscarriage, the mother would have presented at a hospital and her baby would have been rushed to an Isolette for appropriate neonatal care — not abandoned to writhe and eventually expire in a cold, stainless steel specimen vessel. As regards the organizational affiliation of the abortion facility in which this termination was performed, our access agreements forbid the disclosure of any information which might tend to identify the relevant clinics or personnel with whom we work. Preserving confidentiality is vital to future clinic access. I can, however, assure you that the footage in question is not anomalous. It is representative of the frequent outcomes of many late term intact delivery terminations performed at clinics of all organizational affiliations.”
The media have consistently failed to cover the Planned Parenthood footage, and now they are covering up the truth. The reality is that babies of the same gestational age are having their organs harvested every day. These videos feature graphic footage of abortionists mangling babies to harvest organs to sell. They feature abortionists admitting that babies often survive those abortions. They show high-level Planned Parenthood officials encouraging this organ-harvesting scheme, acknowledged that it is happening, and attempting to skirt scrutiny from it.
The videos Carly Fiorina referenced do indeed exist, and they reveal the barbaric realities of the abortion industry.
Enthusiastic fans stream into Madison Square Garden. The atmosphere inside the building is thick with anticipation. Strobe lights and Jumbotron clips illuminate eager spectators’ faces. Tonight’s event has all the trappings of a Knicks game, but the garden isn’t showcasing an athletic competition. Instead, it’s hosting the League of Legends North American Summer Finals, a live video-game competition attended by thousands of enthusiastic fans.
Aside from a brief middle-school infatuation with “Starcraft,” a spiritual predecessor to “League of Legends” that is best understood as the Lucy to the newer game’s fully-articulated homo sapiens, I am utterly unfamiliar with the world of competitive video gaming. An acquaintance helped organize the event and offered us tickets, and her description was intriguing enough to get us to show up.
Before arriving, a cursory Google search revealed that competitive gaming is a popular and growing subculture, complete with its own celebrities, customs, and even a spiritual homeland (South Korea). As we enter the stadium, a quick glance at our surroundings reveals that it has also found an impressively large and boisterous audience for its live events.
The “League of Legends” championship pits two teams of five gamers against each other in a digital Battle Royale. Spectators observe the clash between the contestants’ computer-controlled avatars, all of whom boast special abilities and seem vaguely inspired by European or Asian mythology, on several massive screens. The gamers’ characters fight over a symmetrical map dotted with defensive outposts, exotic terrain features, and monsters that can be dispatched for bonuses.
The ultimate objective is to destroy the opposing team’s base, which is where contestants’ heroes begin the match and where they are (eventually) reincarnated after dying. To the uninitiated, watching the competition on-screen is a bit like hovering over an impossibly busy Looney Tunes brawl.
The experience of walking into a live video-game tournament is almost indistinguishable from going to a National Basketball Association game. Vendors—including a “League of Legends”-themed tattoo parlor—line the halls leading up to our section. The ushers, long inured to the full range of human spectacle, calmly and competently direct guests to their appointed seats. The ubiquity of hot dogs, popcorn, and overpriced beer suggests the “League of Legends” culinary experience isn’t appreciably different from your average Knicks game.
The demographics of the event are about what you’d expect. Female spectators, some sporting elaborate costumes, are in attendance, but the line to the men’s room spills out onto the concourse while the ladies’ room is less heavily trafficked. Seven of the ten competitors on stage are of Asian descent. Before the match kicks off, an enthusiastic MC interviews this season’s North American League of Legends MVP with the help of a Korean translator.
By the time the match begins, the gamers are hunched over a row of computer terminals on a stage at one end of the stadium. The terminals are illuminated by team colors and feature front-facing displays that alternate between the competitors’ screen names—“Lustboy” was a particular favorite—and pictures of their chosen avatars.
The gamers are equipped with massive headsets that allow them to communicate with their teammates; despite sitting next to each other, they don’t exchange so much as a glance during competition. In “Ender’s Game,” another middle-school favorite of mine, Orson Scott Card imagined teenage strategists remotely directing interstellar fleets via ansibles, console-like devices that eerily anticipated today’s gaming terminals. Unsurprisingly, one of Israel’s top missile-defense operators has admitted to being weaned on Warcraft, another computer game that prizes fast-twitch reflexes and tactical improvisation. League of Legends might just be training our next generation of ace drone pilots.
If you’ve ever watched “SportsCenter” or “Monday Night Football,” the jargon of a live video-game event will seem oddly familiar. Competitive video gaming may be relatively new, but the announcers are already well versed in the hoary clichés of Late American sportese. We are solemnly informed that players “overcame adversity” to reach the finals.
“Internal strife” splintered one contender, resulting in former teammates competing against each other on their game’s biggest stage. On the Jumbotron, a sideline reporter peppers celebrity competitors with the same questions LeBron James fields before every Cavaliers game. Presumably, one of these fresh-faced teenagers will eventually sour on his PR obligations and start taking cues from Gregg Popovich’s surly public persona.
At times, the activity’s newness shines through and the seams begin to show. A gamer commenting from the broadcast booth is surprisingly critical of one contestant’s tactics, eliciting a hail of boos and catcalls from the crowd. Whatever the merits of his analysis, it’s a striking departure from the conventions that govern professional athletes’ carefully-choreographed TV appearances.
Much of the pageantry we associate with live sports is easily adapted to live video gaming. The constant visual and audio barrages so beloved by modern stadiums—the walk-up music, the hype videos, the highlight reels—are well suited to an event that centers around fast-moving, riotously colorful cartoon brawls. At a baseball game, the announcements, advertisements, and exhortations to chant and clap are twenty-first century practices awkwardly grafted on to a nineteenth-century pastime. At a “League of Legends” tournament, the game itself is already optimized for the modern stadium experience.
Other borrowed elements are best left to the athletes. Clips of a professional linebacker talking about toughness or a hulking power forward striding towards Madison Square Garden are a bit easier to swallow than similar posturing from a teenaged point-and-click artist.
All this pomp and ceremony raises one inevitable question. Why would anyone want to attend an event that can just as easily be enjoyed remotely? Sports fans still have good reasons to buy tickets to a game, even in an age of HDTV and lavish, league-affiliated cable channels. Athletes’ body language, an offensive lineman’s stellar block, or a telling glance between a coach and star player are easily overlooked if you’re not watching the game in person. Ten unexpressive teenagers staring into their computer consoles as the action plays out on a Jumbotron is another thing entirely.
In a pre-event interview, one gamer dutifully answered a question about playing at home versus playing onstage in front of thousands of screaming fans, but from our vantage point, it’s difficult to tell what difference the venue makes. The gamers are utterly absorbed by their terminals, and crowd noise doesn’t seem to penetrate their massive, military-style headgear. As soon as the first match ends, the contestants walk off the stage without so much as a wave to their audience.
Maybe the experience of attending a live event with thousands of fellow enthusiasts will be enough to sustain live gaming, even as the competitors they cheer for remain totally insulated from their surroundings. The possibility brings to mind Iain M. Banks’ Culture series, another science-fiction universe that anticipated the rise of e-sports.
Banks imagined technology where the aural and visual experience of attending a concert is indistinguishable from watching the performance remotely, yet the possibility of witnessing a transcendent event in person was still enough to incite a mad scramble to “be there.” Does watching an animated brawl play out on a stadium Jumbotron qualify as a transcendent live experience? I’m skeptical, but then again, I use a flip phone.
“I feel like I just got back from the future,” says my brother as we exit the garden. Maybe so. Or maybe we’ll look back on all this as a symptom of competitive gaming’s growing pains, an awkward attempt to emulate another recreational activity it had no business copying in the first place.
Pope Francis’s visit to the USA is fast approaching and, for many of us Americans, it cannot come soon enough. I am not Catholic, but the Catholic Church has encouraged me in what I see as an especially dark time here in the United States.
Religion is dying here. If the current trends continue, we are only a generation away from sharing a designation like that of France or Italy or Germany: just another largely secular, post-Christian nation. My generation, the 30- and 20-somethings, is leading our nation’s secularization.
Increasingly, there is a sense here—just like in Argentina and Europe, I’m sure—that religion ought to be a private matter. Within my short lifetime I have seen the way we speak about religion change. What our political leaders used to call “freedom of religion” is now called “freedom of worship.” The former, language borrowed from our Constitution, entails protection for the practice of religion in both public and private. The latter—freedom of worship—indicates a narrower scope for religion, protecting only practices that occur within religious institutions.
Freedom of religion has been diminished just within the last few years. Religious organizations like the Little Sisters of the Poor are increasingly forced to choose between their religious convictions and their desire to help society’s most vulnerable. Religious people who run modest enterprises are being forced to choose to either violate their consciences or endure penalties that will force them into insolvency. If a private citizen chooses principle over practicality, the government can even silence them.
Indications are that religious persecution here will worsen. At a keynote speech in this year’s Women in the World Summit, a frontrunner for the 2016 presidential campaign pledged to expand access to abortion worldwide, which means “deep-seated cultural codes, religious beliefs, and structural biases have to be changed.”
It should be no surprise that Hillary Clinton thinks this way when just this year our country’s highest court changed the definition of marriage—an institution that stood unchanged for millennia.
This is precisely why the Catholic Church continues to be a source of encouragement for me. Unlike my own particular brand of mainstream Protestantism, which seems to be following the winds of cultural and political change, the Catholic Church continues to stand firmly for timeless Christian principles.
Of course, when churches like mine begin selecting the biblical principles to which they want to adhere, all sorts of inconvenient longstanding moral codes can be lost. Most Protestant denominations have become comfortable with divorce and concupiscence, many are firmly pro-choice, and some have even begun to endorse the idea that Christ is not the unique means to salvation.
I appreciate Francis’s pastoral sensibility and emphasis on compassionate evangelization. He has convinced me that he cares deeply about Christians who have lost their way. Like Francis, I am “a sinner whom the Lord has looked upon,” and my own story of radical conversion teems with the sorts of poor people to whom he’s spent a long career ministering.
I am a very unlikely convert to Christianity: I was a secular Jew who spent many years studying postmodern philosophy at a good urban university. Because of my studies, I formed the idea that the only good was a life devoted to fighting the structural iniquities that had kept so many people from achieving their potential.
So after graduation I joined the Peace Corps and ended up in Paraguay, where I taught beekeeping to subsistence farmers. I arrived in my tiny village in eastern Paraguay with a very limited understanding of the indigenous language, a basic knowledge of modern beekeeping practices, and a deep-seated messiah complex. But the Lord was at work even then. Even at the height of my hubris. I had to be puffed up like that to know the importance of the humility that would follow.
Oh man, was I humbled.
My ivory-tower understanding of the neo-colonial roots of global poverty left me woefully unprepared for the reality of how poor people actually lived in rural Paraguay. Even the training I had received in the Peace Corps was inadequate. It turned out that most folks were not particularly interested in beekeeping. It was 2002, and the wheat and soy boom had arrived. When I came to town, the Brazilians had already been there for years with their big tractors, innovative implements of modern agriculture, and access to capital. The Brazilians leased land to grow soy and wheat, sharing proceeds with the locals. When compared to the huge profits of doing three huge harvests a year, my beekeeping trade seemed like a novel hobby, at best.
I was awfully lonely, and my work seemed laughably inadequate. But I did not despair for long. Eventually I met one desperately poor farmer, Felix, who did want to learn to keep bees. Unlike most folks in my area, he owned no farmland, and to provide for his wife and four kids he worked other people’s farms for a pittance. Beekeeping seemed perfect for him: It would provide his family with some sustenance, maybe even something to sell. Best of all, he didn’t need land to do it.
Felix gave me a renewed sense of purpose. I gave him a new trade. We quickly became best friends. Appropriating bees from the wild is hard work, especially in Paraguay where the bees are all of the aggressive Africanized variety (a.k.a. killer bees). It required accessing the hives, then sifting through tens of thousands of angry stinging insects to find the queen. It was a lot like finding the proverbial needle in the haystack.
We would spend long days in the hot sun, canvassing the Paraguayan countryside for bee colonies. During these odysseys Felix and I spent a lot of time talking about life, family, and faith. I quickly learned that Felix’s faith was a very important part of his life and identity.
The way his faith manifested itself was remarkable. Even though Felix was desperately poor, his faith seemed to sustain him. He wasn’t just poor compared to me; no, he was poor even compared to the subsistence farmers who were his neighbors. Yet, remarkably, he was at peace. He truly possessed a peace that surpassed all understanding.
Meanwhile, I couldn’t help but notice the obvious contrast: Even though I been given every privilege of the Western world, I was never satisfied. In short, I was beginning to see, firsthand, that there are various kinds of poverty: material and spiritual. So I began a journey of discovery. I read the Bible for the first time in my life. I listened to evangelical programs on my little battery-operated shortwave radio. And I continued to spend time with Felix and talk about what I was learning.
When I left Paraguay, I was a Christian. I like to say that my story of conversion is the modern inversion of missionary tradition: young man from rich nation travels to a poor country and is converted by the locals.
Central to the story of my own conversion is a truth that I think has become the most abiding theme of Pope Francis’s pontificate: Materialism is a major impediment to evangelization. This materialism manifests itself both in the philosophical sense—meaning only what we can see is true—and in the cultural sense—meaning we have an unhealthy, sometimes idolatrous relationship with worldly things.
This widespread materialist affliction is a recurring subject of Francis’ pronouncements, writings, and homilies. I don’t know for sure if he came to this understanding of our modern malaise the same way I did—through powerful relationships with those in material poverty—but his enduring affection for the poor and vulnerable and inclination to live simply suggests to me that perhaps he thinks about our postmodern culture in the same way I do.
It is my hope that Francis, as the first-ever pope to address a joint session of Congress, will provide just the sort of forceful rebuke that our wayward culture needs, even though politicians of all stripes will try to claim his statements lend moral credibility to their preferred policies (even when these programs clearly contravene core Christian doctrines about human life).
Government leaders—like those who supervised me when I was in the Peace Corps—will claim that Francis’s emphasis on environmental stewardship affirms the sort of work the Peace Corps does in poor countries like Paraguay. These leaders ignore the important differences between Francis’s ideas about environmental stewardship and their brand of radical environmentalism, which largely guides international development work abroad.
In every Peace Corps office around the world, there are well-worn sets of books with innocuous titles like Se Puede Planear la Familia. I checked out these books and shared them with Felix, explaining—quite patronizingly—how he could avail himself of various scientifically proven methods of contraception. It was a manifestation of the West’s fatal conceit: I did not see people born into poverty for their potential. All I did was calculate a cost. Development workers abroad would be reluctant to deliver the sorts of education programs that we euphemistically call “family planning” and “women’s health” if we had any sense that human beings—no matter how materially poor—are imbued with the Imago Dei.
It speaks volumes about Francis’s compassion for those who are suffering that his first trip back to South America as pope focused on the poorest countries. Of course this is nothing new: My Paraguayan friends tell me Francis spent a lot of time in Buenos Aires ministering to marginalized Paraguayan immigrants who arrived penniless in the Argentinian capitol looking for a way to make a living.
Similarly, when he comes to the United States, it has been announced that Francis will continue to demonstrate his solidarity with those in poverty by visiting Catholic Charities in Washington DC and a Catholic school in East Harlem. I hope, also, Francis’s focus on those living in material poverty is not at the expense of the many of us who are materially comfortable, but have lost our way spiritually. In my experience, there is a much broader spiritual poverty here than in Latin America.
As an architect of the New Evangelization in Latin America, Francis has led the way for the church to re-engage those who have closed their hearts and minds to the Gospel. Now that his evangelistic purview is the whole world, I hope Francis’s efforts include reaching out to people like the young man I was when I went to Paraguay, young people who have forsaken the wisdom of ages for the tidy neo-Marxist philosophies we learn in universities.
Comfortable, college-educated people of my age have an almost religious dedication to often sanctimonious (and wholly secular) ideas about our generation’s role in righting the wrongs of the past. Many will simply interpret Francis’s focus on the environment and poverty as an encouragement for the sorts of morally bereft and economically dysfunctional development programs the Peace Corps typifies.
If Francis can imagine a way to affirm my generation’s devotion to the marginalized while delivering a stern warning against the sort of degenerate sentimentality and paternalism that advocating for the poor can engender, then I think Francis could have an astounding impact here. Sure, he’ll probably upset just about everyone here if he does that. But, then again, he isn’t coming to be a comfort. He’s coming to be a witness for Christ.
(In case you were worried about Felix, don’t be. Last time I called down to Paraguay, I heard things were pretty good for him. He now has 11 kids, so I guess my family-planning instruction was less than successful.)
As we pick apart the candidates’ performances Wednesday night in the GOP primary debate, at least some Republicans must be wondering why, with so many experienced officeholders in the race, they cannot rid their party of Donald Trump.
The answer has less to do with Trump than with Americans’ trust in government, which has been on a half-century-long slide and has reached its nadir in the Obama administration. Support for outsider GOP candidates like Trump or Ben Carson, who now polls in second place behind Trump, or the rise of Vermont Sen. Bernie Sanders among Democrats disillusioned with Hillary Clinton, is less of an endorsement of a particular candidate than a general rejection of—even rage against—the political establishment.
As Michael Barone recently noted, voters are “willfully suspending disbelief in challengers who would have been considered laughable in earlier years.” Barone cites the deep unpopularity of the Obama administration’s major policy initiatives like Obamacare and the Iran deal, coupled with the perception that Republicans in Congress have been unable or unwilling to fight the White House on these things, as evidence that for many voters experience doesn’t count for much anymore.
Barone is right. Witness the exit last week of former Texas Gov. Rick Perry, the candidate with the most experience and arguably the strongest résumé in the GOP field. But the problem goes back further than just the past seven years of the Obama administration. One can trace the erosion of trust in government back nearly 50 years, to its high point in 1964 when 77 percent of those polled expressed trust in government to its record low of 19 percent in 2013.
Pew Research Center compiled polling data on public trust in government from 1958 to 2014 in a handy interactive chart that everyone should spend a lot of time staring at this election cycle. Aside from brief spikes in trust following the First Gulf War in 1991 and the attacks of September 11, trust in government has been declining steadily since the first years of the Johnson administration.
Notice the only two spans during which public trust in government actually grew: the Reagan and Clinton years—eras marked by bipartisan cooperation between the White House and Congress. Although Ronald Reagan was always a staunch conservative, he worked with Democratic Speaker Tip O’Neill to pass a major tax reform in his first term and comprehensive immigration reform in his second. During Bill Clinton’s first term, the 1994 midterm elections put Republicans in charge of the House for the first time since 1952, and Clinton responded by working with Speaker Newt Gingrich to pass welfare reform and a balanced budget, among other initiatives.
This cooperation was possible only because Reagan and Clinton were able to fracture coalitions within the opposing party. They couldn’t just ram through White House policies with one-party supermajorities, as Obama has done. Voters saw Republicans and Democrats (at least some of them) working together to pass major legislation, and that helped build Americans’ confidence in government throughout these two administrations.
When George W. Bush took office in 2001, he was riding a wave of rising confidence in government. Historic polling data from Gallup corroborates Pew’s research on this point: trust in government crested in the aftermath of 9/11 then fell sharply with the invasion of Iraq, from which it hasn’t recovered.
For a brief moment after 9/11, some predicted a renaissance of New Deal-style liberalism among American voters from across the political spectrum. The trauma of the attacks would usher in a resurgence of faith in the competency of government to accomplish difficult things—the kind of confidence Americans had during the Eisenhower and Kennedy administrations.
That November, a Gallup poll found 89 percent of respondents approved of how Bush was handling the war on terrorism, and 77 percent approved of Congress. The national mood about the federal government was very mid-twentieth century. “Had not government won the war, ended the Depression, built the highways, brought electricity to the farmlands, vanquished the economic terror of old age?” wrote the late Michael Kelly in January 2002. “Government did great things for all of us.”
Sure enough, the Bush administration tried to do great things—tried to vanquish al-Qaeda and rebuild Iraq, tried to pass immigration reform, tried and succeeded in expanding Medicare, bailing out the banks, passing No Child Left Behind. After 9/11, compassionate conservatism met the War on Terror.
But the result of all this ambitious government action was an administration whose policies today are almost universally denounced by conservatives. Back in 2002, Kelly, then editor of The Atlantic Monthly, described Bush’s domestic philosophy as “activist, desirous of progress in addressing social ills. Liberal, you might say.” Today, it’s hard to find a self-identifying conservative who will endorse Bush’s domestic record. While many conservatives, including the more serious GOP candidates, want to address social ills, they’re deeply suspicious of the federal government’s ability to accomplish what it sets out to do.
It’s not hard to see why. The end of the Bush era brought what many considered to be one major government failure after another—the subprime mortgage crisis, the financial crisis, the great recession. The government’s failure to prevent these things, combined with responses that hampered a full economic recovery, damaged government credibility among Republicans and Democrats alike. It’s one of the reasons Obama got elected.
But the failures, and the perception of dishonesty, continued under Obama. In its efforts to sell the Affordable Care Act, the Obama White House made promises that simply weren’t true—if you like your plan you can keep it, families will save $2,500 a year on premiums, and so on. Five years later, the healthcare law is still deeply unpopular with most Americans.
On foreign policy, Obama has prized a nuclear deal with Iran above all else, leaving his administration with no viable response to the Syrian civil war, ISIS, the disintegration of Iraq, and the refugee crisis now enveloping Europe. The Iran deal, like Obamacare, is being rammed through Congress on strict party lines. Many Americans understandably feel Washington DC is more partisan and dysfunctional than ever.
Taken together, the Bush-Obama years have given both conservatives and liberals plenty of reasons to distrust the government and doubt its competency. As recent Pew polls show, dissatisfaction with the political establishment predates the rise of outsiders like Trump and Ben Carson.
For now, the feeling is stronger among Republicans than Democrats, but the trend is bipartisan. Lawrence Lessig, the Progressive, bespectacled Harvard Law professor who announced his candidacy for the Democratic nomination last week, flatly summed up the mood: “There is no connection between what the average voter wants and what our government does.”
It’s a line Trump himself might have delivered.