Thursday, January 31, 2013

More worms out of the right-wing can



One might be forgiven if they are baffled by the way in which the Right embraces the wrong side of history. What person would be proud of the fact that history would record that they were a racial bigot, someone who subsidized the rich and crush the impoverished, and fabricated rationales in order to send thousands to die in a needless war? It is astonishing fact that there are plenty of people like this here in America.

Of course, there are variations in the level of common sense and credibility. Take for instance Donald Trump; the good thing is that he doesn’t hold public office, and thus isn’t as dangerous as, say, Michele Bachmann. Another politician who seems intent on out-doing her colleagues in exposing the ass-end of history is the blonde, blue-eyed demi-fascist and social conservative Marsha Blackburn—who happens to be a representative from the state of Tennessee, where there is no shame in ignorance.

Blackburn is one of the original “birthers,” and so naturally she has to reinvent herself as a “skeeter.” This is in regard to the latest “controversy” surrounding Barack Obama—his claim, in response to gun control legislation, that he often skeet shoots. I doubt this is true, but on the other hand, I doubt he would have made this claim if he had never skeet shoot before; he probably did in on a whim  once, or twice. It’s not a big deal to me or most people if this is the case, and to turn this into a focal point for ideological battle merely shows to what lengths extremists on the right desperate for an “edge” will go to.  As I heard someone on the radio say, at least Obama didn’t shoot someone flush in the face with buckshot—like Dick Cheney did. I mean, where do these people get their sense of “balance” anyways? Fox News?

Blackburn—who has drawn derision for her claim that the do-nothing (except start unpaid wars) Republican Party is the party of “big ideas”—has had frequent run-ins with Democrats on television news shows, angering her counterparts and questioners by perpetually veering off-topic or making claims that have no basis in reality; this probably explains why the far-right American Conservative Union annually gives her a “100 percent” rating.  The ACU happens to sponsor something called the Conservative Political Action Conference, whose “big tent” has room not just for paleo-conservatives like Blackburn, but for speakers and sponsorship from hate groups and otherwise extremist organizations like VDARE and the John Birch Society. 

Although Blackburn is one of the more extreme of the current crop of Republicans in Congress—frequently referring to Obama in borderline racist terms, no doubt emanating from racist inclinations—this is only enough to rate her a 60 percent by the Birchers. But not to worry; that is the “average” score that the JBS gives to Republicans, especially those who don’t (publicly) support the things that are near and dear to a Bircher’s heart—like the complete abolition of civil rights safeguards, voting rights, Social Security, Medicare and other things that are not explicitly mentioned in the original draft of the Constitution. Most Democrats rate so low that Birchers don’t even bother with them.

It is an unfortunate fact in our low-information news media world that people have heard of the John Birch Society, but have no clue what the society stands for; in fact, most people probably think that it is just some harmless gathering of old farts. That must have been the case the day some years ago that I walked into the “old” Kent Public Library, when they used to have a glass enclosure in the lobby in which local organizations could post displays. On this particular day I noticed that there was a new display, which was something about being a “patriot” and opposing the multicultural threat to America; there were pictures of “nice,” fair-skinned Caucasian families to demonstrate the “point.” I found this somewhat ironic, because most of the clientele of the library were minorities, probably because they didn’t have as much access to computers and the Internet as the white folks in those pictures did. 

I noticed something else about that display: It was sponsored by the John Birch Society. To say I was appalled was too slight a word. The audacity of the group was beyond belief, if the patrons of the library were actually aware of its beliefs; but it is a testament of how the extreme-right has gone “mainstream”—or how the media had allowed so many people in this country to be so ill-informed. The “ultraconservative” Birchers—who actually claim to be “constitutionalists” but don’t believe it safeguards the rights of minorities or the poor—first gained infamy with their claim that the politically moderate Dwight Eisenhower was a communist agent, and that the fluoridation of water was a communist plot (a conspiracy theory lampooned in Dr. Strangelove); during the 1960s, their hyper-opposition to civil and voting rights legislation made them as popular as the Ku Klux Klan in the Southland.

Environmentalism and the UN’s “Agenda 21” initiative on “sustainable” growth is also a frequent target of the JBS, which fits in the organization’s prime directive to discover the communist “bogeymen” and other threats to the White American Way in every dark corner; the problem is that these mainly exist in the dark corners of their own minds. Not surprisingly, anyone who is not of European origin is particularly concerning to them, since they are for some reason associated with “communism” and “socialism.” To the Birchers, nonwhites threaten to “destroy” their anachronistic vision of what America is. It is ironic, then, that there is a tendency to ignore the fact that “socialism” in some form or another (like universal health care) is a concept European in origin, and most countries employ some facet of it. 

Anyways, I immediately went to the front desk and demanded to know who allowed that display, and asked if they were aware that the JBS was a fascist, white supremacist organization; I received the impression that they were surprised that someone could have such strong feelings about this. I didn’t stop there; I wrote an email to the King County Library expressing my outrage. When I returned a few days, the display was gone. But the Birchers had their “say,” even if hardly anyone paid any heed to its “message.”

And the Birchers and all the right-wing money that backs them has an ally in the Blackburn, who believes in “free speech”—which is why she is calling for “21st Century” regulations to Internet, meaning that she opposes “net neutrality” and equal access and distribution of content on the Internet, which to the Right is little differentiated from the old “Fairness Doctrine." You can’t just let those “liberal” bloggers say anything they want, like facts and such. All this means, of course, is that it is OK to regulate people, but not corporations. But are not corporations “people”? Oh, never mind. This is all of course done in the name of corporate “competitiveness,” since businesses who run the “free” Internet must be able to control its content and who has access regardless if the access has been paid for. The question then is who is allowed “in.” The fanatical fringe that visits right-wing favorites like the neo-Nazi website Stormfront? 

We can’t allow such an imbalance of access and opinion in a country where the top 20 percent own 90 percent of the wealth; it is only the “free” access to the Internet that is keeping anything close to a balance in the struggle to be heard in this country. People like Blackburn who psychologically are still living in the days of plantations and slavery do not exist in a vacuum, and having a standing at all with the John Birch Society (it should be a point of shame for anyone who “achieves” a higher than 10 percent rating from the Birchers) testifies to the fact that too many members of Congress are ideologically tied to extreme groups backed with money from shadowy rich guys (oh yes, Blackburn also believes it is a “competitive advantage” for wealthy sponsors of right-wing propaganda Internet sites to conceal their identities).  Someone has to expose them, even if the audience that needs to know is too busy doing other things—like being mere appendages to their “smart” phones, or watching the lives of the pathetic privileged on “reality” TV.

Monday, January 28, 2013

Women in combat jobs not so a "so what" proposition



Sometimes the decision one makes sounds worse than its actual effect. For example, gay marriage is not the end of civilization as we know it; in time most people who oppose it will hardly give it another thought, because ultimately it has no impact on their own lives. On the other hand, the policies of Ronald Reagan and George W. Bush may have had some political cachet with the right-wingers at the time, but their long-term effects have been a disaster for this country; if anyone “benefited” it was the rich elite, the financial industry and the military-industrial complex—while the vast majority of Americans saw their standard of living decline markedly and a future still under threat by Republicans with chainsaws. 

Still other decisions made for political reasons are ones that no one really knows what the consequences will be. Take for example the announcement by outgoing Secretary of Defense Leon Panetta lifting the ban on women in combat positions in the military. I doubt that it was “coincidence” that Republican Robert Gates suddenly stepped aside for not fully-explained reasons in 2011—and within months his replacement, Panetta, set aside the “don’t ask, don’t tell” policy on gays in the military. The decisions coming down the pike were ones Gates was likely not comfortable in being responsible for, but his replacement was. Panetta has a longer history of support for civil rights than he does in military experience; the former Republican has only two years of service in in the Army, but he took seriously his responsibilities in civil rights enforcement in the Nixon administration, which aroused the anger among its more committed right-wingers. Panetta would eventually leave his post as well as switching parties. One may wonder why President Obama selected him as CIA director when he had so little experience in intelligence matters, let alone military; but he seems to have shared the president’s social agenda. After the “don’t ask, don’t tell” decision, Panetta set the groundwork for the latest policy change in February of 2012, opening-up new jobs for women that were “closer” to the front lines. 

But simply being a “boot” in Afghanistan or Iraq and potentially being subject to enemy fire by irregular forces—rather than actively seeking it and suppressing it—doesn’t necessarily qualify as “combat” experience, which is being used to justify the policy change. Only the tiniest fraction of fatalities on the American side are women, and according to the Department of Defense breakdowns, most of them were caused by “accidents” or otherwise noncombat related. And this during a “real” war. During the Gulf War, the ground “fighting” only lasted all of four days with a handful of American casualties, but there was great effort to manufacture phony "heroes"--even by Hollywood in the 1996 film "Courage Under Fire." "Jarhead"--about the failed search by Marines to find an enemy to shoot at--was at least closer to the truth. Someone who was there told me all enlisted soldiers were awarded Army Commendation Medals—apparently for the hardship of spending six months cleaning sand out of their weapons every day while camped out in the Saudi desert. I was also told that Bronze Stars were being passed out to officers—male and female—as if it was a prize in a box of Cracker Jacks (and according to some reports, it still is). Back in the day, it was awarded for heroism in combat; now, it can be awarded for sitting at a desk as a database administrator. 

Some may say all of this is part of a strategy to cheapen the hardship of actual combat. In our highly technological world, physical and psychological standards have been “modified” in the belief that combat is an eight-hour day at the office and then you come home at night. With the wars coming to an end, life will return to “normal.” What is that? In military posts stateside or kasernes in Germany, that means waking up early, doing PT, standing in formation, on some days training and bullshitting, other days cleaning equipment and bullshitting, or going to the motor pool and doing yet another preventive maintenance check on your vehicle—and bullshitting—and then go to lunch, returning do more (or less) of the same as before, and standing in formation one more time and going back to the billets or apartment. If there was an inspection the next day, you would spend all night with the floor wax or shoe and brass polish. 

Twice a year you would go on a month-long field exercise, which was mostly putting things up and taking them down, practicing radio communications and SOI decoding, and going through the motions of what you would actually do in a combat situation. Sometimes your squad was tasked to conduct a “sneak” attack on another one from a different unit; after “contact,” if they were dumb enough to hang around soldiers would argue about who shot or captured who to avoid embarrassment. I remember helping guard some “prisoners” when one of them tried to be a smartass and wrestle my M-16 away from me; he was taken away to see the medics after I wacked him on the kneecap with the butt of my rifle. And then you would return to post and it would almost seem like you were on vacation, until the monotony of the old routine set in. When I was in the Army, there was this old saying: The best place to be was between the place you were leaving, and the place you were going.

Between Vietnam and the current wars, this is the life that “combat” soldiers experience during the vast majority of their time in service. This is the environment in which it is proposed that women will be integrated into ground combat units; with the wars winding down, it is a suspiciously convenient time for Panetta to make his decision now, since we won’t have to know the practical effect of it for years to come. There are of course multiple issues to be resolved. For example, there is the physical fitness requirements. Will female soldiers be required to meet the minimum physical standards of male soldiers, given the fact that they will (or should) be required to carry their own load? In the “prime” age group, for a score of 100 the time required for female soldiers to complete the 2-mile run is 15:36 (13 minutes for male soldiers), while the number of push-ups is 46 (80 for males); these numbers are barely above the minimum requirement of male soldiers, and the minimum numbers for females soldiers—19:36 in the run and 17 push-ups—would get a male soldier booted out of the service as being physically unfit. That minimum time in the run is barely a fast walking pace, and if you can’t do 17 legitimate (not phony) push-ups in 30 seconds at the ages 18-26, that testifies to an extreme lack of upper body strength. 

These numbers didn’t come out of thin air; they are based on years of statistical analysis. Now some people will say big deal, or the PT requirements are “sexist,” that upper body strength is overrated. But there are other issues to consider. Camaraderie and trust is an important factor in morale and unit cohesion, especially in the ground combat environment. How is this to be gained if women are seen to require “special treatment” because of their strange hygiene requirements, syndromes and sensitivities? There is also the matter of this idea that modern warfare doesn’t require face-to-face contact with the enemy, that if it isn’t just pushing a button against an enemy miles away, it is just a mop-up operation after some target is bombed to hell; but fighting irregular forces in Iraq and Afghanistan has turned out be only marginally differentiated from that encountered in Vietnam. It is interesting to note also that much of this policy change seems to be driven by female pilots who do little but the “glamorous” duties, mostly against non-existent opposition. Officers and career enlisted female soldiers also chafe about the previous ban, although much of this has to do with the level of “respect” they may or may not feel they are given.

Nevertheless, if a “change” is coming, with soldiers returning to a peacetime environment this is the time figure out how this new policy will be integrated into reality; the result might only be fool’s gold, perhaps even a degradation in combat efficiency that is not immediately apparent. And make no mistake—not all recruits want to be in combat units; when I was in the Army, people actually knew the difference between having a real job and being a “grunt.” The only college “credit” I received for my seven years in the service was for “physical education.” It was glory-seeking officers who didn’t mind seeing the pawns paying the price for the decisions they made on the battlefield. It was fortunate people—and females particularly benefited by default—who got to “choose” to go into an MOS that might actually have some value in the civilian world, which is why some soldiers can find good jobs after they leave the service, and others cannot.

Thursday, January 24, 2013

Hillary speaks



Hillary Clinton appeared before a Congressional committee investigating the Benghazi attack, appearing rather more fit and feisty than we were led to believe. Wearing horn-rimmed glasses to take on a more credible pose, she claimed to take “responsibility” for the episode, although that responsibility did not include room for any failure on her part. Despite admitting to “deficiencies” and “inadequacies” at the State Department that led to the failure to adequately secure the Benghazi diplomatic sites—and which continue to be an issue—none of this had anything to do with her personally or the people advising her. Someone forgot to tell her that taking responsibility implies some fault, such as failure to pay attention to the details of the job, or to recognize problems as they occur and correct them. 

While a discussion concerning the disorder and power vacuum that followed the overthrow of Khadafy in Libya is a legitimate issue, to merely say that diplomats “accept a level of risk” and that they "cannot work in bunkers and do their jobs” seems more a dodge than an explanation—particularly given the Benghazi report’s finding that there was disagreements and lack of coordination between diplomats and the State Department in regard to security arrangements in Libya. The security of American diplomats in Arab countries that recently experienced internal upheaval should have been of more vital attention than it appears it was given. Whose fault was it that this did not happen? It is a little late in the day since Clinton abandoned her post to say "So it is our responsibility to make sure they have the resources they need and to do everything we can to reduce the risks.” 

Taking "responsibility” after almost four years of relative neglect (instead, travelling to more countries than any other Secretary of State) seems too retrospective; if no one “is more committed to getting this right” and is more “determined to leave the State Department and our country safer, stronger, and more secure” than she is, that doesn’t say much for the attitude that prevailed before. "For me, it's personal." Well, why wouldn’t it when one’s “rock star” reputation is at stake?

I don’t doubt Clinton has a real sense of regret over the deaths of Americans in Benghazi, and some of the attacks made on her made by John McCain and Rand Paul do seem politically-motivated. She also rightly accused House Republicans of holding-up funds for security cooperation in Libya. She also can’t be blamed for the prevailing conditions on the ground. Nevertheless, she can be blamed for a lack of inquisitiveness and not responding to those conditions; for Clinton to point to the Benghazi board’s claim that the “level of responsibility for the failures that they outlined was set at the assistant secretary level and below"—which may have been politically-motivated itself to absolve her of any blame for the security deficit in Benghazi—as a defense for her lack of attention does not pass the smell test. 

In the meantime, while Clinton claims that steps are being taken to improve security for U.S. diplomats, evidence on the ground suggests that there is little to support this claim, at least not in Libya, where “security” still remains in the hands of mercurial local militias.

Wednesday, January 23, 2013

President's change in philosophy will only work if he has someone to work with



In his second inaugural speech, President Barack Obama revealed what is presumably his true ideological and social philosophy, while at the same time calling for the citizenry to help him out and “do their part.” No more of the failed bipartisan experiment, no more “reaching across the aisle.” The “people”—that is, those who are public-spirited rather than bigoted or narcissistic—should be guiding the direction of public policy, and not the cupidity of private interest. No more dealing on an equal level with moronic types like the man who owns the car with the license plate frame that says “Liberalism is a Mental Disorder,” after spending the previous four years declaring “Comrade Obama: The Enemy Within.” 

In the next four years Obama stated his intention to breathe new life into the words We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights, that among these are Life, Liberty, and the pursuit of Happiness. No more of this libertarian/right-wing animal kingdom world: 

“For history tells us that while these truths may be self-evident, they have never been self-executing; that while freedom is a gift from God, it must be secured by His people here on Earth.  The patriots of 1776 did not fight to replace the tyranny of a king with the privileges of a few or the rule of a mob.  They gave to us a Republic, a government of, and by, and for the people, entrusting each generation to keep safe our founding creed.”

Obama, of course, knows that the foes of government of, for and by the people is under assault by small but powerful forces, and Republicans and conservative Democrat toadies “speak” for these forces. On an ABC News roundtable discussion following the inauguration speech, a right-wing commentator had the audacity to claim that Obama does not have a mandate—that he only spoke for that infamous “47 percent.” Forget the fact that it is red Republican states that have the highest percentage of that “47 percent,” and that the 51 percent of the popular vote for Obama was more than the less than 48 percent who voted for George W. Bush in 2000; it didn’t matter that Bush had no “mandate” to promulgate the disastrous tax, deregulation and war policies that followed. It didn’t even matter that an examination of all the disputed votes in Florida by the National Opinion Research Center revealed that Al Gore was the legitimate winner. It never does to Republicans who represent the interests of a few; all they understand is antebellum class distinctions and power. 

In his speech, Obama attempted to invoke John F. Kennedy’s “Ask not what your country can do for, but what you can do for your country” call for citizen responsibility. “Through it all, we have never relinquished our skepticism of central authority, nor have we succumbed to the fiction that all society’s ills can be cured through government alone.  Our celebration of initiative and enterprise; our insistence on hard work and personal responsibility, are constants in our character.”  America’s “possibilities” are “limitless,” since the country has “youth and drive; diversity and openness; an endless capacity for risk and a gift for reinvention.   My fellow Americans, we are made for this moment, and we will seize it – so long as we seize it together.”

“Together,” of course, doesn’t mean just the Koch brothers and the people who greeted Obama’s reelection with racial slurs. “For we, the people, understand that our country cannot succeed when a shrinking few do very well and a growing many barely make it.  We believe that America’s prosperity must rest upon the broad shoulders of a rising middle class.  We know that America thrives when every person can find independence and pride in their work; when the wages of honest labor liberate families from the brink of hardship.” 

So, with the wars ending and the economy more or less out of recession, it is time to address the deficit, the future of so-called “entitlement” programs, and immigration reform. I didn’t care much about the Republican “responses” after the speech, since the party has no credibility save to such small-minded bigots as mentioned above; the House of Representatives under Republican rule has become a leaderless, rudderless, ossified structure incapable of movement of mind. It is one thing to stand on “principle”; it is quite another when it is based on an inability to look peripherally. Simply taking a meat cleaver to the budget without ascertaining its effects just because you believe in “small” government is merely foolish. What comes next will depend upon whether we have leaders who want to go forward, or fall back on petty, rock-headed partisan-thinking.

Monday, January 21, 2013

Fanatics and one-hit wonders



A week after the Seattle Seahawks were knocked out of the playoffs, the biggest sports story now is the imminent return of the Seattle Supersonics, based on reports of the sale of the Sacramento Kings to a local ownership group that includes Microsoft billionaire Steve Ballmer. That is, it should be; fat chance of that  being the case on the local ESPN radio affiliate when Brock Huard and Mike Salk are on the air. They candidly confess they know little about basketball, and besides, they just want to talk about—who else—Russell Wilson. Words like “homer” and “fan”—or even “fanatic”—are meaningless when describing these two. Logical discussion is taboo, and critical analysis is only permissible if it doesn’t touch the Hall-of-Fame-destined Wilson.

This morning these two “discussed” who among the quarterbacks in the conference title games they would prefer to have on the team rather Wilson; naturally, they couldn’t think of anyone else. Brock and Salk apparently didn’t want to add Aaron Rodgers or Peyton Manning to the mix, because then they really would expose themselves as shysters. They didn’t bring Robert Griffin III into the mix either, which was interesting because Salk was once willing to sell half the team to acquire him; in order to bring Griffin III into the discussion would of course mean addressing his injury issues.

Anyways, the dislike of these two (especially Salk) of Matt Flynn verges on the psychotic; to Salk, Flynn is like an annoying wasp that has to be swatted away before it stings him. He also thinks that the Seahawks should get rid of Flynn because he is no “help” to Wilson because of their differing styles; what Salk is really saying is that he is afraid that if Wilson goes down and Flynn comes in and plays anything like he did in Green Bay, fans—if not Pete Carroll and some of the more political players—might actually come to believe that this is how a real quarterback is supposed to play. Salk’s dislike is so moronic and based on simple malice that he cannot condone the presence of any quarterback on the team who might expose Wilson’s limitations in the light of day. Salk clearly cannot accept being made a fool of, so he must be rid of Flynn. 

Fanboys like these are also unfamiliar with the history of this team. The Seahawks have never drafted a quarterback who went on to have a productive career. Russell Wilson may (or may not) break that mold, but we shouldn’t forget that this team drafted a quarterback No. 2 overall who at first appeared to be a “franchise” quarterback: Rick Mirer. Mirer did have a promising rookie campaign—and would never have another productive season. Packer fans like myself should be all too familiar with the case of Don Majkowski; in 1989 he literally emerged from nowhere to lead the Packers to their first 10-win season since 1972, throwing for 4310 yards and 27 touchdowns. But injuries derailed his career, as did his inability to grasp new coach Mike Holmgren’s West Coast offense; with Brett Favre chomping at the bit, all it took was yet another injury to put an end to his career. 

The history of the NFL is littered with other “one-hit wonders.” Probably the most “note-worthy” case was that of Cincinnati Bengals quarterback Greg Cook, who passed away last year. Although I don’t remember seeing him play, I did read football magazines of the time detailing Cook’s awe-inspiring rookie season. In his second start he completed 14 of 22 passes for 327 yards and 3 touchdowns, and his 9.41 yards-per-pass is still a rookie record. But much like the case of Detroit Tigers pitcher Mark Fidrych, his potential for greatness would never be realized due to injury. Strange as it would seem, sports medicine back in the day was hardly the scientific marvel it is now. During the season, Cook tore his rotator cuff, but the exact nature of the injury went undiagnosed. He missed three games but ill-advisedly returned; Cook admitted to extensive use of painkillers during the season, because he didn’t want to “relinquish” his starting spot. But in doing so further damage was caused, and Cook was forced to retire after the failure of several surgical procedures. Drew Brees would suffer a similar injury, but medical science would allow him to subsequently perform well enough to have three 5,000-yard passing seasons. 

There are other cases, however, of quarterbacks who blinded observers with their brilliance—and then “disappeared” despite playing years more in the league. Mark Rypien led the 1991 Washington Redskins to a Super Bowl win, throwing for 3,564 yards and 29 touchdowns; he never had the same numbers before or after. Jim McMahon led the 1985 Chicago Bears to the Super Bowl—the only season he came close to playing a full season; during his injury-plagued career he never won another playoff game after the Super Bowl.

But at least these two made it as far as the Super Bowl. Bill Kenney of Kansas City threw for 4348 yards and 24 touchdowns in 1983 and disappeared in a puff of smoke. Ditto for Scott Mitchell of the Detroit Lions, who threw for 4338 yards and 32 touchdowns in 1995. Same in 1999 for Steve Beuerlein of the Carolina Panthers, who threw for 4436 yards and 36 touchdowns. In 2007, Cincinnati’s Derek Anderson tossed for 3787 yards and 29 touchdowns; and then—nothing. Rob Johnson never had a great season, but injuries and the fact that there was always another quarterback on the team who was more popular than him derailed what some were predicting as a potential Hall of Fame career. The Packers’ Lynn Dickey threw for 4458 yards and 32 touchdowns in 1983, numbers which he would never again approach; but I wouldn’t put him in this group since when healthy, he was a consistently productive quarterback.

This past season has seen five rookie quarterbacks—Andrew Luck, Ryan Tannehill, Brandon Weeden, Griffin III and Wilson—throw for more than 3000 yards in their rookie season. The question now is who will sustain. It is my contention that Luck has the best potential for a long, productive career, particularly since the Colts are a team—unlike Seattle—that wasn’t poised to win now and can only improve in time. Griffin III has unquestioned athletic ability, but like all such quarterbacks dependent on their running ability, injury—particularly to the knees—seriously undermines his effectiveness; Wilson is clearly in the same category of quarterback as Griffin III, and his future depends, more than “conventional” quarterbacks, upon avoiding injury. This season the success of offenses that utilize running quarterbacks is more a function of a lack of familiarity by defensive coordinators. But as Eric Mangini told ESPN’s Mike and Mike, coaches and coordinators have all offseason to devise a way to attack the “zone-read” or “read-option” offense—including hitting the quarterback and often, which will wear-down a quarterback in much the same way as a running back and limit their career-span. 

I’m not saying that Wilson is a “one-hit wonder”—he may be a two-hit or even three-hit “wonder.” But anyone who denies the history lives in a fantasy world, which I believe most of the local media hear does anyways.