The Mezunian

Die Positivität ist das Opium des Volkes, aber der Spott ist das Opium der Verrückten

Reconsidering Economic Philosophy and its Relations to Democracy

A major flaw in the way we look at economics is that it is constantly restrained to the dichotomy of “progovernment” vs. “antigovernment”—often mistakenly described as “socialism” vs. “capitalism”—neither of which is a particularly accurate reflection of reality.

The idea that we should even discuss whether there should be economic regulations at all is absurd when compared to other issues: After all, we do not question whether or not we should have social safety laws at all—we do not view the issue as a choice between a draconian prison state wherein the government puts you in jail for any minor infraction vs. a society in which murder, rape, and theft are legal. Any rational person would be able to distinguish between just laws and unjust laws, and thus one should at least be able to distinguish between draconian economic regulations and less draconian economic regulations.

The Myth of Economic Freedom and the Double Standards of Laissez-Faire

For one thing, the concept of an unregulated economy is a myth. Every economy is regulated to some degree, even if it is based on whoever is able to kill their competition—which, it should be pointed out, is hardly more ethical than an economic dictatorship. If we are to define “freedom” as the ability for each individual to do whatever she wants, then a society can only be free in instances wherein there is no scarcity. For instance, freedom of speech is possible because one’s speech cannot prevent another from giving their opinion[1]. However, because economics is based on scarcity, it is impossible for everyone to get what she wants. If two people each want control over land, then there must be some way to regulate that conflict of interest.

Capitalism is itself inherently reliant on state regulation and thus is hardly a “free” economy. What is often called “property rights” is actually property control maintained by state power. If an individual tries to use property deemed to be a certain individual’s or organization’s—what is called “theft” or “trespassing”—then that individual or group will sic the government on them in the same way that corporations who violate regulations may be suppressed by the government. Interestingly, the latter should actually be considered less oppressive, since it rarely involves the level of violence as the former. Nevertheless, laissez-faire not only expects the former, but demands it, while the latter is prohibited.

Indeed, this double standard leads to absurd policies in terms of domestic issues. For instance, by logic putting poor thieves in jail should be considered just as “socialist” as giving them government assistance; both cost tax dollars and involve government force. In fact, the latter is often more cost effective, since it allows poor people to still possibly be productive workers, while the former completely separates them from the work force[2]. And yet, the former is acceptable—demanded, even—in a laissez-faire economy, whereas the latter is condemned. In this case, the only consistent pattern is that government intervention is only acceptable if it punishes people, but not if it helps people—a depressingly antisocial ethical rubric.

A more accurate definition of the conflict between the “progovernment” and “antigovernment” forces is whether the current distribution of income is ethical. Those who support laissez-faire policies believe that it is and that the government should not tamper with the natural distribution of the market—in fact, they believe the government must use force to maintain it against individuals who try to redistribute income themselves through theft. The “progovernment” forces can actually be divided into two camps: “Modern liberals,” who generally support the market, but believe that some intervention is necessary to keep it practical, and “socialists,” who believe the market is inherently flawed and should be replaced with a different economic system entirely. The pro-laissez-faire camps almost always base their views on government intervention on their support of meritocracy.

Questioning Capitalist Meritocracy

What is odd about the common assumption that the capitalist market is meritocratic is that there is absolutely no evidence to back it up. It is almost always based on circular logic: Capitalism is proven to be meritocratic by defining merit based on who succeeds in the market. If one questions, say, whether Donald Trump is truly skilled enough to deserve his immense riches an apologist will likely either say, “Well, if he’s successful, he must be,” or accuse the questioner of being jealous—which is odd, since nobody would ever accuse those who criticize Stalin’s control of the Soviet Union’s economy as being jealous of his ability to maintain power.

The claim that those who are unsuccessful should just “pull themselves up by their bootstraps” follows the same logic: Could not one say that those who were not successful enough to be in Stalin’s place in the Soviet Union should have quit whining and “pull themselves up by their bootstraps” so that they were dictator instead? Once again, the claim that laissez-faire is free while socialism is controlled is based on an assumption—one that was discredited earlier.

Indeed, the idea that one could “pull herself up by her bootstraps” contradicts the reality of economics. In reality, one needs access to some resources—whether natural or capital—in order to make something; to do otherwise would be magic. Thus, if one wants to be a computer programmer one needs access to a computer; if one wants to grow food one needs access to land, seeds, water, and other materials. The same applies to intelligence, actually: We generally think of education as a factor of personal choice, but it is actually controlled by external elements. After all, nobody can just learn mathematics by oneself; one needs access to literature or other sources from which to glean this material.

Thus, if one is utterly destitute “pulling oneself up by one’s bootstraps” is literally impossible. Meanwhile, those who have control over resources have an advantage independent of their own skill. Indeed, the property owners are not even necessary for value to be created; all that is needed is labor and property. Workers could simply use the land to make their own value and keep the profits for themselves[3]. It is only because of state-backed property control that property owners are able to force workers to sell the majority of the value of their labor so that they may have access to such property (since resources without labor is at least a little more useful than labor without resources, which, as stated earlier, is utterly useless). This is often called “exploitation,” but should more accurately be called “extortion”: It is the use of the threat of force that makes workers submit to owners, not inherent merit. It is ironic that such a system is often contrasted with Leninist statism for it is actually very similar; the only difference is that “capitalism” somewhat separates the police from the corporations (and is more a case of feuding economic dictators than a single economic dictator), whereas “socialism” has them combined. In fact, the irony is that the only reason liberal democracies are so much better than Leninism is not due to absence of state interference but because there is a somewhat democratic interference from the government that ensures that corporations do not overstep people’s rights too much. If a capitalist country were to rid itself of its “government” completely one would simply end up with an assortment of corporate dictators—virtually the same as Stalinism.

Granted, there is a logic to one being able to benefit from one’s funds if one has earned said funds. For instance, if two workers make the same amount of money for the same amount of work and one saves that money for investment, it makes sense that that one should benefit from one’s saving while the worker who splurges is at a disadvantage. The problem with this scenario is the emphasis on “if one has earned said funds.” There are at least two problems with this factor: Inheritance and ownership of natural resources.

The issue with inheritance should be self-evident: It causes some to be born with more wealth than others. Obviously being lucky enough to be born with a certain wealth goes against meritocracy, which requires that all success be based on one’s own effort or intelligence. Since it has been established that wealth is a determinate factor for more wealth, this also affects future wealth as well. For instance, it is well established that wealth affects success at school[4], likely due to the quality of the school itself; nutrition, which affects one’s brain power; the difference in necessity to waste time working for extra funds—time that could be spent studying; and many other factors. In addition to this, the fact that some children go hungry while others live rich lives is itself an appalling injustice. After all, can one really say that a child deserves to starve because she was born poor? Can we really say that a five-year-old should “pull herself up by her bootstraps”?

Private ownership of natural resources is a much greater threat to the claim that capitalism is meritocratic, for it is a significant part of the means of production. The injustice of private ownership of natural resources is simple: Natural resources, by their nature, were not created by humans, but were there before any of us were born, and thus no individual has any greater right to it than any other. The only just ownership of natural resources is one that is equal: Democratic control. To give any individual more control over any natural resources than another would be to give that individual an undeserved privilege and to give the other an undeserved penalty.

There are two defenses against this discrepancy, but both fall apart. The first is that there are no true “natural resources,” but that humans change them with labor. The first problem with this argument is that even if one alters natural resources at least some of its value is still independent of the labor; thus, at least some of those resources still deserve to be democratically controlled. Second, this does not take into consideration the question of whether that individual had any right to alter the resource in the first place. After all, if someone were to trespass on someone else’s property and dig a hole would that make that part of the land hers? No, she would likely be arrested for changing someone’s property without permission. The same logic applies to those who alter our resources without democratic permission. Indeed, one could argue that a lot of the tampering that has gone on with our natural resources decreases their value, when environmental issues are taken into account.

The second defense of private ownership of natural resources is the adage, “Finders keepers, losers weepers,” as said by John Locke. Other than the fact that this is an utterly fabricated rule, it also inherently violates meritocracy by giving certain people advantages based on birth chance: It gives people lucky enough to be born earlier an advantage over those born later, as they are able to snatch up all the resources before the younger generations even have a chance to claim any. More importantly, this violates the aforementioned issue of inheritance, anyway.

Besides, this rule was never truly followed, anyway: Our ancestors did not gain their wealth by just finding land; they stole it through brute force. This is especially true in terms of the US, wherein all of our natural resources were stolen from the indigenous population. That in itself should invalidate practically all private ownership of natural resources in the US, as all of it is based on theft. After all, if one were to steal from someone else and hand that wealth over to her kin, that kin would still have to give it back; the same should apply here.

Both the inheritance and the natural resource problems also spell problems for most instances of private ownership of capital (non-natural means of production), as well. Since all capital is based on natural resources (in fact all resources, period, are based on natural resources), some of the wealth derived from capital belongs to the public. Granted, how much and how much this has been applied in the past is debatable, so this is a wash. What it does do, however, is offer an ethical reason for taxation of capital. Inheritance poses further problems for capital made before any of us were born, since inheritance is unmeritocratic.

So it has been established that all natural resources should be democratically controlled and that most capital should be as well. Thus, the only economic system that is compatible with meritocracy is one in which the majority of the means of production are democratically controlled. Since capitalism is defined as a society in which the majority of the means of production are privately (which, by its very nature, is undemocratic) held, it is incompatible with meritocracy.

A New Way to Evaluate Economics and the True Reason Leninism Failed

Which brings us to the conclusion: If capitalism is neither free nor meritocratic, what ethical value does it serve? The only logical answer is none: Capitalism is an inherently unjust system.

This leads to an ethical dilemma: What is a just economic system? Socialism? Surely one would not consider countries such as the former Soviet Union or North Korea just.

In reality, this is an issue of tautology rather than actual principles. It has already been established what is required for an ethical economic control: Democratic control. Without democratic control of the economy some individuals are unfairly kept away from resources they have just as much right to use; without economic democracy certain individuals have unfair advantages in terms of making more wealth than others.

The only issue in terms of “socialism” is whether “socialism” is defined as simply a government-controlled economy (including both democratic and undemocratic) or a democratically-controlled economy (which arguably may be devoid of government control if government doesn’t exist—anarcho-socialism—depending on one’s definition of “government”). Since there is no objective qualifications for what “socialism” should truly mean, this is impossible to decide—and is insignificant, anyway.

Thus, what we should focus on is not whether or not our economy is “progovernment” or “antigovernment,” but think of our economy in terms of “democratic-control” and “undemocratic-control,” with the former preferential.

Indeed, empirical evidence shows this to be a worthier concern. There are two economic patterns, both of which seem to have discrepancies: On one hand, we have statist economies like the Soviet Union and North Korea contrasted with capitalist economies such as the US, wherein the latter is clearly superior in terms of health standards, poverty, and overall public happiness—what ultimately makes a good economy. This is usually the evidence against “socialism,” the supposed proof that capitalism is the best economic system. And yet, when one compares developed countries with less government-interventionist economies such as the US with developed countries with more government-interventionist economies—“social democracies”—such as Sweden and Switzerland—one finds that there is a direct correlation between government-intervention and public well-being[5]. But while these seem inconsistent, there is in fact a consistency: Democracy. Leninist countries such as the Soviet Union were not democratic—they actually had less economic democracy than countries like the US, in which the public at least had some influence through a somewhat democratic government. Since the developed social democracies were, well, democracies[6], their government interference was actually somewhat democratic, and thus they has more economic democracy than liberal democracies such as the US and much greater economic democracies than “socialist” dictatorships. Thus, the real correlation is that greater economic democracy leads to greater public well-being.

This brings us to the overarching fallacy of economics: The US wasn’t superior to the Soviet Union because it was capitalist; it was superior because it was democratic. Meanwhile, the US falls behind the rest of the industrial world because the US is too capitalist. If the US wants to improve its faltering economy it must democratize its economy—most likely through government intervention[7]. Though there are flaws to that issue as well…

Concerns About Statism: Republicanism vs. Direct Democracy

While it is one thing to say that an economy must be democratically controlled, it is a whole different issue to decide how to do so. Earlier it was often assumed that a democratic economy came from statism from a democratic government—but this is merely an assumption. What needs to be asked is what a democratic government is.

Much like “socialism” there comes the question of what truly defines a “government.” Often it is described as an oppressive organization separate from the people. Laissez-faire libertarians often decry government-interventionist economies as a form of economic tyranny, wherein some bureaucracy arbitrarily decides who does what and who gets what. Of course, it has been established that capitalism is no better—in fact, since it is utterly undemocratic it is worse. However, this does not mean there is no value to their concern as well. The problem is that this does not show a flaw with socialism but with the form of government itself. After all, would they argue that it is good for a detached bureaucracy to create other laws arbitrarily? Thus, the concern here is not whether socialism is democratic, but whether government is democratic in itself.

And this concern has value to it, for the fact is that the US is not a true democracy, but a republic[8]. The people do not directly run it, but instead vote in temporary dictators who make decisions independent of their choices. If these government officials lie about what they plan to do as government during the election, then they have scammed the public; since there is no way to prevent government officials from doing so, this effectively makes the election system futile. Thus, Barack Obama can run under the promise to shut down Guantanamo Bay or not cut social security and then fail to do either and there is nothing the public can do in reaction, except wait for four more years so they can pick another government official who will lie to them.

The US in particular has three other factors that harm democracy: The monopolization of the electoral system by two political parties, the monopolization of the mass media by the rich, and lobbying—AKA corruption. Much like the market[9], monopolization can tarnish electoral results by harming competition. The only way democracy can function is if people have the ability to vote for someone else if a candidate supports a policy in which one does not approve. In such a competition politicians will either have to compete over who can satisfy the public the most, leading to optimal public satisfaction. However, when the electorate is monopolized by a small set of factions—such as the US, which is dominated by the Republican and Democratic parties—then the factions can indulge in “policy fixing” to ensure that the public does not get what it wants and that there is nothing they can do to change that. Thus, the fact that the majority of Americans are against cutting social security[10] is irrelevant; both Obama and Romney have shown that they support cutting social security and thus no matter how the public votes it gets cut. The majority of Americans also support ending the US invasion of Afghanistan[11]; Obama and Romney do not, and thus the US stays in Afghanistan regardless of their wishes. There is nothing the public can do to change this, for there are laws—many bipartisanly crafted by the Republicans and Democrats, coincidentally![12]—that greatly disadvantage third parties, virtually ensuring that they will never win.

Private control of the mass media also poses a threat to democracy, due to the media’s immense influence on public thought. Much like how the government uses government-controlled media to keep themselves in power, it is unreasonable not to assume corporations such as General Electric or News Corp do the same for their television and radio stations or newspapers[13]. Private control over the media—at least limited media, such as network television or radio—is usually defended based on “freedom of speech,” but this is false and, in fact, hypocritical. As mentioned, there is no such thing as economic freedom; just different people having different power over resources. The same applies to network television and radio. After all, if regulating the way corporations run their networks is a violation of freedom of speech, is not the regulation against network or radio pirates hijacking transmissions? The latter is not, not because it is any less a regulation, but because we assume that the private owners are the rightful owners; however, the claim of meritocratic ownership of television or radio is dubious: If one looks back in history one will see that both transmissions were created by the government, using public funds, and handed off to private companies for private benefits, defended by tax-funded police and regulations. Thus, private control of the media is just as statist as government control of the media.

That is not to say that a government-controlled media is necessarily better, even if the government is republican. For in a republic the government is detached from the public enough that it can twist the media to support itself, thus rigging the election in its favor, which has the risk of creating a vicious cycle of power. On the other hand, making network television and radio controlled through a more direct democratic method is certainly possible, and would at least minimize the negative effects of corporate or government control. Of course, another solution is for the public to rely less on limited media, such as television and radio, for their views, and to partake in a wide variety of sources, though how to carry this out would be difficult to discern, and would pose practicality issues.

The electoral system’s saturation of lobbyism also has an extreme negative effect on American democracy. One can see its effects in Princeton study that compared policy to public polls among the rich, middle class, and poor, which showed that policy—from both Democrats and Republicans—generally leaned in favor of the rich’s desires and against the poor’s desires, with the middle class in the middle[14]. This is, of course, incompatible with democracy: For any just society each individual must have equal power over society. Giving people greater power for greater wealth not only creates a pseudo-oligarchical political system, it is also ironically incompatible with meritocracy, for more power means a better ability to manipulate the social structure to get oneself more wealth, regardless of whether that latter gain was based on true merit or not. One need only look at the popularity of tax cuts for the rich, bank bailouts, and business subsidies (which, incidentally, take up more than four times as many government funds as welfare[15])—all forms of statism favorable to the rich—with the government to see this in action.

Since the rich have such a tight control over the US electoral system, obviously the prospect of economic democratization within the US electoral system is laughable. The fact that even popular, extremely moderate policies such as not weakening welfare—much less increasing it—are seemingly impossible only confirms this. The prospect of loosening the rich’s control of the electorate is also slim because of that very control, unless a significant portion decides that public outcry has become so great that loosening their own control would be the only way to prevent outright revolution, which could cause them to lose control completely. Whether this is possible, and whether it would be possible to build further successes from such increase in democratic power is difficult to discern. Historically, it has never seemed to work: One can look at the rise of “austerity” in Europe today to see that the “social democracy” experiment of the mid-20th century has failed; that even when the rich gives up power they still have enough to be able to build it back up[16].

Though this describes a particular “capitalist” mode of republican vicious cycle, the same risk is present in state “socialism.” In addition to the problems of a government-controlled media, as mentioned earlier, when a government sufficiently separate from the public—even in representative “democracy”—has control of the economy that government can use that economic power to extort people into voting for them, or else risk being starved, thus creating a “socialist” mode of republican vicious cycle. Arguably, this is the case in Cuba, though whether Cuba even counts as a republic is debatable.

Admittedly, a true, direct democracy may pose practicality issues—especially in a country as large as the US. Nevertheless, there are moderate reforms that may at least increase democracy a little within the US: Stronger limits on lobbying and the aforementioned democratization of the media to limit the influence of money on the electorate; and changing the voting system to the “instant runoff” system[17] and opening the media to alternate parties would help loosen the monopolization of the electoral system by the Democratic and Republican parties. A public recall or other punishments against politicians who act against their campaign promises would also be necessary to make elections even minimally relevant.

Of course, even if the electoral system were to remain republican, it would still be possible to plan the economy on a more direct democratic level, though the possibility of the republican government snatching power of the economy back from the public would still be great—and the possibility of such a solution appearing in the current US electoral system controlled by the rich is virtually impossible.

Nonetheless, it seems that an increase in democracy for both the economy and the electoral itself would be beneficial, with direct democracy being the best solution. Whether this, or even moderate reforms toward that direction, is possible, it is impossible to discern. On the other hand, though the US government has shown itself not to be particularly consistent in its support for democracy, democracy is still a beloved aspect of American culture, and a truly democratic US should not be underwritten so easily.

[1] This becomes more complicated when it comes to one’s ability to have one’s speech heard, such as through the media; but this is usually considered an economic issue, anyway.

[2] For instance, a study in Florida, US by Brann, Herman I., (1993), Education, Incarceration, or Welfare? A Comparative Analysis of Institutional Costs, shows that education and welfare are both cheaper than incarceration.

[3] Note that while the term “workers” may only seem to denote blue-collar workers, this does not necessarily exclude white-collar workers or even managers (decision-makers can add value by deciding the best way to employ labor so that it creates the most value). On the other hand, whether a manager’s decision-making is good enough so that she deserves the job and not someone else should be based on a quality independent of ownership (democratic choice, for instance, would be a good, though imperfect, alternative).

[4] A quick Google search turns up numerous papers. Here are a few that just scratch the surface:

…Dyson, A., Gunter, H., Hall, D., Jones, L., & Raffo, C. (2007). “A review of research on the links between education and poverty.” Joseph Roundtree Foundation.

Wenglinsky, H. (1998). “Does It Compute? The Relationship between Educational Technology and Student Achievement in Mathematics.” Policy Information Center.

Ladd, H. F. (2011). “Education and Poverty: Confronting the Evidence.” Duke Sanford School of Public Policy.

[5] Gould, E. & Wething, H. (2012). “U.S. poverty rates higher, safety net weaker than in peer countries.” Economic Policy Institute.  (For direct correlation between high government spending and low poverty, look at the graph at the bottom of the page.)

[6] Well, republics, to be more accurate. See next section for more on this.

[7] The only alternative would be through direct democracy—what is often called “anarcho-socialism.”

[8] The founding fathers themselves refused to consider the US a democracy. Read the Federalist 10 or Peters, E. (2011). “What the Founders Thought About Democracy.” for a few quotes. Impressively, this “libertarian” seems to praise the founding fathers for their hatred of true democracy, agreeing with their claim that it hurt minority rights, while conveniently forgetting to mention the founding fathers’ immense hypocrisy in this subject in regards to slavery, Native American rights, gender rights, and so on. More interesting, he praises the US War Department’s criticism of democracy, including the criticism that it is “anarchy” (AKA authentic libertarianism). Thus, apparently democracy is both too much tyranny and too much freedom at the same time!

[9] Socialist market; as established a capitalist market is already inherently tarnished.

[10] Bethel, T. N., Reno, V. P., & Tucker, J. V. (2013). “Strengthening Social Security: What Do Americans Want?” National Academy of Social Insurance.

[11] Madison, L. (2012). “Poll: Support for war in Afghanistan hits all-time low.” CBS News.

[12] See for a good summary of laws that block primary access.

See also for information about the “15 Percent Rule” that blocks candidates who cannot win at least 15 percent within national polls from the national debates—effectively blocking all but the Democratic and Republican candidates.

[13] Indeed, read Chomsky, N. and Herman, E. S.’s (1988) Manufacturing Consent: The Political Economy of the Mass Media for a more detailed look at how this is the case.

[14] Bartels, L. M. (2005). “Economic Inequality and Political Representation.” Department of Politics and Woodrow Wilson School of Public and International Affairs, Princeton University.

[15] Zepezauer, M. (2004). Take the Rich Off Welfare. South End Press.

[16] See Guinan, J. (2012). “Social democracy in the age of austerity: the radical potential of democratising capital,” Renewal, for more information.

[17] See for more information about “instant runoff voting.”

Posted in Politics

A Look at RPGs: Super Paper Mario’s Story

This will just be a quick comment on the writing of Super Paper Mario with no judgment on its gameplay or overall quality (I’ve never actually played it, just watched it, though its gameplay does look interesting).

My overall assessment of Super Paper Mario’s writing is that they saw the success of The Thousand Year Door’s use of both some minor seriousness and its wit and then tried too hard to implement them in this game, making it forced. This is especially noticeable in its attempt at drama, which is mostly lackluster.

The main problem I have with Super Paper Mario’s story is that I do not like Count Bleck as a villain, mainly because his motivations are stupid. Essentially, he’s butthurt his girlfriend’s dead so he wants to destroy the world. However, he’s not some crazy psychopath who just goes around fucking shit up out of pure grief, which would be understandable, but spends at least a year concocting a detailed plan to do so. With this kind of sanity and this long a wait he should have been sane enough to get over his grief. More importantly, why does he have such close henchmen that help him do this? What do they gain by destroying the world—and thus themselves, too?

You might be surprised to learn that he eventually learns the error of his ways, only to be surpassed by another evil. Luckily he’s able to save the day because of he and his girlfriend’s love. Yeah. No, it isn’t explained how this works and, yes, it’s an utter deus ex machina.

Super Paper Mario does actually succeed in a legitimately touching and surprising scene in which some young heart lady needs to essentially kill herself to save the world, but this is completely ruined at the end of the game when she is magically revived. And then the writers had the gall to have her mother say, “We don’t know how she came back to life, but who cares?” as if the writers themselves were just throwing their arms up in the air and saying, “I don’t fucking know.”

The humor is also forced. The Thousand Year Door had a good balance between the original Paper Mario, which barely tried to be funny much, and this, which tried too hard. Whereas The Thousand Year Door had a lot of wit and some subtle silliness, Super Paper Mario just tried to throw silliness at your face to the point that it sometimes becomes annoying. There is still some wit, such as most of chapters two and three, but a lot of other parts are just annoying. Pixls, who are much less memorable than the partners from the first two games, merely have pointless gimmicks or strange speech patterns like a bad standup comic rather than actually witty dialogue. And then there’s O’Chunks, who’s just full or the stupidest of random humor. At one point he literally yells, “Broccoli!” or something else inane. The Thousand Year Door was able to be funny without relying on such lazy crutches. For instance, in that game Goombella would sometimes make short comments during her tattles or scenery comments that made them more entertaining; but she did not speak in silly accents. I mean, she did say “like” sometimes, but she wasn’t all “Yo, yo, Daddy-o! Yooz got the stache with the plan, my man!” like one of those insipid Pixls might. Shit, even the Yoshi partner, who came close to that line, wasn’t that bad.

This is not to say that Super Paper Mario’s story isn’t entertaining (and it’s still much more interesting than the average RPG story, which I could never even talk about since I would fall asleep during any of them and forget any details about them); but it did have quite a few awkward moments, which The Thousand Year Door seemed to avoid.

Posted in A Look at RPGs, Video Games

A Look at RPGs: Final Fantasy IV

I never understood why Final Fantasy IV is considered so highly among the series (at least compared to the other five of the first six games; I never really played any of the newer ones, and thus will abstain from writing about them). In terms of actual gameplay the fourth iteration is far less interesting than any of the others. The odd-numbered games had the class system, the sixth had the esper system; even the unpopular second game had the interesting (if unintuitive) experience system. Final Fantasy IV allows no such customization: What characters you get to use is based on the whim of the story, any spells they learn they learn when the game says they should, and the story is very linear save for few sidequests.

Final Fantasy IV’s gameplay is not merely less interesting than the first three; some of its gameplay aspects actually hinder its quality. While Final Fantasy VI—the only other game out of the first six to have character differentiation that is out of your control—has many characters that are great to use in their own ways (if you know how to use them), a lot of Final Fantasy IV’s just suck. What other game makes you play as a fucking bard? He is objectively inferior to the other characters in every way except that he can split potions to heal every character a tiny fraction during battle, essentially wasting said potions (this only applies to the Japanese or Game Boy Advance versions; in the US Super Nintendo version her really sucks).

Or how about the dipshit spoiled brat twins, mages too early in the game to have any magic abilities (later on they develop better magic, but only in the Game Boy Advance remake; and by that point Rose and Rydia are superior, still, anyway)? They do have this one move that does quite a lot of damage, but it makes both of them useless in battle for so many turns Cecil and Yang will likely defeat whatever enemies—or one of them will be killed—before the attack even gets a chance to be used. The fact that they’re so weak that they die in so few hits, and that one of them is the only healing-magic user, makes the latter very likely[1].

This is made worse by the fact that your character roster changes all the time, killing off characters in out-of-nowhere scenes just so they can dump you with another character you’ll have for maybe two minutes. And if they’re equipped you lose that equipment forever, unless you de-equip them first, which requires you to already know they will die, essentially punishing new players for not being psychic, which is objectively a bad game design decision.

Granted, Final Fantasy IV is mainly praised for its story, not its gameplay. But its story doesn’t fare much better. The aforementioned character changes are a story problem as well as a gameplay issue. The game introduces characters, kills them a few minutes later, and then expects the player to be devastated. Hey, remember those two dumb ass twins you just met and hate? Well, now they’ve turned to stone to keep a tower from collapsing randomly. And you can’t heal them because of some rule we made up, which doesn’t even apply in-game[2]. Telah kills himself by using a spell that costs more magic than he has… somehow? And you can’t bring him back to life with a Phoenix Down or anything, because, uh… Magic! It’s all so silly you can’t take any of it seriously. For god’s sake, Rydia gets teary-eyed about a character she never even met dying, before she even rejoined your party. Her parents died earlier, but apparently this Telah person she never met was more important to mourn over.

What’s worse is that at the end of the game it turns out everyone is alive. I’m not making that up. Yes, even Cid, who straps a fucking bomb on himself and dashes himself against a tower, lived somehow. Why have these characters die only to bring them back later? It’s stupid from a story perspective and it’s stupid from a gameplay perspective. It’s annoying having to readjust to different characters in battle and it disperses the story’s attention among so many characters that none of them get properly developed, and thus I don’t care when their dumb asses are killed because I know when my magic person dies he’ll soon be replaced with other magic users who are virtually the same. I think it would have been better had they limited the game to only a few characters throughout the whole game and then developed them instead of the clusterfuck they gave instead.

This game’s story is often praised for its depth, but I fail to see it anywhere, to the point that I wonder if these people played the same game I did. The villains have no motivations; they just want to collect all the crystals so they can ruin the world. Why? Because it’s evil! The king doesn’t attack innocent countries because he’s greedy or any other motivation that causes real humans to commit evil actions; he’s just brainwashed! Why does Kain keep betraying you? That darn brainwashing Golbez! Hell, even Golbez, a kindergarten caricature of pure evil with his dark knight clothing and dialogue that would make an airport fiction novel villain look deep turns out to be brainwashed by the true villain, Zeromus, whatever the fuck it is. You learn this at the very end of the game, and learn nothing about what Zeromus is. The game just says, “Oh, hey, we tricked you, this random blob monster’s the real villain instead!”

The best that can be said about Final Fantasy IV’s story is that it’s so awful it’s hilariously awesome. Not only do you need to collect all four elemental crystals, but then you need to dive underground to find the four dark crystals. Then you’ll be able to fly to the fucking moon, where you confront Golbez and later Zeromus (as well as gaining, and then quickly losing, yet another unremarkable character).

And yet, in some cases the story’s stupidity actually harms the gameplay. Magnet Cave is a great example. At the end of it is a “Dark Elf” boss that is so strong that it instantly kills you. This was so original the time it was already done in Final Fantasy III, except in a much more logical, overall better way[3]. With all of the monsters you fight in this game a fucking “dark” elf is the one that is so dangerous? Really? Furthermore, in this game in order to be revived you need to have some certain harp. How do you get this harp? You need to take a guess and talk to that stupid spoony bard to get it from him. But he’s mentioned in-game in such a nonchalant way that no one would think to bother talking to him. I’m sorry, but when someone mentions that Edgar’s resting because he’s hurt (an excuse to keep him out of your party), my first thought is, “Good, tell him to stay the fuck out of my party,” not, “Oh, I’d better talk to him so he can give me some magic harp to revive me when the evil Dark Elf one-hit kills my party.” The whole plot thread is completely random.

Sometimes such randomness is just silly, rather than harmful. At one point you’re about to be killed by another scripted boss. “Oh no, we’re done for no—Oh, wait, Rydia’s randomly arrived to help us. How she got here isn’t answered, but sure, why not?”

The usual defense of Final Fantasy IV’s flaws are that it is old, so criticism is unwarranted. This ignores that this game can be compared unfavorably to older games. I’m not saying Final Fantasy IV is not as good as those new-fangled games that I’ve never even played; I’m comparing it to the first and the third—the latter of which is certainly superior. And to those who defend its cliché medieval environment because it’s old I give two games released earlier: Earthbound Zero and Phantasy Star.

And even if Final Fantasy IV might have been the first truly story-driven RPG, it wasn’t the first story period, was it? You can’t just say that because no other video game has had a story about a hero who starts out evil and turns good that nothing else, such as literature, has done it, either. Video games do not only compete with each other for one’s time, but with other mediums, including books. If a particular video game’s quality is based almost entirely on story, than it better have a story interesting enough that I would rather play it than read a book—especially when books were much cheaper then; why pay sixty dollars for a boring game just for the story when you can pay only five dollars for a book with the same quality story? If it cannot do this, then it should not base its quality on mere story. The other five Final Fantasies do not fall under this trap because they have gameplay that is interesting (even if none of their stories—except perhaps the sixth iteration to some extent—are particularly good, either). Of course, if an RPG has a story that truly does compete with other mediums, then it can succeed. For instance, Mother 3 is mainly story-based, but its story is so creative and interesting that it competes not just with other video games, but books, too. Mother 3 could be written as a book and it would be a solid story; I could only imagine Final Fantasy IV as some airport fantasy novel.

Now after all of this ranting, and after probably already turning-off anyone who might be consoled by this, I will say that I do not think Final Fantasy IV is a bad game, and I have beaten it, which is more than I can say about most RPGs[4]. I just consider it more an okay RPG than good. And, in fairness, there are certainly worse RPGs. Don’t get me started on the sheer blandness of Golden Sun.

[1] I hate how RPGs always make the healer weak, ensuring that the person best able to heal people is always the first one to be killed.

[2] The rule is that because they chose by their own will to turn themselves to stone, other cannot heal them. Have a character turn himself to stone in-battle and watch someone else heal him.

[3] Not only is this done in FFIV after already being done in FFIII, it is done twice in this game, the second time at the beginning of Zeromus’s battle.

[4] Interestingly, this includes FFII and FFV—though these could be due to mere difficulty. I also find it odd that a lot of people claim that FFIV is hard, because it’s actually pretty easy. If I remember correctly, I beat Zeromus on my first try.

Posted in A Look at RPGs, Video Games

Conservative Political Correctness

Like many political terms, “political correctness” is a term so vague it is almost meaningless. There is no objective way to measure whether something is PC or not, so anyone can gleefully accuse any criticism of racism as politically correct regardless of the true level of racism criticized.

What is “political correctness”? It is most often used to denigrate what is believed to be “liberal” prissiness against certain language or depictions—most commonly those accused as being prejudice by the liberal, but not believed to be prejudice by conservatives. This is most common for when traditions are criticized. For instance, the manufactured “War on Christmas,” which Fox News apparently still trots out every December despite the obvious melodramatic goofiness of its title, is the complaint that liberal’s imaginary insistence on holiday celebration that does not prefer a certain religion mars the US’s traditional preference for Christian culture.

This seems arbitrarily specific. Why should prissiness only be bad when fighting tradition, and not when defending tradition? Certainly conservative think tanks’ (because I don’t believe average conservatives could care less about this tripe, either) obsession with the “War on Christmas” is just as triflingly whiny as the liberals’ purported insistence on cultural agnosticism. Should we not be as annoyed at them for wasting our time with such mindless pap as we are at the liberals for supposedly whining about Jews’ feelings getting hurt when Chanukah is ignored?

Evidence of conservative bitchiness in regards to culture offending their precious traditions abounds. The conservative whining against religiously incorrect works, such as that Satanic rock ‘n roll and Dungeons and Dragons—often made by Christians, anyway, such as the latter—is, of course, well established. But even nonfundie conservatives like to whine. Businesses bristle whenever their precious little laissez-faire superstitions are mocked. Hell, Fox News devoted numerous hours to divulging the secret anticapitalism within a Sesame Street special because it dared to argue that businesses that commit antisocial actions just for money is bad—a moral already well-established within our culture, from Dickens to It’s a Wonderful World.

Indeed, the desperate search for the mythical “liberal bias” in the media is akin the paranoid Marxists who find the maintenance of reactionary class distinction within every work. Already we have the infamous Conservapedia project wherein Schafly finally gets around to purging that liberal bias that has somehow snuck within the Bible, laughably breaking a major Bible law himself (in fairness, Christian fundamentalists ignoring their own rules is utterly shocking).

This discrepancy can easily be explained by examining another common thread of conservative ideology: Their hatred of “moral relativism.” Now, when they complain about “moral relativism,” they do not mean we should base our morals on objective science rather than cultural superstitions; conservative Christians have no problem denigrating atheists for being “arrogant” for making fun of their beliefs while, at the same breath, criticizing Muslims—and atheism, too, actually—in the same fashion. (Liberal Christians usually do, too; but they actually support secularization, so they are still consistent.) What conservatives really mean by “moral relativism” is that liberals dare not respect their superstitious traditions—Christianity, American superiority, and laissez-faire—unconditionally.

This is why conservatives rely on labeling as a form of argument against liberals. To call them anti-American, socialist, or communist should itself be enough of an argument; there is apparently no need to actually explain why being these things should be bad or to even have an objective definition for what these terms mean. It all means the same: Liberals are evil because they are not conservative. Q.E.D.

There is actually a logical reason why conservatives act this way, in every country in the world: In every country traditions are treated as the default good[1]. Christianity is good because it is the default in America—it is part of its culture—in the same way that Islam is good in middle eastern countries because it is the default; Laissez-faire is good because it is the default economy. Ideas that contradict these values are treated as blasphemous.

Conservatives love to use the term “common sense” to describe their beliefs. This is apt; when your beliefs are based on the default, when they are so closely embedded within the culture that one cannot be raised within the culture without being infected with them to the point of propaganda, it is easy for them to appear to be common sense. Laissez-faire appears obvious when one is raised being constantly fed pro-laissez-faire arguments and frameworks in the same way that those in the Soviet Union knew that it was obvious that their economic problems were caused by western imperialism.

Because what is usually known as “leftism”, by its nature, goes against these traditions it is much harder to defend, even if logically superior. When one is raised within the dichotomy of “free” economy vs. “command” economy—terms that even the pretend science, economics, depressingly uses—it is difficult to explain socialist ideology when it completely rejects such a dichotomy. In this context socialists look positively insane: Why would they want an economy dominated by a bureaucratic state?

Americans cannot possibly be pro-laissez-faire for the simple reason that the majority of Americans do not truly know what its alternatives are. Those who have never read socialist texts cannot be taken seriously when they argue against socialism—and yet that is precisely what most Americans do. We hear conservative pundits or politicians make the most ridiculous remarks against socialism and few even question whether one should accept the ideological definition of people obviously biased against said ideology. After all, when the pro-laissez-faire rich tell us socialism is bad, they must be telling the truth; why would they lie about something when lying would benefit them?

This dishonesty has the added defect of making some ideas seem positively crazy. This may explain why anarchism is pretty much invisible—the modern movement, as well as its historical elements. How can one explain those who oppose both government and capitalism without giving up the myth of “antigovernment” capitalism? Of course, when they are portrayed they must be done so with the same honesty as socialism: By portraying it as exactly the opposite of what it really is. Thus, anarchists are depicted as bomb-throwing totalitarians, or the term (as well as “libertarianism”) is snatched and applied to laissez-faire movements, in contradiction to history. Thus why many Americans watch Bill O’Reilly jokingly call himself an anarchist on The Daily Show and not notice the obvious overreaching irony: That conservatives actually want you to believe that they’re the ones who are antigovernment.

As long as Americans hide away from Proudhon and Dawkins because they’re afraid of having their delicate traditions offended they will never be able to understand politics accurately. They will not even have true opinions at all, willingly subjecting their freedom of opinion to the dominant ideology unconditionally. For such a thing to occur in a country that prides itself on freedom of thought is a much direr form of political correctness than non-Christians hurting Christian fundamentalists’ narcissistic need for special privileges.

[1] “Marxist” countries such as the former Soviet Union and Cuba are exceptions. At the very least, they are not traditional superstitions, but a kind modern superstitions enforced in the same way as the conservative kind.

Posted in Politics