Me and Summer Wars: Rage Against The Machine

Deus Ex Machina

Let’s face it: Human beings of today cannot live without technology. It’s everywhere, from cellphones, to desktops and laptops, to the most critical of operations like national security, international relations, and even weapons of mass destruction. But we’re not stopping there. We’re never stopping there. The addicting allure of technology would only make us yearn more for its benefits. It’s not surprising to predict that in the near future, everything will be done in just a simple push of the button. But before you say Google is great, let me ask you this: Is it really okay?

Let’s look at the example given in Summer Wars: OZ. It’s the ideal world that substitutes everything we have in reality. From sports, to games, to legal transactions. Everything can be done anytime, anywhere. No long queues, no shortage of consumable content, no end to features that it can offer. Convenient, right? Then Love Machine comes in, takes over OZ, and uses this convenience to do its playful bidding. If not for the Jinnouchis and Koiso Kenji, it could’ve plunged the world into chaos.

Since time immemorial, humans have depended too much on science. They think that anything that has to do with science is fine and righteous because of its son, technology. Science doesn’t betray humanity’s faith and materialism like religion does, and most of the time it delivers the desired results. Even the weapons of war and mass destruction were given “consideration” on their creation. To kill an opposing few to save the many, it’s fine and righteous, right? Therefore we should make weapons for this cause!

But what if science suddenly betrays us? What if science suddenly turns its back on us? What if science suddenly tries to destroy us using the very technology that both science and humanity developed?

A good example of Science + Humanity = Technology FAIL. Then again, what about our mecha?

Humanity then started to fear science and technology. They thought that once they use them, they can never be stopped. They feared that science would give technology sentience. They feared science would suddenly generate an irreversible error through means and usage of technology. They feared that science would set off doomsday and genocide, and that there’s no reset code to stop it. Humanity tries to refrain itself from its desires, but the temptation defeats them. It’s like a dog owner and his dog, where the dog bites the owner, and the owner, instead of killing the dog, would beat it, put a tighter leash on it, and still have it around. If we were briefed on what happened to Love Machine after Summer Wars, this could be the case. The US Government would make a public apology, “destroy” Love Machine, and then make a more lethal but obedient version of it. What happens after, I don’t want to know. More like I’m scared on the capabilities of the program way too much to even imagine what happens after the aftermath. If the plot and events of the movie failed on the way they effectively restricted the dangerous possibilities the concept of the technology, it would screw everything up. And I mean everything.

So is overdependence to technology really okay? I don’t think so.

16 Responses to “Me and Summer Wars: Rage Against The Machine”


  • Do you know how to program?? If you don’t then think about it like this: A computer is a purely logical machine. It does what it’s told to. For a machine to observe, learn and change its own code is nearly impossible. A robot will do whatever it’s supposed to do until it runs out of battery. It must then recharge itself, but it doesn’t know that. Do you really think that it will learn through observation, that he can charge himself by plugging a cable into his back, even if that isn’t part of its routines?? It’s impossible.

    And besides, even if there was a way to do that, do you really think that actual scientists haven’t discussed the potential dangers of a true A.I. extensively?? All this “The computers will turn against us” thing is complete bullshit.

    • This is an example of a guy using his left brain for his thought processes… and is doing it wrong, because he thought that “science betraying humanity = sentience”. Please learn to do probabilistic study and research “human error”, kthx.

      Also, this a prime example of a person defending science and technology because it’s logical and beneficial more than it’s dangerous and destructive (in a Summer Wars kind of sense, so let’s stick to the topic, k?).

  • I like how this article rides upon so many assumptions as to be rendered laughable. Plus, it doesn’t help that you don’t define what an overdependence on technology signifies. Is using an electric stove to cook or a faucet with clean running water with which to wash our hands or getting regular vaccinations to improve our resistance against disease overdependence? Somehow, your worries here come off as being a reaction against things you don’t understand well enough, and so, you toss in all this fear-mongering rather than thinking it through.

    • This reaction is what I based on the advanced, state-of-the-art technology used in Summer Wars. I can say that your examples can be cited as a type of overdependence to technology, but that’s just in a lesser, harmless degree. But if you can launch nukes via OZ in the Internet, how screwed can your technology get?

      • So wouldn’t this be more of a case of making your technology idiot-proof? I mean, people program these things. If someone left a loophole that big within the program/system/whatever, then it’s the fault of the human rather than the technology itself. Technology is just a tool. Humans are the ones in charge of figuring out how to use it appropriately.

        • Isn’t that the reason why people are overdependent on tech? Tech doesn’t disobey, it’s almost doesn’t fail. But what if there’s an underlying purpose for some of them? You can’t tell. Human influence on technology is a major factor in the defection, true. But don’t you think we’re doing it way too much?

          • This is where it starts moving into the realms of speculation and conspiracy theory which runs more on imagined fears. Granted, a little bit of caution never hurts, but somehow, your writings here seem to be more than just “a bit cautious.”

            And no, as you can probably guess, I don’t think we’re doing it way too much. An open, interconnected world will help in bringing about greater understanding and exposure to new ideas. And that in turn will lower the sorts of prejudices that we’ve seen in the past.

            • But what about other factors? Pollution? Destroying Mother Nature? War? Nukes are by far the worst can of worms the world had to open. What’s next, then?

  • Maybe I too use only the left-side of my brain because I cannot understand how science can betray us. Science and technology at its core have always and will be instruments for improving human welfare. To use words and phrases like “betray” and “turn its back” is implying technology itself can somehow decide to undermine us when it has no capability to do so, at least not in reality.

    The sentence that stood out to me was your question, “To kill an opposing few to save the many, it’s fine and righteous, right?” because it’s a moral issue and not related to overdependence on technological advancement, but more on how we choose to use it.

  • Hence the center point of those ‘technological sociology’ literary courses that Engineering majors are required to take (depending on your University), always about how technology can go really wrong and we, as the Innovators, must apply science properly.

    Except no one pays attention in that class, so expect your Doomsday Device =)

    Where something is okay or not doesn’t really matter in the end, otherwise Earth would already be Eden.

    • It’s always that “out of sight, out of mind” kind of policy that the common people (and corporate idiots) implement. It angers me to see that anything that doesn’t concern them will be ignored, even if it means putting everyone and the planet in danger, which is wrong in every sense.

  • This reminds me of a webcomic that got a chuckle out of me recently:

    http://dresdencodak.com/2009/09/22/caveman-science-fiction/

    ME AM PLAY GODS!

  • I agree that human fears the technology he creates. It’s pretty obvious especially in science fiction/dystopia stories. Your analogy about the dog and the owner is logical. But I cannot approve the murder of the dog no matter how logical the action is. That’s human and that’s me joking around ^_^

  • You’ve been gone a while… Hope everything’s all right.

  • Science is not some kind of object that can run amuck on its own accord. It’s a human creation, a particular method of discovering knowledge, and usually the collection of knowledge that comes from that (including the ability to invent various new technologies). If science is merely a system for finding knowledge, it’s entirely the fault of the scientist and the community he or she is living in if that skill is used to discover a new way of waging war as opposed to developing a cure for a disease, efficient ways of recycling, more drought-tolerant crops, etc. Science cannot be “righteous” only the people using it. Also, science has nothing to do with “faith” in the first place. If you start having faith in something, it stops being genuine science and starts being mere opinion, or a religion.

    Actually, it would be nice if “all” human beings had access to even the most useful inventions like antibiotics and filtered water from a tap. Then maybe we could fret (or not fret) more equally about being overdependent on technology.

Leave a Reply




%d bloggers like this: