Let’s face it: Human beings of today cannot live without technology. It’s everywhere, from cellphones, to desktops and laptops, to the most critical of operations like national security, international relations, and even weapons of mass destruction. But we’re not stopping there. We’re never stopping there. The addicting allure of technology would only make us yearn more for its benefits. It’s not surprising to predict that in the near future, everything will be done in just a simple push of the button. But before you say Google is great, let me ask you this: Is it really okay?
Let’s look at the example given in Summer Wars: OZ. It’s the ideal world that substitutes everything we have in reality. From sports, to games, to legal transactions. Everything can be done anytime, anywhere. No long queues, no shortage of consumable content, no end to features that it can offer. Convenient, right? Then Love Machine comes in, takes over OZ, and uses this convenience to do its playful bidding. If not for the Jinnouchis and Koiso Kenji, it could’ve plunged the world into chaos.
Since time immemorial, humans have depended too much on science. They think that anything that has to do with science is fine and righteous because of its son, technology. Science doesn’t betray humanity’s faith and materialism like religion does, and most of the time it delivers the desired results. Even the weapons of war and mass destruction were given “consideration” on their creation. To kill an opposing few to save the many, it’s fine and righteous, right? Therefore we should make weapons for this cause!
But what if science suddenly betrays us? What if science suddenly turns its back on us? What if science suddenly tries to destroy us using the very technology that both science and humanity developed?
A good example of Science + Humanity = Technology FAIL. Then again, what about our mecha?
Humanity then started to fear science and technology. They thought that once they use them, they can never be stopped. They feared that science would give technology sentience. They feared science would suddenly generate an irreversible error through means and usage of technology. They feared that science would set off doomsday and genocide, and that there’s no reset code to stop it. Humanity tries to refrain itself from its desires, but the temptation defeats them. It’s like a dog owner and his dog, where the dog bites the owner, and the owner, instead of killing the dog, would beat it, put a tighter leash on it, and still have it around. If we were briefed on what happened to Love Machine after Summer Wars, this could be the case. The US Government would make a public apology, “destroy” Love Machine, and then make a more lethal but obedient version of it. What happens after, I don’t want to know. More like I’m scared on the capabilities of the program way too much to even imagine what happens after the aftermath. If the plot and events of the movie failed on the way they effectively restricted the dangerous possibilities the concept of the technology, it would screw everything up. And I mean everything.
So is overdependence to technology really okay? I don’t think so.