Tuesday, April 19, 2011

Wandering in Wonderland

I made it to my vantage point safe and sound after successfully escaping the collision with two students who were completely immersed into texting on their smartphones. Now, everything is peaceful. Well, not really. Groups of students are marching on from one place to another while sharing laughs and gossip. There are a few of them alone, but they are accompanied by the tunes flowing into their ears. Personally, I find it quite ironic that so many still believe we are far away from a migration to the virtual world, when we have replaced the sounds of nature with constant digital sounds, and the views that surround us with the photos from our friends’ social profiles.                                                               
However, it is clear that the cyberspace is not always bad for us - it depends on how one uses it. I believe this first year seminar helped me understand better what cyberspace really is and what lies behind it, and I am not referring here to the underlying code. Cyberspace has been shaped as we see it today as a result of our wishes and needs such as efficiency and permanent connectivity. It is amazing how I was able to do most of my research for the projects we had for this course online. Nevertheless, before taking this seminar, I would have taken it all for granted. I want to let myself be surprised more often by what cyberspace can offer. After all, it creates wonderlands for different people to simply have fun and take a break from reality. And who controls all these wonders? We want to believe that we have the power. However, perhaps the most important thing I have learnt is that there are moguls vying to control the switch to cyberspace. Am I afraid that someone will take over cyberspace and create a virtual reality to replace our actual reality? I do not know. I guess the novels I have read and the movie “The Matrix” just made me imagine possible scenarios. I still want to explore cyberspace, not simply label it as something scary and quit using it. Maybe the key is moderation or maybe the lack of ignorance.
It seems as if I have written down all my worries as well as my dreams. But I think as I advanced in this course, I realized there is much more to cyberspace than what I think. It made me realize I want open access to Internet, but there a lot more people out there who are eager on having a switch to restrict the access. There is something more than the apparent easiness of wandering through cyberspace.
I wonder if all these students running around would prefer to keep using all their flashy devices and keep their shallow connection to cyberspace… In the future it might be entertaining to use a virtual world in the course to get a better grasp of how these digital worlds function and what they could be used for. After Gibson’s consensual hallucinations and Anderson’s dumb universe, the direct experience with a digital world might be refreshing.

Tuesday, April 12, 2011

Wide Open Virtual World

Massively multiplayer online games are addictive, a waste of time, reclusive, destructive. These are just a few of the characteristics that a lot of young people come up with when they think of games. So, why is Dr. Castronova devoting an entire book to the subject of migrating in the digital world through play? Maybe, there is something more with these online games. As Castronova emphasizes in his book Exodus to the Virtual World, most of these games bring to life a new world, a virtual one, but, at the same time, one that has the proportions of a real world. Knights that fight dragons and save princesses with magic swords… Or the same world envisioned in the movie Lord of The Rings or any other movie you can think of. Presently, any successful movie can be turned into a game. You do not have to limit yourself to just watching blue avatars on the big screen, but you can actually be the one that saves Pandora (Avatar). Moreover, you can have this experience as many times as you would like.

Pandora in Second Life

Still believe that gaming is just a wasteful activity? Why would we want to be heroes in a different universe, when we have our own goals that we strive to achieve every day? Personally, I think being completely pleased with ourselves is an elusive concept. In one scenario, you might not have a very successful life. Year pass and you feel that your role in this life is meaningless. This is the category of people which, according to Castronova, is most likely to embrace the virtual worlds: “Not everyone lives in a community with rich traditions, faiths, and stories that put meaning into everyone’s life, whereas in synthetic world, everyone is asked to complete quests, fight enemies, and become a hero” (69). On the other hand, we could accomplish everything we aimed for: the high-paying job and the perfect family. But, what comes next? You can make a change in the world, reach high recognition and then what? Even if we can do all these, we are still humans, so we will always desire more. It is in our nature; that is why if we do not set the bar high, boredom might install upon our lives.
So, are we all going to move in the virtual land? It is still unclear, but I can see how this exodus can bring benefits to everyone regardless of status, money, race, appearance. We all want to be challenged and overcome obstacles in order to feel good about ourselves. Additionally, we want to have fun, to simply enjoy our existence and be happy. As Castronova puts it “helping people find happiness may involve something other than giving them the things they currently seek” (88). This is what these online games promise.
Fantasy is how many of us describe these virtual games. However, the number of people who commit themselves to gaming is far from unreal. The virtual world is not so virtual anymore; its implications will affect our “real” lives sooner or later.

Work cited:
Castronova, Edward. Exodus to the Virtual World How Online Fun Is Changing Reality. New York: Palgrave Macmillan, 2007. Print.

Monday, April 4, 2011

Project 3 Final: The Power of Our Brains

“Computing machines can do readily, well, and rapidly many things that are difficult or impossible for man, and men can do readily and well, though not rapidly, many things that are difficult or impossible for computers. That suggests that a symbiotic cooperation, if successful in integrating the positive characteristics of men and computers, would be of great value” (J. C. R. Licklider).


In 1960 in his paper “Man-Computer Symbiosis”, Licklider described how humans and machines, both with positive and negative characteristics, could complement one another. This cooperation was based on the idea that computers are tools that can reduce the time humans spend on tedious tasks such as finding basic information or making calculations. Licklider envisioned the way personal computing would develop and the design of the Internet. He admits, though, that a truly perfect interaction between humans and machines depends on the input and output equipment (“Man-Computer Symbiosis”).
While Licklider imagined that speech recognition could be a viable solution for the future of communications, voice commanded computers have mostly been seen in movies rather than in real life. Over the past years, researchers have directed their efforts towards brain-computer interfaces. This paper examines the recent developments in the technology behind human-machine interfaces and the prospective applications of this technology. The analysis has shown that this technology provides great opportunites particularly for disabled people, while it can also be used in the military, the entertainment industry, and communication. As with any other scientific breakthrough, there have been several negative reactions but these cannot hide the potential of this technology.
A brain-computer interface (BCI) represents a way of communicating between a human being and a computer through thoughts rather than motor movements. As explained in the article “From your Brain to the Machine”, BCIs are controlled by brain waves generated by particular thoughts (28). In most research experiments, electrodes have been implanted on the brain’s surface in order to process the brain waves. However, devices similar to chips that can be implanted in the brain allowing the mind to control computers have also been invented. For instance, BrainGate is one neural implantable device that has already been successfully tested on humans. According to the information provided on their website, BrainGate, the company behind these chips aims to improve the condition of severely disabled people, such as quadriplegics and Parkinson’s patients, with the help of neural implantable devices (Company).

Although this interaction between men and machines is often considered purely science fiction, the possibility of direct communication between humans and machines has been investigated since 1974 (Vidal). At present, innovation in technology takes place at an extremely fast pace. Therefore, the fact that research in this field has been done for almost half a century with no definite results, might actually lead many people to claim that BCI research should be dismissed, as it will never be safe. However, the slow development of brain-computer interface technology can be justified by the challenges faced by researchers when dealing with the human brain. The paper "Brain-computer interfaces as new brain output pathways" develops on some of the limitations of BCIs which can be caused by inexperienced users, and, most importantly, by the variability and the wide distribution of brain functions (Wolpaw 615).
In recent years, there have been successful experiments on animals, as well as on human subjects. Furthermore, as presented in Patrick Tucker’s article “Prospects for Brain-Computer Interfacing”, researchers strongly believe that the technologies will develop quicker in the near future (8). At the University of Pittsburgh, researchers have used the brain-computer interface technology to teach a monkey to control a robotic arm just with its thoughts. This is just one example that presents the potential of direct brain-machine communication. The technology could provide a new life for persons with severe motor disabilities as it has already been proved. In 2008, CBS’s 60 Minutes covered a story on BrainGate, admitting that “Once in a while, we run across a story that is hard to believe until you see it. That’s how we felt about this story when we first saw human beings operating computers, writing emails, and driving wheelchairs with nothing but their thoughts” (“BrainPower”).
In the article “Chips in Your Head”, Ohl and Scheich noted that 400,000 Americans suffer from partial or total paralysis due to spinal cord injuries (65). Therefore, there is an evident need for brain-computer interfaces that could provide new ways of interaction for paralyzed people. While people will not have their own blue surrogate, researchers are trying to move towards a world envisioned in the movie Avatar. Besides brain-wave control of computers, another future prospect is the control of external prosthetics.

In an excerpt from his new book published in the Scientific American, Nicolelis provides a thorough image of the coming “Age of Machines” by showing the numerous applications of BCIs (81). For instance, the exoskeleton, the robotic suit that allows physically impaired people to move, could be controlled solely by brain thoughts. Nicolelis’s positive approach towards this technology stems from the idea that human beings will finally break away from the limits of their own bodies: “I feel an intense calling to embrace the amazing opportunities that freeing our brains from the limits of our terrestrial bodies can bring to our species. In fact, I wonder how anyone could think otherwise, given the tremendous humanistic prospects that brain-machine interface research promises to unleash” (81).
The extent to which BCI have reached a larger audience is shown in the transhumanism movement. Transhumanism is a movement that promotes the enhancement of humans with the aid of technology. The information available on the website that promotes this movement, Humanity+ | Technology & the Futureis centered on the idea that posthumans will become extremely intelligent and resilient to diseases thanks to technologies such as genetic engineering, neural interfaces, or anti-aging therapies (Transhumanist FAQ).  Even with the limitless promises, there have been serious concerns regarding the moral implications of such devices. Many fear that this is just the human's desire to play God which will lead to dramatic consequences for the entire humanity.
However, the different interaction between humans and computers is not only meant to be used as a tool for extending the capabilities of the human body. In an interview for “Il Sole”, Nijholt explains that BCIs could simply be used at home for turning on the lights, but an even more interesting use would be in gaming. There are already tested games that use a BCI. As mentioned on the website Emotiv, the company Emotiv has already developed a wearable interface that measures and interprets brain activity as well as facial expressions (EPOC neuroheadset). Furthermore, in 2007, the technology news website Ars Technica announced that an interface has been developed that allows people to walk or fly their avatars in Second Life just by thinking of it (“Researchers help users control Second Life avatars via brain activity”). This technology has the potential to redefine the entire concept of immersion in virtual realities. Two experiments have been conducted in order to evaluate how the BCI can be used in virtual reality Cave-like system; while several limitations of the BCI are exposed, the results also show the level of engagement: some of the participants confessed to identify themselves with their avatars, and that it comes more natural to use BCI in a virtual environment (Friedman et al iv).
While this might be considered a negative aspect as the limits between the real and digital world become blurry, there are numerous educational purposes for which mind-controlled games could be used. In many of these applications concentration is the key and, since teenagers are likely to devote their complete focus to a game, they can simultaneously learn. Additionally, once BCIs permeate the entertainment industry, the costs of these devices will definitely become more affordable for the larger audience.
Another relevant use of such systems is for simulations, especially for military training. As stated online in the "Department of Defense Fiscal Year (FY) 2010 Budget Estimates", DARPA is currently working at the project Silent Talk that will enable “user-to-user communication on the battlefield without the use of vocalized speech through analysis of neural signals” (12). It is acknowledged that DARPA is also investigating the possibility of telepresence, a concept that implies that a soldier is “physically present at a base or concealed location, but has the ability to sense and interact in a removed, real-world location through a mobile BCI device” (Kotchetkov et al iv, 4). While there are ethical dilemmas as to implanting chips into soldiers’ minds, these technologies could have a significant impact on reducing the number of people who lose their lives in combat.
In an interview from 2004 Sergey Brin, the co-founder of Google, made a very intriguing remark: “it’s credible to imagine a leap as great as that from hunting through library stacks to a Google session, when we leap from today’s search engines to having the entirety of the world’s information as just one of our thoughts” (“Google Guys”). While this idea was not taken seriously back in 2004, right now connecting all human brains to the Internet might seem the ultimate goal of the BCI technology. This paper has introduced several applications of BCI that are already functional or that will be used in the near future. The question that arises is whether the new channel of communication between humans and the Internet will lead to a new stage in the evolution of humanity.
In The Age of Spiritual Machines, Kurzweil goes as far as envisioning a technology that will allow humans to “port their entire mind file” into a computer and, thus, reach immortality (126). The predictions he made for 2009 included wireless communication, computers in all sizes, virtual business transactions, special devices for people with sensory impairment, and cyber music (277). Most of these ideas have materialized, so is Kurzweil right about the future of the human-machine interface? As BCIs are currently designed, the devices serve as a one way channel that transmits commands to a computer or robot. Furthermore, all the applications of this technology presented in this paper still require years of research to go mainstream. Therefore, it is very difficult to predict the next step of BCI research since the devices and technologies currently under investigation might not become applicable in a broader context.
What Kurzweil might be right about is achieving to access all knowledge. Although it will not be the exact scenario from The Matrix where one can insert a chip and immediately know how to fly a helicopter, the neural chips could provide a way to look through a huge database of information – the Internet. This could potentially change how search engines are organized. Instead of keywords, the Internet will display results based on one’s thoughts and feelings which are likely to be more accurate. This brings to light another concern over the capabilities of BCIs. Many worry that such devices could read one’s mind. As Wolpaw explains, BCI cannot read minds, but enables the user to develop new skills that are coordinated by brain signals instead of muscles (“Brain-Computer Interface Research Comes of Age”, 352).
As impressive as it might sound to have a permanent wireless link to the Internet, further research on the brain intricacies might determine whether it would be possible to connect a brain to another brain. Tapscott mentions in his book Grown up Digital a conservation with the founder of Facebook, Mark Zuckerberg, in which Zuckerberg explains his vision: “In order for us to be successful in this century, we’re going to need to be more connected and we’re going to need to have a better understanding of where other people are coming from and just greater sense of like, we’re all connected” (qtd. in Tapscott 55). Thus, brain-to-brain connections would achieve Facebook’s main goal but, at the same time, they might render social networks obsolete. The BCI promises a web of intelligence and, as Nicolelis imagines, a net in which “individual participants not only communicate back and forth with one another just by thinking but also vividly experience what their counterparts feel and perceive as they seamlessly adhere to this true 'mind meld' ”(83).
There are definitely numerous applications of brain-computer interfacing. All the current research proves that the medical applications, as well as the military ones, could make a positive difference in the lives of many people. On the other hand, BCIs are most likely going to be used in virtual reality on a large scale. As for the way everyone will interact with the computer on a daily basis, in Wired for Thought, Stibel believes that the Internet will be similar to a brain and so, the connection between human brains and the Internet will become a reality: “One day, entering thoughts, feelings, or aspirations into the Web will generate a similar process – it will open up endless information, dynamically organized based on your thoughts” (154).

Once considered restricted in the pages of science fiction writings, the brain-computer interfaces have attracted the interest of the scientific world over the past years. The research conducted shows the prospects of the technology behind this different way of communication between humans and machines are immense in medicine, as well as in military and entertainment. Even though this paper’s focus is not the ethics behind the BCI, the moral factor is relevant for further research. This remains the most powerful argument against having a neural chip implant and, thus, altering the human body, which is seen as a violation against the unwritten moral code. However, considering the fact that BCIs have already enabled people with severe motor impairments to express in different ways, the developments in this field will continue at a faster pace. Indeed, taking into account all the advances presented, the world that the writer Gibson envisioned in 1984 might become tangible in the near future.

Works Cited
"From Your Brain to the Machine." Mechanical Engineering 132.11 (2010): 28-29. Academic Search Complete. Web. 27 Mar. 2011.
Cheng, Jacqui. “Researchers help users control Second Life avatars via brain activity”. Ars Technica. 15 Oct. 2007. Web. 27 Mar. 2011. http://tinyurl.com/5um2g6b.
Chittaro, Luca. “Playing With Brain-Computer Interfaces. An interview with Anton Nijholt”. Il Sole 24 Ore (Italy, 5 April 2008). Web. 27 Mar. 2011. http://tinyurl.com/4dlwwac.
Friedman, D., Leeb, R., Pfurtscheller, G., & Slater, M. "Human-Computer Interface Issues in Controlling Virtual Reality With Brain-Computer Interface." Human-Computer Interaction 25.1 (2010): 67-94. Academic Search Complete. Web. 27 Mar. 2011.
Kotchetkov, I.S., Hwang, B.Y., Appelboom, G., Kellner, C.P., and Connolly Jr., E.S. “Brain-computer interfaces: military, neurosurgical, and ethical perspective.” Neurosurg Focus, 28. 5(2010). Web. 27 Mar. 2011.
Kurzweil, Ray. The Age of Spiritual Machines: When Computers Exceed Human Intelligence. New York: Viking, 1999. Print.
Licklider, Joseph C.R. “Man-Computer Symbiosis.” IRE Transactions on Human Factors in Electronics HFE-1.1 (1960): 4-11. Web. 27 Mar. 2011.
Nicolelis, Miguel A. L. "Mind Out of Body." Scientific American 304.2 (2011): 80-83. Academic Search Complete. Web. 27 Mar. 2011.
Ohl, Frank W., and Henning Scheich. "Chips in Your Head." Scientific American Mind 18.2 (2007): 65-69. Academic Search Complete. Web. 27 Mar. 2011.
Sheff, David. “Google Guys”. Playboy, September 2004. Web. 27 Mar. 2011. http://tinyurl.com/dfrujk.
Stibel, Jeffrey M. Wired for Thought. Boston, MA: Harvard Business, 2009. Print.
Tapscott, Don. Grown up Digital: How the Net Generation Is Changing Your World. New York: McGraw-Hill, 2009. Print.
Tucker, Patrick. "Prospects for Brain-- Computer Interfacing." Futurist 44.5 (2010): 7-9. Academic Search Complete. Web. 27 Mar. 2011.
Vidal, Jacques J. “Toward direct brain-computer communication”. Annual Review of Biophysics and Bioengineering. 1973.2:157-180. Web. 27 Mar. 2011.
Wolpaw, J. R. "Brain-computer Interfaces as New Brain Output Pathways." The Journal of Physiology 579.3 (2007): 613-19. Web. 27 Mar. 2011.
Wolpaw, Jonathan R. "Brain-Computer Interface Research Comes of Age: Traditional Assumptions Meet Emerging Realities." Journal of Motor Behavior 42.6 (2010): 351-353. Academic Search Complete. Web. 27 Mar. 2011.
“Department of Defense Fiscal Year (FY) 2010 Budget Estimates”. Defense Advanced Research Projects Agency. May 2009. Web. 27 Mar. 2011. http://tinyurl.com/62cura5.
60 Minutes CBS. Brain Power. Web. 27 Mar. 2011. http://tinyurl.com/ma7pr2.

Tuesday, March 29, 2011

FEED

How does the future look like with all the technology that advances at such a fast pace? In the novel Feed, Anderson depicts a world in which computers are not outside, but in one’s brain. Through a neural chip implanted at birth, called feed, people get access to all the available information, watch the news and movies, see all the new advertisements and make purchases, and talk with other persons. Is it not the world that we all dream of? We could be permanently connected to a net with a much larger scope than the Internet. We could get all the latest information and become “supersmart without ever working” (Anderson 47).
What more can we ask for? Actually, the main character in the novel, Titus, explains that “the braggest thing about the feed, the thing that made it really big, is that it knows everything you want to hope for, sometimes before you even know what those things are” (48). But is this the greatest or the worst part about a feed? This passage brings to light the real state of things: mega corporations such as Feedlink or American Feedware control what people wish for and can even create false desires. Personally, I found it very intriguing whether Anderson is criticizing a society that relies too much on technology, or, rather our own reality in which advertising dictates our preferences. How many ads make us feel as if were not able to live without one specific product? It is questionable whether it is really good that companies are able to predict what we will need or want or are they only giving us the impression that this is the thing that can complete our existence. In the latter case, the goal would be to make us all buy more and, thus, spend more money.
Somehow I think that many are aware of the negative influence of advertisements, but no real action is taken. An interesting perspective in Feed is that of the character Titus: “Of course, everyone is like, da, da, da, evil corporations, oh they’re so bad, we all say that, and we all know they control everything. I mean, it’s not great, because who knows what evil shit they’re up to. Everyone feels bad about that. But they’re the only way to get all this stuff, and it’s no good getting pissy about it, because they’re still going to control everything, whether to like it or not.” (48, 49) Is this our own excuse? We admit that ads are evil but, at the same time, we believe that there is no point in rising against them and the media giants behind them, because we could never succeed. Anderson seems to condemn this exact feeling of complacency. After all, many claim that advertisements do not alter their choices, but when the “revolutionary” pair of jeans appears on a huge billboard, they will all rush to buy it. Or, have we reached the moment when the mega corporations that aim to modify our decisions have become a necessary evil?

Anderson, M.T. Feed. Cambridge, Mass.: Candlewick Press, 2002. Print.

Sunday, March 27, 2011

Project 3: The Power of Our Brains

“Computing machines can do readily, well, and rapidly many things that are difficult or impossible for man, and men can do readily and well, though not rapidly, many things that are difficult or impossible for computers. That suggests that a symbiotic cooperation, if successful in integrating the positive characteristics of men and computers, would be of great value.” (J. C. R. Licklider)

In 1960, in his paper “Man-Computer Symbiosis”, Licklider described how humans and machines, both with positive and negative characteristics, could complement one another. This cooperation was based on the idea that computers are tools that can reduce the time humans spend on tedious tasks such as finding basic information or making calculations. Licklider envisioned the way personal computing would develop and the design of the Internet. He admits, though, that a truly perfect interaction between humans and machines depends on the input and output equipment.
While Licklider imagined that speech recognition could be a viable solution for the future of communications, voice commanded computers have mostly been seen in movies rather than in real life. Over the past years, researchers have directed their efforts towards brain-computer interfaces. This paper examines the recent developments in the technology behind human-machine interfaces and the prospective applications of this technology.
A brain-computer interface (BCI) represents a way of communicating between a human being and a computer through thoughts rather than motor movements. As explained in the article “From your Brain to the Machine”, BCIs are controlled by brain waves generated by particular thoughts (28). In most research experiments, electrodes have been implanted on the brain’s surface in order to process the brain waves. However, devices similar to chips that can be implanted in the brain allowing the mind to control computers have been invented. For instance, BrainGate is one neural implantable device that has already been successfully tested on humans. According to the information provided on their website, BrainGate, the company aims to improve the condition of severely disabled people, such as quadriplegics and Parkinson’s patients, with the help of neural implantable devices (Company).
Although this interaction between men and machines is often considered purely science fiction, the possibility of direct communication between humans and machines has been investigated since 1974 (Vidal). At present, innovation in technology takes place at an extremely fast pace. So, the fact that research in this field has been done for almost half a century with no definite results, might actually lead many people to claim that BCI research should be dismissed, as it will never be safe. However, the slow development of brain-computer interface technology can be justified by the challenges faced by researchers when dealing with the human brain. Some of the limitations of BCIs are due to the variable frequencies of brain signals and the mapping the brain activity.
In recent years, there have been successful experiments led on animals, as well as on human subjects. Furthermore, as presented in Patrick Tucker’s article “Prospects for Brain-Computer Interfacing”, researchers strongly believe that the technologies will develop quicker in the near future (8). At the University of Pittsburgh, researchers have used the brain-computer interface technology to teach a monkey to control a robotic arm just with its thoughts. This is just one example that presents the potential of direct brain-machine communication. The technology could provide a new life for persons with severe motor disabilities as it has already been proved. In 2008, CBS’s 60 Minutes covered a story on BrainGate, admitting that “Once in a while, we run across a story that is hard to believe until you see it. That’s how we felt about this story when we first saw human beings operating computers, writing emails, and driving wheelchairs with nothing but their thoughts” (“BrainPower”).
In the article “Chips in your head”, Ohl and Scheich noted that 400,000 Americans suffer from partial or total paralysis due to spinal cord injuries (65). Therefore, there is an evident need for brain-computer interfaces that could provide new ways of interaction for paralyzed people. While people will not have their own blue avatar, researchers are trying to move towards a world envisioned in the movie Avatar. Besides brain-wave control of computers, another future prospect is the control of external prosthetics.

In an excerpt from his new book published in the Scientific American, Nicolelis provides a thorough image of the coming “Age of Machines” by showing the numerous applications of BCIs (81). For instance, the exoskeleton, the robotic suit that allows disabled to move, could be controlled solely by brain thoughts. Nicolelis’s positive approach towards this technology stems from the idea that human beings will finally break away from the limits of their own bodies: “I feel an intense calling to embrace the amazing opportunities that freeing our brains from the limits of our terrestrial bodies can bring to our species. In fact, I wonder how anyone could think otherwise, given the tremendous humanistic prospects that brain-machine interface research promises to unleash” (81).

The interaction between humans and computers is not restricted to medical applications. In an interview for “Il Sole”, Nijholt explains that BCIs could simply be used at home for turning on the lights, but an even more interesting use would be in gaming. There are already tested games that use a BCI. As mentioned on the website Emotiv, the company Emotiv has already developed a wearable interface that measures and interprets brain activity as well as facial expressions (“EPOC neuroheadset”). Furthermore, in 2007, the technology news website ars technica announced that an interface has been developed that allows people to walk or fly their avatars in Second Life just by thinking of it (“Researchers help users control Second Life avatars via brain activity”). This technology has the potential to redefine the entire concept of immersion in virtual realities. Two experiments have been conducted in order to evaluate how the BCI can be used in virtual reality Cave-like system; while several limitations of the BCI are exposed, the results also show the level of engagement: some of the participants confessed to identify themselves with their avatars, and that it comes more natural to use BCI in a virtual environment (Friedman et al iv).
While this might be considered a negative aspect as the limits between the real and digital world become more blurry, there are numerous educational purposes for which mind-controlled games could be used. In many of these applications concentration is the key and, since, teenagers are likely to devote their complete focus to a game, they can simultaneously learn. Additionally, once BCIs permeate the entertainment industry, the costs of these devices will become more affordable for the larger audience.
Another relevant use of such systems is for simulations, especially for military training. As stated in the online budget estimates for 2010, DARPA is currently working at the project Silent Talk that will enable “user-to-user communication on the battlefield without the use of vocalized speech through analysis of neural signals” (12). It is acknowledged that DARPA is also investigating the possibility of telepresence, a concept that implies that a soldier is “physically present at a base or concealed location, but has the ability to sense and interact in a removed, real-world location through a mobile BCI device” (Kotchetkov et al iv, 4). While there are ethical dilemmas as to implanting chips into soldiers’ minds, these technologies could have a significant impact on reducing the number of people who lose their lives in combat.
In an interview from 2004, Sergey Brin, the co-founder of Google, made a very intriguing remark: “it’s credible to imagine a leap as great as that from hunting through library stacks to a Google session, when we leap from today’s search engines to having the entirety of the world’s information as just one of our thoughts” (“Google Guys”). While this idea was not taken seriously back in 2004, right now connecting all human brains to the Internet might seem the ultimate goal of the BCI technology. Up to this moment, in this paper there have been introduced several applications of BCI that are already functional or that will be used in the near future. The question that arises is whether the new channel of communication between humans and the Internet will lead to a new stage in the evolution of humanity.
In The Age of Spiritual Machines, Kurzweil goes as far as envisioning a technology that will allow humans to “port their entire mind file” into a computer and, thus, reach immortality (126). The predictions he made for 2009 included wireless communication, computers in all sizes, virtual business transactions, special devices for people with sensory impairment, and cyber music (277). Most of these ideas have materialized, so is Kurzweil right about the future of the human-machine interface? As BCIs are currently designed, the devices serve as a one way channel that transmits commands to a computer or robot. Furthermore, all the applications of this technology presented in this paper still require years of research to go mainstream. Therefore, it is very difficult to predict the next step of BCI research since the devices and technologies currently under investigation might not become applicable in a broader context.
What Kurzweil might be right about is achieving to access all knowledge. Although it will not be the exact scenario from The Matrix where one can insert a chip and immediately know how to fly a helicopter, the neural chips could provide a way to look through a huge database of information – the Internet. This could potentially change how search engines are organized. Instead of keywords, the Internet will display results based on one’s thoughts and feelings which are likely to be more accurate. This brings to light another concern over the capabilities of BCIs. Many worry that such devices could read one’s mind. As Wolpaw explains, BCI cannot read minds, but enables the user to develop new skills (“Brain-Computer Interface Research Comes of Age”, 352).
As impressive as it might sound to have a permanent wireless link to the Internet, further research on the brain intricacies might determine whether it would be possible to connect a brain to another brain. Tapscott mentions in his book Grown up digital a conservation with the founder of Facebook, Mark Zuckerberg, in which Zuckerberg explains his vision: “In order for us to be successful in this century, we’re going to need to be more connected and we’re going to need to have a better understanding of where other people are coming from and just greater sense of like, we’re all connected” (qtd. in Tapscott 55). Thus, brain-to-brain connections would achieve Facebook’s main goal but, at the same time, they might render social networks obsolete. The brain computer interface promises a web of intelligence and, as Nicolelis imagines, a net in which “individual participants not only communicate back and forth with one another just by thinking but also vividly experience what their counterparts feel and perceive as they seamlessly adhere to this true <<mind meld>>”(83).
There are definitely numerous applications of brain-computer interfacing. All the current research proves that the medical applications, as well as the military ones, could make a positive difference in the lives of many people. On the other hand, BCIs are most likely going to be used in virtual reality on a large scale. As for the way everyone will interact with the computer on a daily basis, in Wired for thought, Stibel believes that the Internet will be similar to a brain and so, the connection between human brains and the Internet will become a reality: “One day, entering thoughts, feelings, or aspirations into the Web will generate a similar process – it will open up endless information, dynamically organized based on your thoughts” (154).


Transhumanism is a movement that promotes the enhancement of humans with the aid of technology. The information available on the website that promotes this movement, “Humanity Plus”, includes that posthumans will be extremely intelligent and resilient to diseases thanks to technologies such as genetic engineering, neural interfaces, or anti-aging therapies (“Transhumanist FAQ”). Many are against this movement, claiming that there are several moral issues to be regarded such as playing God.
Even though this paper’s focus is not the ethics behind the BCI, the moral factor is relevant for further research. Taking into account all the advances presented, the world that Gibson depicted in Neuromancer might become tangible in the near future.

Works cited:
Licklider, Joseph C.R. “Man-Computer Symbiosis.” IRE Transactions on Human Factors in Electronics,
1. Mar. 1960: 4-11. Web. 27 March 2011.
"From Your Brain to the Machine." Mechanical Engineering 132.11 (2010): 28-29. Academic Search Complete. Web. 27 March 2011.
Kotchetkov, I.S., Hwang, B.Y., Appelboom, G., Kellner, C.P., and Connolly Jr., E.S. “Brain-computer interfaces: military, neurosurgical, and ethical perspective.” Neurosurg Focus, 28. 5(2010): . Web. 27 March 2011.
Nicolelis, Miguel A. L. "Mind Out of Body." Scientific American 304.2 (2011): 80-83. Academic Search Complete. Web. 27 March 2011.

Sheff, David. “Google Guys”. Playboy, September 2004. Web. http://tinyurl.com/dfrujk. 27 March 2011.
Friedman, D., Leeb, R., Pfurtscheller, G., & Slater, M. "Human-Computer Interface Issues in Controlling Virtual Reality With Brain-Computer Interface." Human-Computer Interaction 25.1 (2010): 67-94. Academic Search Complete. Web. 27 March 2011.
Tucker, Patrick. "Prospects for Brain-- Computer Interfacing." Futurist 44.5 (2010): 7-9. Academic Search Complete. Web. 27 March 2011.
Vidal, Jacques J. “Toward direct brain-computer communication”. Annual Review of Biophysics and Bioengineering. 1973.2:157-180. Web. 27 March 2011.
Defense Advanced Research Projects Agency. “Department of Defense Fiscal Year (FY) 2010 Budget Estimates”. May 2009. Web. http://tinyurl.com/62cura5. 27 March 2011.
Cheng, Jacqui. “Researchers help users control Second Life avatars via brain activity”. 15 October 2007. Web. http://tinyurl.com/5um2g6b. 27 March 2011.
Chittaro, Luca. “PLAYING WITH BRAIN-COMPUTER INTERFACES. An interview with Anton Nijholt”. Il Sole 24 Ore (Italy, 5 April 2008). Web. http://tinyurl.com/4dlwwac. 27 March 2011.
60 Minutes CBS. Brain Power. Web. http://tinyurl.com/ma7pr2. 27 March 2011.
Wolpaw, Jonathan R. "Brain-Computer Interface Research Comes of Age: Traditional Assumptions Meet Emerging Realities." Journal of Motor Behavior 42.6 (2010): 351-353. Academic Search Complete. Web. 27 March 2011.
Kurzweil, Ray. The Age of Spiritual Machines – When Computers Exceed Human Intelligence. Viking, 1999. Print.
Stibel, Jeffrey M. Wired for Thought: How the Brain is Shaping the Future of the Internet. Boston: Harvard Business Press, 2009. Print.
Tapscott, Don: Grown up digital: How the Net Generation is Changing your World. New York: McGraw Hill, 2009. Print.

Ohl, Frank W., and Henning Scheich. "Chips in Your Head." Scientific American Mind 18.2 (2007): 65-69. Academic Search Complete. EBSCO. Web. 27 Mar. 2011.

Tuesday, March 22, 2011

Connected to Facebook or the Matrix?

The Matrix and The Social Network are two movies that bring to life two startling ideas. One introduces us to a reality in which people are controlled by intelligent machines, while the other depicts the reality of Facebook and its founding. Many might say that the Matrix is simply science fiction, whereas The Social Network tells the real story behind the largest social network. Is it really that simple?
Both movies come from Hollywood, the place synonymous to making money. In fact, the Matrix might have more value simply because the time when it was released allowed new, meaningful ideas to be brought to the big screen. After just more than ten years, things have changed as blockbusters roll out every week and the emphasis is placed on famous actors and actresses, and special effects, rather than on content. So, is it not the case with The Social Network which came out in 2010?
Of course, I do believe that there is no smoke without fire. As the poster for the movie puts it “YOU DON’T GET TO 500 MILLION FRIENDS WITHOUT MAKING A FEW ENEMIES”. Indeed, it is often much more complicated in the real world. Still, I cannot help to notice that Facebook is one of the few websites that does not annoy its users with countless flashy advertisements. As it is also shown in the movie, Zuckerberg was reluctant towards advertising on Facebook from the beginning. This describes him as a person who does not care only about money. Now, if I analyze the characteristic of Wu’s “defining moguls” there must also be the need for power or greatness. This is also debatable as Facebook was initially a network restricted to Harvard students and the idea of expansion seems not to be entirely Zuckerberg’s. In an interview for Time Magazine, he shares his opinion about the movie: “the biggest thing that thematically they missed is the concept that you would have to want to do something — date someone or get into some final club — in order to be motivated to do something like this. It just like completely misses the actual motivation for what we're doing, which is, we think it's an awesome thing to do.”
geek&poke
Going back to The Matrix, the concept presented by the Wachowski brothers seems to me more startling. Even if I have manifested my enthusiasm towards the development of AI in previous posts, I have to wonder if this scenario could actually take place. It is eventually all about control. Intelligent machines could take over our lives, and everything we do or think. At the same time, Facebook could control us through advertisements. It is true that using our personal information (for example: gender or age) specific ads appear on different walls, but it is not as if a third party gets access to private information. Moreover, Facebook has proved very useful in promoting democracy in the recent events in Africa. According to Zucerkberg, Facebook is meant to bring the power back to the people. In his book “Grown up digital”, Tapscott shares what he discussed with Zuckerberg in 2008, regarding the future of Facebook: “[Facebook] wasn’t just going to be a social network for friends. It would be a social utility, like a power grid.” (64). It will be interesting to see if humans will rely on Facebook to fight the coming age of machines.
Works cited:
Wu, Tim. The Master Switch: The Rise and Fall of Information Empires. New York: Knopf, 2010. Print.
Tapscott, Don.  Grown up digital. New York: McGraw-Hill, 2009. Print.
"Person of the Year." Time (2010).Web. http://tinyurl.com/2u4jmvf. 22 Mar. 2011.

Tuesday, March 15, 2011

The technology behind human beings

Human beings have a dual nature as good and bad are merged together. As much as we would like, we can never become perfect. This is the same way that all our creations are made, with both good and bad parts. So, is technology the evil that will destroy our planet? No, because there are no completely bad or completely good technological advances, only shades of gray…
Apparently, many people understand that technology is needed. However, what we are most afraid is that soon we will build intelligent machines that will destroy us or, at least, control everything we do. That is one of the leading arguments of Kaczynski in his manifesto against technology. But he is not the sole person who predicts such a bleak future. Personally, I find it ironic that we are afraid of robots that could think and talk just like a human being, but we do not even consider the harm that our dependence on smartphones, social networks, and Internet, could cause. I think this second scenario is more dangerous (and more relevant at present) because we do not even realize there exists something else leading our lives.
What Kaczynski proposes is the return to the wild. Nevertheless, going back to a primitive stage will only trigger another cycle of evolution. It is in our nature to strive for something important. There is no other way. If we all set goals that we could easily achieve, what would we do next? A life of boredom. It is true that we often complain we have too much to do, but can you imagine not doing anything meaningful for years? This is how we, as humans, function.
Yes, in our quest for greatness, we can go as far as to build machines that are smarter than us and that will try to take us down. But once we realize this, we will fight for our freedom. One beautiful thing of being a human is that when chained, we always strive to break loose. After all, we are the most complex creatures and the artificial machines are just tools that we have created to assist us. If there is a change in the natural poles of power, we will fight back for supremacy.
So, human beings are flawed. But our flaws make us unique and help us survive. I am probably too optimistic but it is simply because I enjoy technology, or, at least, the good parts of technology. I also believe that this digitized world makes us appreciate more the nature. After a day in front of a computer, surrounded by noise, I feel the need to go out and breathe.

Proposal: The Power of Our Brains

What is a human’s most valuable asset? The brain is surely what makes the human race unique and the engine that keeps the process of evolution going. With the recent advances in artificial intelligence, it is debatable whether the supercomputers that we are building will rule us.
The brain computer interfaces (BCI) are somehow believed to be restricted to cyberpunk science fiction. However, there have been several innovations in the field. Personally, as we become more and more worried about who is in charge, I believe BCI should be considered as a prospective solution.
At this point in my research, I have discovered that BCI could make a huge impact in the lives of disabled people. There are known cases when motor impaired persons were able to communicate through their thoughts. Additionally, the technology could bring the online gaming to a new level. What is more intriguing is whether we can establish a connection from our brains to the Internet. This would mean that we could constantly access information just by using our thoughts.
Nevertheless, there are numerous implications that need to be considered. First, surgery is required in order to become fully immersed into the machine. Currently, most research experiments have been done by placing electrodes on one’s head but to have a permanent link to the machine, a chip has to be inserted into the brain. The cost and the dangers of brain surgery are viable reasons for people to be against BCI. Moreover, many might interpret the merge of humans and machines as another way of playing God.
Will we witness an upgrade of our species or our own destruction? The BCI definitely has the potential to revolutionize our worlds. So, my research will focus on whether this technology will reach a large audience in the near future.

60 Minutes CBS. Brain Power. Web. http://tinyurl.com/ma7pr2. 14 Mar.2011
This video introduces people who are communicating through BCI, and an experiment led on a monkey that offers a better idea of how BCI works and how it might develop.

Stix, Gary. "Jacking into the Brain." Scientific American 299.5 (2008): 56-61. Academic Search Complete. EBSCO. Web. 14 Mar. 2011.
The article raises questions on the future of the merging between computers and human brains. While it is more of a thought provoking piece of writing, the article was very useful as it guided my initial research.

Stibel, Jeffrey M. Wired for thought: how the brain is shaping the future of the Internet. Boston, MA: Harvard Business Press. 2009. Print.
Stibel considers that the brain will shape the future form of the Internet and he offers a complete vision of a new connected world. The book is how I learnt about BrainGate, a device that once implanted into the mind allows us to control electronics.

Mel Slater, et al. "Human-Computer Interface Issues in Controlling Virtual Reality With Brain-Computer Interface." Human-Computer Interaction 25.1 (2010): 67-94. Academic Search Complete. EBSCO. Web. 14 Mar. 2011.
The paper investigates how a BCI works in a virtual rituality in order to understand how it might behave in a real life scenario. It provides useful insight into how BCI functions.

Wolpaw, Jonathan R. "Brain-Computer Interface Research Comes of Age: Traditional Assumptions Meet Emerging Realities." Journal of Motor Behavior 42.6 (2010): 351-353. Academic Search Complete. EBSCO. Web. 14 Mar. 2011.
This article focuses on how BCI could improve the lives of motor disabled persons. It is useful as it explains why mind reading would not be possible through the interface.

Tucker, Patrick. "Prospects for Brain-- Computer Interfacing." Futurist 44.5 (2010): 7-9. Academic Search Complete. EBSCO. Web. 14 Mar. 2011.
The article presents how a team of students was able to control a robot through a BCI. It also describes future applications of the BCI such as video-games, navigation systems, and neurotic therapy.

Monday, February 28, 2011

Connections

“We need to founder in the face of difficulty, we need to suffer solitudes, we need to know how utterly weird it is that we're here, no?” This is the question that Sven Birkerts puts in the online conference that appeared in The Atlantic Monthly Online on the America Online network. However, he does not ask for an answer; this is a rhetorical question that comprises his biggest critique of how we all make use of the Internet. Birkerts claims that we, as students, yearn to be connected all the time to as many friends as possible, and forget to spend some time with our own self.
It is interesting how Birkerts points out that more and more students fail at the “old print rituals” such as reading and understanding a book, but it is not our fault. So, what charges can be brought against us?
We have not been born in cyberspace, but it is true that most of us have been exposed to computers from an earlier stage in life. So, it is then normal to turn to the electronic communications, but have we not gone a bit too far? This is where Birkerts’s questioning of our uses of the Internet might seem reasonable.
The monster from every childhood is the darkness. As we grow, the feeling of being left all alone in the dark seems to be replaced by having no friends on Facebook, or no one to chat to. We live in a world in which the one with most likes on Facebook is the most popular and the one who is not connected is simply left behind. This is a world that we have created – it was not implied with the birth of the Internet – and this is what Birkerts criticizes.
An explanation might lie in our way of measuring success in life.  If more people know our name, it must mean that we hold a very important social position and we are much closer of achieving our dreams of greatness. As we move forward on this path of becoming celebrities, we advance farther away from our own self; Birkerts has noticed that “For some decades now we have been edging away from the opaqueness of private life and toward the transparency of a life lived within a set of systems, electronic and other”.

There are definitely unquestionable advantages that the existence of Internet has brought upon us. Nevertheless, as with all other things we have created, there is also a downside. After all, the Internet did not come with a manual so we have started using it in ways that are not exactly proper. Personally, I am not against any web related activities, but I cannot help wondering whether the desire of being connected is taking over our lives, our private selves. This feeling of “connectedness” has effects similar to the ones of a drug. Once the light from our contact point appears on the circuit board that Birkerts talks about, we become happy. But will it not also make us numb to what happens in the real world? In this content, what the consequences that Birkerts foresees, the flattening of historical perspectives or transforming ourselves into shallow human beings are definitely plausible.

Birkerts, Sven. "Into the Electronic Millennium". Boston Review (1991). Web. 28 Feb. 2011. http://bostonreview.net/BR16.5/birkerts.html.

Online Conference - "Is Cyberspace Destroying Society". The Atlantic Monthly Online (1995). Web. 28 Feb. 2011.
http://www.theatlantic.com/past/docs/unbound/aandc/trnscrpt/birkerts.htm.

Wednesday, February 23, 2011

Megacorporations vs. Individuals: The face-off that will change cyberspace forever

Verizon has gone to court to block the net neutrality regulations which are meant to ensure that ISPs cannot favor Internet content and services. The Apple Store is redefining the term of exclusivity, controlling every application you try to run on all of your iDevices. Even more, until recently, AT&T has been the only service provider for iPhones.
If we draw a line, we can see how the media behemoths are striving to grow their own “walled garden”. This version of the world is very similar to the future which Tim Wu predicts in The Master Switch: The Rise and Fall of Information Empires. And with Verizon, Google, Facebook, Apple, or AT&T, that future does not seem very far away. World2.0 might resemble Gibson’s world but that is light years ahead of us.

"Internet Map"

At first glance, the very structure of the Internet seems to support Gibson’s view of a world in which anyone can enter the cyberspace and access all the existent data. After all, the Internet puts everything one could possibly want within easy reach. However, somehow without even realizing it, the machines that we use on a daily basis are not serving anymore their original purposes. Wu contends that the new machines are “designed for consumption, not creation” (293). He illustrates this situation with devices such as the iPod and the iPad that, thanks to attractive interfaces, have managed to draw attention from the lack of power given to the individual. Here, a counterargument could be the ability to develop apps; however, these apps usually cannot be used on different operating systems and must be approved before one can implement them on his or her device. In this case, the information industry is more polished and more convenient, but there is also less choice. As Wu suggests, this is very much similar to what happened in the information industries from the twentieth century. Gibson underestimated the strong belief of conglomerates that they are the only ones apt to identify and fulfill popular desires. 
William Gibson is quick to notice the importance of single individuals that work their magic in cyberspace. The “console cowboys”, as he names them, have the power (the brains) to rule the world, to access everything. The main character, Case, is the lone cyber-cowboy who works from the shadow and does not have any higher aspirations. However, Gibson fails to understand the complexity of human beings. Based on the theories of the economist Joseph Schumpeter, Wu points out that there exists “a special breed” – those people who dream of becoming gods, of founding private empires (29). After all, who does not wish to be the boss, to be better than everyone? Whether one wants to serve one’s nation as in the case of Mr. Vail or simply to enter the circles of celebrities as in the case of Steven Ross, most people have dreams of greatness and being renowned for their deeds.
In the world of Neuromancer, Case does everything he is told, even if he does not know who is actually behind the decisions. His indifference is perfectly illustrated at the end when he does not care that the A.I. has become “the sum total of the works” (269). Case can do all these because he is an outsider, a disinterested party. Wu is the one who emphasizes the importance of the concept of distance in the process of creating. While the outsiders “can afford to scuttle a perfectly sound ship”, the ones closer to an industry “face a remarkably constant pressure not to invent things that will ruin their employer” (Wu 20). At the same time, Wu stresses the importance of the “right” distance in order to innovate, a concept that is becoming more difficult to achieve. When companies like Microsoft and Apple are rising against open source or free software, it becomes obvious that we are heading in the opposite direction from the world of outlaws envisioned by Gibson. One might argue that all these companies are just trying to keep their businesses up and running, but all these giants have reached this stage of development thanks to the concept of openness of the Internet. Right now they want to close the circle, so that other players cannot disrupt their lasting supremacy. Even Google, the epitome of openness, has become a partner of Verizon – the same company that is very much eager to block free access to sites such as Netflix.

Wu shows how, as a result of this forced closure, creativity and innovation are severely limited. He describes the Kronos effect in order to explain how the natural process of innovation can be hindered by the works of the monopolistic companies: “the efforts undertaken by a dominant company to consume its potential successor in their infancy” (25). This has serious consequences as the numbers of lone outsiders are decimated and, more importantly, they are discouraged from reaching their full potential. As it becomes more difficult to penetrate the web created by large corporations, the idea of a 3D cyberspace, of the “consensual hallucination” that Gibson imagines, remains far-fetched.
Gibson feared that once we have access to technology we will not be able to understand its power and we would end up building artificial intelligences and destroying our humanity. Nevertheless, this has not occurred.  We have developed AIs but we are still far away from becoming one with the machine. Apparently, the big corporations envisioned the same outcomes and decided to rescue us from a bleak future by controlling our power. The issue of controlled content has also been tackled by Lawrence Lessig in his work “Open Source and Open Societies”. He underlines the initial use of the Internet: “a space built on a commons, where because most early code governing the net was open code, and where because of the architectural principle of end-to-end, the network owner could not control how the net would be used – the resource of the net was left open for innovation”, but makes it clear that things have changed and the trend is to control everything one can do on the Internet.
There is another aspect that Gibson has not taken into consideration: the role of the government. The only example of a law enforcement organism is represented by the Turing Police which completely fails in accomplishing its task. However, at present it is evident that people still turn to the government for guidance; otherwise riots would occur all over the globe. We have not yet reached the stage when we do not approve of the government interfering with our actions. Thus, Gibson fails to analyze the implications of the existence of a government. Wu, on the other hand, understands that federal agencies play a significant part. As he perfectly emphasizes: “history shows that in seeking to prevent the exercise of abusive power in the information industries, government is among the actors whose power must be restrained” (304). So, while the government can be the only one that can divide the power of the information industry and prevent a monopoly, the government is also the one that can intervene in the market in favor of a new technology or a large corporation.
Wu’s future of a world ruled by megacorporations is definitely the most plausible one. However, this conclusion goes beyond the arguments that he presents. For me, it is simple the result of acknowledging the current state of affairs in the information industry. The view of monopolies taking over what we see or do on the Internet is not anymore a SF scenario, but something that could happen very soon.
What Gibson has imagined remains a revolutionary idea. Though, it is worth mentioning that both options have a negative impact on humanity. We can choose between two evils: the absence of freedom and too much freedom. It is also interesting if once Wu’s fears have materialized, Gibson’s world will not actually become a desirable escape. Wu debates that the attractiveness of the Hollywood blockbusters and the dazzling devices such as smartphones or iPhones, make most of us forget the danger of an informational empire;
“To see what is sacrificed to such efficiency, polish, and convenience, however, takes work, and to some it may forever remain invisible. It requires appreciation of the merits of systems in which, so to speak, the trains do not always run on time. It requires appreciation of the forms of speech and technical innovation that are excluded in the name of perfection and empire” (305).
While this is a valid statement, there still exists a remote possibility that everyone will understand the dangers of monopolies once we found ourselves face to face with the outcomes of ruthless empires taking over. In this case, Gibson’s wonderland might not seem just an illusion. Indeed, a rebellion against the corporation system might be set in motion and it might eventually lead to lone console cowboys ruling the world.

Work Cited:
Wu, Tim. The Master Switch: The Rise and Fall of Information Empires. New York: Knopf, 2010.
Gibson, William: Neuromancer. New York: ACE Books, 1984.
Lessig, Lawrence: Open Code and Open Societies. Web.  http://www.lessig.org/content/unpublished/. February 23, 2011.