Tuesday, April 19, 2011

Wandering in Wonderland

I made it to my vantage point safe and sound after successfully escaping the collision with two students who were completely immersed into texting on their smartphones. Now, everything is peaceful. Well, not really. Groups of students are marching on from one place to another while sharing laughs and gossip. There are a few of them alone, but they are accompanied by the tunes flowing into their ears. Personally, I find it quite ironic that so many still believe we are far away from a migration to the virtual world, when we have replaced the sounds of nature with constant digital sounds, and the views that surround us with the photos from our friends’ social profiles.                                                               
However, it is clear that the cyberspace is not always bad for us - it depends on how one uses it. I believe this first year seminar helped me understand better what cyberspace really is and what lies behind it, and I am not referring here to the underlying code. Cyberspace has been shaped as we see it today as a result of our wishes and needs such as efficiency and permanent connectivity. It is amazing how I was able to do most of my research for the projects we had for this course online. Nevertheless, before taking this seminar, I would have taken it all for granted. I want to let myself be surprised more often by what cyberspace can offer. After all, it creates wonderlands for different people to simply have fun and take a break from reality. And who controls all these wonders? We want to believe that we have the power. However, perhaps the most important thing I have learnt is that there are moguls vying to control the switch to cyberspace. Am I afraid that someone will take over cyberspace and create a virtual reality to replace our actual reality? I do not know. I guess the novels I have read and the movie “The Matrix” just made me imagine possible scenarios. I still want to explore cyberspace, not simply label it as something scary and quit using it. Maybe the key is moderation or maybe the lack of ignorance.
It seems as if I have written down all my worries as well as my dreams. But I think as I advanced in this course, I realized there is much more to cyberspace than what I think. It made me realize I want open access to Internet, but there a lot more people out there who are eager on having a switch to restrict the access. There is something more than the apparent easiness of wandering through cyberspace.
I wonder if all these students running around would prefer to keep using all their flashy devices and keep their shallow connection to cyberspace… In the future it might be entertaining to use a virtual world in the course to get a better grasp of how these digital worlds function and what they could be used for. After Gibson’s consensual hallucinations and Anderson’s dumb universe, the direct experience with a digital world might be refreshing.

Tuesday, April 12, 2011

Wide Open Virtual World

Massively multiplayer online games are addictive, a waste of time, reclusive, destructive. These are just a few of the characteristics that a lot of young people come up with when they think of games. So, why is Dr. Castronova devoting an entire book to the subject of migrating in the digital world through play? Maybe, there is something more with these online games. As Castronova emphasizes in his book Exodus to the Virtual World, most of these games bring to life a new world, a virtual one, but, at the same time, one that has the proportions of a real world. Knights that fight dragons and save princesses with magic swords… Or the same world envisioned in the movie Lord of The Rings or any other movie you can think of. Presently, any successful movie can be turned into a game. You do not have to limit yourself to just watching blue avatars on the big screen, but you can actually be the one that saves Pandora (Avatar). Moreover, you can have this experience as many times as you would like.

Pandora in Second Life

Still believe that gaming is just a wasteful activity? Why would we want to be heroes in a different universe, when we have our own goals that we strive to achieve every day? Personally, I think being completely pleased with ourselves is an elusive concept. In one scenario, you might not have a very successful life. Year pass and you feel that your role in this life is meaningless. This is the category of people which, according to Castronova, is most likely to embrace the virtual worlds: “Not everyone lives in a community with rich traditions, faiths, and stories that put meaning into everyone’s life, whereas in synthetic world, everyone is asked to complete quests, fight enemies, and become a hero” (69). On the other hand, we could accomplish everything we aimed for: the high-paying job and the perfect family. But, what comes next? You can make a change in the world, reach high recognition and then what? Even if we can do all these, we are still humans, so we will always desire more. It is in our nature; that is why if we do not set the bar high, boredom might install upon our lives.
So, are we all going to move in the virtual land? It is still unclear, but I can see how this exodus can bring benefits to everyone regardless of status, money, race, appearance. We all want to be challenged and overcome obstacles in order to feel good about ourselves. Additionally, we want to have fun, to simply enjoy our existence and be happy. As Castronova puts it “helping people find happiness may involve something other than giving them the things they currently seek” (88). This is what these online games promise.
Fantasy is how many of us describe these virtual games. However, the number of people who commit themselves to gaming is far from unreal. The virtual world is not so virtual anymore; its implications will affect our “real” lives sooner or later.

Work cited:
Castronova, Edward. Exodus to the Virtual World How Online Fun Is Changing Reality. New York: Palgrave Macmillan, 2007. Print.

Monday, April 4, 2011

Project 3 Final: The Power of Our Brains

“Computing machines can do readily, well, and rapidly many things that are difficult or impossible for man, and men can do readily and well, though not rapidly, many things that are difficult or impossible for computers. That suggests that a symbiotic cooperation, if successful in integrating the positive characteristics of men and computers, would be of great value” (J. C. R. Licklider).


In 1960 in his paper “Man-Computer Symbiosis”, Licklider described how humans and machines, both with positive and negative characteristics, could complement one another. This cooperation was based on the idea that computers are tools that can reduce the time humans spend on tedious tasks such as finding basic information or making calculations. Licklider envisioned the way personal computing would develop and the design of the Internet. He admits, though, that a truly perfect interaction between humans and machines depends on the input and output equipment (“Man-Computer Symbiosis”).
While Licklider imagined that speech recognition could be a viable solution for the future of communications, voice commanded computers have mostly been seen in movies rather than in real life. Over the past years, researchers have directed their efforts towards brain-computer interfaces. This paper examines the recent developments in the technology behind human-machine interfaces and the prospective applications of this technology. The analysis has shown that this technology provides great opportunites particularly for disabled people, while it can also be used in the military, the entertainment industry, and communication. As with any other scientific breakthrough, there have been several negative reactions but these cannot hide the potential of this technology.
A brain-computer interface (BCI) represents a way of communicating between a human being and a computer through thoughts rather than motor movements. As explained in the article “From your Brain to the Machine”, BCIs are controlled by brain waves generated by particular thoughts (28). In most research experiments, electrodes have been implanted on the brain’s surface in order to process the brain waves. However, devices similar to chips that can be implanted in the brain allowing the mind to control computers have also been invented. For instance, BrainGate is one neural implantable device that has already been successfully tested on humans. According to the information provided on their website, BrainGate, the company behind these chips aims to improve the condition of severely disabled people, such as quadriplegics and Parkinson’s patients, with the help of neural implantable devices (Company).

Although this interaction between men and machines is often considered purely science fiction, the possibility of direct communication between humans and machines has been investigated since 1974 (Vidal). At present, innovation in technology takes place at an extremely fast pace. Therefore, the fact that research in this field has been done for almost half a century with no definite results, might actually lead many people to claim that BCI research should be dismissed, as it will never be safe. However, the slow development of brain-computer interface technology can be justified by the challenges faced by researchers when dealing with the human brain. The paper "Brain-computer interfaces as new brain output pathways" develops on some of the limitations of BCIs which can be caused by inexperienced users, and, most importantly, by the variability and the wide distribution of brain functions (Wolpaw 615).
In recent years, there have been successful experiments on animals, as well as on human subjects. Furthermore, as presented in Patrick Tucker’s article “Prospects for Brain-Computer Interfacing”, researchers strongly believe that the technologies will develop quicker in the near future (8). At the University of Pittsburgh, researchers have used the brain-computer interface technology to teach a monkey to control a robotic arm just with its thoughts. This is just one example that presents the potential of direct brain-machine communication. The technology could provide a new life for persons with severe motor disabilities as it has already been proved. In 2008, CBS’s 60 Minutes covered a story on BrainGate, admitting that “Once in a while, we run across a story that is hard to believe until you see it. That’s how we felt about this story when we first saw human beings operating computers, writing emails, and driving wheelchairs with nothing but their thoughts” (“BrainPower”).
In the article “Chips in Your Head”, Ohl and Scheich noted that 400,000 Americans suffer from partial or total paralysis due to spinal cord injuries (65). Therefore, there is an evident need for brain-computer interfaces that could provide new ways of interaction for paralyzed people. While people will not have their own blue surrogate, researchers are trying to move towards a world envisioned in the movie Avatar. Besides brain-wave control of computers, another future prospect is the control of external prosthetics.

In an excerpt from his new book published in the Scientific American, Nicolelis provides a thorough image of the coming “Age of Machines” by showing the numerous applications of BCIs (81). For instance, the exoskeleton, the robotic suit that allows physically impaired people to move, could be controlled solely by brain thoughts. Nicolelis’s positive approach towards this technology stems from the idea that human beings will finally break away from the limits of their own bodies: “I feel an intense calling to embrace the amazing opportunities that freeing our brains from the limits of our terrestrial bodies can bring to our species. In fact, I wonder how anyone could think otherwise, given the tremendous humanistic prospects that brain-machine interface research promises to unleash” (81).
The extent to which BCI have reached a larger audience is shown in the transhumanism movement. Transhumanism is a movement that promotes the enhancement of humans with the aid of technology. The information available on the website that promotes this movement, Humanity+ | Technology & the Futureis centered on the idea that posthumans will become extremely intelligent and resilient to diseases thanks to technologies such as genetic engineering, neural interfaces, or anti-aging therapies (Transhumanist FAQ).  Even with the limitless promises, there have been serious concerns regarding the moral implications of such devices. Many fear that this is just the human's desire to play God which will lead to dramatic consequences for the entire humanity.
However, the different interaction between humans and computers is not only meant to be used as a tool for extending the capabilities of the human body. In an interview for “Il Sole”, Nijholt explains that BCIs could simply be used at home for turning on the lights, but an even more interesting use would be in gaming. There are already tested games that use a BCI. As mentioned on the website Emotiv, the company Emotiv has already developed a wearable interface that measures and interprets brain activity as well as facial expressions (EPOC neuroheadset). Furthermore, in 2007, the technology news website Ars Technica announced that an interface has been developed that allows people to walk or fly their avatars in Second Life just by thinking of it (“Researchers help users control Second Life avatars via brain activity”). This technology has the potential to redefine the entire concept of immersion in virtual realities. Two experiments have been conducted in order to evaluate how the BCI can be used in virtual reality Cave-like system; while several limitations of the BCI are exposed, the results also show the level of engagement: some of the participants confessed to identify themselves with their avatars, and that it comes more natural to use BCI in a virtual environment (Friedman et al iv).
While this might be considered a negative aspect as the limits between the real and digital world become blurry, there are numerous educational purposes for which mind-controlled games could be used. In many of these applications concentration is the key and, since teenagers are likely to devote their complete focus to a game, they can simultaneously learn. Additionally, once BCIs permeate the entertainment industry, the costs of these devices will definitely become more affordable for the larger audience.
Another relevant use of such systems is for simulations, especially for military training. As stated online in the "Department of Defense Fiscal Year (FY) 2010 Budget Estimates", DARPA is currently working at the project Silent Talk that will enable “user-to-user communication on the battlefield without the use of vocalized speech through analysis of neural signals” (12). It is acknowledged that DARPA is also investigating the possibility of telepresence, a concept that implies that a soldier is “physically present at a base or concealed location, but has the ability to sense and interact in a removed, real-world location through a mobile BCI device” (Kotchetkov et al iv, 4). While there are ethical dilemmas as to implanting chips into soldiers’ minds, these technologies could have a significant impact on reducing the number of people who lose their lives in combat.
In an interview from 2004 Sergey Brin, the co-founder of Google, made a very intriguing remark: “it’s credible to imagine a leap as great as that from hunting through library stacks to a Google session, when we leap from today’s search engines to having the entirety of the world’s information as just one of our thoughts” (“Google Guys”). While this idea was not taken seriously back in 2004, right now connecting all human brains to the Internet might seem the ultimate goal of the BCI technology. This paper has introduced several applications of BCI that are already functional or that will be used in the near future. The question that arises is whether the new channel of communication between humans and the Internet will lead to a new stage in the evolution of humanity.
In The Age of Spiritual Machines, Kurzweil goes as far as envisioning a technology that will allow humans to “port their entire mind file” into a computer and, thus, reach immortality (126). The predictions he made for 2009 included wireless communication, computers in all sizes, virtual business transactions, special devices for people with sensory impairment, and cyber music (277). Most of these ideas have materialized, so is Kurzweil right about the future of the human-machine interface? As BCIs are currently designed, the devices serve as a one way channel that transmits commands to a computer or robot. Furthermore, all the applications of this technology presented in this paper still require years of research to go mainstream. Therefore, it is very difficult to predict the next step of BCI research since the devices and technologies currently under investigation might not become applicable in a broader context.
What Kurzweil might be right about is achieving to access all knowledge. Although it will not be the exact scenario from The Matrix where one can insert a chip and immediately know how to fly a helicopter, the neural chips could provide a way to look through a huge database of information – the Internet. This could potentially change how search engines are organized. Instead of keywords, the Internet will display results based on one’s thoughts and feelings which are likely to be more accurate. This brings to light another concern over the capabilities of BCIs. Many worry that such devices could read one’s mind. As Wolpaw explains, BCI cannot read minds, but enables the user to develop new skills that are coordinated by brain signals instead of muscles (“Brain-Computer Interface Research Comes of Age”, 352).
As impressive as it might sound to have a permanent wireless link to the Internet, further research on the brain intricacies might determine whether it would be possible to connect a brain to another brain. Tapscott mentions in his book Grown up Digital a conservation with the founder of Facebook, Mark Zuckerberg, in which Zuckerberg explains his vision: “In order for us to be successful in this century, we’re going to need to be more connected and we’re going to need to have a better understanding of where other people are coming from and just greater sense of like, we’re all connected” (qtd. in Tapscott 55). Thus, brain-to-brain connections would achieve Facebook’s main goal but, at the same time, they might render social networks obsolete. The BCI promises a web of intelligence and, as Nicolelis imagines, a net in which “individual participants not only communicate back and forth with one another just by thinking but also vividly experience what their counterparts feel and perceive as they seamlessly adhere to this true 'mind meld' ”(83).
There are definitely numerous applications of brain-computer interfacing. All the current research proves that the medical applications, as well as the military ones, could make a positive difference in the lives of many people. On the other hand, BCIs are most likely going to be used in virtual reality on a large scale. As for the way everyone will interact with the computer on a daily basis, in Wired for Thought, Stibel believes that the Internet will be similar to a brain and so, the connection between human brains and the Internet will become a reality: “One day, entering thoughts, feelings, or aspirations into the Web will generate a similar process – it will open up endless information, dynamically organized based on your thoughts” (154).

Once considered restricted in the pages of science fiction writings, the brain-computer interfaces have attracted the interest of the scientific world over the past years. The research conducted shows the prospects of the technology behind this different way of communication between humans and machines are immense in medicine, as well as in military and entertainment. Even though this paper’s focus is not the ethics behind the BCI, the moral factor is relevant for further research. This remains the most powerful argument against having a neural chip implant and, thus, altering the human body, which is seen as a violation against the unwritten moral code. However, considering the fact that BCIs have already enabled people with severe motor impairments to express in different ways, the developments in this field will continue at a faster pace. Indeed, taking into account all the advances presented, the world that the writer Gibson envisioned in 1984 might become tangible in the near future.

Works Cited
"From Your Brain to the Machine." Mechanical Engineering 132.11 (2010): 28-29. Academic Search Complete. Web. 27 Mar. 2011.
Cheng, Jacqui. “Researchers help users control Second Life avatars via brain activity”. Ars Technica. 15 Oct. 2007. Web. 27 Mar. 2011. http://tinyurl.com/5um2g6b.
Chittaro, Luca. “Playing With Brain-Computer Interfaces. An interview with Anton Nijholt”. Il Sole 24 Ore (Italy, 5 April 2008). Web. 27 Mar. 2011. http://tinyurl.com/4dlwwac.
Friedman, D., Leeb, R., Pfurtscheller, G., & Slater, M. "Human-Computer Interface Issues in Controlling Virtual Reality With Brain-Computer Interface." Human-Computer Interaction 25.1 (2010): 67-94. Academic Search Complete. Web. 27 Mar. 2011.
Kotchetkov, I.S., Hwang, B.Y., Appelboom, G., Kellner, C.P., and Connolly Jr., E.S. “Brain-computer interfaces: military, neurosurgical, and ethical perspective.” Neurosurg Focus, 28. 5(2010). Web. 27 Mar. 2011.
Kurzweil, Ray. The Age of Spiritual Machines: When Computers Exceed Human Intelligence. New York: Viking, 1999. Print.
Licklider, Joseph C.R. “Man-Computer Symbiosis.” IRE Transactions on Human Factors in Electronics HFE-1.1 (1960): 4-11. Web. 27 Mar. 2011.
Nicolelis, Miguel A. L. "Mind Out of Body." Scientific American 304.2 (2011): 80-83. Academic Search Complete. Web. 27 Mar. 2011.
Ohl, Frank W., and Henning Scheich. "Chips in Your Head." Scientific American Mind 18.2 (2007): 65-69. Academic Search Complete. Web. 27 Mar. 2011.
Sheff, David. “Google Guys”. Playboy, September 2004. Web. 27 Mar. 2011. http://tinyurl.com/dfrujk.
Stibel, Jeffrey M. Wired for Thought. Boston, MA: Harvard Business, 2009. Print.
Tapscott, Don. Grown up Digital: How the Net Generation Is Changing Your World. New York: McGraw-Hill, 2009. Print.
Tucker, Patrick. "Prospects for Brain-- Computer Interfacing." Futurist 44.5 (2010): 7-9. Academic Search Complete. Web. 27 Mar. 2011.
Vidal, Jacques J. “Toward direct brain-computer communication”. Annual Review of Biophysics and Bioengineering. 1973.2:157-180. Web. 27 Mar. 2011.
Wolpaw, J. R. "Brain-computer Interfaces as New Brain Output Pathways." The Journal of Physiology 579.3 (2007): 613-19. Web. 27 Mar. 2011.
Wolpaw, Jonathan R. "Brain-Computer Interface Research Comes of Age: Traditional Assumptions Meet Emerging Realities." Journal of Motor Behavior 42.6 (2010): 351-353. Academic Search Complete. Web. 27 Mar. 2011.
“Department of Defense Fiscal Year (FY) 2010 Budget Estimates”. Defense Advanced Research Projects Agency. May 2009. Web. 27 Mar. 2011. http://tinyurl.com/62cura5.
60 Minutes CBS. Brain Power. Web. 27 Mar. 2011. http://tinyurl.com/ma7pr2.

Tuesday, March 29, 2011

FEED

How does the future look like with all the technology that advances at such a fast pace? In the novel Feed, Anderson depicts a world in which computers are not outside, but in one’s brain. Through a neural chip implanted at birth, called feed, people get access to all the available information, watch the news and movies, see all the new advertisements and make purchases, and talk with other persons. Is it not the world that we all dream of? We could be permanently connected to a net with a much larger scope than the Internet. We could get all the latest information and become “supersmart without ever working” (Anderson 47).
What more can we ask for? Actually, the main character in the novel, Titus, explains that “the braggest thing about the feed, the thing that made it really big, is that it knows everything you want to hope for, sometimes before you even know what those things are” (48). But is this the greatest or the worst part about a feed? This passage brings to light the real state of things: mega corporations such as Feedlink or American Feedware control what people wish for and can even create false desires. Personally, I found it very intriguing whether Anderson is criticizing a society that relies too much on technology, or, rather our own reality in which advertising dictates our preferences. How many ads make us feel as if were not able to live without one specific product? It is questionable whether it is really good that companies are able to predict what we will need or want or are they only giving us the impression that this is the thing that can complete our existence. In the latter case, the goal would be to make us all buy more and, thus, spend more money.
Somehow I think that many are aware of the negative influence of advertisements, but no real action is taken. An interesting perspective in Feed is that of the character Titus: “Of course, everyone is like, da, da, da, evil corporations, oh they’re so bad, we all say that, and we all know they control everything. I mean, it’s not great, because who knows what evil shit they’re up to. Everyone feels bad about that. But they’re the only way to get all this stuff, and it’s no good getting pissy about it, because they’re still going to control everything, whether to like it or not.” (48, 49) Is this our own excuse? We admit that ads are evil but, at the same time, we believe that there is no point in rising against them and the media giants behind them, because we could never succeed. Anderson seems to condemn this exact feeling of complacency. After all, many claim that advertisements do not alter their choices, but when the “revolutionary” pair of jeans appears on a huge billboard, they will all rush to buy it. Or, have we reached the moment when the mega corporations that aim to modify our decisions have become a necessary evil?

Anderson, M.T. Feed. Cambridge, Mass.: Candlewick Press, 2002. Print.

Sunday, March 27, 2011

Project 3: The Power of Our Brains

“Computing machines can do readily, well, and rapidly many things that are difficult or impossible for man, and men can do readily and well, though not rapidly, many things that are difficult or impossible for computers. That suggests that a symbiotic cooperation, if successful in integrating the positive characteristics of men and computers, would be of great value.” (J. C. R. Licklider)

In 1960, in his paper “Man-Computer Symbiosis”, Licklider described how humans and machines, both with positive and negative characteristics, could complement one another. This cooperation was based on the idea that computers are tools that can reduce the time humans spend on tedious tasks such as finding basic information or making calculations. Licklider envisioned the way personal computing would develop and the design of the Internet. He admits, though, that a truly perfect interaction between humans and machines depends on the input and output equipment.
While Licklider imagined that speech recognition could be a viable solution for the future of communications, voice commanded computers have mostly been seen in movies rather than in real life. Over the past years, researchers have directed their efforts towards brain-computer interfaces. This paper examines the recent developments in the technology behind human-machine interfaces and the prospective applications of this technology.
A brain-computer interface (BCI) represents a way of communicating between a human being and a computer through thoughts rather than motor movements. As explained in the article “From your Brain to the Machine”, BCIs are controlled by brain waves generated by particular thoughts (28). In most research experiments, electrodes have been implanted on the brain’s surface in order to process the brain waves. However, devices similar to chips that can be implanted in the brain allowing the mind to control computers have been invented. For instance, BrainGate is one neural implantable device that has already been successfully tested on humans. According to the information provided on their website, BrainGate, the company aims to improve the condition of severely disabled people, such as quadriplegics and Parkinson’s patients, with the help of neural implantable devices (Company).
Although this interaction between men and machines is often considered purely science fiction, the possibility of direct communication between humans and machines has been investigated since 1974 (Vidal). At present, innovation in technology takes place at an extremely fast pace. So, the fact that research in this field has been done for almost half a century with no definite results, might actually lead many people to claim that BCI research should be dismissed, as it will never be safe. However, the slow development of brain-computer interface technology can be justified by the challenges faced by researchers when dealing with the human brain. Some of the limitations of BCIs are due to the variable frequencies of brain signals and the mapping the brain activity.
In recent years, there have been successful experiments led on animals, as well as on human subjects. Furthermore, as presented in Patrick Tucker’s article “Prospects for Brain-Computer Interfacing”, researchers strongly believe that the technologies will develop quicker in the near future (8). At the University of Pittsburgh, researchers have used the brain-computer interface technology to teach a monkey to control a robotic arm just with its thoughts. This is just one example that presents the potential of direct brain-machine communication. The technology could provide a new life for persons with severe motor disabilities as it has already been proved. In 2008, CBS’s 60 Minutes covered a story on BrainGate, admitting that “Once in a while, we run across a story that is hard to believe until you see it. That’s how we felt about this story when we first saw human beings operating computers, writing emails, and driving wheelchairs with nothing but their thoughts” (“BrainPower”).
In the article “Chips in your head”, Ohl and Scheich noted that 400,000 Americans suffer from partial or total paralysis due to spinal cord injuries (65). Therefore, there is an evident need for brain-computer interfaces that could provide new ways of interaction for paralyzed people. While people will not have their own blue avatar, researchers are trying to move towards a world envisioned in the movie Avatar. Besides brain-wave control of computers, another future prospect is the control of external prosthetics.

In an excerpt from his new book published in the Scientific American, Nicolelis provides a thorough image of the coming “Age of Machines” by showing the numerous applications of BCIs (81). For instance, the exoskeleton, the robotic suit that allows disabled to move, could be controlled solely by brain thoughts. Nicolelis’s positive approach towards this technology stems from the idea that human beings will finally break away from the limits of their own bodies: “I feel an intense calling to embrace the amazing opportunities that freeing our brains from the limits of our terrestrial bodies can bring to our species. In fact, I wonder how anyone could think otherwise, given the tremendous humanistic prospects that brain-machine interface research promises to unleash” (81).

The interaction between humans and computers is not restricted to medical applications. In an interview for “Il Sole”, Nijholt explains that BCIs could simply be used at home for turning on the lights, but an even more interesting use would be in gaming. There are already tested games that use a BCI. As mentioned on the website Emotiv, the company Emotiv has already developed a wearable interface that measures and interprets brain activity as well as facial expressions (“EPOC neuroheadset”). Furthermore, in 2007, the technology news website ars technica announced that an interface has been developed that allows people to walk or fly their avatars in Second Life just by thinking of it (“Researchers help users control Second Life avatars via brain activity”). This technology has the potential to redefine the entire concept of immersion in virtual realities. Two experiments have been conducted in order to evaluate how the BCI can be used in virtual reality Cave-like system; while several limitations of the BCI are exposed, the results also show the level of engagement: some of the participants confessed to identify themselves with their avatars, and that it comes more natural to use BCI in a virtual environment (Friedman et al iv).
While this might be considered a negative aspect as the limits between the real and digital world become more blurry, there are numerous educational purposes for which mind-controlled games could be used. In many of these applications concentration is the key and, since, teenagers are likely to devote their complete focus to a game, they can simultaneously learn. Additionally, once BCIs permeate the entertainment industry, the costs of these devices will become more affordable for the larger audience.
Another relevant use of such systems is for simulations, especially for military training. As stated in the online budget estimates for 2010, DARPA is currently working at the project Silent Talk that will enable “user-to-user communication on the battlefield without the use of vocalized speech through analysis of neural signals” (12). It is acknowledged that DARPA is also investigating the possibility of telepresence, a concept that implies that a soldier is “physically present at a base or concealed location, but has the ability to sense and interact in a removed, real-world location through a mobile BCI device” (Kotchetkov et al iv, 4). While there are ethical dilemmas as to implanting chips into soldiers’ minds, these technologies could have a significant impact on reducing the number of people who lose their lives in combat.
In an interview from 2004, Sergey Brin, the co-founder of Google, made a very intriguing remark: “it’s credible to imagine a leap as great as that from hunting through library stacks to a Google session, when we leap from today’s search engines to having the entirety of the world’s information as just one of our thoughts” (“Google Guys”). While this idea was not taken seriously back in 2004, right now connecting all human brains to the Internet might seem the ultimate goal of the BCI technology. Up to this moment, in this paper there have been introduced several applications of BCI that are already functional or that will be used in the near future. The question that arises is whether the new channel of communication between humans and the Internet will lead to a new stage in the evolution of humanity.
In The Age of Spiritual Machines, Kurzweil goes as far as envisioning a technology that will allow humans to “port their entire mind file” into a computer and, thus, reach immortality (126). The predictions he made for 2009 included wireless communication, computers in all sizes, virtual business transactions, special devices for people with sensory impairment, and cyber music (277). Most of these ideas have materialized, so is Kurzweil right about the future of the human-machine interface? As BCIs are currently designed, the devices serve as a one way channel that transmits commands to a computer or robot. Furthermore, all the applications of this technology presented in this paper still require years of research to go mainstream. Therefore, it is very difficult to predict the next step of BCI research since the devices and technologies currently under investigation might not become applicable in a broader context.
What Kurzweil might be right about is achieving to access all knowledge. Although it will not be the exact scenario from The Matrix where one can insert a chip and immediately know how to fly a helicopter, the neural chips could provide a way to look through a huge database of information – the Internet. This could potentially change how search engines are organized. Instead of keywords, the Internet will display results based on one’s thoughts and feelings which are likely to be more accurate. This brings to light another concern over the capabilities of BCIs. Many worry that such devices could read one’s mind. As Wolpaw explains, BCI cannot read minds, but enables the user to develop new skills (“Brain-Computer Interface Research Comes of Age”, 352).
As impressive as it might sound to have a permanent wireless link to the Internet, further research on the brain intricacies might determine whether it would be possible to connect a brain to another brain. Tapscott mentions in his book Grown up digital a conservation with the founder of Facebook, Mark Zuckerberg, in which Zuckerberg explains his vision: “In order for us to be successful in this century, we’re going to need to be more connected and we’re going to need to have a better understanding of where other people are coming from and just greater sense of like, we’re all connected” (qtd. in Tapscott 55). Thus, brain-to-brain connections would achieve Facebook’s main goal but, at the same time, they might render social networks obsolete. The brain computer interface promises a web of intelligence and, as Nicolelis imagines, a net in which “individual participants not only communicate back and forth with one another just by thinking but also vividly experience what their counterparts feel and perceive as they seamlessly adhere to this true <<mind meld>>”(83).
There are definitely numerous applications of brain-computer interfacing. All the current research proves that the medical applications, as well as the military ones, could make a positive difference in the lives of many people. On the other hand, BCIs are most likely going to be used in virtual reality on a large scale. As for the way everyone will interact with the computer on a daily basis, in Wired for thought, Stibel believes that the Internet will be similar to a brain and so, the connection between human brains and the Internet will become a reality: “One day, entering thoughts, feelings, or aspirations into the Web will generate a similar process – it will open up endless information, dynamically organized based on your thoughts” (154).


Transhumanism is a movement that promotes the enhancement of humans with the aid of technology. The information available on the website that promotes this movement, “Humanity Plus”, includes that posthumans will be extremely intelligent and resilient to diseases thanks to technologies such as genetic engineering, neural interfaces, or anti-aging therapies (“Transhumanist FAQ”). Many are against this movement, claiming that there are several moral issues to be regarded such as playing God.
Even though this paper’s focus is not the ethics behind the BCI, the moral factor is relevant for further research. Taking into account all the advances presented, the world that Gibson depicted in Neuromancer might become tangible in the near future.

Works cited:
Licklider, Joseph C.R. “Man-Computer Symbiosis.” IRE Transactions on Human Factors in Electronics,
1. Mar. 1960: 4-11. Web. 27 March 2011.
"From Your Brain to the Machine." Mechanical Engineering 132.11 (2010): 28-29. Academic Search Complete. Web. 27 March 2011.
Kotchetkov, I.S., Hwang, B.Y., Appelboom, G., Kellner, C.P., and Connolly Jr., E.S. “Brain-computer interfaces: military, neurosurgical, and ethical perspective.” Neurosurg Focus, 28. 5(2010): . Web. 27 March 2011.
Nicolelis, Miguel A. L. "Mind Out of Body." Scientific American 304.2 (2011): 80-83. Academic Search Complete. Web. 27 March 2011.

Sheff, David. “Google Guys”. Playboy, September 2004. Web. http://tinyurl.com/dfrujk. 27 March 2011.
Friedman, D., Leeb, R., Pfurtscheller, G., & Slater, M. "Human-Computer Interface Issues in Controlling Virtual Reality With Brain-Computer Interface." Human-Computer Interaction 25.1 (2010): 67-94. Academic Search Complete. Web. 27 March 2011.
Tucker, Patrick. "Prospects for Brain-- Computer Interfacing." Futurist 44.5 (2010): 7-9. Academic Search Complete. Web. 27 March 2011.
Vidal, Jacques J. “Toward direct brain-computer communication”. Annual Review of Biophysics and Bioengineering. 1973.2:157-180. Web. 27 March 2011.
Defense Advanced Research Projects Agency. “Department of Defense Fiscal Year (FY) 2010 Budget Estimates”. May 2009. Web. http://tinyurl.com/62cura5. 27 March 2011.
Cheng, Jacqui. “Researchers help users control Second Life avatars via brain activity”. 15 October 2007. Web. http://tinyurl.com/5um2g6b. 27 March 2011.
Chittaro, Luca. “PLAYING WITH BRAIN-COMPUTER INTERFACES. An interview with Anton Nijholt”. Il Sole 24 Ore (Italy, 5 April 2008). Web. http://tinyurl.com/4dlwwac. 27 March 2011.
60 Minutes CBS. Brain Power. Web. http://tinyurl.com/ma7pr2. 27 March 2011.
Wolpaw, Jonathan R. "Brain-Computer Interface Research Comes of Age: Traditional Assumptions Meet Emerging Realities." Journal of Motor Behavior 42.6 (2010): 351-353. Academic Search Complete. Web. 27 March 2011.
Kurzweil, Ray. The Age of Spiritual Machines – When Computers Exceed Human Intelligence. Viking, 1999. Print.
Stibel, Jeffrey M. Wired for Thought: How the Brain is Shaping the Future of the Internet. Boston: Harvard Business Press, 2009. Print.
Tapscott, Don: Grown up digital: How the Net Generation is Changing your World. New York: McGraw Hill, 2009. Print.

Ohl, Frank W., and Henning Scheich. "Chips in Your Head." Scientific American Mind 18.2 (2007): 65-69. Academic Search Complete. EBSCO. Web. 27 Mar. 2011.

Tuesday, March 22, 2011

Connected to Facebook or the Matrix?

The Matrix and The Social Network are two movies that bring to life two startling ideas. One introduces us to a reality in which people are controlled by intelligent machines, while the other depicts the reality of Facebook and its founding. Many might say that the Matrix is simply science fiction, whereas The Social Network tells the real story behind the largest social network. Is it really that simple?
Both movies come from Hollywood, the place synonymous to making money. In fact, the Matrix might have more value simply because the time when it was released allowed new, meaningful ideas to be brought to the big screen. After just more than ten years, things have changed as blockbusters roll out every week and the emphasis is placed on famous actors and actresses, and special effects, rather than on content. So, is it not the case with The Social Network which came out in 2010?
Of course, I do believe that there is no smoke without fire. As the poster for the movie puts it “YOU DON’T GET TO 500 MILLION FRIENDS WITHOUT MAKING A FEW ENEMIES”. Indeed, it is often much more complicated in the real world. Still, I cannot help to notice that Facebook is one of the few websites that does not annoy its users with countless flashy advertisements. As it is also shown in the movie, Zuckerberg was reluctant towards advertising on Facebook from the beginning. This describes him as a person who does not care only about money. Now, if I analyze the characteristic of Wu’s “defining moguls” there must also be the need for power or greatness. This is also debatable as Facebook was initially a network restricted to Harvard students and the idea of expansion seems not to be entirely Zuckerberg’s. In an interview for Time Magazine, he shares his opinion about the movie: “the biggest thing that thematically they missed is the concept that you would have to want to do something — date someone or get into some final club — in order to be motivated to do something like this. It just like completely misses the actual motivation for what we're doing, which is, we think it's an awesome thing to do.”
geek&poke
Going back to The Matrix, the concept presented by the Wachowski brothers seems to me more startling. Even if I have manifested my enthusiasm towards the development of AI in previous posts, I have to wonder if this scenario could actually take place. It is eventually all about control. Intelligent machines could take over our lives, and everything we do or think. At the same time, Facebook could control us through advertisements. It is true that using our personal information (for example: gender or age) specific ads appear on different walls, but it is not as if a third party gets access to private information. Moreover, Facebook has proved very useful in promoting democracy in the recent events in Africa. According to Zucerkberg, Facebook is meant to bring the power back to the people. In his book “Grown up digital”, Tapscott shares what he discussed with Zuckerberg in 2008, regarding the future of Facebook: “[Facebook] wasn’t just going to be a social network for friends. It would be a social utility, like a power grid.” (64). It will be interesting to see if humans will rely on Facebook to fight the coming age of machines.
Works cited:
Wu, Tim. The Master Switch: The Rise and Fall of Information Empires. New York: Knopf, 2010. Print.
Tapscott, Don.  Grown up digital. New York: McGraw-Hill, 2009. Print.
"Person of the Year." Time (2010).Web. http://tinyurl.com/2u4jmvf. 22 Mar. 2011.

Tuesday, March 15, 2011

The technology behind human beings

Human beings have a dual nature as good and bad are merged together. As much as we would like, we can never become perfect. This is the same way that all our creations are made, with both good and bad parts. So, is technology the evil that will destroy our planet? No, because there are no completely bad or completely good technological advances, only shades of gray…
Apparently, many people understand that technology is needed. However, what we are most afraid is that soon we will build intelligent machines that will destroy us or, at least, control everything we do. That is one of the leading arguments of Kaczynski in his manifesto against technology. But he is not the sole person who predicts such a bleak future. Personally, I find it ironic that we are afraid of robots that could think and talk just like a human being, but we do not even consider the harm that our dependence on smartphones, social networks, and Internet, could cause. I think this second scenario is more dangerous (and more relevant at present) because we do not even realize there exists something else leading our lives.
What Kaczynski proposes is the return to the wild. Nevertheless, going back to a primitive stage will only trigger another cycle of evolution. It is in our nature to strive for something important. There is no other way. If we all set goals that we could easily achieve, what would we do next? A life of boredom. It is true that we often complain we have too much to do, but can you imagine not doing anything meaningful for years? This is how we, as humans, function.
Yes, in our quest for greatness, we can go as far as to build machines that are smarter than us and that will try to take us down. But once we realize this, we will fight for our freedom. One beautiful thing of being a human is that when chained, we always strive to break loose. After all, we are the most complex creatures and the artificial machines are just tools that we have created to assist us. If there is a change in the natural poles of power, we will fight back for supremacy.
So, human beings are flawed. But our flaws make us unique and help us survive. I am probably too optimistic but it is simply because I enjoy technology, or, at least, the good parts of technology. I also believe that this digitized world makes us appreciate more the nature. After a day in front of a computer, surrounded by noise, I feel the need to go out and breathe.