This post is hopefully the last in continuation to Pune’s 1st Mini Debian Conference #MiniDebconfIndia specifically in connection with the distributed computing projects and specifically SETI@home. This post is part of events which happened on Day 1.
At the very last 10 minutes in his presentation, Vikram shared his experience of playing/working with Distributed computing project. His was the SETI@home project (Search for Extraterristrial Intelligence) one of the first/pioneers in the distributed computing environment which has been up and running since May 1999. The basic idea itself is pretty simple. There are billions and billions of stars and planets and and similar number of galaxies and Universes. 60 Million galaxies counted to date. When such huge numbers we are talking about the possibility of having intelligent life (as we assume/know it) exists. As Universes and galaxies have been there longer and homo sapiens i.e. us have just been on earth in the last three/five minutes on Earth the possibility of Extraterrestrial Intelligent life is not without merit. One just needs to read Carl Sagan and Issac Asimov to just see what possibly could be. Taking that argument further, its also highly possible/probable that they may be superior intellectually and technologically to us. Again logically, if they are technologically superior to us then they know about radio waves for far longer than we know. We discovered/predicted radio waves in the 18th century so they must/may have used it way before that.
Now as we all know radio waves are propagated as a band of frequencies and as already elaborated its a large sky. So Scientists get the data from Arecibo Observatory which is somewhat similar to our GMRT based near Pune. The idea is basically to point the Telescope to a patch in the sky and listen on/map that patch of sky within a certain frequency/band and see if some radio waves/frequency is being emitted which has some signature. They are looking for something like what our Cable/Direct Broadcast Satellites or Communications Satellites. We are looking for some specific activity on a specific band to know they are there. Its like finding a needle in a haystack scenario. Now the data that comes through is raw and it needs to be processed to weed out any radio contamination (if any) and independently verify if something out of the ordinary is there. This whole process of cleaning and verifying data takes huge computing resources and time.
Now instead of having a dedicated computer/data center for doing this work, it was thought to be a better idea if some of the spare computing cycles of a PC (Personal Computer) could be used to do this work. If one is reading/working on some spreadsheet/presentation or reading an e-book or something like that lot of computing cycles get wasted while the electricity is being used fully and everything else. These spare computing cycles can be effectively used in SETI@home or any other distributed computing project.
The whole architecture runs something similar to the following. There are central servers (having quite a bit of computing resources) whose main job is to make small packets of raw data in a way in which the clients (i.e. our computers) can understand and process and then to take that data back. The same data is sent to 2-3 computers (in some cases even more) to independently verify the results of the data. This whole job is done by a client software such as BOINC. When 3 or more clients give the same result then that processed data is taken as good. Now of course, its somewhat of a compromise because its all random and one expects that the Microprocessors and memory would not go bad otherwise they will give some wrong solution. There have been also cases known of specific chips which gave bad answers to such sort of computing. This process/procedure is common to almost all of the distributed computing projects.
There have been lot of criticisms to the SETI@home project over number of years :-
1. Huge resources (as in computing and time of volunteers and Scientists) are being wasted on a theoretical idea to prove it true is a big challenge.
2. The equipment that is being used to get the raw data may not be good enough.
3. The aliens/Extraterrestrials may have some other ways of more efficient communication then we know (Telepathy for instance or hive minds such as the Borg which has been part of the Star Trek folklore.
4. Even if we do hear something it would be decades if not centuries before we can do meaningful interstellar travel. There was a nice article I read sometime back (its a theory) which tells us where we possibly could be in the galatic evolution and possibly when we could breach or be part of interstellar travel which is in the reams of science fiction today.
5. These resources could be far better put to use in other far more realistic life problems/distributed computing projects such as those put up on World Community Grid homepage. Look at the projects in the title Research.
6. The findings of the project do have religious and societal implications. Just as an e.g. look at the Creation-Evolution controversy.
7. There have been conspiracy theorists who claim that such a signature was known in 1967 but then vanished. Now it was at the height of Cold war so nothing much is known.
8. There are also have been theories that while the computing is done under the garb of research for mankind there may be research for making weapons of mass destruction.
The last two are unsubstantiated reports as they appear in Conspiracy sites.
On the doable side if one is interested it would help if much more is love is given to the BOINC client in Debian and as can be seen they are looking for somebody to help maintain the package.
At the very end if you are fascinated by Distributed computing, you can find a whole list of them over here. That’s all for now.