This post is going to be a somewhat of a roundup of interesting news and analysis about FOSS and technology in general.

First things first.  After sharing about my GNOME issues in the last article . For some unexplainable reason I decided to so some further research about the new 3 TB HDD and was shocked to know that the BIOS we all know doesn’t support anything beyond 2 TB. In reality, the BIOS has served us quite well for the past 20 odd years. It’s because of the technology of the time when BIOS was designed that it has lasted us for so long. Anyways now we need to use EUFI . It’s actually not the BIOS but more so the MBR which has issues. Sadly though the article does not talk about it and one finds the issue on the talk page but that’s how things are. There is also coreboot but finding a mainstream motherboard which has coreboot by default would be equivalent to finding a needle in a haystack. It is more of a niche thing rather than finding it in one of the popular boards.

Anyways, I was intrigued to see if any of the mobos did have EUFI based mobos in India and while a handful of them have EUFI the only one which seemed to be in supply seems to be the Asus P8H67-M EVO REV 3.0-New H67 B3 chipset board . It’s a high-end board and seems to be around Rs 8/8k just for the board itself.

On Gigabyte there is much more support with some sort of hybrid EFI (interim and cheaper solution) BIOSes . If I were to build a system today, I would definitely wait for another 6 months and get either one of pre-flashed hybrid EFI mobos from Gigabyte or/and wait for some cheaper motherboards from Asus which have EUFI .

What was interesting to come to know were two things :-

a. EUFI has its own boot manager
b. GRUB2 has some support for EUFI

although how much of a support and how good it is don’t really know.

I did see some screenshots of EUFI and it does tell me that it would make things lot easier and understanding for the new generation to pick up things.

One of the more interesting news to come and know was finally the news of Google entering into Fibre broadband rollouts in US. This was a rumor which had been circulating for so long, almost 2/2.5 years at least that I know of. This is a reality now as they have a site and an FAQ and a fora for Q&A to answer to interested communities. One can have all kinds of interesting hypothesis to build upon. Right from having control and being able to collect all kinds of net behavior data cheaply to giving cheap services to doing all sorts of experiments on their network and buying Entrepreneurs one can imagine anything one wants to. What is interesting that it should sharpen or make Net access competitive in the geographical areas where Google operates.

On the Debian front-end an interesting initiative called DEX . The idea is simply this, there are a lot of distributions who base their work on Debian (almost 300 odd distros.) who make a lot of patches, now they may be re-inventing the wheel, they may be some good patches which do not get into Debian, there is just no way to know. Ubuntu has been trying to engage with Debian for the same for 3-4 years now (maybe more) . Hence now there is Ubuntu DEX. team . From a downstream distributor perspective it makes sense as if an upstream distributor takes care of the patches that much management bandwidth and/or thinking of things is lessened, less burden for the downstream developer to maintain and can use that energy elsewhere. I do remember some discussion happening on it few years ago when Redhat been also invited but they declined, whether was it arrogance or something else I cannot really say. I said about Redhat because I happenchanced to see Ubuntu’s Matt Zimmerman‘s blog post on the same topic and somebody named Ian Wildeboer asking them to contribute to patches upstream (the package repository) . While the idea does have its merit, its not like every package developer would be interested in integrating every patch nor they may have the time to review all and any patches. As it is, distributor developers and maintainers have somewhat of a strained on-off relationship with upstream.

Also for all Redhat’s work, they do have a history of obfuscating their patches for sometime when they do bleeding edge development. This most probably is due to the competitiveness in business . By obfuscation I mean things like putting up the patches but not putting any sort of documentation within the code or besides it telling what it does, that sort of thing. While I’m no programmer, this has been seen both and been talked about in various Ubuntu and Debian mailing lists, IRC and blogs over time.

Ubuntu hasn’t been a saint either, see the whole thing before Launchpad became open-source and some of the things which still need open-sourcing.

While many think it will be a shoo-in I think the real work has yet to begun. Its only after two or three cycles of both the distributions working together can there be any analysis whether it makes/made sense for Debian and Ubuntu to take this journey jointly or not. It would have similar pressure points that is there between upstream and distributors as well as distribution loyalties. That means 6 point releases and/or two LTS releases from Ubuntu’s side and 2 or 3 feature-based releases of Debian.

As I am a new user to Debian, one question that has perplexed me is a Debian policy of also freezing unstable when both the testing and stable repositories are frozen. I do not really know the reasons behind that policy and while I did ask on the concerned mailing list I did not get any response.

One idea/perspective which comes to me by common-sense is that by freezing the unstable repository as well, they make sure that all developers and maintainers put up their energy in making the new stable release happen. Of course if there are new versions of packages, they can make into experimental or be in a queue, either way they do not get tested enough.

On the GNOME side of things, I came across an interesting article/blog post by Josselin Mouette . I wish I was in Bengaluru to see it but as luck would have it. Anyways, the most interesting aspect of it is the proposal of giving distribution maintainers commit access to old branches that GNOME module maintainers don’t touch anymore.

Whether this would work out in the real world is to be seen but if it does work, it would make for better support and longer lasting GNOME support which is good. I, for instance am not so sure about GNOME 3.0, one of the many reasons being the new GNOME 3.0 is based on JavaScript and from what little I have read, both in, many of the browser development blogs and elsewhere JavaScript and security seem to have somewhat of a rocky relationship. That’s why I do not know or understand the reasons for basing GNOME 3.0 on JavaScript apart from the bling factor and the fact that there are many more JavaScript programmers (more readily available) then say python (insert your favorite programming language here) .

That’s about it for this time from me. Last but not the least, India won the 2011 Cricket World Cup. So Congratulations Indian team, Congratulations India.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s