webmail saga continues

I was pleased to see a reply from Daniel as a reaction to my post. I read and re-read the blog couple of times yesterday and another time today to question my own understanding and see if there is anyway I could make life easier and simpler for myself and other people whom I interact with but finding it somewhat of an uphill task. I will not be limiting myself to e-mail alone as I feel until we don’t get/share the big picture it would remain incomplete.

Allow to share me few observations below –

1. The first one is probably cultural in nature (either specific to India or its worldwide I have no contextual information.) Very early in my professional and personal life I understood that e-mails are leaky by design. By leaky I mean being leaked by individuals for profit or some similar motive.

Also e-mails are and were used as misinformation tools by companies and individuals then and now or using sub-set or superset of them without providing contextual information in which they were written. While this could be construed as giving straw man I do not know any other way. So the best way, at least for me is to construct e-mails in a way where even if some information is leaked, I’m ok with it being leaked or being in public domain. It just hurts less. I could probably give 10-15 high-profile public outings in the last 2-3 years itself. And these are millionaires and billionaires, people on whom many people rely on their livelihoods should have known better. In Indian companies, for communications they do have specific clauses where any communication you had with them is subject to privacy and if you share it with somebody you would be prosecuted, on the other hand if the company does it, it gets a free pass.

2. Because of my own experiences I have been pretty circumspect/slightly paranoid of anybody promising or selling the cool-aid of total privacy. Another example which is of slightly recentish vintage and pains me even today was a Mozilla add-on for which I had done RFP (Request for Package) which a person for pkg-mozext-maintainers@lists.alioth.debian.org (probably will be moved to salsa in near future) packaged and I thanked him/her for it. Two years later it came to fore that under the guise of protecting us from bad cookies or whatever the add-on was supposed to do, it was actually tracking us and selling this information to third-parties.

This was found out by some security researcher casually auditing the code two years down the line (not mozilla) and then being confirmed by other security researchers as well. It was a moment of anguish for me as so many people’s privacy had been invaded even though there were good intentions from my side.

It was also a bit sad as I had assumed (perhaps incorrectly) that Debian does do some automated security audit along with hardening flags that it uses when a package is built. This isn’t to show Debian in a bad light but to understand and realize that Debian has its own shortcomings in many ways. I did hear recently that lot of packages from Kali would make it to Debian core, hopefully some of those packages could also serve as an additional tool to look at packages when they are being built 🙂

I do know it’s a lot to ask for as Debian is a volunteer effort. I am happy to test or whichever way I can contribute to Debian if in doing so we can raise the bar for intended or unintended malicious apps. to go through. I am not a programmer but still I’m sure there might be somehow I could add strength to the effort.

3. The other part is I don’t deny that Google is intrusive. Google is intrusive not just in e-mail but in every way, every page that uses Google analytics or the google Spider search-engine be used for tracking where you are and what you are doing. The way they have embedded themselves in web-pages is it has become almost impossible to see almost all web-pages (some exceptions remain) without allowing google.com to see what you are seeing. I use requestpolicy-continued to know what third-party domains are there on web-page and I see fonts.googleapis.com, google.com and some of the others almost all the time. The problem there is we also don’t know how much information google gathers. For e.g. even if I don’t use Google search engine and if I am searching on any particular topic and if 3-4 of the websites use google for any form or manner, it would be easy to know the information and the line/mode or form of the info. I’m looking for. That actually is same if not more of a problem to me than e-mails and I have no solution for it. Tor and torbrowser-launcher are and were supposed to be an answer to this problem but most big CDNs (Content Distributor Networks) like cloudfare.com are against it so privacy remains an elusive dream there as well.

5. It becomes all the more dangerous when in mobile space where Google Android is the only vendor. The rise of carrier-handset locking which is prevalent in the west has also started making inroads in India. In the manufacturer-carrier-Operating System complex such things will become more common. I have no idea about other vendors but from what I have seen I think the majority might probably be doing the same. IPhone is supposed to also have lot of nastiness where it comes to surveillance.

6. My main worry for protonmail or any other vendor is should we just take them at face-value or is there some other way for people around the world to be assured and in case things take a worse time be possible to file claim for damages if those terms and conditions are not met. I looked to see if I could find an answer to this question which I asked in my previous post and I looked but didn’t find any appropriate answer in your post. The only way I see out of is decentralized networks and apps but they too leave much to be desired. Two examples I can share of the latter. Diaspora started with the idea that I could have my profile in one pod, if for some reason I didn’t like the pod, I could take all the info. to another pod with all the messages, everything in an instant. At least till few months back, I tried to migrate to another pod and found that feature doesn’t work/still a work in progress.

Similarly, zeronet.io is another service which claimed to use de-centralization but for last year or so I haven’t been able to send one email to another user till date.

I used both these examples as both are foss and both have considerable communities and traction built around them. Security or/and anonymity is still at a lower path though as of yet.

I hope I was able to share where I’m coming from.


Webmail and whole class of problems.

Yesterday I was reading Daniel Pocock’s ‘Do the little things matter’ and while I agree with parts of his assessment I feel it is incomplete unless taken from user’s perspective having limited resources, knowledge etc. I am a gmail user so trying to put a bit of perspective here. I usually wait for a day or more when I feel myself getting inflamed/heated as it seemed to me a bit of arrogant perspective, meaning gmail users don’t have any sense of privacy. While he is perfectly entitled to his opinion, I *think* just blaming gmail is an easy way out, the problems are multi-faceted. Allow me to explain what I mean.

The problems he has shared I do not think are Gmail’s alone but all webmail providers, those providing services free of cost as well as those providing services for a fee. Regardless of what you think, the same questions arise whether you use one provider or the other. Almost all webmail providers give you a mailbox, an e-mail id and a web interface to interact with the mails you get.

The first problem which Daniel essentially tries to convey is the deficit of trust. I *think* that applies to all webmail providers. Until and unless you can audit and inspect the code you are just ‘trusting’ somebody else to provide you a service. What pained me while reading his blog post is that he could have gone much further but chose not to. What happens when webmail providers break your trust was not explored at all.

Most of the webmail providers I know are outside my geographical jurisdictions. While in one way it is good that the government of the day cannot directly order them to check my mails, it also means that I have no means to file a suit or prosecute the company in case if breaches do occur. I am talking here as an everyday user, a student and not a corporation who can negotiate, make iron-clad agreements and have some sort of liability claim for many an unforeseen circumstances. So no matter how you skin it, most users or to put it more bluntly almost all non-corporate users are at a disadvantage to negotiate terms of a contract with their mail provider.

So whether the user uses one webmail provider or other, it’s the same thing. Even startups like riseup who updated/shared the canary do show that even they are vulnerable. Also it probably is easier for webmail services to have backdoors as they can be pressurized for one government or the other.

So the only way to solve it really is having your own mail server which to say truthfully is no solution as it’s a full-time job. The reason is because you are responsible for everything. Each new vulnerability you come to know, you are supposed to either patch it or get it patched, or have some sort of workaround. In the last 4-5 years itself, it has become more complex as more and more security features are being added as each new vulnerability or class of vulnerabilities has revealed itself. Add to that at the very least a mail server should at the very least have something like RAID 1 at the very least to lessen data corruption. While I have seen friends who have the space and the money to invest and maintain a mail server most people won’t have the time, energy and the space to do the needful. I don’t see that changing in the near future at least.

Add to that over the years when I did work for companies most of the times I have found I needed to have more than one e-mail client as emails in professional setting need to be responded quickly and most of the GUI based mail clients could have subtle bugs which you come to know only when you are using it.

Couple of years back I was working with Hamaralinux. They have their own mail server. Without going into any technical details, looking into the features needed and wanted for both the parties. I started out using Thunderbird. I was using stable releases of Thunderbird. Even then, I used to see subtle bugs which sometimes used to corrupt the mail database or do one thing or the other. I had to resort to using Evolution which provided comparable features and even there I found bugs so for most of the time I had to resort between hopping between the two mail clients.

Now if you look at the history of the two clients you would assume that most of the bugs should not be there but surprisingly they were. At least for Thunderbird, I remember gecko used to create lot of problems besides other things. I did report the bugs I encountered and while some of them were worked upon, the solution used to often take days and sometimes even weeks to be resolved. Somewhat similar was the case with Evolution also. At times I also witnessed broken formatting and things like that but that is our of the preview of the topic.

Crudely, AFAIK these the basic functions an email client absolutely needs to do –

a. Authenticate the user to the mail server
b. If the user is genuine, go ahead to next step or reject the user at this stage itself.
c. If the user is genuine. let them go to their mailbox.
d. Once you enter the mailbox (mbox) it probably looks at the time-stamp when the last mail was delivered and see if any new mail has come looking at the diff between timesw (either using GMT or using epoch+GMT).
e. If any new mail has come it starts transferring those mails to your box.
f. If there are any new mails which need to be sent it would transfer them at this point.
g. If there are any automatically acknowledgments of mails received and that feature is available it would do that as well.
h. Ideally you should be able to view and compose replies offline at will.

In reality, at times I used to see transfers not completed meaning that the mail server still has mails but for some reason the connection got broken (maybe due to some path in-between or something else entirely)

At times even notification of new mails used to not come.

Sometimes offline Thunderbird used to lock mails or mbox at my end and I had to either use evolution or use some third-party tool to read the mails and rely on webmail to give my reply.

Notice in all this I haven’t mentioned ssh or any sort of encryption or anything like that.

It took me long time to figure out https://wiki.mozilla.org/MailNews:Logging but as you can see it deviates you from the work you wanted to do in the first place.

I am sure some people would suggest either Emacs or alpine or some other tool which works and I’m sure it worked right out of bat for them, for me I wanted to have something which had a GUI and I didn’t have to think too much about it. It also points out the reason why Thunderbird was eventually moved out of mozilla in a sense so that community could do feature and bug-fixing more faster than either mozilla did or had the resources or the will to do so.

From a user perspective I find webmail more compelling even with leakages as Daniel described because even though it’s ‘free’ it also has in-built redundancy. AFAIK they have enough redundant copies of mail database so that even if the node where my mails are dies, it simply will resurrect it from the other copies and give it to me in timely fashion.

While I do hope that in the long-run we do get better tools, in the short-to-medium term at least from my perspective its more about which compromises you are able to live with.

While I’m too small and too common a citizen for the government to take notice of me, I think it’s too easy to blame ‘X’ or ‘Y’ . I believe the real revolution will only begin when there are universal data protection laws for all citizens irrespective of countries and companies and governments are made answerable and liable for any sort of interactive digital services provided. Unless we raise the consciousness of people about security in general and have some sort of multi-stake holders meetings and understanding in real life including people from security, e- mail providers, general users and free software hackers, regulators and if possible even people from legislature I believe we would just be running about in circles.

RequestPolicy Continued

Dear Friends,

First up, I saw a news item about Indian fake e-visa portal. As it is/was Sunday, I decided to see if there indeed is such a mess. I dug out torbrowser-bundle (tbb), checked the IP it was giving me (some Canadian IP starting from (216.xxx.xx.xx) typed in ‘Indian visa application’ and used duckduckgo.com to see which result cropped up first.

I deliberately used tbb as I wanted to ensure it wasn’t coming from an Indian IP where the chances of Indian e-visa portal being fake should be negligible. Scamsters would surely be knowledgable to differ between IPs coming from India and from some IP from some other country.

The first result duckduckgo.com gave was https://indianvisaonline.gov.in/visa/index.html

I then proceeded to download whois on my new system (more on that in another blog post

$ sudo aptitude install whois

and proceeded to see if it’s the genuine thing or not and this is the information I got –

$ whois indianvisaonline.gov.in
Access to .IN WHOIS information is provided to assist persons in determining the contents of a domain name registration record in the .IN registry database. The data in this record is provided by .IN Registry for informational purposes only, and .IN does not guarantee its accuracy. This service is intended only for query-based access. You agree that you will use this data only for lawful purposes and that, under no circumstances will you use this data to: (a) allow, enable, or otherwise support the transmission by e-mail, telephone, or facsimile of mass unsolicited, commercial advertising or solicitations to entities other than the data recipient's own existing customers; or (b) enable high volume, automated, electronic processes that send queries or data to the systems of Registry Operator, a Registrar, or Afilias except as reasonably necessary to register domain names or modify existing registrations. All rights reserved. .IN reserves the right to modify these terms at any time. By submitting this query, you agree to abide by this policy.

Domain ID:D4126837-AFIN
Created On:01-Apr-2010 12:10:51 UTC
Last Updated On:18-Apr-2017 22:32:00 UTC
Expiration Date:01-Apr-2018 12:10:51 UTC
Sponsoring Registrar:National Informatics Centre (R12-AFIN)
Registrant ID:dXN4emZQYOGwXU6C
Registrant Name:Director Immigration and Citizenship
Registrant Organization:Ministry of Home Affairs
Registrant Street1:NDCC-II building
Registrant Street2:Jaisingh Road
Registrant Street3:
Registrant City:New Delhi
Registrant State/Province:Delhi
Registrant Postal Code:110001
Registrant Country:IN
Registrant Phone:+91.23438035
Registrant Phone Ext.:
Registrant FAX:+91.23438035
Registrant FAX Ext.:
Registrant Email:dsmmp-mha@nic.in
Admin ID:dXN4emZQYOvxoltA
Admin Name:Director Immigration and Citizenship
Admin Organization:Ministry of Home Affairs
Admin Street1:NDCC-II building
Admin Street2:Jaisingh Road
Admin Street3:
Admin City:New Delhi
Admin State/Province:Delhi
Admin Postal Code:110001
Admin Country:IN
Admin Phone:+91.23438035
Admin Phone Ext.:
Admin FAX:+91.23438035
Admin FAX Ext.:
Admin Email:dsmmp-mha@nic.in
Tech ID:jiqNEMLSJPA8a6wT
Tech Name:Rakesh Kumar
Tech Organization:National Informatics Centre
Tech Street1:National Data Centre
Tech Street2:Shashtri Park
Tech Street3:
Tech City:New Delhi
Tech State/Province:Delhi
Tech Postal Code:110053
Tech Country:IN
Tech Phone:+91.24305154
Tech Phone Ext.:
Tech FAX:
Tech FAX Ext.:
Tech Email:nsrawat@nic.in
Name Server:NS1.NIC.IN
Name Server:NS2.NIC.IN
Name Server:NS7.NIC.IN
Name Server:NS10.NIC.IN
Name Server:
Name Server:
Name Server:
Name Server:
Name Server:
Name Server:
Name Server:
Name Server:
Name Server:

It seems to be a legitimate site as almost all information seems to be legit. I know for a fact, that all or 99% of all Indian government websites are done by NIC or National Institute of Computing. The only thing which rankled me was that DNSSEC was unsigned but then haven’t seen NIC being as pro-active about web-security as they should be as they handle many government sensitive internal and external websites.

I did send an email for them imploring them to use the new security feature.

To be doubly sure, one could also use an add-on like showip add it your firefox profile and using any of the web services obtain the IP Address of the website.

For instance, the same website which we are investigating gives

Doing a whois of tells that NICNET has got/purchased a whole range of addresses i.e. – which is 65025 addresses which it uses.

One can see NIC’s wikipedia page to understand the scope it works under.

So from both accounts, it is safe to assume that the web-site and page is legit.

Well, that’s about it for the site. While this is and should be trivial to most Debian users, it might or might not be to all web users but it is one way in which you can find if a site is legitimate.

Few weeks back, I read Colin’s blog post about Kitten Block which also was put on p.d.o.

So let me share RequestPolicy Continued –

Requestpolicy Continued Mozilla Add-on

This is a continuation of RequestPolicy which was abandoned (upstream) by the original developer and resides in the Debian repo.


I did file a ticket stating both the name-change and where the new watch file should point at 870607

What it does is similar to what Adblock/Kitten Block does + more. It basically restricts any third-party domain from having permission to show to you. It is very similar to another add-on called u-block origin .

I liked RPC as it’s known because it hardly has any learning curve.

You install the add-on, see which third-party domains you need and just allow them. For instance, many websites nowadays fonts.googleapis.com, ajax.googleapis.com is used by many sites, pictures or pictography content is usually looked after by either cloudflare or cloudfront.

One of the big third parties that you would encounter of-course is google.com and gstatic.net. Lot of people use gstatic and its brethren for spam protection but they come with cost of user-identifibility and also the controversial crowdsourced image recognition.

It is a good add-on which does remind you of competing offerings elsewhere but also a stark reminder of how much google has penetrated and to what levels within sites.

I use tor-browser and RPC as my browsing is distraction-free as loads of sites have nowadays moved to huge bandwidth consuming animated ads etc. While I’m on a slow non-metered (eat/surf all you want) kind of service, for those using metered (x bytes for y price including upload and download) the above is also a god-send..

On the upstream side, they do need help both with development and testing the build. While I’m not sure, I think the maintainer didn’t reply or do anything for my bug as he knew that Web-Exensions are around the corner. Upstream has said he hopes to have a new build compatible with web extensions by the end of February 2018.

On the debian side of things, I have filed 870607 but know it probably will be acted once the port to web-extensions has been completed and some testing done so might take time.

Analyzing security vulnerabilities in Debian.

This will be a slightly longish post about what security vulnerabilities and how to find about them in Debian and what possible steps you could take to minimize fallouts once you know the vulnerabilities.
Continue reading “Analyzing security vulnerabilities in Debian.”