Security by obscurity – and why filtering the internet is a waste of time
Thursday December 31st 2009, 4:17 am

Stuart Anderson’s comments on Kate Lundy’s blog item, Further thoughts on the filter, merit preservation in their entirety:

The biggest problem with listing banned internet content, and then disclosing which exact content you have banned, is that circumvention of the filter is trivial. You then effectively have a publicly index of banned content, which people can go straight to. If you are in the business of being a censor, this is clearly an undesirable outcome.

If you don’t disclose your blocklist, then at least people who are circumventing the filter don’t have an index to the content that you consider so objectionable (they just have the entire internet. Whoops).

The problem of course, is that security through obscurity is a terrible strategy. The list has already been leaked, and it will be leaked again without question. The Government might have a remote chance of ensuring their own internal security, but good luck on all the censorware vendors, all the ISP’s, ACMA, the classification board, and the rest of the hands this must by necessity pass through.

Regardless of the official position the Government takes, the simple fact is that for all intents and purposes the list is already public. It’s still on Wikileaks (and “wikileaks australia blacklist” is the second autocomplete in Google’s search box, with over 1 million results. What does that tell you?).

As a (scant) face saving measure (and as demonstrated by Conroy during the earlier leak), if you don’t disclose the list you at least have the possibility of denial. “No, no, no, this isn’t the real list – we don’t arbitrarily ban dentists and tuckshops”. We all know it was the bona fide list, and that it was full of bad data. However, you’ll sooner get Conroy to erect an altar to the Liberal party in his office than admit the list is genuine. Politicians seem to prefer being worldwide laughing stocks to taking ownership of their own problems – I have no idea why that is.

Some other issues with disclosure:

* You must be accountable – the Government was (quite rightly) pilloried for obvious mistakes and deliberate acts of political censorship on the list. If people see a list, they’ll want to know why you’ve banned a particular thing, and they’ll argue with you about it. The only thing the Government gets from disclosing the list is more grief.

* People can reverse engineer the ACMA complaints process (and the filtering mechanism itself). By submitting sites, and then seeing what makes the list and what doesn’t, you can learn exactly what sets them off and what kind of things they ignore. By having a known dataset that the filter blocks you could easily DDOS it (and I’m sure there are a whole bunch of other nasty things that a list would make easier to do).

* The hypothetical scenario where the filter is 100% effective makes releasing the list pointless. A list that cannot be checked is as good as a list you cannot access. Acting like this is a valid scenario in the real world is pointless – it cannot possibly occur any more than a 0% effective filter can in the real world.

So in short, they’re damned if they do, and damned if they don’t.

We don’t need an accountability mechanism for the filter, because we don’t need the filter. Bolting a bureaucratic system of disclosure and appeals to it is utterly pointless when the fundamental idea of internet censorship is flawed. Lending the idea of censorship credence by seriously discussing accountability in the system is counter productive – there is simply no way this system can work, or be fair and accountable. It’s just not possible, not technically and not even in principle.

No Comments so far
Leave a comment

Leave a comment