"The Politics of Code - Shaping the Future of the Next
Internet" - Conference Report and Notes
This was a conference run by Oxford
Internet Institute and Programme
in Comparative Media Law and Policy, and held at the
Oxford Union (which hosted the
Free Music Debate with Hilary Rosen a few
months back).
There is now the website
http://www.codepolitics.info/,
which has mp3's of all the talks, and most of the presentation slides on it.
Well worth checking out.
Other articles on the conference include:
-
The Register
- Quite Esther Dyson Specific
-
BBC
- Quite Lawrence Lessig Specific
Yes, it's me again. The guy who wrote one of the many reports on the
Union Debate on the Future of Music, which was
strangely one of the most widely circulated, if not one of the most widely
dissed.... Doubtless you'll be glad to head that I've no intention of
changing from my eclectic writing style, but hopefully if you can cope with
it you'll find some things of interest below.
Anyway, this Thursday I was back at the
Oxford Union, this time for a
conference entitled The
Politics of Code - Shaping the Future of the Next Internet". The
keynote speakers were everyone's favourite poster boy on copyright and
law, Lawrence 'Larry'
Lessig, and Esther
Dyson, former ICANN Chairman. Also joining them were a number of
heavy weights from most of the spectrum, though disappointingly this time
we lacked anyone from the "far end" of the pro-control camp.
The conference kicked off with a 20 minute slideshow and talk by Lessig (which
owing to being quite early in the morning, I wasn't taking all that detailed
notes for, doh!). The slideshow was quite similar to his
Free Culture talk, but
with a greater emphasis on the generalities of control and copyright, and less
on the specifics, as would be expected given the audience. However, we were
assured by the conference organisers that they'd be putting all the slides
on their website shortly,
so hopefully you'll be able to view it yourself (it was good).
One interesting addition to his talk, compared to what seems to be his norm,
was a brief talk on the landmark
1774
House of Lords ruling, which for the first time freed the publishing
of Shakespeare's works. He also gave some insightful comments on the differences
between the US and the UK/EU stance on copyright issues (didn't write them
down, sorry...).
Part way through his talk, he pulled up the infamous Adobe E-Book reader, and
showed us the access
controls it placed on three works - one now in the public domain, one
which was never copyrighted, and his own book. It was a very insightful
demonstration of just how easily code now removes our fair use abilities, and
how much more could happen in future.
Disappointingly, he shied away from the debate on
if we
really need copyright at all. Maybe it was in deference to his audience,
but it did lead to an interesting comedy discussion at lunch with
an academic who was writing a paper on copyright, but had never heard of
Richard Stallman, the GNU project, or the GPL!
Lessig was keen to point out that he wasn't anti-market, but quite the
opposite. He tried (though I'm not sure how many people who hadn't got it
did from his speech) to show how the current laws and lobbying were for
control and concentration, not for competition and free markets.
An interesting statistic he rolled out (I guess you need to look in his
Eldred vs Ashcroft filings to find the source) was on the value to the
author with time. He said that for "Life+50" compared to indefinite, you
would get 99.3% of the full value, while "Life+70" would give you 99.7%.
Who, apart for a well rehearsed Bob Dylan at a congress hearing, would
argue that an artist creating a work will be at all bothered by that extra
0.4%, to be claimed long after their death?
After Lessig's speech, we had two other shorter talks before a panel debate.
Christian Larrinaga
reiterated a lot of what Lessig said, but with a bit more of a technical
slant. He kept things nice and short, and I'd say he's definitely a speaker
to keep an eye out for.
Finally, it was the turn of Peter
Davies to speak. Coming from a business background, he had a different
slant on some things, which was a refreshing change. Most importantly, he
seemed far happier to trample on individual's rights and accesses in order
to stop large scale commercial piracy than the other speakers. He was
against the recent US copyright extensions to "life+70", but did seem to hint
he didn't think "life+50" was all that bad. One of his great lines was
People will find there'll be some anomalies coming from this (in relation
to content control). True, but who says we have to like or accept them?
During the Q&A session, Lessig talked the most, few of the questions
were directly answered, and fewer of the audience could hear the questions!
Still, it was interesting to hear the areas or agreement. Still, finding
someone a bit more content company friendly than Peter would've spiced things
up a bit.
IPv6
Owing to my networking interests, I opted to attend the IPv6 panel rather
than the DRM one. This had two parts, firstly and intro to IPv6, followed
by a talk on how IPv6 can be a two edged sword for privacy.
The intro talk was given by Axel Clauberg of Cisco, and didn't contain
much if anything that I didn't know, and presumably most people reading this
wouldn't know either. However, I think the main points relevant to the
issues under discussion were:
- IPv4 has 32 bit addressing to v6's 128 bit. While v4 has a theoretical
4 billion addresses, really only 250 million are usable (see
RFC 3194)
- This means IPv4 is normally deployed with NATing, Firewalls,
filtering routers etc. All of these break the end-to-end (e2e) principal
of the original internet, and add control, filtering, a need to
"register new applications" and test them with these devices etc.
- IPv6 gives you back end-to-end, by removing the need for NATing, taking
you back to wards what made the original internet so good for innovation
- New, non computing devices are on the increase and need to be connected,
which needs both more addresses and new methods of auto-configuring them
- Privacy - you can have temporary IPs as there are enough, and there
are specs describing how to get temporary IPs for connections. However,
will this be allowed, and just how much privacy does it offer?
- Security - IPSec is built in, and more addresses means harder to
portscan, but that was about it.
I felt this section was a bit thin, especially in light of new,
high speed port scanners
- People interested were pointed at
euro6ix.org and
6net.org
IPv6 And Privacy
After this talk (which was starting to make me wish I'd gone to the DRM
one), Alberto Escudero Pascual
did a talk that I thought was really good. I'd definitely recommend
people interested in IPv6 and / or privacy go read some of his papers.
His talk had no real message beyond "Think about what these things
can do, and which way they can go". Below are many of the areas he
touched on:
- Privacy - Right or a Feature?
Technical => e2e, PKI etc
Legal => Legislature and Regulation
- We need to bridge the techno-legal divide, and this is going to be hard
- Policies get written with old technologies in mind, and don't map well
to new ones (eg RIP)
- The Data Protection vs Snooping problem, which is only getting worse
- GUIDs - Globally Unique Identifiers
IPv6 with Mobile IP offers these, IPv6 with temporary addresses allows us
to avoid them. However, will either of these be mandated or banned by
legislation?
- IPv6 can be good or bad for privacy - policy controls which of its
features will be used
- How will an IPv6 address compare in terms of identifiability to the
likes of credit card numbers, passport numbers, IMEI numbers, phone numbers
etc
- We already have less privacy in the digital world than in the real world,
but will this get worse or better?
You can send a postcard without signing it IRL, but email requires sender
information - Sender Anonymity
You can buy things from a shop with cash, but online you need a credit card -
Buyer Anonymity
- The internet lacks the infrastructure for legitimate anonymity - eg
some countries (such as Sweden) have FOI acts (freedom of information) where
no record of who requested what it made, yet the government websites log
who accessed them
- What is "content" and what is "traffic data"? - old terms don't map well
onto the new medium - rules need defining generally (based on amount of
personal data accessed) not trying to apply definitions to fields
- Need to consider location privacy as well as data privacy
- Policies need more insight and review, rather than being written in
private and delivered in a "finished" form
- Telephone number =~ FQDN (fully qualified domain name, eg www.foo.com)
SS7 signalling number =~ IP Address
Law enforcement and policies seem to treat IPs like telephone numbers, but
they're not
Internet Governance
First up was Esther Dyson.
The Register
did a big thing on her talk on ICANN, so I'll concentrate on what lessons
she thinks we can learn from ICANN about internet governance.
- It's hard to decide on what's a consensus - who should vote, and how
much should different peoples / groups votes count?
- Beware of centralised power
- Beware of how charters are drawn up - keep it open
- Keep governance lightweight. Make hard decisions by consensus, but at
an appropriate local level. Stay independent
- ICANN acts as judge, jury and executioner on issues and rules - ensure
future organisations can't
- Jury systems for deciding issues sound great but are hard to work -
how to register interest, how to pick people, how to decide on issues,
how to advise the jury etc)
Then we had Richard Hill of the ITU,
telling us what he thought ICANN and similar internet governing body's could
learn from the ITU:
- ICANN should:
- Only specify what they have to, otherwise leave things up to the regions
to pick what's best for them
- Publish standards and information (be secretarial)
- Work on consensus
- Be similar to IETF or ITU, pick open or closed membership as seems
appropriate
- Be non controversial - no-one has heard of the ITU because they don't
do anything controversial, everyone who does know doesn't disagree with them
- Where possible, make things voluntary but recommended (IETF also do
this, reasons best explained in their slides)
- Have a bottoms up, consensus based decision process, that way most people
will trust you. Be transparent to your membership
(Hans Kraijenbrink and Diane Cabell also spoke)
Liberty By Design
Aland Davidson of the Centre for Democracy
and Technology spoke about Liberty by Design, the idea of getting
the public interest supported from the word go in code. He had a number
of examples of where CDT is trying to do this, what problems they've found,
and where they think the future for this sort of thing may lie. Here goes:
- RFIDs - can be very good for supply chain management, transport, automating
shopping checkouts etc, but can also
be
bad. The main problems revolve around who decides where they go, and who
can read them when they're there. CDT worked with RFID people, and got a
"suicide" feature built in - you can turn them off for good. However, problems
still remain - who will be able to turn them off, will shops do it by
default, how hard will it be to do etc
- Give people choice - with RFIDs, people may opt not to turn them off, or
they may be forced to turn them off, but the code gives these policy
options. Without the suicide feature, there wouldn't be these options
- "If code is law, then how can we include the public interest in the
development of technology and standards?"
- "Liberty by Design" - include important values early in the process.
Once they're there, they can have an important global impact
- Reception - standards groups are normally open to suggestions, occasionally
hostile, but always anxious
- OPES - has good feature, can be
used for bad things (eg censorship, tampering with content etc). Breaks the
end-to-end model. CDT have made people aware of these issues, now the OPES
team are working to avoid the pitfalls that have been pointed out
- Need to fix standards early on - eg DVD has many problems, but it's too
late now to change / fix them, big inertia to change
- Standards can be boring, but are important. Problems can come with
getting involved with closed standards groups and private companies /
consortia
- People aren't aware, and usually can't understand the problems. Sheer
size of the problem - many many standards groups, very few people involved
- Need more people involved in the standards process, but how to make
this scale? How to make sure it works?
- Would government intervention help? Would they really represent the
public?
More on Internet Governance
This panel debate had a number of people speaking on it, not all of whom
said new things. The focus of this section was more on standards and
regulations. Firstly, Raymund Welre spoke on government involvement:
- Initially, internet regulation was a mix of government regulation
and private regulation. The Government's involvement didn't affect the
architecture or the technology
- Now the internet is commercialised. Where in government regulation
needed, and where will it be harmful?
- Infrastructure - Consensus based, from standards body's (both open and
closed), and may have some government involvement
- Content and Conduct - Mix of government regulation and netiquette.
- Government should regulate content and conduct, not code and architecture
Then it was the turn of Harald Alvestrand, chairman of the IETF, to talk
about standards and regulation. (the ->'s are supposed to indicate
a cycle)
- "Openness is more important than regulation"
- Standards liberate value to users
- Open standards are more valuable than proprietary ones
- Voluntary more important that mandated
- Proprietary Standards: Benefit -> Usage -> Cost goes up ->
- Theory of Standards: Standard -> Products -> Benefits ->
- Mandated Standards: Low quality products -> Low benefits ->
Low usage -> - no use => high cost, no incentive, non mandated wins
- Voluntary Standards: Benefit -> Usage -> Quality -> - Starts
with the benefits, moves on from there, cost goes down not up
- Open standards: Standards -> Usage -> Participation ->