Public goods from public agencies

Lawrence Lessig says this.

the strong bias of public policy should be to spread public goods at their marginal cost. Compromises are no doubt necessary if private actors are to contribute voluntarily to the production of public goods; but public entities, such as govern-ments, should not indulge in these compromises unless they are necessary.

I agree and hope to put up a more substantial post on it in future. In the meantime, I’ve been provoked to this post by the contrast between the BBC and some other public agencies particularly in Australia. (Actually having visited the BBC website, its not clear how much better they’re doing than the ABC, but I take something that they’re doing as a jumping off point.)

The ABC is experimenting with ‘podcasting’ and it’s proving to be a very successful experiment increasing the reach of radio national substantially which is a mercy since I’m told that the average age of listeners rises by some large fraction of a year every year a trend which as they say is ‘unsustainable’ – and sad.

Anyway, the BBC seems to have been fairly adventurous about it posting podcasts of all of Beethoven’s symphonies. Amazing it took so long. The ABC however appear to have some concern with the additional cost of the bandwidth that their podcasting trial is lumping them with. As a result they are rolling it out slowly. Further, many months after the launch of podcasts, they remain unavailable for many perhaps most Radio National programs. is obviously the place to start an archive of broadcasting right now. With Google being able to supply 2.5 gigs of memory to e-mail users for free, it seems incredible that the ABC cannot afford the gig or so a week they’d need to preserve most Radio National audio and make it available to whomever wanted to download it.

Meanwhile judging by their copyright claims, government agencies are producing somewhat more public private goods than they were a few years ago. The Budget papers are not yet regarded as public goods.

This work is copyright. You may download, display, print and reproduce this material in unaltered form only (retaining this notice) for your personal, non-commercial use or use within your organisation. Apart from any use as permitted under the Copyright Act 1968, all other rights are reserved.

Come on guys, the budget papers want to go free. That’s what’s happening in Britain. As Slashdot reports:

The BBC is reporting that they view the piracy of a Doctor Who episode before its broadcast date earlier this year as a
‘wake-up call about the demand for new technology’, in a refreshing
change of opinion from most media/broadcasting corporations, who would
damn this piracy without hesitation. They are forming plans to
simulcast the television channels BBC1 and BBC2 on the web, as well as
allowing users (only in the UK to start with, unfortunately) access to
shows for a week after the broadcast date. It is worth noting that they
are already trying out a system where they make shows available on the
web before television broadcast, with The Mighty Boosh. Other BBC3
comedies are due to follow suit and become available on the internet


Update: Those good old Poms. Here’s a story of the UK research councils stipulating that research funding will be conditional on open access to the results of research.

I’ll reproduce it in case the link ever gets lost.

Publishers make last stand against open access
Donald MacLeod
Tuesday August 30, 2005

Guardian Unlimited

Publishers and learned societies are fighting a last ditch action to stop the research findings of thousands of British academics being made freely available online.

The UK research councils, which control billions of pounds worth of funding, have announced their intention to make free access on the internet a condition of grants in a bid to give British research more impact worldwide as it is taken up and cited by other researchers.

The move has been backed by Sir Tim Berners-Lee, the inventor of the world wide web, and other academics.

But publishers who fear that open access will hit sales and damage the UK’s 25% share in the £7bn worldwide learned journals market are lobbying hard against the proposal. Both sides believe the battle has reached a critical stage.

Ian Diamond, the chief executive of Research Councils UK, the umbrella body representing the eight research councils, has proposed that from October academics archive final versions of their papers in repositories belonging to their own universities or subject bodies. These would not be edited, and possibly corrected, by a journal, but would be available free of charge to other researchers via the internet.

This month, the Association of Learned and Professional Society Publishers (ALPSP), whose members publish more than 8,000 journals, wrote privately to Prof Diamond seeking consultation and urging delay.

The policy would not only damage big publishers, but also hurt scores of learned societies, which publish journals, said Sally Morris, the association’s chief executive.

Journals organise the all-important peer review process, which is the quality control for research – although the academics involved do it for free – and this has to be paid for somehow, she pointed out.

Once all of a journal’s content was available free online, university librarians would stop buying it, she said. The advent of Google Scholar meant it was now easy to find the contents of a journal scattered among different repositories.

Ms Morris conceded that those physics journals where 100% of content was open access had not lost subscriptions yet, but there was a worrying trend of academics no longer reading the journals.

“We are worried that the research councils in the UK are trying to push in the direction of a parallel economy without thinking of the possible damage to the journals on which they parasitise.

“We need to talk together to maximise the dissemination of funded research, but without killing the goose. We need to examine very carefully the real risk to publishers and what we can do to minimise it,” added Ms Morris.

But a letter to the research councils signed by Sir Tim Berners-Lee, Stevan Harnad, professor of cognitive science at the University of Southampton, and other advocates of open access, dismisses the publishers’ fears.

“Not only are these claims unsubstantiated, but all the evidence to date shows the reverse to be true: not only do journals thrive and co-exist alongside author self-archiving, but they can actually benefit from it – both in terms of more citations and more subscriptions,” they said.

This entry was posted in Economics and public policy. Bookmark the permalink.
Notify of
Newest Most Voted
Inline Feedbacks
View all comments
Stephen Bounds
Stephen Bounds
2022 years ago


The actual hard disk space required to store a substantial chunk of the ABC’s archives wouldn’t be super-expensive.

However, that’s predominantly a fixed cost. The killer is the *bandwidth* required to make the downloads available to all and sundry.

Let’s say 50,000 users access the service and each download two shows a week, on average, of 10MB each (a fairly conservative estimate).

That’s around 4,000 GB a month of downloads. Assuming bandwidth costs of $5/GB and excluding other maintenance costs including hiring systems administrators and so on, this modest system would cost around $240,000 — hardly chump change for the ABC.

Nicholas Gruen
2022 years ago

Yes, but a massive amount of the bandwidth relates to recent programs. I’d be surprised if having the rest available as an archive adds much to the monthly bandwidth. Also, if one distributed all files as MP3 files, then they could be secondarily distributed through P2P systems. One could then choke the bandwidth of all files older than – say 4 weeks – but leave ID codes for each program on the website. Then you could search the file on a P2P file sharing community.

Then again, given my ignorance, there could easily be something wrong with this logic. Any suggestions?

Stephen Bounds
Stephen Bounds
2022 years ago

P2P is an interesting idea. However, I think if the ABC were to choke off provision of the files after a certain time period accessibility to files would be greatly compromised. Studies have repeatedly shown that a majority of P2P users are consumers, not producers; most people do not make their downloaded files available to others for upload, meaning that after 4 weeks there may not be many copies in the community to find.

Additionally most companies, public or not, tend to dislike losing control when distributing their works and the ABC may still be suspicious of such a decentralized concept.

This makes the situation a perfect scenario for using a P2P program like BitTorrent (

BitTorrent differs from other P2P programs because it uses a central ‘tracker’ URL serving .torrent files to reliably identify the file to download. This tracker information is then used to identify other users currently downloading this file and efficiently obtains a copy by sharing and combining of file segments between users. (The BitTorrent algorithm cleverly prevents parasite users by matching download and upload rates — eg. you can get 10kB/s of downloads only if other users can upload from you at 10kB/s.)

ABC is happy because they can track download numbers and will have far lower bandwidth bills since users simultaneously downloading the same file will mainly obtain the file from each other rather than the ABC.

Users are happy because there is always at least one copy of the file available — the ABC’s “seed” copy and they are guaranteed that the file they are downloading matches their request.