Friday 30 April 2010

Supporting early career researchers

At the JISC2010 conference earlier this month, I chaired a session on "researchers of tomorrow" - looking at how universities can best support this group.

Four projects were presented, all of which used different methodologies, and examined slightly different demographics. "Researchers of tomorrow" might be postgraduate research students who are young; or those who are just starting their research career, regardless of age; or a group who are not yet researching but still undergraduates, perhaps. Nonetheless, all researchers are undertaking similar tasks in their work, regardless of their field of study (finding and analysing data, organising information, and writing and sharing their findings), and those at the start of their research careers are likely to face similar challenges too.

Despite the differences in the approaches taken by the projects, it was very encouraging to see that common themes were emerging across all the projects.

One was the tension between competition and collaboration; whilst many early career researchers are happy to share with others in many ways, and to work together, it remains the case that competition for funding, jobs and publications is fierce.

In addition, early career researchers always seem to find out about new technologies and tools from people - whether that means colleagues, friends, family or others, online or offline. There is often time pressure, limiting willingness to take risks and try new technology that might not prove useful, Time constraints also mean that training and support systems need to fit in to research life; for instance, being offered at times and places which work for the early career researcher, and supporting learning whilst undertaking a real task with real information, so as not to waste time.

I look forward to the future results of the British Library study [4], and seeing how Esther Dingley's study fits into it all!

[1] Cambridge-lead study: commentpress site and downloadable report

[2] OCLC review of JISC VRE work

[3] OCLC other study - covered in slides

[4] British Library study (interim results only, the project runs for another 2 years)

Monday 26 April 2010

JISC Cambridge Library Widgets Project

For those of you who thought widgets were bits of plastic that made beer fizzy, a new project to open your eyes. The JISC Cambridge Library Widgets Project is a collaboration between the University Library and CARET, aiming to deliver a joined-up, relevant and accessible suite of library services through small, portable web interfaces (or widgets).

The UL, CARET and Arcadia already have a history in widgets. They are a perfect tool for fast, lightweight development, for trying out ideas, and for knitting together the stuff of academic life. The UL and CARET collaborated on the Cambridge Library Widget - info (and screencast) here: http://www.lib.cam.ac.uk/toolbox/camlibwidget.html and recent Arcadia Fellow Harriet Truscott did an enormous amount of work on an exam papers widget, which is one of the ideas we'll be taking forward.

We'll also be looking at interfaces for mobile devices, at integration with DSpace (our institutional repository) and with CamSIS (our student registry), and at some fun, innovative tools for the future.

We'll be blogging here: http://culwidgets.blogspot.com/ and I'll also be adding any major updates to the Arcadia blog.

Wednesday 21 April 2010

OUP takes on Wikipedia

From ArsTechnica:

The OBO tool is essentially a straightforward, hyperlinked collection of professionally-produced, peer-reviewed bibliographies in different subject areas—sort of a giant, interactive syllabus put together by OUP and teams of scholars in different disciplines. Users can drill down to a specific bibliographic entry, which contains some descriptive text and a list of references that link to either Google Books or to a subscribing library's own catalog entries, by either browsing or searching. Each entry is written by a scholar working in the relevant field and vetted by a peer review process. The idea is to alleviate the twin problems of Google-induced data overload, on the one hand, and Wikipedia-driven GIGO (garbage in, garbage out), on the other.

"We did about 18 months of pretty intensive research with scholars and students and librarians to explore how their research practices were changing with the proliferation of online sources," Damon Zucca, OUP’s Executive Editor, Reference, told Ars. "The one thing we heard over and over again is that people were drowning in scholarly information, and drowning in information in general. So it takes twice as much time for people to begin their research."

OBO grew out of that research, with the goal of helping scholars and students deal with information overload, possibly by skipping Google entirely. The resulting bibliography is fairly simple and lean, which is exactly the point. The messy and often politicized work of sorting and sifting the information has already been done for users, so that they can drill down directly to a list of the main publications in their target area.

"You can't come up with a search filter that solves the problem of information overload," Zucca told Ars. OUP is betting that the solution to the problem lies in content, which is its area of expertise, and not in technology, which is Google's and Microsoft's.

To trust OBO's content, you have to trust its selection and vetting process. To that end, OUP is making the list of contributing scholars and editors freely available. Each subject area has an Editor in Chief who's a top scholar in the field, and an editorial board of around 15 to 20 scholars. The EIC and editorial board either write the bibliographic entries themselves, or they select other scholars to do the work.

The launch version of OBO covers only four subject areas: Classics, Islamic Studies, Social Work and Criminology. But OUP has plans to add 10-12 new subject areas (known as modules) within the next year. Each subject area contains between 50 and 100 individual entries, and that number should grow at the rate of about 50 to 75 entries per year.


And the cost of all this peer-reviewed quality? Why $29.95 a month or $295.00 a year

He never told a lie, maybe. But did he pay his library fines?

Lovely story in today's Guardian.
Founder of a nation, trouncer of the English, God-fearing family man: all in all, George Washington has enjoyed a pretty decent reputation. Until now, that is.

The hero who crossed the Delaware river may not have been quite so squeaky clean when it came to borrowing library books.

The New York Society Library, the city's only lender of books at the time of Washington's presidency, has revealed that the first American president took out two volumes and pointedly failed to return them.

At today's prices, adjusted for inflation, he would face a late fine of $300,000.

The library's ledgers show that Washington took out the books on 5 October 1789, some five months into his presidency at a time when New York was still the capital. They were an essay on international affairs called Law of Nations and the twelfth volume of a 14-volume collection of debates from the English House of Commons.

The ledger simply referred to the borrower as "President" in quill pen, and had no return date.

Sure enough, when the librarians checked their holdings they found all 14 volumes of the Commons debates bar volume 12.

Under the rules of the library, the books should have been handed back by 2 November that same year, and their borrower and presumably his descendants have been liable to fines of a few cents a day ever since.

C'mon Obama: pay up. It's your Office's responsibility.

Thursday 15 April 2010

Library of Congress decides to archive Twitter firehose

From today's New York Times:

The Twitter archive will join the ambitious “Web capture” project at the library, begun a decade ago. That effort has assembled Web pages, online news articles and documents, typically concerning significant events like presidential elections and the terrorist attacks of 9/11, Mr. Raymond said.

The Web capture project already has stored 167 terabytes of digital material, far more than the equivalent of the text of the 21 million books in the library’s collection.

Some online commentators raised the question of whether the library’s Twitter archive could threaten the privacy of users. Mr. Raymond said that the archive would be available only for scholarly and research purposes. Besides, he added, the vast majority of Twitter messages that would be archived are publicly published on the Web.

“It’s not as if we’re after anything that’s not out there already,” Mr. Raymond said. “People who sign up for Twitter agree to the terms of service.”

Knowing that the Library of Congress will be preserving Twitter messages for posterity could subtly alter the habits of some users, said Paul Saffo, a visiting scholar at Stanford who specializes in technology’s effect on society.

“After all,” Mr. Saffo said, “your indiscretions will be able to be seen by generations and generations of graduate students.”

People thinking before they post on Twitter: now that would be historic indeed.

Monday 12 April 2010

Collaborative composition moves up a gear

Copyright 2010: back to first principles





On Friday last, Counterpoint, the British Council's Thinktank held a conference in London to mark the tercentenary of the Statute of Anne, the first piece of legislation on copyright. I was one of the two opening speakers. Here's my script on "Getting back to first principles".

...............................

When I think about this stuff, two images come to mind.

The first was conjured up by a fellow-countryman of mine in 1726. This is how he tells it:

“I was extremely tired, and with that, and the heat of the weather, and about half a pint of brandy that I drank as I left the ship, I found myself much inclined to sleep. I lay down on the grass, which was very short and soft, where I slept sounder than ever I remembered to have done in my life, and, as I reckoned, about nine hours; for when I awaked, it was just daylight. I attempted to rise, but was not able to stir: for, as I happened to lie on my back, I found my arms and legs were strongly fastened on each side to the ground; and my hair, which was long and thick, tied down in the same manner. I likewise felt several slender ligatures across my body, from my armpits to my thighs. I could only look upwards; the sun began to grow hot, and the light offended my eyes. I heard a confused noise about me; but in the posture I lay, could see nothing except the sky. In a little time I felt something alive moving on my left leg, which advancing gently forward over my breast, came almost up to my chin; when, bending my eyes downwards as much as I could, I perceived it to be a human creature not six inches high, with a bow and arrow in his hands, and a quiver at his back. In the mean time, I felt at least forty more of the same kind (as I conjectured) following the first. I was in the utmost astonishment, and roared so loud, that they all ran back in a fright; and some of them, as I was afterwards told, were hurt with the falls they got by leaping from my sides upon the ground.”


This is Jonathan Swift’s Gulliver, on the first of his celebrated travels.

The second image comes from Joseph Tainter’s intriguing book The Collapse of Complex Societies, in which he examined a number of sophisticated civilisations that flourished for aeons and then suddenly collapsed: these civilisations included those of the Romans, the Lowlands Maya and the Chacoans. Each of these societies had impressively complex social structures and very advanced technology, and yet, despite this, they collapsed, impoverishing and scattering their citizens and leaving little behind. How, Tainter asked, did this happen?

His answer was that they hadn’t collapsed despite their cultural sophistication, but because of it. Tainter’s account describes societies which, through a combination of social organization and environmental luck, find themselves with a surplus of resources. Managing this surplus makes each society more complex, and for a time the marginal value of this complexity is positive: each additional bit of complexity more than pays for itself in improved output. But over time, the law of diminishing returns reduces the marginal value, until it disappears completely. At this point, any additional complexity is pure cost. “Tainter’s thesis”, as Clay Shirky’s useful summary puts it, “is that when society’s elite members add one layer of bureaucracy or demand one tribute too many, they end up extracting all the value from their environment it is possible to extract --and then some”.

***

What have these two images to do with intellectual property?

Well, first of all, the Internet is our Gulliver, and the pygmies crawling about him are IP lawyers and their corporate clients.

My generation was lucky enough – or maybe smart enough, it doesn’t matter -- to invent something magical: a gigantic, global machine for springing surprises. Or, to put it more prosaically, a network for enabling disruptive innovation. The architecture of the TCP/IP-based Internet with its lack of central control and its neutrality towards applications has stimulated an astonishing wave of creativity in the decades since it was switched on in January 1983. Among the surprises sprung by the network to date have been: email, the World Wide Web, streaming media, peer-to-peer networking, cloud computing, VoIP, blogging, Flickr, social networking and powerful search engines. These innovations have transformed our information environment, to the point where life without them has become inconceivable.

The arrival of this unruly giant on our Lilliputian shores, however, caused panic in many quarters, particularly in those which had hitherto made a good living out of the status quo. And their response to it – as evidenced most recently, for example, in the undignified scramble to pass the Digital Economy Act in the dying hours of a Parliament – has been to attempt to immobilise the giant by binding it with billions of silken threads, woven by IP lawyers, in the hope that it can be rendered impotent and life can go back to the status quo ante.

But if we allow that to happen then we’re done for. Capitalism needs explosive innovation: that’s the source of its dynamism. It can’t get by on the cosy incrementalism of old business models. We desperately need Joseph Schumpeter’s waves of creative destruction if we’re to feed our exploding global population, provide citizens with health care and develop technologies which might arrest and eventually reverse global warming. But we’re stuck with an Intellectual Property regime that was shaped by old communications technology and the special interests that grew up around it, and is increasingly a barrier to innovation rather than an incentiviser of it.

Which brings me to Tainter, and his gloomy thesis about collapse. As many of today’s contributors have pointed out, our existing IP regime is increasingly hindering creativity rather than facilitating it. The content industries would dearly love to extend this regime to cover everything that goes on in the networked world. If they succeed it will be, in my view, the step too far that Tainter observed in the societies that he studied. And those who recommend it will find that, far from extracting even more value from the system for their shareholders, they may just choke it to death.

My fear is that this is what will, in fact happen. Our situation is now one best described by the theory of incompetent systems – that is to say systems that can’t fix themselves because the components which need to change are driven by short-term considerations and are unable to think longer-term. Global warming belongs in the same category.

***

But perhaps this is too gloomy a thought to stomach on such a bright Spring morning. So let’s make an effort to be optimistic. If, by some miracle, we actually were able to muster the collective resolve to do something about our plight before it is too late, what should we do? To what First Principles should we return?

Historically, our approach to IP is that it has been too much couched in terms of particular communications technologies – print, records, movies, broadcast, and so on. If we were to have the opportunity to redesign the system then we should escape from these shackles; we should formulate the design in terms of general principles rather than particular instantiations of transient technologies. Among other things, this would involve:

  • Explicit recognition that an IPR is not a presumptively absolute right but a temporary, conditional monopoly granted by society.

  • A default assumption that any creative product is in the public domain unless the creator explicitly asserts ownership of his or her rights.

  • A globally-agreed definition of ‘fair use’ that emphasises its status as a condition of the grant of a temporary grant of monopoly and not a privilege grudgingly granted by rights holders.

  • A return to a legislative philosophy which decides copyright duration by balancing the need to incentivise innovators with society’s need for unrestricted access to creative outputs. This implies: (i) an obligation on legislators to seek objective assessments of the public interest in the context of any requests to extend IPRs; (ii) that appropriate durations may be different for different forms of expression: (iii) that durations should be regularly reviewed and adjusted to match changing circumstances; and that lawmaking on intellectual property should be strictly evidence-based in the way that legislating on e.g. pharmaceutical products is. Rights holders petitioning for extensions of their temporary monopolies would be required to provide evidence that the proposed extensions would lead to increased innovation or some other tangible public benefit.

  • The copyright system should be redesigned to be efficient in the sense that it is easy to identify rights holders.

  • Strict liability should be abolished. Penalties for inadvertent infringement should be proportional to the actual losses suffered by rights-holders, and in the event of disputes compensation should be determined by independent arbitration.


  • None of this is rocket science. These principles seem to me to be patently obvious, if you’ll excuse the pun. Some of them were obvious in 1710, and many were understood – and extensively discussed -- by the framers of the US Constitution in the 1780s. Yet over the intervening 300 years we appear to have forgotten many of them. It’d be nice to think that we can begin learning from our mistakes. But I wouldn’t bet on it.

    Evil Empires and many Davids ... latest on publisher practices

    Meridith Farkas has recently blogged about EBSCO's continued policy of providing market 'exclusives' on popular academic journal titles in their packages, to the extent of removing journal backfiles from JSTOR. This also ties users into EBSCO's set of discovery tools which many find unwieldy and restrictive.

    Furthermore, the society that publishes the journal seems to be complicit in this practice. Comments on the blog reveal a lot of anger from Librarians. Barbara Fister summed the situation up neatly at libraryjournal.com:

    Farkas offers a synonym for “exclusive”: extortion. She also, in an incisive analysis worthy of von Clausewitz, surmises that the society has found a nifty way to increase their membership by removing its journal from most libraries that had it through the nearly-ubiquitous Academic Search Premier. You want the journal? Join the club.

    Out of this situation, librarians tend to arrive at three conclusions:

    • Articles published in this journal have deliberately been made into an artificially scarce resource.
    • The society involved and/or EBSCO stand to make more money because of this new scarcity.
    • Fewer people will have access to the information—and parties to the exclusive agreement think that’s great!

    What’s interesting in all this is that the major providers of access to this journal—libraries—were not consulted at all. In fact, Farkas only found out about it because links through Serials Solution that used to point her patrons to access points in JSTOR and Academic Search Premier had simply vanished.

    Who’s got the ransom note?
    Here’s where I think the word “extortion” doesn’t quite work. Usually, extortionists communicate with their victims: “pay us, or else.” In this case, libraries have had to stumble across the “or else” before they could do some frantic research to find out where to send the ransom and what it will cost while patrons stand at the reference desk or wait on the phone, rolling their eyes at stupid librarians.

    Are we so insignificant that nobody even bothered to copy us on the ransom note?

    I personally feel that issues raised in these two articles tie directly in the mission of the Arcadia fellowship; exploring the role of the library in the networked environment, specifically how we can meaningfully interact with the providers of online scholarly content.

    As well as curators of print and digitised material, libraries currently play a major role in purchasing and negotiating large packages of online material. However, when faced with this kind of practice we can arguably appear obsolete as unnecessary middle-men.

    Lorcan Dempsey visited Cambridge late last year and remarked in a discussion that 'a medium sized academic library that uses EBSCO for the majority of its online material might as well replace its library with EBSCO', (or words to that effect). Another recent survey in the States has warned of the increasing irrelevance of academic libraries.

    Potential solutions in dealing with this kind of practice include negotiating collectively, as we in the UK are continuing to do so in a greater fashion through JISC collections.
    In a follow up post, Meredith queries if consortia in the States are doing this. Certainly, the response from the library community to EBSCO's practices shows that we are at least willing to co-ordinate on sharing experiences and feelings regarding publisher and vendor practice.

    Or on a more radical note, maybe its time to give the whole peer review model that underpins academic publishing a serious rethink? Digital repositories continue to be only marginally successful as alternative and open publishing methods, and will continue to be so as long as the processes of peer review remain with journal publishers. Peer review is in itself a sound and trusted method of ensuring academic credibility, but why do the methods and processes of review and the distribution of peer reviewed output need to remain in the hands of profit making bodies?

    With Librarians and publishers getting together at UKSG this week, its all food for thought.

    Tuesday 6 April 2010

    Farewell to Arcadia - an end or a beginning?


    My final post on the Arcadia Project blog, as the Exam Paper Browser project has now come to an end. It was a challenging project - creating a test case for the potential to provide staff and students with personalised resources and services, and linking together 3 different University IT services with Google's gadget technology. But after just 10 weeks, we now have a working prototype of the system up and running. Of course, this prototype Exam Paper Browser can be developed by the University to provide a central bank of online exam papers, but it also offers a starting point to develop a whole range of tools displaying information from DSpace in easy to use formats directly to any web page. It should be comparatively easy, for example, to develop a tool that will automatically display PhDs submitted by Department members on a Department web page.

    So, lets hope that this is a beginning, rather than an end, and that this will be the start of a whole range of gadgets pulling material out of DSpace to be displayed in helpful ways across the University.

    I'm now back in CARET, and will be sharing my new-found knowledge about gadget design with my colleagues here, before taking up a new post at the Botanic Gardens.

    Finally, a big thank you to all those at the University Library and beyond who were so consistently helpful and welcoming - special thanks go to Huw, Lizz, Ed, Ray, Elin, Barbara, Simon, Sebastiaan and the wonderful people on my support team!

    DIY book scanning



    Above is the elevator pitch. The video below tells the whole story.



    Of course this isn't the way Serious Libraries would approach the task. Perish the thought!

    Thanks to the Berkman Center for the videos.