Intuitive interfaces

I think there is a limit to how “intuitive” library resource discovery tools can be.  The more complicated the system behind the interface, the more one needs to know about how it works in order to use it well.  This is different from usability, which is about optimising the match between user intention and means to achieve it.

Do you remember the brief fashion for federated search in the late 2000s?  These interfaces were promoted as a simple way to search multiple databases simultaneously.  However, the reality was that such systems would display results in the order they were returned from the remote servers (rather than ranked by relevance*, as many users expected) and would often display only the first 50 results retrieved, rather than every matching record.  Once users understood what a federated search tool was doing, it often prompted them to return to searching native interfaces separately, where they could at least be more confident that each tool was performing a comprehensive search.

*Relevancy ranking of results – in itself, another concept that once understood, will be discarded in favour of more transparent ranking e.g. publication date.  Relevancy algorithms are often closely guarded secrets, and I understand that they operate on a popularity basis, where articles which are most downloaded or most cited will rank highest in search results.  This may work well for general web searches, but it’s hardly how scholars would want their academic searches to operate, especially as research often involves seeking obscure or niche information which by definition will score poorly on popularity.

Disruptive forces in the “staggeringly profitable” business of academic publishing

There is an excellent long read in today’s Guardian: Is the staggeringly profitable business of scientific publishing bad for science?

Learn how academic publishing became so profitable, the rapid increase in library subscription costs (the serials crisis) and the start of Big Deals, and the development of open access as an alternative to subscription publishing (see also my other posts on open access).

Sci-Hub, a different way of disrupting the subscription and paywall model, is in the news at the moment: US court grants Elsevier millions in damages from Sci-Hub – though it’s far from clear if or when they may receive any of it:

Meanwhile, Finnish researchers have launched a boycott against Elsevier: “The group behind Tiedonhinta.fi statement urges researchers to refrain from peer review and editorial duties for journals owned by publishing giant Elsevier.  The boycott is launched on a new website nodealnoreview.org. The site welcomes also signatures from international colleagues all around the world, who are worried about cost of and access to research literature in their own countries.”

The 10 commandments of experimental data

Here is the original French version by Charles Nepote:
Les 10 commandements de l'expérimentation data

  1. À apprendre, pas forcément à réussir, tu chercheras
  2. Des hypothèses tu formuleras ou l’exploration tu assumeras
  3. Tes réussites comme tes échecs tu partageras
  4. À la diversité des profils et contributeurs tu veilleras
  5. Frugalité, agilité, simplicitétu chériras
  6. Un accès aux données tu obtiendras
  7. De l’intérêt des données tu ne préjugeras pas
  8. Dans des univers nouveaux, les données tu chercheras
  9. Face aux données, un esprit critique tu garderas
  10. À ces principes tout le monde adhérera

And my quick-and-dirty English translation:

  1. You will seek to learn rather than succeed
  2. You will formulate hypotheses or take an exploratory approach
  3. You will share both your successes and your failures
  4. You will be alert to diversity among candidates and contributors
  5. You will seek frugality, agility, and simplicity
  6. You will make your data accessible
  7. You will not prejudge the appeal or point of your data
  8. You will seek data in new fields
  9. You will maintain a critical eye when faced with data
  10. Everyone will adhere to these principles

Sounds like an excellent manifesto.  Feel free to improve upon my translation!

Supporting Evolving Research Needs

My notes from yesterday’s “Supporting Evolving Research Needs” conference organised by ALISS, the Association of Librarians and Information Professionals in the Social Sciences.

1. The Systematic Review – is the social sciences librarian involved? If not, why not?

Alan Gomersall, Senior Visiting Research Fellow, Centre for Evidence & Policy, King’s College London

Alan spoke of his experience of working with academics involved in doing systematic reviews to inform national policy.  He found that the academics only searched one database (Medline) and did not use synonyms or broader/narrower keywords, or related terms, when searching.  He and a colleague wrote a paper about this, to try to find out why the academics’ research skills were so poor.

His paper identified weaknesses in the systematic review process e.g. Academics ignoring all grey literature on the grounds that it wasn’t peer-reviewed.

Home Office guidelines for systematic review focus on synthesis of findings, not search strategies.  Alan’s work shows that key UK information is being systematically excluded in favour of information from the big-name US databases.

Possible points of failure:

  • Uni library fails to invest in appropriate databases
  • Social sciences librarian & academic staff fail to work together
  • Academic’s poor search skills
  • Too much trust placed in WoK, Sociological Abstracts etc
  • Social sciences librarian never leaves confines of the library

Alan encourages everyone to trial/subscribe to Social Policy & Practice, good source of UK info

Further questions

  • Are UK unis ignorant of the many excellent but small social science databases?
  • Are UK database producers failing to market
  • Are UK library schools limiting student training to a few well-know US services which offer discounts for educational purposes e.g. WoK?
  • Influence of Campbell Collaboration and refusal  by many US databases to accept grey literature

Social sciences librarians must engage with their academics!

Evidence Network site – option to sign up for Alan’s free bimonthly newsletter

Miggie Pickton argues for librarians to be involved in systematic reviews and included in research bids

Centre for Research & Dissemination at York Uni – set good standard

2. What did I do wrong?”a project to support independent learning practices to avoid plagiarism

Helen Hathaway, Liaison Team Manager Science and Information Skills Coordinator,  University of Reading Library

Panic, stress, anxiety, confusion – lots of emotional issues about plagiarism and referencing

Does TurnItIn help with academic practice/referencing? Mixed answers.  May sensitise students to good practice.  Some academics report that it fails to detect plagiarism.

Referen©ite, Uni of Auckland – student voice videos give perspectives on importance of correct referencing e.g. Shows respect to predecessors’ ideas

Uni of Reading have developed re-purposeable resources toolkit – “Academic integrity toolkit”.  Aimed at academics.  It’s meant to be bites iced and incorporated into teaching, not just given out to students for them to read (/ignore).  Considering publishing it as an Open Educational Resource.  For now, guest access to their Blackboard can be arranged.  Contact details here.

Results of research

  • Crucial to go beyond formatting and show role of correct referencing in academic writing
  • Many students failed to engage with skills training
  • Students report lack of consistency and difficulty in finding guidance
  • Implications of alternative academic cultures and experiences (international students)

3. Supporting the Research data management [RDM] process – a guide for Librarians

John Southall, LSE Data Librarian

Digital media formats aren’t future-proof, and researchers have trouble referring back to their notices from 5, 10 years ago if they can no longer open files, or no longer have appropriate disk drive

Strengths of digital media are that it is easily stored, produces perfect copies, great potential for sharing and re-use

RDM includes docs, spreadsheets, research notebooks/codebooks, questionnaires, transcripts, audio, images, videotapes.  A lot of data is generated before any paper is drafted.

UK Data Archive – best practice for creating, preparing, storing and sharing data

Research data objects are acquired or generated during the research process.  Includes protocols and methodologies

Common themes in RDM:

  • Storage and preservation issues
  • Metadata
  • Research ethics (of data creation, of sharing)
  • Data management plan and planning

Other resources:

Not just compliance.  Consider what you would do if you lost your research data tomorrow…

Contact details for John

4. Identifiers for Researchers and Data: Increasing Attribution and Discovery

John Kaye, Lead Curator Digital Social Science, British Library

ODIN = ORCiD (Open Research Contributor iD) and DataCite Interoperability Network

Identifiers such as DOIs uniquely identify research objects.  DOIs assigned by DataCite and CrossRef.  I think the difference is that DataCite makes DOIs for things that aren’t articles, whereas CrossRef assigns DOIs for articles.  ARK = archival research key, a URL to create a persistent identifier.

ImpactStory – view impact of your work using traditional citation metrics and social citations.  Log in using ORCiD details.  See also this introduction to using ImpactStory.

5. Sharing information literacy teaching materials openly: Experiences of the CoPILOT project

Nancy Graham, Subject Advisor (Medicine), University of Birmingham and Dr Jane Secker, Copyright and Digital Literacy Advisor, LSE

OER = open educational resources.  Like CC licence for resources you’ve created.  OER Commons.  OERs are complementary to Open Access, MOOCs, RDM

DELILA = developing educators learning and information literatures for accreditation.  Cross-institutional project to adapt digital and IL [information literacy] resources to OER.

Project CoPILOT – funded by JISC/HEA and aimed to develop a strategy to promote international sharing.  Project is a sub-group of CILIP IL Group.

Mailing list: IL-OERS@jiscmail.ac.uk

Wiki: http://iloer.pbworks.com

Twitter: @CoPILOT2013

CoPILOT – like crowdsourcing of IL materials, gateway of links to sites where materials are hosted.  Good use of tags will be important.

6. Supporting research by becoming a researcher

Miggie Pickton, Research Support Librarian, Northampton University

Miggie’s slides from this presentation

My notes from a similar presentation at Umbrella 2011.  Contact details for Miggie.

Librarians as researchers: that’s a good IDEA

I really enjoyed this session, led by Miggie Pickton (University of Northampton) and Carolynn Rankin (Leeds Metropolitan University).

What does research look like?  Everyday research skills include: reading, watching, questioning, summarising, presenting, listening, choosing, organising, writing up, reflecting.  Many of us are already doing research, but maybe we just don’t realise that it is research!

Research is the professionalisation of everyday skills (Blaxter, 2008)

Library practitioners are often highly innovative in their practice and undertake research-related activity as a normal part of their working lives.

This new knowledge and understanding is often not recognised as research nor is it shared with the wider professional community.

We did an icebreaker exercise to meet each other and learn about the types of research activity we had each been involved in:

Name

Library service

What did they do?

Has had to provide evidence of service value
Has engaged with an external quality benchmark
Has and to collect statistics for annual reviews
Has run a focus group
Has written and/or presented a report to their organisation
Has helped a service user find resources for their research
Has contributed to a publication
Has explored ways of improving their service

Miggie and Carolynn introduced the framework for developing your research:

I=interest, issue, idea
D=develop, discuss, define
E=engage, elaborate, enact
A=advocate, advertise, apply

I=interest, issue, idea

  • Identify a project or research opportunity that interests you or meets a need
  • What do I want to know?
  • How could this help my practice or benefit my organisation?
  • What’s in it for me?

D=develop, discuss, define

  • Define the research question
  • What has been done on this before? Where is the evidence base? Where are the gaps?
  • Develop the project proposal – SMART objectives, appropriate methods

E=engage, elaborate, enact

  • Partnership and connections
  • Look for common goals
  • Win-win agenda
  • Who will you engage with and how?
  • This might be partners, colleagues, management, funders, policymakers

A=advocate, advertise, apply

  • Who needs to know about your work? Service users, managers, funders, policymakers
  • Where will your research make a different?
  • Effecting change within and beyond the library

S=Skills

Finally, the multiplier effect comes in when you add skills.

This session was practical and energising, and it started me thinking about the many ways I could apply these ideas to my work.