My learning round-up from the 2014 UKSG Annual Conference
Video of talk – Slides from talk
The Big Picture: more machines, more people
Is journal publishing model still fit for purpose? Why has it worked so well for 350 years? He identifies four main trends: shifts in scholarship, end of the article, research objects, social machines
In The Big Picture (see image), David identifies four zones of interaction between more machines, and more people.
“Knowledge infrastructure” can be anything from a journal to a library. The Data Deluge, now called Big Data. In future, will research start with data, rather than a hypothesis? See what patterns emerge, then try to explain?
See also: Beyond the PDF2 conference outcomes, including why we need an open alternative to Google Scholar.
The R Dimensions
See also: David’s presentation “e-Research and the Demise of the Scholarly Article”
David points out the reproducibility [of experimental procedure] is not the same as reproduction [of an experiment].
researchobject.org – “the Knowledge Hub for the Research Object community, to disseminate knowledge about Research Object, its concept, adoption, and other latest development.”
R Dimensions – “Research Objects facilitate research that is…” (see image) The list of R words was added to during the talk itself via Twitter!
Implementing e-resource access for alumni: Anna Franca, King’s College London
Video of talk – Slides from talk
As Anna mentions in her talk, the main issue for e-resources managers is how to set up the authentication so that alumni can access only the pool of resources which include them as authorised users, and not the wider pool of platforms and databases licenced for current students and staff.
See also: Extending access to academic research content to NHS users: a pilot (Carolyn Alderson) video – further info
Trust and authority in scholarly communications in the light of the digital transition: David Nicholas, CIBER Research Ltd; Carol Tenopir, University of Tennessee
Video of talk – Final report
This was an interesting report into how academics judge trust and authority in sources they use, and how they view open access publications. It’s a pity the slides from the talk aren’t available (I’ve nudged the conference organisers and will update this post if they become available).
Traditional indicators of trust include journal name, journal reputation, author expertise. Now, access issues are also included – reader has to be able to get to the article.
Reality of trust for academics:
- many read things they “trust” that they would never cite e.g. Wikipedia
- politics influence citing and publishing
- cite to protect yourself and add “trustworthiness”
- publish to help your career – clouds the picture. Younger academics are more conservative than older academics, for this reason
- use different criteria for reading, citing, and publishing
Trust in reading is complex. To decide if a document is trustworthy:
- read abstract and methodology
- check for credible data and sound logic
- look at source’s references – end of first stage, navigational metrics
- colleague recommendations
- experience with author – end of second stage, social metrics
- familiarity with journal
- peer-review linked to quality
- impact factor is a factor… – end of third stage, traditional metrics
How trustworthiness is determined for citing:
- known and trusted authority – author, journal, or conference
- Seminal work in the field
- Supports methodology
- Research group/institution known
How trustworthiness is determined when deciding where to publish:
- traditional metrics still important
- influenced by promotion criteria
- institutional research policies
- audience of a journal
- likelihood of getting published
Differences by age groups (also disciplines). Sciences happy with OA if peer-reviewed; Humanities more comfortable with traditional options. Younger researchers (under 40) more likely to trust non-traditional methods of dissemination e.g. social media, but they conform to norms when dealing with older researchers. They feel pressure to publish in highly-tanked journals to obtain research grants.
Academics cite people they know because they trust them. They cite OA journals if properly peer-reviewed. There is lots of confusion about economic model and peer review in OA, especially among older academics. Lots of older academics think OA means not peer reviewed! [Le sigh.]
But… Academics recognise that there are problems with peer review:
They still recognise it as essential, despite its flaws/pitfalls.
Metrics – trust and impact factor:
Alt metrics did not come up. Most participants unfamiliar, or sceptical. They like metrics that can easily be understood. Authors like to see number of views and downloads for their articles – just don’t call it alt metrics! Popular from publishing point of view, but not from trust angle.
When considering trust in an online environment, connectedness is key. A link sent by a contact carries more weight than one found by search, for example.
Some common thoughts about Open Access:
I think there is scope for the library to get involved and help educate academics on this!
The impacts of impact – challenges and opportunities of ‘multichannel’ academic work: Ernesto Priego, City University London
Video of talk – Ernesto’s article on this topic
Some great quotes:
- “Publishing – where content goes to die.”
- “Like reading the first few pages of Morrissey’s autobiography, you soon realise that academics never read any of the contracts they sign.”
- “In publishing, we’re aiming for collaboration, but what we’re getting is competition.”
I liked Ernesto’s description of scholarly publishing as a network of interconnected outputs – nodes, not monoliths.
Ernesto argued that publishing should be affordable (article/output processing charges, with waivers for students, unemployed etc); sustainable; some (not all) rights reserved; easy to mine; use DOIs; easy to map, measure, track, reproduce, share. Scholarship should promote a culture of sharing, and it should be rewarding, not exhausting.
- “Why do we publish? Social, public responsibility in the act of research. Public money, public access.”
JiscLAMP – Library Analytics and Metrics Project
Slides of talk
JiscLAMP was a project to develop a prototype shared library analytics service for UK academic libraries. Find out more at the JiscLAMP site.
“ORCID provides a persistent digital identifier that distinguishes you from every other researcher and, through integration in key research workflows such as manuscript and grant submission, supports automated linkages between you and your professional activities ensuring that your work is recognized” – it’s like a DOI for a person. Register now to get your ORCID identifier!
ORCID Live – waiting for one to pop up in Antarctica
ORCID Live allows you to see IDs being registered in real time – it’s quite hypnotic watching the pins dropping 🙂
Discovery or displacement?: a large-scale longitudinal study of the effect of discovery systems on online journal usage: Michael Levine-Clark, University of Denver Libraries; John McDonald, University of Southern California
Video of talk – Slides from talk – Summary by Rose Robinson
I enjoyed the statistical approach, and it provided an interesting alternative to the 2013 UKSG report “Impact of library discovery technologies” which was somewhat inconclusive. See also some earlier thoughts on an ethnographic approach in this area.
The main point to ponder for me was the finding that increase in e-resource use varied by resource discovery interface used, with the greatest increase in use seen for Summon, followed by Primo; and then EDS and WorldCat Local which were similar.
And not forgetting the intrepid band of runners who joined me for a run before the conference dinner on Tuesday 🙂 We really should have this on the official programme next year!