why Altmetrics? @ZohrehZahedi [email protected] Center for Science and Technology Studies (CWTS-Leiden University) "Bibliometrics, Scientometrics & Alternative metrics: which tools for which strategies?” , ADBU, 1 April 2015, BULAC, France (Paris) Outline • The importance of altmetrics • Introduc4on of altmetrics (concept, tools and data sources) • What do we know? • What does it show? • What we can do? • Problems & Opportuni4es 1 Why altmetrics is important? • Different ways of measuring impact of research: pros & cons • Tradi4onal metrics: peer review and cita4on analysis • Novel web-‐based metrics: ‘Altmetrics’ or ‘social media metrics’ Limitations: • Scope: partial vs complete impact/quality of research • Format: some limited vs diverse researh products (articles and reviews vs dataset, blog, software, etc.) • Speed: lots of time to accumulate vs the real-time impact • Audience: measuring only the impact of research by researchers (known as scientific impact) vs broad audience such as use of research by general public (societal impact?) • Real time impact: superficial use? • Reliability: gaming, manipulation or boosting the impact? • Producers: human or robot? 2 Then what is the best approach for measuring impact/quality of research? still unanswered question but there is no single way! 3 What is Altmetrics? • There is no exact defini4on of altmetrics (from the conceptual point of view) • “It is a good idea but a bad name” (Rousseau & Ye, 2014) • Diversity of terms suggested: ‘social media metrics’, ‘influmetrics’, etc. • Altmetrics are seen as metrics about ar4cles (ar4cle-‐level metrics) vs journal impact factors (journal level metrics) • first introduced in the ‘Altmetrics manifesto’ By Preim et al. In 2010 • Some consider all the views, downloads, readerships, men4ons in social media and news media, etc. as altmetrics • Refers to different ac4vi4es in different plaZorms: blogs, Twi[er, Facebook, Wikipedia, reference management tools, etc. 4 Altmetrics definition Altmetrics refers to the men4ons of scien4fic outputs in ‘social media’ (e.g. Twi[er, Facebook, blogs, etc.) or crowdsourced tools (e.g. Mendeley) or any online ac4vity around research products captured by altmetrics tools 5 What are the tools and data sources for ‘altmetrics’ Diversity of tools are available among them are: • Impact story • Altmetric.com • PloS one • Plum Analy4cs • Mendeley some offers open API! single vs divers metrics are avaialable by them 6 What type of altmetrics data & tools? 7 What type of altmetrics data? • • • • • A free reference management tool Database of scientific outputs More than 2.8 million users Usage statisitics based on users Open API Distribution of readerships by users across LR fileds • What are the most common types of users in Mendeley? 9 Limita.ons of Altmetrics tools • Differant metrics available: readerships, tweets, FB shares, comments, blogging, etc. • Limita4ons: • No clear meaning of these metrics • Manipulability • Difficult scalability and data collec4on (although API’s are available) • No normaliza4on of indicators • Low level of data standardiza4on 10 What do we know about altmetrics? • Main research topics, so far – Coverage – Correlations – Content analysis – Data problems, quality & validity 11 Coverage of publications • Which altmetrics data source & tools have the highest coverage of WoS publications? a random sample (20,000) of WoS publications from all fileds of science (2005-2011) using Impact Story: Mendeley has the highest coverage (62.6%) Data Source Papers with metrics Mendeley readers % papers without metrics % 12,362 62.6 7,392 37.3 Twitter 324 1.6 19,448 98.3 Wikipedia mentions 289 1.4 19,483 98.6 Delicious bookmarks 72 0.3 19,700 99.7 12 Coverage of publications? - OK, for Mendeley (>70%) - Low for other sources (Robinson-García et al, 2014) - Increasing over time 13 (Costas et al, 2014) (Cor)relations with citations • Moderate correlations with citations for Mendeley • Weak for the other sources (F1000, Twitter, blogs, news, etc.) • Also not very good at filtering highly cited publications except for Mendeley General Precision-recall curves for JCS (blue line) and total altmetrics (green line) for identifying PPtop10% Zahedi, Costas & Wouters, 2014; Waltman & Costas, 2014 most highly cited publications 14 Coverage by fields • There are some remarkable patterns (Costas et al, 2015) : – Twitter, stronger in Social Sciences and General medicine, lower in Natural Sciences and Humanities 15 Content analysis • Analysis of the 4tles and abstracts of publica4ons with altmetrics [Costas et al, 2014; Zahedi & Ness, 2014] • Distribu4on of Cita4ons and Altmetrics by • Disciplines (Subject Categories) • Topics (terms in the 4tles and abstracts) • VOS viewer (www.vosviewer.com) 16 What are the fields with more density of readerships vs citations? Readership activity vs. citation activity: - Social Sciences - Humanities Readership ac;vity: social sciences Lit. & pol. science Cognitive psychology Marketing Data problems: DOI vs title retrieval strategy Title search DOI search 19 Data problems: incorrect metadata Title search 93% 92% 87% 90% 80% 73% 85% 94% 99% n=182 DOI search n=241 7% 4% 13% 6% 14% 27% 15% 6% 1% Author DOI ISSN Issue Pages Source Title Volume Year 6% 0%* 68% 10% 10% 24% 18% 7% 1% 94% 100%* 32% 83% 83% 76% 82% 91% 99% Data problems: error types • What is the best retrieval strategy for collecting Mendeley reader counts? Title search DOI search Consistency of altmetrics data among different providers is necessary! document 22 Impact Story vs Mendeley vs altmetric.com Same document across different Altmetrics providers 23 Data problems Inconsistencies (Zahedi et al, 2014) : 24 Data problems summary - Inconsistencies across different altmetrics providers (due to their different data collection process, time, etc.)(Zahedi, Fenner & Costas, 2014) - fluctuation in Mendeley coverage and readership counts over time and through different retrieval strategies (Bar-Ilan, 2014; Zahedi, Haustein, Bowman, 2014) - Duplicates, different versions of the same document, data quality (in complete or incorrect metadata), retrieval errors (DOI and title-author search?), user profile update? 25 What are the opportunities? – as an indicator of hot/popular topic discussed – as an informative tool of identifying usage pattern by different user category (student vs researchers vs other professions or general publics): informing other types of impact (Societal, cultural vs scientific impact) – as an indicator for evaluating scientific impact: may be for Mendeley but not yet for Twitter, blogs, news media, etc. – as a complementary tool in measuring research impact especially for some fileds with low coverage in citation databases But still not known yet: whether altmetrics shows research impact? visibility? Attention? Buzz, popularity or noise? But many unanswered questions & Still more research needed! 26 Any questions? Merci Beaucoup de votre attention! 27
© Copyright 2024