Open access
SIG Special Topics
Tutorial
13 June 2022

How to Document Scientific and Clinical Impact of Research: Six Steps to Success

Publication: Perspectives of the ASHA Special Interest Groups
Volume 7, Number 3
Pages 679-695

Abstract

Purpose:

The purpose of this article was to provide a brief tutorial about impact metrics and how to use these metrics to document scientific impact. In addition, examples are provided that describe possible ways to document clinical impact of research.

Method:

We briefly introduce traditional bibliometrics for journals (e.g., impact factor), articles (e.g., citation counts), and authors (e.g., h index). We describe alternative metrics (i.e., altmetrics) that focus on other types of dissemination metrics such as usage (e.g., downloads, reads, and views), engagement (e.g., comments, shares, and replies), and attention (e.g., Altmetric Attention Score). We also discuss how these metrics are used by others to make decisions about employment, tenure and promotion, funding, and the like. We detail six steps to documenting the scientific and clinical impact of your research. Steps include establishing an ORCID (Open Researcher and Contributor ID) account, creating research profiles on academic platforms, engaging in broad dissemination activities, harvesting bibliometric and altmetric data, adding bibliometric and altmetric information to your curriculum vitae (CV), and submitting your promotion and tenure portfolio with confidence. Examples for ways in which scientific impact and clinical impact of research can be documented on your CV are provided.

Results:

Readers will have an introductory understanding of bibliometrics and altmetrics and how these data may be used in the evaluation of their research impact and reach. Some strategies are offered as means to increase scientific and clinical impact of research using research profiles on academic platforms. Others are suggestions for documenting scholarly activity aimed to increase clinical impact of research.

Conclusions:

Creating profiles with organizations that document scientific impact and engaging in the research community using less traditional methods may allow researchers to achieve a broader reach and greater clinical impact for their research. Documenting such research outputs should facilitate a researcher's career advancement.
Research is conducted, published, read and cited—but also tweeted, blogged, posted, shared, commented on, uploaded, downloaded, bookmarked, and followed. (Wasike, 2021, p. 416)
Documentation of research productivity and scientific impact by those in academia is vital to make annual performance review decisions, to support tenure and/or promotion, to support continuing review applications, to inform employment decisions, and to contribute to decisions about grant funding (Acquaviva et al., 2020; Agarwal et al., 2016; Bakker et al., 2019; Cabrera et al., 2018). Researchers have long been in the position of documenting scholarly productivity and scientific impact for these purposes and, until recently, primarily have used traditional bibliometrics to do so. Bibliometrics are statistical methods aimed to quantify the quality of a journal, an article, and/or an author (e.g., journal, article, and author metrics; described in detail later). For example, “citation counts” is a metric that refers to the number of times a given article has been cited by others. Presumably, a high citation count is indicative of an impactful article.
More recently, alternative metrics (altmetrics) have emerged as an important supplement to bibliometrics (Chen & Wang, 2021). Altmetrics are based on audience engagement of research-based products and go beyond the traditional measures of publication citations. These metrics are not limited to the number of citations an article receives, for example, but can also provide an index of overall engagement that clinical and academic communities had with articles, presentations, and book chapters. Together with bibliometrics, alternative impact metrics can provide supplementary evidence of scientific and clinical impact of research. Knowlton et al. (2019) suggest that bibliometrics and altmetrics complement each other to provide an accelerated translation from research to practice, with patients as the beneficiaries. Altmetrics are emerging to fill the evidence and relevance gaps left by traditional systems and citation indices (Wilsdon et al., 2015). Given the sheer volume of bibliometrics coupled with the explosion in alternative metrics, an in-depth tutorial regarding the definitions, advantages and disadvantages, and examples of these metrics is in order. Because many of these terms may be new to the reader, we have included brief definitions of the terminology in Table 1.
Table 1. Definitions of metric terminology in alphabetical order.
TermDefinition
AltmetricA company that provides metadata (alternative metrics) to institutions and to academic platforms (proper noun)
altmetricsCommonly used as a collective term to refer to alternative metrics (not a proper noun)
alternative metricsMetadata digitally extracted from publications and presentations to indicate the reach and significance or impact of clinical or scientific research
author metricsData representing the authors contributions to science or research
bibliometric dataThe data that represent the number of publications, number of citations, and so on
bibliometricsThe practice of using statistical methods to describe publication data
citationsThe practice of giving credit to another author by citing another author for their ideas or for quoting their material
citation metricsData representing the number of citations within a certain time frame
clinical impactThe extent to which recommendations based on empirical evidence has reached clinical practice and/or influenced policy
impactThe action (reach and significance) that occurs because of exposure to or education about something
journal metricsData representing the value of the journal in providing information or contributing to science or research
metricsQuantitative measures or indicators used to track performance
scientific impactThe influence that a publication or research finding has on science or on society
Note. Definitions were generated from the synthesis of information presented in this tutorial.
Although the story of scientific impact through the presentation of bibliometric data is most recognized and understood, it does not represent the whole story and does not fully represent the impact of research in clinical practice (van Eck et al., 2013). Clinical impact has been described as the extent to which research and recommendations based on empirical evidence reach clinical practice and/or influence policy (Spencer, 2022). The better a researcher can document the clinical impact of their efforts, the better the individual can demonstrate their value to their employers and to society (Cabrera et al., 2018). A successful research career is defined, in large part, by the scientific and clinical impact of the research produced by a given professional. It is critical not only that the research community captures the impact of efforts on academic and nonacademic communities but also that promotion and tenure committees in communication sciences and disorders take note and recognize that both bibliometric and alternative metrics are available for use to document and describe scientific and clinical impact. Traditional bibliometrics capture the academic audience in a very narrow way, whereas altmetrics expand the impact to include both academic and nonacademic audiences. Although clinical impact can also be captured and documented in other ways (e.g., service, teaching, and leadership), the focus of this article is on the combined use of bibliometric and alternative metrics to represent scientific and clinical impact.
The overall purpose of this article then is to provide researchers with a brief tutorial on how to document the scientific and clinical impact of research using traditional bibliometric and alternative metric (altmetric) data and to advocate for promotion and tenure committees in communication sciences and disorders to include consideration of bibliometric and altmetric data as part of tenure and promotion applications. We provide a brief overview of bibliometric and altmetric indicators followed by six simple steps leading to successful documentation of scientific and clinical impact for promotion and tenure applications.

Making Sense of Metrics

Impact is defined by its reach and significance. In other words, to what extent do the findings and implications of research reach the intended audiences and how important is the research for the specific field or discipline. Bibliometrics and altmetrics are both indicators of research impact. For the most part, bibliometrics tell the story of impact through publication metrics. In contrast, altmetrics tell the story of stakeholder engagement and use of publication and other information. For the novice, bibliometric and altmetric terminology and associated indicators can be overwhelming and confusing. We break it down by providing background information, examples, and explanations of each.

Bibliometrics

Traditional bibliometric data refer to the use of statistical methods to describe publication data. There are three types of traditional bibliometric indicators: journal metrics, article metrics, and author metrics. Bibliometrics are a measure of how the research community responds to research and about the longevity of the research within the academic or scientific community. They have a long traditional history of being used by promotion and tenure committees to indicate how impactful and important a researcher's contributions to the field are. Researchers with publications that were published in well-respected journals with a high impact factor and having large numbers of citations are considered to have a significant impact in the field. Bibliometrics have a long history as indicators of academic success and have withstood the test of time. The disadvantage of bibliometric data is that it takes time to accrue the data (i.e., publication time, dissemination time, and translational research time) and does not provide immediate feedback to promotion and tenure committees.
Bibliometrics were used in the legal field in the early 1800s, emerging in academic literature in the 1920s (Shapiro, 1992). Initially limited to publication counts and citation analysis of books and articles, they have evolved over time. Although the methodology of bibliometrics have advanced with the use of digital technology, they were well established for decades before the advent of computers. We briefly describe each type of bibliometric and what it means. Table 2 provides a list of the most common journal, article, and author metrics. The definition, an example, and an explanation are provided for each metric.
Table 2. Traditional bibliometric indicators for journals, articles, and authors.
MetricMetric nameDefinitionExampleExplanation
Journal metricsJournal impact factor (JIF)Average number of times articles published in a specific journal have been cited in the past 2 yearsJIF (2019) for JSLHR = 1,873Numerator = number of articles published in 2017 + 2018
Denominator = 2
 5-year journal impact factor (5Y-JIF)Average number of times articles published in a specific journal have been cited in the past 5 years5Y JIF (2019) for JSLHR = 2.242Numerator = number of articles published in 2014 + 2015 + 2016 + 2017 + 2018
Denominator = 5
 Journal immediacy index (JII)Average number of citations for articles published in the current year divided by the total number of articles from the current yearJII (2019) for JSLHR = 0.374Numerator = number of citations of articles published in 2019
Denominator = number of articles published in 2019
 Cited half-lifeMedian age of a journal's articles that were cited for a specific yearJSLHR's cited half-life for 2019 = 10.8 yHalf of JSLHR's articles cited in 2019 were published in the past X years
 Citing half-lifeMedian age of the citations produced by a journal during a specific yearJSLHR's citing half-life for 2019 = 12.2 yHalf of the citations produced by JSLHR in 2019
Article metricsCitation countNumber of times the work was cited in other worksCitation Count for the Smith & Weber (2017) article published in JSLHR = 90Total number of citations for the Smith & Weber (2017) article as of 09/01/21
 Field-weighted impact countAverage number of citations received by similar articles over the same period of timeFW-IC = 1Average number of citations for similar articles as of 09/01/21
Author metricsh indexNumber of publications that have a citation number equal to or greater than h for a given authorh index = 8If an author has 10 publications and 8 of them have been cited 8 times or more, the h index would be 8
 g indexThe unique largest number that the top g publications together at least g 2 citationsg index = 10Top 10 publications have been citated at least 100 times (102)
 m quotientAverage h index over the number of researchers' publishing careerh index/n = .8Average h index (8) over a 10-year publishing career
Note. JSLHR = Journal of Speech, Language, and Hearing Research; y = years.

Journal Metrics

Journal-level metrics facilitate comparison of journals within and across fields and reflect citation patterns, impact, and prestige of a given journal. The most common journal-level metric used is the journal impact factor (JIF). This metric was first considered by Eugene Garfield in 1955, and it has evolved over the years (see Garfield, 1999, for a brief review). The Journal Citation Reports (JCR) calculates the JIF for a given journal, in a given year, by dividing the number of citations for articles published in that journal in the JCR year by the total number of articles published in that journal in the previous 2 years. The JIF ratio indicates the average number of times that articles published in that journal have been cited in the past 2 years. Additional journal-level metrics include (a) 5-year impact factor, (b) journal immediacy index, (c) cited half-life, and (d) citing half-life. Other journal metrics can be found at https://www.scopus.com/source/eval.uri (subscription is required) or https://scholar.google.com/intl/en/scholar/metrics.html. Refer to Table 2 for examples.
It can be desirable, or even a requirement in some cases, to departmental leadership and institutions for their faculty to publish in “high-impact” journals or journals with the higher journal-level metrics, as they are often viewed as important or prestigious. The acceptance rates submitted work to high-impact journals is often very low, suggesting that only the highest quality of work is selected for publication by the journal. Thus, rightly or wrongly, journal-level impact metrics may be critiqued during performance reviews or by tenure and promotion committees as an indicator of the quality of work a given faculty member is conducting (Wasike, 2021).

Article Metrics

Article metrics provide data about a given piece of research including articles, book chapters, conference proceedings, and so on. The traditional article citation metrics provide authors with information about how often the article was cited. The author can then compare the citations of one article to other articles. Examples of traditional citation metrics for a given article include (a) citation count and (b) field weighted citation impact (Benard Becker Medical Library, 2021a). See Table 2 for definitions, explanations, and examples.

Author Metrics

Historically, author metrics focused on a count of the number of publications of an author, but current-day author metrics include mathematical equations to document an author's impact (Stuart et al., 2017). Probably the most widely used author metric is the h index that was developed Hirsch (2005, 2007). The h index is based on the number of publications by an author and the number of times each publication is cited. It is determined by the number of publications (h) that have a citation number equal to or greater than h. Author metrics include variants of the h index (Stuart et al., 2017) and others such as the g index and m quotient (Benard Becker Medical Library, 2021b). Definitions and examples of each are provided in Table 2.

Alternative Metrics (Altmetrics)

Alternative metrics (altmetrics) are indicators or metadata used to quantify how research publications and presentations are used, mentioned, and receive attention in the nonacademic community. They provide data that indicates the reach and significance or impact of scientific and clinical research. Alternative metrics are collected from the digital dissemination of research products. As such, the data regarding mentions of the research products, the use of the information, and the attention the research publication or presentation receives are available almost immediately and are readily available. Because they are new and have not withstood the test of time, we can consider that they are in an evolutionary process. The disadvantage of alternative metrics is that there is currently no single accepted paradigm or taxonomy for categorizing the multitude of altmetrics proposed. New platforms with new metrics are emerging on a regular basis, which leaves researchers in a quandary about how to use the alternative metrics to best represent the scientific and clinical impact of the work. We provide a brief background of altmetrics, discuss altmetric indicators that have been used, and present altmetric indicators for select platforms. We present the altmetrics in four domains: (a) attention, (b) engagement, (c) usage, and (d) citations. We define the metrics we have categorized into each domain and specify the platform in which the metric is reported (see Table 3).
Table 3. Alternative metric (altmetric) indicators for usage, mentions, and attention.
VariableMetric nameDefinitionPlatform
UsageClicksNumber of clicks or sharesKudos
 DownloadsThe total number of times an item (PDF) has been downloadedFigshare, Kudos, Publons
 ReadsNumber of clicks on the read publication buttonKudos, Publons
 ViewsNumber of abstract views on a platformFigshare, Kudos
 All Readers WelcomeReading level of your writing that is easily understood at a grade level of n and above based on its abstracts and titlesImpactstory
 Open AccessThe percentage of your research that is free to read onlineImpactstory
 Open HeroAll your publications are free to read onlineImpactstory
 Open LicensePercentage of your research with a public domain licenseImpactstory
 Open Science TriathleteWin times 3—open access paper, open data set, open-source softwareImpactstory
 Software ReuseYour research software is open accessImpactstory
EngagementShareNumber of share referralsFigshare, Kudos
 TweetsNumber of times the research output has been tweetedKudos
 RecommendationsThe count of the recommendations made on a platformPublons
 Big in JapanYour work was saved or shared by someone in JapanImpactstory
 Follower FrenzySomeone with n followers has tweeted your researchImpactstory
 Global ReachYour research has been saved and shared in n countriesImpactstory
 Global SouthOf people who save and share your research, n% are in the global southImpactstory
 Kind of a Big DealYour research has been tweeted by n scientists who are considered Big Deals on TwitterImpactstory
 RickrollYou have been tweeted by a person named RichardImpactstory
 WikitasticYour research is mentioned in n Wikipedia articlesImpactstory
AttentionAltmetric Attention ScoreAn indicator of the amount of attention a research output has received—derived from a complicated algorithm; score is influenced by quantity of posts mentioning an output and quality of the posts sourceFigshare, Kudos
 Clean SweepEvery publication (on ORCID) published since 2012 has been saved and shared onlineImpactstory
 Greatest HitTop publication has been saved and share n timesImpactstory
 Hot StreakConsecutive months count that someone has shared your research onlineImpactstory
CitationsBeamplotWeb of Science Author Impact Plot advantageous for promotion and tenure applicationsPublons
 CitationsNumber of citations from all publications, number of citations for individual publications, average citations per document, average citations per yearFigshare, Google Scholar, Kudos, Publons
 Global impact plotPlot depicting number of citations across the globe, very advantageous in demonstrating global author impactPublons
 h indexNumber of publications that have a citation number equal to or greater than h for a given author (i.e., if an author has 10 publications and 8 of them have been cited 8 times or more, the h index would be 8)Google Scholar, Publons
 i10 indexNumber of publications with at least 10 citationsGoogle Scholar
Use of the term altmetrics dates back to 2010 when it was first proposed by Priem et al. to describe alternative metrics as indicators of how stakeholders use the information available in publications (Priem et al., 2010). They described use as an indicator of how much and to what extent the research resonates with stakeholders. Examples of altmetric data include (a) usage or a quantification of clicks or views, (b) captures or how often the work is “captured” for future use through downloads or exportation into referencing programs, (c) mentions in other sources such as in blogs or news outlets, and (d) social media attention such as the mention of the work on social media platforms such as Facebook or Twitter (Kunze et al., 2020; Saberi & Ekhitiyari, 2019; Wasike, 2021). In a comprehensive review of altmetrics used in research assessment and management, Wilsdon (2015) identified the following as alternative indicators of impact: (a) image downloads, views, or tags; (b) video views, comments, or sentiments; (c) web or URL citations, web presentation citations, blog, or forum citations; (d) Mendeley or other bookmarks; (e) Google Book or Google Scholar citations; (f) website followers, downloads, or views, (f) social media followers, downloads, views, tags, sentiments, comments, tweets, and retweets; and (g) Web of Science data set citations. In general, altmetrics reflect how much attention a scholarly product receives, the reach of the dissemination efforts, and the level of influence and impact with regard to the effect upon a larger field of study or the impact on public health (Altmetric, 2021).
Alternative metrics are diverse, and we have categorized and summarized these into four broad areas representing (a) usage, (b) engagement, (c) attention, and (d) citations. Alternative metrics are collected across researcher academic profiles and social media platforms as diverse indicators of clinical impact of research and stakeholder interaction. It is incumbent upon the clinical researcher to tell the story about the impact of their research by demonstrating the multiple ways it resonates with stakeholders. Scientific and clinical impact is a story told by the researcher describing the usage of, engagement with, attention to, and citation of the research by capturing as many indicators as needed to tell the story. Although there are many social and academic platforms that report altmetric data, we have limited our presentation to the five academic platforms that offer unique benefits to the researcher and maximize the researcher's effort in documenting productivity and impact. The academic platforms we include in this tutorial are (in alphabetical order) Figshare (https://figshare.com/), Google Scholar (https://scholar.google.com/), Impactstory (https://profiles.impactstory.org/), Kudos (https://growkudos.com/), and Publons (https://publons.com/account/). These platforms are depicted in Table 4. We will describe these in more detail later in this tutorial.
Table 4. Purpose and function of academic platforms reporting altmetric data.
PlatformPurposeBenefitsAltmetric reported
FigshareDesigned to allow researchers to keep all research outputs in one place—including data sets, software code, publications, with open access options
Upload, share, cite, and preserve all research outputs
Generate DOI for presentations and preprint manuscripts
Discover other researchers open access outputs
Secure long-term preservation of data
Altmetric Attention Score
Downloads
Views
Citations
ImpactstoryDesigned as a free, open-source code, open data, and open access website that interfaces with ORCID and Scopus to help researchers explore and share the impact of their research
Nonprofit foundations
Developed with radical transparency and open communication
Interfaces with ORCID
Provides access to citing research articles
Linked to Facebook posts, blog posts, Wikipedia articles, etc.
Interfaced to Twitter so research can be easily shared
Funded by the National Science Foundation and Alfred P. Sloan Foundation
All Readers Welcome
Open Access
Open Hero
Open License
Open Science Triathlete
Software Reuse
Big in Japan
Follower Frenzy
Global Reach
Global South
Kind of a Big Deal
Rickroll
Wikitastic
Greatest Hit
Clean Sweep
Hot Steak
Google ScholarGoogle Scholar Profiles provides a simple way for authors to showcase their academic publications. Tools enable an author to see who is citing their articles, to graph citations over time and to compute other citation metrics.
Readily available to other researchers
Easily accessible
Can follow other researchers
No login required
Citations
h index
i10 index
KudosKudos was developed to facilitate the visibility of academic presentations and publications through dissemination across multiple venues.
Focused on visibility and dissemination
Partners with Altmetric, CrossRef, and ORCID
Links to Europe PubMed Central
Collaborates with Editorial Manager to allow author to enter a plain language summary when they submit their paper for publication
Altmetric Attention Score
Clicks
Downloads
Reads
Views
Share referrals
Tweets
Citations
PublonsPublons was developed to allow authors to track peer publication reviews and grant reviews as evidence of service in addition to tracking publications, citations, and journal editing all in one place.
Free to peer reviewers
Links with Web of Science, ORCID, and citation management software such as EndNote, Zotero, or Mendeley
Downloads
Reads
Recommendations
Beamplot
Global Impact Plot
h index

Usage

The story about reach can be told via the numbers of clicks, downloads, reads, and views. For example, the number of clicks on an abstract or the number of downloads for a publication on a website tells the story of how the research output displayed on a platform is used or accessed. Usage answers the question: How is this research used by the academic and nonacademic community? Most academic platforms report the traditional usage information of clicks, downloads, reads, and views (e.g., Figshare, Kudos, and Publons). Impactstory goes a step further by reporting six unique usage indicators as shown in Table 3, along with the definitions of each. These include All Readers Welcome, Open Access, Open Hero, Open License, Open Science Triathlete, and Software Reuse. Although the definitions are provided in the table, the three that speak to Health Literacy and Open Access have value in mentioning here with further explanation. All Readers Welcome is an indicator reflecting the reading level of an author's writing. It is reported as: “The grade level of reading that is easily understood at a grade level of X and above.” For example, one of the author's All Readers welcome status is recognized as Grade 13 and above. Open Access is acknowledgment of the percentage of a researcher's outputs that are free to read online, whereas Open Hero is recognition that the entire body of an author's research is free and available to read online. Whereas Table 3 depicts the alternative metric names by domain with a definition for each, Table 4 provides a brief overview of the purpose, benefits, and metrics reported by each of the academic platforms.

Engagement

The narrative about significance can be told through engagement indicators such as shares, tweets, and recommendations. An indicator qualifies as an engagement metric if it involves communication with other academic or nonacademic constituents. For example, the number of times the research product is shared on social media and academic platforms, the more influence it is likely to have on clinical practice or policy. Likewise, the more a research product has been tweeted or retweeted, the more impact it is perceived to have. At times, blogs and print popular media may pick up a particular research output and popularize it. This type of digital engagement is captured as a recommendation metric. The significance of the research unfolds through the story of stakeholder engagement. These traditional engagement metrics are reported by a few academic platforms (e.g., Kudos and Publons). Impactstory reports seven unique metrics including Big in Japan, Follower Frenzy, Global Reach, Global South, Kind of a Big Deal, Rickroll, and Wikitastic (see Table 3 for definitions). Some of these metrics reflect a sense of humor and quality as “FUN facts,” whereas others are a little more serious. More serious metrics include Global Reach, which reports the number of countries in which a researcher's work has been saved and shared; Global South, which reports the percentage of people who have saved or shared a researcher's work that reside in the Global South; and Wikitastic, which reports the number of Wikipedia articles in which a researcher's work has been cited.

Attention

The story about the attention that a research product receives has typically been told through article bibliometric data in the form of article citation metrics; however, we have separated attention and citations into two separate domains. Attention has evolved to be an indicator of the amount of attention a research output has received, and although it may include citation information, it is not limited to citations. The attention domain is reported by the academic platforms Figshare, Kudos, and Impactstory (see Table 3).
The Altmetric Attention Score is the first metric in Table 3 in the attention domain. Altmetric (https://www.altmetric.com/) is a Digital Science product and an example of a company that collects attention data about research products and shares metadata to institutions and academic platforms. To avoid confusion, it is important to note that there is a generic term altmetrics that refers to any alternative metric, as indicated in Table 1, and there is this company named Altmetric. Designed as an indicator of attention, Altmetric provides the Altmetric Attention Score, which is a weighted count of all the mentions for an individual research product. This score is computed from a complex algorithm incorporating several variables such as blog mentions, policy documents, and Wikipedia citations. The algorithm also includes LinkedIn, Twitter, Facebook, Reddit, and Pinterest posts and the number of Dimensions and Web of Science citations among other indicators (Altmetric, 2021). The Altmetric Attention Score is automatically shared with some academic platforms and is featured on the Figshare and Kudos platforms. The researcher has the option on other platforms to grant access to Altmetric (e.g., Impactstory).
Unique attention indicators introduced and provided by Impactstory include Clean Sweep, Greatest Hit, and Hot Streak. Clean Sweep is a metric reported when every publication on ORCID (Open Researcher and Contributor ID; defined in the Citation section) has been saved and shared online. Greatest Hit is an indicator of the publication that has been saved and shared the most times, and Hot Streak is recognition of the number of consecutive months that a researcher's outputs have been saved and shared online. Although these indicators are unique to Impactstory, it is anticipated that the number and diversity of attention indicators will continue to evolve and increase.

Citations

The science of bibliometric data has changed significantly since the 1920s and has evolved with the evolution of computing applications. Once counted and indexed by hand, citation metrics are now extracted digitally and reported on digital platforms. Digital citation platforms such as Dimensions (https://www.dimensions.ai/), ORCID (https://orcid.org/), Scopus (https://www.scopus.com/), and Web of Science (https://www.webofscience.com/) have made these citation data more readily available. Academic platforms such as Figshare, Google Scholar Profiles, Impactstory, Kudos, and Publons often report citation data. They may report findings from digital platform partners such as Scopus or Web of Science, or they may collect their own citation data. The advantage of citation metrics is that they condense the indicator for a publication to one number. For example, Figshare, Kudos, Google Scholar Profiles, and Publons report standard citation data for the number of individual citations per research output. Google Scholar Profiles and Publons report the h index (see Tables 1 and 3). Google Scholar Profiles provides a unique citation index, called the i10 index, which is the number of publications with at least 10 citations. Publons has created visual tools to help the researcher display the number of citations over time (i.e., Beamplot) and across the globe (i.e., Global Impact Plot) to better communicate the scientific and clinical impact of their research (see Table 3).

Academic Platforms and Researcher Profiles

Some academic platforms use metadata from Altmetric (2021) to create a snapshot of a researcher's impact. Altmetric tracks the online presence of research outputs (e.g., articles, presentations, data sets, and clinical services) through a unique digital identifier such as a digital object identifier (DOI; Altmetric, 2021). Altmetric data can be accessed directly by researchers who are members of subscribing institutions or organizations; however, they are not available for individual accounts. For example, American Speech-Language-Hearing Association (ASHA) journals currently provide access to Altmetric data. These data can be accessed for individual research outputs when the researcher logs into the ASHA journal and retrieves the publication. Altmetric provides the researcher with the Altmetric Attention Score; the number of blog, Twitter, and Mendeley mentions; the total citations; recent citations; field citation ratio; and relative citation ratio. The field citation ratio indicates the relative citation performance of an article when compared to similarly aged articles in its subject area, whereas the relative citation ratio indicates the relative citation performance of an article when compared to other articles in its area of research. Although ASHA journals are not considered within the same category of other academic platforms, it is important that researchers are thoroughly familiar with the options regarding where and how metadata for scientific and clinical impact can be accessed and retrieved. Academic platforms are digital programs whose function is to create virtual spaces for academicians, researchers, and college students to share information. An individual typically registers for an account for the platform and then creates a user profile. We refer to the user profile as the research or researcher profile in this tutorial. The research profile includes the demographic data about an individual researcher and the publications and/or presentations associated with that researcher. Uploading additional publications and presentations manually is an option on most platforms. The display of productivity forms the researcher's record. Each academic platform was developed with a goal in mind to meet one or more needs identified for the target audience. Therefore, each platform may serve very different but equally important purposes. Although there are many academic platforms, we have limited the focus of this tutorial to the five academic platforms that are listed in Table 4 (i.e., Figshare, Google Scholar Profiles, Impactstory, Kudos, and Publons).

Six Simple Steps in Documenting Clinical Impact of Research

Scientific and clinical impact of research cannot be created and documented without dissemination. Knowlton et al. (2019) suggest strategically disseminating research through research profiles linked to social media platforms to build professional reputations. This enables researchers to achieve a broader reach and increased impact of their work. In this tutorial, we describe six steps for documenting clinical impact of research. We describe specific academic platforms that interface with a specific social media platform; however, we do not discuss how to create impact via social media (see Davidson et al., 2022). In this tutorial, we describe how and why to (a) register for an ORCID account, (b) identify and establish suitable academic platforms for which to create research profiles, (c) engage in broach dissemination strategies, (d) harvest bibliometric and altmetric data, (e) add bibliometric and altmetric data to your curriculum vitae (CV), and (f) submit your promotion and tenure application with confidence.

Step 1. Create an ORCID Account

Registering for an ORCID (https://orcid.org/) account is the first step in tracking scientific and clinical impact of research. An ORCID number is a unique 16-character persistent author identifier that is free of charge to researchers and is user controlled (ORCID, 2021). User controlled means that you make all the decisions about how the data are shared with other platforms. Individuals, including graduate students, involved in research who plan to or are currently publishing outputs of research and are presenting at national conferences should register for an ORCID number. When an ORCID account is created, an ORCID profile is generated with an accompanying record of publications. The publications in the record are linked to the ORCID number and include any name or variation of the name the researcher has indicated in the profile as belonging to an individual author. For example, articles published under a maiden name and under a married name are both linked to the record. The ORCID platform automatically retrieves publication and citation data from Clarivate's Web of Science and from Scopus, which are two of the most highly regarded sources of bibliometric data (Baas et al., 2020).
The ORCID number is typically used on all professional information, affiliation agreements, grants, publications, and peer review of articles. Some researchers include it in their e-mail signature line on professional communication. Most journals require the author's ORCID number during the process of submitting for publication. Through their ORCID number, researchers can connect their records to institutions, funding agencies, and publishers, as well as add unconventional research products (e.g., presentations, data sets, and software) with DOI numbers to their research record. Researchers have complete control over their profile and how much information from their record is shared with the public. The data and metrics associated with the ORCID number are available to be shared across platforms (e.g., Figshare, Impactstory, Kudos, and Publons) and allow automatic population of research profiles and records. ORCID supports 37 types of “works” from articles to presentations to performances. Manual uploads of research outputs are allowed to the ORCID record. Once the ORCID profile and record are established, a researcher may choose to add author notes on manuscript submissions and scholarly works in the ORCID record.

Step 2. Identify Suitable Academic Platforms and Establish Research Profiles

There are several academic platforms that allow researchers to set up and view their research profiles. Research profiles may include employment, education, and other demographic data. The academic platforms we have identified as the most beneficial for researchers are those with optional links to ORCID accounts embedded. A researcher may identify which of these are most suitable for their use based on criteria such as those that (a) have features that are advantageous and meet their needs, (b) report the desired bibliometric and/or altmetric data, and (c) interface with the social platform(s) used.
The most well-known academic platforms designed for research networking are Academia, LinkedIn, and ResearchGate. Students or researchers may already have a profile on one or more of these platforms. We do not elaborate on these three platforms in our tutorial because they do not interface with ORCID and must be maintained separately from the ORCID record. If a researcher has already established a community in one or more of these platforms, they may choose to continue to maintain it. If researchers have not already established a profile and are not benefitting from the associated community, then Google Scholar Profiles might be the best choice. The primary reason we recommend it is because it is open access and is indexed by Google Scholar, which increases stakeholder, clinician, and public access to research products.
A researcher new to ORCID and academic platforms may want to limit their research profiles to those that can be populated automatically from ORCID with permission (i.e., Figshare, Impactstory, Kudos, and Publons). This will make the most efficient use of a researcher's time. As there are advantages to each of the different ORCID-populating profiles, we provide a summary of each of these platforms and their benefits in Table 5. To view samples of data provided by Impactstory profiles, see Figure 1. An example of a researcher's Kudos public profile is shown in Figure 2. A sample of the Web of Science Beamplot from Publons is shown in Figure 3.
Table 5. Academic platforms linked to ORCID—unique features and benefits.
PlatformUnique feature and benefits summary
FigshareFigshare is part of the Digital Science portfolio and serves as a repository where researchers or other users can make all their outputs available to the wider community in a citable, shareable, and discoverable manner. Figshare provides an alternative venue for difficult to cite media such as videos, data sets, presentations, and infographics. Once the research product is uploaded and edited with identifying information, the product can be published. When it is published, it is available to the public, and a DOI is generated and attached to the publication. This research product can be added to ORCID via the API. Once the connection is established, the API has bidirectional synchronization allowing population of the ORCID record to Figshare and vice versa. Files can be dragged and dropped to populate Figshare. Figshare can also be linked to Impactstory.
ImpactstoryImpactstory is a not-for-profit open-source tool owned by OurResearch (https://ourresearch.org/). It is a project supported by the National Science Foundation and Alfred P. Sloan Foundation with the primary aim to highlight the impact of Open Science activities. One of their goals is to promote nontraditional research outputs (e.g., data sets, code, blogs, and infographics) by showcasing them and reporting metrics beyond traditional citation data. Traditional publications included in the ORCID record are factored into Impactstory's achievements. You can discover the online impact of your research by logging in to Impactstory with your Twitter or ORCID login information. Application programming interfaces with ORCID and Twitter allow for easily accessible communication among the platforms. See Figure 1 for an Impactstory profile example.
KudosKudos is like Impactstory and Figshare in that it was developed to help researchers ensure publications get found, read, and cited. This platform is focused on accelerating impact of scientific and clinical research products by making them more accessible to the public. The Kudos platform focuses on the way that the research story is told and facilitates circulation to stakeholders through a unique dissemination strategy. For each publication, the author has the option of adding a plain language title, a statement describing what it is about and why it is important in lay language, and an author's perspective. Kudos reports they are the only platform disseminating across multiple networks and channels and the only platform to aggregate the most relevant metrics in one place with a map of outreach activities. When the researcher creates a research profile, the option of importing data from ORCID is presented. Optional links are also available for Academia, Facebook, LinkedIn, Mendeley, ResearchGate, Twitter, WeChat, Weibo, and YouTube platforms. A Kudos public profile is shown in Figure 2.
PublonsPublons (https://publons.com/) is part of the Clarivate Web of Science family. The Publons website has training modules for individuals who wish to learn how to review journal articles. Publons purports that recognition of an individual's completion of a journal article review is advantageous because (a) it can be verified and documented on a CV and (b) it increases review invitation rates and decreases turnaround time. High-quality journal article reviews receive their own DOIs, making them permanently citable and indexable additions to a CV. Publons has a feature where researchers can export their CV directly from Publons. In addition, it has a direct link for easy import to the EndNote citation management software. Reviews completed for journals are not public and are not shared; however, they are verified, a service proving to be of value for promotion and tenure portfolio documentation. Publons' primary aim is to provide accountability for and recognition of journal article reviews; however, their platform also includes a dashboard and the ability to track publications, citation metrics, peer review of journal articles, and journal editing work in one place. Researchers can link their Publons account to their ORCID account, giving them the option of automatically transferring their review records (with documentation of the manuscript review and the review content hidden) to their ORCID record as a one-time option or automatically when Publons is updated. This link is bidirectional. Researchers can also depict the scientific or clinical impact of research by including the Web of Science Author Impact Beamplot (see Figure 3) in promotional and tenure applications.
Note. Figshare, https://figshare.com/; Impactstory, https://profiles.impactstory.org/; Kudos, https://growkudos.com/; Publons, https://publons.com/account/. API = application programming interface; CV = curriculum vitae.
Figure 1. Example of an Impactstory profile.
Figure 2. Example of a Kudos public profile.
Figure 3. Web of Science Author Impact Beamplot example.

Step 3. Engage in Broad Dissemination Strategies

Once researchers have created an ORCID account, registered their unconventional or undocumented research outputs with Figshare, added the nontraditional DOIs to ORCID via the optional bidirectional application interface, and established one or more additional user accounts with academic profiles (e.g., Impactstory, Kudos, and Publons), they are ready to disseminate. Strategies specific to creating scientific and clinical impact through social media dissemination are provided by Davidson et al. (2022). We provide additional strategies to increase usage, engagement, and attention specific to the Figshare, Impactstory, Kudos, and Publons platforms. Citations tend to increase as a result of these dissemination efforts.
Figshare provides an in-depth tutorial about how to edit or delete data; upload and publish data; sync ORCID; and share, cite, or embed data (Figshare, 2021). Figshare accounts are provided with 20 GB of free private space and unlimited public space. After an author edits the research product with a title, author, key words, description, and funding if applicable, a license is chosen (based on reuse options), and it is published. Once it is published, it is permanently available with a DOI and can be shared for publication. Figshare has a link to Impactstory for dissemination purposes. This is one way to self-archive preprints to make them accessible to a broad audience. More information about bridging the gap between scientific research and clinical practice can be found at https://www.csdisseminate.com/.
Login to Impactstory with your ORCID number and password. Impactstory uses ORCID open access links to easily import information. You can upload your preprints and open access publications to Impactstory. Uploading research products (e.g., articles, slides, and videos) from Figshare with DOI is an ideal method of disseminating these publications. Impactstory has a direct link to Twitter for creating impact. They also provide directions for using social media automation tools (i.e., cyborgs and robots; OurResearch, 2021). Impactstory relies on Altmetric and CrossRef metadata to highlight researcher achievements and includes optional links to Figshare and Mendeley.
Kudos offers the easiest access to social media via direct links to Academia, Facebook, LinkedIn, Mendeley, ResearchGate, Twitter, WeChat, Weibo, and YouTube. Researchers also have the option of generating a link to a public profile or to an individual article's public profile to share to a website. Figure 2 shows an example of a Kudos public profile for one author. The Kudos website where a researcher can establish their own Kudos profile is https://growkudos.com. Researchers can add an image to generate interest and attention to each publication, maximizing usage and engagement with stakeholders. The lay title, abstract, and plain language indicator of contribution to science to health represents a step in the translational science continuum. Choices for geographic dissemination and sharing via apps popular in other countries (e.g., WeChat and Weibo) further promote engagement and impact.
Publons offers limited options for engagement; therefore, usage and engagement data are more readily available on Figshare, Impactstory, and Kudos. For additional strategies to increase usage, engagement, attention, and ultimately citations of research products through social media dissemination, see Davidson et al. (2022).

Step 4. Harvest Bibliometric and Altmetric Data

When the researcher is ready to begin documentation for promotion and tenure, they will go to their research profiles on the platforms where the metrics are summarized in their record. For science impact, the best sites to check for bibliometrics are Scopus and Web of Science. The promotion and tenure portfolio narrative about science impact may include the JIF for the journals one has published in, the citation count for specific publications, the h index, and if the researcher has a profile on Google Scholar, one may choose to include the Google Scholar i10 index. It is also important to note that the alternative platforms report some of the same bibliometric data (e.g., Figshare, Kudos, and Publons). For example, Publons is a Web of Science platform owned by Clarivate, and Publons reports Web of Science data.
The researcher would then turn their attention to alternative metrics to supplement their narrative and CV with usage, engagement, and attention data. For the altmetric data that reflect the impact of the traditional and unique indicators as well as social media strategies, researchers look to Figshare, Impactstory, Kudos, and Publons to supplement bibliometric data. Bibliometric and altmetric indicators for each of these platforms are listed by usage, engagement, attention, and citations in Table 3.
Figshare provides data for usage (e.g., downloads and views), engagement (e.g., shares), attention (e.g., Altmetric Attention Score), and citations (e.g., citation counts), especially for nontraditional research outputs that may otherwise not have been trackable or citable. This is where the researcher would go for data about national and international presentations and perhaps videos or data sets that have been uploaded and assigned a DOI. More information about Figshare metrics can be found in Table 3.
Impactstory provides supplemental data about their unique usage, engagement, and attention indicators. Usage includes the research outputs in terms of readability and open access. Engagement is depicted by indicators that emphasize the level of engagement (e.g., Follower Frenzy) and engagement reach (e.g., Global Reach). Impactstory's FUN indicators are summaries of research outputs in other unique categories (e.g., Big in Japan). See Table 3 for more details regarding the impact indicators Impactstory highlights. A sample Impactstory profile is shown in Figure 1.
Kudos provides researchers with click, download, reads, and view counts for usage; shares and tweets as indicators of engagement; and the Altmetric Attention Score and citation as an attention indicator. Kudos was developed as a platform to tell the research story in a way that is translated for easy access to stakeholder populations and the public. The dissemination strategy used by Kudos may lead to greater distribution of research outputs in mainstream media, thus increasing reach and significance of research to target populations. See Table 3 for a summary of the metrics Kudos reports. An example of a Kudos public profile is shown in Figure 2.
Publons offers limited usage (i.e., downloads and reads) and engagement (i.e., recommendations) options; however, they offer a myriad of attention indicators for journal reviews. The metrics for attention for journal reviews include alternative metrics such as average review word count, number of verified reviews, and review to publication ratio. Publons also offers traditional bibliometrics such as citations, h index, and ResearchGate score. Author impact tools are also available. See Table 3 for specific metrics. An example of the Web of Science Author Beamplot is shown in Figure 3.

Step 5. Add Bibliometric and Altmetric Data to CV

Bibliometric and altmetric data can be added to any research output citation on the candidate's CV to be submitted as part of the promotion and tenure application. Although including traditional bibliometric data such as citation metrics and altmetric data for each publication or other type of research output can be quite time consuming, they are important for significant career events such as the promotion and tenure application. They can also be included in applications for new positions. Sometimes these data are required by institutions. Due to the dynamic nature of these data, inclusion is typically limited to important career events. If including these bibliometric and altmetric data is not needed or desired, then another alternative is to highlight the most impactful publications or research outputs in the narrative portion of the application.
Here, we provide several examples of how to use bibliometric and altmetric data on a CV and on a promotion and tenure application to showcase scientific impact and clinical impact of research efforts. Examples include peer-reviewed journal articles, non–peer-reviewed journal articles, national or international conference presentations, individual audio podcast episodes, individual video podcast episodes, individual Instagram live episodes, audio podcast series, video podcast series, individual blog posts, social media outreach, patents, products and other research deliverables, interviews with media, media mentions about research as separate section on CV, and media mentions about research included as part of a citation in your traditional publication list. These examples, shown in Table 6, are not exhaustive and only serve to stimulate the researcher's creative imagination regarding ways to document clinical impact.
Table 6. Bibliometric and altmetric examples on a curriculum vitae (CV) in a promotion and tenure application.
Category of research outputsReference example
Peer-reviewed publicationsAuthor, A. (Year). Title in sentence case. Journal Name in Title Case Italicized. Vol. #(issue #), page#–page#. https://doi.org/10.000/10.1000x. 5 /Citations: 4 /Twitter mentions: 21 /Mendeley bookmarks: 22 /Facebook shares
Non–peer-reviewed publicationsAuthor, A. (Year). Title in sentence case. Publication Magazine in Title Case Italicized. Vol. #(issue #), page#–page#. https://doi.org/10.000/10.1000x. 3 /Citations: 14 /Twitter mentions: 17 /Mendeley bookmarks: 22 /Facebook shares
National or international presentationsAuthor, A. (Year, Conference Dates). Title in sentence case italicized [conference Presentation]. Conference Name, Location. https://doi.org/10.000/10.1000x. 2 Citations: 21 /Twitter mentions: 1 /Mendeley bookmarks: 121 /Facebook shares
Individual audio podcast episodeAuthor, A. (Host). (Year, Month Day). Title of the audio podcast (No. Episode number) [Audio podcast episode]. In Podcast name. Production Company. URL 4 /Twitter mentions: 22 /Facebook shares
Individual video podcast episodeAuthor, A. (Host). (Year, Month Day). Title of the video podcast (No. Episode number) [Video podcast episode]. In Podcast name. Production Company. URL 22 /Twitter mentions: 4 /Facebook shares
Individual Instagram live episodeAuthor, A. (or Username). “Title of the Instagram Live video.” Instagram, other contributors, (Year, Month Day). URL 15 /Twitter mentions: 6 /Retweets: 6 /Facebook shares
Audio podcast seriesAuthor, A. (Host). (Year–present if active, or year–year if inactive). Title of video podcast [Audio podcast]. Production Company. URL 22 /Twitter mentions: 8 /Retweets: 13 /Facebook shares
Video podcast seriesAuthor, A. (Host). (Year–present if active, or year–year if inactive). Title of video podcast [Video podcast]. Production Company. URL
Individual blog postAuthor, A. ({complete date of posts} (Year, Month Day). Title of individual blog post. Site name (if applicable). URL to specific post 3 /Citations: 14 /Twitter mentions: 22 /Facebook shares
Social media outreachLab Webpage Name, URL link, (Year, Month published). XX Hits, XX Bookmarks, or XX other altmetric data. 3 /Citations: 3/Twitter mentions: 27 /Facebook shares
Twitter: @ [Your Research or Lab Page Twitter Handle]. Number of Followers: XXX
Author, A. (Host). (Year–present if active, or year–year if inactive). Title of video podcast [Audio podcast]. Production Company. URL 7 /Twitter mentions
Author, A. (or Username). “Title of the Instagram Live video.” Instagram, other contributors (Year, Month Day). URL 9 /Twitter mentions
PatentsAuthor, A. (Year patent issued.) Title of invention (Country Patent number), filed (Month Day, Year) and issued (Month Day, Year).
Products or other deliverablesAuthor, A. (Year). Title of Product. Location of Publisher: Publishing Co. URL to access product. # Clicks
Author, A. (Year). Title of clinical tool or other resource. Brief description of tool. URL # Clicks
Interviews with mediaAuthor, A. (Year, Month Day). Interview by First Name Last Name. Publication Information. Medium. URL (if applicable). XXXX audience views/downloads from [date range].
Media mentions about research (separate section on CV)Author, A. (Year, Month Day). Title of Article. Publication information, URL if applicable [brief description of your article that was highlighted].
XXXX audience views/downloads from [date range].
Media mentions about research (included as part of citation in your traditional publication list)Author, A. (Year). Title in sentence case. Journal Name in Title Case Italicized 1(1), 1–21.* https://doi.org/10.000/10.1000x. 5 /Citations: 4 /Twitter mentions: 21 /Mendeley bookmarks: 22 /Facebook shares
*This publication was highlighted by the media. Author, A. (Year, Month Day). Title of Article. Publication information, URL if applicable.
XXXX audience views/downloads from [date range].
Note. These are provided as examples of ways a researcher can capture scientific and clinical impact of research using a combination of bibliometric and altmetric metadata.

Step 6. Submit Tenure and Promotion Packet With Confidence

There is a long-standing tradition in academia of using bibliometrics to demonstrate impact when preparing promotion and tenure applications. Research in communication sciences and disorders suggests that supplementing bibliometrics with altmetrics can help convey the true impact of research and clinical efforts in the community at large (Stuart, 2018a, 2018b; Stuart et al., 2017). Wilsdon et al. (2015) report that altmetrics have been used successfully by researchers in academia to provide supplementary impact data. A solid understanding of the use and potential abuse of altmetrics is important to the effective use of altmetric data to promotion and tenure applications
Researchers preparing promotion and tenure packets should take care to choose metrics from profiles and platforms that are robust and accurately represent the scope of their work. The quantitative metrics can provide valid and reliable support; however, they do not replace a cohesive well-written narrative carefully explaining the reach and significance of the research outputs, nor do they replace expert peer assessment of performance. Researchers should document their data collection and analytical process to provide the evaluation team with details if requested. This assures transparency. Metrics selected for inclusion should be clearly defined in a manner that can be understood across diverse professions and career paths. Care should be taken to interpret metrics with variability in a manner consistent with the intent of the platforms purpose.

A Call to Action for Tenure and Promotion Committees/Committee Members

Tenure and promotion decisions often rely on bibliometric indicators to demonstrate the scientific and clinical impact of research. Alternative metrics have emerged with potential to fill in the gaps and provide additional information not communicated by bibliometrics. In addition, we caution that these should not be the only indicators of impact as they do not reflect other important contributions of a researcher such as leadership, service, teaching, and so on. Nonetheless, there are distinct advantages and disadvantages to both types of metrics when assessing impact of research that we would like to offer for consideration.

Pros and Cons of Bibliometric Data

Faculty members applying for tenure and promotion are often encouraged or, in some case, required to quantify the value of their work. This means researchers would be reporting the number of publications, the number of citations per publication, the impact factor of the journals published in, the number of grant dollars obtained, and their individual h index (see Table 2).

Advantages

Bibliometrics analyze data, which is the essence of scientific research in all disciplines. The culture of citations is embedded into the reputation of research as a system and is a method of expressing recognition and influence of other's work (Bornmann & Leydesdorff, 2014). Bibliometrics are reasonably established as a valid and reliable tool in the general assessment of research (Bornmann & Leydesdorff, 2014). Advantages of the use of bibliometric data include it as an indicator of quantity. This quantity is limited to scholarly productivity, a very narrow perspective of impact. In addition, specific types of research output are valued (i.e., peer-reviewed publications) and counted; however, other types of research outputs do not have equivalent bibliometric indicators. Bibliometrics are seen as an objective measure, which means research impact can be compared more readily than peer review, which is seen as a subject measure. The bibliometric process is transparent, and results are easily reproduced. They are readily accessible across a wide variety of disciples, are inexpensive to produce and use, and require a relatively small amount of time to collect.

Disadvantages

Bibliometric data are numerical by nature and have highly skewed distributions, which require appropriate statistical methodologies to assess (Bornmann & Leydesdorff, 2014). For example, the JIF is subject to influence by articles that are highly cited and minimally impacted by articles that are cited very little if at all. Likewise for the h index, which boasts h as an arbitrary choice as the selection of significant publications. One of the most apparent disadvantages is that bibliometric data take time to accumulate. Three years has emerged as a reliable measurement of the impact of publications for bibliometric data (Bornmann & Leydesdorff, 2014). In many cases, the true impact of research is not readily recognized until decades after the publication of an article. Other disadvantages are that bibliometrics do not distinguish between what is cited and what is not and are simply not indicators of quality. For example, a publication may be heavily cited as a bad example of research or for otherwise negative reasons. Metrics can be exploited by researchers and journals to artificially boost bibliometric scores. Some people feel bibliometrics skew research by encouraging people to write articles they think will be cited more rather than what is valuable in terms of research. Bibliometrics do not consider variation between disciplines, publication frequency, or citation culture. For example, one cannot compare medicine to allied health.

Pros and Cons of Altmetric Data

Numerous alternative indicators derived from digital commercial products developed to collect data supplementary to traditional bibliometric data (see Table 3 for a list and definitions of alternative metrics). Altmetrics have been proposed to assess the societal impacts of research and to obtain early impact evidence (Thelwall, 2020). Although there are distinct advantages to the inclusion of these indicators in promotion and tenure applications, they are considered nontraditional. With limited systematic methods of identifying and measuring societal impact of research, altmetrics offer a potential solution (Holmberg et al., 2019). These indicators provide important information to peer reviewers in terms of research and clinical impact and should be weighed and considered for inconclusion in and consideration by promotion and tenure committees as indicators of societal impact and early evidence.

Advantages

Impact is an expectation of research outcomes and drives funding decisions regardless of whether it is scientific or its clinical research impact. Impact includes diverse stakeholders and represents a broad audience—beyond that of academia (Holmberg et al., 2019). Altmetrics capture the mentions in social media and elsewhere online, revealing something about the influence or impact research has made on a short-term basis (Holmberg et al., 2019; Thelwall, 2020). Examples of the metrics provided on social media, digital databases, and in other research specific platforms are provided in Table 3.
Holmberg et al. (2019) suggests that altmetric data can be divided into two categories: (a) data sources that reflect new forms of scholarly communication such as citations and (b) indicators that reflect other aspects of how scientific information is shared, received, and discussed. Use of indicators in the second category as early indicators of stakeholder usage, engagement, and attention offers a significant advantage. These metrics appear soon after publication or presentation and are available as early indicators of the potential long-term impact of the research. A second significant advantage is the ability to capture societal impact from diverse stakeholders beyond that of the academic community. Another significant advantage is the ability to quantify evidence beyond nonstandard outputs such as YouTube videos, conference presentations, and gray literature (Holmberg et al., 2019).

Disadvantages

Although there is significant support for the use of altmetrics to capture societal impact beyond that of the academic community and to serve as early evidence of the potential long-term scientific or clinical impact of research, confusion and controversy about how to incorporate the use of altmetric data into the decision-making process persist (Holmberg et al., 2019; Thelwall, 2020). Ethical considerations may also need to be considered. For example, additional complications include the potential bias introduced by human tweeting in comparison to Twitter bots. Use of bots in social media points to the potential for manipulation of data (Holmberg et al., 2019). Currently, collection of altmetric data may prove to be time-consuming due to the diversity of platforms and the lack of standards and normalization. The availability and diversity of altmetric data are highly inconsistent (e.g., farming vs. health care issues). Guidelines regarding implementation use of altmetric data into promotion and tenure decisions are likely to vary by discipline, and currently, clear-cut guidelines regarding implementation and use are not readily available. Ultimately, the value of altmetric data must be carefully considered to determine if it outweighs the risks.

Conclusions

Documentation of scientific and clinical impact of research is an increasingly vital task for faculty, including those in speech-language pathology and audiology. We suggest that researchers become acquainted with bibliometrics and alternative metrics for which they will likely be required to present to their employers or institutions. We also suggest researchers actively engage in the activities available to those with profiles on specific platforms. Doing so will help advance the impact of their research in the broader scientific and clinical communities. Also, probably most importantly, these data should be documented and shared with other parties for the advancement of career opportunities, such as employment or tenure and promotion applications, in addition to funding opportunities and the like. Without the careful tracking and documentation of scholarship efforts, researchers' impact will likely be underestimated.

References

Acquaviva, K. D., Mugele, J., Abadilla, N., Adamson, T., Bernstein, S. L., Bhayani, R. K., Büchi, A. E., Burbage, D., Carroll, C. L., Davis, S. P., Dhawan, N., Eaton, A., English, K., Grier, J. T., Gurney, M. K., Hahn, E. S., Haq, H., Huang, B., Jain, S., … Trudell, A. M. (2020). Documenting social media engagement as scholarship: A new model for assessing academic accomplishment for the health professions. Journal of Medical Internet Research, 22(12), e25070.
Agarwal, A., Durairajanayagam, D., Tatagari, S., Esteves, S. C., Harlev, A., Henkel, R., Roychoudhury, S., Homa, S., Puchalt, N. G., Ramasamy, R., Majzoub, A., Ly, K. D., Tvrda, E., Assidi, M., Kesari, K., Sharma, R., Banihani, S., Ko, E., Abu-Elmagd, M., … Bashiri, A. (2016). Bibliometrics: Tracking research impact by selecting the appropriate metrics. Asian Journal of Andrology, 18(2), 296–309.
Altmetric. (2021). Altmetric for researchers. https://www.altmetric.com/audience/researchers/
Baas, J., Schotten, M., Plume, A., Côté, G., & Karimi, R. (2020). Scopus as a curated, high-quality bibliometric data source for academic research in quantitative science studies. Quantitative Science Studies, 1(1), 377–386.
Bakker, C. J., Bull, J., Courtney, N., DeSanto, D., Langham-Putrow, A. A., McBurney, J., & Nichols, A. (2019). How faculty demonstrate impact: A multi-institutional study of faculty understandings, perceptions, and strategies regarding impact metrics. In D. M. Mueller (Ed.), Recasting the narrative: The proceedings of the ACRL 2019 Conference (pp. 556–568). Association of College and Research Libraries. https://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2019/HowFacultyDemonstrateImpact.pdf
Benard Becker Medical Library. (2021a). Quantifying the impact of my publications: Article metrics. https://beckerguides.wustl.edu/impactofpublications/ALM
Benard Becker Medical Library. (2021b). Quantifying the impact of my publications: What is the h index? https://beckerguides.wustl.edu/c.php?g=299569&p=2001203
Bornmann, L., & Leydesdorff, L. (2014). Scientometrics in a changing research landscape: Bibliometrics has become an integral part of research quality evaluation and has been changing the practice of research. European Molecular Biology Organization Reports, 15(12), 1228–1232.
Cabrera, D., Roy, D., & Chisolm, M. S. (2018). Social media scholarship and alternative metrics for academic promotion and tenure. Journal of the American College of Radiology, 15(1), 135–141.
Chen, J., & Wang, Y. (2021). Social media use for health purposes: Systematic review. Journal of Medical Internet Research, 23(5), e17917.
Davidson, M. M., Mahendra, N., & Nicholson, N. (2022). Creating clinical research impact through social media: Five easy steps to get started. Perspectives of the ASHA Special Interest Groups. Advance online publication.
Garfield, E. (1955). Citation indexes for science. Science, 122(3159), 108–111.
Garfield, E. (1999). Journal impact factor: A brief review. CMAJ, 161(8), 979–980.
Hirsch, J. E. (2005). An index to quantify an individual's scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102(46), 16569–16572.
Hirsch, J. E. (2007). Does the h index have predictive power. Proceedings of the National Academy of Sciences of the United States of America, 104(49), 19193–19198.
Holmberg, K., Bowman, S., Bowman, T., Didegah, F., & Kortelainen, T. (2019). What is societal impact and where do altmetrics fit into the equation? Journal of Altmetrics, 2(1).
Knowlton, S. E., Paganoni, S., Niehaus, W., Verduzco-Gutierrez, M., Sharma, R., Iaccarino, M. A., Hayano, T., Schneider, J. C., & Silver, J. K. (2019). Measuring the impact of research using conventional and alternative metrics. American Journal of Physical Medicine & Rehabilitation, 98(4), 331–338.
Kunze, K. N., Polce, E. M., Vadhera, A., Williams, B. T., Nwachukwu, B. U., Nho, S. J., & Chahla, J. (2020). What is the predictive ability and academic impact of the altmetrics score and social media attention? The American Journal of Sports Medicine, 48(5), 1056–1062.
OurResearch. (2021). Impact challenge day 11: Social media automation for academics. OurResearch blog. https://blog.ourresearch.org/impact-challenge-social-media-automation-academics/
Priem, J. Taraborelli, D., Groth, P., & Neylon, C. (2010). Altmetrics: A manifesto. http://altmetrics.org/manifesto
Saberi, M. K., & Ekhitiyari, F. (2019). Usage, captures, mentions, social media and citations of LIS highly cited papers: An altmetrics study. Performance Measurement and Metrics, 20(1), 37–47.
Shapiro, F. R. (1992). Origins of bibliometrics, citation indexing and citation analysis: The neglected legal literature. Journal of the American Society for Information Science, 43(5), 337–339.
Smith, A., & Weber, C. (2017). How stuttering develops: The Multifactorial Dynamic Pathways Theory. Journal of Speech, Language, and Hearing Research, 60(9), 2483–2505.
Spencer, T. (2022). Clinical impact of research: Introduction to special forum. Perspectives of the ASHA Special Interest Groups. Advance online publication.
Stuart, A. (2018a). Audiology faculty author impact metrics as a function of institution. American Journal of Audiology, 27(3), 354–365.
Stuart, A. (2018b). Speech-language pathology faculty author impact metrics as a function of institution. Perspectives of the ASHA Special Interest Groups, 3(10), 62–82.
Stuart, A., Faucette, S. P., & Thomas, W. J. (2017). Author impact metrics in communication sciences and disorder research. Journal of Speech, Language, and Hearing Research, 60(9), 2704–2724.
Thelwall, M. (2020). The pros and cons of the use of altmetrics in research assessment. Scholarly Assessment Reports, 2(1).
van Eck, N. J., Waltman, L., Van Raan, A. F. J., Klautz, R. J. M., & Peul, W. C. (2013). Citation analysis may severely underestimate the impact of clinical research as compared to basic research. PLOS ONE, 8(4), Article e62395.
Wasike, B. (2021). Citations gone #social: Examining the effect of altmetrics on citations and readership in communication research. Social Science Computer Review, 39(3), 416–433.
Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Jones, R., Kain, R., Kerridge, S., Thelwall, M., Tinkler, J., Viney, I., Wouters, P., Hill, J., & Johnson, B. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. Report. Higher Education Funding Council for England.

Information & Authors

Information

Published In

Perspectives of the ASHA Special Interest Groups
Volume 7Number 313 June 2022
Pages: 679-695

History

  • Received: Sep 13, 2021
  • Revised: Dec 14, 2021
  • Accepted: Jan 11, 2022
  • Published online: Mar 18, 2022
  • Published in issue: Jun 13, 2022

Authors

Affiliations

Nova Southeastern University, Fort Lauderdale, FL
Duke University School of Medicine, Durham, NC
Durham Veterans Affairs Health Care System, NC

Notes

Disclosure: The authors were members of the American Speech-Language-Hearing Association Clinical Research, Implementation Science, and Evidence-Based Practice Committee upon the writing of this article.
Correspondence to Nannette Nicholson: [email protected]
Editor-in-Chief: Patrick Finn
Publisher Note: This article is part of the Forum: Clinical Impact of Research.

Metrics & Citations

Metrics

Article Metrics
View all metrics



Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Citing Literature

  • Open science interventions proposed or implemented to assess researcher impact: a scoping review, F1000Research, 10.12688/f1000research.140556.1, 12, (1396), (2023).
  • Introduction: An International Perspective on Clinical Research in Speech-Language Pathology Cleft Care, Perspectives of the ASHA Special Interest Groups, 10.1044/2023_PERSP-23-00089, 8, 5, (955-958), (2023).
  • Clinical Impact of Research: Introduction to the Forum, Perspectives of the ASHA Special Interest Groups, 10.1044/2022_PERSP-22-00012, 7, 3, (647-650), (2022).
  • Creating Clinical Research Impact Through Social Media: Five Easy Steps to Get Started, Perspectives of the ASHA Special Interest Groups, 10.1044/2022_PERSP-21-00208, 7, 3, (669-678), (2022).

Figures

Tables

Media

Share

Share

Copy the content Link

Share