SHARE Score

About
Framework
/

Reuse

/

R5

R5

Community Engagement

Social mentions, blog posts, Wikipedia, policy citations

Reuse (R)
Outcome metric (not FAIR-derived)
Outcome Metric

Justification

Social mentions, Wikipedia references, and policy citations capture impact beyond academia that formal citations miss.

Practical Guide

outcome

Track social mentions. Captures impact beyond academia.

Community engagement — social media mentions, Wikipedia citations, policy references — captures impact that formal citations miss. No repository in our validation set natively tracks social engagement, so we have no citation data. But services like Altmetric and PlumX show that datasets referenced in policy documents or Wikipedia have outsized real-world impact.

Why this signal matters despite the numbers

No citation data available because community engagement metrics require external services (Altmetric, PlumX, DataCite Event Data) that aren't natively tracked by repositories in our validation set.

For Repositories

  • Integrate with DataCite Event Data for social mention tracking
  • Consider Altmetric or PlumX integration for broader impact metrics
  • Display community engagement indicators on dataset landing pages

For Depositors

  • Share your dataset on academic social media (Twitter/X, Mastodon)
  • If your data is policy-relevant, highlight it in policy briefs
  • Track mentions beyond citations — Wikipedia references, blog posts, news articles

Outcome metric — not yet measurable in our validation data. Requires external services (Altmetric, PlumX, DataCite Event Data).

Standards Sources

Convergence score: 1/4 independent sources —

Bibliometric

StandardField / PropertyObligation Level
AltmetricAttention score
Service
PlumXSocial media metrics
Service
DataCite Event DataSocial events
API

FAIR Principle Alignment

Primary mapping: Outcome metric (not FAIR-derived)

This is an outcome metric not derived from FAIR principles. The R (Reuse) bucket intentionally measures realized impact rather than metadata quality, enabling validation that deposit-time signals predict downstream use.

How This Signal Is Measured

Altmetric score or DataCite social events. Binary for v1: any engagement = 1.

Empirical Evidence (Zenodo, n=1.3M)

Per-signal statistics use Zenodo as the primary validation source because it is the largest general-purpose repository with structured DataCite metadata, natural variance across all 25 signals, and available citation/usage data. Domain-specific repositories exhibit ceiling effects or restricted variance that preclude per-signal discrimination. Cross-repository validation is reported separately.

Data Source

Zenodo (CERN)

1,328,100 records analyzed

Interpretation: Not directly measurable in Zenodo metadata. Altmetric and PlumX provide social engagement scores but require API access. DataCite Event Data captures some social events.

Cross-repository note: Community engagement metrics are increasingly important for demonstrating broader impact beyond academic citations, particularly for policy-relevant datasets.

Quantitative Evidence

Scoring Formula

altmetric_score > 0 || social_events > 0 → 4 pts

Contribution: 4 of 100 points · Reuse bucket (0–20)

Data Gap

Empirical validation not yet available for this signal

Community engagement metrics require Altmetric API or PlumX (subscription services) or DataCite Event Data (social mentions). No repository in our validation set natively tracks social engagement. Planned for future integration.

Method: Not yet computed · Source: Requires external service

ShareScore