For a long time, a university’s name was its best defense. If a prestigious institution published a paper, the world assumed it was solid. But in today’s high-pressure, fast-moving research ecosystem, prestige is no longer a self-sustaining currency. The reality is that institutional reputation is fragile.
Relying on “trust” or waiting for a public scandal to break isn’t just risky—t’s outdated. It’s time to move from reactive crisis management to data-driven governance.
At SCImago, we built IRIS to change the conversation. Many people hear “integrity” and think of punishment. We think in terms of diagnostics. IRIS is not designed to play detective or issue verdicts; rather, it functions as a high-tech smoke detector for structural vulnerabilities. By applying Z-score normalization (where 0 represents the global average), we identify statistical outliers that may signal areas of institutional risk.
The 9 Signals: Decoding Your Institution’s Health
We’ve identified nine specific indicators that act as early-warning signals. On their own, they are prompts for reflection; together, they tell the story of your research culture.
| Indicator | What it measures | The “Red Flag” |
| Rate of Multiple Affiliations | Proportion of output where authors list more than one institutional affiliation. | Is this legitimate mobility, or strategic “affiliation shopping” that dilutes credit to the home institution? |
| Rate of Retracted Output | Share of an institution’s output formally withdrawn from the scientific record. | Does this reflect a robust culture of scientific correction, or systemic integrity failures (e.g., plagiarism, fabrication)? |
| Rate of Institutional Self-Citation | Proportion of citations received that originate from the institution’s own authors. | Is the institution becoming scientifically insular and lacking necessary external validation? |
| Rate of Output in Discontinued Journals | Share of output in journals removed from major databases (like Scopus) due to quality/ethical concerns. | Are researchers failing in their due diligence and falling prey to questionable or predatory publishing practices? |
| Rate of Hyper-Authored Output | Proportion of works with an anomalous, statistically outlying number of co-authors for the discipline. | Is this necessary large-scale collaboration, or a veil for unjustified guest/honorary authorship? |
| Gap in Normalized Impact | Difference between overall Field-Weighted Citation Impact (FWCI) and impact when holding corresponding authorship. | Does high-impact research rely too heavily on external partners, creating a sustainability risk for independent capacity? |
| Rate of Hyperprolific Authors | Proportion of contributions from researchers exceeding an extreme threshold (e.g., >25 works per year). | Are production incentives compromising the depth, transparency, and reality of individual intellectual contributions? |
| Rate of Output in Institutional Journals | Proportion of scholarly production published in venues managed by the institution itself. | Are we risking endogamy, compromising peer-review objectivity, and limiting external validation? |
| Rate of Redundant Output | Output with exceptionally high bibliographic overlap (>70%) with other works by the same authors in the same year. | Is coherent research being artificially fragmented (“Salami Slicing”) to inflate publication metrics? |
It’s about moving from “I think” to “I know.”
- Spot patterns early: Identify “Salami Slicing”, endogamous publishing loops, or reliance on discontinued journals before they escalate into structural problems.
- Prioritize your efforts: You can’t oversee every paper. IRIS highlights which departments may need more training, support in identifying quality venues, and which policies deserve closer review.
- Leadership dashboards: Bring integrity into the boardroom. By integrating these indicators into daily management, integrity becomes a performance asset—not just a compliance requirement.
Integrity is the foundation that protects every other achievement your institution makes. In the era of evidence-based governance, “trust me” is good—but “here is the evidence” is better.