Distribution by country of the research institutions in the SIR 2012 among the first 100, 250, 500, 1000, 2000 and all (3290)

August 17, 2012

Data source: Scimago Institutions Rankings

Country 100 250 500 1000 2000 Total
USA 41 80 138 218 361 521
CHN 14 29 54 99 218 332
FRA 5 12 24 59 143 204
JPN 5 12 20 59 127 185
GBR 6 16 30 60 119 165
ESP 1 4 14 32 73 153
DEU 4 16 38 57 93 129
IND 1 1 3 18 43 128
ITA 2 10 21 43 77 128
BRA 1 4 7 20 42 104
KOR 1 5 12 31 56 91
CAN 4 10 20 37 57 89
AUS 4 8 10 24 53 78
TWN 1 3 7 20 52 73
TUR 3 14 46 62
POL 1 1 2 14 33 49
IRN 1 4 8 23 45
NLD 2 7 13 16 22 35
RUS 1 2 2 5 15 34
CHE 1 3 5 14 21 33
MEX 1 4 7 15 32
GRC 1 4 7 14 31
PRT 2 7 10 29
CZE 1 2 2 4 14 28
SWE 3 10 13 18 26
ARG 1 1 2 4 10 22
BEL 1 2 6 11 16 22
ZAF 1 5 12 22
IRL 2 4 8 21
AUT 3 9 15 20
ISR 3 5 9 14 20
DNK 3 4 6 9 19
FIN 1 3 10 15 19
ROU 4 10 19
THA 1 3 11 18
MYS 2 5 8 17
NOR 1 4 5 10 17
PAK 7 16
CHL 1 3 5 15
EGY 3 9 15
NZL 2 5 9 15
SGP 2 3 3 3 5 12
HKG 3 5 6 8 10
HRV 1 1 6 10
NGA 3 9
HUN 1 1 6 7 8
SRB 1 1 5 8
UKR 1 1 1 3 8
COL 1 4 7
SAU 1 3 7
TUN 3 7
VEN 4 7
JOR 2 6
SVK 1 2 3 6
SVN 1 2 3 6
BGR 1 1 2 5
DZA 1 5
LTU 3 5
ARE 1 4
BGD 4
VNM 4
BLR 1 2 3
EST 1 2 3
IDN 3
ISL 1 3
MAR 1 3
PER 3
QAT 3
ARM 2
AZE 1 2
CUB 1 2
CYP 1 2
GEO 2
GHA 2
KEN 1 2
LBN 1 2
LKA 2
LVA 1 2
PHL 2
TZA 2
BIH 1
BWA 1
CIV 1
CMR 1 1
CRI 1
ETH 1 1
JAM 1
KWT 1 1
LUX 1
MAC 1 1
MDA 1
MKD 1
MLT 1
MNE 1
MWI 1
OMN 1 1
PAN 1
PRI 1 1 1
SDN 1
SEN 1
TTO 1
UGA 1 1
URY 1 1
UZB 1 1
ZMB 1
ZWE 1
Total 100 250 500 1000 2000 3290

Normalized Impact of Universities (I)

June 11, 2012

We, at Scimago Lab, include the Normalized Impact indicator in every ranking SIR we build, both in world and iberoamerican series. This indicator can be seen as a substitute to the overused Impact Factor as it conveniently captures the scientific impact of institutions and corrects the biases generated when heterogeneous outputs –such as those coming from big universities– are analyzed. This post is the first of a series of two in which we describe its main features and computation methods.

When the scientific output of an university is analyzed to evaluate its impact and compare it to other universities’, we must have in mind that there exists many factors influencing the amount of citations achieved by the university such as the research thematic profile, the document types where its output is published, the “age” of the papers under analysis, etc. These biases make a direct comparison on the quantity of citations obtained by universities unfeasible, therefore rankings built over raw citation counts should be avoided. It should happen the same to journal rankings (as IF based journal rankings), but that matter will not be discussed here.

At the level of Universities and large Research Centres (national agencies, big companies, etc), the Normalized Impact indicator is more efficient to reveal the scientific impact (or visibility) of institutions outputs because it includes some tools to correct the topic-type-age biases. The Normalized Impact amends the citation value achieved by every paper by comparing it to an international standard, thus in this way it normalizes the disparity of citation scores obtained by papers in different domains, with different ages and of different types.

This citedeness bias is due to the different  citation dynamics:

Between Thematic Domains. There exists different citation behaviours in different areas of Science, so it cannot be directly compared the citedeness of papers in Medicine with works on Social Sciences, Engineering, Life Sciences nor Health Life. Every thematic domain has its own citation patterns.

The following bubble chart shows Germany’s 30 Top Subject Categories by number of documents published in the period 2009-2010. Bubble size represents Citations per Document in the area. The big light green bubble correspond to Cell Biology –located between 20,000 and 25,000 Cites in the vertical axis– (11.13 citations per document) while the wine-red little one down in the middle correspond to Electrical and Electronic Engineering (2.23 citations per document). So how would influence this parameter to a university output that had a strong Engineering focus? It seems clear that using Impact Factor (or any other non-standardized methodology) to build the rank would push it down, no matter how good it was in its research.

Germany’s 30 Top Subject Categories by number of documents published in the period 2009-2010
Source: SCImago Journal and Country Rankg. Period: 2009-2010. Link

Between different document types. There also exists different citation patterns among the various types of papers/articles/reviews/etc. traditionally included in scholarly journals. So reviews should not be directly compared with papers (original article), conference papers, letters, etc.

Among different papers’ “ages”. Usually, when a ranking is built, the output of universities to be compared includes sets of papers with different ages, that is, papers that have been published in several years. But a paper that has been published three years before another one has more chances to achieve citations because it has had more time to be known by the scientific community, so the ages of the papers in the set influence the number of expected citations.

The following chart shows year by year citedeness achieved by Canada’s output, both the total number of citations and self-citations (that is, canadian papers citing ohter canadian papers). It is clear how the line declines at the end, when it approximates to the current date. So should 2006 output be evenly compared to 2010’s when less citations are “being played”? It must be taken into account that when universities are compared to build rankings, papers from differente ages a measured with this evident age bias.

Annual Series of Citations achieved by Canada’s Output
Source: SCImago Journal and Country Rankg. Period: 2009-2010. Link

How international standards are built

What NI makes is to build citation standards that are used to normalize the citation achieved by an institution. The normalization is then carried out paper by paper o just in one time by summing up all the university citations (i.e. citations received by the whole output of and institution) and normalizing later.

The citation value of a “standard paper” is computed for every topic-type-age  relationship. For instance, to build a “standard 2008 review in Oncology” the citations to every Review in Oncolgy published in 2008 are counted and then divided by the number of reviews or to build a “standard 2009 original article in Applied Mathematics” the mean of citations achieved by all the Original Articles in Applied Mathematics published in 2009 is computed. This is done for each one of these triads so at the end of this process we have a bunch of standard paper values ready to be used to normalize the output of institutions.

How to interpret NI scores

NI scores are standardized to the world average which is asigned a value of 1, then a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average.

The following chart shows the Top Ten universities by output in China and France ordering them by NI scores. As can be seen, just two Chinese universities equal or exceed the world average barrier, to the rest their scientific impact is lower than the international standard, in some a 40% lower (those that have 0.6 score) On the other hand, french largest universities present scientific outputs with significant impact, all of the over 40% of the world average (1.4 to 1.7)

Source: SCImago Insitutions Rankings. Period: 2006-2010

In the following post of this series we will talk about the two main methods to compute the Normalized Impact indicator: the so called CROWN indicator by the University of Leiden and the “Field Normalized Citation Score” set by the Karolinska Instituet. We will also compare NI with other indicators used in SIR Rankings and describe its features.

Borja González Pereira works at Scimago Lab as Content and Media Manager, he is also a researcher at SCImago Research Group where participates in the development of the SJR indicator and in renowned bibliometic projects including Scimago Journal & Country RankScimago Institution Rankings and The Atlas of Science. His main interest included Bibliometrics and Scientometrics.

World’s Top 100 Companies by Scientific Knowledge Output, a Scientometric Characterization (SCOPUS 2003-2010)

June 6, 2012

To date we have been posting brief scientometric tables of universities and national research agencies. to expand our analysis, the current post is devoted to Private Companies as scientific agents, so we´ll lead the spotlight to the World’s Major Companies and their research performance outputs. We are going to consider the Top 100 scientific producers and show their indicators related to scientific impact, international collaboration, ability to put their papers in top journals, specialization rate and excellence output.

The research activity carried out at big companies is concentrated in Northen America and Western Europe –44 in Northern America and 36 in Western Europe among the Top 100–. The list shows a prominent bias in the Activity Sector as well: most of the selected companies belong to ICT(46), PHARMACEUTICAL (26) or ENERGETIC (14)

The following table contains some key scientometric indicators of these companies based on SCOPUS data during the period 2003-2010:

Download Table

 

The indicators included in the table are the following:

Output

The number of scientific papers published in scholarly journals reveals the ability of an institution to create scientific knowledge.  In co-authored publications, a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration

It shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of impact, the calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1

Ratio of publications that an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

Specialization Rate

The Specialization Rate indicates the extent of thematic concentration / dispersion of an institution’s scientific output. Values range between 0 to 1, indicating generalistic vs. specialized institutions respectively. This indicator is computed according to the Gini Index used in Economy.

Excellence

The Excellence indicates the number of papers an institution included into the set formed by the 10% of the most cited papers in their respective scientific fields. It is a measure of high quality output of research institutions.

Leadership

Number of research papers where the author of the company is the main contributor, selected as paper’s corresponding author which is expected to be in most cases the leader resarcher.

 

The largest National Bodies for Research in the world: some performance indicators (2003-2010)

May 4, 2012



Download the report

Last year we published a series of posts characterizing, through scientometrics methods, the research activity carried outin some of the major National Bodies for Reserach from Europe. Specifically, the series comprised the Cosiglio Nazionale delle Ricerche CNR (Italy), the Max Planck Society (Germany), and the Spanish Consejo Superior de Investigaciones Científicas. With the goal of highlighting some global characteristics involving the research outcomes of national organizations, we are releasing in this post a informative table containing a selection of performance indicators belonging to the six largest National Bodies for Research (in number of scientific publications):

  • Centre National de la Recherche Scientifique,
  • Chinese Academy of Sciences,
  • Russian Academy of Sciences,
  • Max Planck Gesellschaft,
  • Consejo Superior de Investigaciones Cientificas,
  • Consiglio Nazionale delle Ricerche
photoFélix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Universities by country among the Top 100 by research field based on SCOPUS and SCIMAGO

April 17, 2012

For this post, we have elaborated rankings of universities based on the output they have in each of the major scientific fields. Then we took the Top 100 in each field and saw them by countries. The result is the following table where each row represents a country and each cell shows how many institutions that country has within the Top 100 in the corresponding field. The rank is ordered by the average of institutions that countries have in Top 100. We have used Scopus as the data source, after a disambiguation process of institution names.

Download the table.

Download the full list of universities.

 

photo

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Forecasting exercise: How World Scientific Output will be in 2018

April 11, 2012

According to how nations’ scientific output have increased during the period 2003-2010 (top 50), we propose the following forecasting exercise (PDF), we will use this data to obtain a prediction not necessarily realistic, but anyhow suggestive, of the scientific output scenario for 2018.

Forecasting Excercise

Growth rate (percentage) and Total Output by country during the period 2003-2010 (click on the graph to access the table with all the data of this forecasting exercise)

Source: SCOPUS. Elaboration: SCImago

photo

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Science Indicators of Spanish HEIs 2006-2010

February 20, 2012

Sciene Indicators of Spanish HEIs 2006-2010
Download the report [English]

Felix de Moya Anegon. This post is devoted to characterize the research activity carried out at Spanish Higher Education Institutions over the period 2006-2010. The following table (in PDF format) shows scientometric indicators of research output, international collaboration, normalized impact, % Output in 1st quartile journals according to the Scimago Journal Rank indicator and the number of works of excellence (ordering criteria) published by any of the institutions.

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

Excellence

The Excellence indicates the number of papers an institution included into the set formed by the 10% of the most cited papers in their respective scientific fields. It is a measure of high quality output of research institutions.

photo

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Performance Indicators of Italian CNR Institutes

July 5, 2011

Performance Indicators of CNR Institutes
Download the report [English]
Descargar el informe [Español]

Felix de Moya Anegon. This is the third post in our series on performance indicators of government organizations and leading national scientific institutions (National Research Councils and Science Academies). After the analysis of Spanish CSIC and German Max Planck Society we now focus on the Italian Cosiglio Nazionale delle Ricerche CNR and its research institutes. Mainly funded by public funds, the Italian leading scientific institution is a government agency with 107 Institutes set throughout the country (four of them have been discontinued) and grouped into 11 thematic departments. In 2009, the scientific output attributable to CNR Institutes represented more than 10% of Italian scientific output. Again, we present in this post a table (in PDF format) that, based on Elsevier’s Scopus database, shows scientometric indicators of research output, citation impact and international collaboration related to CNR Institutes. Its main goal is not to rank the institutes, but to highlight some of the differential characteristics in regard to the research outcomes achieved by these centers.

In order to show trends, the table includes three consecutive 5-year periods (2003-2007, 2004-2008 and 2005-2009). Ordering criteria is the output of the CNR Institutes during the period 2005-2009. The institutes are color-marked to indicate which ones have Normalized Impact (NI) values higher than the NI average over all CNR Institutes, which surpass Italian NI average (but fall below CNR average) and which ones fall below Italian NI average.

The following table provide some background information by showing the same four indicators and periods that you will find in the PDF report, but referred to the CNR entire organisation and to Italy:

CNR Output % IC NI % Q1
2003-2007 39,949 41.8 1.2 66.0
2004-2008 38,528 42.2 1.2 64.8
2005-2009 37,916 42.7 1.3 63.8
ITALY Output % IC NI % Q1
2003-2007 292,346 35.8 1.2 54.8
2004-2008 311,598 36.5 1.2 53.9
2005-2009 328,826 37.2 1.2 53.1

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration IC(%)

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact NI

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1(%)

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

photo

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Performance Indicators of German Max Planck Institutes

June 25, 2011

Performance Indicators of CSIC Research Institutes
Download the report [English]
Descargar el informe [Español]

Felix de Moya Anegon. In the previous post on Performance Indicators of Spanish CSIC Research Institutes, we started up a series of posts intended to characterize, through scientometric methods, the research activity carried out at the large government organizations and leading national scientific institutions (National Research Councils and Science Academies), mainly in countries from Europe and Asia. We now turn to the German Max Planck Society and its research institutes (Max Planck Institutes, MPIs). The Max Planck Society is not a government institution; it is a registered association and has its registered seat in Berlin. As of January 1, 2011, the Max Planck Society promotes research in 80 own institutes and research facilities; four institutes and one research facility are situated abroad. The following table (in PDF format) shows scientometric indicators of research output, citation impact and international collaboration related to MPIs based on Elsevier’s Scopus database. Its main goal is not to rank MPIs, but to highlight some of the differential characteristics in regard to the research outcomes achieved by these institutes.

In order to show trends, the table includes three consecutive 5-year periods (2003-2007, 2004-2008 and 2005-2009). Ordering criteria is the output of the MPIs during the period 2005-2009. The institutes are color-marked to indicate which ones have Normalized Impact (NI) values higher than the NI average over all MPIs, which surpass German NI average (but fall below MPIs average) and which ones fall below German NI average.

The following table provide some background information by showing the same four indicators and periods that you will find in the PDF report, but referred to Max Planck entire organisation and to Germany:

MAX PLANCK Output % IC NI % Q1
2003-2007 46,111 64.1 1.8 73.5
2004-2008 48,338 64.5 1.8 72.7
2005-2009 50,038 65.0 1.8 72.2
GERMANY Output % IC NI % Q1
2003-2007 577,881 41.2 1.3 52.6
2004-2008 609,033 42.1 1.3 52.3
2005-2009 634,385 43.1 1.3 52.0

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration IC(%)

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact NI

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1(%)

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

photo

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Performance Indicators of Spanish CSIC Research Institutes

June 20, 2011

Felix de Moya Anegon. The major European national research councils —Max Plank Society from Germany, Centre National de la Recherche Scientifique (CNRS) from France, Italian’s Consiglio Nazionale delle Ricerche (CNR) or Spanish’s Consejo Superior de Investigaciones Científicas (CSIC)– and science academies from Eastern Europe countries and Asia including China, Russia (Russian only) or Ucrania have highly complex institutional structures consisting of tens or hundreds of research institutes with diverse scientific missions. From a scientometric point of view, the characterization of such institutional complexity demands the use of performance indicators that go beyond global structures and put the focus on each institute, research center and research laboratory that make up these scientific organizations.

Performance Indicators of CSIC Research Institutes
Download the report [English]
Descargar el informe [Español]

With this in mind, this post starts up a series of posts devoted to characterizing the research activity carried out at national research councils and science academies all over the world, through scientometric methods and from an agglutinative perspective. The following table (in PDF format) shows scientometric indicators of research output, impact and collaboration related to Spanish’s CSIC research institutes, based on Elsevier’s Scopus database. Its main goal does not consist in making a ranking table of CSIC centers, but to highlight some of the differential characteristics involving the research outcomes achieved by these institutes.

In order to show value trends, the table includes three consecutive 5-year periods (2003-2007, 2004-2008 and 2005-2009). Ranking criteria is the output of institutions during the period 2005-2009. Also, the institutes are color-marked to indicate which ones have Normalized Impact values higher than CSIC NI average, which overpass Spanish NI average and which ones fall below.

Beforehand, to help put the information in the table into context, we include here the same indicators referred to CSIC entire institution and to Spain. CSIC represents 15.3% of the total Spanish research output:

CSIC Output % IC NI % Q1
2003-2007 35,429 47.8 1.4 69.9
2004-2008 38,665 48.6 1.4 69.5
2005-2009 41,929 49.4 1.4 68.7
SPAIN Output % IC NI % Q1
2003-2007 230,731 33.3 1.1 49.3
2004-2008 252,422 33.9 1.1 48.4
2005-2009 273,482 34.8 1.1 47.8

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration IC(%)

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact NI

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1(%)

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

photo

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Powered by Wordpress