Forecasting exercise: How World Scientific Output will be in 2018

According to how nations’ scientific output have increased during the period 2003-2010 (top 50), we propose the following forecasting exercise (PDF), we will use this data to obtain a prediction not necessarily realistic, but anyhow suggestive, of the scientific output scenario for 2018.

Forecasting Excercise

Growth rate (percentage) and Total Output by country during the period 2003-2010 (click on the graph to access the table with all the data of this forecasting exercise)

Source: SCOPUS. Elaboration: SCImago

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Science Indicators of Spanish HEIs 2006-2010

Felix de Moya Anegon. This post is devoted to characterize the research activity carried out at Spanish Higher Education Institutions over the period 2006-2010. The following table (in PDF format) shows scientometric indicators of research output, international collaboration, normalized impact, % Output in 1st quartile journals according to the Scimago Journal Rank indicator and the number of works of excellence (ordering criteria) published by any of the institutions.

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

Excellence

The Excellence indicates the number of papers an institution included into the set formed by the 10% of the most cited papers in their respective scientific fields. It is a measure of high quality output of research institutions.

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Performance Indicators of Italian CNR Institutes

Felix de Moya Anegon. This is the third post in our series on performance indicators of government organizations and leading national scientific institutions (National Research Councils and Science Academies). After the analysis of Spanish CSIC and German Max Planck Society we now focus on the Italian Cosiglio Nazionale delle Ricerche CNR and its research institutes. Mainly funded by public funds, the Italian leading scientific institution is a government agency with 107 Institutes set throughout the country (four of them have been discontinued) and grouped into 11 thematic departments. In 2009, the scientific output attributable to CNR Institutes represented more than 10% of Italian scientific output. Again, we present in this post a table (in PDF format) that, based on Elsevier’s Scopus database, shows scientometric indicators of research output, citation impact and international collaboration related to CNR Institutes. Its main goal is not to rank the institutes, but to highlight some of the differential characteristics in regard to the research outcomes achieved by these centers.

In order to show trends, the table includes three consecutive 5-year periods (2003-2007, 2004-2008 and 2005-2009). Ordering criteria is the output of the CNR Institutes during the period 2005-2009. The institutes are color-marked to indicate which ones have Normalized Impact (NI) values higher than the NI average over all CNR Institutes, which surpass Italian NI average (but fall below CNR average) and which ones fall below Italian NI average.

The following table provide some background information by showing the same four indicators and periods that you will find in the PDF report, but referred to the CNR entire organisation and to Italy:

CNR Output % IC NI % Q1
2003-2007 39,949 41.8 1.2 66.0
2004-2008 38,528 42.2 1.2 64.8
2005-2009 37,916 42.7 1.3 63.8
         
ITALY Output % IC NI % Q1
2003-2007 292,346 35.8 1.2 54.8
2004-2008 311,598 36.5 1.2 53.9
2005-2009 328,826 37.2 1.2 53.1

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration IC(%)

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact NI

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1(%)

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Performance Indicators of German Max Planck Institutes

Felix de Moya Anegon. In the previous post on Performance Indicators of Spanish CSIC Research Institutes, we started up a series of posts intended to characterize, through scientometric methods, the research activity carried out at the large government organizations and leading national scientific institutions (National Research Councils and Science Academies), mainly in countries from Europe and Asia. We now turn to the German Max Planck Society and its research institutes (Max Planck Institutes, MPIs). The Max Planck Society is not a government institution; it is a registered association and has its registered seat in Berlin. As of January 1, 2011, the Max Planck Society promotes research in 80 own institutes and research facilities; four institutes and one research facility are situated abroad. The following table (in PDF format) shows scientometric indicators of research output, citation impact and international collaboration related to MPIs based on Elsevier’s Scopus database. Its main goal is not to rank MPIs, but to highlight some of the differential characteristics in regard to the research outcomes achieved by these institutes.

In order to show trends, the table includes three consecutive 5-year periods (2003-2007, 2004-2008 and 2005-2009). Ordering criteria is the output of the MPIs during the period 2005-2009. The institutes are color-marked to indicate which ones have Normalized Impact (NI) values higher than the NI average over all MPIs, which surpass German NI average (but fall below MPIs average) and which ones fall below German NI average.

The following table provide some background information by showing the same four indicators and periods that you will find in the PDF report, but referred to Max Planck entire organisation and to Germany:

MAX PLANCK Output % IC NI % Q1
2003-2007 46,111 64.1 1.8 73.5
2004-2008 48,338 64.5 1.8 72.7
2005-2009 50,038 65.0 1.8 72.2
         
GERMANY Output % IC NI % Q1
2003-2007 577,881 41.2 1.3 52.6
2004-2008 609,033 42.1 1.3 52.3
2005-2009 634,385 43.1 1.3 52.0

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration IC(%)

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact NI

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1(%)

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Performance Indicators of Spanish CSIC Research Institutes

Felix de Moya Anegon. The major European national research councils —Max Plank Society from Germany, Centre National de la Recherche Scientifique (CNRS) from France, Italian’s Consiglio Nazionale delle Ricerche (CNR) or Spanish’s Consejo Superior de Investigaciones Científicas (CSIC)– and science academies from Eastern Europe countries and Asia including China, Russia (Russian only) or Ucrania have highly complex institutional structures consisting of tens or hundreds of research institutes with diverse scientific missions. From a scientometric point of view, the characterization of such institutional complexity demands the use of performance indicators that go beyond global structures and put the focus on each institute, research center and research laboratory that make up these scientific organizations.

With this in mind, this post starts up a series of posts devoted to characterizing the research activity carried out at national research councils and science academies all over the world, through scientometric methods and from an agglutinative perspective. The following table (in PDF format) shows scientometric indicators of research output, impact and collaboration related to Spanish’s CSIC research institutes, based on Elsevier’s Scopus database. Its main goal does not consist in making a ranking table of CSIC centers, but to highlight some of the differential characteristics involving the research outcomes achieved by these institutes.

In order to show value trends, the table includes three consecutive 5-year periods (2003-2007, 2004-2008 and 2005-2009). Ranking criteria is the output of institutions during the period 2005-2009. Also, the institutes are color-marked to indicate which ones have Normalized Impact values higher than CSIC NI average, which overpass Spanish NI average and which ones fall below.

Beforehand, to help put the information in the table into context, we include here the same indicators referred to CSIC entire institution and to Spain. CSIC represents 15.3% of the total Spanish research output:

CSIC Output % IC NI % Q1
2003-2007 35,429 47.8 1.4 69.9
2004-2008 38,665 48.6 1.4 69.5
2005-2009 41,929 49.4 1.4 68.7
         
SPAIN Output % IC NI % Q1
2003-2007 230,731 33.3 1.1 49.3
2004-2008 252,422 33.9 1.1 48.4
2005-2009 273,482 34.8 1.1 47.8

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration IC(%)

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact NI

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1(%)

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Scientific Excellence Georeferenced. The neighborhood matters.

While the current dynamics of Worldwide Science outputs are rising to the surface Research Institutions from developing countries with important results; the location of Institutions holding excellence outcomes, that is, those which represent a truly and significant science advancement does not seem to be changing.

Although a formal study would need a detailed analysis by scientific fields, an exploratory analysis of the group formed by the Research Institutions which annually publish 100 scientific papers or more (3,000 worldwide, as indexed in Scopus database) reveals that the resulting geographic distribution has a strong bias which is explained by research spending patterns in different regions of the world. There is a large concentration of institutions in North America, Europe, India, China and Japan, and to a lesser extent, in the southern hemisphere, in Chile, Argentina and more prominently in Brazil, South Africa and Australia.

Neighborhood influence

By depicting Research Institutions grouped into four levels of Normalized Impact (NI), a picture, as the one exposed, of the geographic distribution of Research Excellence around the world is obtained. Taking into account that, with odd exceptions, institutions reaching NI scores higher than the world average (values higher than 1) are concentrated in North America, Western Europe and Australia/New Zeeland, we must conclude that if the geographic bias in Scientific Outputs is high, it is even higher the bias affecting the Scientific Impact.

MI world map

Impact can be considered to reflect the use researchers make of the scientific knowledge previously generated. With this in mind, the map suggests that those regions that produce the most (Western Europe and North America) firstly use the knowledge being generated in their area, justifying this way the concentration of large impacts in highly productive regions and implying that Research Institutions in the most productive regions worldwide accumulate a reputational capital, which is due to the geographical context where they are, and on the other hand, that is unattainable for institutions located in less productive regions. It is still to be seen what will happen with China, a newcomer to the elite of more productive scientific countries in the world. Put another way, the neighborhood of a Research Institution affects the scientific reputation it can achieve in global terms, unless it can go beyond its neighborhood through inter-regional alliances with reputed institutions from highly productive regions.

Scientific Dependence

To developing countries, an unintended consequence of the need to collaborate with researchers from highly productive regions is what it could be called “scientific dependence”. Even though it is difficult to measure the role played by researchers in scientific works by just studying the affiliation fields, what we really know is that certain Research Institutions show extremely high rates of International Collaboration (IC) which indeed can be associated to situations of scientific dependence. I.e. when an Institution’s IC reaches or exceeds 80% of its total output, in such a way that its exclusive production ends being marginal, it is clear that it operates in a “scientific dependence” situation. Furthermore, when an institution shows high collaboration rates jointly with an outstanding impact (NI) within its own country and/or region, such impact will be most likely due to the collaboration, and it may be concluded that there exists an external scientific dependence.

Some countries from Latin America constitute an example and an interesting case of scientific dependence in cases where a less scientific development, a highly international collaboration rate and a relatively high impact in a regional scale come together. See the high correlation between IC and NI values in the following table.

 

Country Output IC NI
Brazil 135,259 25.35 0.77
Mexico 52,527 38.65 0.75
Argentina 34,927 42.93 0.90
Chile 20,534 52.64 0.95
Venezuela 8,489 44.40 0.65
Colombia 8,292 53.90 0.83
Cuba 6,915 39.38 0.46
Puerto Rico 4,476 57.95 1.00
Uruguay 2,711 64.66 0.95
Peru 2,654 75.77 1.13
Costa Rica 1,946 71.17 1.16

Science Indicators of Spanish Public Universities

We are releasing here a small post with some interesting indicators that can help contribute to shed light on the performance of the Spanish Scientific System, these specifically are related to publicly-funded universities.

The table is available as a pdf and can be downloaded here: Science Indicators of Spanish Public Universities

The list below is a description of the indicators you will find on the table.

  • Normalized Impact (NI): (ordering criteria) weighted citation average of universities. By normalizing citation counts for differences among fields this indicator represents, in an indirect way, the Average Scientific Quality of the universities. Analysis Period: 2005-2009
  • Weighted Scientific Production (WSP): It is the result of assessing each scientific paper by its Normalized Impact and then adding them up for an entire institution. In this way, each paper counts higher than 1 when cited above world average (in its field, type and year) and lower than 1 when cited below. Analysis Period: 2008
  • Equivalent Full-Time Faculty Productivity (EFTF-P): Scientific Output by Equivalent Full-Time Faculty. Year 2008.
  • Total Expenditure in Research (TER): Combination of the different Research Expenditure Items at a University. Year 2008.
  • Expenditure per Paper (EP): Average expenditure per published scientific paper at a University (considering its Weighted Scientific Production). Year 2008.
  • Total Equivalent Full-Time Faculty (EFTF): Number of Full-Time Academics by converting part-time to full-time equivalents. Year 2008.
  • Expenditure per Equivalent Full-Time Faculty (TER/EFTF): Relationship between Total Expenditure in Research (TER) and Total Equivalent Full-Time Faculty (EFTF). Year 2008.

Overall thoughts about Normalized Impact

The evolution of the Normalized Impact of Spanish universities highlights the overall average impact improvement in Spanish system, as well as a slight decrease in top-notch universities. It remains true that the whole Public System is balanced and comparable with scientifically developed countries, specifically when compared with some English-speaking countries. More than 75% of Spanish public universities are above the world average impact, however none of them reach impact average levels higher than 50% of the world average.

Chart: Evolution of Normalized Impact in Spanish Universities

The following chart relates the academics’ productivity average and the Normalized Impact of universities (columns EFTF-P and NI in the table) and shows that there exists a clear positive relationship between these variables. In other words, in those universities where academics’ quantitative output in terms of scientific publications is higher, their qualitative outcomes are higher too. This means that as faculty’s publication activity at universities becomes more intense, the average quality of its outcomes reach higher peaks, among other things as a consequence of the higher chances of scientific synergies that take place within the institutions.

Professors' productivity and the Normalized Impact of universities

Institutional Collaboration in Global Science

Felix de Moya Anegon. Nowadays, georeferenced maps of science are becoming widely used in a number of ways in Research Analysis and Evaluation because of their ability to represent context information; see for instance Beauchesne’s captivating images in his post on Maps of scientific collaboration between researchers or Bornmann & Waltman’s geographical density maps. There are many more examples. However I will be posting on a different kind of map: a Scientific Collaboration Map that overtakes issues regarding the topographical rigidity exhibited by the former. The proposed maps represent the nearly 3,000 Research Institutions which account for about 80% of the world scientific output.

World Collaboration Map

In the picture, institutions are linked one another based on Research Collaboration as stated on paper affiliations (the list of institutions in the chart is exactly the same to the one used to elaborate the Scimago Institutions Rankings World Report 2010) According to the algorithm used to make the picture, collaboration links act as gravitational forces in such a way that the closeness between nodes (and clusters) represents collaboration strength; each node represents an institution and colors symbolize world regions.

Scientific collaboration tends to take place among neighbors (preferably but not exclusively) so one might expect that a graph illustrating collaboration links among Research Institutions should group them into regions, as in the figure. This regional cluster formations, besides highlighting intra-regional vs inter-regional collaboration strength, helps us analyze the degree of centrality reached by the different regions of the world within the global network of Research Institutions. You can observe not only size differences in regional sub-networks but also how “central” are different regions within scientific knowledge generation and communication processes.

Reputed vs. Emerging Science

In these kind of representations, centered positions tends to reflect higher reputation levels while peripheral ones imply larger local collaboration patterns. As a consequence, researchers belonging to centered institutions are requested to collaborate by researchers from all around the world.

As it was to be expected Research Institutions from Northern America (USA and Canada) and Europe compete for central positions. These regions have intense research collaboration links as the wide contact front between institutions from both regions highlights. Meanwhile, the self-organizational system of World Science keeps on pushing outwards to traditionally peripheral regions (Asia, Latin America, Middle East and Africa) despite these regions currently exhibit larger Growth Rates than central ones.

In fact, despite the impressive increasingly important role played by Asia in Global Research Outputs, mainly due to Chinese Science grown, their institutions are still far from reaching the reputation levels achieved by some European and Northern American’s Research Institutions, hence the outlying position showed by the Asian Cluster. It is so, even though Asia and Northern America have strong collaborations links affecting many Asiatic Countries.

Latin America, Oceania and Middle East

These regions maintain priority collaboration links, at a regional scale, with at least two regions each. Latin America with Europe –mainly Spain and Portugal- and Northern America; in this case the picture shows that Latin America – Europe collaboration links are so intense that the boundary between both regions is not well defined in the map. Similarly, the deep links that Australian and New Zeeland Research Institution has with Northern American counterparts lead to the blurred boundaries observed in the map between Oceania and Northern American Regional Clusters.

Featured Neighborhoods

The Middle East region deserves an isolated view. It is the unique region divided into two separated areas in the map. Arabic countries are placed at the bottom of the map primarily connected to Europe, mainly East Europe (white nodes), and a lesser extent to Asia and Northern America; while Israel sets itself in the junction of Europe, Northern America and Latin America.
The overlapping nodes from Western and Eastern Europe depict only one research cluster for Europe; suggesting there is not Western Europe separated from Eastern Europe in terms of Research Collaboration.
African Institutions share their collaboration links between Europe and Middle East.
Japanese Institutions fit their institutional links among the Northern American collaboration web.

Further Analyses

It turns out obvious that the relative weight of Biomedicine and Health Sciences is very important in this kind of representation given the skewed thematic distribution shown by the world research output and the fact that all those papers with authors belonging to the same institutions are excluded. In the future will be appropriate to carry out more thorough analyses devoted to concrete scientific fields where possible collaboration pattern differences are highlighted, meanwhile this post can serve as a discussion provoking contribution.

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

The research impact of National Higher Education Systems

Felix de Moya Anegon. Research is not the only activity carried out at universities, but we know Higher Education Institutions’ (HEIs) ability to generate scientific knowledge is an obvious symptom of general performance. The existence of renowned researchers in universities is not by itself a sign of global quality, however nowadays it is hard to think about advanced human capital training unconnected to high quality knowledge generation processes.

In this regard, bibliometric indicators measuring the average quality of HEIs research outputs are valuable reference marks of research capability when it comes time to disseminate the acquired knowledge and spread it to the students. For this reason, it makes sense to put HEIs through benchmarking processes using the Normalized Impacts (NI)* reached by its research outputs. And it will make sense as well to compare the impact distributions of National Higher Education Systems to assess aspects such as the level of heterogeneity of universities within a country or its compared average level, so that policymakers consider whether is better to sacrifice homogeneity to promote top-class institutions, or otherwise the excellence can be reached for the system itself without having to give up equity.

The following chart generated from the World Report SIR 2010 (based on Scopus data) shows, in a comparative way, the Normalized Impact Distributions for all the universities belonging to the world’s 50 most productive countries on scientific knowledge.

chart

[Download data: Microsoft Excel | Open Office]

Universities within countries are distributed in quartiles ranging form higher to lower NI. As can be seen on the chart, the 50 countries can be divided into two groups: on one hand, those which have more than 75% of its universities over the world average and; on the other, those which have the same percentage underneath. We find 24 countries in the first group featuring USA, UK, Germany, France, Canada, Italy and Spain and 26 in the second one including China, Japan, Korea, India, Brazil and Russia.

It might be surprising the presence of Japan or Korea in the second group, but it is not. We must consider that NI distributions show the average value of the international visibility reached by universities research outputs in those countries. Therefore the fact of being technologically developed countries is not incompatible with the relatively low research visibility showed by its academic institutions. The reasons for this are varied and this may not be the place to deal with them, but it can be pointed out briefly that according to the SIR World Report 2010 while one can easily find large Japanese and Korean Companies carrying out high-profile research amongst the world’s Top 100 ordered by NI (i.e. Toyota, Nippon, Samsung, Toshiba, Fujitsu, Hitachi and Mitsubishi) it is otherwise hard to find a Japanese or Korean university in the same ranking Top 400.

It also worth analyzing the level of homogeneity shown by different National Higher Education Systems. The chart shows some countries having similar NI values for all their universities, these countries, which belong either to the first or second before-mentioned groups, show small differences between extreme NI values, i.e. Belgium and Ukrania. On the contrary, the most heterogeneous countries, from an NI perspective, are those whose universities manage to reach very high impact rates jointly coexisting with other ones presenting very low NI rates. This phenomena is not only related to the size or complexity of educational systems, but also to Public vs. Private balance and the level of openness to the globalized market of the different National Higher Education Systems.

To conclude I would mention that the systems’ ability to attract talent is one of the most supporting factors in rising research reputation and, attached to it, the scientific visibility reached by universities around the world.

* Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. [More]

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

A new venture

We are starting up Scimago Lab’s blog with this post. The blog aims at set itself up as the main channel to show our particular vision about our work as well as important issues relating to research evaluation, quantitative analysis of scientific information, scientometrics news and so on. We will also inform and discuss here about Scimago Lab‘s product schedule, development, and strategic directions.

We are confident this blog will be as useful to you as exciting to us and encourage everyone interested to participate and set your ideas out.

The Scimago Lab staff

Contact

scimago lab

mail contact@scimago.es

twitter @scimago

twitter Our blog

Scimago Lab is a division of the SRG S.L. company. Copyright 2010-2020.