World’s Top 100 Companies by Scientific Knowledge Output, a Scientometric Characterization (SCOPUS 2003-2010)

To date we have been posting brief scientometric tables of universities and national research agencies. to expand our analysis, the current post is devoted to Private Companies as scientific agents, so we´ll lead the spotlight to the World’s Major Companies and their research performance outputs. We are going to consider the Top 100 scientific producers and show their indicators related to scientific impact, international collaboration, ability to put their papers in top journals, specialization rate and excellence output.

The research activity carried out at big companies is concentrated in Northen America and Western Europe –44 in Northern America and 36 in Western Europe among the Top 100–. The list shows a prominent bias in the Activity Sector as well: most of the selected companies belong to ICT(46), PHARMACEUTICAL (26) or ENERGETIC (14)

The following table contains some key scientometric indicators of these companies based on SCOPUS data during the period 2003-2010:

Download Table

 

The indicators included in the table are the following:

Output

The number of scientific papers published in scholarly journals reveals the ability of an institution to create scientific knowledge.  In co-authored publications, a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration

It shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of impact, the calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1

Ratio of publications that an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

Specialization Rate

The Specialization Rate indicates the extent of thematic concentration / dispersion of an institution’s scientific output. Values range between 0 to 1, indicating generalistic vs. specialized institutions respectively. This indicator is computed according to the Gini Index used in Economy.

Excellence

The Excellence indicates the number of papers an institution included into the set formed by the 10% of the most cited papers in their respective scientific fields. It is a measure of high quality output of research institutions.

Leadership

Number of research papers where the author of the company is the main contributor, selected as paper’s corresponding author which is expected to be in most cases the leader resarcher.

 

The largest National Bodies for Research in the world: some performance indicators (2003-2010)

Last year we published a series of posts characterizing, through scientometrics methods, the research activity carried outin some of the major National Bodies for Reserach from Europe. Specifically, the series comprised the Cosiglio Nazionale delle Ricerche CNR (Italy), the Max Planck Society (Germany), and the Spanish Consejo Superior de Investigaciones Científicas. With the goal of highlighting some global characteristics involving the research outcomes of national organizations, we are releasing in this post a informative table containing a selection of performance indicators belonging to the six largest National Bodies for Research (in number of scientific publications):

  • Centre National de la Recherche Scientifique,
  • Chinese Academy of Sciences,
  • Russian Academy of Sciences,
  • Max Planck Gesellschaft,
  • Consejo Superior de Investigaciones Cientificas,
  • Consiglio Nazionale delle Ricerche
photoFélix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world. 

Universities by country among the Top 100 by research field based on SCOPUS and SCIMAGO

For this post, we have elaborated rankings of universities based on the output they have in each of the major scientific fields. Then we took the Top 100 in each field and saw them by countries. The result is the following table where each row represents a country and each cell shows how many institutions that country has within the Top 100 in the corresponding field. The rank is ordered by the average of institutions that countries have in Top 100. We have used Scopus as the data source, after a disambiguation process of institution names.

Download the table.

Download the full list of universities.

 

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Forecasting exercise: How World Scientific Output will be in 2018

According to how nations’ scientific output have increased during the period 2003-2010 (top 50), we propose the following forecasting exercise (PDF), we will use this data to obtain a prediction not necessarily realistic, but anyhow suggestive, of the scientific output scenario for 2018.

Forecasting Excercise

Growth rate (percentage) and Total Output by country during the period 2003-2010 (click on the graph to access the table with all the data of this forecasting exercise)

Source: SCOPUS. Elaboration: SCImago

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Science Indicators of Spanish HEIs 2006-2010

Felix de Moya Anegon. This post is devoted to characterize the research activity carried out at Spanish Higher Education Institutions over the period 2006-2010. The following table (in PDF format) shows scientometric indicators of research output, international collaboration, normalized impact, % Output in 1st quartile journals according to the Scimago Journal Rank indicator and the number of works of excellence (ordering criteria) published by any of the institutions.

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

Excellence

The Excellence indicates the number of papers an institution included into the set formed by the 10% of the most cited papers in their respective scientific fields. It is a measure of high quality output of research institutions.

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Performance Indicators of Italian CNR Institutes

Felix de Moya Anegon. This is the third post in our series on performance indicators of government organizations and leading national scientific institutions (National Research Councils and Science Academies). After the analysis of Spanish CSIC and German Max Planck Society we now focus on the Italian Cosiglio Nazionale delle Ricerche CNR and its research institutes. Mainly funded by public funds, the Italian leading scientific institution is a government agency with 107 Institutes set throughout the country (four of them have been discontinued) and grouped into 11 thematic departments. In 2009, the scientific output attributable to CNR Institutes represented more than 10% of Italian scientific output. Again, we present in this post a table (in PDF format) that, based on Elsevier’s Scopus database, shows scientometric indicators of research output, citation impact and international collaboration related to CNR Institutes. Its main goal is not to rank the institutes, but to highlight some of the differential characteristics in regard to the research outcomes achieved by these centers.

In order to show trends, the table includes three consecutive 5-year periods (2003-2007, 2004-2008 and 2005-2009). Ordering criteria is the output of the CNR Institutes during the period 2005-2009. The institutes are color-marked to indicate which ones have Normalized Impact (NI) values higher than the NI average over all CNR Institutes, which surpass Italian NI average (but fall below CNR average) and which ones fall below Italian NI average.

The following table provide some background information by showing the same four indicators and periods that you will find in the PDF report, but referred to the CNR entire organisation and to Italy:

CNR Output % IC NI % Q1
2003-2007 39,949 41.8 1.2 66.0
2004-2008 38,528 42.2 1.2 64.8
2005-2009 37,916 42.7 1.3 63.8
         
ITALY Output % IC NI % Q1
2003-2007 292,346 35.8 1.2 54.8
2004-2008 311,598 36.5 1.2 53.9
2005-2009 328,826 37.2 1.2 53.1

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration IC(%)

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact NI

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1(%)

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Performance Indicators of German Max Planck Institutes

Felix de Moya Anegon. In the previous post on Performance Indicators of Spanish CSIC Research Institutes, we started up a series of posts intended to characterize, through scientometric methods, the research activity carried out at the large government organizations and leading national scientific institutions (National Research Councils and Science Academies), mainly in countries from Europe and Asia. We now turn to the German Max Planck Society and its research institutes (Max Planck Institutes, MPIs). The Max Planck Society is not a government institution; it is a registered association and has its registered seat in Berlin. As of January 1, 2011, the Max Planck Society promotes research in 80 own institutes and research facilities; four institutes and one research facility are situated abroad. The following table (in PDF format) shows scientometric indicators of research output, citation impact and international collaboration related to MPIs based on Elsevier’s Scopus database. Its main goal is not to rank MPIs, but to highlight some of the differential characteristics in regard to the research outcomes achieved by these institutes.

In order to show trends, the table includes three consecutive 5-year periods (2003-2007, 2004-2008 and 2005-2009). Ordering criteria is the output of the MPIs during the period 2005-2009. The institutes are color-marked to indicate which ones have Normalized Impact (NI) values higher than the NI average over all MPIs, which surpass German NI average (but fall below MPIs average) and which ones fall below German NI average.

The following table provide some background information by showing the same four indicators and periods that you will find in the PDF report, but referred to Max Planck entire organisation and to Germany:

MAX PLANCK Output % IC NI % Q1
2003-2007 46,111 64.1 1.8 73.5
2004-2008 48,338 64.5 1.8 72.7
2005-2009 50,038 65.0 1.8 72.2
         
GERMANY Output % IC NI % Q1
2003-2007 577,881 41.2 1.3 52.6
2004-2008 609,033 42.1 1.3 52.3
2005-2009 634,385 43.1 1.3 52.0

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration IC(%)

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact NI

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1(%)

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Performance Indicators of Spanish CSIC Research Institutes

Felix de Moya Anegon. The major European national research councils —Max Plank Society from Germany, Centre National de la Recherche Scientifique (CNRS) from France, Italian’s Consiglio Nazionale delle Ricerche (CNR) or Spanish’s Consejo Superior de Investigaciones Científicas (CSIC)– and science academies from Eastern Europe countries and Asia including China, Russia (Russian only) or Ucrania have highly complex institutional structures consisting of tens or hundreds of research institutes with diverse scientific missions. From a scientometric point of view, the characterization of such institutional complexity demands the use of performance indicators that go beyond global structures and put the focus on each institute, research center and research laboratory that make up these scientific organizations.

With this in mind, this post starts up a series of posts devoted to characterizing the research activity carried out at national research councils and science academies all over the world, through scientometric methods and from an agglutinative perspective. The following table (in PDF format) shows scientometric indicators of research output, impact and collaboration related to Spanish’s CSIC research institutes, based on Elsevier’s Scopus database. Its main goal does not consist in making a ranking table of CSIC centers, but to highlight some of the differential characteristics involving the research outcomes achieved by these institutes.

In order to show value trends, the table includes three consecutive 5-year periods (2003-2007, 2004-2008 and 2005-2009). Ranking criteria is the output of institutions during the period 2005-2009. Also, the institutes are color-marked to indicate which ones have Normalized Impact values higher than CSIC NI average, which overpass Spanish NI average and which ones fall below.

Beforehand, to help put the information in the table into context, we include here the same indicators referred to CSIC entire institution and to Spain. CSIC represents 15.3% of the total Spanish research output:

CSIC Output % IC NI % Q1
2003-2007 35,429 47.8 1.4 69.9
2004-2008 38,665 48.6 1.4 69.5
2005-2009 41,929 49.4 1.4 68.7
         
SPAIN Output % IC NI % Q1
2003-2007 230,731 33.3 1.1 49.3
2004-2008 252,422 33.9 1.1 48.4
2005-2009 273,482 34.8 1.1 47.8

The indicators exposed are the following:

Output

The output or number of scientific papers published in scholarly journals reveals the ability of an institution to produce scientific knowledge. Output values are affected by institution sizes and research profiles, among others factors. The Output indicator forms the basis for more complex metrics. At co-authored publications a score is assigned to each contributing institution through the author’s institutional address.

International Collaboration IC(%)

This indicator shows the ability of institutions to create international research links through the output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliations include more than one country address.

Normalized Impact NI

Normalized Impact scores indicate the scientific impact that institutions have over the scientific community. In order to obtain a fair measurement of such impact, its calculation removes the influence due to institutions’ size and research profile making it ideal for comparing research performance. Normalized Impact values show the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame, document type and subject category. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, –i.e. a score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above world average. Normalized Impact is computed using the methodology established by the Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average”. The long name used is because the normalization of the citation values is done on an individual article level. Further information on the methodology at Bibliometric Handbook for Karolinska Institutet .

High Quality Publications Q1(%)

Ratio of publications an institution publishes in the world most influential scholarly journals. Journals considered for this indicator are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

photo 

Félix de Moya Anegón is Research Professor at the Institute of Public Goods and Policies (IPP) from the Spanish National Research Council (CSIC), his academic interests include scientometrics, bibliometrics, research evaluation and science policy; he has published around 100 papers in these fields. He is SCImago Research Group‘s main researcher, where he has led renowned bibliometic projects including Scimago Journal & Country Rank, Scimago Institution Rankings and The Atlas of Science. Prof. De Moya is also advisor for Science Policy issues for national organizations of science and technology and research institutions around the world.

Scientific Excellence Georeferenced. The neighborhood matters.

While the current dynamics of Worldwide Science outputs are rising to the surface Research Institutions from developing countries with important results; the location of Institutions holding excellence outcomes, that is, those which represent a truly and significant science advancement does not seem to be changing.

Although a formal study would need a detailed analysis by scientific fields, an exploratory analysis of the group formed by the Research Institutions which annually publish 100 scientific papers or more (3,000 worldwide, as indexed in Scopus database) reveals that the resulting geographic distribution has a strong bias which is explained by research spending patterns in different regions of the world. There is a large concentration of institutions in North America, Europe, India, China and Japan, and to a lesser extent, in the southern hemisphere, in Chile, Argentina and more prominently in Brazil, South Africa and Australia.

Neighborhood influence

By depicting Research Institutions grouped into four levels of Normalized Impact (NI), a picture, as the one exposed, of the geographic distribution of Research Excellence around the world is obtained. Taking into account that, with odd exceptions, institutions reaching NI scores higher than the world average (values higher than 1) are concentrated in North America, Western Europe and Australia/New Zeeland, we must conclude that if the geographic bias in Scientific Outputs is high, it is even higher the bias affecting the Scientific Impact.

MI world map

Impact can be considered to reflect the use researchers make of the scientific knowledge previously generated. With this in mind, the map suggests that those regions that produce the most (Western Europe and North America) firstly use the knowledge being generated in their area, justifying this way the concentration of large impacts in highly productive regions and implying that Research Institutions in the most productive regions worldwide accumulate a reputational capital, which is due to the geographical context where they are, and on the other hand, that is unattainable for institutions located in less productive regions. It is still to be seen what will happen with China, a newcomer to the elite of more productive scientific countries in the world. Put another way, the neighborhood of a Research Institution affects the scientific reputation it can achieve in global terms, unless it can go beyond its neighborhood through inter-regional alliances with reputed institutions from highly productive regions.

Scientific Dependence

To developing countries, an unintended consequence of the need to collaborate with researchers from highly productive regions is what it could be called “scientific dependence”. Even though it is difficult to measure the role played by researchers in scientific works by just studying the affiliation fields, what we really know is that certain Research Institutions show extremely high rates of International Collaboration (IC) which indeed can be associated to situations of scientific dependence. I.e. when an Institution’s IC reaches or exceeds 80% of its total output, in such a way that its exclusive production ends being marginal, it is clear that it operates in a “scientific dependence” situation. Furthermore, when an institution shows high collaboration rates jointly with an outstanding impact (NI) within its own country and/or region, such impact will be most likely due to the collaboration, and it may be concluded that there exists an external scientific dependence.

Some countries from Latin America constitute an example and an interesting case of scientific dependence in cases where a less scientific development, a highly international collaboration rate and a relatively high impact in a regional scale come together. See the high correlation between IC and NI values in the following table.

 

Country Output IC NI
Brazil 135,259 25.35 0.77
Mexico 52,527 38.65 0.75
Argentina 34,927 42.93 0.90
Chile 20,534 52.64 0.95
Venezuela 8,489 44.40 0.65
Colombia 8,292 53.90 0.83
Cuba 6,915 39.38 0.46
Puerto Rico 4,476 57.95 1.00
Uruguay 2,711 64.66 0.95
Peru 2,654 75.77 1.13
Costa Rica 1,946 71.17 1.16

Science Indicators of Spanish Public Universities

We are releasing here a small post with some interesting indicators that can help contribute to shed light on the performance of the Spanish Scientific System, these specifically are related to publicly-funded universities.

The table is available as a pdf and can be downloaded here: Science Indicators of Spanish Public Universities

The list below is a description of the indicators you will find on the table.

  • Normalized Impact (NI): (ordering criteria) weighted citation average of universities. By normalizing citation counts for differences among fields this indicator represents, in an indirect way, the Average Scientific Quality of the universities. Analysis Period: 2005-2009
  • Weighted Scientific Production (WSP): It is the result of assessing each scientific paper by its Normalized Impact and then adding them up for an entire institution. In this way, each paper counts higher than 1 when cited above world average (in its field, type and year) and lower than 1 when cited below. Analysis Period: 2008
  • Equivalent Full-Time Faculty Productivity (EFTF-P): Scientific Output by Equivalent Full-Time Faculty. Year 2008.
  • Total Expenditure in Research (TER): Combination of the different Research Expenditure Items at a University. Year 2008.
  • Expenditure per Paper (EP): Average expenditure per published scientific paper at a University (considering its Weighted Scientific Production). Year 2008.
  • Total Equivalent Full-Time Faculty (EFTF): Number of Full-Time Academics by converting part-time to full-time equivalents. Year 2008.
  • Expenditure per Equivalent Full-Time Faculty (TER/EFTF): Relationship between Total Expenditure in Research (TER) and Total Equivalent Full-Time Faculty (EFTF). Year 2008.

Overall thoughts about Normalized Impact

The evolution of the Normalized Impact of Spanish universities highlights the overall average impact improvement in Spanish system, as well as a slight decrease in top-notch universities. It remains true that the whole Public System is balanced and comparable with scientifically developed countries, specifically when compared with some English-speaking countries. More than 75% of Spanish public universities are above the world average impact, however none of them reach impact average levels higher than 50% of the world average.

Chart: Evolution of Normalized Impact in Spanish Universities

The following chart relates the academics’ productivity average and the Normalized Impact of universities (columns EFTF-P and NI in the table) and shows that there exists a clear positive relationship between these variables. In other words, in those universities where academics’ quantitative output in terms of scientific publications is higher, their qualitative outcomes are higher too. This means that as faculty’s publication activity at universities becomes more intense, the average quality of its outcomes reach higher peaks, among other things as a consequence of the higher chances of scientific synergies that take place within the institutions.

Professors' productivity and the Normalized Impact of universities

Contact

scimago lab

mail contact@scimago.es

twitter @scimago

twitter Our blog

Scimago Lab is a division of the SRG S.L. company. Copyright 2010-2020.