- Informacje o czasopiśmie
- Pierwsze wydanie
- 30 Mar 2017
- Częstotliwość wydawania
- 4 razy w roku
- Otwarty dostęp
Zakres stron: 1 - 13
New developments in the study of delayed recognition are discussed.
Based on these new developments a method is proposed to characterize delayed recognition as a fuzzy concept.
A benchmark value of 0.333 corresponding with linear growth is obtained. Moreover, a case is discovered in which an expert found delayed recognition several years before citation analysis could discover this phenomenon.
As all citation studies also this one is database dependent.
Delayed recognition is turned into a fuzzy concept.
The article presents a new way of studying delayed recognition.
- Delayed recognition
- Sleeping beauty
- Fuzzy membership function
- Otwarty dostęp
A comparison of citation disciplinary structure in science between the G7 countries and the BRICS countries
Zakres stron: 14 - 30
This study aims to compare the characteristics of citation disciplinary structure between the G7 countries and the BRICS countries.
In this contribution, which uses about 1 million Web of Science publications and two publications years (1993 and 2013), we compare the G7 countries and the BRICS countries with regard to this type of structure. For the publication year 2013, cosine similarity values regarding the citation disciplinary structures of these countries (and of nine other countries) were used as input to cluster analysis. We also obtained cosine similarity values for a given country and its citation disciplinary structures across the two publication years. Moreover, for the publication year 2013, the within-country Jeffreys-Matusita distance between publication and citation disciplinary structure was measured.
First, the citation disciplinary structures of countries depend on multiple and complex factors. It is therefore difficult to completely explain the formation and change of the citation disciplinary structure of a country. This study suggests some possible causes, whereas detailed explanations might be given by future research. Second, the length of the citation window used in this study is three years. However, scientific disciplines differ in their citation practices. Comparison between citations across disciplines using the same citation window length may affect the citation discipline structure results for some countries.
First, the results of this study are based on the WoS database. However, in this database some fields are covered to a greater extent than others, which may affect the results for the citation discipline structure for some studied countries. In future research, we might repeat this study using another database (like Scopus) and, in that case, we would like to make comparisons between the two outcomes. Second, the use of a constant journal set yielded that a large share of the journals covered by WoS year 2013 is ignored in the study. Thus, disciplinary structure is studied based on a quite restricted set of publications. The three mentioned limitations should be kept in mind when the results of this study are interpreted.
Disciplinary structure on country level is a highlighted topic for the S&T policy makers, especially for those come from developing countries. This study observes the disciplinary structure in the view of academic impact, and the result will provide some evidence to make decision for the discipline strategy and funding allocation. Besides, Jeffreys-Matusita distance is introduced to measure the similarity of citation disciplinary structure and publication disciplinary structure. By applying this measure, some new observations were drawn, for example, “Based on the comparison of publication disciplinary structure and citation disciplinary structure, the paper finds most BRICS counties have less impact with more publications”.
The outcome of the cluster analysis indicates that the G7 countries and BRICS countries are quite heterogeneous regarding their citation disciplinary structure. For a majority of the G7 countries, the citation disciplinary structure tend to be more stable compared to BRICS countries with regard to the years 1993 and 2013. Most G7 countries, with United States as an exception, turned out to have lower values on the Jeffreys-Matusita distance than BRICS countries, indicating a higher degree of heterogeneity between the publication and the citation disciplinary structure for the latter countries. In other words, BRICS countries still receive much less citations in most disciplines than their publication output would suggest. G7 countries can still expect more citations than is to be expected based on their publication output, thereby generating relatively more impact than BRICS countries.
- Citation disciplinary structure
- Otwarty dostęp
Minimum Representative Size in Comparing Research Performance of Universities: the Case of Medicine Faculties in Romania
Zakres stron: 32 - 42
The main goal of this study is to provide reliable comparison of performance in higher education. In this respect, we use scientometric measures associated with faculties of medicine in the six health studies universities in Romania.
The method to estimate the minimum necessary size, proposed in in Shen et al. (2017), is applied in this article. We collected data from the Scopus data-base for the academics of the departments of medicine within the six health studies universities in Romania during the 2009 to 2014. And two kind of statistic treatments based on that method are implemented, pair-wise comparison and one-to-the-rest comparison. All the results of these comparisons are shown.
According to the results: We deem that Cluj and Tg. Mureş have the superior and inferior performance respectively, since their reasonably small value of the minimum representative size, in either of the kinds of comparison, whichever indexes of citations,
There is only six faculties of medicine in health studies universities in Romania are analyzed.
Our methods of comparison play an important role in ranking data sets associated with different collective units, such as faculties, universities, institutions, based on some aggregate scores like mean and totality.
We applied the minimum representative size to a new emprical context—that of the departments of medicine in the health studies universities in Romania.
- Research evaluation
- Minimum representative size
- Bootstrap sampling
- Medicine departments
- Otwarty dostęp
Zakres stron: 43 - 80
Recently, a vast number of scientific publications have been produced in cities in emerging countries. It has long been observed that the publication output of Beijing has exceeded that of any other city in the world, including such leading centres of science as Boston, New York, London, Paris, and Tokyo. Researchers have suggested that, instead of focusing on cities’ total publication output, the quality of the output in terms of the number of highly cited papers should be examined. However, in the period from 2014 to 2016, Beijing produced as many highly cited papers as Boston, London, or New York. In this paper, another method is proposed to measure cities’ publishing performance by focusing on cities’ publishing efficiency (i.e., the ratio of highly cited articles to all articles produced in that city).
First, 554 cities are ranked based on their publishing efficiency, then some general factors influencing cities’ publishing efficiency are revealed. The general factors examined in this paper are as follows: the linguistic environment of cities, cities’ economic development level, the location of excellent organisations, cities’ international collaboration patterns, and their scientific field profile. Furthermore, the paper examines the fundamental differences between the general factors influencing the publishing efficiency of the top 100 most efficient cities and the bottom 100 least efficient cities.
Based on the research results, the conclusion can be drawn that a city’s publishing efficiency will be high if meets the following general conditions: it is in a country in the Anglosphere–Core; it is in a high-income country; it is home to top-ranked universities and/or world-renowned research institutions; researchers affiliated with that city most intensely collaborate with researchers affiliated with cities in the United States, Germany, England, France, Canada, Australia, and Italy; and the most productive scientific disciplines of highly cited articles are published in high-impact multidisciplinary journals, disciplines in health sciences (especially general internal medicine and oncology), and disciplines in natural sciences (especially physics, astronomy, and astrophysics).
It is always problematic to demarcate the boundaries of cities (e.g., New York City vs. Greater New York), and regarding this issue there is no consensus among researchers. The Web of Science presents the name of cities in the addresses reported by the authors of publications. In this paper cities correspond to the spatial units between the country/state level and the institution level as indicated in the Web of Science. Furthermore, it is necessary to highlight that the Web of Science is biased towards English-language journals and journals published in the field of biomedicine. These facts may influence the outcome of the research.
Publishing efficiency, as an indicator, shows how successful a city is at the production of science. Naturally, cities have limited opportunities to compete for components of the science establishment (e.g., universities, hospitals). However, cities can compete to attract innovation-oriented companies, high tech firms, and R&D facilities of multinational companies by for example establishing science parks. The positive effect of this process on the city’s performance in science can be observed in the example of Beijing, which publishing efficiency has been increased rapidly.
Previous scientometric studies have examined cities’ publication output in terms of the number of papers, or the number of highly cited papers, which are largely size dependent indicators; however this paper attempts to present a more quality-based approach.
- Publishing efficiency
- Spatial scientometrics
- Highly cited articles
- Web of Science
- Otwarty dostęp
Zakres stron: 81 - 98
To get a better understanding of the way in which university rankings are used.
Detailed analysis of the activities of visitors of the website of the CWTS Leiden Ranking.
Visitors of the Leiden Ranking website originate disproportionally from specific countries. They are more interested in impact indicators than in collaboration indicators, while they are about equally interested in size-dependent indicators and size-independent indicators. Many visitors do not seem to realize that they should decide themselves which criterion they consider most appropriate for ranking universities.
The analysis is restricted to the website of a single university ranking. Moreover, the analysis does not provide any detailed insights into the motivations of visitors of university ranking websites.
The Leiden Ranking website may need to be improved in order to make more clear to visitors that they should decide themselves which criterion they want to use for ranking universities.
This is the first analysis of the activities of visitors of a university ranking website.
- CWTS Leiden Ranking
- University ranking
- University ranking website
- Web log analysis