Hi readers!
You must have heard the term “Web of Science” (WoS), especially if you are a scientist or an educationist.
WoS began in 1964 as the Science Citation Index (SCI), created by Dr. Eugene Garfield and the Institute for Scientific Information (ISI) to revolutionize literature searches through citation mapping. It later evolved into the Web of Science (WoS), an organized citation database offering high-quality, peer-reviewed research, particularly strong in the sciences.
Its advantages include comprehensive citation tracking, strong bibliometric tools for research evaluation, organized high-impact contents, multiple databases for references, and citation data from academic journals, conference proceedings, and other scholarly documents across various disciplines. It is currently owned by Clarivate and contains 79 million records in its Core Collection and 171 million records on the platform, including citation indexes and journal impact factors.
However, it is a paid subscription service, has a more selective approach compared to Google Scholar, and may exhibit bias toward English-language and Western-centric publications.
A citation index is built on the idea that it serves as a linkage between similar research items and leads to related scientific literature. Additionally, literature that shows the greatest impact in a particular field can be located. For example, a paper’s influence can be determined by linking it to all the papers that have cited it. This way, current trends, patterns, and emerging research fields can be assessed. According to Eugene Garfield, the “father of citation indexing of academic literature”:
“Citations are the formal and unambiguous linkages between papers that have particular points in common. Citation indexing is built around these linkages and lists publications that have been cited while also identifying the sources of the citations. Every paper found provides a list of new citations with which to continue the search.” The simplicity of citation indexing is one of its main strengths.
In November 2009, Thomson Reuters introduced the Century of Social Sciences, a service that contains files capturing social sciences research dating back to the beginning of the 20th century. Because of this, WoS now has indexing coverage from 1900 to the present.
As of February 2017, the multidisciplinary coverage of WoS encompassed over a billion cited references and 90 million records, covering more than 12,000 high-impact journals and 8.2 million records across 160,000 conference proceedings, with 15,000 proceedings added each year.
However, WoS does not index all journals.
CiteScore (a comprehensive journal-level metric from Scopus, Elsevier) measures the average citations received by all items published in a journal over a four-year period. Unlike the Journal Impact Factor, it includes articles, reviews, conference papers, letters, and notes, and it is updated annually with monthly tracker updates. There is a significant positive correlation between the Impact Factor and journal evaluation metrics. However, analysis by Elsevier identified 216 journals from 70 publishers that were in the top 10 percent of the most-cited journals in their subject categories
yet, they did not have an Impact Factor.
It appears that the Impact Factor does not provide comprehensive and unbiased coverage of high-quality journals.
While marketed as global reference tools, Scopus and WoS have been characterized as “structurally biased against research produced in non-Western countries, non-English language research, and research from the arts, humanities, and social sciences.”
The WoS database has a Core Collection consisting of three online indexing databases:
- Science Citation Index: Covers more than 9,200 journals across 178 scientific disciplines. Coverage is from 1900 to the present, with over 53 million records.
- Book Citation Index (BCI): Covers more than 116,000 editorially selected books. Coverage is from 2005 to the present, with over 53.2 million records.
- Conference Proceedings Citation Index (CPCI): Covers more than 205,000 conference proceedings. Coverage is from 1990 to the present, with over 70.1 million records and also, six regional online databases since 2008 including:
- Chinese Science Citation Database, produced in partnership with the Chinese Academy of Sciences,
- SciELO Citation Index (established in 2013) covering Brazil, Spain, Portugal, the Caribbean, South Africa, and 12 additional Latin American countries,
- Korean Citation Index (2014), with updates from the National Research Foundation of Korea,
- Russian Science Citation Index (2015),
- Arabic Regional Citation Index (2020).
- (There is no South Asian citation index.)
We could create our own citation index with the help of the Pakistan Academy of Sciences and share it with the Arabic Regional or Chinese Citation Index, if the HEC and relevant authorities wish to do so.
The citation indexes listed above contain references that have been cited by other articles. Researchers may use them to conduct searches to locate articles that cite earlier or current publications. Searches can be made by topic, author, source title, or location.
Two chemistry databases, Index Chemicus and Current Chemical Reactions, allow users to create structure drawings, enabling them to locate chemical compounds and reactions.
The literature indexed includes scholarly books, peer-reviewed journals, original research articles, reviews, editorials, chronologies, abstracts, and other items from disciplines such as agriculture, biological sciences, engineering, medical and life sciences, physical and chemical sciences, anthropology, law, library sciences, architecture, dance, music, film, and theater.
WoS also include BIOSIS and the Zoological Record: an electronic index of zoological literature that serves as the unofficial register of scientific names in zoology.
Citation analysis examines the frequency, patterns, and graphs of citations in documents using citation links from one document to another to reveal properties of the documents. This helps identify the most important documents in a collection.
However, criticism has been raised regarding the questionable use of citation analysis to compare the impact of different scholarly articles without considering other factors that may affect citation patterns.
(I have personally experienced such practices while searching through Google Scholar, impact factors, and citations of each paper I published, and I observed how citations are managed. I can demonstrate this.)
One recent criticism focuses on “field-dependent factors,” indicating that citation practices vary from one area of science to another, and even between fields within a discipline. This is what I observed in my field (wide hybridization), which I consider a dedicated field, as not many researchers work in it. If upon searching for one of my papers, I find 55 citations but simultaneously see a similar paper from another country earning hundreds of citations, one may question the integrity of the paper and the citation patterns—not of Google, which simply displays available data.
Dear readers, this is how citations are sometimes managed.
Like other scientific approaches, scientometrics and bibliometrics have their own limitations and were criticized in 2010 for deficiencies in calculation processes based on Thomson Reuters Web of Science.
For example, citation distributions are often highly skewed toward established journals. Journal Impact Factor properties are field-specific and can be manipulated by editors, either by prioritizing review articles (which are cited more frequently) or by altering editorial policies, making the process less transparent.
The accepted view is that, for greater accuracy, journal-level metrics must be supplemented with article-level metrics and peer review.
Studies on methodological quality and reliability have found that the reliability of published research in several fields may decrease with increasing journal rank. Thomson Reuters responded that “no one metric can fully capture the complex contributions scholars make to their disciplines, and many forms of scholarly achievement should be considered.”
The San Francisco Declaration on Research Assessment (DORA) recognizes the need to improve the ways researchers and scholarly outputs are evaluated and denounces the practice of correlating the Journal Impact Factor with the merits of a specific scientist’s contributions.
According to DORA, this practice creates bias and inaccuracies in research evaluation because the Impact Factor is a flawed and easily manipulated metric that poorly measures individual article quality or researcher impact. It suffers from disciplinary bias, skewed citation distributions (where a few papers drive high scores), and the promotion of “salami slicing.” DORA argue that it should not be used for hiring or funding decisions.
DORA advises the use of article-level metrics that focus on citations of specific papers rather than journal averages when evaluating research output.
For qualitative assessment, expert panel evaluations should be used instead of relying solely on numerical, automated metrics.
DORA also states that
“the Impact Factor should not be used as a surrogate measure of the quality of individual research articles (which sometimes may result from salami slicing), or in hiring, promotion, or funding decisions.”
(Then why focus only on citations and Impact Factors?)
Salami slicing refers to breaking a large, cohesive research dataset into smaller segments to increase the number of publications.
In academic publishing, it describes fragmenting a single coherent body of research into multiple papers to inflate publication counts.
This practice may distort the literature by leading readers to believe that each article is based on a different dataset, which is often not the case. Salami slicing is considered a form of scientific misconduct.
Dear readers, this is WoS. It is up to you to decide whether to depend wholeheartedly on WoS scientometrics and bibliometrics tools to evaluate science and scientists?
See you again. Take care. Bye.


