Skip to Main Content



Database Comparisons: WoS, Scopus, GS, Dimensions

A summary of database comparisons focusing on large multidisciplinary research databases. It deliberately focuses on content and search features rather than pricing as this varies for each subscribing institution.

Web of Science vs. Scopus vs. Google Scholar vs. Dimensions

(Last updated Nov. 18, 2025)

Database

Web of Science
Core Collection
1

Scopus2

Google Scholar

Dimensions3

Total records

95+ million

90.6+ million

399 million4

147+ million

Journals

>22,619+ total
(~7,500 are from ESCI)

27,950 active titles
15,450 inactive titles

Unknown

77,471 5

Preprints

Yes - via Preprint Citation Index

Unknown

Unknown

Yes

Books

157,000+

292,000; 1,167 book series

Number unknown, but is integrated with Google Books

116,643 books/edited books; 300,408 monographs6

Proceedings

10.5 million

11.7+ million conf papers

Unknown

8.8 million

Period Covered

1945-present; if Century of Science purchased, coverage goes back to1900

Records go back to 1788; cited refs for 1970 to present

Unknown

1665-present

Non-English publications

Yes, if has an English abstract; 4% of the publications are non-English (excluding ESCI)

Yes, if has an English abstract; 20% of publications are non-English

Articles published in many languages

14% of the publications are     non-English7

Update frequency

Daily

Daily

Unknown

Daily (ca. 2-3 days after deposit in CrossRef)

Author Profiles

Algorithm generated

Algorithm generated

Author created

Algorithm generated

Citation Analysis

Yes

Yes

No

Yes

Mark Records

Yes

Yes

Yes –  requires Google Account login

Yes – requires registering to use this feature

Export Records

Yes - en masse

Yes - en masse

If logged in; mass export saved records in user library

Yes – requires registering to use this feature

Systematic Reviews

Yes

Yes

Limited advanced searching features

Complex Boolean searches for systematic reviews require using their API15

Strengths

  • Covers "journals of influence” – this has become synonymous with higher quality
  • Organization name unification
  • Ability to search & browse for incorrectly cited references & author name misspellings
  • Includes altmetrics when available
  • Exportable visualizations for author & citation reports
  • Larger coverage of Social Sciences, Arts & Humanities than WoS8
  • Includes trade publications
  • Includes altmetrics when available
  • Includes wide range of document types - e.g., theses and white papers
  • Finds more citations than any other database in this table, regardless of subject area9
  • Wider book coverage than other databases due to Google Books integration
  • Significantly more publications than WoS or Scopus
  • Larger coverage of Social Sciences, Arts & Humanities than WoS and Scopus10 
  • Includes clinical trials, grants, datasets and policy documents
  • Integrated with ReadCube and Altmetric

Weaknesses

  • Covers only "journals of influence"
  • Difficulty searching unusual author name formats: hyphenated, compound names, umlauts, etc.; and ampersands in journal titles
  • Owned by a publisher and may not be neutral in their content inclusion
  • Errors in reference lists include: author names in wrong order, phantom citations added, transcription errors in author names and article titles11
  • Cloned DOI's - identical number on multiple papers12
  • Questionable content quality and many non-peer-reviewed sources
  • Difficult to narrow down common author name searches
  • Have to create a GS profile to create reports
  • Owned by a publisher and may not be neutral in their content inclusion
  • Missing citation links are a significant problem13 
  • Large % of articles missing metadata – noticeably author affiliations14
  • Search interface is very different and may confuse users

Due to major changes in content coverage changes for these databases post-2015, and the launching of Dimensions in 2018, data from older research studies is outdated and has been replaced/updated in this chart. All database providers (with the exception of Google Scholar which is a black box) have been very responsive to identified weaknesses and have made improvements to eliminate some of them.

[1] Some data in this column is from Resources for Librarians and Administrators, March 2025. Where possible, the comparison chart above has included information on which Web of Science data points include or exclude Emerging Sources Citation Index (ESCI).

[2] Some data in this column is from Scopus Content Coverage Guide, March 2023.

[3] Dimensions. Why Dimensions?, June 2024. [Scroll down the page to see this section. Written by Dimensions staff, this webpage provides a content comparison with Dimensions, Scopus, Web of Science, and Open Alex – with notes at the bottom about data sources used.] Note that Dimensions does not: define what “publications” covers; give a category for “number of journals”; say whether Web of Science numbers are just the Core Collection or also include ESCI.

[4] Aguillo, Isidro F. 2025.Transparent Ranking of Repositories 2025. figshare. Preprint. https://doi.org/10.6084/m9.figshare.29099234.v2

[5] Singh, Vivek Kumar, Prashasti Singh, Mousumi Karmakar, et al. 2021. "The Journal Coverage of Web of Science, Scopus and Dimensions: A comparative analysis.” Scientometrics,126(6): 5113-5142. https://doi.org/10.1007/s11192-021-03948-5. On page 5117 they state that “Dimensions contains more than 74,000 journal entries”; however, 3 other places in the article state the Dimensions journal list “contained 77,471 entries.” Per email communication with Dimensions staff on 8/14/25, "Dimensions contains more publication sources than just journals, including book series, preprint platforms, and conference proceedings. There are about 115k sources with online ISSNs."

[6] Singh, "Journal Coverage," 5123.

[7] Visser, Martijn, Nees Jan van Eck, and Waltman, Ludo. 2021. "Large-scale Comparison of Bibliographic Data Sources: Scopus, Web of Science, Dimensions, Crossref, and Microsoft Academic." Quantitative Science Studies 2 (1): 20–41. https://doi.org/10.1162/qss_a_00112. [Note: this study did not include Emerging Sources Index.] See p.33 for non-English data.

[8] Singh, "Journal Coverage," 5133-5134.

[9] Martín‑Martín, Alberto, Mike Thelwall, Enrique Orduna-Malea, et al. 2021. "Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations." Scientometrics, 126:871-906. https://doi.org/10.1007/s11192-020-03690-4. [Google Scholar found 88% of all citations, followed by Microsoft Academic (which is now defunct), Scopus, Dimensions, then Web of Science. This study used the Web of Science Core Collection which included Emerging Sources Citation Index.]

[10] Singh, "Journal Coverage," 5133-5134.

[11] Franceschini, Fiorenzo, Domenico Maisano and Luca Mastrogiacomo. 2016. “Empirical Analysis and Classification of Database Errors in Scopus and Web of Science.” Journal of Infometrics, 10(4):933-953. https://doi.org/10.1016/j.joi.2016.07.003

[12] Franceschini, Fiorenzo, Domenico Maisano and Luca Mastrogiacomo. 2016. “The Museum of Errors/Horrors in Scopus.” Journal of Infometrics, 10(1):174-182. https://doi.org/10.1016/j.joi.2015.11.006

[13] Visser, "Large-Scale Comparison," 37.

[14] Basson, Isabel, Marc-André Simard, Zoé Aubierge Ouangré, et al. 2022. “The Effect of Data Sources on the Measurement of Open Access: A comparison of Dimensions and the Web of Science.” PLOS One, 17(3): e0265545. https://doi.org/10.1371/journal.pone.0265545. [Note: it compares only the open access article coverage in these databases.]

[15] Gusenbauer, Michael and Neal R. Haddaway. 2020. "Which Academic Search Systems are Suitable for Systematic Reviews or Meta-Analyses? Evaluating Retrieval Qualities of Google Scholar, PubMed, and 26 Other Resources." Research Synthesis Methods, 11:181-217. https://doi.org/10.1002/jrsm.1378.

Citation Overlap App