Jump to content

Scientific citation

From Wikipedia, the free encyclopedia
The reference section in a scientific paper

Scientific citation is providing detailed reference in a scientific publication, typically a paper or book, to previously published (or occasionally private) communications that have a bearing on the subject of the new publication.[citation needed] The purpose of citations in original work is to allow readers of the paper to refer to cited work to assist them in judging the new work, source background information vital for future development, and acknowledge the contributions of earlier workers.[citation needed]

To a considerable extent the quality of work, in the absence of other criteria, is judged on the number of citations received, adjusting for the volume of work on the relevant topic.[citation needed] While this is not necessarily a reliable measure, counting citations is trivially easy; judging the merit of complex work can be very difficult.[citation needed]

Previous work may be cited regarding experimental procedures, apparatus, goals, previous theoretical results upon which the new work builds, theses, and so on. Typically such citations establish the general framework of influences and the mindset of research, and especially as "part of what science" it is, and to help determine who conducts the peer review.[citation needed]

Patent references

[edit]

In patent law, the citation of previous works, or prior art, helps establish the uniqueness of the invention being described. The focus in this practice is to claim originality for commercial purposes, and so the author is motivated to avoid citing works that cast doubt on their originality. This does not appear to be "scientific" citation. Inventors and lawyers have a legal obligation to cite all relevant art; not doing so risks invalidating the patent.[citation needed] The patent examiner is obliged to list all further prior art found in searches.[citation needed]

Digital object identifier (DOI)

[edit]

A digital object identifier (DOI) is a persistent identifier or handle used to uniquely identify various objects, standardized by the International Organization for Standardization (ISO).[1] DOIs are an implementation of the Handle System;[2][3] they also fit within the URI system (Uniform Resource Identifier). They are widely used to identify academic, professional, and government information, such as journal articles, research reports, data sets, and official publications.

A DOI aims to resolve to its target, the information object to which the DOI refers. This is achieved by binding the DOI to metadata about the object, such as a URL where the object is located. Thus, by being actionable and interoperable, a DOI differs from ISBNs or ISRCs which are identifiers only. The DOI system uses the indecs Content Model to represent metadata.

Research and development

[edit]

Citation analysis is a method widely used in metascience:

Citation analysis

[edit]

Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books.[4][5] For another example, judges of law support their judgements by referring back to judgements made in earlier cases (see citation analysis in a legal context). An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks.[6]

Documents can be associated with many other features in addition to citations, such as authors, publishers, journals as well as their actual texts. The general analysis of collections of documents is known as bibliometrics and citation analysis is a key part of that field. For example, bibliographic coupling and co-citation are association measures based on citation analysis (shared citations or shared references). The citations in a collection of documents can also be represented in forms such as a citation graph, as pointed out by Derek J. de Solla Price in his 1965 article "Networks of Scientific Papers".[7] This means that citation analysis draws on aspects of social network analysis and network science.

An early example of automated citation indexing was CiteSeer, which was used for citations between academic papers, while Web of Science is an example of a modern system which includes more than just academic books and articles reflecting a wider range of information sources. Today, automated citation indexing[8] has changed the nature of citation analysis research, allowing millions of citations to be analyzed for large-scale patterns and knowledge discovery. Citation analysis tools can be used to compute various impact measures for scholars based on data from citation indices.[9][10][note 1] These have various applications, from the identification of expert referees to review papers and grant proposals, to providing transparent data in support of academic merit review, tenure, and promotion decisions. This competition for limited resources may lead to ethically questionable behavior to increase citations.[11][12]

A great deal of criticism has been made of the practice of naively using citation analyses to compare the impact of different scholarly articles without taking into account other factors which may affect citation patterns.[13] Among these criticisms, a recurrent one focuses on "field-dependent factors", which refers to the fact that citation practices vary from one area of science to another, and even between fields of research within a discipline.[14]

Citation frequency

[edit]

Modern scientists are sometimes judged by the number of times their work is cited by others—this is actually a key indicator of the relative importance of a work in science. Accordingly, individual scientists are motivated to have their own work cited early and often and as widely as possible, but all other scientists are motivated to eliminate unnecessary citations so as not to devalue this means of judgment.[15] A formal citation index tracks which referred and reviewed papers have referred which other such papers. Baruch Lev and other advocates of accounting reform consider the number of times a patent is cited to be a significant metric of its quality, and thus of innovation.[citation needed] Reviews often replace citations to primary studies.[16]

Citation-frequency is one indicator used in scientometrics.

Replication crisis

[edit]

Some studies explore citations and citation-frequencies. Researchers found that papers in leading journals with findings that can not be replicated tend to be cited more than reproducible science. Results that are published unreproducibly – or not in a replicable sufficiently transparent way – are more likely to be wrong, may slow progress and, according to an author, "a simple way to check how often studies have been repeated, and whether or not the original findings are confirmed" is needed. The authors also put forward possible explanations for this state of affairs.[17][18]

Progress and citation consolidation

[edit]
Various results from scientific citation analysis[19]
(more graphs)

Two metascientists reported that in a growing scientific field, citations disproportionately cite already well-cited papers, possibly slowing and inhibiting canonical progress to some degree in some cases. They find that "structures fostering disruptive scholarship and focusing attention on novel ideas" could be important.[20][21][22]

Other metascientists introduced the 'CD index' intended to characterize "how papers and patents change networks of citations in science and technology" and reported that it has declined, which they interpreted as "slowing rates of disruption". They proposed linking this to changes to three "use of previous knowledge"-indicators which they interpreted as "contemporary discovery and invention" being informed by "a narrower scope of existing knowledge". The overall number of papers has risen while the total of "highly disruptive" papers has not. The 1998 discovery of the accelerating expansion of the universe has a CD index of 0. Their results also suggest scientists and inventors "may be struggling to keep up with the pace of knowledge expansion".[23][21][19]

IT systems

[edit]

Research discovery

[edit]
Stages of research and publication processes and metadata, including citation metadata[24]

Recommendation systems sometimes also use citations to find similar studies to the one the user is currently reading or that the user may be interested in and may find useful.[25] Better availability of integrable open citation information could be useful in addressing the "overwhelming amount of scientific literature".[24]

Q&A agents

[edit]

Knowledge agents may use citations to find studies that are relevant to the user's query, in particular citation statements are used by scite.ai to answer a question, also providing the associated reference(s).[26][additional citation(s) needed]

Wikipedia

[edit]
Years of publication of a set of analyzed scientific articles referenced in Wikipedia[27]

There have been analyses of citations of science information on Wikipedia or of scientific citations on the site, e.g. enabling listing the most relevant or most-cited scientific journals and categories and dominant domains.[27] Since 2015, the altmetrics platform Altmetric.com also shows citing English Wikipedia articles for a given study, later adding other language editions.[27][28] The Wikimedia platform under development Scholia also shows "Wikipedia mentions" of scientific works.[29] A study suggests a citation on Wikipedia "could be considered a public parallel to scholarly citation".[30] A scientific publication being "cited in a Wikipedia article is considered an indicator of some form of impact for this publication" and it may be possible to detect certain publications through changes to Wikipedia articles.[31] Wikimedia Research's Cite-o-Meter tool showed a league table of which academic publishers are most cited on Wikipedia[30] as does a page by the "Academic Journals WikiProject".[32][33][circular reference][additional citation(s) needed] Research indicates a large share of academic citations on the platform are paywalled and hence inaccessible to many readers.[34][35] "[citation needed]" is a tag added by Wikipedia editors to unsourced statements in articles requesting citations to be added.[36] The phrase is reflective of the policies of verifiability and no original research on Wikipedia and has become a general Internet meme.[37]

Differentiation of semantic citation contexts

[edit]
Percent of all citances in each field that contain signals of disagreement[38]

The tool scite.ai tracks and links citations of papers as 'Supporting', 'Mentioning', or 'Contrasting' the study, differentiating between these contexts of citations to some degree which may be useful for evaluation/metrics and e.g. discovering studies or statements contrasting statements within a specific study.[39][40][41]

Retractions

[edit]

The Scite Reference Check bot is an extension of scite.ai that scans new article PDFs "for references to retracted papers, and posts both the citing and retracted papers on Twitter" and also "flags when new studies cite older ones that have issued corrections, errata, withdrawals, or expressions of concern".[41] Studies have suggested as few as 4% of citations to retracted papers clearly recognize the retraction.[41] Research found "that authors tend to keep citing retracted papers long after they have been red flagged, although at a lower rate".[42]

See also

[edit]

Notes

[edit]
  1. ^ Examples include subscription-based tools based on proprietary data, such as Web of Science and Scopus, and free tools based on open data, such as Scholarometer by Filippo Menczer and his team.

References

[edit]
  1. ^ "ISO 26324:2012(en), Information and documentation – Digital object identifier system". ISO. Archived from the original on 17 June 2016. Retrieved 20 April 2016.
  2. ^ "The Handle System". Handle.Net Registry. Archived from the original on Jan 7, 2023.
  3. ^ "Resources (including Factsheets)". DOI. Archived from the original on Dec 25, 2022.
  4. ^ Rubin, Richard (2010). Foundations of library and information science (3rd ed.). New York: Neal-Schuman Publishers. ISBN 978-1-55570-690-6.
  5. ^ Garfield, E. Citation Indexing - Its Theory and Application in Science, Technology and Humanities Philadelphia:ISI Press, 1983.
  6. ^ Jaffe, Adam; de Rassenfosse, Gaétan (2017). "Patent citation data in social science research: Overview and best practices". Journal of the Association for Information Science and Technology. 68 (6): 1360–1374. doi:10.1002/asi.23731.
  7. ^ Derek J. de Solla Price (July 30, 1965). "Networks of Scientific Papers" (PDF). Science. 149 (3683): 510–515. Bibcode:1965Sci...149..510D. doi:10.1126/science.149.3683.510. PMID 14325149.
  8. ^ Giles, C. Lee; Bollacker, Kurt D.; Lawrence, Steve (1998), "CiteSeer", Proceedings of the third ACM conference on Digital libraries - DL '98, New York: Association for Computing Machinery, pp. 89–98, doi:10.1145/276675.276685, ISBN 978-0-89791-965-4, S2CID 514080
  9. ^ Kaur, Jasleen; Diep Thi Hoang; Xiaoling Sun; Lino Possamai; Mohsen JafariAsbagh; Snehal Patil; Filippo Menczer (2012). "Scholarometer: A Social Framework for Analyzing Impact across Disciplines". PLOS ONE. 7 (9): e43235. Bibcode:2012PLoSO...743235K. doi:10.1371/journal.pone.0043235. PMC 3440403. PMID 22984414.
  10. ^ Hoang, D.; Kaur, J.; Menczer, F. (2010), "Crowdsourcing Scholarly Data", Proceedings of the WebSci10: Extending the Frontiers of Society On-Line, April 26-27th, 2010, Raleigh, NC: US, archived from the original on 2015-04-17, retrieved 2015-08-09
  11. ^ Anderson, M.S. van; Ronning, E.A. van; de Vries, R.; Martison, B.C. (2007). "The perverse effects of competition on scientists' work and relationship". Science and Engineering Ethics. 4 (13): 437–461. doi:10.1007/s11948-007-9042-5. PMID 18030595. S2CID 2994701.
  12. ^ Wesel, M. van (2016). "Evaluation by Citation: Trends in Publication Behavior, Evaluation Criteria, and the Strive for High Impact Publications". Science and Engineering Ethics. 22 (1): 199–225. doi:10.1007/s11948-015-9638-0. PMC 4750571. PMID 25742806.
  13. ^ Bornmann, L.; Daniel, H. D. (2008). "What do citation counts measure? A review of studies on citing behavior". Journal of Documentation. 64 (1): 45–80. doi:10.1108/00220410810844150. hdl:11858/00-001M-0000-0013-7A94-3. S2CID 17260826.
  14. ^ Anauati, Maria Victoria and Galiani, Sebastian and Gálvez, Ramiro H., Quantifying the Life Cycle of Scholarly Articles Across Fields of Economic Research (November 11, 2014). Available at SSRN: https://ssrn.com/abstract=2523078
  15. ^ Aksnes, Dag W.; Langfeldt, Liv; Wouters, Paul (2019-01-01). "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories". SAGE Open. 9. doi:10.1177/2158244019829575. hdl:1887/78034. S2CID 150974941.
  16. ^ Gurevitch, Jessica; Koricheva, Julia; Nakagawa, Shinichi; Stewart, Gavin (March 2018). "Meta-analysis and the science of research synthesis". Nature. 555 (7695): 175–182. Bibcode:2018Natur.555..175G. doi:10.1038/nature25753. ISSN 1476-4687. PMID 29517004. S2CID 3761687.
  17. ^ "A new replication crisis: Research that is less likely to be true is cited more". phys.org. Retrieved 14 June 2021.
  18. ^ Serra-Garcia, Marta; Gneezy, Uri (2021-05-01). "Nonreplicable publications are cited more than replicable ones". Science Advances. 7 (21): eabd1705. Bibcode:2021SciA....7.1705S. doi:10.1126/sciadv.abd1705. ISSN 2375-2548. PMC 8139580. PMID 34020944.
  19. ^ a b Park, Michael; Leahey, Erin; Funk, Russell J. (January 2023). "Papers and patents are becoming less disruptive over time". Nature. 613 (7942): 138–144. arXiv:2106.11184. Bibcode:2023Natur.613..138P. doi:10.1038/s41586-022-05543-x. ISSN 1476-4687. PMID 36600070. S2CID 255466666.
  20. ^ Snyder, Alison (14 October 2021). "New ideas are struggling to emerge from the sea of science". Axios. Retrieved 15 November 2021.
  21. ^ a b Thompson, Derek (11 January 2023). "The Consolidation-Disruption Index Is Alarming". The Atlantic. Retrieved 25 February 2023.
  22. ^ Chu, Johan S. G.; Evans, James A. (12 October 2021). "Slowed canonical progress in large fields of science". Proceedings of the National Academy of Sciences. 118 (41). Bibcode:2021PNAS..11821636C. doi:10.1073/pnas.2021636118. ISSN 0027-8424. PMC 8522281. PMID 34607941.
  23. ^ Tejada, Patricia Contreras (13 January 2023). "With fewer disruptive studies, is science becoming an echo chamber?". Advanced Science News. Archived from the original on 15 February 2023. Retrieved 15 February 2023.
  24. ^ a b Nüst, Daniel; Yücel, Gazi; Cordts, Anette; Hauschke, Christian (4 January 2023). "Enriching the scholarly metadata commons with citation metadata and spatio-temporal metadata to support responsible research assessment and research discovery". arXiv:2301.01502 [cs.DL].
  25. ^ Beel, Joeran; Gipp, Bela; Langer, Stefan; Breitinger, Corinna (1 November 2016). "Research-paper recommender systems: a literature survey". International Journal on Digital Libraries. 17 (4): 305–338. doi:10.1007/s00799-015-0156-0. ISSN 1432-1300. S2CID 254074596.
  26. ^ "How does ask a question work?". scite.ai. Retrieved 25 February 2023.
  27. ^ a b c Arroyo-Machado, Wenceslao; Torres-Salinas, Daniel; Herrera-Viedma, Enrique; Romero-Frías, Esteban (10 February 2020). "Science through Wikipedia: A novel representation of open knowledge through co-citation networks". PLOS ONE. 15 (2): e0228713. arXiv:2002.04347. Bibcode:2020PLoSO..1528713A. doi:10.1371/journal.pone.0228713. ISSN 1932-6203. PMC 7010282. PMID 32040488.
  28. ^ "New Source Alert: Wikipedia". Altmetric. 4 February 2015. Retrieved 25 February 2023.
  29. ^ Arroyo-Machado, Wenceslao; Torres-Salinas, Daniel; Costas, Rodrigo (20 December 2022). "Wikinformetrics: Construction and description of an open Wikipedia knowledge graph data set for informetric purposes". Quantitative Science Studies. 3 (4): 931–952. doi:10.1162/qss_a_00226. hdl:10481/80532. S2CID 253107766.
  30. ^ a b Priem, Jason (6 July 2015). "Altmetrics (Chapter from Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact)". arXiv:1507.01328 [cs.DL].
  31. ^ Zagorova, Olga; Ulloa, Roberto; Weller, Katrin; Flöck, Fabian (12 April 2022). ""I updated the <ref>": The evolution of references in the English Wikipedia and the implications for altmetrics" (PDF). Quantitative Science Studies. 3 (1): 147–173. doi:10.1162/qss_a_00171.
  32. ^ Katz, Gilad; Rokach, Lior (8 January 2016). "Wikiometrics: A Wikipedia Based Ranking System". arXiv:1601.01058 [cs.DL].
  33. ^ "Wikipedia:WikiProject Academic Journals/Journals cited by Wikipedia". Wikipedia. 15 September 2022. Retrieved 25 February 2023.
  34. ^ Leva, Federico (21 February 2022). "Wikipedia is open to all, the research underpinning it should be too". Impact of Social Sciences. Retrieved 25 February 2023.
  35. ^ Tattersall, Andy; Sheppard, Nick; Blake, Thom; O'Neill, Kate; Carroll, Chris (2 February 2022). "Exploring open access coverage of Wikipedia-cited research across the White Rose Universities" (PDF). Insights: The UKSG Journal. 35: 3. doi:10.1629/uksg.559. S2CID 246504456.
  36. ^ Redi, Miriam; Fetahu, Besnik; Morgan, Jonathan; Taraborelli, Dario (13 May 2019). "Citation Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's Verifiability". The World Wide Web Conference. WWW '19. San Francisco, CA, USA: Association for Computing Machinery. pp. 1567–1578. doi:10.1145/3308558.3313618. ISBN 978-1-4503-6674-8. S2CID 67856117.
  37. ^ McDowell, Zachary J.; Vetter, Matthew A. (2022). "What Counts as Information: The Construction of Reliability and Verifability". Wikipedia and the Representation of Reality. Routledge, Taylor & Francis. p. 34. doi:10.4324/9781003094081. hdl:20.500.12657/50520. ISBN 978-1-000-47427-5.
  38. ^ Lamers, Wout S; Boyack, Kevin; Larivière, Vincent; Sugimoto, Cassidy R; van Eck, Nees Jan; Waltman, Ludo; Murray, Dakota (24 December 2021). "Investigating disagreement in the scientific literature". eLife. 10: e72737. doi:10.7554/eLife.72737. ISSN 2050-084X. PMC 8709576. PMID 34951588.
  39. ^ Khamsi, Roxanne (1 May 2020). "Coronavirus in context: Scite.ai tracks positive and negative citations for COVID-19 literature". Nature. doi:10.1038/d41586-020-01324-6. Retrieved 19 February 2022.
  40. ^ Nicholson, Josh M.; Mordaunt, Milo; Lopez, Patrice; Uppala, Ashish; Rosati, Domenic; Rodrigues, Neves P.; Grabitz, Peter; Rife, Sean C. (5 November 2021). "scite: A smart citation index that displays the context of citations and classifies their intent using deep learning" (PDF). Quantitative Science Studies. 2 (3): 882–898. doi:10.1162/qss_a_00146. S2CID 232283218.
  41. ^ a b c "New bot flags scientific studies that cite retracted papers". Nature Index. 2 February 2021. Retrieved 25 January 2023.
  42. ^ Peng, Hao; Romero, Daniel M.; Horvát, Emőke-Ágnes (21 June 2022). "Dynamics of cross-platform attention to retracted papers". Proceedings of the National Academy of Sciences. 119 (25): e2119086119. arXiv:2110.07798. Bibcode:2022PNAS..11919086P. doi:10.1073/pnas.2119086119. ISSN 0027-8424. PMC 9231484. PMID 35700358.

Further reading

[edit]
[edit]