Academic journal: Difference between revisions
imported>Gareth Leng |
Pat Palmer (talk | contribs) m (Text replacement - "Open Access" to "Open access") |
||
(59 intermediate revisions by 15 users not shown) | |||
Line 1: | Line 1: | ||
{{subpages}} | |||
:''In relation to the natural sciences, see [[Scientific journal]]'' | :''In relation to the natural sciences, see [[Scientific journal]]'' | ||
{{Image|Cover of the journal "Atmospheric Chemistry and Physics".png|left|200px|The cover of an academic journal, depicting its title and the essential [[bibliographic metadata]] of this particular issue (volume, number, year of publication, publisher), as well as some further information (e.g. that this journal is published [[Open access]]).}} | |||
{{TOC|right}} | |||
{{Image|Scholarly-journal-map-journal.pone.0004803.g005-scale-0.75.png|right|200px|A two-dimensional map of relations between the subjects of academic journals, based on clickstream data.}} | |||
An '''academic journal''' is a | An '''academic journal''' is a regular [[peer-review|peer-reviewed]] periodical that publishes scholarship relating to an [[academic discipline]]. An academic [[journal]] provides a place for the introduction and scrutiny of new research, and is often a forum for the [[critique]] of existing research. This is most often manifested in the publication of [[original research articles]] ([[research]] findings), [[review article]]s, and [[book review]]s). [[Scientific journals]] and journals in the [[quantitative]] [[social sciences]] vary somewhat in form and function from journals in the [[humanities]] and [[qualitative]] [[social sciences]]. American and British systems of academic publishing are similar; other regions have somewhat different practices. | ||
== Scholarly articles == | == Scholarly articles == | ||
In [[academia]], manuscript submissions are generally unsolicited. Professional scholars generally [[submission guidelines|submit an article]] to a journal; then, the editor (or co-editors) determines whether to reject the submission outright (often on grounds of not being appropriate to the subject of the journal) or to send out the article for peer review. Peer review is often a double-blind system: the author does not know who the reviewers are and the reviewers do not know who the author is. In order to facilitate this double-blind system, authors eschew self-references in their articles, and their names never appear on any page headings of the manuscript. | |||
The journal editor chooses the reviewers. There are usually two reviewers; a third is sometimes asked if the two disagree. In some fields, three reviewers is the norm. The opinions of these outside reviewers are used in the determination to publish the article, to return it to the author for revision, or to reject the article. (There are many variations on this process, discussed in the article on [[peer review]]). Even accepted articles are subject to further (often considerable) editing by the journal before publication. Because of this lengthy process, an accepted article will typically not appear in print until several months, at the very least, after its initial submission—several years is not unknown. | |||
The process of peer review is generally considered critical to establishing a reliable body of research and knowledge. Scholars can only be expert in a limited area ; they rely | The process of peer review is generally considered critical to establishing a reliable body of research and knowledge. Scholars can only be expert in a limited area; they rely on peer-reviewed journals to provide reliable and credible research which they can build upon for subsequent or related research. As a result, significant scandal ensues when an author is found to have falsified the research included in a published article, as many other scholars, and more generally the field of study itself, have relied upon that research. | ||
==Review Articles== | ==Review Articles== | ||
{{main|review article}} | |||
Review articles, often called "reviews of progress," serve as a check on the research published in the journals. Unlike research articles, review articles are usually solicited from long-standing experts in the field. Some journals are entirely devoted to review articles, others contain a few each issue, but most do not publish review articles at all. Such reviews often cover the research for the preceding year, some for longer or shorter periods; some are devoted to very specific topics, some to general surveys. Some are [[enumerative]], with intent to list all significant articles in a subject. Others are selective, including what they think is worth including. Yet others are evaluative, aiming to give a judgment of the state of progress in the field. Some are published in series, covering each year a complete subject field, or covering a number of specific fields over several years. | |||
Unlike [[original research articles]], book reviews tend to be solicited, and are sometimes planned years in advance. Authors are often paid a few hundred dollars for such reviews. Because of this, the standard definitions of [[open access]] do not require review articles to be open access, although many are. They are typically relied on by students beginning a study in a field, or for [[current awareness]] for those already in the field. | |||
Unlike [[original research articles]], book reviews tend to be solicited, and are sometimes planned | Due to concerns about inconsistent quality of review articles<ref name="pmid1535110">{{cite journal |author=Antman EM, Lau J, Kupelnick B, Mosteller F, Chalmers TC |title=A comparison of results of meta-analyses of randomized control trials and recommendations of clinical experts. Treatments for myocardial infarction |journal=JAMA |volume=268 |issue=2 |pages=240–8 |year=1992 |pmid=1535110 |doi= |issn=}}</ref><ref name="pmid18056905">{{cite journal |author=Tatsioni A, Bonitsis NG, Ioannidis JP |title=Persistence of contradicted claims in the literature |journal=JAMA |volume=298 |issue=21 |pages=2517–26 |year=2007 |pmid=18056905 |doi=10.1001/jama.298.21.2517 |issn=}}</ref>, the [[systematic review]] article which used technqiues from [[meta-analysis]] has been developed. The [[systematic review]] forms the core of [[evidence-based medicine]]. | ||
== Book reviews == | == Book reviews == | ||
Book reviews of scholarly books serve as a check on the research published in book form. Unlike articles, book reviews tend to be solicited. Journals typically have a separate book review editor who determines which new books should be reviewed and by whom. If an outside scholar accepts the book review editor's request to review a book, he or she generally receives a free copy of that book from the journal in exchange for a timely and publishable review. Publishers or authors send books to book review editors in the hope that their books will be reviewed. The length and depth of reviews vary considerably from journal to journal. The extent to which textbooks and other non-scholarly books are covered also varies from journal to journal. | |||
== Prestige == | |||
The prestige of an academic journal is established over time. It can reflect many factors, some but not all expressible quantitatively. Prestige is usually expressed in terms of the [[impact factor]] (a measure of popularity) but various alternatives are emerging.<ref name="urlCitation metrics">{{cite web |url=http://www.harzing.com/pophelp/metrics.htm |title=Citation metrics |author= |authorlink= |coauthors= |date= |format= |work= |publisher= |pages= |language= |archiveurl= |archivedate= |quote= |accessdate=}}</ref><ref name="titleCitations: Incitement or Excitement? The Scholarly Kitchen">{{cite web |url=http://scholarlykitchen.sspnet.org/2008/04/09/citations-incitement-or-excitement/ |title=Citations: Incitement or Excitement? The Scholarly Kitchen |accessdate=2008-04-09 |author=Anderson, Kent |authorlink= |coauthors= |date=2008 |format= |work= |publisher= |pages= |language= |archiveurl= |archivedate= |quote=}}</ref> The many alternatives to the impact factor have been compared.<ref>Davis P. (2009) [http://scholarlykitchen.sspnet.org/2009/02/17/scientific-impact-measures-compared/ Scientific Impact Measures Compared] </ref><ref name="pmid19562078">{{cite journal| author=Bollen J, Van de Sompel H, Hagberg A, Chute R| title=A principal component analysis of 39 scientific impact measures. | journal=PLoS One | year= 2009 | volume= 4 | issue= 6 | pages= e6022 | pmid=19562078 | |||
| url=http://www.ncbi.nlm.nih.gov/entrez/eutils/elink.fcgi?dbfrom=pubmed&tool=clinical.uthscsa.edu/cite&retmode=ref&cmd=prlinks&id=19562078 | doi=10.1371/journal.pone.0006022 | pmc=PMC2699100 }} </ref> | |||
===Impact factor=== | |||
{{main|impact factor}} | |||
In the sciences, and the quantitative social sciences, [[impact factor]] is a convenient numerical measure, reflecting the number of later articles citing those articles already published in the journal. There are other possible quantitative factors, and there is question whether the number of citations is a best quantitative measure of prestige—See the discussion of [[impact factor]]. There is also a question of whether any quantitative factor can reflect true prestige. An excellent review on the pitfalls of impact factor is available [http://www.bio-diglib.com/content/2/1/7]. | |||
In the Anglo-American [[humanities]], there has not yet been a tradition (as currently exists in the sciences) of giving numerical prestige "values" to journals in schemes to quantify the relative importance of research (based on the number of references made to an article in other academic articles). Perhaps a key reason for this is the relative unimportance of academic journals in these field, as contrasted with the importance of [[academic monographs]]. Very recently, there have been some preliminary work towards determining the validity of such measurement [http://www.mja.com.au/public/issues/178_06_170303/wal10537_fm.html], | |||
[[Faculty of 1000]] [http://www.f1000biology.com/about/system]. | |||
===Google Web-URL citations=== | |||
Google Web-URL citations have been compared to the Impact factor.<ref>{{Cite journal | |||
| volume = 58 | issue = 7 | pages = 1055-1065 | last = Kousha | first = Kayvan | coauthors = Mike Thelwall | |||
| title = Google Scholar citations and Google Web-URL citations: A multi-discipline exploratory analysis | |||
| journal = J. Am. Soc. Inf. Sci. Technol. | accessdate = 2008-09-26 | date = 2007 | url = http://portal.acm.org/citation.cfm?id=1241540.1241547&coll=&dl |doi=10.1002/asi.v58:7 }}</ref> | |||
===Faculty of 1000=== | |||
For [[Biology]] articles, [[Faculty of 1000]] is another ranking index for individual papers. | |||
===H-index=== | |||
The H-index, also called the Hirsch index, is another alternative measure.<ref name="urlThe “h-index”: An Objective Mismeasure? « The Scholarly Kitchen">{{cite web |url=http://scholarlykitchen.sspnet.org/2008/06/30/the-h-index-an-objective-mismeasure/ |author=Anderson K|title=The “h-index”: An Objective Mismeasure? |format= |work= |accessdate=2008-06-30}}</ref> It is available online at http://www.scimagojr.com/ and per the website is "an index that quantifies both the scientific productivity and the scientific impact of a journal (it is also applicable to scientists, countries...). The index is based on the set of the journal's most quoted papers and the number of citations that they have received in other publications."<ref>http://www.scimagojr.com/help.php</ref> | |||
===''q̅(J)''=== | |||
Ranking journals by the average quality of its papers has been proposed.<ref name="pmid18301760">{{CZ:Ref:Stringer 2008 Effectiveness of Journal Ranking Schemes as a Tool for Locating Information}}</ref> | |||
===Pagerank and its variations=== | |||
Bollen ''et al''. proposed using the [[PageRank]] algorithm used by [[Google]] to distinguish the "quality" of citations and hence improve Impact Factor calculation.<ref> | |||
{{Cite journal | doi = 10.1007/s11192-006-0176-z | volume = 69 | issue = 3 | pages = 669-687 | last = Bollen | first = Johan | coauthors = Marko A. Rodriquez, Herbert Van de Sompel | title = Journal status | journal = Scientometrics | accessdate = 2010-05-05 | date = 2006-12-23 | url = http://dx.doi.org/10.1007/s11192-006-0176-z }}</ref> <ref>[http://www.arxiv.org/abs/cs.GL/0601030 Journal Status], Johan Bollen, Marko A. Rodriguez, and Herbert Van de Sompel, May 17, 2006</ref><ref name="pmid17499388">{{cite journal |author=Dellavalle RP, Schilling LM, Rodriguez MA, Van de Sompel H, Bollen J |title=Refining dermatology journal impact factors using PageRank |journal=J. Am. Acad. Dermatol. |volume=57 |issue=1 |pages=116–9 |year=2007 |month=July |pmid=17499388 |doi=10.1016/j.jaad.2007.03.005 |url=http://linkinghub.elsevier.com/retrieve/pii/S0190-9622(07)00502-6 |issn=}}</ref> | |||
ISI Impact Factor PageRank Combined | |||
1 52.28 ANNU REV IMMUNOL 16.78 NATURE 51.97 NATURE | |||
2 37.65 ANNU REV BIOCHEM 16.39 J BIOL CHEM 48.78 SCIENCE | |||
3 36.83 PHYSIOL REV 16.38 SCIENCE 19.84 NEW ENGL J MED | |||
4 35.04 NAT REV MOL CELL BIO 14.49 PNAS 15.34 CELL | |||
5 34.83 NEW ENGL J MED 8.41 PHYS REV LETT 14.88 PNAS | |||
6 30.98 NATURE 5.76 CELL 10.62 J BIOL CHEM | |||
7 30.55 NAT MED 5.70 NEW ENGL J MED 8.49 JAMA | |||
8 29.78 SCIENCE 4.67 J AM CHEM SOC 7.78 LANCET | |||
9 28.18 NAT IMMUNOL 4.46 J IMMUNOL 7.56 NAT GENET | |||
10 28.17 REV MOD PHYS 4.28 APPL PHYS LETT 6.53 NAT MED | |||
== | The table shows the top 10 journals by [[Institute for Scientific Information|ISI]] Impact Factor, PageRank, and a modified system that combines the two (based on 2003 data). ''[[Nature (journal)|Nature]]'' and ''[[Science (journal)|Science]]'' are generally regarded as the most prestigious journals, and in the combined system they come out on top. That the ''[[New England Journal of Medicine]]'' is cited even more than ''Nature'' or ''Science'' might reflect the mix of review articles and original articles that it publishes. It is necessary to analyze the data for a journal in the light of a detailed knowledge of the journal literature. | ||
===Eigenfactor=== | |||
The Eigenfactor has been proposed as an alternative to the impact factor (http://www.eigenfactor.org/ and http://well-formed.eigenfactor.org/).<ref>Bergstrom CT. (2007) [http://www.ala.org/ala/acrl/acrlpubs/crlnews/backissues2007/may07/eigenfactor.cfm Eigenfactor: Measuring the value and prestige of scholarly journals] C&RL News 68(5)</ref><ref name="urlEigenfactor « The Scholarly Kitchen">{{cite web |url=http://scholarlykitchen.sspnet.org/2008/07/23/eigenfactor/ |title=Eigenfactor|author=Philip Davis |authorlink= |coauthors= |date=7/23/2008 |format= |work= |publisher=The Scholarly Kitchen |pages= |language= |archiveurl= |archivedate= |quote= |accessdate=}}</ref> | |||
The | ===Y factor=== | ||
The Y factor is similar to the Eigenfactor.<ref>{{Cite journal | |||
| doi = 10.1007/s11192-006-0176-z | |||
| volume = 69 | |||
| issue = 3 | |||
| pages = 669-687 | |||
| last = Bollen | |||
| first = Johan | |||
| coauthors = Marko A. Rodriquez, Herbert Van de Sompel | |||
| title = Journal status | |||
| journal = Scientometrics | |||
| accessdate = 2010-05-05 | |||
| date = 2006-12-23 | |||
| url = http://dx.doi.org/10.1007/s11192-006-0176-z | |||
}}</ref> | |||
In | ===Online usage=== | ||
In health sciences, alternatives have been developed for predicting the impact of an article soon after publication without waiting for the two years needed to calculate an impact factor. These alternatives have included predicting the impact factor by using webhits<ref name="pmid15345629">{{cite journal |author=Perneger TV |title=Relation between online "hit counts" and subsequent citations: prospective study of research papers in the BMJ |journal=BMJ |volume=329 |issue=7465 |pages=546-7 |year=2004 |pmid=15345629 |doi=10.1136/bmj.329.7465.546 |url=http://bmj.com/cgi/pmidlookup?view=long&pmid=15345629 |issn=}}</ref> and other factors such as "indexing in numerous databases; number of authors; abstraction in synoptic journals; clinical relevance scores; number of cited references" and the nature of the article.<ref name="pmid18292132">{{cite journal |author=Lokker C, McKibbon KA, McKinlay RJ, Wilczynski NL, Haynes RB |title=Prediction of citation counts for clinical articles at two years using data available within three weeks of publication: retrospective cohort study |journal=BMJ |volume= |issue= |pages= |year=2008 |pmid=18292132 |doi=10.1136/bmj.39482.526713.BE |url=http://bmj.com/cgi/pmidlookup?view=long&pmid=18292132 |issn=}}</ref>. | |||
== | ===Tweeting=== | ||
Frequency of tweeting may predict citation counts<ref name="pmid22173204">{{cite journal| author=Eysenbach G| title=Can tweets predict citations? Metrics of social impact based on twitter and correlation with traditional metrics of scientific impact. | journal=J Med Internet Res | year= 2011 | volume= 13 | issue= 4 | pages= e123 | pmid=22173204 | doi=10.2196/jmir.2012 | pmc= | url=http://www.ncbi.nlm.nih.gov/entrez/eutils/elink.fcgi?dbfrom=pubmed&tool=sumsearch.org/cite&retmode=ref&cmd=prlinks&id=22173204 }} </ref>; however, this study is controversial.<ref>{{Cite news | |||
| title = Tweets, and Our Obsession with Alt Metrics | |||
| work = The Scholarly Kitchen | |||
| accessdate = 2012-01-04 | |||
| url = http://scholarlykitchen.sspnet.org/2012/01/04/tweets-and-our-obsession-with-alt-metrics/ | |||
}}</ref> | |||
===Total citations / total articles=== | |||
Scopus, a product of [[Elsevier]] B.V. has introduced its [http://www.scopus.com/scopus/source/eval.url Analytics] which provides interactive charting of the total citations divided by the total number of articles for multiple journals and publication years.<ref name="urlScopus Journal Analyzer">{{cite web |url=http://www.info.scopus.com/journalanalyzer/ |title=Scopus Journal Analyzer |author=Anonymous |authorlink= |coauthors= |date= |format= |work= |publisher=Elsevier B.V. |pages= |language= |archiveurl= |archivedate= |quote= |accessdate=2008-07-21}}</ref> | |||
== | ===SCImago Journal Rank (SJR)=== | ||
SCImago Journal Rank (SJR), also based on data from Scopus, borrows from the PageRank concept to weight citations based on prestige of the citing journals.<ref>Falagas ME, Kouranos VD, Arencibia-Jorge R, Karageorgopoulos DE. Comparison of SCImago journal rank indicator with journal impact factor. FASEB J. 2008 Apr 11. PMID 18408168</ref> According to its website, "is an indicator that expresses the number of connections that a journal receives through the citation of its documents divided between the total of documents published in the year selected by the publication, weighted according to the amount of incoming and outgoing connections of the sources." <ref>http://www.scimagojr.com/help.php</ref> | |||
Prior work had explored using citation data to enhance information retrieval from biomedical journals.<ref>Bernstam EV, Herskovic JR, Aphinyanaphongs Y, Aliferis CF, Sriram MG, Hersh WR. Using citation data to improve retrieval from MEDLINE. J Am Med Inform Assoc. 2006 Jan-Feb;13(1):96-105. Epub 2005 Oct 12. PMID 16221938 </ref> SJR is available online at http://www.scimagojr.com/. | |||
===Index Copernicus=== | |||
[http://journals.indexcopernicus.com/ Index Copernicus] is similar to the [[impact factor]]. | |||
== | == Financial operation == | ||
Academic journals in the humanities and social sciences are usually [[subsidized]] by universities or professional organizations, and do not exist to make a profit. However, they often accept advertisements (usually from academic book publishers) as a way of off-setting production costs. It is standard practice for academic journals to charge libraries much higher subscription rates than individual subscribers pay. Editors of journals tend to have other professional responsibilities, most often as teaching professors. In the case of the very largest journals, there is sometimes paid staff to assist in the editing. The production of the journals is almost always done by paid staff from the publisher. Publishers in the subjects are often university presses; some of them specialize in such journals, such as the [[Oxford University Press]]. | |||
== | == New developments == | ||
In recent years, the [[Internet]] has revolutionized the production of, and access to, academic journals. Journal content is often available online via services subscribed to by academic libraries. Individual articles are indexed in databases by subject, and can be increasingly found in such databases as [[Google Scholar]]. Other specialized databases serve as platforms for disseminating journals (i.e. [[JSTOR]] or [[ScienceDirect]]). Some of the smallest and most specialized journals are prepared in-house by an academic department and published only on the internet--recently such publication has sometimes taken the form of a blog. | |||
[[ | ===Open access=== | ||
[[ | {{main|Open access}} | ||
[[ | There is currently a movement in higher education encouraging [[open access]], either by [[self archiving]], where the author places his paper in a [[repository]] where it can be searched for and read, or by publishing in an [[open access journal]], which does not charge for subscriptions, being either subsidized or financed through author [[page charges]]. To date, open access has had a much greater effect on science journals than on those in the humanities. | ||
[[ | ==References== | ||
{{reflist|2}}[[Category:Suggestion Bot Tag]] |
Latest revision as of 11:45, 27 September 2024
- In relation to the natural sciences, see Scientific journal
An academic journal is a regular peer-reviewed periodical that publishes scholarship relating to an academic discipline. An academic journal provides a place for the introduction and scrutiny of new research, and is often a forum for the critique of existing research. This is most often manifested in the publication of original research articles (research findings), review articles, and book reviews). Scientific journals and journals in the quantitative social sciences vary somewhat in form and function from journals in the humanities and qualitative social sciences. American and British systems of academic publishing are similar; other regions have somewhat different practices.
Scholarly articles
In academia, manuscript submissions are generally unsolicited. Professional scholars generally submit an article to a journal; then, the editor (or co-editors) determines whether to reject the submission outright (often on grounds of not being appropriate to the subject of the journal) or to send out the article for peer review. Peer review is often a double-blind system: the author does not know who the reviewers are and the reviewers do not know who the author is. In order to facilitate this double-blind system, authors eschew self-references in their articles, and their names never appear on any page headings of the manuscript.
The journal editor chooses the reviewers. There are usually two reviewers; a third is sometimes asked if the two disagree. In some fields, three reviewers is the norm. The opinions of these outside reviewers are used in the determination to publish the article, to return it to the author for revision, or to reject the article. (There are many variations on this process, discussed in the article on peer review). Even accepted articles are subject to further (often considerable) editing by the journal before publication. Because of this lengthy process, an accepted article will typically not appear in print until several months, at the very least, after its initial submission—several years is not unknown.
The process of peer review is generally considered critical to establishing a reliable body of research and knowledge. Scholars can only be expert in a limited area; they rely on peer-reviewed journals to provide reliable and credible research which they can build upon for subsequent or related research. As a result, significant scandal ensues when an author is found to have falsified the research included in a published article, as many other scholars, and more generally the field of study itself, have relied upon that research.
Review Articles
Review articles, often called "reviews of progress," serve as a check on the research published in the journals. Unlike research articles, review articles are usually solicited from long-standing experts in the field. Some journals are entirely devoted to review articles, others contain a few each issue, but most do not publish review articles at all. Such reviews often cover the research for the preceding year, some for longer or shorter periods; some are devoted to very specific topics, some to general surveys. Some are enumerative, with intent to list all significant articles in a subject. Others are selective, including what they think is worth including. Yet others are evaluative, aiming to give a judgment of the state of progress in the field. Some are published in series, covering each year a complete subject field, or covering a number of specific fields over several years.
Unlike original research articles, book reviews tend to be solicited, and are sometimes planned years in advance. Authors are often paid a few hundred dollars for such reviews. Because of this, the standard definitions of open access do not require review articles to be open access, although many are. They are typically relied on by students beginning a study in a field, or for current awareness for those already in the field.
Due to concerns about inconsistent quality of review articles[1][2], the systematic review article which used technqiues from meta-analysis has been developed. The systematic review forms the core of evidence-based medicine.
Book reviews
Book reviews of scholarly books serve as a check on the research published in book form. Unlike articles, book reviews tend to be solicited. Journals typically have a separate book review editor who determines which new books should be reviewed and by whom. If an outside scholar accepts the book review editor's request to review a book, he or she generally receives a free copy of that book from the journal in exchange for a timely and publishable review. Publishers or authors send books to book review editors in the hope that their books will be reviewed. The length and depth of reviews vary considerably from journal to journal. The extent to which textbooks and other non-scholarly books are covered also varies from journal to journal.
Prestige
The prestige of an academic journal is established over time. It can reflect many factors, some but not all expressible quantitatively. Prestige is usually expressed in terms of the impact factor (a measure of popularity) but various alternatives are emerging.[3][4] The many alternatives to the impact factor have been compared.[5][6]
Impact factor
In the sciences, and the quantitative social sciences, impact factor is a convenient numerical measure, reflecting the number of later articles citing those articles already published in the journal. There are other possible quantitative factors, and there is question whether the number of citations is a best quantitative measure of prestige—See the discussion of impact factor. There is also a question of whether any quantitative factor can reflect true prestige. An excellent review on the pitfalls of impact factor is available [1]. In the Anglo-American humanities, there has not yet been a tradition (as currently exists in the sciences) of giving numerical prestige "values" to journals in schemes to quantify the relative importance of research (based on the number of references made to an article in other academic articles). Perhaps a key reason for this is the relative unimportance of academic journals in these field, as contrasted with the importance of academic monographs. Very recently, there have been some preliminary work towards determining the validity of such measurement [2], Faculty of 1000 [3].
Google Web-URL citations
Google Web-URL citations have been compared to the Impact factor.[7]
Faculty of 1000
For Biology articles, Faculty of 1000 is another ranking index for individual papers.
H-index
The H-index, also called the Hirsch index, is another alternative measure.[8] It is available online at http://www.scimagojr.com/ and per the website is "an index that quantifies both the scientific productivity and the scientific impact of a journal (it is also applicable to scientists, countries...). The index is based on the set of the journal's most quoted papers and the number of citations that they have received in other publications."[9]
q̅(J)
Ranking journals by the average quality of its papers has been proposed.[10]
Pagerank and its variations
Bollen et al. proposed using the PageRank algorithm used by Google to distinguish the "quality" of citations and hence improve Impact Factor calculation.[11] [12][13]
ISI Impact Factor PageRank Combined 1 52.28 ANNU REV IMMUNOL 16.78 NATURE 51.97 NATURE 2 37.65 ANNU REV BIOCHEM 16.39 J BIOL CHEM 48.78 SCIENCE 3 36.83 PHYSIOL REV 16.38 SCIENCE 19.84 NEW ENGL J MED 4 35.04 NAT REV MOL CELL BIO 14.49 PNAS 15.34 CELL 5 34.83 NEW ENGL J MED 8.41 PHYS REV LETT 14.88 PNAS 6 30.98 NATURE 5.76 CELL 10.62 J BIOL CHEM 7 30.55 NAT MED 5.70 NEW ENGL J MED 8.49 JAMA 8 29.78 SCIENCE 4.67 J AM CHEM SOC 7.78 LANCET 9 28.18 NAT IMMUNOL 4.46 J IMMUNOL 7.56 NAT GENET 10 28.17 REV MOD PHYS 4.28 APPL PHYS LETT 6.53 NAT MED
The table shows the top 10 journals by ISI Impact Factor, PageRank, and a modified system that combines the two (based on 2003 data). Nature and Science are generally regarded as the most prestigious journals, and in the combined system they come out on top. That the New England Journal of Medicine is cited even more than Nature or Science might reflect the mix of review articles and original articles that it publishes. It is necessary to analyze the data for a journal in the light of a detailed knowledge of the journal literature.
Eigenfactor
The Eigenfactor has been proposed as an alternative to the impact factor (http://www.eigenfactor.org/ and http://well-formed.eigenfactor.org/).[14][15]
Y factor
The Y factor is similar to the Eigenfactor.[16]
Online usage
In health sciences, alternatives have been developed for predicting the impact of an article soon after publication without waiting for the two years needed to calculate an impact factor. These alternatives have included predicting the impact factor by using webhits[17] and other factors such as "indexing in numerous databases; number of authors; abstraction in synoptic journals; clinical relevance scores; number of cited references" and the nature of the article.[18].
Tweeting
Frequency of tweeting may predict citation counts[19]; however, this study is controversial.[20]
Total citations / total articles
Scopus, a product of Elsevier B.V. has introduced its Analytics which provides interactive charting of the total citations divided by the total number of articles for multiple journals and publication years.[21]
SCImago Journal Rank (SJR)
SCImago Journal Rank (SJR), also based on data from Scopus, borrows from the PageRank concept to weight citations based on prestige of the citing journals.[22] According to its website, "is an indicator that expresses the number of connections that a journal receives through the citation of its documents divided between the total of documents published in the year selected by the publication, weighted according to the amount of incoming and outgoing connections of the sources." [23]
Prior work had explored using citation data to enhance information retrieval from biomedical journals.[24] SJR is available online at http://www.scimagojr.com/.
Index Copernicus
Index Copernicus is similar to the impact factor.
Financial operation
Academic journals in the humanities and social sciences are usually subsidized by universities or professional organizations, and do not exist to make a profit. However, they often accept advertisements (usually from academic book publishers) as a way of off-setting production costs. It is standard practice for academic journals to charge libraries much higher subscription rates than individual subscribers pay. Editors of journals tend to have other professional responsibilities, most often as teaching professors. In the case of the very largest journals, there is sometimes paid staff to assist in the editing. The production of the journals is almost always done by paid staff from the publisher. Publishers in the subjects are often university presses; some of them specialize in such journals, such as the Oxford University Press.
New developments
In recent years, the Internet has revolutionized the production of, and access to, academic journals. Journal content is often available online via services subscribed to by academic libraries. Individual articles are indexed in databases by subject, and can be increasingly found in such databases as Google Scholar. Other specialized databases serve as platforms for disseminating journals (i.e. JSTOR or ScienceDirect). Some of the smallest and most specialized journals are prepared in-house by an academic department and published only on the internet--recently such publication has sometimes taken the form of a blog.
Open access
There is currently a movement in higher education encouraging open access, either by self archiving, where the author places his paper in a repository where it can be searched for and read, or by publishing in an open access journal, which does not charge for subscriptions, being either subsidized or financed through author page charges. To date, open access has had a much greater effect on science journals than on those in the humanities.
References
- ↑ Antman EM, Lau J, Kupelnick B, Mosteller F, Chalmers TC (1992). "A comparison of results of meta-analyses of randomized control trials and recommendations of clinical experts. Treatments for myocardial infarction". JAMA 268 (2): 240–8. PMID 1535110. [e]
- ↑ Tatsioni A, Bonitsis NG, Ioannidis JP (2007). "Persistence of contradicted claims in the literature". JAMA 298 (21): 2517–26. DOI:10.1001/jama.298.21.2517. PMID 18056905. Research Blogging.
- ↑ Citation metrics.
- ↑ Anderson, Kent (2008). Citations: Incitement or Excitement? The Scholarly Kitchen. Retrieved on 2008-04-09.
- ↑ Davis P. (2009) Scientific Impact Measures Compared
- ↑ Bollen J, Van de Sompel H, Hagberg A, Chute R (2009). "A principal component analysis of 39 scientific impact measures.". PLoS One 4 (6): e6022. DOI:10.1371/journal.pone.0006022. PMID 19562078. PMC PMC2699100. Research Blogging. [e]
- ↑ Kousha, Kayvan; Mike Thelwall (2007). "Google Scholar citations and Google Web-URL citations: A multi-discipline exploratory analysis". J. Am. Soc. Inf. Sci. Technol. 58 (7): 1055-1065. DOI:10.1002/asi.v58:7. Retrieved on 2008-09-26. Research Blogging.
- ↑ Anderson K. The “h-index”: An Objective Mismeasure?. Retrieved on 2008-06-30.
- ↑ http://www.scimagojr.com/help.php
- ↑ Stringer, M.J.; M. Sales-Pardo & L.A.N. Amaral (2008), "Effectiveness of Journal Ranking Schemes as a Tool for Locating Information", PLoS ONE 3 (2): e1683, DOI:10.1371/journal.pone.0001683 [e]
- ↑ Bollen, Johan; Marko A. Rodriquez, Herbert Van de Sompel (2006-12-23). "Journal status". Scientometrics 69 (3): 669-687. DOI:10.1007/s11192-006-0176-z. Retrieved on 2010-05-05. Research Blogging.
- ↑ Journal Status, Johan Bollen, Marko A. Rodriguez, and Herbert Van de Sompel, May 17, 2006
- ↑ Dellavalle RP, Schilling LM, Rodriguez MA, Van de Sompel H, Bollen J (July 2007). "Refining dermatology journal impact factors using PageRank". J. Am. Acad. Dermatol. 57 (1): 116–9. DOI:10.1016/j.jaad.2007.03.005. PMID 17499388. Research Blogging.
- ↑ Bergstrom CT. (2007) Eigenfactor: Measuring the value and prestige of scholarly journals C&RL News 68(5)
- ↑ Philip Davis (7/23/2008). Eigenfactor. The Scholarly Kitchen.
- ↑ Bollen, Johan; Marko A. Rodriquez, Herbert Van de Sompel (2006-12-23). "Journal status". Scientometrics 69 (3): 669-687. DOI:10.1007/s11192-006-0176-z. Retrieved on 2010-05-05. Research Blogging.
- ↑ Perneger TV (2004). "Relation between online "hit counts" and subsequent citations: prospective study of research papers in the BMJ". BMJ 329 (7465): 546-7. DOI:10.1136/bmj.329.7465.546. PMID 15345629. Research Blogging.
- ↑ Lokker C, McKibbon KA, McKinlay RJ, Wilczynski NL, Haynes RB (2008). "Prediction of citation counts for clinical articles at two years using data available within three weeks of publication: retrospective cohort study". BMJ. DOI:10.1136/bmj.39482.526713.BE. PMID 18292132. Research Blogging.
- ↑ Eysenbach G (2011). "Can tweets predict citations? Metrics of social impact based on twitter and correlation with traditional metrics of scientific impact.". J Med Internet Res 13 (4): e123. DOI:10.2196/jmir.2012. PMID 22173204. Research Blogging.
- ↑ Tweets, and Our Obsession with Alt Metrics, The Scholarly Kitchen. Retrieved on 2012-01-04.
- ↑ Anonymous. Scopus Journal Analyzer. Elsevier B.V.. Retrieved on 2008-07-21.
- ↑ Falagas ME, Kouranos VD, Arencibia-Jorge R, Karageorgopoulos DE. Comparison of SCImago journal rank indicator with journal impact factor. FASEB J. 2008 Apr 11. PMID 18408168
- ↑ http://www.scimagojr.com/help.php
- ↑ Bernstam EV, Herskovic JR, Aphinyanaphongs Y, Aliferis CF, Sriram MG, Hersh WR. Using citation data to improve retrieval from MEDLINE. J Am Med Inform Assoc. 2006 Jan-Feb;13(1):96-105. Epub 2005 Oct 12. PMID 16221938