Journal rankings are often based on average citation metrics. But a journal can have high citation counts because it published a few great articles and many mediocre ones or because it published many good articles. A single number per journal also cannot be used to figure out how much overlap there is in the distribution of citations across journals. Of course, citations are far from a perfect quality metric, but they are arguably the best metric we currently have, and authors who are more highly cited are certainly more influential in the profession, on average.
I investigated the full distribution of journals’ citations, combining Academic Sequitur data with citation data from Semantic Scholar.* The exercise is simple: take all the papers that a journal has published in the past 5 years (2016-2021) and calculate the share of papers with fewer than C citations per year for various values of C. Plotting the shares against C gives a cumulative distribution function (CDF) of citations for each journal. The closer a journal’s CDF is to the x-axis, the more highly cited papers it has.
The CDFs for the top 5 economics journals is shown below. It’s very clear that the Quarterly Journal of Economics leads the pack, with many more highly cited papers than the other four journals across the distribution of C. For “moderate” citation numbers, the Journal of Political Economy is next, but it has fewer highly cited articles than the other top 5s. American Economic Review and Review of Economic Studies are pretty close together, but AER has more papers with very few citations and more papers with large numbers of citations. Econometrica has substantially more papers that are cited infrequently than the other four journals, but catches up to three of them at the top of the distribution.
The next obvious exercise is to compare the distribution of citations in the top 5 journals to other journals. If people like these graphs, I will follow up with more journals, but let’s start with the AEJs. To keep things simple, I combined the top 5 journal data into one CDF (which is overall pretty similar to AER‘s CDF).
Couple of things are worth highlighting. Even though AEJ: Macro has a relatively high number of low-cited articles, it’s a powerhouse when it comes to highly cited articles, outperforming the top 5 index. The best articles in AEJ: Applied are as good as those in the top 5 (at least up to the 100 cites/year cutoff). AEJ: Applied also has a notably better distribution of citations than AEJ: Policy. Finally, AEJ: Micro is clearly the worst of the bunch, although perhaps theory papers are just cited less frequently?
And, of course, there’s a good amount of overlap across journals. If your paper is being cited 15 times per year, it’s doing better than 60 percent of the papers in the top 5! About 31 percent of AEJ: Macro papers, 28 percent of AEJ: Applied papers, 24 percent of AEJ: Policy papers, and 9 percent of AEJ: Micro papers have 15 or more citations per year, so we’re not talking peanuts here. Such overlaps are why I think it’s so important to follow new relevant papers across a variety of journals.
What do you think about this way of visualizing journal quality? Which journal(s) do you want to see the distribution for?
*Due to the large number of articles, the matching process was automated, so the data aren’t perfect (e.g., some articles could not be matched). But any mis-matches are unlikely to substantively affect conclusions.