Category Archives: OnTheMedia

What’s in a last name?

In a new paper published today in PNAS, Jacopo and I try to extract as much information as possible from an ostensibly meager source of data: a list of the names of all the researchers working in Italy, at the Centre National de la Recherche Scientifique in France, and at public universities in the US.

We show that by using simple randomizations, one can highlight several interesting facts about these different academic systems.

In how many ways can you randomize a list of last names?

The work is a bona fide exercise in style: by introducing subtle variations of the randomization algorithm, we show that in Italy researchers work in the region where they were born and raised, while in the US geography does not influence the distribution of researchers; in France, we can detect academic couples working in the same unit; we demonstrate that academic nepotism in Italy (the focus of a previous paper of mine) is declining; finally, we show that in the US immigration is field-specific.

Jacopo Grilli & Stefano Allesina
Last name analysis of mobility, gender imbalance, and nepotism across academic systems
PNAS, 2017

Because the article is paywalled for 6 months, I have stored a personal copy here.
The data and code are available on GitHub.

The article is being covered by the popular press. Here below we link some of the coverage:
In English: EurekAlert; Science Life; U Chicago News; Nature;
In Italian: Corriere della Sera; adnkronos; La Repubblica; Lettera43; Il Giornale; Wired; Wired (again); Corriere della Sera 1 and 2; Il Messaggero; Il TempoSole 24 ore; Rai TV TG 1; TGCOM 24; Il Foglio (genius! …not really); Il Fatto Quotidiano; Il Foglio (again); La Stampa;
Other languages: Naked science (Russian); Science Times (Korean); (Russian); Xataka Ciencia (Spanish);

More is more (when it comes to abstracts)

In the summer of 2013, I was coordinating a class meant to prime incoming graduate students on what it takes to succeed in graduate school. One session dealt with writing good abstracts. You have heard the usual advice: keep it short and simple, avoid jargon, write it for a general reader, etc.

I thought that it would be fun to test whether following this type of advice increases readership (citations). After a few months, I pitched this idea to my friend James Evans, and we decided to try it out with the help of Cody Weinberger, an undergraduate student in my laboratory.

We collected about 1M abstracts from 8 disciplines, and we tested the impact of following the usual advice on citations, once accounted for obvious factors such as age of the article, journal where it was published, etc. To our surprise, we found that following some of the most common suggestions leads to a significant decrease in citations!

You can read the results here:
Cody J. Weinberger, James A. Evans, Stefano Allesina
Ten Simple (Empirical) Rules for Writing Science
PLoS Computational Biology, 2015


The short article starts with a quote from Boyle’s “Proemial Essay”. Robert Boyle was one of the main proponents of the use of “modern” scientific articles to disseminate science (i.e., instead of books). Amusingly, while describing the advantages of this approach, Boyle already states some guidelines on how the essays should be written: we’ve been told how should we write our science for at least 350 years!

Update: The Chronicle of Higher Education features a Q&A with Cody.

Countries and Citations

Update: Joe Palca produced a story on this paper for the series “Joe’s Big Idea”. The story was broadcast today (15/12/14) in NPR’s Morning Edition.

We live in a world dominated by rankings. Besides soccer teams, movies and restaurants, rankings of Universities and researchers have become commonplace.

The Scientific Wealth of Nations has been measured in many ways, all centered on a very simple idea: if a country producing a certain proportion of papers (pp) accrues a much larger proportion of citations (pc), then the country is producing high-quality science. Conversely, countries for which pc < pp would produce lower-quality research.

This appealing simplicity, however, conceals one of the most important factors determining the influence of a scientific article, the journal where it was published. Clearly, publishing a paper in Nature would guarantee a much wider audience than that reached by The Bulletin of Koala Research — even for papers of the same quality.

We thus took 1.25M articles in eight disciplines (from 1996 to 2012), and parsed the country of affiliation of all the authors. We then measured how the country(ies) of affiliation influenced the journal placement (i.e., where was the paper published) and the citation performance (i.e., whether the article received more or fewer citations than its “peers”). Differently from other studies, we kept a tally for each possible combination of countries, such that we can see which international collaborations are more effective.

Effect of country(ies) of affiliation on the journal placement in ecology
Effect of country(ies) of affiliation on the journal placement in ecology. Clearly, we should collaborate with people in Switzerland!

The paper was published today in PLoS One:

Matthew J. Smith, Cody Weinberger, Emilio M. Bruna and Stefano Allesina
The Scientific Impact of Nations: Journal Placement and Citation Performance
PLoS One 9(10):e109195

Here’s the press release on the Computation Institute website.

Some afterthoughts:

  • Originally, we thought of measuring the effect of the institution (rather than country) of affiliation—how much is an Oxford affiliation worth? We’re sufficiently proficient in regular expressions to distinguish India from Indiana, but affiliations like The Miami University in Oxford, Ohio made us decide to stick with countries.
  • In the paper, we start by talking about the 1982 study by Peters and Ceci. This is one of the most intriguing paper I’ve ever seen, and even the lengthy commentary (you can find here) is a pleasure to read.
  • In hindsight, we should have changed our own affiliations to the wonderful ones used by Peters & Ceci. The Northern Plain Center for Human Potential sounds just right!

Robots in Hiring Committees

When I sit in a hiring committee, I ask myself: is it even possible to predict who’s going to be a good scientist by looking at a CV? It turns out that yes, it is possible.

A piece in Nature on the delicate topic of measuring scientists productivity. Daniel Acuna, Konrad Kording and I tried to predict the future h-index of neuroscientists. You can find the article here:

Daniel E. Acuna, Stefano Allesina & Konrad P. Kording
Future impact: Predicting scientific success
Nature 489, 201–202

Nature has an editorial on this work:
Count on me

Some of the coverage:
The Chronicle of Higher Education