Here come old flat top He come groovin’ up slowly He got joo joo eyeball Come together
Lennon & Mccartney
We have been working on the applications of random matrix theory to ecology for four years. By now, it is quite clear that the most important challenge ahead is to extend the theory to the case of structured networks (as described here). A new study we just published is a first step in this direction:
In this work, we studied community matrices produced according to the cascade model, in which “big fish eat little fish”. These matrices look like this:
where the red squares represent negative coefficients (effects of predators on prey), and the blue ones positive coefficients (effects of prey on predators). These matrices produce a peculiar spectrum, suggestive of an “eyeball”:
In the paper, we derive simple, analytical results that allow us to approximate the spectrum (and hence the stability) of the eyeball.
I wrote an R package that performs the analysis described in the paper, and published it on github.
A bit on the backstory: in December 2014, I gave a talk at UC Davis, and, at dinner, Sebastian Schreiber mentioned that if I liked problems involving eigenvalues I should have looked at the classic Hanski & Ovaskainen model. Back in Chicago, Gyuri (who had just started his postdoc) and Jacopo (who was visiting from Italy) thought it would be a good project to jumpstart our collaboration…
In the summer of 2013, I was coordinating a class meant to prime incoming graduate students on what it takes to succeed in graduate school. One session dealt with writing good abstracts. You have heard the usual advice: keep it short and simple, avoid jargon, write it for a general reader, etc.
I thought that it would be fun to test whether following this type of advice increases readership (citations). After a few months, I pitched this idea to my friend James Evans, and we decided to try it out with the help of Cody Weinberger, an undergraduate student in my laboratory.
We collected about 1M abstracts from 8 disciplines, and we tested the impact of following the usual advice on citations, once accounted for obvious factors such as age of the article, journal where it was published, etc. To our surprise, we found that following some of the most common suggestions leads to a significant decrease in citations!
The short article starts with a quote from Boyle’s “Proemial Essay”. Robert Boyle was one of the main proponents of the use of “modern” scientific articles to disseminate science (i.e., instead of books). Amusingly, while describing the advantages of this approach, Boyle already states some guidelines on how the essays should be written: we’ve been told how should we write our science for at least 350 years!
I thought that this would be a great occasion to review the progress my laboratory has made on the study of the stability of large ecological systems. Even better, this article could outline a research program on this topic, listing the main challenges that we are facing.
My former student Si Tang (now pursuing a second PhD in Statistics) and I set to work with this idea in mind. You can now read this hybrid between a review and a list of “grand challenges”:
We live in a world dominated by rankings. Besides soccer teams, movies and restaurants, rankings of Universities and researchers have become commonplace.
The Scientific Wealth of Nations has been measured in many ways, all centered on a very simple idea: if a country producing a certain proportion of papers (pp) accrues a much larger proportion of citations (pc), then the country is producing high-quality science. Conversely, countries for which pc < pp would produce lower-quality research.
This appealing simplicity, however, conceals one of the most important factors determining the influence of a scientific article, the journal where it was published. Clearly, publishing a paper in Nature would guarantee a much wider audience than that reached by The Bulletin of Koala Research — even for papers of the same quality.
We thus took 1.25M articles in eight disciplines (from 1996 to 2012), and parsed the country of affiliation of all the authors. We then measured how the country(ies) of affiliation influenced the journal placement (i.e., where was the paper published) and the citation performance (i.e., whether the article received more or fewer citations than its “peers”). Differently from other studies, we kept a tally for each possible combination of countries, such that we can see which international collaborations are more effective.
Originally, we thought of measuring the effect of the institution (rather than country) of affiliation—how much is an Oxford affiliation worth? We’re sufficiently proficient in regular expressions to distinguish India from Indiana, but affiliations like The Miami University in Oxford, Ohio made us decide to stick with countries.
In the paper, we start by talking about the 1982 study by Peters and Ceci. This is one of the most intriguing paper I’ve ever seen, and even the lengthy commentary (you can find here) is a pleasure to read.
In hindsight, we should have changed our own affiliations to the wonderful ones used by Peters & Ceci. The Northern Plain Center for Human Potential sounds just right!