Gender Effects on Authorship Credit
Recent research indicates that in some academic disciplines, the gender of one’s coauthors can have a significant effect on one’s career. Justin Wolfer, in the Jan 8 online NY Times, summarizes the work of Heather Sarsons for her Harvard dissertation in economics (in a deliberately solo-authored paper): “[In economics], [w]hen women write with men, their tenure prospects don’t improve at all. That is, women get essentially zero credit for the collaborative work with men. Papers written by women in collaboration with both a male and female co-author yield partial credit. It is only when women write with other women that they are given full credit. These differences are statistically significant… The bias that Ms. Sarsons documents is so large that it may account on its own for another statistic: Female economists are twice as likely to be denied tenure as their male colleagues.” Wolfers supplements with some anecdotes, and refers to an earlier article of his documenting such a bias in the wider sphere, such as news coverage. Sarsons notes the effect seems to be discipline dependent; she finds no such effect in the field of sociology, for example, and a couple of conjectured explanations are offered.
Locations of Newly Discovered Species Hidden
“Academic journals have begun withholding the geographical locations of newly discovered species after poachers used the information in peer-reviewed papers to collect previously unknown lizards, frogs and snakes from the wild,” reports Arthur Neslen in the Jan 1 edition of The Guardian. Usually such species have a rather small population and geographical extent, and are often located in countries in which protection of such species has very low priority. In earlier cases, some going back decades, within weeks of publication, specimens of the species have appeared on the commercial market, in some cases, severely depleting the wild population.
Gender Bias in Teaching Evaluations
A statistician believes there is enough accumulating evidence that students’ evaluations of teachers in colleges and universities disadvantages female teachers due to student bias, that class action lawsuits concerning the use of such evaluations for hiring, promotion and tenure will start this year. Philip Stark, a professor of statistics at Berkeley, along with junior colleagues Anne Boring in Paris and Kellie Ottoboni at Berkeley in a paper published in January state in their abstract, “SET [Student evaluations of teaching] are biased against female instructors by an amount that is large and statistically significant…gender biases can be large enough to cause more effective instructors to get lower SET than less effective instructors.” “These findings are based on nonparametric statistical tests applied to two datasets: 23,001 SET of 379 instructors by 4,423 students in six mandatory first-year courses in a five-year natural experiment at a French university, and 43 SET for four sections of an online course in a randomized, controlled, blind experiment at a US university.” In an article concerning this study, Stark is quoted, “Replication of this kind of experiment and analysis elsewhere would strengthen the argument. Eventually, lawsuits will lead universities to do the right thing, if only to mitigate financial risks.”
Rutgers Faculty Vote on Tracking Faculty Productivity
In December, the faculty of the School of Arts and Sciences of Rutgers University-New Brunswick, by a vote of 92 to 40, adopted a resolution calling on the School to abandon use in decisions about faculty members’ careers of data obtained via a contract with Academic Analysis, LLC, a proprietary database for tracking faculty productivity. The resolution also called for the School to release to each faculty member personal data collected by the firm. This was reported in a short note in Inside Higher Education, which has links to earlier analyses and the text of the resolution. There are also articles in The Chronicle of Higher Education and other venues. The University of Maryland College Park used to be a client of Academic Analytics, but according to the Office of Institutional Research, Planning and Assessment, the University could not validate the data received and dropped the contract without using the data for anything substantive, either at the unit or the faculty level.