Close

Insights from a survey-based analysis of the academic job market

Jason D. Fernandes, Sarvenaz Sarabipour, Christopher T. Smith, Natalie M. Niemi, Nafisa M. Jadavji, Ariangela J. Kozik, Alex S. Holehouse, Vikas Pejaver, Orsolya Symmons, Alexandre W. Bisson Filho, Amanda Haage

Preprint posted on 2 December 2019 https://www.biorxiv.org/content/10.1101/796466v2

Article now published in eLife at http://dx.doi.org/10.7554/eLife.54097

Shining a spotlight into the black box of the academic job market

Selected by Jonny Coates, Dey Lab, Sejal Davla, Maiko Kitaoka

Many recent articles and studies have highlighted the increasing number of STEM PhD degrees awarded globally each year; this stands in sharp contrast to the shrinking and highly competitive academic job market. Those who aim to continue in academia after their stint as a postdoc face a time-consuming and difficult application process before “making it” and reaching the ultimate academic goal: a tenure-track or permanent faculty position at a research university or institute.

However, even as the number of applicants has continued to increase, and the applicant pool has become ever more competitive and diverse, the hiring process has stagnated. Anecdotal information predominate the narrative on how exactly applicants are evaluated, and the combination of qualities hiring committees are searching for in the next wave of new faculty. Previous studies have called for more transparency in the process, but there is little concrete evidence to establish what makes a good faculty candidate. Consequently, potential applicants are often left to use their best judgment to navigate the job market. Fernandes and Sarabipour et al capitalize on the large pool of applicants from the 2018-2019 cycle to determine what the application process is really like through a survey posted worldwide via social media and university mailing lists. In addition, they surveyed faculty on hiring committees to see both sides of the process and determine what current faculty expect from potential new colleagues.

Remit/scope of the study

The lack of transparency and feedback on the job market makes it difficult to compete for academic positions. The respondents, applicants in the 2018-19 application cycle, were largely based in the life sciences and in North America. Over half of the respondents were successful in their faculty application (54%) and all were at the early stages of their careers.

Key conclusions

Immediately, several important points stand out. 1) Candidates receive virtually no feedback at any stage of the entire process, either from the institutions they are applying to or their direct mentors. 2) No single metric alone guarantees a job offer, but some combination of metrics helps the applicant, such as having a higher number of citations or receiving postdoctoral and independent/transition funding (e.g. K99 in the US). Overall, hiring committees appear to be seeking more rounded candidates. 3) Contrary to a popular belief, a majority of applicants did not have any first author Cell, Nature, or Science (CNS) paper and that did not impede the success of their application. This demonstrates a clear mismatch of expectations, again pointing to a lack of information and communication. 4) Individuals who focused only on academic job applications rather than also applying for non-academic jobs were more prepared and had higher success rates. 5) Search committees see interdisciplinary research as a weakness even if such research is favoured by institutions and funding agencies. 6) Women fared better in securing fellowships but men outperformed them in all publication and citation matrixes, these are however influenced by publication practices which bias against gender.

It’s all about the future

Applicants often viewed a lack of funding and/or CNS papers as major obstacles to a successful application. However, hiring committees stated that the single most important factor in determining a successful application was the research proposal and future plans, which may surprise many potential applicants who consider other metrics more valued. Fernandes and Sarabipour et al provided good advice for applicants, stating that “trainees should avoid overreliance on any single criteria…but should instead focus on communicating why their proposed research program is of interest”. This should remove some of the pressure that applicants put on themselves whilst also providing a strong argument for more rounded career development/opportunities for trainees.

Preprints

Increasing data has shown that preprints are accelerating the pace of scientific knowledge by being entirely open access and posted often together with submission to a journal (Abdill and Blekhman 2019, Fraser et al 2019). As preLighters and avid supporters of preprints, we were excited to see that both applicants and search committees viewed preprints favourably during the hiring process as a way to demonstrate scientific productivity. More than half of the applicants had posted preprints during their career, and 40% had one preprint that had not yet been published in a journal at the time of their faculty application. The increased acceptance of preprints in STEM fields, despite these manuscripts not being peer-reviewed, is a significant step forward for early career researchers in improving their chances in the job market.

Mentoring

Mentorship emerged as a key factor throughout the preprint. Applicants listed “poor mentorship” as one of the biggest obstacles to success with search committees listing it as essential to success. This preprint, in addition to others, highlights a clear mandate for institutional-level action to provide mentorship to all new trainees at every stage of training, not just for PhD students. Institutions should implement formal mentorships for trainees and applicants should investigate other sources of mentorship, such as the FuturePI slack channel or mentoring programmes run by various societies.

Next steps: How to shift the culture

Overall, it’s clear that the academic job search is an antiquated system and there is a dire need for increased transparency and communication in the faculty job search. More than 50% of the applicants were successful in receiving at least one job offer, but in the authors’ own words, “Of over 300 responses by job applicants, we did not receive a single positive comment on the process”. Search committees need to clarify what they value in an applicant, and current faculty need to be aware of these aspects when advising and mentoring their postdocs through the process. While most people are likely in agreement with the concept of greater transparency, the outstanding question of how the academic community can affect these changes remains. The issues highlighted by this and other studies, such as the lack of transparency and incorrect assumptions, remain in the spotlight. Unfortunately, even when the concept of change is widely accepted, academic culture still has not positively progressed to match expectations.

The key take-home is that even something as simple as a timely rejection email notification could save valuable time and stress for the applicant. As applicants spend three hours on average to personalize their application material, a few lines of feedback on the candidate’s application by the search committee will be a valuable practice in this confusing process. While providing feedback may be slightly more time-consuming to a small search committee overloaded with applicants, it makes a world of a difference to acknowledge the time, effort, and stress spent and could begin to improve applications with each year. As for the applicants themselves? This study highlights key metrics that will improve their chances in the academic job market. Further, this study provides a roadmap to early career trainees wanting to pursue an academic career.

 

Questions for the authors: 

  1. Does the length of the postdoc impact success/offer rates? This is particularly relevant for overseas applicants who are applying to US-based positions.
  2. Can you speculate on other job markets outside of the US (or breakdown the data you currently have)? In addition, is it possible to determine if international applicants experience an advantage/disadvantage compared to home applicants in the US and other job markets?
  3. Lack of mentorship is a huge issue that appears in the data. Did the applicants propose any guidelines to improve mentorship?
  4. Search committees stated that there were too many good applicants for the number of positions, and that applicants often underperformed at the interview stage. Does this bias towards more confident, advantaged, applicants? Is there a disadvantage to female and other underrepresented minority candidates?
  5. How would you design a follow-up study to this? Is there potential opportunity to re-capture the same pool in the future for longitudinal studies?

 

References:

  1. Abdill RJ, Blekhman R. Tracking the popularity and outcomes of all bioRxiv preprints. eLife 2019;8:e45133.
  2. Fraser N, Momeni F, Mayr P, Peters I. The effect of bioRxiv preprints on citations and altmetrics. bioRxiv 2019. doi: https://doi.org/10.1101/673665.

Tags: academia, academic culture, faculty job market, mentoring, postdocs, preprints, stem, tenure-track

Posted on: 8 December 2019

doi: https://doi.org/10.1242/prelights.15649

Read preprint (No Ratings Yet)

Author's response

Amanda Haage shared

Questions for the authors: 

 

  1. Does the length of the postdoc impact success/offer rates? This is particularly relevant for overseas applicants who are applying to US-based positions.

For postdocs above or below the median length of 4 years, we saw no significant difference in offer % (the number of offers a candidate receives divided by the number of applications they sent) (see Figure 4B). We didn’t look at very long or short postdocs, nor did we examine any possible relationship between length of postdoc and country of training and offer %. We think our non US dataset is probably too small to say much anyway.

  1. Can you speculate on other job markets outside of the US (or breakdown the data you currently have)? In addition, is it possible to determine if international applicants experience an advantage/disadvantage compared to home applicants in the US and other job markets?

We don’t have enough data to speculate accurately. Although we didn’t restrict to any particular market, our responses were mostly from the North American job market. We hope that publishing these results will attract more respondents to our future surveys, and open up the possibility of comparing different countries markets. Looking back through the data we only received 9 responses from applicants that were looking for jobs in countries outside of where they were currently working. From personal experience in our group, this can be a disadvantage to those looking at primarily undergraduate institutions (PUIs) as these smaller schools usually cannot sponsor visa applications.

Another issue that was highlighted by our respondent comments was that non-US postdoctoral researchers face challenges in obtaining US based fellowships, which require US permanent residency or citizenship. This could be further examined in our future surveys.

  1. Lack of mentorship is a huge issue that appears in the data. Did the applicants propose any guidelines to improve mentorship?

The applicants themselves didn’t propose any specific guidelines with regards to their comments on lack of mentorship, but we think that our data supports a need for improved mentorship at various career stages in academia. Applicants specifically wished for more feedback on their application materials, as well as just general mentorship on the process of being on the market. However we hope that PIs reading our survey will realize that applicants overall feel like they need more direct guidance through this process. We think that in line with what other groups have proposed, establishing mentorship training for PIs and trainees by training institutions will be beneficial and will help alleviate some of the problems highlighted by our respondents in the long term. Mentorship and trainee satisfaction can further be factored into the criteria for tenure promotion.

  1. Search committees stated that there were too many good applicants for the number of positions, and that applicants often underperformed at the interview stage. Does this bias towards more confident, advantaged, applicants?

We aren’t able to say as we don’t have any measurements of “confidence”.  However application number was the strongest factor in obtaining an offer, so applicants who talked themselves out of applying for a position or feared they weren’t “ready” or weren’t a good “fit” may have cost themselves a shot at an interview.  However, a confident person getting an interview but giving an average talk isn’t likely to get an offer either; so it’s really hard to say how/if this biases the system. Regardless, we’re hoping that studies like ours can increase transparency, and thereby give people awareness and possibly some additional confidence as they navigate the job search process.

5. Is there a disadvantage to female and other underrepresented minority candidates?

We weren’t able to look at URM candidates due to concerns regarding data privacy, but we hope to responsibly collect that kind of data in the future. We did look at self-identified gender, and found that traditional publication metrics were biased against women (many of these biases have been previously documented). That said, the outcomes for women and men were roughly the same, suggesting that many search committees may be accounting for this kind of bias.  Because of these differences in publication metrics, female gender significantly positively associated with offer status in our logistic regression. That doesn’t mean women were getting more jobs;  just that they were getting offers at roughly the same rate as men while still facing publication metric biases.

6. How would you design a follow-up study to this? Is there potential opportunity to re-capture the same pool in the future for longitudinal studies?

We are interested in pursuing a follow-up study. This work came about as a result of a collaboration between members of the Future PI Slack community and so we had to figure out a lot of basic logistics regarding survey design, data collection, and analysis.  We’re hoping that figuring out many of these issues will allow us to ask questions about things like under-represented minority (URM) status (that we avoided due to privacy concerns), examine if/how the market is changing from year to year, and expand the search committee survey. We’ve also received feedback from postdoctoral researchers and faculty about what they’d like to see in a new survey (analysis of particular markets, journals besides CNS, measurements of mentor reputation) and will see if we can integrate some of these other parameters in future analyses.

Have your say

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sign up to customise the site to your preferences and to receive alerts

Register here
Close