Tis the season for student projects!
As a postdoc I’m not teaching this year and I’m not pining for a pile of grading, but this is a bit of the academia-phenology that I miss. I’ve found myself idly re-listening to my favorite student-created podcast on alpine heath snowbank communities from a final project last spring. But while I mire in the nostalgia of the ghosts of student projects past, a recent PLoS ONE paper presents an innovative partnership of student projects and citizen science in Australia.
Here, Nicola Mitchell and her colleagues showcase the Journal Project — a semester-long project for first year students in large (266-586 students) introductory biology classes that incorporates natural history, data analysis, and writing peer-reviewed papers. Mitchell, a senior lecturer at the University of Western Australia, has led the Journal Project since its inception in 2011. After reading this paper, I’m torn between two equally strong urges. Should I pose as a freshman Never-Been-Kissed-style and join Mitchell’s class or wholesale borrow this project design and teach a Journal Project with an American citizen science program?
Citizen science describes programs that involve non-scientists (ie the citizens, though recently organizations like The Audubon Society have moved toward the term ‘community science’) in some (or all) aspect(s) of study design, data collection or data analysis. The most common model of citizen science draws on volunteer-collect data to expand the spatial or temporal scale of observations, while scientists determine the questions, analysis, and interpretation of the data.
Earlier this month Ferris Jabr published a wonderful essay in New York Times Magazine on the citizen science app iNaturalist. With iNaturalist, you can snap a photo of an organism and receive identification help almost immediately, placing a name in the palm of your hand. Jabr writes: “Learning the names of wild things changes the way we look at nature and the way we think about it… [It] is an exercise in perspective and empathy, transforming the outdoors from a pastoral backdrop into a world of parallel societies inhabited by diverse creatures, each with its own character and career.”
This sentiment so beautifully matches the advice of one of my favorite botanical heroes: “never be content with the common name only. Search, inquire, study, until you have discovered the title by which science recognizes your favorite. There are dozens of swamp pinks; there is only one Arethusa bulbosa; there are scores of Mayflowers, but only one Epigaea repens.” This is from Annie Sawyer Downs, and though she was a 19th century botanist, I believe she would have enjoyed iNaturalist for the same reasons that Jabr does.
Just last weekend iNaturalist recorded its 7 millionth observation; the app has become a repository, archiving our human encounters (smartphone in hand) with the natural world over the last nine years.
The Journal Project is linked to the Australian phenology citizen science program ClimateWatch. ClimateWatch, like iNaturalist, depends on volunteer-collected data. It is specifically focused on documenting changes in seasonal events (budding, leafing, flowering) and behavior (nesting, breeding, migration) for a suite of common species — the phenologies of these species are tracked as indicators of climate change.
The Journal Project first requires students to collect data for ClimateWatch, and then assess the program’s volunteer-collected data. I genuinely love the requirement to collect data — in her paper Mitchell includes this quote from a student about their experience collecting ClimateWatch data: “it creat[ed] awareness of the different kinds of plants and birds. They are not just trees, they are now jacarandas and banksia and birds are not black-tail bird or crows they are willie wagtails and magpie lark etc.” This naming is the Jabr & Saywer Downs effect, and I’m so impressed that a large intro bio class was able to facilitate this kind of natural history engagement through citizen science. But, the data collection is just the beginning: early in the semester students are divided into teams and given a raw dataset — all of the records (i.e. no data quality checks) for a given species that have been submitted to ClimateWatch since the launch of its website in 2009.
Assessing the quality of volunteer-collected data can be thorny: were the volunteers where they say they were? Did they actually see what they said they saw? I emailed with Emily Bennett, a recent student who participated in the Journal Project, and asked which part of the experienced was more challenging, making her own observations or assessing the quality of others’ observations. Bennett writes:
Contrary to my initial expectations before undertaking the Journal Project, I found that assessing the quality of others’ observations was one of the most challenging components of the Journal Project; the quantity of observations provided by ClimateWatch Citizen Scientists and combined with the need to generate extensive and yet efficient criteria posed a significant challenge to my group, especially when we realised that the majority of the observations provided did not meet our specified criteria needed to be deemed reliable…After considering criteria for evaluating research from other scientists, I found that making my own observations was a much easier task, and that I was actively seeking ways to improve my observations in both detail and accuracy to ensure that it would meet the criteria for data quality and be of use to other scientists.
(I am beyond impressed by Bennett and her fellow students here: I assessed the quality of volunteer-collected phenology data in New Hampshire as a M.S. student, and if you want to know more, you should definitely cite McDonough MacKenzie et al. 2017).
The Journal Project presents students with real data and asks them to engage in two big real-world questions: Does their data provide evidence of phenological or distributional shifts? Does citizen science produce reliable data? Each Journal Project group writes a paper that grapples with one or both of these questions, and then submits this paper to the peer-review process. Peers (other student) review the papers, but so ‘Subject Editors’ (PhD students and postdocs). Revisions are re-submitted and the best articles are published in an online student journal Cygnus on the final day of semester.
I emailed with two Subject Editors because I wanted to hear more about their role in the class and how closely the Journal Project mirrored the reality of writing a peer-reviewed paper.
Mavra Grimonprez explained: “Subject editing…is a very interesting process: we do not have contact with the students [aside from] email and reading their articles, exactly as peer-reviewers do. So we are not biased in any way while marking as we can just assess the quality of the work submitted to us…though we still get a good picture of the writers’ personality through reading. It is a thrilling adventure to watch the students improve, see who is responsive to feedback, who is not, watch the groups’ dynamics unfold, etc.”
When I asked what the Subject Editors received from the experience, Grimonprez said, “I was writing my own article when I applied to be a subject editor: all the guidance and advice I had received from my supervisor, I could in turn provide it to the students and a lot of it made suddenly even more sense as I was in the reviewer’s shoes!” She also said that serving as a Subject Editor improved her own writing, a sentiment echoed by Jamie Tedeschi, who wrote “Serving as Subject Editor has not only improved my own manuscript writing, but [gave] me confidence as a manuscript reviewer…I have become a better critic of scientific writing, and have learned how to give valuable and constructive feedback to my peers of all stages in their careers, whether they be undergraduate students, colleagues or supervisors.”
Another side beneficiary of the Journal Project is ClimateWatch. Mitchell and her coauthors report that by November 2014, 41% of the ClimateWatch records were from University of Western Australia students. Citizen science programs are generally inundated with retirees volunteering to collect data, but often struggle to engage younger audiences. The Journal Project exposes students to citizen science, but it also flips the script as the participants analyze their own data and grapple with data quality in a volunteer-based program. As Mitchell told me, “I guess, like the students, I was pretty shocked to discover how many errors are in the data (some datasets more so than others) – basically the citizen scientists frequently record the wrong species. That said, there IS a lot of good data in each species dataset, and now that ClimateWatch has matured to seven years old, the largest datasets would be valuable for phenological research once erroneous data are removed.” Mitchell’s students are engaged in natural history, data assessment and data analysis, paper writing, peer review, and publishing — an ambitious, broad, and inquiry-based experience that produces both incredible projects and well-trained scientists.
Mitchell shared some recent student reflections with me, including these two heart-warmers:
“Participating in ClimateWatch and the Journal Project have change[d] my perspective of science and affirmed to me that I am on the right path regarding careers…The Journal project was very intimidating, but the experience was rewarding and taught so much more than just learning from a syllabus.” — Emma Kuzminski
“We would like to thank you for the submission of our paper to the Cygnus journal. This has made our day. Being in a scientific peer reviewed journal, where people can see our paper and use it for further research is an amazing start to all of our careers… For a team that was petrified at the start of the year, this has filled us with confidence.” —Sean Davey (and team)
I encourage you to cuddle up with a copy of Mitchell’s PLOS One paper this winter. You’ll drift off to sleep with visions of student engagement dancing in your heads… Happy holidays!