FROM ROUTINE TO PLAYFUL: ENHANCING VIROLOGY TEACHING WITH SCIENCE FICTION AND GAMIFICATION
This article has been Reviewed by the following groups
Listed in
- Evaluated articles (PREreview)
Abstract
Given that traditional methodologies in virus teaching often spark little interest among High School students, this research aimed at the development of a dynamic and innovative methodological approach to teaching virus content through science fiction associated with active methodologies, in order to make learning more attractive and comprehensible to students. For this purpose, an experimental and qualitative approach was adopted with 20 second-year High School students from the Federal Institute of Education, Science, and Technology of Ceará. Initially, participants watched an 18-minute fictional video, accompanied by a dynamic questionnaire on the main concepts taught in virology addressed in a fictitious context. Subsequently, the students' perception of the use of the produced material was assessed through a second questionnaire. The results indicated a high rate of correct answers in the first questionnaire, while the responses obtained with the second questionnaire revealed that the strategy of content presentation based on elements of science fiction was positively evaluated by 95% of the participants. The students' comments showed a preference for the adopted teaching technique, in contrast with conventional teaching methods, corroborating the inference that the dynamic approach was effective in arousing interest, promoting understanding, and making learning more engaging.
Article activity feed
-
This Zenodo record is a permanently preserved version of a PREreview. You can view the complete PREreview at https://prereview.org/reviews/14569352.
Commentary
First off, I wanted to congratulate you guys for the preprint, and for publishing here in PreReview. Briefly, I specialize in education and games critical digital literacy. I have done courses in statistical analysis and meta-research, including Open research and peer review. I must note a certain bias towards progressive and transformative research. Please note that this review was conducted in about 4 hours.
Summary
This is a classroom test where some teachers looked at whether an 18-minute video was or wasn't engaging to the 20 students (that came that day). To call it experimental and gamified, I personally find it an overstatement, given with a reduced sample size it is …
This Zenodo record is a permanently preserved version of a PREreview. You can view the complete PREreview at https://prereview.org/reviews/14569352.
Commentary
First off, I wanted to congratulate you guys for the preprint, and for publishing here in PreReview. Briefly, I specialize in education and games critical digital literacy. I have done courses in statistical analysis and meta-research, including Open research and peer review. I must note a certain bias towards progressive and transformative research. Please note that this review was conducted in about 4 hours.
Summary
This is a classroom test where some teachers looked at whether an 18-minute video was or wasn't engaging to the 20 students (that came that day). To call it experimental and gamified, I personally find it an overstatement, given with a reduced sample size it is complicated to even suggest avenues of research (not to get into correlation). This paper does not seem to add a new technique to teaching, nor does it justify why the used methods are contextually relevant, or whether the content was adapted (personalised) to be particularly novel.
Briefly, although the "video technique" may be somewhat atypical in virology teaching in Brazil, I do not think narrowing down a situation like this pushes scientific literature on the field. Without trying to be rude or minimise your effort, ask yourself: What if you tested every class intervention and called it a paper because it had a survey? Would that create research noise (I ask myself this too, in a "publish or perish" environment)?
Major limitations
I'd say I am missing a "table 1" to clarify details about the sample besides those you mention in the first paragraph from the methodological procedures. I'd argue it would aid the clarity of the text. Furthermore, I am missing other contextual cues that could have affected the process: Which classes did the students have before and after? What time of day was it? Did they have homework, exams, was this a graded task? How long did it take? Who presented the activity and why? This paper Brown et al. summarises my advice on what to report, to enhance scientific/public trust.
On a similar note, you mention the exclusion criteria was "Not adhering to the inclusion criteria", does this mean that if a student intentionally answered randomly, you would have not excluded their responses? What if they disrupted the classroom or added offensive qualitative comments? Also, regarding ethics, how were students prepared to participate on this research? How was the data anonymised, and how did you track and ensure there was consent? With a form, digital, physical, when? I'd argue this is off relevance to curve possible novelty effects that could have inflated the scores you mention on page 16.
Education has a lot of references and international literature of what works and what doesn't, so I suggest you focus on systematic reviews of techniques/interventions rather than specific papers (like you do in the Introduction). This line action I am talking about is intended to bring resources that help your potential readers, instead of guiding them to particular non-generalisable case studies.
To this end, I find Corwin's Visible Learning meta analysis library, and Penn State Clearinghouse database of educational policy/classroom practices particularly useful.
I take it that your aim/objective is in the last paragraph of the introduction, right? Is it "This study aimed to explore the efficiency of a more engaging and interactive approach to teaching virology, using elements of science fiction and gamification as methodological tools. Through this new pedagogical strategy …"? If that's the case, I don't think this presents an exhaustive thesis: More engaging compared to what? How is this strategy really new? Why do you think is relevant in your case?
Again, to make my point clear, you could present this article to a newspaper or use it internally to review your educational practices, I would find that commendable, cool work! Nevertheless, I must insist that you present your approach as "new" and "innovative" but to me, it sounds like any MOOC that uses narrative examples to catch the attention of its users. If you contend with this, I think it would be fair to mention it.
I am not fond of authority arguments like "Dickey (2005), in his analysis of immersive learning environments […] significantly increase student engagement and motivation" because I feel they detract from the why. If they increase engagement, say why? Is this a correlation or a causation? How did Dickey or other authors arrive to that conclusion? I may know this, but some readers may not.
Despite having plenty of sources, the citation used on them is somewhat inconsistent. All references start with the authors, but some are followed by a full stop or a point whereas others are not. Some have the date, whereas others have only access. You link half of your sources, and often use the non-secure http gateway. Some have their DOI written and others have the link. You give the ISBN for some books, but not all. Because of the aforementioned, I found myself unable to locate some of the articles, like that of "Pliessnig". In summary, this lack of references systematisation can difficult replication and expansion efforts.
From what I gathered, the project had intentional sampling, it lacked blinding, it used an unvalidated instrument which might've had difficult or threatening questions. These aspects are not mentioned in the manuscript.
The selective reporting on page 14 led me to fear that your particular results were cherry-picked to match a specific paper narrative. Although not untypical nor necessarily forbidden, I am afraid you claim "In the light of the obtained results, it becomes clear that …", and nevertheless not share the full results. Where can I find the raw data? Could you upload it to a repository like Zenodo or Figshare?
Additionally, I wonder how you analysed the results personally, given you didn't mention of protocols or procedures conducted during the qualitative or quantitative surveys. Did you follow a strict set of guidelines to try and prove causation as in a descriptive research?
I assume this is not the case, so I'd argue you should mention the analyses were exploratory for transparency's sake. Otherwise, can you prove you scheduled and followed clear criteria (beyond being approved by your local ethics committee, which I thank you for)? On this sense, I recommend looking into the process before the actual research takes place: Pre-registration is being adopted/required at increasing rates, you may want to look into OSF.
Minor limitations
I was a bit frustrated that the article is in English and yet most of the sources are in Portuguese. I wonder if there exist alternatives that at least have the main texts translated in both languages. I think it would be faster to review it for non-natives. Perhaps this could help you reach a broader audience.
The graphs have some consistency issues. I believe the clearest are charts 2 and 3 because they use a biromantic colour palette and indicate the variable and the amount of responses both with percentage and absolute value (reading from left to right). The pie charts on the other hand would present difficulties for people with impaired visibility because of a lack of contrast in colours, I suggest adding the values and tags on the chart itself (and not on a side/bottom legend/index), or changing the visualisation to bars.
Additionally, I believe all the graphs have fairly small text. Lastly, I'd recommend avoiding 3D graphs, specially those without labels, since the lines create noise and it can be difficult to grasp what value or percentage each variable has. Nevertheless, I appreciate that most charts are vector/text and not PNGs with artefacts.
To get a sense of variety and more appealing data visualisation, which can help attract readers, I recommend looking at The Economist, or The New York Times, and perhaps playing with the tool RAWGraphs.
I am aware of the popularity authors like Foucault, Freire, or Nussbaum have taken in reshaping education away from monotonous reproduction of western values, and I see you mention Freire and some of your the authors you credit use the others. Nevertheless, I think these tenets are not "new" and can be gathered through informal step-by-step logic. I'd personally like to see more science that promotes a procedural and inclusive understanding by walking through the topics, even if the papers are meant for people who already know the authors and their theses.
I appreciate the times when you cite the specific pages you used when referencing an author for an opinion, like "According to Pliessnig and Kovaliczn (2007, p. 5), the weakness of the methodologies used by biology teachers creates a dependency on the use of the textbook (when available) or limits them to using the chalkboard to "deliver content"", and yet, I think you needn't spend time on some of these claims. I'd trust readers to be able to see that textbooks and the chalkboard can create a reproductive and content memorisation bias.
I found the section "Use of science fiction as a teaching resource" very enjoyable and straightforward, I'd argue if you wanted to go further you could link it to Levitas' Utopia as a method, whereby fiction enables the construction of alternative futures, which contrasts with the stagnation of capitalism realism that Mark Fisher suggests.
I repeatedly read generalist/ambiguous terms (usually adjectives and adverbs) like "strategically delimited", "committed involvement", "it is important", "demonstrates a strong", "it is essential", or "highly promising". To me, these can make the paper sound less "scientific" and more "headline-seeking", this might have been intentional, but I nevertheless wanted to warn you.
Finally, I generally recommend in any study to add a small section for the limitations of the study, as an act of scientific honesty, indicating possible biases during the process, and its validity.
Competing interests
The author declares that they have no competing interests.
-