Interactive Simulations in Physics Education

This document was written to fulfill the research paper component of the University of Waterloo’s Certificate in University Teaching program. It is not peer-reviewed or published in a formal journal.

Abstract

The use of interactive media including simulations, games, and responsive graphics in post-secondary education has increased commensurately with the proliferation of computing technology. Simulations can improve student engagement and facilitate discovery; interactive media can be especially useful in physics education, where concepts commonly have no obvious human-recognizable visual representation and often rely heavily on geometric analogy. Many authors in the last two decades have developed interactive simulations to teach concepts in fields such as electromagnetism and kinematics; large open educational resource (OER) collections of these media such as the University of Colorado’s Physics Education Technology (PhET) or the MyPhysicsLab project exist in an evolving educational ecosystem alongside commercial alternatives such as Pearson’s long-standing Mastering Physics courseware series and newer subscription-based libraries. While many authors have formally investigated the efficacy of these learning tools on student motivation and achievement, as well as developed data-based design principles, research investigating how best to incorporate these tools in course design is itself actively evolving. This paper reviews the former two items in the context of physics education, while casting a more critical eye to the latter and exploring the implementation of simulations in an undergraduate course.

Introduction and Background

Interactive Media in Education, Taxonomy

Interactive media in education takes many forms, from simple navigable video clips and polling instruments to intricate webpages and simulations. All these tools seek to actively incorporate input from learners and react accordingly to demonstrate particular concepts or direct learning processes. This paper considers simulations or sims; sims are electronic representations of phenomena. Learners may alter key parameters of a sim, which then faithfully demonstrates the resulting dynamics or representation of the phenomenon. We may contrast a sim with an interactive webpage or application: whereas a webpage may display content or material based on learner input, even in a sophisticated manner such as through identifying a learner’s current interest or ability, a sim provides a responsive, representative environment in which exploration and experimentation can occur.

Interactive Media in Education, Chronology

or: For $4200 CAD, I can get you 512kb RAM and a simulated cannonball

Formal academic interest in the efficacy and proper implementation of interactive media in higher education has existed as long as the technology itself: see e.g. reviews by Miller [1987] (1), De Jong and Van Joolingen [1998] (2), or Rutten et al. [2012] (3), which examine individual studies beginning in the early 1970s. Early authors were often attracted to computer simulations by the hope that they would better engender problem solving and inquiry skills (see e.g. (4,5)), as simulations offered students the ability to perform experiments both on phenomena which they normally couldn’t meaningfully control or perform experiments faster, cheaper, or more safely than in conventional laboratory settings.

While individual early studies often showed ostensibly significant positive influences on student concept retention (6), hypothesis-forming (7), and attitude (8), meta analyses (attempting to synthesize these results into a cohesive pedagogical approach as computing technology became more widespread starting in the mid-1980s) found large variation in their efficacies. This was commonly attributed to the lack of consideration for how new technology was being incorporated into broader course design, with authors criticizing poor connection between simulations and lecture learning goals (1), simulations not being intentionally designed to facilitate hypothesis forming and data interpretation (2), and instructor passivity during simulation activities (3). We can ourselves readily find 20th century studies with no or negative results, e.g. (9).

An illustrative example can be found in a trio of early meta analyses published by Kulik, Kulik, and Bangert-Drowns examining the effectiveness of computer-based education in elementary (10), secondary (11), and post-secondary (12) settings. While the pre-university analyses found strong differences in efficacy based on the implementation of computer elements, these differences atrophied by the time learners entered university. The authors conjectured that this was due to the improved self-regulation capacity of older students, with the entertainment and structuring benefits of computer-assisted learning becoming less valuable. When we compare these initial results to more recent meta analyses which do show efficacy in post-secondary contexts (which are the beneficiaries of decades of research on integrating these tools into teaching theory), (e.g. (13)), we might argue that these tools do not intrinsically improve learning in higher education. While useful, these tools remain strongly reliant on their intentional integration into existing pedagogical strategies.

In the last twenty years research has studied the design and implementation of simulations for class contexts more thoroughly. As personal computing devices have become ubiquitous and capable, literature has explored connections with other developing pedagogical paradigms such as flipped-classroom learning (e.g. (14,15)) and game-based learning (e.g. (1618)).

A quick case for interactive media in physics

or: The Board is Not Enough

Although interactive media has been widely adopted by other fields including chemistry 1, biology2, and, appropriately enough, computer science3, we confine ourselves to considering its use in physics education. We can establish both the general merits of these tools in higher education, while noting the unique appeal it offers to physics educators. Many authors have critiqued or measured the ineffectiveness of conventional lectures for developing hypothesis-forming and intuition skills in physics undergraduates, see e.g. (2224). Indeed in examining the literature, we might see the development of sims for physics education as a particular manifestation of broader trends towards inquiry- and active-learning based pedagogies.

Sims offer a representative environment which allows for learner-directed exploration of the phenomenon at hand; much contemporary literature examines sims through the lens of discovery/inquiry learning (26). Sims encourage students to form hypotheses and interrogate phenomena, instead of having an understanding recited to them. This flexibility allows students to form their own internal models for the concepts, and can have positive effects on engagement and interest (3).

Representation

Many physical disciplines are subject to issues of representation and the actual medium-based (as opposed to interactivity-, entertainment-, or inquiry-based) limitations of instructional elements. Student understanding of physical concepts is complicated by the fact that many relevant phenomena are largely if not entirely imperceptible to humans, or in the case of quantum phenomena, not directly observable at all. Several authors have tied this to the importance of model-building in physics education, arguing that being able to manipulate and switch between active representations improves learner’s ability to conceptualize difficult concepts e.g. (27)! Consider that if we subscribe to an analogy-centric view of explaining phenomena (a common enough perspective in contemporary physics literature, see e.g. reviews (28,29) and studies (3032)), we find ourselves in the awkward position of having to create coherent references between an invisible process and some visible surrogate. Even having arrived at some representation which we find satisfying, we remain constrained by the medium used to actually… constitute it. By way of an example, consider we are tasked with introducing the polarization state of electromagnetic radiation to students using the representation of transverse waves. In a conventional classroom setting with access to chalk and board, we might invoke the following diagrams4:

Explaining that the magnetic component is pointing “into the board” and thus not visible. Stuck describing a 4D phenomenon (3 space + time) in a 2D medium (2 space and static), we rely on paired projections into the space to illustrate the dynamics. Even developing an enviable ability to draw in accurate 3D perspective, we would find ourselves a dimension short. Instead using a 4D medium (3 space and animated), we might invoke the following diagram (linked to decrease load on this page).

Interactivity

But an ostensible improvement in representation still remains ‘flat’ until we put it into action: we need also consider the value of interactivity. In the prior diagram, we are given the freedom to drag and select a perspective of our choosing, to speed to the phenomenon up and down5. Critically however, we can alter the constituent red and blue base states to see the effect on the output grey state. This allows learners to construct their own nascent understanding of the geometry of polarization, beyond being simply told through 2D diagrams that these states exist under certain conditions. This is a basic instance of learning a complex topic through interrogation, investigation, inquiry, discovery, or whatever near-synonym suits you best – sims allow learners to observe and prod phenomena themselves.

Efficiency and physics for the non-physicist

Not every student who needs to understand a given phenomenon has the luxury of spending a course (or even multiple courses) learning the requisite material to engage with it from a first principles perspective. As an example, polarimetric analysis of RADAR data is becoming an increasingly important method in the remote sensing of climate systems. However it is unreasonable to require environmental studies students to take an electromagnetism course and a Fourier analysis course simply to engage with one class of practical methods6. Once concrete learning goals have been identified, sims can allow students to investigate precisely the aspects of a phenomenon that is relevant, skipping unnecessary formalism. Moreover sims can offer a more engaging, less intimidating environment to students unsure of their ability (33). Sims can also allow students to develop a nascent familiarity in less time than simple recitation (3).


Ultimately the appeal of sims then becomes the ability to personally investigate well thought out representations of physical phenomena, rather than rely on either alienatingly dense theoretical descriptions or high-level but disjointed characterizations.

The Limitations and Pitfalls of implementing sims

or: OK, but let’s walk back that excitement

Rutten et al.’s broad 2012 review of sim implementation concluded that while significant effort has been placed on studying their design, far less research considers sim development as a holistic part of course development, taking into account curriculum goals, overall lesson structure, and the role of the instructor during simulation activities (3). Early meta analysis of computer simulation studies found that students struggle with creating hypotheses and designing new experiments when these activities are not intentionally supported in the instruction design (2); these results mirror similar conclusions from studies on physical experimentation, see e.g. studies on inquiry- vs. verification-based laboratory design (3436). Since the emergence of these technologies in the late 1980s authors have argued that new media offers little or no intrinsic benefit to teaching and must be implemented with intention as a single tool in a broader pedagogical paradigm; see e.g. Salomon’s critique (37), Dillenbourg’s criticism and suggested methods of computer-integrated learning (38), or the conclusions of Schittek et al.’s meta analysis in the context of medical instruction (39).

This can perhaps be seen as an extension of the short-sighted tendency of educators to view instruction as the direct transfer of understanding to students authors have identified, see e.g. Wieman’s critique of conventional lecture-based instruction (40). Indeed in my own attempt to fold IM into my lectures, I was quickly enamored by the potential to finally come up with representations of phenomena which did justice to my own internal model, unrestricted by conventional media. Yet this obscures the point of teaching: to be of service to learners, to help them achieve their learning development and goals. The entire point of these tools remains to encourage students to form their own models, and to see the interchangeability and relative merits of ostensibly “competing” models. My own experience and the literature at large remind us to be careful and intentional when creating these tools, to ensure they exist to serve a specific purpose within the broader course design and that these tools don’t usurp our overall pedagogical strategy for a course.

Implementation Principles

So how do we use sims effectively? We can identify common methods of in-class sim use, which may be comingled in a given implementation.

Instructor-Lead

Instructor-lead uses of sims are straightforward extensions of conventional lectures with an obvious eye towards allowing more active learning. In basic implementations the sim may simply be a visual aid depicting the phenomena being discussed, with the instructor manipulating the parameters and demonstrating outcomes as those concepts arise in the lecture script. Obviously however the instructor is free to merely act as a ‘gatekeeper’ for the sim, and supply the class with goals or hypotheses to achieved or tested. Typical methods of active discussion (e.g. paired students, small groups, discussion circles, anonymous submission, clickers) can then be used to encourage participation and elicit responses. Studies of student responses to sims commonly find that while sims are effective at conveying concept knowledge, this then leads students to more complex questions that requires the aid of a subject matter expert (27,41). Illustrative examples can be found in (19,24,42).

Guided Inquiry

In guided inquiry, students are invited to explore a sim freely while pursuing a set of questions or outcomes to generate. Students may be asked to explain how the phenomena in question is dependent upon a particular parameter (e.g. (19,43)) or to seek a combination of inputs which produce a particular result (e.g. (42)). This is in obvious parallel to conventional laboratory design, and sims can be used outside of lecture in an analogous manner. However the convenience and responsiveness of simulations make them well-suited to guided inquiry within the classroom as well. After providing initial background instructors may ask students to actively investigate course concepts. When doing so, it is important to consider:

1 - Care must be taken to provide instructions which are specific enough to align purposefully with learning goals, but are not so specific as to constrain student’s investigation of the simulated environment. Being overly prescriptive limits student’s ability to form their own goals and hypotheses (19).

2 - Instructions and goals should be few and intentional, directly aligned with that lecture’s learning goals. Providing too many goals may limit student’s absorption of content and limit the amount of time available for thorough investigation (19).

3 - Peer-to-peer instruction and discussion are useful to inquiry. Guided inquiry tasks with sims can be approached in small groups, and easily folded into existing strategies such as think-pair-share.

Specific examples of guided inquiry with sims in-class can be found in, e.g. (19,4345).

Free or Open Inquiry

The absence of guides is free inquiry, where students explore a simulated environment without externally-provided instruction or motivation. This method can be dangerous: it lacks inherent connection to learning goals, risks alienating or overwhelming students who feel intimidated by the content, and ultimately relies entirely on the sim design itself. In the related context of game-based learning, many authors have empirically shown the importance of ensuring the design guides learners to activities which are explicitly linked to desired cognitive processes 7. Guidance may help decrease cognitive load, induce reflection which learners may otherwise eschew, and decrease frustration or feelings of inability (27,33). In their analysis of physics sim design, Adams et al. noted that “exploration is not always productive” and suggested identifying and pruning any such avenues not aligned with primary learning goals [(27)]8. However some authors suggest free inquiry can be a useful way to have learners already capably familiar with other inquiry methods develop hypothesis-building and experimental design skills (26,47). In these cases regular discussion either in groups or with instructors is likely critical to encourage reflection and contextualize knowledge gained. Free inquiry must be carefully implemented: while free inquiry is sometimes romanticized as simulating what it is to “do real science”, taken to the extreme it may result in a deleterious condition known as “graduate studies”.

Thinking of sims as tools for conducting investigation, we can see these categories map well onto inquiry-based pedagogy classifications, see e.g. the Confirmation/Structured/Guided/Open inquiry hierarchy of (26) or the Confirmation/Structure/Guided/Open/Authentic hierarchy of (48). Ultimately, we must ask ourselves: is our use of sims supporting or usurping the learning goals? Are the desired learning processes actually being induced in students?

Design Principles

Sims not developed intentionally as part of a broader collection or educational resource project often reflect the designer’s own perspective (or technical ability…). But what principles should guide us, if we want to add such a tool to our teaching?

Encouraging Exploration

As discussed earlier, countless authors have examined and extolled the importance of exploration or discovery learning in physics education, see e.g. (13). Sims then need to be designed from the start to facilitate exploration and interrogation of the phenomena at hand. This puts an emphasis on interactivity and generality: sims should allow students to manipulate all relevant parameters easily and concoct any combination they deem worth observing. Sims can also be designed, or supplemented with written material, to create puzzles or leading inquiries for the student to fixate upon and solve (27). As we look at the following design principles, we will see how they support exploration.

Credibility and Realism

If the goal is to get students to engage in active exploration, we need to construct sims that students think are worthwhile. The validity of the sim and its clear connection to student’s experience is then key: in a study of student’s responses to a quantum mechanics sim, learners were observed to take the activity less seriously and spend less time with it when instructors showed it produced erroneous results for a specific parameter combination (27). Evidently exploration becomes more frustrating when you can’t be confident that the representation is accurate. Similarly if the representation is not relevant, what’s the point? Students respond more enthusiastically to sims with representations of familiar, every-day objects and attempt to recreate responses they already expect in order to ‘test’ sims (49). Sims must be accurate and comprehensive (to their scope).

Fun!9

“Users were disappointed that the temperature could reach thousands of degrees and the box remained intact, so we added a feature where the lid flies off under extreme conditions. Now users are more satisfied.” (49)

Exploration is more captivating when it’s, well, enjoyable. Students spend more time with and are more likely to engage deeply with sims they find at least somewhat enjoyable (41). Students are aware of how much learning they need to do to be successful in a competitive system: perhaps adding a little fun helps students feel more at-ease and be less self-critical. Fun can be a distraction however, essentially offering a different objective than actually interrogating the phenomenon. Care needs to be taken to remove sim elements that are “too fun” while not engendering any desired learning processes.

Student Interview and Input

While following overarching pedagogical principles provides a reliable framework for creating useful tools, it is important to actively solicit student thoughts on each simulation. This provides empirical evidence about whether desired learning processes are occurring: in their study of sim design effects on student learning and attitude, PhET team researchers used multiple rounds of interview and redesign for 52 sims assessed by 89 non-science students, finding that:

These interviews always reveal interface weaknesses, resolve interface questions that were not agreed upon by the team, and often reveal pedagogically undesirable (and occasionally unexpected desirable) features and subtle programming bugs. (27) [emphasis mine]

Although the need to actually empirically investigate the effectiveness of a sim is obvious enough, the process of doing so requires much care and effort. First it is critical that the sampled students accurately represent the target population, whether in considering students of different educational backgrounds and previous academic performance or ensuring that a representative minimum of students belonging to marginalized groups are included. Moreover students should have little to no prior exposure to the particular material. The method of interview needs to monitor both the learning outcomes and processes. Thus authors have commonly observed students as they interact with the sim for the first time: think-aloud transcriptions (27) and directed questioning (50) are commonly used to probe student’s thoughts.

Beyond this student input is particularly important in the context of universal design and ensuring equitable outcomes for learners. In their study of physics sim design, Adams et al. found that efficacy varied substantially with (27):

1 - Student familiarity with the material

2 - Student familiarity with the style of sim

3 - Student’s own expectations of their understanding

4 - Student’s perception of the sim as connected to their experience of the real world

Students became intimidated, self-conscious, or frustrated when simulations were too difficult to use. This then emphasizes intuitiveness and flexibility in sim design, to accommodate learners with different prior exposure to the material or user experience used. In their companion paper on UI design, Adams et al. identified click-and-drag, checkbox, slider, and grabbable interfaces as being most commonly understood, and the use of minimalism and intentional visual cues to prevent learners from feeling overwhelmed or lost (49). Moreover consistency in design and representation between sims was critical to make them intuitive and relate concepts between them.

Interestingly students who had previously taken a course on a given topic were more likely to simply use sims as visual aids for their own explanations when not actively directed, then become self-conscious, self-critical, and switch from mastery orientation to performance orientation when faced with difficulty remembering concepts. This underscores our main point: students need to be engaged in active exploration for sims to be effective, and their response to and use of sims guided by the instructor.

Reflections on my own implementations

It’s straightforward enough, if time-consuming, to consult the literature. But seeing as the lesson learned has been that experimentation is key to learning, it would seem we need to design and introduce some sims ourselves to appreciate what works and what doesn’t. I wrote several simulations for my teaching of GEOG 371 - Advanced Remote Sensing in Fall 2023. GEOG 371 is an excellent test course to implement physics sims in – while it deals intimately with satellite-based instruments and the nature of electromagnetic radiation, it is ultimately concerned with practical applications to climate monitoring. Students are typically enrolled in environmental majors such as geomatics, geography/environmental management, and resources/sustainability and have relatively nascent physics training. Despite this they are expected to form a nontrivial understanding of notoriously confusing topics such as polarization state, phase, and various atmospheric light-matter interactions such as Rayleigh scatter.

I sought to use sims to help teach three concepts which are typically not well understood by students in the course:

1 - How waves sum (necessary to understand constructive and deconstructive interference)

2 - Basic polarization states (necessary to understand RADAR imaging)

3 - How speckle arises (necessary to interpret RADAR images)

The goal here is to efficiently get students to acquire an elementary intuitive understanding of these topics, and engage them in topics that students are often intimidated by. These sims differed in form and implementation while staying generally consistent with the principles outlined above: consistent representation, minimalism and reduced cognitive load, intuitive controls, fully general and bug-free. The simulations were used in-lecture using a guided approach: after outlining the basic theory in conventional lecture, I demonstrated the basic functionality of the sims to the class. Students were then given a short amount of time (1-2 minutes) to familiarize themselves with the sims individually, before pairing up with a partner to investigate a set of questions. After paired experimentation students were asked to voluntarily share their findings with the class; this process is analogous to and adapted from traditional think-pair-share activities(51).

Intuitively I understood the appeal of the following design choices:

1 - Minimalism and clarity of design to decrease cognitive load

2 - Consistency of representation to prevent confusion and make connections between sims

3 - Intuitive, fun10 controls over programmatically simpler options

The sum of waves sim took this form (linked to decrease load on this page). The polarization state sim took this form.

The speckle sim took the following form:

Distance from target one to the satellite: m
Distance from target two to the satellite: m

It depicts a RADAR satellite 600km above the Earth, showing how the relative placement of reflectors in a scene can cause the returned waves to constructively or deconstructively interfere on return, making the pixel bright or dark. A detailed explanation can be found here but is not necessary for this discussion. The partner section below extends the concept to the statistical level, generating the output of many random configurations.

These simulations were generally well-received, though to what extent this was simply due to them being interesting diversions from lecture is unclear. Due to the nature of the course and limited resources, comprehensive student interview or response testing was not conducted beyond a majority of students indicating they ‘enjoyed’ the sims. Given the goal of producing certain outcomes or hypothesizing about the effects of certain changes, several student pairs shared ideas commensurate with the level of understanding needed to apply the concepts to the course material.

Ultimately however, as discussed above we would need to measurably tie the use of these sims to the achievement of learning goals, and incorporate student feedback cyclically, to be confident about their merits as teaching tools. Their creation itself has instead become my own learning experience, and perhaps a useful free resource for anyone desperate enough to find themselves on page 3+ of Google search.

Conclusion

Sims offer students a uniquely valuable opportunity to engage in active, personalized exploration of phenomena in an efficient and engaging way. They allow students to interrogate relationships directly, form and test hypotheses, and interact with representations too complex for chalk and board. Moreover they can offer students a less intimidating exposure to concepts which they have little foreknowledge of. However sims are only effective when designed and implemented with consideration of broader pedagogical principles. The following are the most salient in both my review of the literature and own experience:

1 - Sims should be designed to facilitate engaged exploration of phenomena

2 - Sims must be empirically evaluated by monitoring the experiences of learners and ensuring they align directly with learning goals

3 - Guidance is needed to ensure sim use is productive and encourage reflection on the results of student’s experiments

The difficulty of creating good sims (requiring programming, subject matter, and pedagogical expertise) can make their development and use sparse, limited only to large collections with staff and testing procedures, e.g. PHeT. Worse it can make their use superficial or misguided. Yet when intentionally developed as part of a larger educational program, their effect on engagement and learning is too substantial to dismiss.

References

1.
Miller MD. Simulations in medical education: A review. Medical Teacher. 1987;9(1):35–41.
2.
De Jong T, Van Joolingen WR. Scientific discovery learning with computer simulations of conceptual domains. Review of educational research. 1998;68(2):179–201.
3.
Rutten N, Van Joolingen WR, Van Der Veen JT. The learning effects of computer simulations in science education. Computers & education. 2012;58(1):136–53.
4.
Rivers RH, Vockell E. Computer simulations to stimulate scientific problem solving. Journal of Research in Science Teaching. 1987;24(5):403–15.
5.
Breuer K, Kummer R. Cognitive effects from process learning with computer-based simulations. Computers in Human Behavior. 1990;6(1):69–81.
6.
Eylon BS, Ronen M, Ganiel U. Computer simulations as tools for teaching and learning: Using a simulation environment in optics. Journal of Science Education and Technology. 1996;5:93–110.
7.
De Jong T. Learning and instruction with computer simulations. Education and Computing. 1991;6(3-4):217–29.
8.
Geban Ö, Askar P, Özkan Ï. Effects of computer simulations and problem-solving approaches on high school students. The Journal of Educational Research. 1992;86(1):5–10.
9.
Steinberg RN. Computers in teaching science: To simulate or not to simulate? American Journal of physics. 2000;68(S1):S37–41.
10.
Kulik JA, Kulik CLC, Bangert-Drowns RL. Effectiveness of computer-based education in elementary schools. Computers in human behavior. 1985;1(1):59–74.
11.
Bangert-Drowns RL. Effectiveness of computer-based education in secondary schools. Journal of computer-based instruction. 1985;12(3):59–68.
12.
Kulik CLC, Kulik JA. Effectiveness of computer-based education in colleges. Aeds Journal. 1986;19(2-3):81–108.
13.
Smetana LK, Bell RL. Computer simulations to support science instruction and learning: A critical review of the literature. International Journal of Science Education. 2012;34(9):1337–70.
14.
Falode OC, Mohammed IA. Improving students’ geography achievement using computer simulation and animation packages in flipped classroom settings. Journal of Digital Educational Technology. 2023;3(2):ep2303.
15.
Wu HT, Mortezaei K, Alvelais T, Henbest G, Murphy C, Yezierski EJ, et al. Incorporating concept development activities into a flipped classroom structure: Using PhET simulations to put a twist on the flip. Chemistry Education Research and Practice. 2021;22(4):842–54.
16.
Martens A, Diener H, Malo S. Game-based learning with computers–learning, simulations, and games. Transactions on edutainment I. 2008;172–90.
17.
Whitton N. Motivation and computer game based learning. Proceedings of the Australian Society for Computers in Learning in Tertiary Education, Singapore. 2007;1063:1067.
18.
Pellas N, Mystakidis S. A systematic review of research about game-based learning in virtual worlds. J Univers Comput Sci. 2020;26(8):1017–42.
19.
Moore EB, Chamberlain JM, Parson R, Perkins KK. PhET interactive simulations: Transformative tools for teaching chemistry. Journal of chemical education. 2014;91(8):1191–7.
20.
Akpan JP. Issues associated with inserting computer simulations into biology instruction: A review of the literature. The Electronic Journal for Research in Science & Mathematics Education. 2001;
21.
Gibson B, Bell T. Evaluation of games for teaching computer science. In: Proceedings of the 8th workshop in primary and secondary computing education. 2013. p. 51–60.
22.
Hussain A, Azeem M, Shakoor A. Physics teaching methods: Scientific inquiry vs traditional lecture. International journal of humanities and social science. 2011;1(19):269–76.
23.
Laws PW. Calculus-based physics without lectures. Physics today. 1991;44(12):24–31.
24.
Thornton RK. Using large-scale classroom research to study student conceptual learning in mechanics and to develop new approaches to learning. In: Microcomputer–based labs: Educational research and standards. Springer; 1996. p. 89–114.
25.
Bruner JS. The act of discovery. Harvard educational review. 1961;
26.
Banchi H, Bell R. The many levels of inquiry. Science and children. 2008;46(2):26.
27.
Adams WK, Reid S, LeMaster R, McKagan SB, Perkins KK, Dubson M, et al. A study of educational simulations part i-engagement and learning. Journal of Interactive Learning Research. 2008;19(3):397–419.
28.
Coll RK, France B, Taylor I. The role of models/and analogies in science education: Implications from research. International Journal of Science Education. 2005;27(2):183–98.
29.
Glynn SM, Duit R, Thiele RB. Teaching science with analogies: A strategy for constructing knowledge. In: Learning science in the schools. Routledge; 2012. p. 247–73.
30.
Podolefsky NS, Finkelstein ND. Analogical scaffolding and the learning of abstract ideas in physics: An example from electromagnetic waves. Physical Review Special Topics-Physics Education Research. 2007;3(1):010109.
31.
Didiş N. The analysis of analogy use in the teaching of introductory quantum theory. Chemistry Education Research and Practice. 2015;16(2):355–76.
32.
Jonāne L. Using analogies in teaching physics: A study on latvian teachers’ views and experience. Journal of Teacher Education for Sustainability. 2015;17(2):53–73.
33.
Kirschner PA, Sweller J, Clark RE. Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational psychologist. 2006;41(2):75–86.
34.
Ural E. The effect of guided-inquiry laboratory experiments on science education students’ chemistry laboratory attitudes, anxiety and achievement. Journal of Education and Training Studies. 2016;4(4):217–27.
35.
Suits JP. Assessing investigative skill development in inquiry-based and traditional college science laboratory courses. School Science and Mathematics. 2004;104(6):248–57.
36.
Myers MJ, Burgess AB. INQUIRY-BASED LABORATORY COURSE IMPROVES STUDENTS’ABILITY TO DESIGN EXPERIMENTS AND INTERPRET DATA. Advances in physiology education. 2003;27(1):26–33.
37.
Salomon G. Technology’s promises and dangers in a psychological and educational context. Theory into practice. 1998;37(1):4–10.
38.
Dillenbourg P. Integrating technologies into educational ecosystems. Distance Education. 2008;29(2):127–40.
39.
Schittek M, Mattheos N, Lyon H, Attström R. Computer assisted learning. A review. European Journal of Dental Education: Review Article. 2001;5(3):93–100.
40.
Wieman C. Why not try a scientific approach to science education? Change: The Magazine of Higher Learning. 2007;39(5):9–15.
41.
Rieber LP, Noah D. Games, simulations, and visual metaphors in education: Antagonism between enjoyment and learning. Educational Media International. 2008;45(2):77–92.
42.
Perkins K, Adams W, Dubson M, Finkelstein N, Reid S, Wieman C, et al. PhET: Interactive simulations for teaching and learning physics. The physics teacher. 2006;44(1):18–23.
43.
Batuyong CT, Antonio VV. Exploring the effect of PhET interactive simulation-based activities on students’ performance and learning experiences in electromagnetism. Asia Pacific Journal of Multidisciplinary Research. 2018;6(2):121–31.
44.
Moore EB, Herzog TA, Perkins KK. Interactive simulations as implicit support for guided-inquiry. Chemistry Education Research and Practice. 2013;14(3):257–68.
45.
Ogegbo AA, Ramnarain U. Teaching and learning physics using interactive simulation: A guided inquiry practice. South African Journal of Education. 2022;42(1).
46.
Tobias S, Fletcher JD, Wind AP. Game-based learning. Handbook of research on educational communications and technology. 2014;485–503.
47.
Zion M, Mendelovici R. Moving from structured to open inquiry: Challenges and limits. Science education international. 2012;23(4):383–99.
48.
Buck LB, Bretz SL, Towns MH. Characterizing the level of inquiry in the undergraduate laboratory. Journal of college science teaching. 2008;38(1):52–8.
49.
Adams WK, Reid S, LeMaster R, McKagan S, Perkins K, Dubson M, et al. A study of educational simulations part II–interface design. Journal of Interactive Learning Research. 2008;19(4):551–77.
50.
Cook DA, Hamstra SJ, Brydges R, Zendejas B, Szostek JH, Wang AT, et al. Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis. Medical teacher. 2013;35(1):e867–98.
51.
Kaddoura M. Think pair share: A teaching learning strategy to enhance students’ critical thinking. Educational Research Quarterly. 2013;36(4):3–24.

Footnotes

  1. See, e.g. (19)↩︎

  2. See, e.g. (20)↩︎

  3. See, e.g. (21)↩︎

  4. Likely with less neat handiwork!↩︎

  5. While this may seem like a somewhat trivial improvement, anyone who has dealt with interpreting more advanced wave properties like polarization state may see the value added here↩︎

  6. Albeit extremely cool and interesting methods, studied by extremely cool and interesting, yet modest and well-adjusted, people who are fun at parties↩︎

  7. See table 38.2 in Handbook of Research on Educational Communications and Technology (46)↩︎

  8. Consider for example the principle of error tolerance in universal design↩︎

  9. ʸᶦᵖᵖᵉᵉ !↩︎

  10. and haaaaard to program…↩︎