A web-based formative feedback system development by utilizing isomorphic multiple choice items to support physics teaching and learning

A WEB-BASED FORMATIVE FEEDBACK SYSTEM DEVELOPMENT
BY UTILIZING ISOMORPHIC MULTIPLE CHOICE ITEMS
TO SUPPORT PHYSICS TEACHING AND LEARNING

Sentot Kusairi

Universitas Negeri Malang (Indonesia)

Received July 2019

Accepted November 2019

Abstract

Formative feedback plays an important role in assisting students in their learning process. However, administering information about student weaknesses and strengths is one of the challenges faced by teachers when implementing formative assessment. This study aims to develop a web-based formative feedback system that is able to provide specific feedback. This is a development research with steps including needs analysis, model design, model development, and limited model trial. The research has succeeded in developing a web-based formative feedback system through utilizing isomorphic multiple choice items, namely Tryout and Webvoting applications. The results of the initial trial, involving 22 high school physics teachers and 44 prospective physics teachers, showed that the model can be used by both students and teachers. The use of Tryout application allows the teachers to administer individual and groups of students’ feedbacks. Based on the feedbacks and information, teacher also can discuss student’s learning difficulties using Webvoting application. More extensive trials are needed to find out the effectiveness of this system.

 

Keywords – WEB-based formative feedback system, Formative assessment, Isomorphic multiple choice item, Physics.

To cite this article:

Kusairi, S. (2020). A web-based formative feedback system development by utilizing isomorphic multiple choice items to support physics teaching and learning. Journal of Technology and Science Education, 10(1), 117-126. https://doi.org/10.3926/jotse.781

 

----------

    1. 1. Introduction

Feedback takes place as a key element of learning assessment. It is consist of various information given to students, thus they can improve their learning outcomes (Burns & Foo, 2013). The feedback received by the students on time supports them to be more involved in learning, knowing the achievement of targets, and managing their learning (Barana & Marchisio, 2016a; Asadi, Azizinezhad & Ehsani-Fard, 2017). Effective feedback can provide opportunities for the students to close the gap between their abilities and expected abilities and provide information to the teacher to sharpen the learning process (Nicol & Macfarlane-Dick, 2006; Gedye, 2010). Formative feedback should also be personal, motivating learning, and related to assessment criteria (Hatziapostolou & Paraskakis, 2010). Nevertheless, timely formative feedback is still a problem in learning. Some of the causes are limited time the teacher has, the large number of students, and the variety of problems experienced by them (Kusairi, 2012).

The effort that researchers have made to overcome the problem of providing timely feedback is developing computer-assisted assessments (Faber, Luyten & Visscher, 2017). The use of computers that have ability to process and store data can increase flexibility in the timing of the implementation of the assessment, reduce the time to correct the results of the assessment, and reduce the cost of conducting the assessment (Baleni, 2015). Computer assisted assessment can also provide faster feedback (Denton, Madden, Roberts & Rowe, 2008). Computer network based assessments besides being able to provide feedback to students in real time can also provide feedback to the teacher quickly. Some studies also show that computer-based assessment can encourage the students to learn more effectively (Barana & Marchisio, 2016b). The use of clickers with the appropriate items helps them conduct self assessment and reflect (Ludvigsen, Krumsvik & Furnes, 2015).

However, efforts to deliver specific feedback to the students are still a challenge for computer-assisted assessment. The use of multiple choice items in computer-assisted assessments has so far only given scores to them as soon as they have finished working on the test (Attali & van der Kleij, 2017). The use of multiple choice items also has weaknesses that can be easily guessed by them. Feedback in the form of a test result score has not provided the right information to them to fill the gap between their abilities and expected abilities. The use of feedback in teacher’s interaction with them in the learning process also needs attention (Havnes, Smith, Dysthe & Ludvigsen, 2012).

A new scoring report based on a formative assessment mechanism has been developed to provide feedback not only on the students’ final scores but also on the sub-scale scores, percentile positions, and appropriate feedback on self-regulation strategies. The result of the study shows that the new score report is more efficient for students’ independent learning than conventional learning reports (Zou & Zhang, 2013). Computer-assisted formative assessment can be used to support learning that involves the task of discrimination and developing appropriate cognitive strategies (Bhagat & Spector, 2017). The research shows a particular pattern of student’s involvement with feedback that reflects productive study strategies and significantly predicts higher performance (Chen, Breslow & DeBoer, 2018). The teacher’s information is also needed by the teacher because this shows implications for the development of new score reports (Hopster-den Otter, Wools, Eggen & Veldkamp, 2017). However, timely and specific computer-assisted feedback to help students learn and help teachers make learning decisions has not been widely reported (Floratos, Guasch & Espasa, 2015).

The use of isomorphic multiple choice items is an alternative solution. With isomorphic questions, feedback to students will be more effective in helping them know their weaknesses and strengths and help them learn better. The use of isomorphic items has been used in several studies (Attali & van der Kleij, 2017). Isomorphic items have also been reported to be effective in the development of interactive multimedia (Kusairi, Alfad & Zulaikah, 2017). Other findings suggested that the use of Multiple-choice items will equal easy questions if the scoring involves the correct number of scores (Kastner & Stangla, 2011).

This study aims to develop the model of Web-based formative feedback by utilizing isomorphic multiple choice items. in physics learning and conducting pre-trials to understand the effectiveness of the model. In the development process, several questions will be answered, including: a) How was design of the web-based formative feedback that will be developed?, b. How was the web-based feedback formative feedback model that developed? c. Can the web-based formative feedback model be used by teachers and students?

2. Method

This research is a type of development research with the aim of developing a model of WEB-based formative feedback system by utilizing isomorphic multiple choice items. The step of development research carried out in the Physics Department Faculty of Mathematics and Science, Universitas Negeri Malang including needs analysis, model design, model development, and limited model trial. The stage of needs analysis is carried out by conducting interview with several lectures especially in basic physics courses and also undergraduate students. Based on the data of the needs analysis stage, a formative feedback model design was developed by utilizing multiple choice isomorphic items.

The design of the system produced is then realized in the form of a Web-based application. Before being tested, the model was evaluated by senior lecturers who are an expert in the field of physics learning. Limited trials were conducted on team of lecturer, 22 high school physics teacher, and 44 prospective physics teachers. After getting training on using the Tryout application and the Webvoting application, secondary physics teachers were asked to fill out questionnaires and open answers.

3. Result

A Web-based Formative Feedback using isomorphic items is an application that can help teachers to identify mastery of student’s concepts and follow up in the class discussions. The system is consists of Tryout applications and Webvoting applications. The system can be accessed by personal computer or smartphone that connected to internet. Tryout application can be used to design and deliver tests and provide feedback to students and teachers, while Webvoting is an application to identify student’s responses to the items with a smartphone help. Tryout application can be implemented as a set of face-to-face processes and carried out outside of the learning hours. Meanwhile, Webvoting can be used to follow up the student’s weaknesses with discussions at face-to-face class meetings. The mechanism for using the Tryout and Webvoting application can be described as Figure 1.

Some of the characteristics of this system include the following. 1. The system implements isomorphic items, namely each indicator of competency achievement consists of 3 multiple-choice items with 5 choices. 2. The system provides an opportunity for teachers or lecturers who already have an account to fill in the indicator column and items according to the learning needs. 3. If the test has been released by the teacher or lecturer, the students who already have an account can access the test wherever and whenever as long as they have a connection with the internet. 4. If the time for solving the question has run out or the student has sent an answer, they can know the results directly through the feedback provided by the system. 5. The system can only be accessed by users who have been registered, so that system security is guaranteed. The teacher or lecturer can also see the student’s performance in working on items both individually and in groups.

The items used in this Web-based formative feedback system are isomorphic items, namely three items with different faces but developed based on the same learning indicators. Ordinary multiple choice items have 5 options. Figure 2 below is an example of item test interface.

Based on the responses from the students, the system will conduct an analysis based on the number of student’s correct answers on certain indicators. The logic to produce formative feedback is as shown in Figure 3.

 

Figure 1. The Web-based Formative Feedback System

 

Figure 2.Students Test Interface in the Tryout application

 

Figure 3. Flow Chart Mechanism of Producing Feedback

Based on the responses from the students, the program will identify answers to users on items related to certain indicators. Possibly what happened was students were declared “understand”, “moderate understand”, and “not understand’”. Students are said “understand” if they can answer correctly for all items of isomorphic multiple choice items. Students are said to “moderate understand” if there is one wrong answer. Finally students are said “not understand’”if only one item answered correctly or all items are answered incorrectly. Because the questions are randomly assigned, it is hoped that this feedback model can anticipate the students who are only guessing answers. If the students guess the answer, students probably will get “not understand” feedback.

The Web-based formative feedback model has been conducted in a limited test on a 22 students of teacher training, hundreds of physics students, and also physics teachers. In general, the students have no difficulty in using this application after they have registered and have an account. They can even access the questions presented in the application via a smartphone device. In the testing of 15 items from 5 different indicators, the students were provided with 30 minutes, but most of them completed the questions and sent the answers in less than 30 minutes. Immediately after sending the answers, they will receive individual feedback as shown in Figure 4. This formative feedback can be a reflection material for them about their learning outcomes. Most of them stated that the model was very useful for their physics learning, because it can provide more specific information, not just score as usually given by conventional multiple choice tests.

As soon as all students complete the test, the teacher also can find out the accumulated results through reports. The example that can be accepted by the lecturer or teacher is as shown in Figure 5 (a) from the group of student teacher and (b) from the group of teacher. The information received by the teacher is the result of an analysis of the performance of all students. For each student who succeeds in doing all the questions correctly on a particular indicator will be a contributor to the percentage of mastery. So it can be seen in Figure 5 that for learning indicator 5, as many as 80% of them have mastered, while in learning indicator 2, as many as 60 percent of them master. In other indicators, most of them have not mastered. This shows that student’s mastery of the topic that being tested is still low. It also appears that for different samples, the system can provide different information based on the students’ mastery of concepts.

 

Figure 4. Example of Individual Student Feedback

In addition to individual and class reports, the teacher can also see the results of the answers of all students on each learning indicator. The data are still raw but can be copied and processed (compatible) with other applications such as spreadsheets. Furthermore, the teachers can further process the available data according to their needs.

 

Figure 5. Example of Groups’ Feedback (a) prospective teacher, (b) teacher

Based on the data on learning difficulties obtained from the Tryout application, the teacher can follow up on the problem by utilizing the Webvoting application. He can choose the desired multiple choice item to be discussed interactively in the class. Questions can be displayed on the projector screen as well as can be seen on the students’ smartphones. Furthermore, the number of students who answer each option and correct answers can also be displayed by teacher on the class projector. Picture 6 show the graph generated by Webvoting application shows that their answers spread to almost all options. Based on this information, the teacher can justify a next step in teaching learning process for example ask students to discuss the solution of their difficulties.

More than 30 high school physics teachers receive training on how to use the Tryout and Webvoting application. Afterwards, they were asked to fill out a questionnaire. Twenty-two high school physics teachers returned the questionnaire. Secondary physics teachers perception of the Tryout and Webvoting application, can be seen in the Table 1.

 

Figure 6. Example of the Graph of Student Response on Webvoting

No

Description

Percentage

Strongly agree

Agree

Disagree

Strongly disagree

1

Information about the students’ conceptual understanding is needed by the teacher in the learning process.

77.27

22.73

0.00

0.00

2

Information about group conceptual understanding is needed by the teacher to make decisions in learning.

77.27

22.73

0.00

0.00

3

The TRYOUT application can help teachers identify students’ conceptual understanding

59.09

36.36

0.00

0.00

4

The TRYOUT application can help the teacher in providing feedback about students’ conceptual understanding

68.18

27.27

0.00

0.00

5

Isomorphic question in TRYOUT application can be developed by the teacher

31.82

63.64

0.00

0.00

6

WEBVOTING application can assist teachers in identifying students’ conceptual understanding.

45.45

50.00

0.00

0.00

7

WEBVOTING application can help teachers give feedback on students’ conceptual understanding.

63.64

36.36

0.00

0.00

8

TRYOUT application is useful to support students’ learning

68.18

31.82

0.00

0.00

9

WEBVOTING application is useful to support teaching and learning process.

68.18

31.82

0.00

0.00

Table 1. Secondary physics teacher’s perception about Tryout and Webvoting application

Hundreds of prospective physics teachers are also introduced to the Webvoting application. After that, they were asked to answer openly about how their perceptions on the Webvoting application. Some of their responses are as follows.

“Webvoting application is very helpful for students and teachers in learning, especially in assessment”

“The teacher knows the number of students who have answered and the teacher can see how many students have answered correctly and answered incorrectly.”

“Good, can identify student difficulties”

“Very helpful in the process of learning and understanding concepts”

“Very good for use in learning systems. because with this webvoting will not drop students mentality if students answer the question incorrectly.”

What are the obstacles when implementing Webvoting application?

“If the internet network is bad, it can hamper the process of implementing web voting”

“Requires a stable internet connection, students can cheat each other”

4. Discussion

In this development research, the model of web based formative feedback that can generate more specific feedback as soon as the user completes the test (timely feedback) has been developed. The feedback is also quite effective because it does not only provide the score as usually given by conventional multiple‑choice tests but also provides feedback in the form of learning’ indicators and the extent to which the user mastery their learning’ indicator. The model can also provide reports to the teacher or lecturer users about the accumulation of results for all students (group profile). So, in general, the model has been able to help providing an information and feedbacks to the teachers and students. Thus, they can reflect on learning outcomes and learning modifications according to students’ needs.

The model developed, namely Tryout and Webvoting applications, has similarities with some computer‑assisted assessments, which can provide timely feedback. The advantages of the model compared to other models are as follows. 1). Justification of student’s abilities is not only from one multiple choice item but 3 multiple choice items. This is to reduce the guessing factor of students. 2) The model can provide feedback on the accumulation of student’s abilities in one class to teacher. 3) The model also allows the teacher to see the appearance of students on each indicator. 4) Test models and feedback can be assessed from anywhere and at any time.

The model also has several disadvantages. 1) Item questions needed at least 3 multiple choice items for each indicator. This has an impact on the length of the process of making questions and processing questions. 2) The model has not been equipped with remediation for students. 3) The appearance of all students for each item has not been processed automatically. With regard to deficiency No. 3, the model needs to be further developed to make it easier for teacher and lecturer to analyze each test item. 4. The question items used in this feedback model are only multiple choice questions, students just have to choose, and the level of authenticity is looked down upon. 5. Work on questions through the internet allows the student’s answers not to describe their true abilities.

In general, the models have been proven to provide feedback and provide benefits for both students and teachers. Therefore, the model can be implemented in physics learning in high school, college, and other needs. How the feedback model influences the student’s learning process done by the teacher needs to be examined further.

5. Conclusion

Development of Web-based formative feedback system has been carried out, consists of “Tryout” and “Webvoting” applications. The system has been evaluated by experts in the field of physics learning, and tested by lecturer and students. The Web-based formative feedback system that utilizes isomorphic multiple-choice items has been shown to provide more specific feedback for both students and teachers, not just score as usually given by conventional multiple-choice tests but also student mastery in specific learning’ indicator.

The system can also be used or adapted in other subjects by developing multiple choice isomorphic items. Further studies on the impact of the model on students’ learning of physics’ concepts or other subjects are needed. More extensive trials are needed to find out the effectiveness of this system.

Acknowledgment

The writer would like to express his sincerest appreciation to various parties who contributed to this study. The greatest honor would be finally dedicated to DP2M that has funded research with competing grant research funds, researchers’ team, program development teams from Illiyin Studio, MGMP Malang, and PPG students who involved in implementing the model. Hopefully the development of this model can contribute to improving the quality of teaching specially physics teaching.

Declaration of Conflicting Interests

The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding

The author received no financial support for the research, authorship, and/or publication of this article.

References

Asadi, M., Azizinezhad, M., & Ehsani-Fard, E. (2017). Formative Assessment and Feedback as Predictors of Students’ Engagement. Research in Applied Linguistics, 8(0), 291-298. https://doi.org/10.22055/rals.2017.12933

Attali, Y., & van der Kleij, F. (2017). Effects of feedback elaboration and feedback timing during computer-based practice in mathematics problem solving. Computers & Education, 110(Supplement C), 154-169. https://doi.org/10.1016/j.compedu.2017.03.012

Baleni, Z.G. (2015). Online formative assessment in higher education : Its pros and cons. The Electronic Journal of E-Learning, 13(4), 228-236.

Barana, A., & Marchisio, M. (2016a). Ten Good Reasons to Adopt an Automated Formative Assessment Model for Learning and Teaching Mathematics and Scientific Disciplines. Procedia - Social and Behavioral Sciences, 228, 608-613. https://doi.org/10.1016/j.sbspro.2016.07.093

Barana, A., & Marchisio, M. (2016b). Ten Good Reasons to Adopt an Automated Formative Assessment Model for Learning and Teaching Mathematics and Scientific Disciplines. Procedia - Social and Behavioral Sciences, 228, 608-613. https://doi.org/10.1016/j.sbspro.2016.07.093

Bhagat, K.K., & Spector, J.M. (2017). Formative assessment in complex problem-solving domains: The emerging role of assessment technologies. Journal of Educational Technology & Society, 20(4), 312-317.

Burns, C., & Foo, M. (2013). How is feedback used? – The international student response to a Formative Feedback Intervention. The International Journal of Management Education, 11(3), 174-183. https://doi.org/10.1016/j.ijme.2013.06.001

Chen, X., Breslow, L., & DeBoer, J. (2018).Analyzing productive learning behaviors for students using immediate corrective feedback in a blended learning environment.Computers & Education, 117(Supplement C), 59-74. https://doi.org/10.1016/j.compedu.2017.09.013

Denton, P., Madden, J., Roberts, M., & Rowe, P. (2008). formative feedback : A comparative case study. British Journal of Educational Technology, 39(3), 486-500. https://doi.org/10.1111/j.1467-8535.2007.00745.x

Faber, J.M., Luyten, H., & Visscher, A.J. (2017). The effects of a digital formative assessment tool on mathematics achievement and student motivation: Results of a randomized experiment. Computers & Education, 106, 83-96. https://doi.org/10.1016/j.compedu.2016.12.001

Floratos, N., Guasch, T., & Espasa, A. (2015). Recommendations on Formative Assessment and Feedback Practices for stronger engagement in MOOCs. Open Praxis, 7(2), 141-152. https://doi.org/10.5944/openpraxis.7.2.194

Gedye, S. (2010). Formative assessment and feedback: a review. Planet, 23(1), 40-45. https://doi.org/10.11120/plan.2010.00230040

Hatziapostolou, T., & Paraskakis, I. (2010). Enhancing the Impact of Formative Feedback on Student Learning Through an Online Feedback System. Electronic Journal of E-Learning, 8(2), 111-122.

Havnes, A., Smith, K., Dysthe, O., & Ludvigsen, K. (2012). Formative assessment and feedback: Making learning visible. Studies in Educational Evaluation, 38(1), 21-27. https://doi.org/10.1016/j.stueduc.2012.04.001

Hopster-den Otter, D., Wools, S., Eggen, T.J.H.M., & Veldkamp, B.P. (2017). Formative use of test results: A user’s perspective. Studies in Educational Evaluation, 52, 12-23. https://doi.org/10.1016/j.stueduc.2016.11.002

Kastner, M., & Stangla, B. (2011). Multiple Choice and Constructed Response Tests: Do Test Format and Scoring Matter? Procedia - Social and Behavioral Sciences, 12(Supplement C), 263-273. https://doi.org/10.1016/j.sbspro.2011.02.035

Kusairi, S. (2012). Analisis asesmen formatif fisika sma berbantuan komputer. Penelitian Dan Evaluasi Pendidikan, Dies Natal, 3, 68-87.

Kusairi, S., Alfad, H., & Zulaikah, S. (2017). Development of Web-Based Intelligent Tutoring (iTutor) to Help Students Learn Fluid Statics. Journal of Turkish Science Education (TUSED), 14(2).

Ludvigsen, K., Krumsvik, R., & Furnes, B. (2015). Creating formative feedback spaces in large lectures. Computers & Education, 88, 48-63. https://doi.org/10.1016/j.compedu.2015.04.002

Nicol, D.J., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning : A model and seven principles of good feedback practice. Formative assessment and self-regulated learning : A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218.

Zou, X., & Zhang, X. (2013).Effect of different score reports of Web-based formative test on students’ self-regulated learning.Computers & Education, 66(Supplement C), 54-63. https://doi.org/10.1016/j.compedu.2013.02.016




Licencia de Creative Commons 

This work is licensed under a Creative Commons Attribution 4.0 International License

Journal of Technology and Science Education, 2011-2024

Online ISSN: 2013-6374; Print ISSN: 2014-5349; DL: B-2000-2012

Publisher: OmniaScience