NeltaChoutari June 2014 Issue
Towards Student Friendly Testing System
Human life is full of tests. We test our blood pressure or temperature in order to make sure our body is functioning well. If the pressure or temperature is at a normal level, then we think that our life style is fine, whereas if any indicator is above or below an acceptable level, we go for treatment. Thus, testing helps us find out about a situation and work accordingly.
We teach our students and assess how much learning has been achieved. In order to do so, we take the help of tests. The tests help us assess the achievement of our students and ultimately our own achievements. It further helps us diagnose what worked well and what did not, and go for remedial measures after the diagnosis. Moreover, the test result serves basically two functions- forward looking and backward looking. However, in our context the test in the most cases serve only the forward looking function. In other words, we interpret the result of students and make decision of either promoting or failing them, and the role of test is over. Isn’t it necessary to explore the detailed causes of the failure? Isn’t there chance of fault in the curriculum itself or in the teaching methodology? But nobody seems to bother. We have rarely heard that School Leaving Certificate (SLC) board or universities doing needful to explore the causes of poor result and improving the tests. The authority seems like a machine to give a tag of either PASS or FAIL. Where is interaction? Where is research? Where is change? and when we do not look back after the result, we never get to correct the faults in the system and keep victimizing students.
Apart from this, major problems are seen in our test administration system. There is problem in the quality of tests. Be that the SLC or universities tests, the condition is rather pathetic. The English language test papers themselves contain deviated language forms- forget about other quality! We know that a test paper goes through a number of phases before going to examinees. After a test is constructed based on a specification grid, it is piloted. Then it is improved and finalized on the basis of feedback of the pilot. However, our test papers are so powerful, whatever faults they contain, they manage to escape from the grip of our skillful experts! If we compare our so-called standardized test with teacher made test, in the many respects, the latter stands out. Such poorly constructed paper-pencil test of few hours makes decision about a student’s life- a strong decision that determines career and future of the student. In this regard, the test is very serious and critical thing. However, it is taken very lightly in our context. A test of few hours is set randomly and marked impressionistically to make judgement about students’ life. Is the ability of students only what they can fill in the paper in few hours? If not, the test does not have right to make decision about anybody’s life. A test is supposed to test abilities, knowledge, understanding and attitudes of students. However, our test is testing something absurd. It is testing students’ memory (if not memory then cheating skill) and writing speed. This paper pencil test tests mostly students’ memory, calligraphy and writing speed. They are not meant for assessing students’ originality, innovation and creativity. In fact, it is stupid to talk about testing such abilities of students with a few hours memory- driven calligraphy marathon competition! It is this unscientific test that contributes to develop test anxiety in students. They dislike tests as tests are not doing any justice to them. Their actual abilities are never tested through tests. However, their quality is measured with reference to the certificate of such test in the outer world. In fact, this paper-pencil test is destroying the life of many making them mentally handicapped and how can we expect them to love tests.
The testing has gained a bad impression because of the faulty process. However, the situation is not as hopeless as it seems. We can do something to make tests worth practising and the subject teacher or expert can be a saviour here. The language teachers can devise students’ friendly test, which will be a modification of formative assessment. Our students’ now need to be assessed based on their classroom performance. Garcia and Pearson (1994) has termed such test as performance assessment. Students’ throughout the session/semester do a lot of things in classroom (at least in schools). They read and write plenty of things, get involved in classroom discussions, express their views, and work in peer or group to solve problems. This requires them to perform actual language skills and abilities. Besides that, they get involved in different co-curricular activities in or outside the institution. Moreover, teacher should assign them to do project works, mini-researches and different other tasks that involve language, abilities, knowledge, understanding and attitudes. Meanwhile, the ongoing internal tests/formative assessment will along. In such a way, whatsoever activities are performed by students in or outside the classroom, teachers need to keep record of them in an individual portfolio. Based on such records, students’ actual assessment can be made, and such assessment is more valid and reliable than a paper-pencil test. This sort of assessment captures actual performance of students. In addition to that, it also encourages them to keep learning throughout the year, which makes students’ learning oriented not merely test-oriented. Furthermore, such assessment makes tests common to students and they no more get scared of tests. However, this kind of assessment demands individual record of students, which maximizes the work pressure of teachers. Therefore, the work pressure of teachers should be balanced.
The performance assessment can replace the traditional paper-pencil test in the days to come. However, at present, we can at least use it as a supplement to the paper-pencil test, which can minimize the dominance of memory- driven calligraphy marathon competition!
What is in this issue?
We have attempted to make this issue language testing special as we thought that it is necessary to create a discourse on language testing in order to change the face of present language testing system. So, let’s wonder, ponder, share and care about language testing in this issue. Besides making this issue a language testing special, we have invented a new genre, i.e. ‘interactive article’. The idea is to bring together the experts and readers to discuss and interact on a particular theme and to explore more among ourselves the unlimited possibilities.
One of the challenges of the thematic issue is to maintain variety. However, we have attempted to overcome this challenge by raising multiple issues within an issue. In the interactive article, we have brought forward a number issues of language testing in our context, which include multilingual competence testing, formative assessment, professionalism in testing, quality of test including testing listening and speaking in secondary level, faulty test construction process, and administration and validation. Similarly, in the next entry, Balram Adhikari shares his reflection on his own experiences of marking answer sheets of university. His thought-provoking article reveals the quality of students’ writing in university and compels us to ponder upon the impressionistic way of marking the answer sheets. Similarly, it also brings forward the issue of language versus content debate in marking the answer sheets. Not only have we raised the issues but also have attempted to offer some suggestions. Ashok Raj Khati and Manita Karki offer us alternative practice of language testing through classroom assessment based on a lecture delivered by a prominent scholar Prof. Dr. Tirth Raj Khaniya. On the other hand, Umes Shrestha shares his ideas about the faulty system of paper-pencil test in our context and shares his practices of marking the answer sheets in a liberal way. Moreover, he also offers some alternatives to existing testing system. In the same way, in the next entry, on his research based article (Based on his Master’s Thesis), Bhupal Sin Bista explores that listening skill is neglected in teaching as well as testing in most of the cases in the secondary schools of our country. He further explores a distinct gap between teachers’ theoretical knowledge and its application in classroom. He then suggests ideas to teach and test listening skill effectively. Last but not the least; we have attempted to add a bit black humour by depicting a part of scenario of language testing through pictorial.
Here is a list of contents included in this issue:
- Testing the Testing System of Nepal: An Interactive Article: Choutari Editors
- The Taste of Testing Students’ Test Papers: Bal Ram Adhikari
- Teaching and Testing Listening at Secondary Level: Bhupal Sin Bista
- Classroom Assessment: A New Era in Language Testing or An Additional exercise?: Ashok Raj Khati and Manita Karki
- It is Funny and Yet Serious!: Choutari Team
- Giving a Benefit of Doubt in Assessment: Umes Shrestha
Now, I request you to share what you read and like, drop your comment to encourage writers and join the conversation by writing new entries in the upcoming issues of Choutari.
Lastly, I extend my sincere gratitude to Shyam Sharma and Balram Adhikari for their rigorous support and constructive feedback in every step to make this issue possible. Similarly, I am indebted to Praveen Kumar Yadav, Umes Shrestha and Ushakiran Wagle for their physical and moral support to materialize this issue!
June Issue, 2014