This is a group for anyone involved in Occupationally Directed Education Training and Development Practices (ODETD) in any capacity. You are invited to join if this is your field of expertise or just interest. This is an opportunity to raise your voice related to Skills issues and to have the support of like minded people.
Please only add discussion topics or reply to discussions. Advertising of your company, your products, services and events is not permitted in this group.
What are the “Principles of Assessment”?
9th Jul 2014 at 7:51 am #32422
PRINCIPLES OF ASSESSMENT
In order that effective quality assurance is maintained in all assessments, there needs to be some form of principles and practice that guide implementation. The quality of assessment is important in order to provide credible certification.
The following are some of the principles in assessment that are used in order to ensure fairness in assessment. In addition the adherence to these principles ensures all parties involved in learning of the credibility of the assessment process while at the same time providing accurate information about the achievements of the student who has been assessed.
- The assessment process and the evidence required must be appropriate to what is being assessed. It must relate to the desired outcomes.
- Valid assessment implies the method used is the one most likely to give an accurate picture of the individual’s competence within a particular area.
To achieve validity assessors should
- State clearly what outcome is being assessed
- Use appropriate sources of evidence
- Use appropriate assessment methods
- Use the right instrument or test to measure learning
Validity has a particular significance within the NQA because what is being assessed is the evidence presented. This evidence must relate to what the student has learned.
If the evidence is not valid then the student will have to be reassessed using different, additional and more appropriate or correct ways of gathering the required evidence of learning. No evidence is automatically valid or invalid. It depends on the learner’s interpretation of the evidence and how they present and justify its relevance that makes it valid.
- The judgement confirms the candidate’s performance will be of a consistent standard in a range of different contexts
- The same assessor would make the same judgement about the learner on a number of different occasions.
- Other assessors would make the same judgement about the learner.
Reliability and fairness are closely linked. The learner should feel confident they have been fairly treated by the assessor and the assessors operate within a structured format that ensures the consistency of the assessment.
Consistency in the NQA is essential for employers, colleges and institution of learning. Assessors will be required to confirm the standards give a clear and accurate picture of how someone can perform in employment or in preparation for higher learning.
Assessors need to be mindful that each time an assessment is conducted they ensure similar conditions prevail and that procedures are appropriate and according to policy.
Give clear and concise instruction without creating any doubt.
Ensure the assessment criteria and guidelines for the module are adhered to.
Clarify the module requirements for learners.
Make sure that internal and external quality management systems are in place and followed.
Use the appropriate checklist to ensure reliability.
The method of assessment selected must be appropriate to the module being assessed. This relates back to reliability and validity.
SUFFICIENT AND SYSTEMATIC
The evidence is enough to prove competence. Planning and recording is sufficiently rigorous to ensure that assessment is fair
Although evidence overload is an area of concern for assessors there is a fine line between adding evidence for the sake of it and providing evidence that is sufficient to show competence.
Experience has shown that sufficient evidence does not usually mean too little evidence but too little appropriate and relevant evidence.
If the assessment is to have meaning for the candidates, then time should be spent in discussing their personal own performance. Students need to learn to assess their own strengths and weaknesses while at the same time working out what evidence is most appropriate in order to demonstrate competence.
The evidence is genuine and has been produced by the learner. This is reasonable straightforward in direct observation.
However, work prepared prior to the assessment, or prepared away from the assessor, when presented to the assessor needs to be checked to ensure it is the work of the learner. The assessor needs to use every means available to make sure they give fair recognition to every authentic contribution.
INTEGRATION OF WORK OR LEARNING
Evidence collected is integrated into the work or learning process where this is appropriate and feasible.
The majority of assessments contain some subjective judgements, particularly when assessment is of more complex skills and knowledge. There may be situations that lead to subjective interpretations by an assessor. This may well reflect personal bias or prejudice which is neither fair to the learner nor reliable as evidence of competence.
Another prejudice can occur if the assessor knows the learner personally and has formed an opinion of him or her prior to the assessment. It is advisable that there is strict adherence to the module standards and assessment requirements. Assessors should not allow their personal preferences to cloud their judgment.
The appeals procedure gives learners an opportunity to appeal against any judgement they see as unfair. In addition the moderator or verifier’s role is designed to eliminate such unfairness.
For assessment procedures to be fair assessors should make note of the following
- Perceptions of inequality related to the appropriate learning tools and other related resources.
- Perceived bias such as race, gender, age, disability, social class etc. in the assessment process.
The process must be transparent and open to any scrutiny from the learners or persons involved in the assessment process. Learners have options available to them if they feel they have been compromised or unfairly treated – the appeals policy allows such options.
Assessors must find the most effective way of moving the student towards competence. This should be done in a manner, which is fast, manageable and cost effective. The purpose of the NQA system is to promote the development of skills and to promote equity while at the same time ensuring effectiveness. The entire assessment system must therefore be manageable.
One aspect that contributes to the success of outcomes based education and assessment is that the learner is an essential part of the process and must therefore be kept involved. Learners can contribute to the planning and accumulation of evidence, particularly where RPL is concerned. The learner should be confident about the process and how assessment will take place as well as all the criteria that apply.
An assessment is only legitimate if assessors in different locations would make the same judgement about the same candidates based on the same evidence. There must be consistency and standards related to the assessment criteria.Share on Social Media
9th Jul 2014 at 12:29 pm #32442DK TeyimParticipant
Thanks for the details on VARCCS. I believe one of my fundamental problems is in the memos provided by the material developers. Lets say material is verified as correct by the seta. What happens if the assessor and the moderator (internal) can can agree on some of the answers? As an assessor, i believe that some of the memo answers provided by the material developer are misleading and incorrect, but the internal moderator want me to stick to the memo. This even includes calculation errors in the maths components.
do i stick to the memo? (my conscience says no. garbage in = garbage out)
do i redo the memo and use it with my learners? (That is what i do and the new memo is communicated to the provider)
do i ask the provider to update the memo using my adjusted memo (yes i do that, but the provider is not interested. says material already passed verification)
Will there be any future problems with the use of my “unverified memo”? I have no idea. Never had any
Any advise?Share on Social Media
9th Jul 2014 at 12:39 pm #32441Lynel FarrellKeymaster
That is such a good questions DK! What the provider must take into consideration is that learning material (which includes all instruments such as assessment tools and marking memorandums) must be current, reviewed and updated at all times. The mistakes you have noted must be sent to the provider (keep this on record please). Should the mistakes be picked up by the verifiers, they might just reject the verification until such time that the instruments are corrected. Incorrect information could disadvantage a learner, and this we never allow. It is only right that you notify the provider of the mistakes and give the corrections.
If the learning material is correct and the outcomes are not being changed, then the instruments must be updated and given a new version for quality control and record keeping. This must be documented and kept as evidence.Share on Social Media
9th Jul 2014 at 12:46 pm #32440DK TeyimParticipant
Thank you Lynel. Exactly what i have been doing. whether the provided agrees to update and go with my changes or not, so long as i can back my answers up, i go with them and make sure i communicate the provider and keep by recordsShare on Social Media
10th Jul 2014 at 9:50 am #32439Alexander RobertsonParticipant
DK Teyim, I was once stopped from doing facilitation and assessing by one training provider because I told them that their manuals needed to be updated, as there were items in them which were no longer applicable. Informed that their other assessors liked teaches out of date material and they cannot have my material up to date and the others out of date!!
Why teach something which is out of date? I know from where you are coming. Another said, well, correct the manual! No recognition for the corrections!Share on Social Media
10th Jul 2014 at 10:13 am #32438Daniel HadebeParticipant
You got it right. Chances of a learner getting disadvantaged become smaller,so long as we assessors stick to these principles.Share on Social Media
10th Jul 2014 at 10:29 am #32437
DK, please note the following:
- – Training material approved by the SETA meets minimum requirements and are not cast and stone, it should flexible – taking into account current context;
- – The memo’s approved by the SETA with the training material should at all times serve as a guide to the assessors and not a my way or the highway issue;
- – As registered assessor – signed assessor code of conduct, you are responsible to make indepedent assessment decisions guided by the assessment guide and memo’s;
- – Verifiers from the SETA’s should not dictate terms, they have duty to listen to assessors and moderators (remember you’re an expert on the subject matter) and they are not – in most cases. The assessor and moderator standards should guide the conduct and work of the assessors and moderators;
- – You can review the memo at the provider level and this does not need SETA approval prior to the implementation. Just explain it in the assessment and moderation reports – give clear rationale; and
- – This is a view that I promote in my SETA – SETA’s and providers should work towards a common goal. We are equal and should be allowed space to register our views and or decisions.
Good luckShare on Social Media
10th Jul 2014 at 1:02 pm #32436
Nice post… although most ETD practitioners can recite these assessment principles, many fail to grasp their correct application; thus, an excellent discussion topic. The experience that DK shared cement my view that practitioners fail to grasp the principles and raises a few questions. I shall elaborate as briefly as possible.
Firstly, it is my understanding that competent learners should demonstrate an ability to apply what they learnt in a variety of contexts. SAQA further states assessments should assess ‘applied competence’; viz – practical, foundational and reflexive competence. Now, relating to DK’s questions, I have some questions. How can a ‘memo’ be cast in stone when the assessment is required to produce evidence that will demonstrate ‘applied competence’ – in particular, I refer to the matter of reflexive competence? I’m sorry if I am going to hurt the sensitivities of anybody here, but any training manager (particularly those at the SETA’s) who endorses an assessment that does not evidence this has failed.
Sorry if I seem to be rambling, but I am getting to the point. The/my point is that both the assessment and memorandum should be flexible enough to allow for learners to demonstrate competence within their individual work/personal context. Thus, the memorandum becomes a ‘proposed solution’ (as opposed to a ‘model answer’) that caters for such differing contexts and much of it cannot therefore be cast in stone. There are loads of free resources to guide assessors (and training managers) in this regard, so there really is no excuse. A good departure point is this old SAQA document that I just dusted off – it may have been written over a decade ago, but it is still ‘gold’ – http://www.saqa.org.za/docs/guide/2001/assessment.pdf
Lastly, to the training providers out there who are paying loads of money to developers to develop material for them. If they are not developing resources that will help your trainers lead learners towards ‘Applied Competence’ (AND assess for it), you have been short changed (no pun intended). That aside, the advice given to DK (and her approach) is sound.
Many thanks for stimulating the debate via your post Des.Share on Social Media
10th Jul 2014 at 8:06 pm #32435Cas OlivierParticipant
Fully support your view Steve (long time not heard from you).
My latest (negative) experience was with a provider/developer who requested me to moderate material for accreditation purposes.
NONE of the activities ‘developed’ (in fact draw form other resources) met the requirements of the specific outcomes and assessment criteria.
When challenging the provider on the principle of validity, the motivation was: “These activities were used in a previous programme and the learners enjoyed them”.
Des, therefor the principles of assessment in fact start with learning material development keeping in mind the assessments which are going to take place in future.Share on Social Media
11th Jul 2014 at 6:21 am #32434Lynel FarrellKeymaster
Absolute awesome reply Mthunzi! SETAs and Providers should work together!Share on Social Media
11th Jul 2014 at 6:58 am #32433
If there are errors on the marking memo it is up to you as the assessor to ensure they are corrected prior to assessment. A moderator does not have the right to insist that you make use of something that is wrong. If this is the case then you as the assessor has the right to appeal to the verifier. The evidence must be VALID and RELIABLE – nothing in the assessment should impact negatively on the learner.Share on Social Media
11th Jul 2014 at 6:59 am #32432Alexander RobertsonParticipant
If it is not aligned to the US, it is NOT good enough, no matter how much the learners enjoy it! I agree with you wholeheartedly. I have come across this many times, as a facilitator and now as a provider as well, I need to ensure my material is up to date with market trends plus US aligned.Share on Social Media
11th Jul 2014 at 7:06 am #32431
We have a responsibility as assessors and moderators to conduct the assessment or moderation making bearing in mind the content of the training manual. This is what was used in training and the content thereof is what has been imparted to the learner. We would therefore be required to assess accordingly making use of the related instrument and marking memorandum. If however there are errors in either the training material or the marking memo etc then we are obligated to discuss these with the provider concerned. WE are also obligated to recommend changes and to note these recommendations in our assessor and/or moderator report for the benefit of the verifier. The provider would then be obligated to rectify before the next assessment.Share on Social Media
11th Jul 2014 at 7:10 am #32430
The developers must also make sure they cater for the entry level requirements and providers must ensure these are complied with. Assessment instrument must take into account Diagnostic assessment when required, as well as formative assessments, summative assessments and CCFO verification or assessment.
These areas are so often overlooked.Share on Social Media
11th Jul 2014 at 7:17 am #32429
hi Cas, Yes material development should be aligned to the SO’s and AC’s in all cases and the assessment must be aligned to these as well as to the CCFO’s. The problem of course is many developers have just take on themselves to develop training material without fully understanding the requirements of a qualification or a unit standard. No developer should be developing unless they have undergone training on 115755 Develop outcomes based assessment – for that matter I believe all assessors should also be trained on this unit standard. The reason being that this unit standard is for people who design and develop assessments to facilitate consistent, credible, reliable, fair, and unbiased assessments of learning outcomes.Share on Social Media
11th Jul 2014 at 1:04 pm #32428
YES Cas! Of course it starts with the material development. I’m quite surprized that nobody latched onto your comment “These activities were used in a previous programme and the learners enjoyed them”. If that’s the providers sole motivation then where does the end user come into the picture… anyhow, I don’t even want to go into secondary evaluation but must just add one question in that regard. How can providers continue to attract business if their focus is on learners “enjoying the assessment activities?”
Yes I have been very quiet (and busy) Cas and I will make a catch-up email a priority 🙂Share on Social Media
11th Jul 2014 at 1:14 pm #32427
Indeed Des they are but my primary point was the issue of assessing for ‘applied competence’. I think I shall post something about that and see what new insights we might gain from that.Share on Social Media
11th Jul 2014 at 1:59 pm #32426Irene JamesParticipant
Hi Mthunzi – I agree with all of your points 100%. But the question I would like to ask is why are we allowing the SETA’s to dictate to us in the first place? “Model answers” belong in an old fashioned traditional methodology – tick to get the marks, – all 30 or 50% of them to “pass” – and has little, if any place, in a competency-based OBE methodology.
Are we still expected to churn out little clones who spot and swot and spit out rote learned “model answers” that are forgotten within a week or two? Where is the value in that? (unless, of course it is critical/essential bit of knowledge that needs to be embedded.)
As quality providers, we should be working with well developed rubrics, that lead us to the scope of evidence that we are looking for in responses. As facilitators, we are subject matter experts in our field (or else we should not be facilitating), and we know what constitutes an excellent, mediocre or inferior response in an assessment – Surely? (Incidentally, that’s why I also do not believe that the function of facilitator-assessor should be split. How do you assess the ccfo’s if you have not been present in the learning?). I say we should rally to get rid of the notion of model answers altogether and move towards a more appropriate system of assessment rubrics. Perhaps we could start by explaining what an assessment rubric is to the SETAs?Share on Social Media
11th Jul 2014 at 2:04 pm #32425
Yes Irene 🙂Share on Social Media
11th Jul 2014 at 2:06 pm #32424Irene JamesParticipant
Agree 100% Steve – Glad you are on the same page!Share on Social Media
11th Jul 2014 at 11:16 pm #32423Victoria Siphiwe Mamvura-GavaParticipant
I agree with both of you, Des and Cas.Share on Social Media
- You must be logged in to reply to this topic.