To run a successful 360 multi-rater review, here are some good tips to follow:
- Purpose - Development should be the fundamental purpose of any 360 multi-rater review exercise.
- Rater - Participants should be rated only by those who have sufficient recent working experience with the participant. A simple rule to follow would be to only select raters who have worked with the participant in the past 3 months.
- Number of Raters - In the research put forth by Greguras & Robie in 1995, four supervisors, eight peers and nine reports should be included in the 360 review in order to achieve statistical reliability with the results. However, this greatly depends on the participant’s job function and the organisational structure. At EngageRocket, we encourage a minimum of 10 raters to be selected for each participant.
- Confidentiality - It is seen as good practice to ensure that feedback remain confidential. Providing this assurance allows the rater to be open, honest and constructive. By default, EngageRocket sets a confidentiality threshold of 3 before aggregated results are display. This can be changed under the preference page.
- Questions - EngageRocket provides suggested well-researched questions that are already validated. Questions should be clearly focused and specific around a particular set of skills, competencies, or behaviours that are trainable.
- On-going process - A participant’s development plan should be revisited regularly as skills and competencies are learned and developed over time. The requirements of a participant may also change with time. To support development, the 360 multi-rater review process should be repeated regularly so that participants can measure progress and identify ongoing development requirements and make adjustments to their development plan.
When the 360 multi-rater review process is done properly, you will see measurable results in the organization. If you have more questions for us, write to us at [email protected]