Open Access Open Badges Research

Reliability of an injury scoring system for horses

Cecilie M Mejdell1*, Grete HM Jørgensen2, Therese Rehn3, Kjersti Fremstad2, Linda Keeling3 and Knut E Bøe2

Author Affiliations

1 Section for Domestic Animal Health and Welfare, National Veterinary Institute, P.O.Box 750 Sentrum, 0106 Oslo, Norway

2 Department of Animal and Aquacultural Sciences, Norwegian University of Life Sciences, P.O.Box 5003, 1432 Ås, Norway

3 Department of Animal Environment and Health, Swedish University of Agricultural Sciences, Box 7068, 750 07 Uppsala, Sweden

For all author emails, please log on.

Acta Veterinaria Scandinavica 2010, 52:68  doi:10.1186/1751-0147-52-68

Published: 31 December 2010



The risk of injuries is of major concern when keeping horses in groups and there is a need for a system to record external injuries in a standardised and simple way. The objective of this study, therefore, was to develop and validate a system for injury recording in horses and to test its reliability and feasibility under field conditions.


Injuries were classified into five categories according to severity. The scoring system was tested for intra- and inter-observer agreement as well as agreement with a 'golden standard' (diagnosis established by a veterinarian). The scoring was done by 43 agricultural students who classified 40 photographs presented to them twice in a random order, 10 days apart. Attribute agreement analysis was performed using Kendall's coefficient of concordance (Kendall's W), Kendall's correlation coefficient (Kendall's τ) and Fleiss' kappa. The system was also tested on a sample of 100 horses kept in groups where injury location was recorded as well.


Intra-observer agreement showed Kendall's W ranging from 0.94 to 0.99 and 86% of observers had kappa values above 0.66 (substantial agreement). Inter-observer agreement had an overall Kendall's W of 0.91 and the mean kappa value was 0.59 (moderate). Agreement for all observers versus the 'golden standard' had Kendall's τ of 0.88 and the mean kappa value was 0.66 (substantial). The system was easy to use for trained persons under field conditions. Injuries of the more serious categories were not found in the field trial.


The proposed injury scoring system is easy to learn and use also for people without a veterinary education, it shows high reliability, and it is clinically useful. The injury scoring system could be a valuable tool in future clinical and epidemiological studies.