I get low coder reliabilities with Krippendorff’s Alpha. What do I do?

Krippendorff’s Alpha is calculated during the development of the codebook and in the analysis-phase. It is important to get your inter coder reliability in order before you start coding your sample. There are two things that can cause Krippendorff’s Alpha (KALPHA) to be low. The first, and most important reason is when there is a low agreement between coders. The other reason is when the data is binary and one of the categories is coded very rarely. To find out which problem it is, calculate the percent agreement between all coders, next to KALPHA. If there is an actual low agreement between all coders, either the codebook should the categories better or the coders should be trained better. Don’t forget to check whether one coder is out of agreement with all others, in which case this coder should be either trained better or asked to leave the research. If the agreements are high, the data is binary and it is evident that one of the categories is coded rarely, you can ignore KALPHA and use another measure for inter-coder reliability. If your inter coder reliability turns out low after the data collection, you can try and improve it by reducing the number of categories in your variables. This way there may be more agreement. For more information about KALPHA, see: