Authors
Radu-Daniel Vatavu, Jacob O Wobbrock
Publication date
2015/4/18
Book
Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems
Pages
1325-1334
Description
We address in this work the process of agreement rate analysis for characterizing the level of consensus between participants' proposals elicited during guessability studies. Two new measures, i.e., disagreement rate for referents and coagreement rate between referents, are proposed to accompany the widely-used agreement rate formula of Wobbrock et al. [37] when reporting participants' consensus for symbolic input. A statistical significance test for comparing the agreement rates of k>=2 referents is presented in analogy with Cochran's success/failure Q test [5], for which we express the test statistic in terms of agreement and coagreement rates. We deliver a toolkit to assist practitioners to compute agreement, disagreement, and coagreement rates, and run statistical tests for agreement rates at p=.05, .01, and .001 levels of significance. We validate our theoretical development of agreement rate analysis in …
Total citations
20152016201720182019202020212022202320243141922404433373217
Scholar articles