Human activities have been investigated and applied in various fields, such as context-aware computing, search engine, social network services, location-based services, automated visual surveillance, and multimodal human-computer interaction. However, the human activities in the review process of academic conferences have seldom been explored. There is no doubt that review process plays an important role in deciding whether a paper can be accepted or not. In this paper, we present our work to understand the review activities by analyzing the anonymized review data of two conferences (ACM SIGCOMM and UIC). The descriptive statistics and the data mining technology are adopted in the analysis. We got some interesting knowledge, which is significant for interpreting how the reviewers give their reviews in academic conferences, such as the relationships between the score, confidence and review length, and reviewer activity patterns. © 2013 SERSC.
CITATION STYLE
Yu, Z., Zhang, X., Liang, Y., & Zhang, D. (2013). How did reviewers give their reviews? Characterizing review activity in academic conferences. International Journal of Multimedia and Ubiquitous Engineering, 8(5), 361–374. https://doi.org/10.14257/ijmue.2013.8.5.36
Mendeley helps you to discover research relevant for your work.