Community heuristics for user interface evaluation of crowdsourcing platforms

Simon à Campo (Corresponding author), Vasssilis Javed Khan, Konstantinos Papangelis, Panos Markopoulos

Research output: Contribution to journalArticleAcademicpeer-review

30 Citations (Scopus)
164 Downloads (Pure)


Crowdsourcing is growing rapidly in both industry and academia, introducing new ways of conducting work and improving our understanding of how to utilize the potential of crowds. Related research has emphasized on how to improve crowdsourcing platforms and related practices to foster collaboration, motivation, trust, quality and creativity. However, these challenges do not seem to be as apparent in vibrant online communities. Research in how to make online communities work provides insights into how to address the challenges crowdsourcing is facing right now. For this work, we have gathered from literature relevant design guidelines (heuristics) for online communities and have applied them to 20 crowdsourcing platforms to evaluate how those platforms conform to the heuristics. The heuristics can be used as a tool for designers of crowdsourcing platforms, to evaluate how to improve these platforms and to compare them to their competition. Finally, our paper highlights the current challenges crowdsourcing platforms face to acquire positive aspects of online communities.

Original languageEnglish
Pages (from-to)775-789
Number of pages15
JournalFuture Generation Computer Systems
Publication statusPublished - Jun 2019


  • Crowdsourcing
  • Design methods
  • Online community analysis and support


Dive into the research topics of 'Community heuristics for user interface evaluation of crowdsourcing platforms'. Together they form a unique fingerprint.

Cite this