Community heuristics for user interface evaluation of crowdsourcing platforms

Simon à Campo (Corresponding author), Vasssilis Javed Khan, Konstantinos Papangelis, Panos Markopoulos

Onderzoeksoutput: Bijdrage aan tijdschriftTijdschriftartikelAcademicpeer review

15 Citaten (Scopus)
1 Downloads (Pure)


Crowdsourcing is growing rapidly in both industry and academia, introducing new ways of conducting work and improving our understanding of how to utilize the potential of crowds. Related research has emphasized on how to improve crowdsourcing platforms and related practices to foster collaboration, motivation, trust, quality and creativity. However, these challenges do not seem to be as apparent in vibrant online communities. Research in how to make online communities work provides insights into how to address the challenges crowdsourcing is facing right now. For this work, we have gathered from literature relevant design guidelines (heuristics) for online communities and have applied them to 20 crowdsourcing platforms to evaluate how those platforms conform to the heuristics. The heuristics can be used as a tool for designers of crowdsourcing platforms, to evaluate how to improve these platforms and to compare them to their competition. Finally, our paper highlights the current challenges crowdsourcing platforms face to acquire positive aspects of online communities.

Originele taal-2Engels
Pagina's (van-tot)775-789
Aantal pagina's15
TijdschriftFuture Generation Computer Systems
StatusGepubliceerd - jun 2019

Vingerafdruk Duik in de onderzoeksthema's van 'Community heuristics for user interface evaluation of crowdsourcing platforms'. Samen vormen ze een unieke vingerafdruk.

Citeer dit