Sponsored Tasks and Solver Participation in Crowdsourcing Contests
Prof. Jianqing Chen
Ashbel Smith Professor in Information Systems
Jindal School of Management
The University of Texas at Dallas
Crowdsourcing platforms provide venues for firms looking for solutions (seekers) to interact with individuals who can provide solutions (solvers). As crowdsourcing contest platforms have grown in popularity with numerous tasks being posted on a daily basis, a concern that has emerged is that many similar tasks compete for solver attention, with some tasks failing to attract sufficient solver participation. To alleviate this concern, in addition to regular task listings, many crowdsourcing platforms offer sponsorship programs under which seekers pay an extra fee for highlighting their tasks to draw solvers’ attention. We examine the effect of sponsorship on solver participation using a unique data set collected from a leading crowdsourcing platform. In contrast to platforms’ claims about the effect of sponsorship on participation, we find that sponsorship does not always boost participation in crowdsourcing contests; sponsorship increases the number of participants only when the task’s prize amount is already high. Furthermore, even when the number of participants increases, the increase primarily comes from low-ability solvers. We also find that when sponsorship increases the total number of submissions, it does so only by increasing the number of participants; sponsorship does not increase the number of submissions individual solvers submit after joining a task. A more granular analysis reveals an effect of anticipated increased competition caused by sponsorship on high-ability solvers but not on those of low ability, explaining the difference in their participation decisions when facing sponsored tasks. We also find the effect of sponsorship weakens over the duration of a task for high-ability solvers and is also weaker for solvers with more experience on the platform.