A Worldwide Team Rolls In – Crowdsourcing Asteroid Hunters

April 2, 2014 Appirio

crowdsourcing
photo credit: NASA #Cosmos

It’s all too common.

Here’s the formula:

  • Big Announcement! (bright lights, glitter, much pomp and circumstance)

  • [crickets]

  • Confusion, tears, lack of closure, and general melancholy

Just a few days ago, Topcoder and the NASA Tournament Lab announced the Asteroid Data Hunter Challenge, part of the Asteroid Grand Challenge series. To avoid the all-too-common scenario mentioned above, I figured we should provide an update on the progress of this currently ongoing, potentially earth-saving, series of crowdsourcing challenges.

Now that the review phase has closed for the first contest below are some nuggets of little data surrounding this challenge to noodle on:

  1. Out of the 382 total registrants for the contest, 329 are brand new members (registered on or after 03/10/14, which is when the challenge officially was announced)

  2. The challenge has received a total of 24 submissions – 5 didn’t pass the initial screening, and 19 went on for review (Footnote – we typically get 2 to 4 submissions for content creation / problem statement contests)

  3. Of the top 5 submissions, 4 of the members registered after the announcement of the contest (including the 1st place winner)

  4. The top 5 winners (in order of placement) are from: Romania, Poland, Spain, the United States & the United Kingdom

[button link=”http://appirio.com/category/resource/crowdsourcing-vs-outsourcing/?WebSource=Blog” size=”big” color=”orange”] 6 Ways Crowdsourcing Is Different Than Outsourcing [/button]

What can we glean from these initial stats?

  • Fresh Thinking, On Tap. 86% of the participants are brand new participants to Topcoder. This must be in part due to the promotional power and gravity of the NASA ecosystem. Regardless, it’s a testament of the ability to bring fresh thinking to every challenge, even for repeat sponsors of crowdsourcing challenges like NASA.

  • Bet Big, Win Big. A major selling point for a competition-based crowdsourcing model like Topcoder can be the efficiency of the model: you only pay for results. However, that efficiency can be applied in multiple ways, and in an innovation-driven challenge like the Asteroid Data Hunter Challenge, bigger prizes easily translate into bigger (or more) results.

  • Joy’s Law Revised. Joy’s Law is pretty clear: “No matter who you are, most of the smartest people work for someone else”. Or do they? Topcoder finally provides a way for organizations to tap into a world of top technical development expertise, regardless of who they work for.

Again, this is just round 1. The #AsteroidHunters search on. If you want to be a part of the experience, go to Topcoder.com/asteroids and click on one of the asteroids on the left side of the page… maybe you’ll find something interesting yourself.

The challenge continues.

crowdsourcing

Previous Article
Marketing Automation: 3 Questions with Ashley Stepien
Marketing Automation: 3 Questions with Ashley Stepien

Marketing automation technology has become a big topic over the last few years.  There has been an explosio...

Next Article
HCM Strategy Discussion: Performance Management
HCM Strategy Discussion: Performance Management

Like a lot of kids, I hated report card day. I am a bad test taker, and one of those people that has to che...