Community evaluation of crowd-sourced ideas : An explorative study on how to improve the prediction accuracy of crowdsourcing communities
5 Angebote vergleichen
Bester Preis: € 43,95 (vom 01.07.2017)1
Community evaluation of crowd-sourced ideas
DE NW
ISBN: 9783330509733 bzw. 3330509732, in Deutsch, neu.
Lieferung aus: Deutschland, Lieferzeit: 7 Tage.
In 2008, Google initiated an ideation challenge called 'Project 10 to the 100'. They asked quite openly 'What would help most?' and received more than 150,000 ideas from people all over the globe. As Google's 'Project 10 to the 100' and many other real life examples show (e.g. Cisco's 'I-Prize' or Nokia's 'Tune Remake'), the number of submissions to crowdsourcing contests can be stunning. However, the majority of submitted ideas is usually of very low quality. According to the so-called Sturgeon's (1958) law, 90% of everything is crap. Firms are often overloaded by too much 'noise' generated on crowdsourcing platforms. They face the problem of not being able to filter and select the best ideas (or only being able to do so with substantial effort). Recently, scholars have proposed that the integration of crowdsourcing communities into the evaluation process appears to be a very promising method to filter high-quality ideas. In this explorative study, Georg Terlecki-Zaniewicz analyzed eleven crowdsourcing platforms, and concluded with a framework that makes the evaluation of crowd-sourced ideas through community evaluation both more efficient and more accurate.
In 2008, Google initiated an ideation challenge called 'Project 10 to the 100'. They asked quite openly 'What would help most?' and received more than 150,000 ideas from people all over the globe. As Google's 'Project 10 to the 100' and many other real life examples show (e.g. Cisco's 'I-Prize' or Nokia's 'Tune Remake'), the number of submissions to crowdsourcing contests can be stunning. However, the majority of submitted ideas is usually of very low quality. According to the so-called Sturgeon's (1958) law, 90% of everything is crap. Firms are often overloaded by too much 'noise' generated on crowdsourcing platforms. They face the problem of not being able to filter and select the best ideas (or only being able to do so with substantial effort). Recently, scholars have proposed that the integration of crowdsourcing communities into the evaluation process appears to be a very promising method to filter high-quality ideas. In this explorative study, Georg Terlecki-Zaniewicz analyzed eleven crowdsourcing platforms, and concluded with a framework that makes the evaluation of crowd-sourced ideas through community evaluation both more efficient and more accurate.
2
Symbolbild
Community evaluation of crowd-sourced ideas : An explorative study on how to improve the prediction accuracy of crowdsourcing communities (2017)
DE PB NW
ISBN: 9783330509733 bzw. 3330509732, in Deutsch, AV Akademikerverlag Jun 2017, Taschenbuch, neu.
Lieferung aus: Deutschland, Versandkostenfrei.
Von Händler/Antiquariat, AHA-BUCH GmbH [51283250], Einbeck, Germany.
Neuware - In 2008, Google initiated an ideation challenge called 'Project 10 to the 100'. They asked quite openly 'What would help most ' and received more than 150,000 ideas from people all over the globe. As Google's 'Project 10 to the 100' and many other real life examples show (e.g. Cisco's 'I-Prize' or Nokia's 'Tune Remake'), the number of submissions to crowdsourcing contests can be stunning. However, the majority of submitted ideas is usually of very low quality. According to the so-called Sturgeon's (1958) law, 90% of everything is crap. Firms are often overloaded by too much 'noise' generated on crowdsourcing platforms. They face the problem of not being able to filter and select the best ideas (or only being able to do so with substantial effort). Recently, scholars have proposed that the integration of crowdsourcing communities into the evaluation process appears to be a very promising method to filter high-quality ideas. In this explorative study, Georg Terlecki-Zaniewicz analyzed eleven crowdsourcing platforms, and concluded with a framework that makes the evaluation of crowd-sourced ideas through community evaluation both more efficient and more accurate. 92 pp. Englisch.
Von Händler/Antiquariat, AHA-BUCH GmbH [51283250], Einbeck, Germany.
Neuware - In 2008, Google initiated an ideation challenge called 'Project 10 to the 100'. They asked quite openly 'What would help most ' and received more than 150,000 ideas from people all over the globe. As Google's 'Project 10 to the 100' and many other real life examples show (e.g. Cisco's 'I-Prize' or Nokia's 'Tune Remake'), the number of submissions to crowdsourcing contests can be stunning. However, the majority of submitted ideas is usually of very low quality. According to the so-called Sturgeon's (1958) law, 90% of everything is crap. Firms are often overloaded by too much 'noise' generated on crowdsourcing platforms. They face the problem of not being able to filter and select the best ideas (or only being able to do so with substantial effort). Recently, scholars have proposed that the integration of crowdsourcing communities into the evaluation process appears to be a very promising method to filter high-quality ideas. In this explorative study, Georg Terlecki-Zaniewicz analyzed eleven crowdsourcing platforms, and concluded with a framework that makes the evaluation of crowd-sourced ideas through community evaluation both more efficient and more accurate. 92 pp. Englisch.
3
Symbolbild
Community evaluation of crowd-sourced ideas (2008)
DE PB NW
ISBN: 9783330509733 bzw. 3330509732, in Deutsch, Taschenbuch, neu.
Lieferung aus: Deutschland, Versandkostenfrei.
Von Händler/Antiquariat, European-Media-Service Mannheim [1048135], Mannheim, Germany.
Publisher/Verlag: AV Akademikerverlag | An explorative study on how to improve the prediction accuracy of crowdsourcing communities | In 2008, Google initiated an ideation challenge called 'Project 10 to the 100' They asked quite openly 'What would help most?' and received more than 150,000 ideas from people all over the globe. As Google's 'Project 10 to the 100' and many other real life examples show (e.g. Cisco's 'I-Prize' or Nokia's 'Tune Remake'), the number of submissions to crowdsourcing contests can be stunning. However, the majority of submitted ideas is usually of very low quality. According to the so-called Sturgeon's (1958) law, 90% of everything is crap. Firms are often overloaded by too much 'noise' generated on crowdsourcing platforms. They face the problem of not being able to filter and select the best ideas (or only being able to do so with substantial effort). Recently, scholars have proposed that the integration of crowdsourcing communities into the evaluation process appears to be a very promising method to filter high-quality ideas. In this explorative study, Georg Terlecki-Zaniewicz analyzed eleven crowdsourcing platforms, and concluded with a framework that makes the evaluation of crowd-sourced ideas through community evaluation both more efficient and more accurate. | Format: Paperback | Language/Sprache: english | 92 pp.
Von Händler/Antiquariat, European-Media-Service Mannheim [1048135], Mannheim, Germany.
Publisher/Verlag: AV Akademikerverlag | An explorative study on how to improve the prediction accuracy of crowdsourcing communities | In 2008, Google initiated an ideation challenge called 'Project 10 to the 100' They asked quite openly 'What would help most?' and received more than 150,000 ideas from people all over the globe. As Google's 'Project 10 to the 100' and many other real life examples show (e.g. Cisco's 'I-Prize' or Nokia's 'Tune Remake'), the number of submissions to crowdsourcing contests can be stunning. However, the majority of submitted ideas is usually of very low quality. According to the so-called Sturgeon's (1958) law, 90% of everything is crap. Firms are often overloaded by too much 'noise' generated on crowdsourcing platforms. They face the problem of not being able to filter and select the best ideas (or only being able to do so with substantial effort). Recently, scholars have proposed that the integration of crowdsourcing communities into the evaluation process appears to be a very promising method to filter high-quality ideas. In this explorative study, Georg Terlecki-Zaniewicz analyzed eleven crowdsourcing platforms, and concluded with a framework that makes the evaluation of crowd-sourced ideas through community evaluation both more efficient and more accurate. | Format: Paperback | Language/Sprache: english | 92 pp.
4
Community evaluation of crowd-sourced ideas als von
DE HC NW
ISBN: 9783330509733 bzw. 3330509732, in Deutsch, AV Akademikerverlag, gebundenes Buch, neu.
Die Beschreibung dieses Angebotes ist von geringer Qualität oder in einer Fremdsprache. Trotzdem anzeigen
5
Symbolbild
Community evaluation of crowd-sourced ideas: An explorative study on how to improve the prediction accuracy of crowdsourcing communities (2017)
EN PB NW
ISBN: 9783330509733 bzw. 3330509732, in Englisch, 92 Seiten, Av Akademikerverlag, Taschenbuch, neu.
Lieferung aus: Deutschland, Versandfertig in 1 - 2 Werktagen, Versandkostenfrei.
Von Händler/Antiquariat, dodax-shop-eu.
Die Beschreibung dieses Angebotes ist von geringer Qualität oder in einer Fremdsprache. Trotzdem anzeigen
Von Händler/Antiquariat, dodax-shop-eu.
Die Beschreibung dieses Angebotes ist von geringer Qualität oder in einer Fremdsprache. Trotzdem anzeigen
Lade…