Our Sponsors
Category CIR 2011
Deadline: June 09, 2011 | Date: July 28, 2011
Venue/Country: Beijing, China
Updated: 2011-06-01 07:17:48 (GMT+9)
**Deadline Extension to Thursday June 9th**.CALL FOR PAPERSThe advent of crowdsourcing is driving a disruptive shift in IR areassuch as evaluation, learning to rank, and development of new hybridman+machine systems which blend automation and crowd computation(potentially in real-time) to deliver innovative functionality andsearch experiences. Traditionally manual-labor intensive tasks likejudging relevance and annotating training data can now be accomplishedmore quickly and accurately, and at a fraction of traditionalcosts.Despite this potential and early successes of crowdsourcing onIR, many significant challenges remain with regard to theory,methodology, policy, and best practices that currently limit ourability to realize this potential in practice.We invite submissions of papers describing novel, unpublished researchaddressing one or more of the following areas:* General: Theoretical, experimental, and/or methodologicaldevelopments advancing state-of-the-art knowledge of crowdsourcing forIR* Applications: search blending automation with the crowd, especiallyreal-time systems which must model dynamic and temporal properties ofcrowd behavior* New functionality: use of crowdsourcing to realize innovativesearch features (e.g. using geographic dispersion of the crowd forlocal search or to detect geo-specific intents, etc.)* Machine learning: consensus labeling for vote aggregation, activelearning strategies for efficient labeling, learning to rank withnoisy crowd labels and multi-labeling* Evaluation: evaluating systems with noisy and multi-labeledrelevance judgments* Infrastructure: new software packages and tool kits which simplifyor otherwise improve general support for crowdsourcing or particulartasks (e.g. TurkIt, Get Another Label)* Human factors and task design: how to design effective interfacesand interaction mechanisms for the crowd; how to enable effectivecrowd performance on tasks traditionally requiring scare and expensivedomain experts; how different forms of crowdsourcing or crowdmotivations (fun, socialization, prestige, economic, etc.), might beselected or tailored for different IR tasks (e.g. Page Hunt)* Vision: Reflective or forward-looking position papers on use ofcrowdsourcing for IRIMPORTANT DATESWorkshop PapersSubmissions: June 9, 2011Notification of acceptance: July 5, 2011Camera-ready: July 12, 2011Workshop: July 28, 2011WORKSHOP ORGANIZERSVaughn Hester, CrowdFlower, USAMatthew Lease, University of Texas at Austin, USAAlex Sorokin, CrowdFlower, USAEmine Yilmaz, Microsoft, UKYou can email the organizers at cir2011-org *at* googlegroups *dot* com.PROGRAM COMMITTEEOmar Alonso Microsoft BingPaul Bennett Microsoft ResearchAdam Bradley Amazon.comBen Carterette University of DelawareCharlie Clarke University of WaterlooHarry Halpin University of EdinburghJaap Kamps University of AmsterdamMartha Larson Delft University of TechnologyGabriella Kazai Microsoft ResearchMounia Lalmas University of GlasgowEdith Law Carnegie Mellon UniversityDon Metzler University of Southern CaliforniaStefano Mizzaro University of UdineStefanie Nowak Fraunhofer IDMTIadh Ounis University of GlasgowMark Sanderson RMIT UniversityMark Smucker University of WaterlooIan Soboroff National Institute of StandardsSiddharth Suri Yahoo! ResearchKeywords: Accepted papers list. Acceptance Rate. EI Compendex. Engineering Index. ISTP index. ISI index. Impact Factor.
Disclaimer: ourGlocal is an open academical resource system, which anyone can edit or update. Usually, journal information updated by us, journal managers or others. So the information is old or wrong now. Specially, impact factor is changing every year. Even it was correct when updated, it may have been changed now. So please go to Thomson Reuters to confirm latest value about Journal impact factor.