Sign for Notice Everyday    Sign Up| Sign In| Link| English|

Our Sponsors

    Receive Latest News

    Feedburner
    Share Us


    CIRSE 2010 - The 2nd International Workshop on Contextual Information Access, Seeking and Retrieval Evaluation CIRSE 2010

    View: 4423

    Website www.irit.fr/CIRSE | Want to Edit it Edit Freely

    Category CIRSE 2010

    Deadline: January 20, 2010 | Date: March 28, 2010

    Venue/Country: Milton Keynes, U.K.

    Updated: 2010-06-04 19:32:22 (GMT+9)

    Call For Papers - CFP

    The 2nd International Workshop on Contextual Information Access, Seeking and Retrieval Evaluation
    in conjunction with ECIR-2010
    Milton Keynes, UK, March 28th, 2010
    Aims
    Since the 1990s, the interest in the notion of context in Information Access, Seeking and
    Retrieval increased. Many researchers have been concerning with the use of context in
    adaptive, interactive, personalized or collaborative systems, the design of explicit
    and implicit feedback techniques, the investigation of relevance, the application of a notion of
    context to problems like advertising or mobile search.
    The previous edition of this workshop held in Toulouse (CIRSE 2009) and other workshops and
    conferences, i.e. IR in Context (IRiX, 2005), Adaptive IR (AIR, 2006, 2008),
    Context-based IR (CIR, 2005, 2007) and Information Interaction in Context (IIiX, 2006, 2008)
    gathered researchers exploring theoretical frameworks and applications which have focussed
    on contextual IR systems.
    An important issue which gave raise to discussion has been Evaluation. It is commonly accepted
    that the traditional evaluation methodologies used in TREC, CLEF, NTCIR and INEX campaigns
    are not always suitable for considering the contextual dimensions in the information
    seeking/access process. Indeed, laboratory-based or system oriented evaluation is challenged by
    the presence of contextual dimensions such as user interaction, profile or environment which
    significantly impact on the relevance judgments or usefulness ratings made by the end user.
    Therefore, new research is needed to understand how to overcome the challenge of user-oriented
    evaluation and to design novel evaluation methodologies and criteria for contextual information
    retrieval evaluation.
    This workshop aims to have a major impact on future research by bringing together IR researchers
    working on or interested in the evaluation of approaches to contextual information access, seeking
    and retrieval to foster discussion, exchange ideas on the related issues. The main purpose is to
    bring together IR researchers, to promote discussion on the future directions of evaluation.
    Topics
    Both theoretical and practical research papers are welcome from both research and industrial communities
    addressing the main conference theme.
    Original and unpublished papers are welcome on any aspect including:
    User system, context and task modelling for information access seeking and retrieval evaluation.
    Novel techniques for implicit or explicit feedback evaluation.
    Learning algorithms that use non-traditional relevance judgments (click through data, query streams, user interactions …).
    Novel or extension of traditional evaluation measures, test collections, methodologies of operational evaluation.
    Contextual and user simulation algorithms.
    Accuracy evaluation of personal profiles built using implicit set-level responses.
    Merging ranking from collaborative system outputs.
    Application and evaluation of context-based systems for distributed retrieval, personal search, mobile search,
    digital libraries, archives and museums.
    Application and evaluation of context-based access to television broadcasted recordings, images, videos and music collections

    Keywords: Accepted papers list. Acceptance Rate. EI Compendex. Engineering Index. ISTP index. ISI index. Impact Factor.
    Disclaimer: ourGlocal is an open academical resource system, which anyone can edit or update. Usually, journal information updated by us, journal managers or others. So the information is old or wrong now. Specially, impact factor is changing every year. Even it was correct when updated, it may have been changed now. So please go to Thomson Reuters to confirm latest value about Journal impact factor.