Sign for Notice Everyday    Sign Up| Sign In| Link| English|

Our Sponsors


    MAMCA 2011 - Workshop on Multimodal Audio-based Multimedia Content Analysis (MAMCA 2011)

    View: 613

    Website www.icme2011.org | Want to Edit it Edit Freely

    Category MAMCA 2011

    Deadline: February 20, 2011 | Date: July 11, 2011

    Venue/Country: Barcelona, Spain

    Updated: 2011-01-31 10:27:25 (GMT+9)

    Call For Papers - CFP

    By definition, multimedia content is composed of multiple forms, including audio, video, text/ subtitles, and others. Traditionally, applications and algorithms that work with such content have considered only a single modality, allowing for example searching of textual tags, thereby ignoring any information available from others modalities. The limitations of this approach are obvious, and there is a recent trend towards multimodal processing, in which different content modalities complement each other, or are used for bootstrapping analysis of new modalities.

    Audio is a prominent part of multimedia content, which is backed up by extensive research by the speech and music communities, although usually performed on audio-only systems. Utility of audio-only systems is often limited by the quality of the acoustic environment or the information contained therein, so they can benefit from a multimodal analysis of multimedia data, to enhance the resulting performance, robustness, and efficiency.

    The main goal of the workshop is to explore ways in which audio processing can be enhanced, bootstrapped, or facilitated by other available information modalities. We are interested not only in applications that show successful combinations of audio and other sources of information, but also on algorithms that effectively integrate them and leverage complementary information from each modality to obtain an enhanced result, in terms of degree of detail, coverage of the corpus, or other enabling factors.

    The workshop will provide a forum for publication of high-quality, novel research on multimedia applications and multimodal processing, with a special focus on the audio modality.

    Paper format

    MAMCA-2011 solicits regular technical papers of up to 6 pages following the ICME author guidelines. The proceedings of the workshop will be published as part of the IEEE ICME 2011 main conference proceedings and will be indexed by IEEE Xplore. Papers must be original and not submitted to or accepted by any other conference or journal.

    Paper submission

    Papers can be submitted through https://cmt.research.microsoft.com/ICMEW2011

    Selection process

    Papers submitted to the workshop will be peer-reviewed by members of the community with extensive experience both in audio processing as well as other relevant modalities considered. The review will be semi-blind and assignment will be performed manually in order to generally produce three best practice reviews of each of the submitted papers.

    List of topics

    Topics include, but are not limited to:

    Effective fusion of audio with other modalities

    Multimodal input applications, where one input is audio

    Multimodal databases

    Bootstrapping of multimodal systems

    Co-training for labeling new data

    User-in-the loop calculations to detect preferences

    Games with a purpose to label new data

    Improving robustness through multimodality

    Prediction of modality preference

    Applications that utilize multimodality

    Important dates

    Paper submission deadline: February 20th 2011

    Paper acceptance notification: April 10th 2011

    Camera-ready paper: April 20th 2011

    Workshop day: July 11th 2011


    Keywords: Accepted papers list. Acceptance Rate. EI Compendex. Engineering Index. ISTP index. ISI index. Impact Factor.
    Disclaimer: ourGlocal is an open academical resource system, which anyone can edit or update. Usually, journal information updated by us, journal managers or others. So the information is old or wrong now. Specially, impact factor is changing every year. Even it was correct when updated, it may have been changed now. So please go to Thomson Reuters to confirm latest value about Journal impact factor.