Sign for Notice Everyday    Sign Up| Sign In| Link| English|

Our Sponsors

    Receive Latest News

    Feedburner
    Share Us


    JEDM 2011 - CALL FOR PAPER SUBMISSIONS FOR SPECIAL ISSUE OF JEDM DIAGNOSTIC MEASUREMENT IN COMPLEX LEARNING ENVIRONMENTS USING EVIDENCE-CENTERED DESIGN: A SNAPSHOT OF THE CURRENT STATE-OF-THE-ART

    View: 7054

    Website educationaldatamining.org/JEDM | Want to Edit it Edit Freely

    Category JEDM 2011

    Deadline: March 31, 2011 | Date: July 01, 2011

    Venue/Country: CALL FOR PAPERs, U.S.A

    Updated: 2011-01-05 00:38:35 (GMT+9)

    Call For Papers - CFP

    CALL FOR PAPER SUBMISSIONS FOR SPECIAL ISSUE OF JEDM

    DIAGNOSTIC MEASUREMENT IN COMPLEX LEARNING ENVIRONMENTS USING EVIDENCE-CENTERED DESIGN:

    A SNAPSHOT OF THE CURRENT STATE-OF-THE-ART

    Guest Editors

    André A. Rupp, University of Maryland (ruppandratumd.edu)

    Brian Nelson, Arizona State University (brian.nelsonatasu.edu)

    Rebecca Nugent, Carnegie Mellon University (rnugentatstat.cmu.edu)

    Aim of Special Issue

    We invite paper submissions for a special issue of the peer-reviewed Journal of Educational Data Mining with a focus on diagnostic measurement for digitally-mediated complex learning environments. We invite authors to submit papers that describe projects at various stages of development with an emphasis on articulating the complex, integrative interplay decisions about the design of the learning environment, the identification, accumulation, and synthesis of evidence from the resulting data, and the presentation and reporting of feedback to different stakeholders. Ideally, authors would be able to describe in some detail the role that either established or more novel promising statistical methods from areas such as educational data mining, multivariate data analysis, and diagnostic measurement can play in this work.

    We are seeking papers that are methodologically rigorous, lay out the interconnections between domain analysis, domain modeling, evidence identification, data-analysis, and reporting processes in the described project, and discuss directions for future research and practice. We are not necessarily looking for papers that showcase the supposedly ‘cleanest’ and ‘slickest’ solutions to their challenging modeling and measurement problems; rather, we decidedly welcome descriptions of the ‘messiness’ that these projects face, just as long as the descriptions strive towards best possible solutions. In short, our aim in this special issue is to showcase the diversity of complex digitally-mediated learning environments and their associated data-modeling strategies.

    Structure of Special Issue

    To make the structure of the special issue coherent, we ask that authors use the language of the evidence-centered design framework (e.g., Mislevy, Steinberg, Almond, & Lukas, 2006), which is particularly suitable for describing the different components of principled assessment design with complex tasks for ill-structured domains. The ECD framework identifies different layers and models at which different kinds of activities take place. These activities include modeling the target domain via appropriate tasks, assembling the tasks into a coherent assessment, and delivering the assessment with suitable interfaces.

    Using ECD language, we ask submitting authors to specifically address how they pursued the processes of domain analysis and domain modeling and have operationalized the different elements in the student model(s), evidence models, task models, assembly models, and presentation model. A core emphasis should be placed on the methodologies that are used or developed to extract and accumulate empirical evidence to create diagnostic feedback, which can be non-parametric, semi-parametric, or parametric models from educational measurement or computational methods from educational data mining. Despite this focus, it will be important to highlight the trustworthiness of the resulting scores / classifications / profiles and the defensibility of the resulting interpretations. In other words, we ask authors specifically to address how they investigate their adaptations of traditional notions of reliability (e.g., Haertel, 2006) and inferential validation (e.g., Kane, 2006).

    In addition to a series of core papers that serve as illustrative methodological case studies, the special issue will contain an initial framing paper by the editors that describes the ECD framework and its utility for complex learning environments (e.g., Rupp, Gushta, Mislevy, & Shaffer, 2010). The issue will also contain a final synthesis paper by the editors that looks at lessons learned and future directions for intradisciplinary and interdisciplinary research, especially as it pertains to methodological development tied to the kinds of projects featured in the special issue. In this piece we will also focus on synthesizing the compromises that were made for the development, implementation, and analysis of the diagnostic assessment systems presented in the different case studies as a result of the competing forces of research objectives, practitioners’ needs, statistical requirements of certain methodologies, and other resource constraints.

    Review Process

    As stipulated by JEDM reviewing guidelines, each submission will be blind-reviewed by three colleagues in the field as is typical for most peer-reviewed journals; the aim for us is to have one person be a potential contributor to the issue, one person be a member of the editorial board, and one person be an external reviewer from a relevant community. If you believe that certain colleagues are specifically qualified to review your submission, please do not hesitate to suggest their names to us considering, of course, any potential meaningful conflicts of interest.

    Submission Guidelines

    We invite submissions that are no longer than 60 double-spaced pages with 1-inch margins using a 12-point font, including all text, tables, figures, references, and appendices. We ask that you use the 6th edition of the APA style guide for preparing your submission.

    All submissions can be made electronically via email to André A. Rupp (ruppandratumd.edu). Please send both a blinded Word file version (.doc, .docx) as well as a blinded public domain format version (.pdf) of your submission. In addition, please provide contact information for all authors in a separate Word document and note in your email to whom email correspondence should be sent.

    Deadlines

    Please send your submission to the above email address by March 31, 2011. We are looking for a review cycle of approximately three months so that you should receive feedback and a decision by approximately July 1, 2011.

    Closing Comment

    The availability of modern digital technology has opened up exciting possibilities for diagnostic measurement that can support learning along non-traditional, but also more traditional, pathways. The wealth of data that many of these complex learning environments provide has to be translated into meaningful evidence about learning, which is not a trivial task and represents one of the new key frontiers in diagnostic measurement.

    We are sincerely excited about the intellectual and practical challenges that these opportunities hold and hope that you will be able to help us lay out a snapshot of the current-state-of-the art for this kind of work. Thus, we hope that you will be able to take us up on our invitation to contribute to this special issue. We are looking very forward to receiving your submission and welcome any questions that you may have about the issue!

    Sincerely,

    André A. Rupp, Brian Nelson, and Rebecca Nugent


    Keywords: Accepted papers list. Acceptance Rate. EI Compendex. Engineering Index. ISTP index. ISI index. Impact Factor.
    Disclaimer: ourGlocal is an open academical resource system, which anyone can edit or update. Usually, journal information updated by us, journal managers or others. So the information is old or wrong now. Specially, impact factor is changing every year. Even it was correct when updated, it may have been changed now. So please go to Thomson Reuters to confirm latest value about Journal impact factor.