Deakin University ‘Getting Published in Academic Journals’.
The aim of this session is to demystify the processes involved in identifying a suitable paper, choosing an appropriate journal and understanding the steps of the process from writing to acceptance. It will provide a perspective from an experienced journal editor who deals with many submissions from doctoral students on what goes on behind the scenes when a paper is considered.
The University of Auckland – University of Umea A methods workshop on test item analysis using both Classical and Item Response Theories.
Checking test items for quality: Classical Test Theory and Item Response Theory methods. The use of multiple-choice test questions is widespread in higher education and in standardised international testing of school achievement (e.g., PISA, PIRLS, TIMSS, etc.). However, not every item contributes positively to the calculation of score estimates of student achievement. This workshop will overview the theoretical background for classical and modern approaches to statistical evaluation of items in a test. The workshop will demonstrate classical test approach (using SPSS and MS Excel) and item response approach (using R package ltm).
Participants could read the following in preparation: Brown, G. T. L., & Abdulnabi, H. (2017). Evaluating the quality of higher education instructor-constructed multiple-choice tests: Impact on student grades. Frontiers in Education, 2(24). doi:10.3389/feduc.2017.00024
University of Illinois Assessment as Reasoning from Evidence: Using Evidence-Centered Design Processes to Develop High Quality Assessments
This workshop will provide an overview of assessment as a process of reasoning from evidence and an introduction to the use of Evidence Centered Design (ECD) as a process to develop high quality and valid assessments of various cognitive constructs. The ECD process and products will be illustrated for various cognitive constructs. A specific use case will be highlighted that involves the development of assessments of science learning intended to support ongoing classroom teaching and learning.
Utrecht University How to analyze feedback quality, based on research activities that he has done in collaboration with Jan Willem Strijbos. If PhD students are interested, they will be asked to bring their own data if possible.
University of Helsinki, Centre for Educational Assessment Learning to learn and Key competencies: Emergent Multidiciplinary Concepts in Studies on Development and Assessment in Schooling Context.
The fundaments of the personalized framework to be applied in supporting participants’ research agendas consist of distinctions: psychology/education; development/learning/teaching/intervention; roles/positions - obligations/rights; disciplines/general education/liberal education; grading schools/measuring students; ecological validity and ecological fallacy; normative and criterion related measures; added value and socio-economic factors – explanations or excuses; Brunswik Symmetry Principle and the Construct Under/Over -Representation Rule of Gustafsson; World Educational Politics: Cross-Curricular, Key, PISA, Transversal and 21st Century Competencies and Literacies; and some others as well.
Some possibly relevant references: Blömke, S. & Gustafsson, J-E. (Eds.) (2017). Standard Setting in Education. The Nordic Countiries in an International Perspective. Springer. Cronbach, J. & Snow. R.E. (1977). Aptitudes and Instructional Methods: A Handbook in Interaction. Irvington Publishers. Hartig, J., Klieme, E. & Leutner, D. (eds.) (2007). Assessment of Competencies in Educational Contexts. Hogrefe. Olson, D. (2003). Psychological Theory and Educational Reform: How school remakes mind society. Cambridge University Press.
Zuyd University of Applied Sciences Two sides of the same coin: The similarities between assessment and educational research
Every day, many students are assessed to determine if they master certain learning outcomes. To design an assessment that fits its purpose, several important steps have to be taken: Some of these steps are that the learning goal(s) should be clearly defined, that success criteria are defined, that students understand what performances should be demonstrated and which evidence these performances should yield, and that transparent assessment procedures are needed to reach a decision on students' performance level. The goals of this workshop are threefold: 1) presenting the main steps of designing a performance assessment, 2) presenting the main steps of designing an educational research study, and 3) presenting you a conceptual framework which makes you aware that designing an assessment and setting up an education research study are two sides of the same coin. During the workshop, there will be opportunities for interaction and hands-on assignments. I am looking forward to meet you in Helsinki!