posted by user: BirteRWTH || 863 views || tracked by 7 users: [display]

LAK 2023 : Learning Analytics and Knowledge Conference

FacebookTwitterLinkedInGoogle


Conference Series : Learning Analytics and Knowledge
 
Link: https://www.solaresearch.org/events/lak/lak23/general-call/
 
When Mar 13, 2023 - Mar 17, 2023
Where Arlington, Texas, USA
Submission Deadline Oct 3, 2022
Categories    learning analytics   HCLA
 

Call For Papers

We welcome submissions from both research and practice, encompassing different theoretical, methodological, empirical and technical contributions to the learning analytics field. Learning analytics research draws on many distinct academic fields, including psychology, the learning sciences, education, neuroscience, computer science and design. We encourage the submission of works conducted in any of these traditions. We also welcome research that validates, replicates and examines the generalizability of previously published findings, as well as examines aspects of adoption of existing learning analytics methods and approaches.

This year, we encourage contributors to consider how collective action can tackle concerns and issues associated with the implementation of learning analytics. Learning analytics impacts on both technical and social systems. We invite papers that address areas of bias, privacy, ethics, transparency and accountability from multiple lenses including the design, implementation and evaluation stages of learning analytics. Accountable analysis refers to providing a certain degree of transparency and explanation, and adjusting the transparency of data and computation according to the differences of stakeholders. Trust goes hand in hand with transparency in decision-making; whether the decisions for predictions and interventions are fair and explainable is an ethical issue. There is still much to be done in human behavior and social values, such as respecting privacy, providing equal opportunities, and accountability. Based on diversity, equity, and belonging, inclusive learning analytics identifies and breaks down systemic barriers to inclusion, fosters a culture that every learner knows their belonging, feels empowered to bring their whole self to learning, and is inspired to learn.

For the 13th Annual conference, we encourage authors to address the following questions related to LAK23's theme of "Towards Trustworthy Learning Analytics:

What are the essential components of building a trustworthy LA system?
How do we give diverse stakeholders a voice in defining what will make LA trustworthy?
How can we develop and evaluate instruments or frameworks for measuring the trustworthiness of a LA system?
Is there anything distinctive about trustworthiness in teaching and learning or can we borrow unproblematically from notions of trustworthiness from other fields?
How can we develop models or frameworks that can measure the fairness, bias, transparency or explainability level of a LA system?
How do we develop human-in-the-loop predictive or prescriptive analytics that benefit from instructor judgement?
How can we enable students or instructors to share their perceptions of the level of trustworthiness of a LA system?
How can we reliably and transparently model student competencies?
Other topics of interest include, but are not limited to, the following:

Implementing Change in Learning & Teaching:

Ethical issues around learning analytics: Analysis of issues and approaches to the lawful and ethical capture and use of educational data traces; tackling unintended bias and value judgements in the selection of data and algorithms; perspectives and methods that empower stakeholders.
Learning analytics adoption: Discussions and evaluations of strategies to promote and embed learning analytics initiatives in educational institutions and learning organizations. Studies that examine processes of organizational change and practices of professional development that support impactful learning analytics use.
Learning analytics strategies for scalability: Discussions and evaluations of strategies to scale capture and analysis of information in useful and ethical ways at the program, institution or national level; critical reflections on organizational structures that promote analytics innovation and impact in an institution.
Equity, fairness and transparency in learning analytics: Consideration of how certain practices of data collection, analysis and subsequent action impact particular populations and affect human well-being, specifically groups that experience long term disadvantage. Discussions of how learning analytics may impact (positively or negatively) social change and transformative social justice.
Understanding Learning & Teaching:

Data-informed learning theories: Proposals of new learning/teaching theories or revisions/reinterpretations of existing theories based on large-scale data analysis.
Insights into specific learning processes: Studies to understand particular aspects of a learning/teaching process through the use of data science techniques, including negative results.
Learning and teaching modeling: Creating mathematical, statistical or computational models of a learning/teaching process, including its actors and context.
Systematic reviews: Studies that provide a systematic and methodological synthesis of the available evidence in an area of learning analytics.
Evidencing Learning & Teaching:

Finding evidence of learning: Studies that identify and explain useful data for analysing, understanding and optimising learning and teaching.
Assessing student learning: Studies that assess learning progress through the computational analysis of learner actions or artefacts.
Analytical and methodological approaches: Studies that introduce novel analytical techniques, methods, and tools for modelling student learning.
Technological infrastructures for data storage and sharing: Proposals of technical and methodological procedures to store, share and preserve learning and teaching traces, taking appropriate ethical considerations into account.
Impacting Learning & Teaching:

Human-centered design processes: Research that documents practices of giving an active voice to learners, teachers, and other educational stakeholders in the design process of learning analytics initiatives and enabling technologies.
Providing decision support and feedback: Studies that evaluate the use and impact of feedback or decision-support systems based on learning analytics (dashboards, early-alert systems, automated messages, etc.).
Data-informed decision-making: Studies that examine how teachers, students or other educational stakeholders come to, work with and make changes using learning analytics information.
Personalised and adaptive learning: Studies that evaluate the effectiveness and impact of adaptive technologies based on learning analytics.
Practical evaluations of learning analytics efforts: Empirical evidence about the effectiveness of learning analytics implementations or educational initiatives guided by learning analytics.

Related Resources

MLDM 2023   18th International Conference on Machine Learning and Data Mining
CoSinE 2022   10th Illia O. Teplytskyi Workshop on Computer Simulation in Education
FAIML 2023   2023 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML 2023)
IJCNN 2023   International Joint Conference on Neural Networks
CFDSP 2023   2023 International Conference on Frontiers of Digital Signal Processing (CFDSP 2023)
IEEE ICA 2022   The 6th IEEE International Conference on Agents
MLDM 2023   19th International Conference on Machine Learning and Data Mining
SEKE 2023   The 35th International Conference on Software Engineering and Knowledge Engineering
FAIML 2023   2023 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML 2023)
DMKD 2023   2023 6th International Conference on Data Mining and Knowledge Discovery(DMKD 2023)