posted by user: BirteRWTH || 4633 views || tracked by 8 users: [display]

LAK 2023 : Learning Analytics and Knowledge Conference

FacebookTwitterLinkedInGoogle


Conference Series : Learning Analytics and Knowledge
 
Link: https://www.solaresearch.org/events/lak/lak23/general-call/
 
When Mar 13, 2023 - Mar 17, 2023
Where Arlington, Texas, USA
Submission Deadline Oct 3, 2022
Categories    learning analytics   HCLA
 

Call For Papers

We welcome submissions from both research and practice, encompassing different theoretical, methodological, empirical and technical contributions to the learning analytics field. Learning analytics research draws on many distinct academic fields, including psychology, the learning sciences, education, neuroscience, computer science and design. We encourage the submission of works conducted in any of these traditions. We also welcome research that validates, replicates and examines the generalizability of previously published findings, as well as examines aspects of adoption of existing learning analytics methods and approaches.

This year, we encourage contributors to consider how collective action can tackle concerns and issues associated with the implementation of learning analytics. Learning analytics impacts on both technical and social systems. We invite papers that address areas of bias, privacy, ethics, transparency and accountability from multiple lenses including the design, implementation and evaluation stages of learning analytics. Accountable analysis refers to providing a certain degree of transparency and explanation, and adjusting the transparency of data and computation according to the differences of stakeholders. Trust goes hand in hand with transparency in decision-making; whether the decisions for predictions and interventions are fair and explainable is an ethical issue. There is still much to be done in human behavior and social values, such as respecting privacy, providing equal opportunities, and accountability. Based on diversity, equity, and belonging, inclusive learning analytics identifies and breaks down systemic barriers to inclusion, fosters a culture that every learner knows their belonging, feels empowered to bring their whole self to learning, and is inspired to learn.

For the 13th Annual conference, we encourage authors to address the following questions related to LAK23's theme of "Towards Trustworthy Learning Analytics:

What are the essential components of building a trustworthy LA system?
How do we give diverse stakeholders a voice in defining what will make LA trustworthy?
How can we develop and evaluate instruments or frameworks for measuring the trustworthiness of a LA system?
Is there anything distinctive about trustworthiness in teaching and learning or can we borrow unproblematically from notions of trustworthiness from other fields?
How can we develop models or frameworks that can measure the fairness, bias, transparency or explainability level of a LA system?
How do we develop human-in-the-loop predictive or prescriptive analytics that benefit from instructor judgement?
How can we enable students or instructors to share their perceptions of the level of trustworthiness of a LA system?
How can we reliably and transparently model student competencies?
Other topics of interest include, but are not limited to, the following:

Implementing Change in Learning & Teaching:

Ethical issues around learning analytics: Analysis of issues and approaches to the lawful and ethical capture and use of educational data traces; tackling unintended bias and value judgements in the selection of data and algorithms; perspectives and methods that empower stakeholders.
Learning analytics adoption: Discussions and evaluations of strategies to promote and embed learning analytics initiatives in educational institutions and learning organizations. Studies that examine processes of organizational change and practices of professional development that support impactful learning analytics use.
Learning analytics strategies for scalability: Discussions and evaluations of strategies to scale capture and analysis of information in useful and ethical ways at the program, institution or national level; critical reflections on organizational structures that promote analytics innovation and impact in an institution.
Equity, fairness and transparency in learning analytics: Consideration of how certain practices of data collection, analysis and subsequent action impact particular populations and affect human well-being, specifically groups that experience long term disadvantage. Discussions of how learning analytics may impact (positively or negatively) social change and transformative social justice.
Understanding Learning & Teaching:

Data-informed learning theories: Proposals of new learning/teaching theories or revisions/reinterpretations of existing theories based on large-scale data analysis.
Insights into specific learning processes: Studies to understand particular aspects of a learning/teaching process through the use of data science techniques, including negative results.
Learning and teaching modeling: Creating mathematical, statistical or computational models of a learning/teaching process, including its actors and context.
Systematic reviews: Studies that provide a systematic and methodological synthesis of the available evidence in an area of learning analytics.
Evidencing Learning & Teaching:

Finding evidence of learning: Studies that identify and explain useful data for analysing, understanding and optimising learning and teaching.
Assessing student learning: Studies that assess learning progress through the computational analysis of learner actions or artefacts.
Analytical and methodological approaches: Studies that introduce novel analytical techniques, methods, and tools for modelling student learning.
Technological infrastructures for data storage and sharing: Proposals of technical and methodological procedures to store, share and preserve learning and teaching traces, taking appropriate ethical considerations into account.
Impacting Learning & Teaching:

Human-centered design processes: Research that documents practices of giving an active voice to learners, teachers, and other educational stakeholders in the design process of learning analytics initiatives and enabling technologies.
Providing decision support and feedback: Studies that evaluate the use and impact of feedback or decision-support systems based on learning analytics (dashboards, early-alert systems, automated messages, etc.).
Data-informed decision-making: Studies that examine how teachers, students or other educational stakeholders come to, work with and make changes using learning analytics information.
Personalised and adaptive learning: Studies that evaluate the effectiveness and impact of adaptive technologies based on learning analytics.
Practical evaluations of learning analytics efforts: Empirical evidence about the effectiveness of learning analytics implementations or educational initiatives guided by learning analytics.

Related Resources

SOCIETY TRENDS 2025   International Conference on Technical Advances and Human Consequences
ALL 2025   2nd International Workshop on Adaptive Lifelong Learning
KES 2025   29th International Conference on Knowledge-Based and Intelligent Information & Engineering Systems
ICARC 2025   International Conference on Advanced Research in Computing 2024
LAK 2025   15th International Conference on Learning Analytics and Knowledge
IEEE CACML 2025   2025 4th Asia Conference on Algorithms, Computing and Machine Learning (CACML 2025)
IEEE-Ei/Scopus-ITCC 2025   2025 5th International Conference on Information Technology and Cloud Computing (ITCC 2025)-EI Compendex
SPIE-Ei/Scopus-DMNLP 2025   2025 2nd International Conference on Data Mining and Natural Language Processing (DMNLP 2025)-EI Compendex&Scopus
IEEE-Ei/Scopus-CNIOT 2025   2025 IEEE 6th International Conference on Computing, Networks and Internet of Things (CNIOT 2025) -EI Compendex
PAKDD 2025   29th Pacific-Asia Conference on Knowledge Discovery and Data Mining