Vous êtes sur la page 1sur 4

Building Institutional Capacities and Competencies for Systemic Learning Analytics Initiatives

Kimberly E. Arnold
University of WisconsinMadison 1305 Linden Drive Madison, WI 53706, USA

Grace Lynch
University of New England Armidale, NSW 2351, Australia

Daniel Huston
Rio Salado College th 2323 West 14 Street Tempe, AZ 85281, USA

kim.arnold@doit.wisc.edu Lorna Wong


University of Wisconsin System 1220 Linden Drive Madison, WI 53706, USA

grace.lynch@une.edu.au Linda Jorn


University of WisconsinMadison 1305 Linden Drive Madison, WI 53706

daniel.huston@riosalado.edu Christopher W. Olsen


University of WisconsinMadison 500 Lincoln Drive Madison, WI 53706, USA

lwong@uwsa.edu

jorn@wisc.edu

cwolsen@wisc.edu

ABSTRACT
The last five years have brought an explosion of research in the learning analytics field. However, much of what has emerged has been small scale or tool-centric. While these efforts are vitally important to the development of the field, in order to truly transform education, learning analytics must scale and become institutionalized at multiple levels throughout an educational system. Many institutions are currently undertaking this grand challenge and this panel will highlight cases from: the University of Wisconsin System, the Society for Learning Analytics Research, the University of New England, and Rio Salado College.

1. INTRODUCTION

Categories and Subject Descriptors


C.3[Special-Purpose and Application-Based Systems];H.1.2 [User/Machine Systems]Human Factors Human Information Processing; H.2.0 [Database Management General] Security, Integrity; J.1[Administrative Data Processing] Education; K.3 [Computer Uses in Education] General; K.4.3 [Organizational Impact].

General Terms
Design, Human Factors, Management.

Keywords
Learning Analytics, Capacity Building, Sustainability, Higher Education, Leadership, Systemic Application, Cultural Change,
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author. Copyright is held by the owner/author(s). LAK '14, Mar 24-28 2014, Indianapolis, IN, USA ACM 978-1-4503-2664-3/14/03. http://dx.doi.org/10.1145/2567574.2567593

Over the past decade learning analytics (LA) has become a bona fide field. Learning analytics has been explicitly listed in The New Media Consortiums Horizon Report Higher Education Edition for the past four years, with the 2014 edition reporting a time to adoption of one year or less [3]. Various techniques routinely used in learning analytics are listed on Gartners Emerging Hype Cycle 2013, including: quantified self; prescriptive analytics; big data; content analytics; activity streams; and, predictive analytics [2]. In 2013 big data, which, for our purposes, we will construe as big educational data, has reached the peak of inflated expectations on the hype cycle [2]. Despite significant data collection activity, the education sector has been slow to adopt a strategic and systemic view of analytics. It is envisaged that education systems that do make the transition towards data-informed planning, decision making, and teaching and learning will hold significant competitive advantage and quality advantages over those who do not [12]. However, despite the explosion of learning analytics research, most of what emerges in the field is course level, small scale, or tool-centric approaches. While these efforts are vitally important to the development of the field, in order to truly transform education, LA must scale and become institutionalized at multiple levels throughout an educational system. Identifying the precise competitive advantages that analytics can bring to the education sector is a multifaceted and complex undertaking. In short, LA practitioners must begin applying the wealth of evidence-based research and processes to the everyday operations of teaching and learning. In order to achieve these goals, institutions must invest significant resources and employ systems-level thinking in order to grow institutional capacities.

2. FRAMING QUESTION FOR THE PANEL DISCUSSION


While LA is concentrated on the education space (formal and informal learning), a vast amount of past research exists in closely aligned fields such as educational data mining, social network analysis, learning and computer sciences, and many more that can be applied to advance development. However, this is often problematized as a result of disciplinary silos, skepticism and a lack of acceptance and knowledge of analytics [6]. Using specific case studies, a panel of changes agents from the University of Wisconsin System (UWS), the Society for Learning Analytics Research organization, the University of New England

(UNE), and Rio Salado College (RSC) will discuss the grand challenge learning analytics is currently facing, specifically focusing on the following overarching question: What organizational capacities are necessary for successful systematic adoption of learning analytics? While often a daunting task, these case studies demonstrate how institutions have managed to make significant in-roads in system-level application of learning analytics by adopting a multipronged strategy focusing on the scalability, sustainability, and diversification of organizational competencies.

3.1.2Analytics tools
Once stable and reliable infrastructure exists, it is important to begin examining analytic tools. While analytic capabilities are built into technical infrastructure, in the case of learning analytics, there are a wide range of analytic tools that might supplement existing analytics capability or enable additional analytic features. Since analytic tools will need to meet the needs of a wide variety of stakeholders, the decision to develop a new system should go through a rigorous requirement gathering and evaluation process. Especially in the context of a systems level LA adoption in which a move away from tool- centric application of learning analytics may be considered, it is very possible that instead of supporting a single tool, institutions opt for a diverse analytic tool chest.

3. ORGANIZATIONAL CAPACITY FOR LEARNINGANALYTICS


Many have asserted the necessity of building organizational capacity in learning analytics [5, 8, 9, 11], but limited attention has been paid to the larger policy and strategy considerations that influence the adoption and deployment of analytics in educational settings. Fostering student success comes from many levels of an organization and, thus, any strategy pertaining to student success must involve multiple stakeholders and contributions. It should be made clear that many systemic LA implementations to date have centered on a perspective of analytics as a technology, as a tool, and as a means to measure. Yet analytics like any other technological system is complex and encapsulates the social and cultural domains. Pugliese presents five stages of student success analytics: 1) technology infrastructure, analytics tools, and applications; 2) policies, processes, practices, and workflows; 3) values and skills; 4) culture and behavior; and 5) leadership [10]. These stages, combined with a compelling, multi-faceted framework for optimized student success through analytics from Norris and Baer, have provided a foundation for optimization of learning analytics at a systemic level [7]. In order to achieve sufficient organizational capacity and demonstrate institutional commitment, all five stages are ideally addressed concurrently with appropriation of adequate human and nonhuman resources.

3.2 Policies, Processes, Practices, and Workflows


In order to build a sustainable, systemic level learning analytics culture, policies, processes, practices, and workflows must be considered along with policy and governance, logistical process and project management, and budgetary and financial viability.

3.2.1 Policy and governance


Learning analytics often calls for different uses of existing data and while these uses may be covered under existing institutional policies, the emerging definition of data ownership and stewardship may require additional new attention. Even with the national discussions occurring around this topic, each institution must broach this topic with the interests of its stakeholders in mind. Given the complexity in extracting and interpreting data from multiple pedagogical approaches it is understandable that government and quality assurance agencies have limited the variables to the more transferable, yet obtuse, measures such as attrition, progression, and graduation rates. While data policies are central to LA discussions, governance is also important to think though at a systemic level. There are many levels of governance, but a structure that is likely not in existence currently is one which focuses solely on the forward movement of learning analytics. Depending upon the needs of an institution this may be a system wide steering committee, a task force, or working groups.

3.1 Technology Infrastructure, Analytics Tools, and Applications


The technology infrastructure underlying analytics, as well as the analytic tools and applications themselves, are fundamental concerns when considering how to implement and deliver learning analytics solutions, particularly at scale. While this category is very broad, for the purposes of this panel we will focus on underlying technical infrastructure and analytic tools.

3.2.2 Logistical process and project management


Due to the complexities of the technological and human infrastructures required to build a system-wide approach to learning analytics we have found that both system and campuslevel project management are necessities. The key areas are: (a) tracking task completions, (b) development of processes and workflow, (c) coordinating communication for key deliverables and dates, and (d) ensuring that the work that is conducted in line with the overall directions set by the group.

3.1.1 Technical infrastructure


The technical infrastructure is a foundational element of any LA initiative. The technical infrastructure allows for collection, storage, transformation, and access to raw or processed data .and must be implemented and managed in strict accordance with all appropriate IT security and data privacy protocols, including any relevant local, state and federal policies. These important considerations often exist in tension with the desire to explore and innovate. However, establishing processes such as regular monitoring of data intake, transformation and extract procedures, as well as the overall health of analytics technical infrastructure help ensure compliance. In addition it is important to regularly subject the data to examination by subject-matter experts who can provide validation of data integrity and to enable prompt corrective action to proceed when appropriate.

3.2.3 Budgetary and financial viability


The initial funding for the UWS LA project was provided through a UW System sponsored three year Growth Agenda Grant. The Grant has been essential in providing flexibility for the project team to overcome challenges and explore opportunities that arise. As a part of the grant, the necessary resources to scale and operationalize learning analytics to additional system campuses beyond the initial pilot campuses are being assessed to assist with determining ongoing financial viability. UWS is in the process of building preliminary estimates into a five year operational budget plan. The initial funding for the UNE learning

analytics initiative was sponsored under a federal funding grant for courseware enhancement. Similar to the UWS project, long term sustainability with operational budgets must be considered.

the strength of models; and develop data visualization tools and other means of synthesizing data for mass consumption.

3.3 Values and skills


Puglieses category of values and skills is wide-reaching. Values and skills are closely aligned so this section discusses data and analytic expertise, evaluation and research competencies, and teacher and learner support. A barrier to institution-wide adoption pertains to not only a skills and capabilities shortage, but also a lack of a common understanding of analytics in general and the definitions associated with user and content generated data [12].

3.3.3 Evaluation and research competencies


In any learning analytics program a core competency that must be considered is that of evaluation and research. Evaluation is a key skill when examining the progress of forward momentum. Depending on the needs of certain constituents, outcome measures may be very different and an experienced evaluator will be able to map goals to outcomes so that informed decisions can be make. While evaluation and research expertise may be found in the same person, in some instances, there may be a distinct need for an educational researcher with experience in learning analytics research in order to assist in making iterative changes in process to improve the learning analytics trajectory at an institution.

3.3.1 Data expertise


Successful implementation of a learning analytics initiative requires involvement of course-level and institutional data experts who understand how data are collected, where they are stored, and what inferences can be reasonably drawn from them. Since learning analytics often combine course-level interaction and participation data from academic technologies with administrative data from the institutions student information system (SIS), data experts from both areas need to contribute to learning analytics efforts. These staff understand what data are collected from which students; reasons for missing or incomplete data; and alreadyknown predictors of institutional and course outcomes that could inform a learning analytics implementation. In terms of predictive analytics, data experts who understand the structure of the academic technologies, such as learning management systems (LMS), are needed to develop and interpret predictors of course success. These data experts need to work with each course instructor to understand the structure of the course and use of the technology within the course. It is important that the data experts understand which elements of the technologies are used by instructors in each course. Short of an institutional repository for course meta-data, this requires a relationship between the academic technology data expert and the course instructor. In addition to knowing what inferences can be made from existing administrative data, data expertise and leadership is also needed to inform new data collection efforts. Most administrative data are collected for operational, compliance, or legally-mandated reporting purposes. Although these data often prove useful in predicting course and institutional outcomes, they are usually not collected explicitly for learning analytics purposes. Collection of new data elements for a LA effort incurs cost for staff time, leadership of new implementation efforts, database updates and maintenance, and training.

3.3.4 Teacher and learner support


Introducing the value of learning analytics to faculty and students and getting their buy-in is a must for the success of any project. As UWS approached learning analytics as a pilot project, pilot faculty selection was based on their interest and willingness to experiment with predictive data to identify at-risk student behaviors early in the course. Faculty were scaffolded with personal consultations and job aids support to: critically assess the weekly predictions they were given to predict at-risk students in their course; compare other indicators they normally observe with the data offered by the learning analytics system predictions; validate accuracy and assist in refining the predictive models; and build data-driven intervention strategies for their course. It is imperative that faculty understand the predictive data and be able to interpret the results with confidence. Students also need a high level of support. While the faculty member or instructor has direct interaction with students in a course, there is a high likelihood that additional support and interaction as predictive data are made available to them. Students need to be properly informed of how their data are being used, as well as advised about the potential benefits/risks to them. This is underscored by the need for data literacy among students. As the analytics mature and students have direct access to analytics about their learning behaviors, additional support will need in the form of academic advising.

3.4 Culture and Behavior


The promise of educational technology to underpin and drive a transformative learning experience will not be delivered through a simple adoption process. This requires educators to revisit and break the historical pedagogical, socio-cultural and economic assumptions that can stifle education practice [6]. It is vitally important that institutions examine these domains when trying to target systems-level thinking about learning analytics as it will be the bedrock of how any project is accepted.

3.3.2Analyticexpertise
Depending on an institutions goals and objectives, learning analytics relies heavily on analytic expertise. The development of predictive models or adaptive algorithms, particularly those used to make consequential decisions and recommend courses of action to teachers or students, requires contributions from experts with backgrounds in statistics, predictive analytics, learning sciences, measurement, and data visualization. Analytic expertise is needed to develop and validate predictive models; integrate, coordinate, and use data inputs from multiple data systems; help interpret the meaning of significant predictive variables as well as determine appropriate interventions based on

3.4.1 Cultural framework for awareness and acceptance


The cultural framework for awareness and acceptance for the presence of learning analytics needs to be developed. Many myths surrounding the use of data, privacy infringement and ownership of data need to be dispelled and can be properly modulated once the values of learning analytics are realized. Additionally, an understanding that learning analytics take on different forms, and serve different needs, take away the urge to champion one best tool as multiple tools can co-exist and build on each other. Partnership with faculty builds a trust relationship and diffuses the suspicion they may hold against a specific tool.

At both UWS and UNE a major focus was put on awareness workshops for the academic community to introduce the power of learning analytics. Academic units were engaged early in the planning and visioning process. Ongoing conversations were ensured total transparency in pilot projects, as well as encouraged involvement. Community conversations often needed to highlight potential transformation, but also needed to overlay a pragmatic lens so that realistic expectations could be maintained. Overnight success and silver bullet solutions in the realm of learning analytics is highly unlikely. Generally, it is vital to deliver a message of persistence and dedication that, in time, will hopefully yield meaningful results.

research and theory as the foundation to begin building out new theory and research in system level thinking to support learning analytics. No institution will achieve a system level application of learning analytics by accident, intentional planning must occur that focuses on the emergent and iterative nature of organizational change in learning analytics. The nuanced views of the cross-functional panel will shine unique perspectives on the five stages of Puglieses student success analytics.

5. ADDITIONAL AUTHORS
Three additional authors contributed to the creation of this panel paper: Clare Huhn and Daniel Voeks from the University of WisconsinMadison; and Andrew Taylor, University of Wisconsin System.

3.5 Leadership
While addressed last in this article, committed leadership is vital to any attempt to launch a new program at both campus- and systemwide scale. Key pedagogical questions must be the drivers rather than tools or perceived technical limitations. In addition, if an institution hopes to institutionalize LA over time, new leadership must be cultivated for the future of overall capacity building. Early successes from the case studies presented during the panel and others can be used to guide activity in other systems.

6. REFERENCES
[1] Grajek, S. & Mulvenon, S. (2012). The Transformative Role of Analytics in Education. Conclusions Paper.SAS. [2]Hong,H.L.,Fenn,J.(2013).EmergingTechnologiesHypeCycle for2013:RedefiningtheRelationship.Webinarpresented21Augus t 2013. [3] Johnson, L., Adams Becker, S., Estrada, V., Freeman ,A. (2014). NMC Horizon Report: 2014 Higher Education Edition. Austin, Texas: The New Media Consortium. [4]Kotter, J.(2008). Leading Change. Harvard Business Review Press. [5]Lonn, S., Krumm, A. E., Waddington, R.J., & Teasley, S.D. (2012,April).Bridging the gap from knowledge to action: Putting analytics in the hands of academic advisors. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp.184-187).ACM. [6] Manyika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R., Roxburgh, C., & Byers, A.H. (2011). Big Data: The Next Frontier for Innovation, Competition, and Productivity. McKinsey Global Institute. [7] Norris, D.M. &Baer, L.L. (2012)A Toolkit for Building Organizational Capacity for Analytics. Washington, DC: Strategic Initiatives, Inc. [8]Norris,D.M.&Baer,L.L.(2013).BuildingOrganizationalCapa cityforAnalytics.Boulder,CO:EDUCAUSE. [9] Norris, D., Baer, L. ,Leonard, J., Pugliese, L.,& Lefrere, P. (2008).Action Analytics: Measuring and Improving Performance that Matters in Higher Education. Educause Review,43(1), 42-44. [10]Pugliese,L.C.(2010).ANewAgeofLearningManagementAn alytics.Whitepaper published atedu1word.org. [11]Siemens,G.(2013).LearningAnalyticsTheEmergenceofaDis cipline.AmericanBehavioral Scientist, 57(10), 1380-1400. [12] Siemens, G., Dawson, S., & Lynch, G. (2013). Improving the Productivity of the Higher Education Sector: Policy and Strategy for Systems-Level Deployment of Learning Analytics.Society for Learning Analytics Research for the Australian Government Office for Learning and Teaching.

3.5.1 Strategic thinking and leadership


In the case of the UWS LA initiative, leadership was vital at multiple levels. In general, we followed Kotters eight- step model of leadership: 1) establishing a sense of urgency; 2) creating a guiding coalition; 3) developing a change vision; 4) communicating the vision for buy-in; 5) empowering broadbased action; 6) generating short-term wins; 7) never letting up; and 8) incorporating the changes into the institutional culture [4]. For the statewide system, budget and project managers were critical to overall project coordination, budgeting and interactions with the commercial analytics software provider. Each participating campus also had a principal investigator (PI) to locally manage grant reporting and interactions with the institutional review boards and campus data custodians. On the technology side, an overall project manager coordinated communications internally with project participants and instructors, and externally to the greater campus communities. Absolutely vital to success was having a leader with a deep scholarly understanding of learning analytics principles and practices and the mechanics of creating predictive models. Finally, beyond individual leadership roles, learning analytics initiatives will benefit from coordination of leadership from across the institution. Strong leadership is essential in facilitating intra- and cross-campus communication, problem solving and strategic planning.

4. CONCLUSION
Many university systems are at the stage of small scale analytics projects to develop recommender systems, predictive models, and attributes and profiles of successful students. The lack of enterprise models has been noted by Susan Grajek Vice President for Data, Research, and Analytics at EDUCAUSE, in lamenting the lack of broad scale impact that analytics has made to date: We havent figured out how to put analytics to work pervasively throughout higher education to make a difference and resolve our most pressing issues [1]. This panel presentation focuses on the importance of building institutional capacity for system-level implementation of learning analytics. The emphasis is on using the existing

Vous aimerez peut-être aussi