Learning Analytics | July/August 2019

Page 1

J U LY/A U G U S T 2 0 1 9

PROVING THE VALUE OF TRAINING | 18 Connecting On-the-Job Performance to Training

WHERE TO START WITH LEARNING DATA | 26 Learning Leaders Share Tips and Best Practices

FOSTERING DATA-DRIVEN FEEDBACK | 42 How Machines are Enhancing Human Relationships

BUSINESS

PERSPECTIVES

ON

MANAGING

WORLD-CLASS

TRAINING


ADVERTORIAL

SITUATIONAL LEADERSHIP Relevant

T h e n,

Where were you in 1969? A better question might be, where were your parents or, perhaps, even your grandparents? A fellow by the name of Richard Nixon was president, the war in Vietnam was winding down and your relatives were probably still recovering from Woodstock. The training industry was in its infancy. There was no internet. Nobody Googled anything. Only a handful of the most forward-thinking organizations did anything remotely resembling leadership training. It was a topic tackled almost exclusively by professors at universities who wound up publishing research results in journals for each other’s consumption. One person who was in the middle of that professorial mix was our founder, Dr. Paul Hersey. He started out working for Carl Rogers in the late 1950s and, over the years, used one creative instructional technique after another to make sense of organizational behavior. This work led to his teaching a course at Ohio University in the late 1960s — a course that everybody wanted to take. It evolved into a best-selling textbook (“Management of Organizational Behavior”).

Relevant

®

Now

Co-authored with his colleague Ken Blanchard, that book introduced The Situational Leadership® Model and was the genesis for the launch of The Center for Leadership Studies (CLS) in 1969. During the 1970s, Dr. Hersey traveled the world about 10 times over introducing the model to thousands of university students and corporate leaders. By 1983, there was certainly awareness and interest in developing leaders, but comparatively few organizations were ready to dive in, make it a priority and commit serious resources to the cause. Then, things changed — in a hurry! In the fall of 1983, Tom Peters and Bob Waterman published “In Search of Excellence,” the first universally-embraced book on management, leadership and organizational culture. Suddenly, there was a deluge of interest in leadership development. With everything that’s happened in leadership development from 1969 to 2019, one thing hasn’t

1969

Dr. Hersey launches The Center for Leadership Studies.

1970s

Dr. Hersey introduces The Situational Leadership Model® worldwide.

Dr. Paul Hersey, Founder The Center for Leadership Studies


14

MILLION LEADERS TRAINED

35

COUNTRIES AROUND THE GLOBE

changed: Situational Leadership®. We believe the ongoing generational appeal of this model stems from the fact that it sits on a rock-solid foundation of pioneering research. Dr. Hersey’s genius was to integrate many, seemingly disparate, contributions into a practical, repeatable framework that gives leaders a place to start: • What is the specific task? • What is the person’s ability and willingness to perform that task? • What approach should the leader use based on the answers to the first two questions? These questions are every bit as relevant for leaders today as they were five decades ago. The challenges over the last decade or so have been to ensure that the mechanisms for delivering the message remained current and that our content was cohesively united with other important leadership development frameworks. In response to those dynamics, we now feature: • A core offering that is translated into 25 languages. • A multitiered Situational Leadership®-based curriculum that tethers our model to coaching, emotional intelligence, DiSC®, power, leading teams and leading change. • Content that is available in a classroom setting and online, in both blended and virtual formats. • A customer-facing portal that houses our sustainment suite (The Four Moments of Truth™) and allows customers to access microlearning anytime, anywhere and on any device.

Tom Peters and Bob Waterman publish “In Search of Excellence,” drawing interest to leadership development.

LANGUAGE TRANSLATIONS

On our 50th anniversary, as we reflect with pride on the reach (70% of the Fortune 500) and relevance of Situational Leadership®, we say sincere thanks to everyone who has participated in any way during our journey. We are excited about all that awaits us moving forward!

“Our purpose is to fulfill a legacy of leadership by equipping people at all levels of an organization to effectively influence, engage and succeed.” Maureen Hersey Shriver, CEO The Center for Leadership Studies

2019 1983

25

The Center for Leadership Studies celebrates its 50th anniversary. Learn more at situational.com.


Management tools for the real heroes of your organization You are an HR superhero! Your challenges may be great, but the rewards are greater. You must take risks, manage people and organizations, juggle many initiatives all at once AND be a change agent. Your job requires you to be quick thinking, responsive, innovative and also courageous. You have the skills to help your employees realize their potential and the tenacity to achieve mission goals.

Learn how to grow your superpowers at csod.com


PERSPECTIVES KEN TAYLOR

LEARNING ANALYTICS: TABLE STAKES FOR BUSINESS SUCCESS

L&D HAS STRUGGLED WITH THE COLLECTION AND STRUCTURE OF LEARNING ANALYTICS SINCE THE BEGINNING OF TIME.

Big data is here to stay. As a function, learning and development (L&D) needs to embrace it or fall behind the rest of departments inside our organizations. The struggle we are all facing is where to find the skill set on our teams to tackle the mounds of available data — and put it to use to improve the impact we make when training our companies’ employees. L&D has struggled with the collection and structure of learning analytics since the beginning of time. For years, Training Industry’s research has identified it as one of the top challenges learning leaders face. It’s about time that we start to define the role of the learning data scientist. To be successful, the role will require a combination of skills that go beyond data collection and an understanding of the language of L&D. But I think it goes beyond just defining the role and brings us back to the fundamentals that great training organizations have embraced for years. The best companies make clear connections between training investments and their intended impact. They ensure that their training programs are specifically aligned to the core objectives of the organization. They think about measurement while they’re designing training programs, not after they roll them out. They are systematic in the implementation of learning technologies to maximize their technology stack for data trackability. They are customer-focused and mindful

of the need to collect feedback from all stakeholders in the roll-out of a training program, not only the learners. It might feel like a tall order for any company’s L&D team, but we must drive in this direction to make sure that we remain relevant and connected to the businesses we support. Learning analytics will help us make certain that not a penny of the investment we make in training is wasted. More importantly, it will allow us to better monitor and understand the value we bring to the teams we support, which I believe is table stakes in the environment of scarce resources and escalating skillsrelated challenges that many companies now face. As roles morph, corporations are looking to reskill their employees. The rate of change in skill requirements continues to accelerate, and we need to be ready. The great news is that there are many resources available to help with the journey, and we are excited to add to the resource pool with this edition of Training Industry Magazine. As always, we would love to hear your thoughts about the perspectives shared in the magazine. Please feel free to send along any suggestion for future editions of Training Industry Magazine for us to consider. Ken Taylor is president and editor in chief of Training Industry, Inc. Email Ken

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

|5


CO N T E N TS

TA B L E O F VOLUME 12

|6

I

ISSUE 5

I

JULY/AUGUST 2019


FEATURES

MACHINES ARE THE FUTURE OF TRAINING:

18 PROVING THE VALUE OF TRAINING

18 22 26

26 WHERE TO START WITH LEARNING DATA

42 FOSTERING DATA-DRIVEN FEEDBACK

IS TRAINING ANALYTICS SMOKE AND MIRRORS? By Jim and Wendy Kirkpatrick

Learn how to create and implement a plan to reach Level 3 of the Kirkpatrick Model.

LEVERAGING ANALYTICAL TOOLS TO TRANSFORM L&D By Preeta Chandran

Data analytics can successfully measure and improve training program effectiveness.

LEVERAGING LEARNING ANALYTICS TO IMPROVE BUSINESS OUTCOMES: WHERE TO START By Ken Taylor

Learning leaders share tips for effectively leveraging learning analytics to drive business success.

33 36 38 42

WAS IT WORTH IT? MEASURING THE IMPACT AND ROI OF LEADERSHIP TRAINING By Paul Leone, Ph.D.

This case study proves that leaders can’t afford not to go through training.

HOW TO MAKE THE TRANSITION TO A DATA-DRIVEN LEARNING CULTURE By Jon Green

Adopting a data-driven learning culture is key in reaching organizational goals.

HOW PERSONAL SKILLS AI ASSISTANTS MIGHT DISRUPT CORPORATE TRAINING By François Debois and Simon Vuillaume

Personal skills artificial intelligence assistants will soon make waves in the corporate training space.

MACHINES ARE THE FUTURE OF TRAINING: HOW DATA-DRIVEN FEEDBACK IS FOSTERING IMPROVEMENT AND ENHANCING HUMAN RELATIONSHIPS By Noah Zandan

Discover how objective, data-driven feedback can better serve learners.

47

IMPROVING INSTRUCTOR IMPACT ON LEARNING WITH ANALYTICS By Eric A. Surface, Ph.D., and Reanna P. Harman, Ph.D.

Maximize instructor impact through analytics and feedback for better learning outcomes.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

|7


I N THIS I S S U E

THOUGHT LEADERS

5 11

PERSPECTIVES By Ken Taylor

Remain competitive in the marketplace by effectively utilizing your data scientists.

GUEST EDITOR By Kevin M. Yates

Build an effective measurement practice in your organization by focusing on performance.

13

SCIENCE OF LEARNING

15

PERFORMANCE MATTERS

By Srini Pillay, M.D.

Discover the brain’s influence on learner retention and learning analytics.

By Julie Winkle Giulioni

Optimize the use of your learning analytics by including your learners in the process.

17

BUILDING LEADERS By Sam Shriver and Marshall Goldsmith Effectively measuring the impact of training begins with your organization’s strategic goals.

55

WHAT’S NEXT IN TECH

57

SECRETS OF SOURCING

59

LEARNER MINDSET

By Stella Lee, Ph.D.

Discover potential solutions and resources for the challenges presented by data collection.

By Doug Harward

L&D must think of analytics from the perspective of the business to drive training objectives.

By Michelle Eggleston Schwartz

Focusing on quality versus quantity of training can improve the effectiveness of learning.

INFO EXCHANGE

50 52

CASEBOOK Follow a sales enablement organization’s journey toward effective data measurement and analysis.

GLOBAL OUTLOOK Make your e-learning more accessible to international audiences by localizing your content.

CONNECT WITH US

|8

60 61

1 (866) 298-4203

CLOSING DEALS Coursera accelerates enterprise product innovation to meet the data analysis needs of the marketplace.

COMPANY NEWS Keep up with the latest in the training industry by reading news from the last quarter.

editor@trainingindustry.com

TrainingIndustry.com


A B O U T OUR TEAM

STAFF CHIEF EXECUTIVE OFFICER Doug Harward

EDITORIAL INTERN Hope Williams

EDITOR IN CHIEF & PRESIDENT Ken Taylor

DESIGNER Mary Lewis

dharward@trainingindustry.com

hwilliams@trainingindustry.com

ktaylor@trainingindustry.com

mlewis@trainingindustry.com

EDITORIAL DIRECTOR Michelle Eggleston Schwartz

DESIGNER Kellie Blackburn

meggleston@trainingindustry.com

kblackburn@trainingindustry.com

MANAGING EDITOR, DIGITAL Taryn Oesch

toesch@trainingindustry.com

DESIGNER Alyssa Alheid

ASSOCIATE EDITOR Sarah Gallo

ADVERTISING SALES

sales@trainingindustry.com

EDITORIAL BOARD SCOTT NUTTER General Manager, Research, AQP & Development Delta Air Lines

JUDI BADER Senior Director of Learning Arby’s Restaurant Group MICHAEL CANNON, M.ED. Senior Director, Head of Learning & Development Red Hat

MARC RAMOS Vice President, Chief Learning Officer Sitecore

MEGAN CASADOS Director of Training DISH

KELLY RIDER Vice President, L&D Content Strategy & Experience SAP Learning & Development

BARBARA JORDAN Group Vice President, Global Learning & Development Sims Metal Management

DR. SYDNEY SAVION General Manager, Learning Air New Zealand

CATHERINE KELLY, MA, BSN, RN, CPTM Director of Learning Programs Brookdale Senior Living

KERRY TROESTER Director, North America Sales Training Lenovo NATASHA MILLER WILLIAMS Vice President, Talent Engagement & Development Nielsen

SHIREEN LACKEY Talent Management Officer, Office of Business Process Integration Veterans Benefits Administration

KEE MENG YEO Vice President, Enterprise Talent Development

LAURA MORAROS Global Head of Sales Learning Facebook

Amway

MATTHEW S. PRAGER Executive Training Manager U.S. Government

Fostering B2B editorial excellence

American Society of Business Publication Editors

2018 Cross-Platform Package of the Year Top 10 Award

Training Industry Magazine connects learning and development professionals with the resources and solutions needed to more effectively manage the business of learning.

aalheid@trainingindustry.com

sgallo@trainingindustry.com

A | S | B | P| E

MISSION

A | S | B | P|E

A | S | B | P|E

American Society of Business Publication Editors

American Society of Business Publication Editors

Fostering B2B editorial excellence

2017 National

ONLINE Award Winner

SUBSCRIPTIONS ELECTRONIC:

Sign up at TrainingIndustry.com to receive notification of each new digital issue. PRINT:

Print copies are available for purchase at magcloud.com for $15.95.

ARTICLE REPRINTS To order reprints of articles, please contact Training Industry at editor@trainingindustry.com.

PUBLISHER Training Industry Magazine is published bi-monthly by: Training Industry, Inc. 6601 Six Forks Rd Ste 120 Raleigh, NC 27615

Fostering B2B editorial excellence

2017 Regional

ONLINE Award Winner

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

|9


Insights. On Demand. TRAINING INDUSTRY WEBINARS

WE PROVIDE THE ACCESS, YOU GAIN THE INSIGHTS.

From exposure to fresh perspectives and approaches from some of the world’s greatest mentors, like Allen Mulally and Marshall Goldsmith, to convenient continuing education credits and interactive demos on the latest learning technologies, we have everything you need to extend your professional development and improve your L&D initiatives.

Browse free webinars or on-demand presentations.


GUEST EDITOR KEVIN M. YATES

LET’S BUILD A MEASUREMENT PRACTICE FOR L&D I have an idea. Let’s create a function in learning and development (L&D) that uses facts to show the influence of training and learning on peoples’ behavior, actions and performance. Let’s tell L&D’s story with evidence for impact. Let’s build a culture in L&D that uses data and analytics to answer questions and inform decisions. How about a function embedded in L&D that validates fulfillment of our purpose and transforms the way L&D demonstrates its value? I got it! Let’s build a measurement practice for L&D. CHANGE OUR MINDSET Our mindset about impact is deeply rooted in the history of what we’ve thought to be our deliverable: the class, the course, the training materials, etc. But what about impact as the deliverable? We must consider the change that comes from learning’s influence on behavior, performance and actions. The content we create is not our final deliverable. It’s not where the story of measurable impact ends. For me, the story ends when our learning solutions strengthens or builds capability. We, as L&D professionals, fulfill our highest purpose when we measurably impact peoples’ performance. Measurable impact is the deliverable, and that should be our mindset. CHANGE THE DISCUSSION Let’s change the discussion by changing where the discussion starts. The discussion doesn’t start with, “We need training.” It starts with being curious about the underlying reasons for training requests.

How often do we engage in discovery-based discussion about what the organization is trying to achieve when we get the order for training? Our discussions should help us uncover how the organization will measure success. Moreover, our conversations should inform decisions for training and learning solutions that measurably impact performance.

WE FULFILL OUR HIGHEST PURPOSE WHEN WE MEASURABLY IMPACT PEOPLES’ PERFORMANCE. When we’re clear about organization priorities and align our L&D solutions accordingly, we are using the same metrics and measures to evaluate success. We connect in ways that prevent us from just measuring our programs and initiatives. A new discussion with a different focus and innovative strategy for engagement helps us build a measurement practice for L&D. FOCUS ON PERFORMANCE Our training and learning solutions should be purposefully aligned and designed to produce performance outcomes that help people and the organization. We don’t design and deliver training and learning solutions simply for the experience. Performance impact is the purpose and the goal. If performance impact is the deliverable, and we’re going to measure that, we need to know what impact looks like. More

specifically, we need to know the skill and capability requirements that aid people in achieving goals, executing strategy and allowing the organization to succeed. Performance impact is the foundation upon which a measurement practice is built. Let’s focus on influencing and changing peoples’ performance through training. Results are not the number of people trained, the number of training hours completed or the number of people who liked the program. Results are the extent to which behavior and actions change, and we can measure that. BUILDING A MEASUREMENT PRACTICE I deliberately stayed away from talking about methods, models and technology for L&D measurement. There’s work to do before we get to that. Building a measurement practice for L&D doesn’t start with measurement. That may sound contradictory, but here’s what I mean: We need a mindset shift for impact as the deliverable, a change in our discussions for alignment and organization goals, and a focus on performance outcomes. If we get these things right first, we are setting ourselves up for a successful L&D measurement practice. The methods, models and technology will be there to support us. The foundations I’ve described here will sustain us in building a meaningful measurement practice for L&D. Kevin M. Yates is an L&D detective and just like Sherlock Holmes, he solves mysteries. The mystery he solves is, “Did training work?” He uses facts, evidence and data to show training and learning’s impact on behavior, performance and goals. Email Kevin.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 11


GET BEHIND YOUR EMPLOYEES TO STAY AHEAD OF THE CURVE Keeping up with the demands of business means working faster, better, smarter. With 200+ careerfocused programs, Southern New Hampshire University offers flexible, affordable degree pathways designed to upskill your employees – so you can stay competitive now and ahead of the curve tomorrow.

UPSKILL YOUR WORKFORCE. TRANSFORM YOUR ORGANIZATION. LEARN MORE AT snhu.edu/talk-talent


SCIENCE OF LEARNING SRINI PILLAY, M.D.

BRAIN-BASED LEARNING ANALYTICS AND EVALUATION PRACTICES When we analyze learning and measure learning outcomes, we usually want to determine its effectiveness. However, learning relies on more than the content provided. In the brain, other factors influence learning. Mindset, instructor trust, context, value and design also impact learning, and these factors may operate entirely outside of awareness. Learning analytics must take these dynamic and abstract factors into account. Let’s review insights on the nature of these factors, as well as how to analyze and evaluate them in learning environments. STUDENT MINDSET A growth mindset reflects the belief that one can successfully develop their own intelligence. When people hold this belief, it increases their intrinsic motivation to learn. Motivation is also increased when people internalize external values. As a result, people with growth mindsets are more likely to respond promptly to errors. In order to foster growth mindsets in students, trust must be established between the learner, the instructor and the learning environment. INSTRUCTOR TRUST Students’ trust in their instructor helps facilitate learning. Both growth mindset and instructor trust can increase the student’s commitment to learning. Trust must be established early, as opinions developed within six seconds may influence student evaluation of the instructor. It is important to note that the instructor’s mindset matters as well.

Teachers who have a growth mindset appreciate successive improvements more than those who do not. Consequently, students are more motivated by teachers with growth mindsets. Teacher feedback is least effective when focus is placed on final results rather than the learning process. LEARNING CONTEXT When learning is delivered, the importance of context matters but may vary depending on the student. Some studies show that learners’ application of training depends on their prior level of knowledge and their ability to transfer knowledge across domains. In the brain, context matters and can either improve or worsen learning. A brain circuit, involving short-term memory, long-term memory, self and emotional processing, participates in establishing learning context. Therefore, all content must address the material, contain vehicles to enhance long-term memory, and remain relevant. REWARD-BASED LEARNING Value also guides behavior in the brain. Students are more likely to learn when they perceive the learning as valuable, as this is rewarding to the brain. Reward-based learning enhances students’ ability to attend to the information at hand. Learning feels more relevant when students are actively involved in the learning process. Learning is enhanced by content’s relevancy, because the brain feels rewarded. LEARNING DESIGN Finally, design is critical to learning. When learning is sublime rather than simply visually pleasing, it is arousing to the brain. In effect, students are most engaged when

they are swept off their feet by content design. Instructors should evaluate their content based on this.

EVALUATING LEARNING PROCESSES GOES FAR BEYOND THE CONTENT AND LEVEL OF APPLICATION. BRAIN-BASED EVALUATION PRACTICES Analyzing and evaluating learning processes and design goes far beyond the content and level of application. Other factors determine brain-based learning success and should be included in the evaluation process. The following screening questions may form the basis for additional learning analytics and evaluation practices: • Does the learner have a growth mindset? • Does the instructor have a growth mindset? • Does the student trust the teacher? • Was trust established early on? How? • Is the learning motivating? • Has the context been optimized? • Does the student value the learning? • Has the student been included in the learning? When these factors are addressed, brainbased learning can be optimized. Dr. Srini Pillay is the CEO of NeuroBusiness Group. He is also assistant professor (part-time) at Harvard Medical School and teaches in the executive education programs at Harvard Business School and Duke CE. Email Srini.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 13


TOP 20 TOP

20

COMPANY

CONTENT DEVELOPMENT

2019

2019

COMPANIES

TOP

20

COMPANY

ONLINE LEARNING LIBRARY

VIEW THE LISTS


PERFORMANCE MATTERS JULIE WINKLE GIULIONI

WHO [ELSE] CARES ABOUT LEARNING IMPACT DATA? During my years as an internal learning professional, a considerable amount of my time was spent contemplating, measuring, evaluating and communicating the extent to which training and development initiatives were delivering on their promises. Like you, I debated the merits of Kirkpatrick and Brinkerhoff and explored the best methods for tracking and promoting learning results.

EVOLVING NEEDS & EXPECTATIONS L&D has never been more important in the workplace. In a fast-paced business environment, the ability to learn quickly, upgrade skills agilely and perform in new ways daily offers a powerful and sustainable competitive advantage. However, learning isn’t exclusively a priority for organizations; it has become a priority for individuals as well.

Naturally, the output of these efforts was used to inform program modifications, with lessons learned improving our content, strategies and initiatives. But at the end of the day, the real audience for this analysis was typically executives. It was all about selling management on the value of training: linking development to business priorities, justifying future budgets, the learning and development (L&D) function and even my job.

In fact, according to multiple studies, L&D currently represents one of the most important employee benefits a business can offer. Prospective employees often make the decision to join your organization based upon the training trajectory available. Additionally, existing employees choose to stay or go based upon the learning opportunities presented to them.

Positioning the role of learning in your organization’s success with executives is important; offering information about how your efforts are directly supporting results helps leaders make wise choices about prioritizing and deploying their finite resources. However, too frequently the L&D function sub-optimizes the data it gathers, evaluates and prepares. And it does so by focusing exclusively on leaders and failing to make use of it with another highly interested and influential audience: the learners. The next frontier of learning impact analytics, or at least a possible expanded focus, involves strategically leveraging data with participants for their benefit and for the benefit of the organization as a whole. In short, it’s time to start looping in the learners.

So, lift the curtain and let employees in on what you have to offer; allow people to take advantage of training. Online systems with marginal incremental costs can support a more generous and democratic approach to learning. Publish, in layman’s terms, any data available from participant reactions and recommendations to documented behavior change and key business drivers.

LIFT THE CURTAIN AND LET EMPLOYEES IN ON WHAT YOU HAVE TO OFFER. Additionally, it’s important to remember that employees everywhere are becoming increasingly sophisticated consumers. Given the ubiquitous nature of information and learning resources, both within and

outside of the workplace, employees have a variety of choices. They’re even willing to spend their own money to further their development. Sharing data with employees about the quality of your offerings can inspire not just credibility and confidence. THE BENEFITS OF BEING TRANSPARENT When employees are offered visibility regarding the value of the learning experiences available to them, your organization can benefit in the following ways: • Learners are more receptive to engaging in L&D efforts, including online social interactions. • Learners have a greater appetite and willingness to participate in future evaluation processes because they’ve benefitted from its output. • Learning becomes more valued by everyone, allowing informal learning to garner greater attention and energy. And all of this creates a stronger learning culture within the organization and makes the job of the L&D department easier and more satisfying. So, you’re gathering the data and crunching the numbers already. Why not squeeze even greater value from your training evaluation initiatives? It’s as easy as expanding your distribution list to include an additional audience. Let’s start looping in the learners! Julie Winkle Giulioni has 25 years of experience working with organizations worldwide to improve performance through learning. Email Julie.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 15


Understand the Sales Training Market

+10%

This year, overall spending on sales training will increase by 10%.

In this report, you will have access to: The size of the overall training market and the sales training segment of that market

$2.8B

An in-depth analysis of the sales training market, including key trends, topics covered and delivery methods

Companies worldwide will invest $2.8 billion in sales training programs and vendors this year.

Profiles of each of the 2019 Training Industry Top 20 Sales Training Companies

Understanding this market is essential to the success of your organization’s sales training strategy and investment.

You’ll benefit from being able to benchmark your sales training investments, gain intelligence on how to evaluate sales training offerings and expand your understanding of the sales training partners available in today’s marketplace.

PURCHASE THE REPORT

15

2019

Training Industry Market Segment Reports

TOP

20

COMPANY

YEARS

SALES TRAINING

Fifteen years of corporate L&D market analysis.

Data from over 500 L&D vendor company submissions.

Survey responses from 5,000 L&D professionals annually.

View all Training Industry reports


BUILDING LEADERS SAM SHRIVER & MARSHALL GOLDSMITH

JUSTIFYING THE INVESTMENT IN LEADERSHIP TRAINING After reflecting on the philosophy classes that we took in college, and comparing notes, we realized our experiences were strikingly similar. When it came time for the final exam, our professors dramatically turned to a chalkboard and wrote out a version of the question: “Why live?” And, tempted though we were to respond by writing down something like “Why not?” and head for the door, we succumbed to traditional expectations and filled blue book after blue book recounting what we had learned from reading the work of one philosopher after another throughout the semester.

WHEN YOU BEGIN WITH THE BUSINESS RESULTS, THERE IS AN ELEMENT OF CONSISTENCY THAT SURPRISES NO ONE.

Now, fast forward to this very moment and let’s consider the question: “Why do leadership training?” We’ll come back to what we think is a viable answer shortly but, first, a little side story. In large part because of the model that has been ascribed to him since the 1960s, there is a natural inclination to attribute measurement strategy for leadership training to Donald Kirkpatrick. He was the first person

to publicly suggest the following parameters of impact analysis for a training intervention: • Reaction: Did learners learning experience?

like

the

• Learning: Did the learners learn anything? • Behavior: Did the learners change behavior because of what they learned? • Results: Are there any results we can tie to the behavior change? In the mid-1980s, there was a highly disruptive article published by training guru Bob Pike that essentially posed the question: “What if we turned the Four Levels Model upside down?” Donald Kirkpatrick was initially resistant to the suggestion, but later came around to the idea. Donald’s successors, Jim and Wendy Kirkpatrick of Kirkpatrick Partners, analyzed the model and inverted the starting point for effective measurement strategy. When you begin with the results most organizations seek, there is an element of consistency that surprises no one. There is a bottom-line, productivity measure that needs to continue migrating upward and to the right; there is the organization’s ability to attract and keep key talent. There are also the transformational commitments many organizations make to positively impact the world in any number of creative

and imaginative ways. Regardless of what those strategic initiatives happen to be, they are the starting point for measuring the impact of training. You literally ask questions, and get answers, that cascade from results to behavior to learning instead of the other way around – for example: • What are our key strategic initiatives? --How can training help us achieve the objectives dictated by our strategy? • If employees implemented what they learned in training, what would be different? --What would people at all levels start doing? Stop doing? Do more of? • Given the answers to those questions, what form should our training take? --Is it both relevant and engaging? --Does our design intentionally include and feature extended stakeholders (e.g., the managers of the trainees who possess the potential to drive desired behavior change)? So, why do leadership training? You do leadership training to enable your organization to achieve its strategic objectives. We are well past the point in life where regurgitating what we know in a blue book or two matters to anyone. Marshall Goldsmith is the world authority in helping successful leaders get even better. Sam Shriver is the executive vice president at The Center for Leadership Studies. Email Marshall and Sam.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 17


IS TRAINING ANALYTICS

BY JIM A N D W EN DY K IRKPA T RICK

| 18


Alisa beamed as she presented the final report for the year-long leadership development initiative she spearheaded at her organization. She recapped the thoughtfully designed, blended training curriculum and its related test scores and comments. She highlighted positive results in key company metrics, including employee retention, customer satisfaction, product sales volume and promotions of the program’s participants. She eagerly awaited the management team’s reaction. After some pleasant nodding, someone asked, “Alisa, please explain how training specifically influenced the organizational results you have presented, such as customer satisfaction. I would think there are multiple contributors to that outcome.” Alisa began to sweat; she didn’t really have any supporting data. If this story sounds familiar, you are in good company. Many training plans, even those for award-winning programs, are missing one critical element that will both ensure their success and demonstrate value to stakeholders: Level 3 data connecting onthe-job performance to program results. This common omission has cast doubt on the value of training and is a key factor in the cyclical nature of training budgets. However, creating and implementing a plan with the all-important Level 3 data is not as difficult as some assume.

THE BASICS OF TRAINING VALUE Begin designing your program using the four levels of training evaluation, or the Kirkpatrick Model (see Figure 1 on page 20). Savvy training professionals know that every training request should first be considered through the lens of what Level 4 result it will support, meaning, what high-level organizational outcome the training should positively influence. From there, the elusive missing piece should be designed: the Level 3 plan. Both the training professional and the organization should work together to define what needs to change in the onthe-job environment and employee performance to yield the training’s desired results.

factors. One plus two does not equal four, meaning that formal training alone, no matter how good, will not produce a meaningful level of organizational results and cannot be credited with results without sufficient data showing the connection.

FORMAL TRAINING ALONE CANNOT BE CREDITED WITH ORGANIZATIONAL RESULTS WITHOUT SUFFICIENT DATA SHOWING THE CONNECTION.

THE MISSING PIECE

There is strong agreement that formal training alone yields a fairly small portion of any program’s organizational results. On-the-job experiences are the biggest source of learning for employees, according to the “Deconstructing 70-2010” research report published by Training Industry. An organization’s on-the-job environment and culture significantly impact what employees will or won’t do, regardless of their knowledge.

Alisa’s final report, like many similar reports, was missing the Level 3 component. She jumped from a detailed explanation of the training program to taking credit for organizational results that, by nature, have many contributing

During the next round of budget cuts training will likely be on the list, because no one can really tell if it was successful or not through anecdotal evidence alone. This is why creating a Level 3 plan and defining roles and responsibilities

Then, the training professional can design the training program (Level 2), and consider the type of environment that would best support the learning effort and eliminate unnecessary distractions (Level 1).

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 19


early in the planning phase is critical not only for the success of the program, but also for the continued existence of the training function.

CREATING A LEVEL 3 PLAN The first step in creating a credible Level 3 plan is to define, in observable and measurable terms, the critical behaviors that training graduates should be able to perform as a result of the training. To ensure training graduates do what they know they are supposed to do, your plan should include a variety of methods and tools to reinforce the importance of critical behaviors, provide coaching and support in performing such behaviors, track the degree to which they are performed, reward those who do them reliably and correct those who do not. In reality, executives are not physically there on a daily basis, and even direct supervisors often feel that they do not have enough time to support training graduates, or possess high enough comfort level in tracking performance and correcting employees who are not implementing the behaviors taught during the training. Therefore, during the program development process, there must be buy-in for roles and responsibilities after training to ensure the plan is followed. The degree to which the Level 3 plan is implemented defines the initiative’s overall success.

IMPLEMENTING A LEVEL 3 PLAN Training analytics are often summative in nature, meaning they occur at the completion of a training program or implementation phase. Focus instead on formative evaluation, or that which occurs during a phase, so you can correct issues and maximize performance and results along the way. Before training, find out what analytics are important to stakeholders, managers of training graduates, training graduates themselves and the training group. Determine who will gather and report the data, and decide how the data should be formatted. Determine which data will be reported throughout the initiative, and

| 20

how often, to track progress and identify areas for improvement. Let training participants know before and during training what is expected of them after training, what support is available to them and how their performance and outcomes will be tracked. Making this connection has countless benefits, including increasing training relevance and engagement, because participants can see the training’s purpose and how it connects to their personal performance and contributes to greater organizational success. When the formal training is over, the important work begins. As training graduates return to the job, schedule regular checkpoints to ensure that the reinforcement, support and accountability actions are occurring. Automated reminders to check in with key parties, such as managers of the graduates, are a good practice. Expect to have a few setbacks, and be prepared with a remediation plan. This is a normal part of the process, and the reason that continual monitoring and reinforcement is required for success in most programs.

A LEADERSHIP DEVELOPMENT EXAMPLE In Alisa’s leadership development program, a critical behavior for new leaders could be conducting weekly team meetings with all direct reports to review key projects and progress toward milestones. This behavior could be supported by requiring new leaders to submit monthly reports documenting when team meetings were held and the key issues discussed during each meeting. New leaders failing to submit their reports or to hold the required meetings would have a conversation with a senior leader about their reason for doing so. The underlying belief here is that team meetings foster communication which, in turn, fosters good decision-making that contributes to timely and satisfactory job completion. This will produce pride in the team, leading to employee satisfaction, higher retention rates and possibly awards and recognition showcasing the organization as a great place to work.

FIGURE 1: The Kirkpatrick Model LEVEL 4 Results The degree to which targeted program outcomes occur and contribute to the organization’s highest-level result.

LEVEL 3 Behavior

The degree to which participants apply what they learned during training when they are back on the job.

LEVEL 2 Learning

The degree to which participants acquire the intended knowledge, skills, attitude, confidence and commitment based on their participation in the training.

LEVEL 1 Reaction

The degree to which participants find the training favorable, engaging and relevant to their jobs.

These lofty outcomes can have a variety of contributors. This is where the story that underpins the data is required to credibly connect training to performance and to organizational results.

REPORTING ON LEVEL 3 ANALYTICS Powerful data is a combination of numbers and information that supports them. This is the formula that connects what is learned to what is done on the job to the results it creates. However, there is always the objection that some other factor created or influenced the results.


In reality, there always remains a number of factors that influence results. With a reasonable investment of resources, you can show the relative contribution of multiple factors, and create a culture of collaborative success. A simple way to connect training to organizational results is to survey training graduates and their supervisors on what factors have contributed to their success. For example, supervisors could be asked what factors they felt were most important in reducing turnover of their direct reports. They could be forceranked, or respondents could select all that apply. Responses could include: • Skills learned in leadership development training • Regularly scheduled team meetings • Supervisors taking a personal interest in their direct reports • Support from human resources • Technology and tools available to employees • Company reward systems • Office environment • Company culture/values Then, respondents can be asked to share a story related to how one or more of these factors contributed to their success. The responses to these questions show the value that training brings to the

company overall and highlights other important factors contributing to the company’s success, giving credit where credit is due, and contributing to a teambased approach to success. If Alisa had asked such a question, she might be able to tell a story like this: Alan, a new leader, participated in the leadership development program. He learned how to conduct an effective meeting and was asked to hold weekly team meetings. At first, he objected that no one had time to attend and meeting attendance was sporadic. The program he was spearheading experienced a delay, which cost the company money.

SPEND LESS TIME TWEAKING THE TRAINING PROGRAM AND MORE TIME ON WHAT OCCURS BEFORE AND AFTER IT.

With some management pressure, Alan began holding regular meetings. Through team discussions, misunderstandings that were slowing progress were addressed and resolved — and the delayed program was finally launched. Client response was favorable, and a key client increased their

purchases by 10% because they were so pleased with the new offering. Alan treated his team to a pizza lunch on Friday, and presented a variety of awards that were both serious and humorous in nature. Suzanne, a team member, said, “I would never leave a great team like this.”

TAKING THE FIRST STEP Creating a Level 3 plan can seem daunting for those who have never done it before. Here are a few tips to get started: • Select one important initiative, and use it as a pilot. • Find an executive sponsor, and share this article with them. • Create a committee to support the pilot. • Spend less time tweaking the training program and more time on what occurs before and after it. Invest your time in building a process during the pilot. As a result, subsequent programs will be both faster and easier. By following the tips and tricks outlined in this article, you will soon find yourself building and implementing Level 3 plans for all major company initiatives — and reaping the benefits. Jim and Wendy Kirkpatrick are co-authors of “Kirkpatrick’s Four Levels of Training Evaluation.” Email Jim and Wendy.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 21


Leveraging

Analytical Tools

to Transform L&D By Preeta Chandran The single biggest challenge facing learning leaders today is evaluating the effectiveness of their learning interventions. Currently, data analytics is all the rage. There is a vast array of analytical models, tools and data science expertise available, all of which are working to transform businesses across the globe. So, how do we leverage the latter to enable the former and give learning and development (L&D) organizations accurate tools to measure the effectiveness of their learning programs and use that data for continuous improvement and transformation?

THE CHALLENGES FACED BY CLOs AND THEIR TEAMS L&D professionals understand that learning is crucial to business performance. However, proving the value of L&D to other parts of the organization, including business leaders and decision makers, is an uphill task. Why do chief learning officers and their teams struggle in this area? The answer lies in the practical challenges they face in measuring the effectiveness of learning programs and, hence, the challenge in showing their impact on business goals and performance. There are frameworks that address measuring learning effectiveness, the most notable being the Kirkpatrick Model:

LEVEL 1 | REACTION This level refers to the reaction of learners, typically captured through feedback

| 22

forms or “smile sheets”; the reaction phase helps L&D professionals understand to what degree participants find the learning intervention engaging and relevant to their jobs.

LEVEL 2 | LEARNING This level refers to the degree to which learners acquire the intended knowledge, skills and attitudes addressed in the learning intervention, and it is typically measured through assessment scores.

LEVEL 3 | BEHAVIOR This level refers to the degree to which learners apply their learning from the learning intervention to their jobs; supervisor, mentor and 360-degree feedback are some ways to capture this information.

LEVEL 4 | RESULTS This level refers to the degree to which targeted outcomes occur as a result of the learning intervention and its impact on key business metrics. However, many organizations following these frameworks lack the tools needed to put the framework to use. At each level of

the Kirkpatrick Model, the first step toward measuring effectiveness is data collection which, for many organizations, is a difficult task to carry out beyond Level 2. According to the “State of the ROI of Learning” report by Udemy, most companies still rely on Level 1 metrics and were primarily measuring training satisfaction and completion rates. By relying only on Level 1 data, organizations fail to measure the L&D’s impact on employee behavior, key performance indicators (KPIs) and critical business metrics. Even with the wide availability of data, how does one go about analyzing the mass of information to generate valuable insights that enable better decision making? This remains the bigger challenge. This is where data mining and analytics come into the picture. The key objective is to start a thought process on how to address the challenge of measuring learning effectiveness by leveraging cuttingedge, new-age technology. How can organizations leverage analytics to evaluate the effectiveness of their L&D interventions and drive continuous improvement?


DATA MINING AND ANALYTICS IN L&D Data mining is the process of finding anomalies, patterns and correlations within large data sets to predict outcomes. Big data analytics is the process of collecting, organizing and analyzing large sets of data (called big data) to discover patterns and other useful information. So, how can L&D professionals leverage data mining and analytics? Figure 1 is a depiction of what analytics tools can be leveraged and at what level of the Kirkpatrick Model. Let us delve into a little more detail, with the four Kirkpatrick’s levels as the reference points.

REACTION Organizations typically collect learner feedback through either physical or online survey forms. In the case of e-learning hosted on learning management systems (LMSs), feedback forms are often built into the e-learning course. The net result is a host of data (from questions that ask learners to score

parameters such as the relevance of course content, ease of understanding of concepts, interactivity or engagement, etc., on a five or 10-point scale) and/or a large amount of unstructured information through comments on various parameters in the questionnaire or survey forms.

The single biggest challenge facing learning leaders today is evaluating the effectiveness of their learning interventions. To analyze this kind of unstructured data, there are two analytics tools that, when used in tandem, are extremely useful: text mining and sentiment analysis. Imagine a Python sentiment analyzer that takes

input from the various feedback sources, the algorithm generates sentiments from the required or specified surveys, throws up key themes and generates output that is continuously appended to the previous analysis for the same training program. Then, it generates dashboards with meaningful, insightful data and shows the progression of trends over time. Consider the following example: Let’s say that a large global organization conducts a new manager induction program for people who have been newly promoted to managerial or supervisory roles from individual contributor roles. Feedback is gathered and run through the Python sentiment analyzer every time the training is conducted. From each session, the training team obtains instant insights on what worked well, what needs improvement, etc. Insights can be generated in numerous ways, such as geographically and evaluation parameter-wise. So, for example, course content might be working well across geographies. Logistics might be working well in one geography but might be a pain in another. Learner engagement might be a pain across the board.

FIGURE 1

WHEN

ANALYTICS

HOW

WHAT

Applying analytics for evaluating training effectiveness, personalization and continuous improvement REACTION

LEARNING

BEHAVIOR + RESULT

Learners’ reaction to the learning intervention

The extent to which learners acquire the intended knowledge, skill, attitude, from the learning intervention

The extent to which learners acquire the intended knowledge, skill, attitude from the learning intervention

Feedback surveys or smile sheets, seeking feedback on training content, design, trainer

Pre- and post-assessments

Performance assessments, supervisor, mentor, and 360-degree feedback, performance on outcomes such as customer satisfaction, operational metrics, sales

Text mining, sentiment analysis: Key themes generated via tools like Python Sentiment Analyzer

Assessment scores: scores, trends, anomalies; analytics powered by learning management systems

Artificial intelligence (AI) and predictive analytics: Intelligent systems enabling integration of data from HRMS, LMS, and other relevant systems, and AI algorithms driving predictive analytics

After intervention

Before and after intervention

After intervention, over defined time periods

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 23


Text mining reads each sentence in verbatim from the feedback forms and then breaks, chunks and organizes the information. Then, it throws up key themes in visual formats (such as highlighting pain areas in red and successes in green). Now this is something L&D teams can work with! They now know what specific aspects are working and which are not, which geography/group/department is performing well and what can others imbibe from them, and a host of other meaningful insights. This helps L&D teams create continuous improvements over time.

LEARNING LMSs today can generate a host of useful insights from learner-generated data. They can also generate individual and group proficiency dashboards by processing learners’ pre- and postassessment scores or assigned target score levels and achieved score levels. Reports can also be generated to compare the scores of managers, departments, specific groups and/or locations, in addition to showcasing trending scores and the progression of scores over time. Overall, artificial intelligence (AI) powered LMSs have the ability to scan the host of data and visually categorize scores, depict trends and flag anomalies. This empowers L&D teams much more than traditional LMSs that provide insights solely related to the amount of time learners spent on the course, individual learner scores and the most popular courses. Thus, L&D teams can accurately quantify the learning that has taken place in the organization as a result of L&D interventions.

BEHAVIOR AND RESULT This is where AI and predictive analytics come in. Data on behavior and results over time are typically derived from outside the LMS, from systems such as HR management systems (HRMS), performance evaluations, customer surveys and/or 360 surveys. Today, even external data sources can be plugged into the learning system to provide a more accurate employee profile. Systems such as Adobe Captivate Prime have features that enable the automatic importing of user details from HRMS or other applications, such as Salesforce Dot

| 24

Com (SFDC), into the system. Thus, the system can recommend learning paths based on a combination of assessment scores and performance data. This is the essence of predictive learning. Predictive analytics is embedded in several learning platforms today. Platforms like EdCast and several others apply AI and machine learning algorithms in order to provide corporate learners with the online training resources they require. The resources are pulled from the LMS as well as external sources, such as YouTube videos, TED talks and learning libraries on the web. As a result, every member of your team can pursue personalized online training paths to bridge gaps and improve workplace performance. Learners are given recommendations based on their performance according to key metrics, assessment scores, what courses they are viewing already, what courses others in similar roles are viewing, similar learning needs and/or similar learning interests. The result is wholesome learning integrated with performance and behavioral aspects. Training paths can be re-adjusted based on behavioral and performance changes, and the data is available to managers to inform evaluation feedback.

HOW TO KICKSTART AND SUSTAIN LEVERAGING ANALYTICS FOR LEARNING L&D professionals should not shy away from asking for support. They will need to work closely with big data and analytics experts to see where they are in terms of the quantity and quality of learning data, what improvements need to be made and the time and effort that will entail. It is the data that is key to showing results. L&D professionals will also need to work with technology experts to understand what exists in the organization, to evaluate what features their LMS supports (assuming there is an LMS in place) and whether they are optimally utilizing the LMS; they should also evaluate whether a change in LMS implementation is required. Adopting an LMS that integrates with the organization’s HRMS and curates and aggregates content from external sources — while also employing AI and machine learning for personalized learning — is the way forward.

Ideally, this process should be followed up with an implementation roadmap and associated budgets, with relevant stakeholders enlisted for approvals. This would aid in ensuring the L&D team is committed to driving business results by understanding the value of training through effective training evaluation practices.

Having the right data will help drive continuous improvement and performance enhancement across the business. BENEFITS OF LEVERAGING DATA MINING AND ANALYTICS IN L&D Having the right data and accurate insight into learning effectiveness will help drive continuous improvement and performance enhancement across the business. Through these tangible contributions to the business, L&D can establish itself as not just a cost center — but as a business partner. Insights on skill levels, the alignment of employee skills with business needs, and the impact of learning on key organizational metrics can enable more informed decision-making. As a result, L&D can better determine what courses need to be designed and delivered, and can make better decisions related to hiring, staffing and competency development as a result. By leveraging data mining and analytics in L&D, business stakeholders can see how the learning organization is addressing and impacting key business imperatives, including operational efficiency, employee performance and, ultimately, business results. Preeta Chandran is a digital transformation and L&D professional with over 18 years of experience. She is the CEO of eWandzDigital Services, a digital media company, and was formerly an operations leader with global conglomerate Genpact. Email Preeta.



LEVERAGING LEARNING ANALYTICS TO IMPROVE BUSINESS OUTCOMES:

WHERE TO START BY KEN TAYLOR

By leveraging learning data and analytics, organizations can better understand if their programs drive desired business outcomes and improved employee productivity. Learning analytics provide the opportunity and insights necessary to better meet learner expectations. Leveraging learning data properly will reveal learners’ preferences, including which modalities and design features they value. This data can also be used to uncover poor training programs and segments by evaluating whether the learning has been understood or even consumed by the learner. These discoveries allow learning leaders to prioritize learning that works and eliminate programs that waste the organization’s time and money.

We understand that leveraging learning analytics is often easier said than done. The potential and availability of data in organizations presents ample opportunities and plenty of challenges. Some organizations may not have the resources to collect learning data, and organizations that do have the resources may struggle to determine if they are collecting the right metrics. You may even have the necessary resources and correct data but struggle to accurately interpret the results. Where do you go from there?

| 26

To gain a greater understanding of learning analytics, we asked a group of leading learning and development (L&D) companies and knowledgeable learning leaders to share their best practices and solutions for learning data collection and analysis. We hope the insights you gain from this report will provide actionable tips that will inform your organization’s learning analytics practice, as well as offer a comprehensive understanding of learning analytics’ role in your organization’s success.


WHAT ARE YOUR BUSINESS PARTNERS MEASURING?

Bonnie Beresford, Director of Performance and Learning Analytics, GP Strategies Training departments will forever be challenged if they continue to try to show value by exclusively using their own data. Learning data provides insights and helps learning leaders manage the efficiency of the training department. However, the data is not sufficient to prove the value of training; you must reach outside the learning organization to credibly show value. You need to meet with business partners to discover what they measure and what KPIs are on their dashboards. Then align your learning solutions with these business KPIs during the design stage. Collaborating up front with business partners pays great dividends when it comes to measuring and showing value. Your partners will recognize your interest in their business, understand how your solutions are going to help, and get you the business data you need to show training’s impact on the KPIs that really matter.

OPERATIONAL & PEOPLE METRICS JD Dillon, Chief Learning Strategist, Axonify

L&D must expand its definition of learning data to include a variety of operational and people metrics. Start with a clear, measurable business goal, and avoid the temptation to include a wide range of topics in your solution. Instead, focus on a specific result that can be measured using existing business data. Then, work with subject matter experts (SMEs) to identify the employee knowledge and behaviors required to achieve your goal. Look for existing data collection opportunities within the operation, especially related to behavior observation. Finally, determine the right-fit learning solution based on your identified knowledge, behavior and result targets. Ensure that your design, regardless of content modality, provides opportunities for you to measure changes in these elements before, during and after implementation. This process will allow you to determine your impact: the direct connection between your training and changes in business results.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 27


SITUATIONAL-BASED LEARNING ASSESSMENTS

Dr. Ian Stewart, Executive Director Learning and Design, Kaplan Leadership and Professional Development Imagine learning data that reveals: • The current level of understanding and application across an organization. • Those people or groups who are a training priority. • How to personalize the learning provision. • The effect the training has made. Begin by harnessing the contextual richness of situational-based judgement assessments and presenting scenarios that simulate the real-world application of knowledge, replete with ecological factors such as ambiguity, time pressure and social friction.

Delivered and managed online, situational judgement assessments can provide a data map of current practice, identifying respondents with low competence and high confidence who are a training priority. The function also distinguishes the respondent’s strengths to inform personalized learning and offers a simple test-retest measure of training effectiveness.

PERFORMANCE MINDSET Sales Performance International

The sales training industry would benefit tremendously from a more definitive, evidence-based approach to understanding the return on capabilities development for sales. Sales organizations need to shift from a training-focused mindset (knowledge acquisition) to a performance mindset (outcome attainment). To begin, organizations need to translate their growth strategy into specific sales goals and metrics. Then data-driven models can deduce the most important competencies for attaining those goals, which also assumes the existence of welldefined sales competency models. With a capabilities model in place, organizations can apply a five-step process to drive continuous, measurable improvement. Through the use of well-defined models, integrated learning and development content, and data-driven technology, it is possible to efficiently collect data at each step of the process. Organizations can also apply integrated analytics to measure how sellers are progressing in their development, and how their changing capability levels relate to their attainment of specific outcomes.

| 28

GET ALIGNED

Tim Dickinson, Director of Strategy, Watershed To effectively leverage data to improve business outcomes, the first and most important practice is having a shared understanding of how strategic business priorities align with their respective learning programs. Creating alignment with strategic priorities requires crossfunctional discussions about selecting or defining priorities. It also begins the process of gaining stakeholder buy-in and provides the foundation for a chain of evidence to demonstrate the eventual impact of learning.

MEASURABLE PERFORMANCE IMPROVEMENT We measure sales training success by how well our customers achieve their sales goals. Win Rate % Change

Combine the measure of the respondents’ competence with the level of confidence in their responses. Although competence is a moderate indicator of behavior, competence allied to confidence makes for a much stronger indicator.

12

Win Rate % Change vs. Competency Level

10 8 6 4 2 0

0

Learn

Practice

Apply

Competency Level

Our unique approach provides complete, precise visibility into how investments in capability development impact key business outcomes.

www.spisales.com


JUST GET GOING!

Andrew Joly, Director of Strategic Design, LEO Learning For the past three years, we’ve asked L&D professionals to share their perspectives and progress on learning data and analytics in our annual measurement survey. Examining the top barriers to success provides interesting insights into the first steps toward accurate measurement analysis: 1. Get stakeholder buy-in | Early on in the process, engage senior stakeholders in benefitsfocused conversations. 2. Build team capabilities | Buy, build or borrow data and analytics skills, and begin to build the skill change across your team. 3. Start small | A “narrow-but-deep” approach is a useful focus to implement when creating your chain of evidence.

GRANULAR LEARNING DATA Christopher Cameron, Valamis

“Feel-good” statistics, like completion metrics, are dead. Training teams need to look at granular learning data, like learner behavior, in conjunction with business metrics to understand the true efficacy of their training. Data science allows companies to analyze huge swaths of learning data and key business metrics to see the behavioral influences on learning. If you can influence a desired behavior to meet a business goal and accurately measure the impact learning had on that outcome, management teams will be armed with the information necessary to make strategic decisions regarding training and their business.

MEASUREMENT FROM THE OUTSET

Edward Trolley, Senior Vice President of Managed Training Services, NIIT The measurement of training value can be elusive, but it is important. Begin with establishing a measurement methodology. Next, have a measurement discussion at the outset, not at the end. It is not necessary to measure everything, but it is important to measure what matters (i.e., large projects, high visibility projects, business issue driven projects, etc.). With the advent of analytics and integrated systems, we can tie the measurement of L&D to various metrics, like employee performance. The objective of measurement should be to consistently build confidence in your customers, so they know they will get an acceptable ROI. To reinforce this, develop a results contract at the beginning of customer discussions to define the scope of the work, the results expected, the metrics to determine the results, what needs to be done to ensure the results and any other external factors that could impact the results of training.

4. Understand what tools are needed | The key tool in this area is a learning record store (LRS). This will collect experience API (xAPI) statements from different learning events and connect to business data from elsewhere in your business. Short and sharp. Our advice: just get going!

UNDERSTANDING FIRST Raytheon Professional Services

For L&D professionals, improving business outcome comes down to improving the outcome or work output of individual employees. This upfront analysis needs to take the whole performance system into account. It is not only about learning management system (LMS) data but rather about collecting and analyzing learning, workforce and performance measures to extract valuable insights that will guide the solution design. L&D professionals need to adopt a consultative approach combined with an analytical mindset: • Ask the right questions with a good dose of critical thinking in order to unveil the real issues. • Collect and analyze new types of data that will highlight the most difficult and most frequent tasks that have the greatest impact on business outcomes. Let’s call it, “Understanding First.” This approach not only provides insights to inform the learning strategy but also establishes the baseline measures to prove the value of our solutions.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 29


ON-THE-JOB PERFORMANCE DATA Jim Kirkpatrick and Wendy Kayser Kirkpatrick, Kirkpatrick Partners

Most training organizations gather data about the training itself, such as measurements of participant reaction and learning. While this type of data will not directly improve business outcomes, training organizations frequently report this data and attempt to claim some credit for the outcomes. However, these stories have no credibility because numerous factors influence organizational results. Fewer organizations gather data useful to the business, that which relates to on-the-job performance after training and changes in the key company results that the training was designed to improve. The single most important thing training organizations can do is refocus their resources on

How to Use Predictive Analytics to Turn Data Into Leadership Action. DOWNLOAD WHITE PAPER

| 30

supporting and measuring what happens on the job after training. This is where training value is created, and it is the area that is most often overlooked. On-the-job performance data connects training and organizational outcomes and creates the story of value required to sustain training investments.

APPLY DATA SCIENCE

Stephen Young, Ph.D., Manager of Leadership Analytics, Center for Creative Leadership In 2018, organizations outsourced $34.3 billion in leadership development initiatives. Only 10% of global CEOs reported that those initiatives had a clear business impact. Is your organization one of the 10% that can demonstrate impact or the 90% that can’t? If you’re in the 90%, you need to apply data science to the data you already have. Data only becomes an asset when it’s leveraged to inform decisions. Most organizations capture data about their leader behaviors and employee engagement but rarely tie that data to business outcomes. The solution: Use predictive analytics to tie your leader data to tangible business metrics. The resulting models can help you prioritize leadership investments most likely to bring improvement in critical business metrics. Then, once you’ve made your training investment, you can track improvement both in behaviors and in business outcomes to demonstrate return on investment (ROI). Predictive analytics help remove risk from decisionmaking, so you can make strategic bets on your leadership development investments.


ENGAGEMENT, EXPERIENCE, EFFECT Richardson

Analytics are more than a way to verify outcomes at the end of training. Effective measurement guides the training process as it unfolds. Participants discover where they need to focus their attention and where their blind spots reside. Measurement is for the learner as much as it is for the leadership. Despite these benefits, many organizations fail to generate and measure data. Often, the data sits in disparate systems, making it difficult to connect the data to business impacts. Moreover, the available data represents different levels of accuracy, forcing the user to qualify the analytics first. Even after meeting these challenges, it is difficult to know what to measure. Simplify the process by segmenting measurement into three key parts: 1. Engagement: How are learners progressing through training? 2. Experience: How memorable is training, and how excited are learners to participate? 3. Effect: How has training improved business outcomes and skill adoption?

MORE THAN TECHNOLOGY ALONE Explorance

The top performing corporate training organizations recognize that learning analytics require more than technology alone; it’s a transformation process rooted in human cooperation and geared toward business objectives. L&D leaders first align with strategic business goals. Then, establishing a common language is crucial when agreeing upon leading indicators that describe how training results lead to improved business outcomes. For a proven and accelerated process, L&D professionals derive KPIs from a validated learning impact model. The right technology optimizes the transformation by automating data collection and reporting processes, allowing personnel to focus on analysis and actions. Internal and external benchmarks enhance the credibility of measures and shape performance expectations. These benchmarks further reinforce business executives’ confidence that L&D programs function adequately and maximize impact. Finally, this process results in building practical data literacy skills at all levels in the learning organization, so all stakeholder groups can effectively leverage rolebased reports, dashboards and data exploration tools to

monitor progress and transparently recognize the value of training programs.

VALID & RELIABLE ASSESSMENTS

Kristin Bernor, Marketing Manager, Questionmark With the growing demand to leverage learning data to improve business outcomes, training organizations can develop better measurement practices to prove the value of training by implementing valid and reliable assessments. Assessments are a valuable tool before, during and after training that define learning success. Organizations can gain a better understanding of the effectiveness of the learning and make the appropriate adjustments to meet goals and objectives. Key drivers for measuring learning include improving the effectiveness of learning and deliberately linking learning programs with organizational performance, individual performance and employee engagement. By utilizing assessments and the analysis of those data points, organizations can prove learning impacts these areas. Observational assessments can also be used to check onthe-job performance. Valid and reliable assessments result in meaningful measurement outcomes that prove the value of training.

TURN AWAY FROM EVENT-DRIVEN METRICS

Sam Shriver, Executive Vice President, The Center for Leadership Studies Old habits die hard, and learning professionals have long been identified as event-driven creatures. As such, much of our collective design and development energy remains focused on the experience and the Level I reactions that experience produces. Developing better measurement practices begins with putting those habits in the rearview mirror. The primary stewards of your organization simply do not care what learners think or feel about company sponsored training experiences. They seek evidence of change tethered to training that positively impacts the results they have responsibility to deliver. The implications of that reality suggest that the target of your measurement practices should be pretty much anybody but the trainees themselves. What are those trainees doing differently that is attributable to the experience, and what impact is it having on the outcomes key executives have invested interest in achieving? Ken Taylor is president and editor in chief of Training Industry, Inc. Email Ken.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 31


DEVELOPING SITUATIONAL LEADERS FOR OVER 50 YEARS. FIND OUT MORE AT SITUATIONAL.COM


“Is leadership training really worth it?”

TRAINING TIMES

Extra! Extra! Grab the greatest business news!

VOLUME 20 | ISSUE 19

$3.00

WAS IT WORTH IT? MEASURING THE IMPACT AND ROI OF LEADERSHIP TRAINING BY PAUL LEONE, PH.D. Taking leaders of any level away from their everyday roles is a costly proposition. If business leaders and stakeholders are truly going to be convinced and buy in to leadership training, they’re going to want tangible evidence of bottom-line business impact. So, as training evaluators, we always have to ask the question: “Do the revenue-generating or cost-saving benefits of the leader training outweigh the costs of the entire training experience?” In other words, is the leadership training really worth it?

Another question training evaluators should be able to answer for their stakeholders is, “How much does it benefit the business to send participants through the training sooner rather than later?” For instance, what’s the benefit of training new leaders in the first three to six months of their new role, as opposed to postponing or putting off the training? Many leaders and their managers think they can’t afford to leave the business for one or two days to go through training. As an evaluator, I wanted to conduct a case study to prove that leaders can’t afford not to go through training. The following case study is based on a two-day leadership training program for new leaders that I evaluated at Verizon.

What We Measured To provide evidence of the impact of training, we implemented a comprehensive evaluation strategy that measured success at every level and milestone of the trainee’s experience. That is, were they engaged and satisfied with the experience? Did they learn anything new? Did they apply the training and

improve any key leadership behaviors back on the job? Did those behaviors impact the business? Which performance metrics did their teams and direct reports improve? What were those improvements worth to the organization in dollar value? Did the dollar benefits outweigh the costs of training, and what helped or hurt the transfer of learning back to the job. Why would some participants get a bigger impact and return on investment (ROI) than others? This approach is based on the five-level Phillips/Kirkpatrick model of training evaluation, with the addition of a level six added to understand what transfer climate factors were at work to improve impact and ROI. • Level 1: Did they like it? Were participants engaged with content, facilitators and overall delivery of the training experience? • Level 2: Did they learn anything? What new knowledge and skills did participants gain from the training?

• Level 3: Are they doing anything differently or better? What improvements have they had in key behaviors back on the job after attending the training? • Level 4: Are they impacting the business? Are these improved behaviors increasing their key performance metrics that can increase revenue or create cost-saving efficiencies? • Level 5: Was it worth the investment? Did the benefits of training outweigh its costs? • Level 6: What maximizes training impact? What factors back on the job can help or hinder the application and transfer of learning? This is powerful data to show stakeholders because it describes how successive rollouts can be optimized and underscores the importance of things like manager endorsement of the training and manager support back on the job.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 33


Using this model (see Figure 1), we can essentially tell the extent of the training’s impact — from learners’ participation in the program to business results to ROI to and how to improve the training impact in the future. Additionally, by creating this comprehensive, connected and cohesive picture of impact, we would be able to show how valuable it was (in quantifiable terms) for business leaders and stakeholders to send their new leaders through the training early in their role, as opposed to later, which typically happens when the everyday demands of the business seem to take priority.

participants’ manager and direct reports. This was to compare and corroborate that these leadership behaviors did, in fact, improve over the post-training period.

Develop leaders who inspire people to perform at a higher level and increase organizational productivity and profits.

The Training The leadership training was delivered to over 2,000 first-time managers to help them transition from individual contributors to people leaders, and to help them ramp up their leadership skills as early as possible in their new roles. These skills include coaching their frontline employees, aligning their teams to organizational priorities and recognizing the right behaviors. These managers could be in their role anywhere from just a few months to more than a year before they attended the training, which is why we wanted to prove the value of early attendance. Ultimately, the objective is to develop leaders who inspire people to perform at a higher level and, therefore, increase organizational productivity and profits.

How We Measured Impact Level 1: A survey was administered to participants immediately after training, asking participants to rate various factors like instructor effectiveness, course content and relevance to their job. Level 2: We added several learning questions to the same post-training survey. Here, we asked participants whether they gained new knowledge and skills that were relevant for their roles. Level 3: We used multi-rater feedback. First, we administered a survey three months post-training that asked participants to rate the improvement (if any) they observed in very specific leadership behaviors. We then gave these same core questions to both the

| 34

Level 4: To define and isolate the impact training had on the business, we used a control group design. That is, we compared the performance improvements of a trained group of leaders who had been through the training (within their first three to six months as a new leader) and a twin sample or control group of leaders who had not yet attended the training (also within their first three to six months in their role). For each group, we tracked their performance three months before the training and three months after training to see if there was any improvement. The control group of untrained leaders was

tracked during the same timeframe, and all other factors were controlled for (e.g. tenure, location, team size). Level 5: To measure ROI, we turned the performance benefits we found at Level 4 into a dollar value (only the metrics we could monetize), and then compared them to the fully loaded cost of training. Level 6: To understand which factors impacted ROI, we added several questions to the original survey asking participants about their immediate work environments, such as “how supportive was your immediate manager when you tried to apply all your learning?” Answers to these questions were then analyzed and correlated to the amount of impact the training had on their behavioral improvements and performance.

The Results Level 1: Satisfaction with training: The average rating was 4.73 out 5.00 stars. Level 2: Learning new knowledge and skills: 97% of participants gained new and valuable knowledge or skills for their job. Level 3: Behavioral change on the job: 94% of participants showed “some” to

Figure 1: Level 1-6 Training Measurement Model

6 5

Climates that Maximize? Return on Investment?

4

Impact the Business?

3

Change Behavior?

2 1

Learn Anything? Like It?


“exceptional” improvement in the seven key leadership behaviors included in the multi-rater survey. There was consistent corroboration from both direct reports and managers that these improvement had indeed taken place. Level 4: Business impact: Results within the retail business showed the overall increase in performance for the trained group of managers was significantly higher than the performance increase of the control group. Using the five metrics we tracked, there was a 2.1% average improvement for the trained group above and beyond the control group. This became the tangible “benefit” of training to the business.

New leaders were adding more value to the business the sooner they went through the training.

Figure 2: What does it cost for each employee to put off new leader training by...? ...3 months

$2,138

...6 months

$4,276

...12 months

$8,552

participants in the study, but they also made $4.15 for every $1 spent. Level 6: Factors training impact:

that

maximized

The top three factors that were most effective in maximizing the impact and ROI of the training were: • Having an immediate manager who discussed the training with them and encouraged them to apply new skills • Being given the opportunity and extra time to have coaching conversations with each of their direct reports • Quickly identifying and addressing the resistors to change within their teams

Level 5: ROI: Benefits of training: Once we partnered with the finance team to ascertain what each metric was worth to the business, we were able to identify and monetize an impressive benefit of $2,138 per participant. Costs of training: To calculate the cost of training, we used a fully loaded cost calculator to account for every expense incurred as a result of training. These included development costs, attendance costs, participant travel and time away from job, materials, instructors, etc. The total fully loaded cost for each participant was $1,668. Once both benefits and costs were calculated, we then plugged these dollar values into the ROI formula. We were able to confidently and conservatively say that the business had an ROI of 29% within just the first three months post-training and a final annualized ROI of 415%. This means that not only did the business recoup the fully loaded costs of training for the

Conclusion All too often when L&D offers training to new leaders, many employees will procrastinate or postpone attendance, rationalizing that they can’t afford to leave their roles to participate in training or they need to stay focused on their metrics. What we wanted to prove with this case study was that leaders can’t afford not to attend leadership training and that, the sooner they go, the sooner they can add even more value to their organization’s bottom line. This is exactly what our case study proved: New leaders were adding more value to the business the sooner they went through the training. In fact, by using this measurement approach and looking at participants’ business benefits three, six and 12 months post-training, we were able to show exactly how much performance advantage was created by attending the training and, at the same time, how much could be lost by not attending the training.

By duplicating the same rigorous trained versus control methodologies and designs other disciplined researchers might use to test whether a medicine or drug “works,” we sought to prove how much training “worked” on improving job performance. Medical researchers would call this the “experimental” group (the ones who get the new medicine) and the “control” or “placebo” group (the ones that don’t get the new medicine). Further, we were able to see how much would be gained or lost by putting off taking the “medicine.” Results told us that if managers put off training for three months, they would lose $2,138 in improved performance. If they put off going through training for six months, they would be losing $4,276 and if they put off training for a full year after they assume leadership roles, they could be losing $8,552 in extra performance. See Figure 2 for a summary of lost performance due to a lack of training among new leaders. At a first glance, this might not seem like much for a company that generates significant annual revenue — but it’s a very significant finding. When you consider how many employees are stepping into new leadership roles across the enterprise and how many employees will be influenced every day by these leaders, the numbers add up. Add to this the fact that this significant benefit comes at the relatively small cost of $1,662 per participant, and it creates a compelling, and even irrefutable, case for all leaders to take their prescribed training. To put it another way, for the benefit of their organizations, leaders should hurry up and take their medicine. Dr. Paul Leone is an author, expert ROI consultant at Verizon and thought leader in the industry. Email Paul.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 35


HOW TO MAKE THE TRANSITION TO A DATA-DRIVEN LEARNING CULTURE BY JON GREEN Data here, data there, data, data everywhere. In the words of Geoffrey Moore, author of “Crossing the Chasm,” “Without big data analytics, companies are blind and deaf, wandering out onto the web like deer on a freeway.”

tailoring L&D programs to their own companies versus an entire industry and, with the onset of data, companies could offer the ultimate individualized approach to skills training by looking at the unique needs of the individual and customer.

Across industries, businesses are shifting to data-driven structures, using key business insights to produce faster and more customized products and services. The learning and development (L&D) industry is no different and, these days, it is often a place where data transformation begins — especially as more and more employees look for personalized training programs.

HISTORICALLY, L&D PROGRAMS HAVE BEEN BORN OUT OF NECESSITY, NOT DATA.

Historically, L&D programs have been born out of necessity, not data. Consider the case of World War I: Men were being called overseas to serve, which meant that women were keeping businesses running in jobs where they had no prior experience. Industries had to develop training programs to help these women succeed on the job. As technology and resources evolved, organizations began

Adopting a data-driven culture is a must for organizations looking to stand out from the competition and attract and retain employees. By generating insights from various data sources that can include anything from customer satisfaction, waste reduction and churn to timeto-market, engagement and adoption rates, L&D professionals will evolve their programs to better benefit employees and

| 36

fuel the bottom line. However, with all the data available today (2.5 quintillion bytes of data are generated worldwide every day), where does one begin? Follow the initiatives below to commence your successful data-driven L&D program and pave your way toward a data-driven learning culture:

INVESTIGATE THE EXISTING DATA POOL Not all data are created equal and recognizing that is the first step toward success. In many organizations, attempts to extract L&D insights from data have historically been difficult to access or structure in a meaningful way. Instead of defining and curating useful data, many organizations would over-supply their employees with content and training materials based on last minute requests from business leaders or broad, industry-focused best practice programs. Before implementing a data-driven L&D program, take a step back to understand


what types of information flow through your organization and how to best utilize it in your program.

CULTIVATE THE RIGHT TEAM OF DATA-DRIVEN LEADERS Once you’ve organized and assessed your data, it’s time to turn to your most important asset – your people. Organizations must choose the right people to lead their transformation to a data-driven culture. While hiring data scientists might seem to be the obvious answer, it’s not. With data analysis technology becoming more available and user-friendly, human data skills will become less important in the future. Instead, you should work to identify subject matter experts who possess the inherent context, personality and curiosity (fueled by learning), necessary to take on the challenge. These individuals can come from various backgrounds as long as they are detail-oriented, conscientious and can draw the right insights from the data in order to create meaningful stories for the business and, as a result, help drive transformational, data-driven change. Once these leaders are in place, there needs to be open communication between them and the rest of the organization. For data-driven projects to be successful, it’s critical that the L&D team communicates with other company stakeholders from the outset to align on processes and expected outcomes from the start. This can be challenging due to business pressures and priorities and, in some cases, misunderstandings about the role and goals of the L&D function. However, with proper communication, you can ensure everyone is working toward the same goal: establishing a datadriven learning culture.

LOOK TO EXISTING DATA-DRIVEN PROCESSES AS THE GUIDELINE FOR SUCCESS As companies embark on their datadriven L&D journey, they should look to existing departments that practice strong data-driven processes for inspiration. For example, marketing and operations teams are usually built on a data-driven

mindset. Marketing relies on data to gain insights into customers’ buying patterns and brand experiences. Operations leverages data in every move they make to optimize people, tech, and product functions across the organization. L&D can leverage these processes and strengths to achieve success.

ADOPT THE TECHNOLOGY AND PROCESSES NECESSARY FOR SUCCESS A data-driven culture can only truly flourish with the right technology in place. Data-driven leaders should consider the benefits of next-gen technologies such as augmented reality (AR) to provide richer and more cost-effective options for onthe-job training, experiential coaching and collaboration for employees. Additionally, the 2018 Training Industry Trends Report found that artificial intelligence (AI) and machine learning are being used to help companies better understand employee learning behavior. By looking at past and real-time behavior along with the adoption of intelligent technology, organizations can predict specific training needs and gain insight into the recommended content, and learning methods, for individual employees. Ultimately, the powerful partnership of data and new technology will help improve learning proficiency across your organization.

NOT ALL DATA ARE CREATED EQUAL AND RECOGNIZING THAT IS THE FIRST STEP TOWARD SUCCESS. IDENTIFY KPI THAT ARE RIGHT FOR THE ORGANIZATION If the agreed upon goal for both L&D and stakeholders is creating a better customer experience, data can help organizations by verifying what is working throughout the entire customer journey. When it comes to setting key performance indicators (KPIs) and goals, approaches vary by organization. An organization’s goals can broadly range from waste

reduction in a shipping company to increased customer satisfaction for a consumer product. Before embarking on data analysis, the company should clearly articulate the problem that needs to be solved on an enterprise-wide level. Indicators such as Net Promoter Scores (NPS) can provide a good benchmark for organizations to understand how your customers feel about your products and services. If the score is low in one particular area, companies know they need to change something and if it’s high, they can feel comfortable in continuing their current strategies. Once a clear objective is put in place, L&D leaders can truly understand the success and necessary adjustments to their program. For an industry with remote or deskless workers, such as restaurants, the data and KPIs that are tracked can measure process efficiencies. This includes factors such as customer wait time, or even the number of cups left at the end of the day compared to the number of soft drink sales. Any significant deltas captured in these metrics can tell an organization a great deal about the effectiveness, or ineffectiveness, of specific learning interventions — if measured properly. The lessons learned from these efforts can be applied more broadly and may very well affect multiple facets of business.

LOOKING AHEAD As businesses look to stay relevant and competitive, L&D professionals are central to the conversation and need to incorporate data-driven training to support company goals. By developing a learning culture that is built on data analytics, organizations will see improved employee satisfaction and output. In the coming years, data-driven L&D leaders will play even more of an important role in creating organizational success. Therefore, they should consider how to implement goal-based learning and strategies, including leveraging nextgen technologies, in order to create a data-driven learning culture — and, as a result, lasting success. Jon Green is the operations manager at CGS. Email Jon.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 37


Artificial Intelligence (AI) is here to stay, and it will inevitably impact many HR processes — but how? The following is an example of an interview between a recruiter and a job applicant: RECRUITER: Ms. Grey, we’ve received your information for the job we are offering from someone named Hal. We were very impressed by how your experience seems to meet our needs. APPLICANT: Well, Hal is always very smart when it comes to matching my credentials with a new gig. RECRUITER: Is he your agent — like, a sports agent? APPLICANT: You could say that! Actually, Hal identifies my next opportunities to learn by searching for future gigs, and he allows me to face new professional challenges that will develop my skills. RECRUITER: Ok, so Hal essentially handles your career map? APPLICANT: Yes, and Hal also enables me to learn in the workplace. Hal curates relevant content and identifies the appropriate “learning buddies” to meet my challenges when they arise. For instance, I wasn’t fully proficient in the agile project management skills you requested for the job. So, Hal suggested that, beyond the microlearning program he has already designed for me, I could meet with Tom Miller, who already works in your team. Tom has already agreed to train me on agile project management and, in exchange, I’ll train him on user interface (UI) design. It’s a fair deal! RECRUITER: Wow, so you haven’t even started working for us yet, and you’re already in touch with Tom? APPLICANT: Yes. I feel like I’m already part of your team! Hal coaches me to make sure I make the most of these resources and learning buddies. RECRUITER: So, meet regularly?

| 38

do

you

and

Hal


APPLICANT: Well, we plan some feedback loops together, and Hal creates assignments that add clarity to my learning path. RECRUITER: Assignments? That sounds like you’re going back to school, doesn’t it? APPLICANT: I feel I never left! My job is changing so quickly that I need to continue learning, and extra assignments allow Hal to track my progress and to publish authenticated micro-credits via blockchain once I achieve them.

INDIVIDUALS ARE BECOMING MORE SELFACCOUNTABLE REGARDING THEIR SKILLS DEVELOPMENT, BUT THEY STILL NEED SUPPORT. RECRUITER: That’s what we thought was so great about your resume. It’s more than simply having a diploma — we can actually track what you’ve achieved. That’s impressive. Obviously, we want you for the job, and we’d also love to meet Hal. I think he could be a great addition to our learning and development team. APPLICANT: Hmm…you do know that Hal is an AI-powered assistant, right? This scenario isn’t just something that would come up in a Black Mirror episode — it’s a representation of the reality in which we will soon live and work.

CONTINUOUS UPSKILLING IS THE NEW NORMAL Gig economy, swarms, squads … whatever you call it, a growing part of organizations will eventually rely on M-shaped persons (those who

have a wide breadth of knowledge on various topics but shallower knowledge where appropriate) and specialists. For these individuals, continuous upskilling will be the new normal. Individuals are becoming more self-accountable regarding their skills development (similar to how they are regarding their health or their online reputation). Still, self-accountable doesn’t mean alone or without support. However, as they no longer consider their employers as the only ones responsible for developing many of the key skills needed for the future or for managing everyone’s lifelong learning activities, there’s a gap to be filled.

ENTER THE AI-POWERED LEARNING COMPANIONS In the near future, the AI-powered learning companions may very well assist learners in four ways: 1. They may curate and analyze professional opportunities according to each individual’s current and future preferences and capabilities. Their goal will be to find “future gigs” that will allow the learner to tackle new professional challenges that will help them learn new skills. 2. AI-powered learning companions may also curate the appropriate content (sourcing the right learning objects from available learning repositories) and find the right learning buddies to meet those challenges. They may even have the ability to push the right content or promote human interaction at the right pace and best moment to learn.

3. They could practice coaching and providing feedback through interactive functions (e.g., natural language processing, video analysis, chatbots) to monitor the learner’s progress, stimulate them and keep them engaged in the program. 4. They could track evidence of skill acquisition over time and publish micro-credentials authenticated via blockchain. In other words, these assistants will manage a continuous workflow of pursuing the right gig, curating the best content and buddies, building adaptive learning experiences, coaching to keep them on track, and publishing evidence that will enable the learner to embark on a new and stimulating gig!

WHEN WILL IT BECOME A REALITY? As the ecosystem of the learning companion is already expanding, many of the conditions for an AI assistant-powered workforce are already being met. Consider the following: • Swarms and squads are gaining more and more interest as they help organizations become more agile. • Employees who are not looking for a new job are enrolling on job platforms and providing data related to their jobs and skills — which helps AI thrive. • Learning marketplaces are able to provide curated content in ASALAF (Any Skill/Any Language/Any Format). • Badges and certificates now recognize formal and informal learning (Open Badges, xAPI).

THE FUTURE OF AI ASSISTANTS In a very near future, personal skills AI assistants will be like agents for learners. They will manage a continuous workflow of chasing the right gig, building an adaptive learning journey, coaching you to keep you on track and publishing evidence that will enable the learner to chase a new stimulating gig with confidence.

Many challenges will arise as learners will become more and more dependent on an agent they’ve fed with their own host of personal data, and that has a huge impact on their paycheck. Corporate learning will need to focus on specific business objectives to demonstrate its value, as individual learning needs will already be fulfilled.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING -TITLE HEREANALYTICS 2019 I WWW 20 19 . T RAININGINDU I WWW. T RAININGINDU S T RY . C OM/SMAGAZ T RY . C OM/ I NE MAGAZ I NE

| 39


The following technological bricks are forming the foundation for the AI-powered learning companion’s success: • Personal assistants (such as Amazon’s Alexa or Google Home) with voice activated interactions and web curation. • Automated recruitment and job opportunity curation (through robotic AI assistants like “Mya” and “Tengai”) featuring profile fit analysis, robot interviewers and behavior analysis. • Adaptive learning platforms (like Knewton and Area9) supporting personalized learner experience (LX) management. • Automated personality assessment.

and

skills

• Intelligent learning objects that coach you on various skills, such as public speaking. curation tools like • Intelligent “Filtered,” which curates content you’re actually interested in. • Open like badges providers Mozilla combined with blockchain technology, which offers a secure model for the collection and sharing of skills indicators including academic records, badges and certificates, citations and letters of recommendation.

FOCUS ON UPSKILLING AND RESKILLING THE PEOPLE WHO AREN’T TECH-SAVVY ENOUGH TO USE A LEARNING COMPANION. | 40

DOES THIS MEAN THAT L&D WILL BE DISRUPTED? Learners will likely become more emancipated in the near future. Modern learners often follow the mantra, “I learn what is required to become who I wish to be.” This is where the sports agent analogy comes into play: Sports agents want their players to reach their full potential, just as AI-powered assistants want their learners to reach their own potential. The film “Love & Mercy” focuses on the Beach Boys’ co-founder and leader Brian Wilson’s struggles with mental illness. In the movie, his therapist turns into an overbearing agent showing that, well, agents can easily become greedy. In the years to come, we may face a new kind of singularity: not the one where robots become self-aware, but the one when you’ve given so much personal data to your personal assistant that it becomes the master, and you become the tool. Regarding corporate learning, one of the major side effects of technology is that intermediaries disappear. So, corporate training will need to refocus on the following specific objectives to demonstrate its value: • Develop industry-specific learning objects that will feed learning companions with relevant content for individual companies.

• Become consultants in corporate transformation. As individuals will eventually be handled by their AI skills assistants, there will still be a challenge of changing teams, organizations and company cultures in order to adapt to their digital ecosystems. • Grow a sustainable learning culture by fostering mentoring and providing opportunities for on-the-job learning. • Promote the employer branding. Measure and showcase the fact that in this company, people develop their skills and obtain new credentials — which has a positive impact on recruitment costs and retention rates. • Focus on upskilling and reskilling the people who aren’t tech-savvy enough to use a learning companion. Ultimately, in order to tackle the numerous challenges outlined above, corporate learning leaders may very well need their own personal skills AI assistants.

François Debois is the head of innovation for the Cegos Group, developing digitalbased solutions that generate engagement and change the way people work. Simon Vuillaume is the director of international projects for the Cegos Group, where he works with corporate L&D managers and other Cegos team members worldwide. Email François and Simon.


TAILORED LEARNING SOLUTIONS

UNDERSTANDING FIRST A strategic, consultative approach to learning is one that understands your key challenges—and delivers outcomes to support your critical objectives. This is what targeted, data-driven learning solutions are all about. This is how Raytheon Professional Services will impact the performance of your people and your business.

rps.com @RaytheonRPS Raytheon Professional Services

© 2018 Raytheon Company. All rights reserved.


MACHINES ARE THE FUTURE OF TRAINING:

TECHNOLOGY CAN ACCURATELY ASSESS THE PERFORMANCE CAPABILITIES OF PEOPLE SIGNIFICANTLY BETTER THAN HUMANS.

| 42

In today’s tight job market, human resources departments are seeking effective, cost-efficient ways to retain top talent, and study after study has shown that today’s employees are looking for organizations committed to championing their professional development. In fact, recent Gallup research has found that nearly 60% of millennials cite learning and development (L&D) opportunities as critical factors in their job searches, and Deloitte has found that 70% of young employees planning to leave their jobs cite a lack of development opportunities as a key reason for their departure.

their employees’ skills and entice their talent to stay. Currently, it seems they do see the value in L&D. LinkedIn’s 2017 Workplace Learning Report shows that 90% of leaders consider L&D as critical to closing the skills gap. However, the follow-through isn’t there: The same report shows that only 8% of leaders see the impact of L&D on the organization.

If that’s the case, it follows that businesses would be clamoring to design and implement leadership development programs to both improve

• Personalized leadership or skills coaching that’s prohibitively expensive for more than a very few select executives.

So, why the disparity? L&D programs are traditionally a risky investment, primarily because they fall into one of two categories:


HOW DATA-DRIVEN FEEDBACK IS FOSTERING IMPROVEMENT & ENHANCING HUMAN RELATIONSHIPS

• Out-of-the-box workshops or computerbased trainings that are meant to be a one-size-fits-all program but, in reality, are far too generic to create any lasting change. The programs in either category offer very little immediate (or long-term) evidence of return on investment (ROI), making it difficult for HR departments to discern whether their investment is making a difference in the organization and its people. There must be another option. There must be a way to offer detailed, personalized feedback to large groups of employees without busting the budget — and there must be a way to track those

BY NOAH ZANDAN

employees’ progress to demonstrate the business value of a quality L&D program.

THE ANSWER: AI-DRIVEN LEARNING AND DEVELOPMENT In recent years, we’ve been able to harness the power of machine learning and other forms of artificial intelligence (AI) to optimize a number of workplace systems and processes, from recruiting and hiring to billing to customer service and beyond. What’s exciting is that we can now do the same with L&D initiatives. Advances in behavioral science give us the power to design programs that

provide data-driven feedback and lasting improvement at scale, giving savvy organizations a true competitive advantage in today’s corporate landscape by empowering them to provide employees with the learning opportunities they crave. These innovations are grounded in one key idea: that technology can accurately assess the performance capabilities of people significantly better than we can. This empowers L&D teams to take the human coaches and workshop facilitators out of the equation with programs designed to measure users’ proficiency in a given area and offer adaptive, personalized learning at scale.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 43


Innovation over the past eight years in technologies like natural language processing, vocal recognition and facial analysis means that machines can assess, as accurately as a human coach, a professionals’ predicted impact on their audience, their team and/or their clients, and then respond with specific feedback, insights and actionable recommendations for improvement tailored to each user’s unique strengths and development areas. This is one example of innovation driving the capability for improved communication skills.

AUTOMATING THE LEARNING PROCESS THROUGH DATASCIENCE AND AIDRIVEN PROGRAMS LEADS TO A MORE PERSONALIZED LEARNING EXPERIENCE.

Imagine if you could take your new salesperson and use behavioral analytics to assess their five-minute pitch against the top 10% of salespeople in your organization, then deliver them a personalized development plan. If you do this for all your salespeople and continue to measure and tweak the feedback, then the system only gets smarter and faster. Of course, given the reputation of outdated computer-based learning programs, the idea of taking the human experts out of the equation does tend to raise eyebrows. However, we know from research and our own experience with coaching both the “old” way and through analytics, that automating the learning process through these data-science and AI-driven programs leads to a more personalized learning experience — and more measurable results — than traditional,

| 44

subjective coaching methods content-only approaches.

or

3 WAYS MACHINE-DRIVEN LEARNING OUTPERFORMS TRADITIONAL COACHING IT’S OBJECTIVE. Professionals who rely on data and research to make every major business decision are more apt to accept data than opinion, no matter how experienced the coach. After all, for somebody with a mind for facts and statistics, “You didn’t seem very confident” sounds less compelling than, “Your latest presentation scored 30% less confident than average, because you used more tentative language than usual, including these five words...” The former feedback is subjective — maybe the audience felt differently — but the latter is concrete, data-driven and paired with an insight that can help drive improvement in the future. Additionally, with data, learners can track their progress meticulously, watching their scores increase rather than relying on subjective comparisons from peers and leaders. It’s one thing to hear a coach tell you you’re getting better, but it’s another altogether to look at a chart showing measurable progress in distinct areas and highlighting areas that need improvement. Finally, objective scores can be modeled against performance outcomes, so you can demonstrate efforts’ true ROI — in areas like performance, engagement, retention, selection and more.

IT’S IMPERSONAL. Traditionally, giving feedback — in any area, from leadership to job performance to public speaking — has been a highly personal and subjective activity. No matter how skilled the coach or constructive the feedback, learners tend to take evaluations personally, dwelling on the negatives and defending their performance rather than focusing on the recommended solutions.

When it’s data rather than a human delivering the tough love, however, it creates a safer space in which learners can accept feedback. No longer is the learner hearing the coach’s perspective (which too often sounds like a judgment) but, instead, is an seeing objective analysis free of emotional impact. This means that, when trainers and coaches do get involved, they aren’t the “enemy.” Instead, they’re allies using the data to build a healthy coaching relationship on the foundation of improvement and progress.

IT’S SCALABLE. Elite training has historically been too expensive (easily adding up to

WHAT HAPPENS TO LEARNING WHEN WE TAKE HUMANS OUT OF THE PICTURE?

Given the reputation of old-school, computer-based L&D programs, the idea of world-class coaching without human involvement raises eyebrows. However, there are three distinct advantages to letting the data do the work: 1. It’s objective. Savvy leaders make their biggest decisions based on data, research, and statistics — not opinions. When the feedback is based in that concrete data and evidence they value for decision-making, it’s easier to accept than subjective opinions. 2. It’s impersonal. While feedback from a coach can feel like personal judgment no matter how positive the relationship, no emotions are involved when the data does the talking. With data-driven feedback, learners can let down their defenses and focus on improvement. 3. It’s scalable. It would take too much time and money to send a coach to every team member, but automated, data-driven programs can reach every employee — on schedule and under budget.


thousands of dollars per day) for organizations to provide to anyone but C-suite executives. Alternatively, with machine-driven learning, powerful improvement opportunities exist within an arm’s reach for every member of an organization. Individual data and insights support improvement for dozens, hundreds, or even thousands of participants, and aggregate, group-

level data can give team leaders or HR executives a holistic look at the group’s performance as a whole, helping them identify areas in which the entire team could benefit from a little extra support. In today’s business world, the best talent is eager to continuously learn and improve. That is, after all, one of the characteristics that makes them

TAKEAWAYS As of now, there is a huge gap between employees’ desire for ongoing L&D opportunities and their employers’ ability and willingness to provide those opportunities. • 90% of leaders consider L&D as critical in closing the skills gap, but just 8% of leaders see the impact of L&D efforts on their organizations (LinkedIn).

• 70% of millennials looking to leave their jobs cite a lack of development as a key reason for doing so (Deloitte).

While traditional L&D strategies were either too costly or too generic to be effective, organizations today can leverage the power of AI and machine learning to create automated L&D programs that deliver individualized, expert guidance without the exorbitant cost. Further, because they are grounded in data, they deliver true performance ROI.

such strong employees and team members. By withholding opportunities for development, companies are doing themselves a double disservice: They’re driving away their top talent, and they’re also stunting their own growth by preventing employees from learning new skills that could take their productivity, engagement and creativity to the next level. While, once upon a time, there was an excuse for refraining from making significant investments in training, today’s technology allows companies of all sizes to provide world-class, personalized development programs to every team member, tracking measurable ROI, improving employee engagement and retention, and gaining an edge over the competition — all without breaking the bank. Noah Zandan is the CEO and co-founder of Quantified Communications, a firm  that combines data and behavioral analytics to help organizations measure and strengthen the way their leaders and employees communicate and perform. Email Noah.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 45


When it comes to revolutionizing your safety training…

Think big. Go small. Eye Safety

Slips & Trips

Heat Stress Safety Strains & Sprains

Lockout/ Tagout

Back Safety

GHS Device Distractions Ladders

Arc Flash

Microlearning from DuPont Sustainable Solutions.

Make a big impact on your employees’ safety awareness with video-rich microlearning courses that provide just the information employees need to know, right when they need it… in five minutes or less. Choose from hundreds of e-learning courses and videos covering key topics such as first aid, safe lifting techniques, fire extinguishers and handwashing. Call 800-861-7668 or visit www.training.dupont.com/microlearning to learn more.

LEARNING & DEVELOPMENT


IMPROVING INSTRUCTOR IMPACT

ON LEARNING WITH ANALYTICS BY ERIC A. SURFACE, Ph.D., AND REANNA P. HARMAN, Ph.D.

EACH OF US CAN RECALL AN INSTRUCTOR WHO MADE LEARNING ENGAGING, RELEVANT AND IMPACTFUL, INSPIRING US TO APPLY WHAT WE LEARNED. UNFORTUNATELY, EACH OF US CAN ALSO RECALL AN INSTRUCTOR WHO FAILED IN ONE OR MORE THESE AREAS. Instructors are force multipliers, reaching hundreds — if not thousands— of learners, impacting both their learning experience and motivation to transfer. So, how can we improve instructor impact on learning? Learning and development (L&D) professionals use metrics and analytics to demonstrate program effectiveness, and to make program management/ improvement decisions. This approach can also be applied to manage, improve and develop instructors. Instructor-focused formative evaluations and analytics are typically neglected, even as they can help improve instruction and, as a result, learning outcomes. The following example demonstrates how formative instructor evaluations and analytics can improve instruction and learning.

HOW MUCH DO INSTRUCTORS MATTER? Over a decade ago, a client project presented an opportunity to explore how much instructors matter in the learning process. We frame this case study using the following two questions posed in the article, “Two Fundamental Questions L&D Stakeholders Should Answer to Improve Learning,” to explore a problem and guide evaluation and analytics:

selecting and training each candidate was a six-figure investment. Failing to achieve the proficiency standard for graduation meant a candidate was dropped from or recycled through the training pipeline, creating not only a monetary loss but also the loss or delayed deployment of a soldier with job-focused skills. Achieving a 100% rate graduation was critical, as this was during the height of operations in Iraq and Afghanistan.

• How well did I do? • How can I do better?

Context: In 2005, we investigated a gap between desired and actual learner skill proficiency in a job-required foreign language training course, which lasted 18-24 weeks and was the last phase in the training pipeline for U.S. Army Special Forces (SF). Note that

Questions: Program leaders asked themselves, “Is our training program meeting its proficiency and graduation objectives and producing the capabilities needed by operational units?” After evaluating its effectiveness, they determined it was not. Then, they asked, “How do we improve learning, graduation rates, and program effectiveness?”

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 47


Approach: We helped program leaders answer the second question. Almost no data existed on diagnostic factors shown by research to impact learning. We decided on two strategies — analyzing archival learning outcome data and collecting survey data focused on diagnostic factors from future learners and instructors. We had information about the training’s objectives, structure, stakeholders and context as well as learners’ class assignments and end-of-course (EOC) proficiency scores. This allowed us to determine how much individual and class-level characteristics impacted proficiency scores. The nested structure of learning events provides the opportunity to explore sources of influence on outcomes, even in the absence of direct data on diagnostic factors associated with a level of analysis (e.g., class). For our client, each learner was nested within a class and each class within an event. Learners and classes were also nested within instructors, as each instructor taught multiple classes.

Results: Our analyses provided evidence that instructors contributed strongly to learners’ success in developing proficiency. For example, instructors accounted for 42% of differences (i.e., variance) in learner reading proficiency scores.

SO, INSTRUCTORS IMPACT LEARNING. NOW WHAT? Identifying instructors as a lever to improve learning outcomes gave us a diagnostic factor to focus on … but now what? No specific instructor data existed to guide the creation of diagnostic survey items or of interventions to improve instruction. We determined what factor needed improving, but we still had to determine how to improve it.

How Do Instructors Impact Learning? Instructors impact learning directly through their decisions and actions

| 48

in preparing and delivering training content and by interacting with learners. We defined and measured instructor performance — the decisions, actions or behaviors under instructors’ control related to their roles and objectives in the learning enterprise — not instructor characteristics.

Will Instructors Differ on Performance? Since instructors were content subject matter experts (SMEs) with varying degrees of instructional experience, it was reasonable to assume there would be performance variability. Without instructor variability, this approach does not work.

INSTRUCTORS ARE IMPACT MULTIPLIERS THROUGH THEIR INFLUENCE ON LEARNERS AND NEED INSIGHTS, TOOLS AND SUPPORT TO MAXIMIZE THEIR IMPACT. Defining Instructor Performance: We reviewed research to identify instructional behaviors empirically linked to learner outcomes that could be rated by learners, instructors and/or supervisors. We identified behaviors that fit into four performance domains:

Measuring Instructor Performance: We also developed and validated instructor performance metrics, which assessed key behaviors in the four domains. Then, we collected data multiple times during and at the end of the training for two complete cycles. The metrics performed as designed with excellent construct validity and reliability.

Does Instructor Performance Impact Learning? Performance ratings collected throughout the course, starting at the 25% course completion mark, significantly correlated with EOC outcomes. When we retrospectively compared the performance ratings of instructors who had high and low-proficiency classes, instructors who taught high-proficiency classes had higher ratings on all items, across all time points; higher performing instructors had higher performing learners. With such robust findings, we developed and piloted a feedback intervention. We distributed a feedback report to instructors with results from the 25% collection, offered guidance on interpreting its results and suggested improvement resources. When we had data from four training cohorts (two with feedback, two without), we compared instructors who received feedback to those who did not. Instructors who received feedback improved their subsequent performance ratings, and their learners had higher EOC assessment scores.

• Learner Engagement • Classroom Management • Responsiveness to Learners • Adapting to Learner Needs Over the years, we identified additional performance domains, but these four remained relevant for instructor-led training (ILT). Training context and content, instructor effectiveness measure(s), instructional philosophy, and learner and instructor populations all impact what performance domains are relevant.

Intervention: We implemented a formative evaluation and feedback program to deliver results and provide tools for reflection and improvement/development planning. The reports provided comparisons to help instructors determine if they needed to improve. Instructors used the report to guide conversations about development with supervisors. Supervisors used the reports to identify instructors for observation and coaching. The reports later transitioned to web-based dashboards.


ARE INSTRUCTORS STILL RELEVANT? With so much focus on asynchronous, technology-delivered learning, it is understandable to question whether instructors and instructor-led training (ILT) are still relevant. The short answer is yes!

recent research (What Learners Want: Strategies for Training Delivery) found that 63% and 28% of learners, respectively, participated in at least one ILT course and in at least one VILT in the past 12 months.

Approximately 67% of formal learning hours available in 2017 were instructor-led (53% traditional, 9% virtual and 5% nononline remote classroom), according to ATD research. Training Industry research concurs, finding on average companies deliver 64% of their training portfolios via ILT (39%) or virtual ILT (VILT; 25%). Other

Training Industry research found that, over the next 12 months, 21% and 31% of companies plan to increase their use of ILT and VILT, while only 10% and 8% plan to decrease their use. Thus, we see a place for ILT and VILT in training portfolios and a role for instructors into the foreseeable future.

Now, we successfully answered the question, “How do we improve learning, graduation rates, and program effectiveness?” and provided a mechanism to use formative evaluation, analytics and feedback to drive improvement. Over time, instructor performance and effectiveness increased, and variability in instructor performance decreased. Thus, the program’s effectiveness increased, producing more capability.

INSTRUCTOR-FOCUSED FORMATIVE EVALUATIONS AND ANALYTICS ARE TYPICALLY NEGLECTED. ARE YOU READY TO TRY THIS APPROACH? Formative evaluation focused on levers, such as instructor performance, can drive continuous improvement and optimize the learning process and its outcomes. Every L&D program is different, so tailor the process as needed and let your findings guide its implementation. Before you get started, however, it is important to do the prep-work: • Ask if the training program is meeting its objectives. Asking questions about

effectiveness allows stakeholders to identify gaps between actual and desired outcomes linked to their roles and objectives. Prioritize outcomes desired by multiple stakeholders. If there are no gaps, stop. If stakeholders are satisfied with current performance, stop. • Ask if there is opportunity for improvement. Then, determine if improvement is possible given the context, stakeholders’ cooperation and the outcome’s measurement. If not, stop. • Develop questions related to improvement, such as “How can I impact the focal outcome?” or “What factors drive the focal outcome?” Training effectiveness research and models identify factors that typically influence learning outcomes. Statistical techniques can identify sources that influence outcome measures to narrow the candidates. Instructor performance is just one potential factor. Select factors to investigate that are easily measured. • Develop and pilot metrics for the selected factors, choosing the most appropriate data sources, measurement methods and collection times to test the impact on the focal outcome. Determine if the metrics function as designed, meeting both validity and reliability standards. If not, repeat until they do.

• Collect and analyze data on these metrics along with learning outcomes. Determine if there is a relationship between the factor(s) and learning outcome(s). If not, stop. • Determine if the factor is suitable to be used in an intervention. Is the factor actionable? Does the factor’s measurement occur before the focal outcome’s measurement? Is there time for a change in the factor to impact the outcome? Determine if the evidence supports use of the factor as an intervention. If not, stop. • Develop and implement an analytics intervention to improve the relevant factors and associated outcomes. Evaluate and adjust over time.

FINAL THOUGHTS Our case demonstrates the “two questions” approach in driving evaluation, analytics and feedback practice. Specifically, it provides an example of how instructor performance was identified as a key lever impacting learning, and how instructor performance measurement, analytics and feedback were used to improve instruction and its impact on learning outcomes. Instructors are impact multipliers through their influence on learners and need insights, tools and support to maximize their impact. Analytics help supervisors have timely performance conversations, coach instructors and provide support based on data and insights. Ultimately, analytics and development tools provide instructors agency over their professional and career development. Timely, analyticsbased feedback empowers instructors to adjust their practice in process, sharpen their craft and create more value for themselves, their learners and their employers. Dr. Eric Surface is CEO and Dr. Reanna Harman is VP for Practice at ALPS Insights. They have 35 years of combined L&D and consulting experience. ALPS Insights provides L&D evaluation, analytics and insights through its software platform, ALPS IbexTM, as well as consulting and analytics services. Email Eric and Reanna.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 49


CASEBOOK

A YEAR IN THE LIFE: EVOLVING L&D METRICS FROM A REPORTING MANAGED SERVICE TO BUSINESS INSIGHT PARTNER BY ERIKA ROBERTSON

In 2017, my sales enablement organization tapped me, a long-time performance consultant, to lead and form a team whose focus would be learning measurement and analytics. While many organizations claim they focus on measuring learning, few truly dedicate a team to support it. At that time, we had a two-member reporting function within the enablement operations team who provided reports on course completions via Excel. We relied on those individuals and the enablement leads across departments to use filters and pivot tables to make further analysis. Prior to our team’s creation, the enablement operations team created non-standardized questionnaires to distribute to course and event participants, without leveraging any measurement approach. Then, the same two individuals reported the results. Our leaders recognized a need. GETTING STARTED Tapped with not only establishing the initial focus and objective but structuring the team as well, I sought guidance from mentors and set my mind on building a collaborative and innovative team. Within a few days, we had a team name – learning measurement, analytics and seller experience – and a general focus. Our team – containing our two business analysts, a program manager for our seller experience program, a strategic initiatives lead, an outgoing contractor who supported our dashboard creation and myself – gathered in person for a few days, taking advantage of a globally

| 50

dispersed team actually able to meet in our headquarters. This made the beginning steps doable. First, we needed to define our team’s mission statement. What verbiage would be our focus and reference for our direction? While each of us relied upon technology to work and connect virtually, we appreciated and leveraged our face-to-face time. We discussed ideas, shared thoughts and wordsmithed. The mission statement itself didn’t take long, as we shared this common vision:

Through measurement and analysis, we could provide clear insights to arm our performance consultants with, as well as the data behind the stories they tell. We knew that would take dedication and a change of management. MEASUREMENT APPROACH To my good fortune, every person in that room had a background in Phillips Levels of Evaluation, and most had achieved certification. The true foundation of this team would be found in its measurement levels (see Figure 1).

To provide actionable insight of enablement’s impact towards achievement of business outcomes aligned to company priorities.

INTERNAL FOCUS After our face-to-face meeting, the real work began. How we functioned as a new team was crucial to our success. Our basic procedures and principles included:

OUR LEADERS

• Team meetings – At first weekly and later transitioning to every two weeks, these were our cornerstone event to bring our geographically dispersed team together.

RECOGNIZED A NEED.

There were many additional, valuable words left on that whiteboard. Words like strategy, passion, zeal, optimistic, connect and balance. These words were ours, not for external communications but to remind ourselves what we brought to this new team. Although not mentioned specifically in our mission or principles, we all agreed that we did not want to be considered a managed service. Our team’s focus was to guide the metrics maturity of our organization.

• Communication – We never wanted unilateral decisions made on how to improve or remove processes, tools, and approaches to measurement and analytics. Each team member provided crucial input in these areas. • Collaboration – As processes and tools needed to evolve, we joined forces to implement that change. Each team member led, and each team member supported multiple areas as we moved forward.


FIGURE 1

the course or event, we had standard metrics across the board:

WHAT IS MEASURED?

GFPR measurements align to Phillips Levels of Evaluation

ROI BUSINESS IMPACT

What were program benefits compared to costs? What business measures were affected?

APPLICATION

LEARNING

REACTION/SATISFACTION PARTICIPATION/COMPLIANCE

EXTERNAL FOCUS Over the next year, we balanced completion and compliance metrics while maturing the reaction and satisfaction, application, and business impact measurements – knowing that in the future we could stand by return on investment (ROI) measurements. Changing management served as a key aspect of our external focus. Changing the formerly referred to reporting team to the measurement and analytics team was significant. To our enablement organization, this change clarified our intention to be a partner in the business. To other industry professionals, this indicated the level of seriousness our enablement organization brought to the table. On each conference call we attended, we no longer used reporting. Our team scheduled lunchand-learns and attended other team calls to represent clearly who we were, our mission and how to engage our support. SUCCESSES We knew that as an enablement organization and as a team, we were moving toward deeper, more insightful learning measurement. Each small victory lead to our larger victories chronicled below: Implementation of measurement levels across enablement organization: The implementation of Phillips’ measurement methodology was, and remains, vital. The pyramid of measurement levels became the symbol for the intended journey both

Is on-the-job behavior changed? Were knowledge and skills learned? Was content relevant and valuable? Did target audience complete?

internally and externally. In conversations with field leaders, we communicate our intention of providing value to and partnering with the business. Introduction of measurable objectives: Fundamental to our efforts, and helped in great part by team leaders and partner centers of excellence, was the demand for more detailed objectives when planning for a course, quarterly plans and yearly goals. Our team provided examples and worked with teams to turn needs into clear, measurable objectives. This alone helped our other enablement teams be clear on expectations as we planned, as well as during quarterly business reviews where we detailed what was accomplished.

OUR TEAM’S FOCUS WAS TO GUIDE THE METRICS MATURITY OF OUR ORGANIZATION. Standardization of surveys: Although surveys had been developed, distributed and reported on previously, there was no uniformity. Creating survey templates was among our first tasks as a team. Both our team and enablement operations owned creating in the survey platform. We identified key metrics for measurement. This way, regardless of the target audience,

• Net promotor score (NPS) • Confidence change • Relevancy to role • Anticipated business impact Progress from reporting manage service to a consultative business partner: Our team’s efforts included layering the beginning measurement level of completion and compliance with survey data, and then guiding the enablement leads towards identifying the behaviors that would indicate application had occurred. In many cases, we were able to connect those to business changes expected by field leaders. In one instance, we leveraged data related to the value selling workshops held in order to enable a methodology and tool for our sales force. We combined the available workshop completions with survey metrics, monitored tool usage for an application metric across teams and worked with an extended team to average deal size to show impact. With our corporate business insight team, we partnered to move the measurement of this program from Level 1 to Level 4. With the business impact data, our team progressed through the levels of measurement to go beyond reporting. We, along with enablement leads who represented various participant groups, could analyze these results for changes over time, achievement of targets set and correlate the enablement to the business through these results. These accomplishments represent our position as a true partner to our enablement leads as they seek to satisfy the demands of the business and their field leaders. While the team now consists of additional members and the enablement organization changes constantly, we continue in the right direction, with our methodology grounding our focus to measure and analyze – not only report. Erika Robertson has been in the tech space for 12 years, focusing on sales force enablement. Achieving the Certified ROI Professional (CRP) designation from ROI Institute allowed her to refine her instructional design further to measurement and analytics. Email Erika.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 51


G L O B A L O U T LO O K

LOCALIZING E-LEARNING PROGRAMS FOR AN INTERNATIONAL AUDIENCE BY RALPH MICHAEL STROZZA

Thanks to the Internet and evolving technologies, education has never been more accessible. To ensure that language is not a barrier when delivering education to global audiences, e-learning localization requires strategy and expertise to ensure a successful outcome. The aim of e-learning localization is to give a course the look and feel of having been created specifically for a particular market, regardless of language, culture or location. This article serves as a high-level blueprint for e-learning developers who are considering localizing their programs in one or more languages. E-LEARNING IS HERE TO STAY, GLOBALLY

average growth rate, different regions around the world experienced much higher growth rates, with Asia and Eastern Europe experiencing close to 17% growth and Africa and Latin America estimated close to 15%. Just five years later in 2016, market size rose to $46.67 billion, according to Docebo.

MAKING COURSES AVAILABLE IN MULTIPLE LANGUAGES IS CRITICAL TO REACHING A WIDER AUDIENCE.

Due to the numerous advantages e-learning-based education provides to organizations, it is not only going to remain an effective delivery platform for training; it is projected to show significant growth in the coming years.

What is fueling this growth? It can be attributed to a combination of several factors:

The “E-Learning-Global Market Outlook” report by Orbis Research estimates that the global market for technology-based training will grow at a compound annual growth rate of 9.5%, from $176 billion in 2017 to over $398 billion by 2026. While e-learning revenue levels and growth estimates vary, most data point to a continued upward trend on a global scale.

• Course design and implementation speed • Easy accessibility • Increased effectiveness through animated learning • Escalation in number of internet users • Growing access of broadband • Savings to organizations in terms of capital expenditures • Reduction in operational expenses

In 2011, the worldwide market for selfpaced e-learning was $35.6 billion. At that time, the annual growth rate was projected to be 7.6%, or $51.5 billion, by 2016. Although 7.6% was the estimated

Given the growth rates and revenues cited previously, it appears to be universally accepted that self-paced e-learning is not going away anytime soon. Additionally, while English is

| 52

currently the primary language of e-learning course development, making courses available in multiple languages is critical to reaching a wider audience and achieving overall program success. TOOL DESIGNS FACILITATE LOCALIZATION In the early days of Flash-based e-learning localization, developing e-learning courses was a time-consuming, manual process and constituted one of the major roadblocks to multilingual e-learning course availability. Thanks to the latest development tools available today, e-learning course localization is now relatively straightforward. Popular state of the art tools, such as Adobe Captivate, incorporate XML file export and import for a simplified and streamlined localization process. For example, Captivate’s functionality includes the export of captions to a text or an XML Localization Interchange File Format (XLIFF) file, as well as the subsequent import of the translated text file into a copy of the original project file. Tools such as Articulate StoryLine support right-toleft languages such as Arabic and Hebrew, as well as double-byte character sets for languages such as Chinese (simplified and traditional), Japanese and Korean. Today’s technology eliminates many of the barriers organizations used to face when localizing their e-learning programs into multiple languages, making localization more feasible to those who may have previously


been apprehensive about attempting localization of their courses. E-LEARNING COMPONENTS REQUIRING LOCALIZATION Organizations differ in their approach to e-learning course design and to the components they include therein. Let’s examine some of the more common features of e-learning programs. 1. Content and narrative translation and formatting. The program content will include all on-screen text (OST), which will need to be translated and formatted. For example, text that is translated into languages that expand, such as French, German and Spanish, will need to be reformatted so that it appears properly on the screen originally designed for English text. This can present quite a challenge with more complex screens and is best done by an experienced multilingual desktop publishing (DTP) specialist. The content and presenter notes typically make up the course narrative and will include descriptions of what is being discussed, as well as instructions and tips for the user. The narrative may take the form of true voice narration or may be in the form of closed captions. Whether using voice narration or subtitles, the timing of the localized application will need to be adjusted in order to account for matching the voice or subtitles with the on-screen text. It wouldn’t make sense for the narrator to be speaking about an item on the screen when another is being highlighted or referenced. The cadence of each language will dictate the timing of how long a screen appears and must account for the amount of time it takes to narrate the translated text or for someone to read the translated subtitle before a new slide appears. 2. Graphics. On-screen graphics may contain text that needs to be translated. When text is embedded in the graphic, the text will need to be reproduced in an external file and translated, after which

the original graphic can be localized. Localizing can be defined as replacing the source language text with the target language text and formatting it so that it appears correctly. As this step is frequently done using a graphics editing application, it is most commonly performed by a DTP specialist rather than a translator. 3. Mouse cursor movement captures. Due to the instructional nature of most e-learning programs, there are usually cursor movements pointing to items and actions that the narrator is explaining. These cursor movements need to be reflected in the localized program, and the timing of these cursor movements needs to be synced with the accompanying narration. 4. Navigation and action buttons. Depending on how they are designed, e-learning programs will often incorporate navigational or action-related buttons containing translatable text. Examples of these include buttons such as PREV (previous), NEXT, SUBMIT and CONTINUE. As with embedded graphics text, this content is either manually extracted, translated and reinserted by a DTP specialist, or may already be translated in the tool. In the latter case, selecting your target language will prompt the system to display the translated navigation. 5. Subtitles and voice-over recording. Development budgets may control whether subtitles or actual voice narration is used when designing a course. Subtitling is far less expensive and time-consuming than providing a narrator. The same is true about localization; it costs far less and requires less time to translate subtitles than to record the translated narration using a live voice. Studio time, directors and professional voice talent all add up to what can be a costly proposition. The end result of a voice-over, however, is a much more professional-looking, and sounding, product. Most importantly, the participant retains more of what is being presented, since she or he is not forced to read subtitles rather than focus on the lesson at hand.

TODAY’S TECHNOLOGY ELIMINATES MANY OF THE BARRIERS ORGANIZATIONS USED TO FACE WHEN LOCALIZING THEIR E-LEARNING PROGRAMS INTO MULTIPLE LANGUAGES.

CONCLUSION Delivering training via e-learning platforms is a proven approach and is projected to increase year after year. The introduction of social, mobile, analytic and cloud (known as SMAC) technologies has facilitated the adoption of e-learning solutions. Organizations spend significant resources designing, developing and testing e-learning programs in order to effectively educate clients on products or services and employees on internally focused topics. By only providing education in a single language, you are excluding those who do not understand that language and those who do not understand it sufficiently enough to be able to learn in it. Making your e-learning and other computer-based training curricula available in multiple languages will serve to increase the number of people who can benefit from this educational tool. Additionally, comprehension and retention levels will be dramatically improved as people learn in a language they can understand and work in. Ralph Michael Strozza serves as the chief executive officer of Interpro Translation Solutions, which he founded in 1995. He began his career in the localization industry in 1982 and has staffed and managed corporate translation teams in North America and Europe. Email Ralph.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 53


ADVANCE YOUR CAREER IN TWO DAYS OR LESS. Training Industry is committed to empowering learning leaders. Our continuing professional development programs are designed specifically to meet the needs of the training manager and to drive the business of learning.

EXPLORE HOW TO EFFECTIVELY MANAGE LEARNING TECHNOLOGIES.

Learn how to select, purchase, implement, maintain and retire learning technologies to improve the business impact of your training organization. September 10-11 & November 7-8, 2019

MANAGING LEARNING TECHNOLOGY CERTIFICATE

CREATE ALIGNMENT BETWEEN TRAINING AND ORGANIZATIONAL GOALS. Gain the tools to build a strategic roadmap for your organization. August 27 & October 18, 2019

STRATEGIC PLANNING MASTER CLASS

VIEW THE CALENDAR


WHAT’S NEXT IN TECH STELLA LEE, PH.D.

LEARNING ANALYTICS IS MORE THAN DATA COLLECTION

The benefits of learning analytics are manifold. Learners can get real-time information on their progress and learning gaps; instructional designers and facilitators can be better informed regarding how learning activities are received by individuals and groups of learners in order to adjust learning programs accordingly. Additionally, managers are able to glean insights on how learning potentially supports and translates to on-the-job performance. The underlying assumption of learning analytics is that we will use the data collected to gain insights on the learning activities and learner behaviors, interpret the data, and provide interventions and predictions. Typically, much of the data comes from the learning management system (LMS), but also through informal learning channels, face-to-face session attendance records and evaluations, access to other content repositories, and even mobile phone usage.

LEARNING PROFESSIONALS NEED TO START WITH THE END IN MIND. However, when it comes to implementing learning analytics, we are faced with many data problems. The following factors outline a few of the problems surrounding learning analytics and potential solutions: POOR DATA The main problem with the use of data in learning: We need to have them.

Many learning and development (L&D) departments still rely on paper-based records, some don’t get past the collection of smile sheets for training evaluations and some are not in the habit of collecting any learning data. Often, data are out of date, stored across different databases and formats, and are in general quite messy. It is hard enough just getting the data; we need to take the first step in collecting data in a format that is structured, cohesive and standardized. There are tools and data science professionals to help you with data cleansing using statistical software. DIFFERENT TYPES OF DATA We need to get past the Shareable Content Object Reference Model (SCORM) in terms of the types of data to collect. SCORM measures learning activity completion, but completion doesn’t prove that learning has taken place. To identify if the desired learning outcomes are being achieved, learning professionals need to start with the end in mind when designing learning activities and programs. One solution is to design formative self-knowledge checks throughout an online course to spot whether learners have grasped the concepts or if additional supports are needed in real time. I HAVE DATA AND I NEED INSIGHTS There is a fundamental flaw in approaching learning analytics this way. We should be asking: What learning problem and, ultimately, performance problem am I trying to solve that data can help provide insights to? For example, you are trying to solve the problem of customers’ complaints

about the poor service provided by your call center staff. You might hypothesize that, by providing a customer service training simulation experience for the staff, the number of complaints from customers will go down by a certain percent. Does the data reflect that? If not, you must figure out the root cause of what is not working. DATA IS NOT KNOWLEDGE Knowledge involves a practical understanding of subjects, an acquisition of skills, and a demonstration of competencies. Data collected from various learning sources that haven’t undergone thorough processing are meaningless. While it is important to collect diverse sets of data, it is from the analysis of the data that we can infer how learners learn and the conditions needed for successful learning. Once you’ve gained these insights, the question becomes: How do you convert insights into actions? All in all, learning analytics represent a real opportunity for learning professionals to provide evidence-based learning interventions, enhance learning experiences and foster better performance support. Understanding and applying data strategically will go a long way in the successful implementation of learning analytics at your workplace. Dr. Stella Lee has over 20 years of experience in consulting, planning, designing, implementing, and measuring learning initiatives. Today, her focus is on large-scale learning projects including LMS evaluation and implementation, learning analytics, and artificial intelligent applications in learning. Email Stella.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 55


CAN THE CERTIFIED PROFESSIONAL IN TRAINING MANAGEMENT (CPTMâ„¢) CREDENTIAL SUPPORT YOUR PROFESSIONAL DEVELOPMENT AND CAREER GOALS?

Take this quiz to see if the CPTM credential is the missing link to achieving your future goals.

TAKE THE QUIZ


SECRETS OF SOURCING DOUG HARWARD

USING BUSINESS DATA TO DRIVE TRAINING OBJECTIVES

Measuring and communicating the value of training has long been one of the most important challenges of a training professional. I often think of it as the holy grail of the profession. It’s a topic that many analysts and consultants speak to often. But with all the rhetoric and attention given to the subject, the problem still seems as big as ever. Is our organization creating value for the business? How do we measure the ROI of training? Is the training creating behavior change? Why does the problem persist? I believe the problem is two-fold: We struggle with understanding what we need to measure and why we need to measure it. Training analytics often focus more on understanding whether the student learned what we wanted them to learn. While this is important, we often lose focus on measuring whether the learner’s behavior changed posttraining, and how the learner’s behavior impacted the business. When Dr. Kirkpatrick introduced the four levels of evaluation, he helped us better understand this and the relationship between measuring the learner’s satisfaction (Level 1) with training to whether the training positively impacted the business (Level 4). Measuring the impact of training has often been thought of in terms of ROI. Unfortunately, training professionals have struggled to understand the financial value gained from the training. We may know how much the training cost to develop but being able to measure value based on financial impact is an extremely

difficult question to answer – especially if that training is not aligned to a specific need of the business. If we really want to measure the impact of training, we need to think of analytics from the perspective of the business. Understanding the learner’s behavior and how that behavior impacts the business is much more important in determining the future needs of training. And I believe this is at the root of understanding how to provide value. As training professionals, we must work with the business leadership to understand what is happening in their organization, such as quality issues, communication problems or delays in deliverables. This data is the source of where training can have the greatest impact to the business.

WE NEED TO THINK OF ANALYTICS FROM THE PERSPECTIVE OF THE BUSINESS. Strategically aligning training to these kinds of problems can provide L&D with a direct line to how we create value and how we communicate our value based on improvements to the business after the training is completed. If you can say that training is offered specifically to solve a problem, then your ability to measure the value of training becomes much clearer. It allows you to assign a financial value because you can better measure the cost of the problem to the business prior to the training. You can

then measure the cost of improvement to the business after the training, measure the change and compare it to the cost of the training, ultimately providing you a return on investment. The biggest hinderance to measuring value comes from the idea that we offer training that has no direct link to a business problem, initiative or issue, and we try to communicate the value of something that unfortunately has little to no value. This is the difference between supply-based and demandbased learning. Supply-based learning is when we offer a curriculum of courses, allow learners to register for what they want, do nothing after the course to reinforce the information or the behavior, and then try to figure out how to justify the expense. Demand-based learning is where we offer training based on a need (demand) of the business. And we understand these needs best from data we get from the business units, not from the training organization itself. From where I sit, an increased focus on data analytics is what will help training professionals more than anything else going forward. Data is widely available today, but we need to make sure we’re measuring the right things. Let’s change the paradigm of thinking about learning analytics from a learner’s perspective and focus on understanding how we can better use business data to drive our training objectives. Doug Harward is CEO of Training Industry, Inc. and a former learning leader in the high-tech industry. Email Doug.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 57


THANKS TO EVERYONE WHO ATTENDED TICE 2019! A record number of learning professionals attended the 2019 Training Industry Conference & Expo and enjoyed building relationships, sharing best practices and forming positive connections with other attendees. If you weren’t able to make it, you can still see some of the highlights in the official recap video.

WATCH NOW.

SAVE THE DATE FOR TICE 2020

June 16-18, 2020 // Raleigh, North Carolina


LEARNER MINDSET MICHELLE EGGLESTON SCHWARTZ

QUALITY VERSUS QUANTITY: HOW ANALYTICS CAN IMPROVE TRAINING EFFECTIVENESS

There is more to learning and development (L&D) than simply creating a learning program and checking a box. There’s a multitude of considerations that influence an effective L&D plan – from the state of the business market to the organization’s goals to the needs of employees. Organizations must remain competitive, innovative and agile to stay relevant in today’s market. This requires employees to be prepared to meet and deliver on changing expectations, which requires training. Time is a precious commodity. Who has time to dedicate days or even hours to learning? Yes, those days spent in training are worthwhile. However, if employees return to their jobs only to continue checking off items from their to-do list, the training is wasted. The learning must be applied to the job in order to become an engrained behavior. To streamline learning and application, employees need training personalized to their job role and function. They need training that closes skills gaps in an efficient and timely manner. They need learning solutions available on the job to solve immediate problems. To do all that effectively, L&D needs data to deliver quality training. LEARNING ANALYTICS 101 The idea of gathering learning data and analytics may sound overwhelming – and it is – but the trick is to start small. Think about a question you have; what data do you need to answer it? Then, collect and analyze the data that you need to answer that question.

Review the metrics that you have available to you and start investigating. Why are learners returning to a specific page in your e-learning programs? How many people shared an article in your learning management system (LMS)? Are completion rates suffering or soaring in specific online programs? There are a multitude of metrics available to us. Pinpointing the right things to measure is the hard part; finding the answers to questions that are meaningful to strategic goals is the key.

L&D NEEDS DATA TO DELIVER QUALITY TRAINING. IMPROVING TRAINING EFFECTIVENESS “You can’t manage what you can’t measure,” said organizational development and management expert Peter Drucker. And if you can’t measure it, then you certainly can’t improve it, Drucker believed. L&D professionals understand that learning programs must impact business results for it to be a worthwhile investment. In fact, their training budgets often depend on how well they can validate the connection between training and business outcomes. However, making that connection is challenging. Data can help to make those connections and provide L&D with the information they need to prove the value of training. In his article, “Learning Data: The real definition and how you can prove business impact,”

JD Dillon, chief learning architect at Axonify, highlights four learning metrics to improve the impact of a learning strategy: • Learner engagement: Engagement must extend beyond course completion rates and track participation in training. This engagement score will allow L&D to connect learning activity with other metrics. • Knowledge assessments: Learning must be continuous for it to be effective. Assessments should be built into learning activities that occur routinely throughout an employee’s life cycle to accurately assess knowledge over time. • Learner confidence: Confidence is an often overlooked component in training. Low confidence is a barrier to proficiency. L&D must move beyond assessing knowledge through tests and assess the confidence of learners as they apply the knowledge on the job. • Behavior change: Assessing how the behavior of learners has changed since returning to the job can help L&D connect knowledge growth to realworld application. Managers should be heavily involved in this process to correct any issues before they become engrained behaviors. The culmination of these metrics can help L&D professionals prove the value of training. Collecting and analyzing learning data is not easy but can help L&D improve the effectiveness of training through the development of more targeted, impactful learning solutions. Michelle Eggleston Schwartz is the editorial director at Training Industry, Inc. Email Michelle.

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 59


CLOSING D E A L S COURSERA REACHES UNICORN STATUS AND ACCELERATES ENTERPRISE PRODUCT INNOVATION BY TARYN OESCH

Online learning libraries are increasingly expanding their consumer products to include enterprise offerings — packaging online courses for entire teams, customizing courseware and offering data and analytics functions for training managers. Coursera is becoming a giant in this space, with 40 million users and more than 1,800 enterprise customers. In fact, its $103 million Series E equity round in April made it a “unicorn” (valued at over $1 billion) — a rarity in the training industry. According to Leah Belsky, vice president of enterprise,“Coursera will use the funding to accelerate product innovations, especially for the enterprise business, and to expand our partnerships with companies and governments globally.” Last year, Coursera launched an artificial intelligence (AI)powered skills benchmarking tool, and Belsky says Coursera will be building on that tool to create industry-based skills maps and “content collections mapped to specific roles, skills and industries.” “The future of learning and the future of work are converging,” Belsky adds. “At a time when globalization and technology advancement is reshaping our lives, it is also causing a rate of change that is outpacing human adaptability. AI, automation, shifting demographics and the changing economic landscape are all disrupting businesses, jobs and the skills they require.” Coursera’s goal is to help individuals and companies navigate these changes using a “scalable, stackable, ondemand and curated learning model.” THE GROWING IMPORTANCE OF DATA “The quality of our learning catalog is our strongest differentiator,” says Belsky, “but increasingly, customers are approaching

| 60

us to learn from our platform data and skills insights to inform their talent transformation strategy.” Different organizations have different required skills and learning needs. They need individualized data to understand those needs and determine what training to provide their employees. Online learning libraries can often provide an advantage in this area, because they have a great deal of data from learners already in their platform. By leveraging this data and offering it to enterprise customers, Coursera can enable more targeted upskilling initiatives.

THE FUTURE OF LEARNING AND THE FUTURE OF WORK ARE CONVERGING. Belsky sees this trend growing in the near future. “Traditionally, L&D initiatives have been positioned more as an employee benefit, with the aim of improving employee retention through more broadstrokes learning opportunities,” she says. “As we see technology advancement outpace human adaptability, building training programs that resolve actual skills shortages within the organization will be a priority for L&D.” She believes that, rather than training being solely a top-down directive from the training team, it will become a collaborative initiative in partnership with team leaders. After all, they are the ones who know best what skills their teams need to develop in order to be successful.

Combining the training organization’s data and learning expertise with team leaders’ access to employees and subject matter knowledge will help organizations better fill skills gaps and meet business needs. And Coursera wants to be there to assist. HOW EDUCATION ATTRACTS FUNDERS SEEK, which led Coursera’s Series E funding round, is dedicated to employment and education. Its portfolio consists of companies in 18 countries aimed at “help[ing] people live more fulfilling and productive working lives and help[ing] organizations succeed,” according to its website. SEEK was founded as an online jobs site, but its investment in Coursera, along with its recent acquisition of 50% of the business of Futurelearn, demonstrates the growing reality that you cannot separate talent acquisition from talent development. Matching people to jobs is useless if those people can’t succeed in them. “This investment reflects our commitment to online education, which is enabling the up-skilling and re-skilling of people and is aligned to our purpose of helping people live fulfilling working lives,” said SEEK co-founder and CEO Andrew Bassat in the press release announcing the Coursera investment. As other funders continue to understand the importance of education at work, and the return they see on their investment in the companies that help provide that education, will we see more unicorns in the training marketplace? Taryn Oesch is the managing editor of digital content at Training Industry, Inc. Email Taryn.


C O M PA N Y N E W S

ACQU I S I T I O N S A N D PA RTN E R SHIPS BARBRI, Inc., a Leeds Equity Partners, LLC portfolio company, acquired The Center for Legal Studies, a leading provider of online programs and training courses for paralegals and other legal support professionals. The acquisition highlights BARBRI’s effort to broaden its legal education product suite and extend its online program management partnering capabilities with colleges and universities. Learning Pool, a leading online learning company, acquired learning innovation company HT2 Labs, whose Learning Locker platform has been installed more than 12,000 times worldwide, making it the world’s most-installed Learning Record Store (LRS). With the acquisition, Learning Pool will be better positioned to create innovative learning technologies and deliver exceptional customer service.

BizLibrary, a major provider of online employee training content and software that recently launched a regulatory compliance training content for banks, credit unions and mortgage lenders, partnered with OnCourse Learning to offer its product alongside their collection of short, video-based lessons covering the key skills employees need to succeed in their roles. InComm, a leading provider of payments and technology services, acquired Hallmark Business Collections, a provider of integrated, personalized solutions for organizations to improve employee engagement by offering physical and digital gift cards as incentives. The acquisition represents InComm’s transition from focusing on the delivery of stored value rewards to providing a full suite of end-to-end services for incentive programs.

Franklin & White, a reputable partner for assessment management services for executive coaches, leadership consultants, organizational learning and development professionals, and university executive programs, was successfully acquired by Stephan Lischke, who has been a client of the firm for decades. With the acquisition, Lischke plans to implement a variety of additional products and services. 2U, Inc., a leader in education technology, acquired Trilogy Education Services, Inc., an organization dedicated to preparing adult learners for high-growth careers in the digital economy. The acquisition will broaden 2U’s offerings across the career curriculum continuum and will help accelerate the company’s expected path to $1 billion in revenue by a full year.

INDUST RY N E WS IMPROVED MOBILE MEETING EXPERIENCE TO BETTER CONNECT USERS

In response to customer feedback regarding the reality of today’s mobile professional, Blue Jeans Network, Inc. announced it has thoroughly improved its mobile meeting experience to better connect a wide-variety of users. The improved mobile experience will offer users a consistent, comprehensive meetings solution that can support voice, video and content collaboration from any device, at any location. ANALYTICS SOLUTION PROVIDES COMPANIES WITH ACTIONABLE DATA

Buck, an integrated HR and benefits consulting, administration and technology services firm, launched digital-first analytics solution bEquipped™ to offer clients valuable financial, workforce, healthcare, diversity and engagement data. The results-focused solution

provides actionable insights to organizations regarding their people and financial performance to help them identify future needs and create an HR program that both attracts and retains talent. SPREADING AWARENESS, TAKING ACTION TO IMPROVE WORKPLACE INCLUSIVITY

Deloitte and The Female Quotient (The FQ) announced a strategic alliance in effort to advance inclusion in the workplace, with a special focus on the tech industry, where minorities and women are often underrepresented. The alliance will create and host popups focused on equality at global conferences, companies and college campuses to bring even more people together in pursuit of more inclusive work environments.

ADVANCING EMPLOYEE DEVELOPMENT WITH ONGOING CONVERSATIONS

Cornerstone OnDemand, a global leader in cloud-based learning, talent management and talent experience software, announced Cornerstone Conversations, a new product aimed at helping organizations achieve increased productivity, engagement and growth by implementing consistent and ongoing conversations among teams. By facilitating feedback and productive, actionable coaching, Cornerstone Conversations helps support the continual development of all employees so that they can accomplish key organizational goals.

INTERESTED IN SUBMITTING COMPANY NEWS? PLEASE SEND TO EDITOR@TRAININGINDUSTRY.COM

T R A I N I N G I N DUSTR Y MAGAZ INE -LEARNING ANALYTICS 20 19 I WWW. T RAININGINDU S T RY . C OM/ MAGAZ I NE

| 61


DATACAMP.COM/BUSINESS

Learn Data Science Online The skills people and businesses need to succeed are changing. No matter where you are in your career or what field you work in, you will need to understand the language of data. With DataCamp, you learn data science today and apply it tomorrow.

Data Visualization

Programming

Import & Cleaning Data

Machine Learning

Data Manipulation

Apply Finance

Reporting

Case Studies

Probability & Statistics


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.