Monday 24 November 2014

The Only Person Who Behaves Sensibly Is My Tailor


“The only person who behaves sensibly is my tailor. He takes new measurements every time he sees me. All the rest go on with their old measurements.”
—George Bernard Shaw

I’ve always enjoyed George Bernard Shaw’s writing. He was a man who made a great deal of sense to me. I started reading his books in my early teenage years and many of the ideas in them have stuck.

Shaw was a true Renaissance man - an Irish playwright and author, a Nobel Prize and Academy Award winner (how many can claim that double?) and a co-founder of the London School of Economics.

Shaw had a particular interest in education; from the way the state educates its children, where he argued that the education of the child must not be in “the child prisons which we call schools, and which William Morris called boy farms”; to the way in which education could move from teachers “preventing pupils from thinking otherwise than as the Government dictates” to a world where teachers should “induce them to think a little for themselves”.

Shaw was also a lifelong learner. Despite, or possibly because of, his own irregular early education he focused on learning as an important activity in life. He developed his thinking and ability through a discipline of reading and reflecting, through debating and exchanging ideas with others, and through lecturing. Apart from leaving a wonderful legacy of plays, political and social treatises, and other commentaries, Shaw also won the 1925 Nobel Prize for literature for “his work which is marked by both idealism and humanity, its stimulating satire often being infused with a singular poetic beauty". And, in 1938, the Academy Award for his screenplay for Pygmalion (later to be turned into the musical and film My Fair Lady after Shaw’s death. He hated musicals – some would say sensibly - and forbade any of his plays becoming musicals in his lifetime)

At 91 Shaw joined the British Interplanetary Society whose chairman at the time was Arthur C Clark (some interesting conversations there, I’m sure).

Shaw summed up his views on lifelong learning thus:

"What we call education and culture is for the most part nothing but the substitution of reading for experience, of literature for life, of the obsolete fictitious for the contemporary real."

Shaw’s Tailor

In the statement about his tailor Shaw was simply making the point that change is a continuous process and part of life, and that we constantly need to recalibrate if we’re to gain an understanding of what’s really happening. If we do this we are more likely to have a better grasp of things and make the adjustments and appropriate responses needed. It’s the sensible approach.

Shaw and Work-Based Learning

I recently came across Shaw’s quote about sensibility and his tailor again in Joseph Raelin’s book ‘Work-Based Learning: Bridging Knowledge and Action in the Workplace’. Raelin’s work is something every L&D professional should read.

The quote started me thinking about the ways we measure learning and development in our organisations.

Effective Metrics for Learning and Development

I wonder what Shaw would think if he saw the way learning and development is predominantly measured in organisations today.

The most widely used measures for ‘learning’ are based on activity, not on outcomes. We measure how many people have attended a class or completed an eLearning module, or read a document or engaged in a job swap or in a coaching relationship.

Sometimes we measure achievement rates in completing a test or certification examination and call these ‘learning measures’.

The activity measures determine input, not output. The ‘learning’ measures usually determine short-term memory retention, not learning.

I am sure that Shaw would have determined we need to do better.

Outcomes not Activity

Even with today’s interest in the xAPI/TinCan protocol the predominant focus is still on measuring activity. It may be helpful to know that (noun, verb, object) ‘Charles did this’ as xAPI specifies. However extrapolating the context and outcomes to make any sense of this type of data requires a series of further steps that are orders of magnitude along the path to providing meaningful insight.

In many cases the activity measures simply serve to muddy the water rather than to reveal insights.

Attending a course or completing an eLearning module tells us little apart from the fact that some activity occurred. The same applies to taking part in a difficult workplace task or participating in a team activity.

Activity measurement does have some limited use. For instance when a regulatory body has defined an activity as a legal or mandatory necessity and requires organisations to report on those activities. these reports may help to keep a CEO out of the courts or jail. But this type of measurement is starting from the ‘wrong end’. A ‘learning activity is not necessarily an indicator of learning’ tag should be attached to every piece of this data.

There’s plenty of evidence beyond the anecdotal to support the fact that formal learning activity is not a good indicator of behaviour change (‘real learning’). For example a  study of 829 companies over 31 years showed diversity training had "no positive effects in the average workplace." The study reported that mandatory training sometimes has a positive effect, but overall has a negative effect.

“There are two caveats about training. First, it does show small positive effects in the largest of workplaces, although diversity councils, diversity managers, and mentoring programs are significantly more effective. Second, optional (not mandatory) training programs and those that focus on cultural awareness (not the threat of the law) can have positive effects. In firms where training is mandatory or emphasizes the threat of lawsuits, training actually has negative effects on management diversity”

Dobbin, Kalev, and Kelly
Diversity Management in Corporate America
2007, Vol. 6, Number 4
American Sociological Association.

For further evidence as to the fact that training activity does not necessarily lead to learning (changed behaviour) we need look no further than the financial services industry. Did global financial services companies carry out regulatory and compliance training prior to 2008?  Of course they did – bucketsful of it. Did this training activity lead to compliant behaviour. Apparently not. It could be argued that without the training things could have been worse. However, there’s no easy way to know that. The results of banking behaviour and lack of compliance were bad enough to suggest the training had little impact. I suppose we could analyse, for example, the amount of time and budget spent per employee on regulatory and compliance training by individual global banks and assess this against the fines levied against them.  I doubt that there would be an inverse correlation.

(What is our response to the global financial crisis and the apparent failure of regulatory and compliance training? More regulatory and compliance training, of course!)

The Activity Measurement ‘Industry’

The ATD’s ‘State of the Industry’ report, which is published around this time of the year on an annual basis, is a case-in-point of the industry that has grown up around measuring ‘learning’ activity.

ATD has been producing this annual report for years (originally as the ASTD). The data presented in the ATD annual ‘State of the Industry’ report is essentially based around activity and input measurement – the annual spend on employee development, learning hours used per employee, expenditure on training as a percentage of payroll or profit or revenue, number of employees per L&D staff member and so on.

Some of these data points may be useful to help improve the efficient running of L&D departments and therefore of value to HR and L&D leaders, but many of the metrics and data are simply ‘noise’. They certainly should not be presented to senior executives as evidence of effectiveness of the L&D function.

To take an example from the ATD data, the annual report itemises ‘hours per year on ‘learning’ (which means ‘hours per year on training). The implicit assumption is that the more that are hours provided, the better and more focused the organisation is on developing its workforce.

But is it better for employees in an organisation be spending 49 hours per year on ‘learning’ than, say, 30 hours per year? These are figures from the 2014 ATD report.

Even if one puts aside the fact that as a species we are learning much of the time as part of our work and not just when we engage in organisationally designed activities that have a specific ‘learning’ tag, this is an important point worth considering.

It could be argued that organisations with the higher figure – 49 hours per year – are more focused on developing their people.  It could equally be argued that these organisations are less efficient at developing their people and simply take longer to achieve the same results. It could be further argued that the organisations spending more time training their people in trackable ‘learning’ events are simply worse at recruitment, hiring people who need more training than the ‘smart’ organisations that hire people with the skills and capabilities needed who don’t need much further training. We could dig further and ask whether spending 49 hours rather than 30 hours is indicative of poor selection of training ‘channel’ – that organisations with the higher number are simply using less efficient channels (classroom, workshop etc.) than others who may have integrated training activities more closely with the workflow (eLearning, ‘brown bag lunches’, on-the-job coaching etc.). Even further, is the organisation with the 49 hours per year simply stuck in the industrial age and using formal training as the only approach to attack the issue of building high performance – when it could (and should) be using an entire kitbag of informal, social, workplace and other approaches as well?

One could go on applying equally valid hypotheses to this data.The point is that activity data provides few if any insights into the effectiveness of learning and provides only limited insight into the efficiency of learning activities.

So why is there an obsession to gather this data?

Maybe we gather it because it is relatively easy to do so.

Maybe we gather it because the ‘traditional’ measurement models – based on time-and-motion efficiency measures – are deeply embedded. These time-honoured metrics developed for an industrial age are not the answer.  We need to use new approaches based on outcomes, not inputs.

Learning is a ‘Messy’ Process

imageThe real challenge for measuring learning and development is that performance improvement often comes about in ‘messy’ ways.

Sometimes we attend a structured course and learn something new and then apply it our jobs. At other times we attend a structured course and meet another attendee who we then add to our LinkedIn connections. At some later point we contact this LinkedIn connection to help solve a problem – because we remember they told an interesting story  about overcoming a similar situation in their organisation or part of our organisation.

This second case falls into the ‘messy’ basket. It is almost impossible to track and ‘formalise’ this type of learning through data models such as xAPI – unless we’re living with the unrealistic expectation that people will document everything they do at every moment in time or that we track every interaction and are able to draw meaningful inferences.  Even national security agencies struggle doing that.

More frequently than learning in structured events we learn through facing challenges as part of our daily workflow, solving the problems in some way, and storing the knowledge about the successful solution for future use.  We also increasingly learn and improve through our interaction with others – our peers, our team, our wider networks or people we may not even know.

So how do we effectively measure this learning and development? Is it even worthwhile measuring?

I believe the answer to the second question is ‘Yes, when we can gain actionable insight’. It is worthwhile measuring individual, team and organisational learning and development to understand how we are adapting to change, innovating, improving our customer service, reducing our errors and so on.

This type of measurement needs to be part of designed performance improvement initiatives.

Furthermore, measuring learning frequently via performance improvement is better than measuring it infrequently.

One of the challenges the annual performance review process has come under recently is that the insights (and data) collected as part of the process is too infrequent. Companies like Adobe have already abolished annual performance reviews and replaced them with regular manager-report informal check-ins to review performance progress and any corrections needed. Fishbowl, a Utah-based technology company, has gone a step further and not only abolished annual performance reviews but also abolished its managers. Companies such as W.L.Gore have been treading this path for some time. It is clear that the annual performance review, a metrics approach based on ling (long) cycle times and relatively stability, will give way new, more nuanced approaches. A parallel path to learning metrics.

Outcome Measurement

One of the challenges for L&D is that the useful outcome metrics are not ‘owned’ by them. These are stakeholder metrics not ‘learning metrics’.

If we want to determine the effectiveness of a leadership development programme the metrics we should be using will be linked to leadership performance – customer satisfaction, employee engagement levels, organisational profitability for instance.

If we want to measure the impact and effectiveness of a functional training course the metrics we should be using are whether productivity increases, first-time error rate decreases, customer satisfaction rises, quality improves and so on.

If we want to measure the benefits from establishing a community for a specific function or around a specific topic the metrics we should be using will be linked to similar outputs – productivity increases, increase in customer satisfaction etc. Also we should be measuring improvements in collegiate problem-solving, cross-department collaboration and co-operation and similar outputs in the ‘working smarter together’ dimension.

These metrics need to be agreed between the key stakeholders and the L&D leaders before any structured learning and development activities are started. Without knowing and aligning with stakeholder expectations any structured development is just a ‘'shot in the dark’.

L&D also needs to consult with its stakeholder on how to obtain these metrics.

Some data may be readily available. Customer-facing departments, for example, will regularly collect CSAT (Customer Satisfaction) data. There are a number of standard methodologies to do this. Sales teams will inevitably have various measures in place to collect and analyse sales data. Technical and Finance teams will have a wealth of performance data they use. Other data will be available from HR processes – annual performance reviews, 360 feedback surveys etc.

These are the metrics that will provide useful insights into the effectiveness and impact of development activities managed by the L&D department.

Obviously these data are more nuanced than the number of people who have completed an eLearning course or have attended a classroom training course, but they are more useful. Sometimes the causal links between the learning intervention and the change in output are not clearly identifiable. This is where careful scientific data analysis together with the level of trust relationship between L&D and stakeholder are important. The 10-year old study by the (then) ASTD and IBM ‘The Strategic Value of Learning’  found that:

“When looking at measuring learning's value contribution to the organization, both groups (C-level and CLOs) placed greater emphasis on perceptions as measures”

One C-Suite interviewee in this study said “We measure (the effectiveness of learning) based on the success of the business projects. Not qualitative metrics, but the perceptions of the business people that the learning function worked with.

New Measurements Every Time

Returning to George Bernard Shaw, one of the challenges of effective measurement is the need to review the metrics needed for each specific instance. No two situations are identical, so no two approaches to measuring impact are likely to be identical. Or, at least, we need to check whether our metrics are appropriate for each measurement we undertake.

As Robert Brinkerhoff says, “There is no uniform set of metrics suitable for everyone”.

Brinkerhoff’s Success Case Method addresses systems impact rather than trying to isolate the impact of learning individually as the more simplistic Kirkpatrick approach attempts. Brinkerhoff’s approach moves us from input metrics to stakeholder metrics – certainly on the right road.

What is also required in defining and agreeing metrics that will be useful for each and every project is a process of engagement with stakeholders and performance consulting by learning professionals.

These approaches require a new way of thinking about measurement  and new skill for many L&D professionals but, like Shaw’s tailor, we need to ‘behave sensibly’ and stop wasting our time on trying to ‘tweak’ the old methods of measurement.

Learning, and measurement, are both becoming indistinguishable from working.


Shaw: Nobel Foundation 1925. Public Domain
Tape Measure:
Creative Commons Attribution-Share Alike 3.0


Monday 17 November 2014

Embedding Learning in Work: The Benefits and Challenges

(a version of this article was originally written as background for an #OzLearn chat held on Twitter, 11th November 2014)

The Power of Embedded Learning

A common finding that has emerged from study after study over the past few years is that learning which is embedded in work seems to be more effective than learning away from work. If people learn as part of the workflow then this learning is more likely to impact performance in a positive way.

The Research

imageA 2009 study by the Learning & Development Roundtable, a division of the Corporate Executive Board, reported that on-the-job learning had three times the impact on performance improvement over formal training programs. The same study found employees with high exposure to on-the-job learning activities were 262% more engaged than those who had no exposure to on-the-job learning. ‘High exposure’ in this study was defined as being engaged in ‘11 or more on-the-job learning activities during the last month’.

A further 2010 study of manager development activities by Casebow and Ferguson at GoodPractice in Edinburgh, Scotland reported that informal chats with colleagues was both the most frequently used development activity and was also seen as the most effective by the majority of managers.


imageYet another study by Bersin & Associates (now Bersin by Deloitte) published in March 2012 reported that “Organizations with strong informal learning capabilities, including the adoption and use of social learning tools, are 300% more likely to excel at global talent development than organizations without those competencies.” By their very nature informal and social learning is embedded in the daily workflow.

An earlier study 2003 by the Corporate Leadership Council identified 15 leader-led activities that improve performance and found that learning through workplace experience was at least three times more effective than simply ensuring that workers had the necessary knowledge and skills to do their jobs.


There are many other studies with similar findings, and more being published on a regular basis.

Learning in Context

These findings are not at all surprising.

imageAs long ago as 1885 Dr Hermann Ebbinghaus published his treatise Über das Gedächtnis (On Memory) that suggested context was critical for effective learning. Although Ebbinghaus’ experimental research was limited, his theory and results indicated that context and the spacing effect are key contributors to effective retention, learning and performance improvement. It could be argued that context is best provided by embedding learning in work.

Recent brain science work is filling in the gaps and we now know a lot more about the way the brain modifies itself in the light of experience and both the neural and behavioural differences between people who approach learning with ‘open’ or ‘fixed’ mindsets. The work by Carol Dweck, a professor of psychology at Stanford University, has enhanced our understanding about learning, context and mindset considerably. Dweck’s research suggests that experience and practice combined with a growth mindset are critical ingredients for effective learning and development. Each of these is more powerfully experienced in the context of the workflow rather in the more sterile atmosphere of a classroom.

The benefits are clear, but what are the challenges of embedded learning in work for L&D departments?

The Challenges

imageOne of the major challenges is the fact that until recently L&D professionals have seen their primary role as instructional designers and creators of learning content and experiences where this content and these learning experiences are separate from work. ADDIE (or some other instructional design approach) ruled. The learning needed to be designed, managed and measured.

Of course some effective learning experiences can be designed, managed and measured, but they tend to be in the minority. The majority of learning occurs naturally as part of the workflow. This type of learning is ‘designed’ by the individual (sometimes with input from their manager), it is self-managed, and the measurement is in terms of outputs – not by passing a test or some form of certification but by demonstrating the ability to do work better, faster, more accurately, with greater agility and levels of innovation if needed.

imageThe challenge for L&D professionals is to develop ways to support, encourage and facilitate these ‘90’ types of learning (through the 70:20:10 lens) that occur as part of the daily workflow. This learning can’t be ‘managed’ by HR, L&D or by any of the processes and technology systems they put in place. It can, however, be supported, facilitated, encouraged, exposed and shared by HR and L&D with the outcome of improving not only individual performance, but team and organisational performance as well.

A second significant challenge (and blind spot for many L&D departments) has been the provision performance support. The lack of understanding and failure to use performance support approaches and tools has created a significant barrier for supporting the learning that is embedded in work. Performance support is a sleeping giant that has only recently been nervously prodded by some L&D departments, despite the fact that ePSS has been around for at least 25 years, and other non-technology supported performance support approaches for eons.

imageGloria Gery published her seminal ‘Electronic Performance Support Systems’ book in 1991, yet these powerful systems and approaches have only marginally entered L&D’s mindset. This will no doubt change in one respect as the ‘rise and rise’ of social learning further impinges on organisational learning cultures and people turn to online communities and expert location tools to help them improve their work and to learn more effectively in the workplace. Together with ‘point-of-need’ performance support solutions (Bob Mosher and Conrad Gottfredson at ApplySynergies are doing a great job on this, as are companies such as Australian organisation Panviva and others in Europe) the whole gamut of performance support opportunities are an open goal if only L&D can evolve from ‘course’ to ‘resource’ thinking.

A final challenge facing many L&D professionals is that embedding learning in work almost always requires the active support of executives, business managers and team leaders. This means L&D needs to engage these groups and work closely with them. This inevitably requires the provision of a clear set of business imperatives for embedding learning in work delivered in a way that is meaningful and compelling to these busy stakeholders. L&D professionals need to step up to the plate with their consulting and interpersonal skills if they are to enrol the critical support from these groups. This can be a big challenge but it is one where success is critical if learning is to be effectively embedded in the workflow.


Tuesday 7 October 2014

Development Mindsets and 70:20:10

x-defaultProfessor Carol Dweck is a psychologist at Stanford University and the prime force behind mindset theory. Dweck’s research has led her to the conclusion that each individual will place themselves on a continuum according to their implicit belief of where their own ability originates.

In simple terms this means that those who tend towards believing in ‘nature’ or innate ability as the prime factor in determining their success are defined in Dweck’s model as having ‘fixed mindsets’ or fixed theories of intelligence.

At the other end of the continuum are those that believe their success, and the success of others, comes from hard work, learning, and persistence. These people are defined as having ‘growth mindsets’ or incremental theories of intelligence.

This is an interesting theory, but so what?

Well, Dweck’s work goes further than this observation. She has found that even if people aren’t aware of their own mindset they can be identified by their behaviours, especially their responses to failure.

Fixed-mindset people fear failure (it reflects badly on their ‘innate ability’) while growth mindset people tend not to mind failure so much because they believe they can overcome failure by reflecting on what went wrong and then set themselves to unlearn, re-learn and overcome the cause of the failure.

imageAll this is very interesting. I’m sure we have all looked around us at colleagues and family members and seen these different mindset types – or people behaving in these very different ways.

However, neuroscience research has supported Dweck’s model. In the experiments by Moser, Schroder, Heeter, Moran & Lee brain activity of students was examined when receiving feedback and the differences were clear.

All students’ brains were active when being told whether they had selected the right or wrong answer to a question they had previously been asked, but only the brains of students with ‘growth mindsets’ remained active to hear details of the correct answer if they had got it wrong.  The brains of those with ‘fixed mindsets’ simply shut down at this point.

New Knowledge and Crack Cocaine

Gary Marcus is another professor of psychology who has spent years studying human cognitive development. His book ‘Guitar Zero’ addresses a challenge close to my own heart – that of an adult learning a musical instrument. Marcus’ insights into the learning process reinforce Dweck’s model. Throwing himself into learning guitar at 39 years he describes the learning process as ‘addictive’. Marcus cites neuroimaging research by Knutson and Cooper that found:

“new knowledge can bring the same sort of surge of dopamine one might get by ingesting crack cocaine.”

Knutson and Cooper also point out that the motivating force of novelty and the desire to learn new things are basic biological needs. All foraging species must have a drive to explore the unknown. We’re no different from other species, whether we’re working in our offices or other workplaces, or enjoying our time with our families and friends.

If HR and learning & performance professionals are working with something that’s both a basic human driver, and whose impact on achieving it provides a kick like a horse, then maybe this is something we should be actively exploiting in our organisations.

The challenge is to ensure each of our organisations has as many people with growth mindsets as possible. These type of people are more receptive to continuous learning. They are critical for organisational survival and growth in a changing world. Without ‘growth mindset’ people, organisations end up providing products and services to a world that is in the past.

Development Mindsets

I have been focussing on the important role of development mindsets as an starting point for adopting the 70:20:10 model for some time. My development mindsets are identical to Dweck’s ‘growth mindsets’. They view personal, team and organisational development as something that needs to be worked at constantly. Every day. Widespread evidence of development mindsets is essential if organisations are to achieve Peter Senge’s ‘Learning Organization’ status and if the 70:20:10 model is to be successfully used.

The French company Danone has an excellent initiative based on continuous learning and the 70:20:10 model called ‘One Learning a Day’. There is a very good short video of Danone’s approach on YouTube here. I have worked with Danone to help the company build support for One Learning a Day in the form of approaches and tools to underpin this cultural change initiative and support the drive to demonstrate the power of continuous learning and development mindsets.

Initiatives such as Danone’s help open up more people to adopting and building development mindsets. It’s not easy to change attitudes, behaviours and habits, but this change is essential if organisations are to gain full benefit from the 70:20:10 model.

Development: noun \di-ˈve-ləp-mənt, dē-\
the act or process of growing or causing something to grow or become larger or more advanced

Mindset: noun \ˈmīn(d)-ˌset\
a particular way of thinking : an attitude or set of opinions. An inclination or a habit … a way of life

Development Mindsets and the 70:20:10 Model

70:20:10 provides a clear and simple approach to extending the support of learning and development for all workers - from individual contributors to senior leaders - beyond the services traditionally delivered by the HR and Training/L&D departments. Ignore the specific numbers (it’s just so obvious they are simply helpful indicators to remind us how people learn at work, not some rigid formula to be aimed at or adhered to). Focus on putting into place the support and processes that help embedding, extracting and sharing learning as part of the workflow.

Development mindsets are critical for successful use of the 70:20:20 model.

70:20:10 relies on workers taking much more responsibility for their own development, and on team leaders, managers and senior executives supporting that development together with, and aligned to, the activities of HR and learning professionals. It has to be a full team effort.

If nothing else, 70:20:10 is an agent of change – helping strengthen cultural focus on high performance and continuous development and better positioning people to change behaviours to incorporate all the things that go with growth or development mindsets – constant enquiry, and acceptance of failure as part of the process on the road to success.

70:20:10 also focuses beyond structured learning activities to address the entire way adults learn at work – whether that is through challenging experiences and their outcomes, through opportunities to practice, through building robust, resilient and supportive personal networks, or through making space for reflection, gaining insights and ensuring improvements, where necessary, are taken on-board. A 70:20:10 implementation will provide support, tools and processes to ensure learning is deeply embedded in everyday work.

Jane Hart recently published her insights on an experiential online workshop she ran for the sales team at Pfizer in India. Reviewing the success of this event – a ‘10’ type of activity, but designed to drive ‘70’ and ‘20’ behaviours – Jane observed that:

“the organisational culture encouraged, supported and rewarded the team in their endeavours through learning and working from one another.”

This is at the heart of successful a 70:20:10 strategy, or any other change implementation. If organisational culture and values are at odds with the idea of self-directed development, openly sharing learning together, and the need for managers and leaders to play their (important) role in facilitating, encouraging and supporting continuous development then failure is almost inevitable. If values and culture are aligned and if managers and leaders do play their part then the outcome is invariably successful.

Fortunately, we’re moving into an era where collaboration and sharing are recognised as increasingly important – even with one’s competitors. I have written about the rise of ‘co-opetition’ in an earlier article. There is no doubt that organisations are becoming more co-operative and collaborative within and without, and this is usually reflected in more openness and sharing and greater receptiveness to new ways of development and reaching high performance.  There are very few organisations swimming against this tide.  Some may be slower to understand the benefits, but they will do so finally, without doubt.

A key driver of these changes will be the encouragement for more and more of the workforce to adopt development mindsets.

Thanks to Simon and Carol Townley of the Gorilla Learning Company for insights and the fixed/growth neuroimage.


Wednesday 20 August 2014

It’s Only 65% !

Adam Weisblatt's imageThe results of yet another 70:20:10 survey were published recently.

The researchers (possibly on work experience) declared that “50:26:24 is the average learning mix in most companies right now”.

The report of the 50:26:24 survey went on to say:

“It’s widely accepted that the 70:20:10 model is the most effective learning blend for business, but getting to that perfect mix can be a challenge. It’s early days and we’ve got a long way to go, but when we crunched the first numbers on our new study, we could see that the current average mix of training in the L&D industry is actually:

  • 50% via ‘on the job learning’
  • 26% through ‘informal training’
  • 24% from ‘formal training’”

A few things flew off the page from this survey and hit me square in the temple. The comments below are not intended as a blanket criticism of this specific survey, but it did get me thinking about a number of misconceptions of what the 70:20:10 model is really about. It also got me thinking  about approaches to organisational learning in general.

Learning ≠ Training

Although the terms convey basic concepts, there seems still to be some confusion between the meaning of the words learning and training. This confusion is not isolated in surveys such as the above. It is a common problem and underlies many of the barriers that organisations encounter as they strive to develop and implement effective learning strategies.

‘Learning’ covers a much wider range of activities than training. Learning is a process not an event. Learning is something we’re doing every day.

Training describes a structured set of events that when designed and assembled carefully can provide an effective way to help people accelerate learning (learning = behaviour change). However the words training and learning are not interchangeable.

This may seem a small point that most of us have ‘got’ and don’t think about, but it’s important. The term ‘informal training’ for example is meaningless. Whereas the term ‘informal learning’ as Jay Cross describes it, is extremely meaningful:

“.. the unofficial, unscheduled, impromptu way most people learn to do their jobs. Informal learning is like riding a bicycle: the rider chooses the destination and the route. The cyclist can take a detour at a moment’s notice to admire the scenery or help a fellow rider."

Jay wrote the seminal book on Informal Learning.

70:20:10 is Not About the Numbers

J.Potts-1The 70:20:10 model is not about percentages or numbers and there is no universal ‘right’ ratio.

70:20:10 is a model that describes the way adults in work generally learn.

So why use the numbers, then?

The numbers are a useful reminder that the majority of learning occurs through experience and practice within the workflow (the ‘70’), through sharing and supporting others, conversations and networks (the ‘20’),  and that a smaller amount of overall learning occurs through structured training and development activities (the ‘10’).

70:20:10 is not a recipe to be used slavishly. The numbers are a simple framework to drive change and help people focus beyond structured learning interventions to where most of the learning happens (in the ‘20’ and ‘70’).

I wrote about ‘the numbers’ on this blog back in June 2012. What I said there still holds.

Every organisation that uses the 70:20:10 framework will have individual needs and contexts.  The way they support learning and development will be particular to them. If you’re in a high compliance environment it’s likely that your people will be required to spend more time on structured training (whether this has an impact or not is another issue altogether). If you’re working in a highly innovative and creative environment it’s likely your learning and development will be skewed more towards the social and experiential types of development. So the ratios describing how people learn in a large compliance-driven organisation are likely to be different from an agile start-up. How we support learning and building high performance should reflect these differences.

So if you’re supporting effective learning and development in a high compliance context, then you’ll need to be aware that the ‘best’ ratio for your organisation and the individuals will be skewed by regulatory needs. If you’re supporting effective leaning and development in an environment where agility and innovation are at a premium then you’d better be prepared to support higher levels of collaborative and ‘trial-and-error’ learning – in the ‘70’ and ‘20’ zones.

'Training Types’

The 50:26:24 survey categorised three distinct and separate ‘training types’, described in this way:

“Current research suggests that the ideal training mix is 70% On the Job Training, 20% Informal Learning and 10% Formal Learning.”

I found this a strange categorisation.

Now I’m not sure if I’m over-reacting to this approach, or I simply don’t understand it, but ‘on the job training’ suggests structure and intention to me. The ‘training’ word gives that away. But then the input for this survey asks for the ‘current learning mix’ and offers ‘on the job’, ‘informal’ and ‘formal’. On-the-job learning doesn’t have to be structured and intentional. Most isn’t.

I don’t want this post to be a criticism of an individual survey design, but I do think that the designers of data-gathering surveys such as these need to think about the terminology they use carefully. In my mind ‘on-the-job’, ‘informal’ and ‘formal’ are not three mutually exclusive categories.

In real life 'on-the-job' learning can be either informal (i.e. self-directed or non-directed) or formal (i.e. experiential development that is part of a structured course or programme).

‘Formal learning’ suggests learning that is designed and directed by someone other than the learner as part of a curriculum, course, programme, module etc. Formal learning can include on-the-job activities and learning, but not necessarily.

These three types of learning are not dichotomies (if one can have dichotomous trios). The world is not black-and-white.

Without any definitions of these categories (and I couldn't find any in this survey) I fail to see how respondents will provide consistently accurate input. Potentially leading to garbage in, garbage out.

Continuous Learning is the goal of 70:20:10

J.Potts-2The final point I would make is that focusing on ‘the numbers’ masks the fact that learning happens as a continual process and usually as part of the workflow. 

That’s a fact. We humans are learning machines. We can’t help but learn as we live and work.

Even when we engage in classes, programmes and structured eLearning modules as part of the overall mix to support and accelerate learning and performance improvement we don’t stop learning as soon as we walk out the door or finish the online recall test.  We continue to learn as we put our new knowledge or skills into practice. We continue to learn as we discuss challenges and options with our colleagues. We continue to learn as we try to do things a newer, better way.

Many organisations are using the 70:20:10 framework to help create cultures of continuous learning and to to build high performance. They understand that more formal learning is not necessarily better, and that by helping develop mindsets to exploit learning and development opportunities whenever and wherever possible they are much more likely to achieve their high performance aims.

(Thanks to Adam Weisblatt and Jim Potts for allowing me to use their 70:20:10 cartoons)

Images © Adam Weisblatt and © Jim Potts. Not to be reproduced without permission of the copyright owners.


Wednesday 13 August 2014

Learning in the Collaboration Age (original post)


Many may not have noticed it at the time, but the world of learning changed in 1990.

In November of that year British computer scientist Tim Berners-Lee together with his Belgian colleague Robert Cailliau proposed a project to develop the use of hypertext “to link and access information of various kinds as a web of nodes in which the user can browse at will”…..… 
The rest is history.

Over the next few years the Web turned technical networks into ubiquitous conduits for everyone to use. The Web reduced our need to hold detailed information in our flesh-and-blood memories as it blew away the barriers to easy access. The Web allowed us to reach out easily and establish connections with others that previously were impossible or extremely difficult to make.

And for Organisational Learning?

The Web has allowed us to totally redefine our traditional learning models. It has allowed us to reach beyond content-rich learning approaches and focus on experience-rich learning. It has allowed an evolution from ‘Know What’ learning to ‘Know Who’ and ‘Know How’ learning; and it has allowed the emergence from learning in the silos of our own organisations to learning with and through others across the world – easily and transparently.

The Collaboration Age

On a wider plane the Web has been the harbinger of the Collaboration Age. It has blown away many of the barriers to access and has reinforced the power and influence of collaboration and co-operation1  over silo mentalities.

In the Collaboration Age it is those who share and work together who are the winners. Those who hide behind organisational garden walls end up deep in weeds.

If we’re to succeed we must no longer just collaborate and co-operate inside the ever-softening boundaries of own organisations. We need to do so with others, in some cases even with our competitors. The rather ungainly term ‘co-opetition’ is being increasingly used to define co-operative competition, where competitors work together to achieve increased value at the same time as they are competing with each other. There is no doubt this is one of the ways forward to success.

In the world of talent, learning and performance the impact of the Collaboration Age is only now starting to take hold. The emerging understanding that invariably we need to work with others to solve problems is driving these collaborative and co-operative behaviours and, in turn, fuelling a focus on collaborative learning.

The Collaborative Age requires collaborative mindsets to drive collaborative learning. We can’t simply redesign content-rich courses and curricula and hope that changes will occur. We need new thinking, new approaches, and new strategies if we’re to fully exploit the potential.

However, we’re starting to see some fundamental changes happen in practical ways.
“If any one of us can find the answer to almost any question or problem we face almost instantly with a few clicks or a posted question, why should we need to learn and memorise all this ‘stuff’?”
This question is being answered by re-imagining traditional learning approaches and defining new and sometimes novel ways for the world of learning and development to respond to today’s challenges.

Traditional Learning to MindFind

Although experiential and social learning have been around for eons, in the past most structured organisational learning and training has focused on knowledge acquisition and memorising. We filled them up with information and then assessed their abilities of recall. We still see it today in many classrooms and eLearning programmes.

This process is (still) generally referred to as ‘knowledge transfer’ and is both overrated and totally inappropriate for the post-1990 world.

The ‘knowledge transfer’ model of training is based a number of assumptions that no longer apply.
  • the assumption that information is generally static
  • the assumption that information or ‘knowledge’ is acontextual
  • the assumption that we work as individuals, so individual training and development is the best solution
A few years ago David James Clarke III and I developed a model to address the failings of the first and second of these assumptions. We called it MindFind. Inside the MindFind model we explained the migration of traditional learning to the find-access approach2 

Traditional Learning Approach

Find-Access Approach

The find-access approach is based on the fact that with today’s information explosion and the increasingly dynamic nature of information it only makes sense to memorise and ‘learn’ core concepts. These are the bedrock that will be needed for some time. Core concepts are likely to be unchanging, or change very slowly. They are the ‘Newtonian Laws’ of the domain or discipline. Core concepts will apply to most situations and it’s helpful to have them at hand (or in the head).

Then you need to familiarise yourself with the contextual job and project-related information. This is likely to change more frequently, so it is often a hindrance to have memorised it. This type of information is better familiarised. You know where it is. You are familiar with its nature and content, and you know how to find, recall and verify it at the point-of-need.

And ultimately, the majority of the detailed information or ‘knowledge’ we need at any time is not only ephemeral and in constant flux, it is very often contained in other people’s heads rather than being codified and held in structured documents and databases. If you have learned the ‘Know Who?’ then you will be able to quickly locate and retrieve the detailed information.

The Power of Collaborative Learning

Over the past 15 years I have worked with the 70:20:10 framework and used it as a powerful tool to help organisations evolve their organisational talent and learning and development approaches from pre-1990 to present day practices. It provides a very good starting point to help make this move.

Collaboration sits firmly across the ‘other 90’ - the 70 and 20 (experiential and social elements) - in 70:20:10. We collaborate in our work teams while we learn through experience and practice. We collaborate when we share within and outside our organisations, and we often collaborate as part of our own reflective practices. In fact reflection is usually enhanced when shared with others.

Of course, the ‘10’ also provides great opportunities for collaboration.

Success in the Collaborative Age

We don’t need to go far to see tangible successes resulting from collaborative behaviours in fields other than learning and development.

Year over year, Tesla Motors, Inc. has been able to grow revenues from $413.3M USD to $2.0B USD. 
Tesla CEO Elon Musk announced in a press release and conference call and blog on June 12, 2014 that the company will allow its technology patents for use by anyone in good faith, in a bid to entice automobile manufacturers to speed up development of electric cars.
In an industry (the electric car market) that was killed almost at birth by the the internal combustion engine and mass production of petroleum more than 100 years ago, Tesla has emerged as a shining light the second age of alternative-powered personal vehicles.

The driver for Tesla’s collaborative approach and decision to share its patents, usually the ‘gold’ for any innovative organisation, is obvious. In order to grow the market there is a mutual interest in sharing. This behaviour is successful not only for Tesla but for the industry as a whole.

The learning industry could take some lessons from Tesla.

Going Forward

In the global industry and profession of talent development there are many opportunities to adopt and exploit collaboration as a fundamental tenet of operation. In fact learning professionals owe it to the profession to build practices and platforms to not only help others exploit the benefits of collaboration, but also to collaborate themselves.

Without the adoption of collaborative mindsets, learning and development professionals and the entire industry that supports talent development will find themselves foundering and failing to join everyone else in the Collaborative Age.


1 Collaboration and co-operation are two distinct behaviours. My colleague Harold Jarche has written about the distinctions.

2 I have written about the three categories of find-access model and how they align with the memorisation : familiarisation : on-demand model developed by Ted Gannon at Panviva in a past post


Monday 7 July 2014

Nothing Has Changed. Everything Has Changed.

A Revolution or a Slow Demise?
Revolutionize L&D - QuinnI’ve recently read Clark Quinn’s excellent new ‘Revolutionize Learning & Development’ book. Clark always provides a thoughtful and enlightening perspective.  There are some observations and suggestions in here that get to the heart of the issues around the fact that our approaches to building capability through learning need a radical rethink.
Despite the book’s title (or maybe that’s the point of it) Clark’s focus is not so much on learning and development as on performance. It’s about the output rather than process. Learning as simply a means to improve performance.
I agree totally with this approach.
Learning itself may be a noble aim but it’s not an end in itself in the context of work. Professionals in any field are usually recognised (and paid) for results. Whether you’re working in an HR or L&D department, or for a commercial company providing learning tools and services, or for a college, university or business school, results matter. It’s no different from any other profession. Lack of demonstrable results lead to consequences such as limiting of funding, closure or contract cancellation.
We know that the results of learning and development activities can only be determined by changes in behaviour (after all, at its heart that’s what ‘real learning’ is) and behaviour change needs to be measured in terms of what individuals, teams and organisations can do and are doing, that they couldn’t do previously, or what they’re doing better than before. ‘Knowing’ does not prove ‘learning’.
This point seems still to be lost on many HR and learning professionals.
Quinn suggests that a focus on performance augmentation should be at the core of all learning and development activity. He describes a range of methods to achieve the transformation from learning to performance. This book is very helpful in that way and builds on the work of others such as Gloria Gery, Mary Broad and Harold Stolovitch.
If learning professionals and learning departments don’t adapt and change, Quinn argues, they will be revealed not simply as having no clothes, but as being so out of step that they will wither and die, or be removed from the value chain.
The evidence seems to strongly support Quinn’s arguments.
One piece of aligning data comes from the Corporate Executive Board’s 2012 study ‘Building High Performance Capability for the New Work Environment’. This study sampled 35,000 managers and employees across the globe. Part of its focus was examining the links between stakeholder expectations, current practices and results delivered by L&D, and steps toward adapting to the needs of the ‘new work environment’.
Characteristics of this new work environment include:
  • increased demands on the amount of work done with co-workers in different locations
  • increased numbers of individuals involved in making decisions
  • increased need to navigate complex work processes
  • increased requirement for employees to use analytical competencies to process data and make decisions
  • increased importance of ‘network performance’ over individual task performance
  • increase in need for ‘network performance’ capabilities such as:
    • teamwork
    • self- and organisational awareness
    • process design
    • creativity
    • systems thinking
The startling findings of this CEB study were that in order to achieve breakthrough employee performance and achieve their short-term business goals (goals for the next 12 months) stakeholders reported they required an average improvement of around 20-25%.
Business leaders stated they required a 20% improvement in employee performance to achieve their goals, managers needed to see the performance of their teams rise by 22%, and HR leaders believed that their workforces needed to improve by 25% to achieve business goals.
At the same time the study found that simply improving existing practices in the delivery of classroom training will yield only limited gains.
The major reason for this is that by-and-large classroom training techniques have become more efficient over the years (we’ve been at it for a long time) and only marginal further improvements are possible.
CEB future course Fig: From Corporate Executive Board’s ‘Building High Performance Capability for the New Work Environment’ study, October 2012.
Data from CLC Learning and Development High Performance Survey: CLC Training Effectiveness Dashboard. Used with permission.
The simple conclusion CEB draws from this study is that ‘current course and speed won’t get us there’.
However there’s a sting in the tail, too. An earlier piece of Corporate Leadership Council research, L&D Team Capabilities Survey (2011), found that although participants and their managers report high levels of satisfaction on individual learning interventions, their feedback on the performance of the L&D function as a whole was extremely low.
  • “Only 23% of line leaders report satisfaction with the overall effectiveness of the L&D function”
  • “Only 15% of line leaders report the L&D function is effective in influencing their talent strategy”
  • “Only 14% or line leaders would recommend working with L&D to a colleague”
Clearly there is a huge gap between what stakeholders need from their L&D departments and what is currently being delivered.
The imperative is clear, but are L&D leaders and professionals stepping up to address it?
Clark Quinn and I would both argue that there is some good practice and some emerging practice being demonstrated which can close these gaps and transform the learning function into an effective organisational lever. But we need wider fundamental changes if we’re to do so at speed and scale.
So, what are these fundamental changes?
Fundamental Change for L&D – Learning Lessons from Our Changing Concept and Use of Time
I believe we can learn some lessons of where we are and where we need to be as HR and L&D professionals, or as anyone who has responsibility for workforce development and performance, by looking back at how we have adapted our ideas and practices around the concept of time.
It might seem a strange analogy but I think it fits well to the current challenge.
Measuring Time by Observation For millennia people measured time based on the position of the sun. When the sun was directly overhead it was noon. That was the only way to know and that was the way time was calculated. The concept of hours and minutes wasn’t needed or used.
This was a bit like the one-trick classroom pony that has been used for many years. Training was seen as the only solution to performance problems. If training didn’t work, then we trained them again – and again.
time - WinchesterSundials and Water Clocks As technologies such as sundials and water clocks evolved to allow us to measure time more accurately and efficiently we adopted them.
This was  little like the uptake of online courses and eLearning. We found that we could reach more people using fewer resources, faster. However, we were still using the same mindset and thinking of solutions as being ‘courses and programmes’. Learning by physically or mentally taking people away from the workplace.
Mechanical Clocks By the Middle Ages sundials and other approaches based on natural ‘tools’ had been replaced by the the mechanical clock. Each city had its mechanical clock which was set by the angle of the sun at noon.  However  every city had its own, unique time zone.
The photograph above is of the instrumentation on an external wall of the Guildhall in Winchester, the city where I live. Winchester is one degree 19 minutes west of Greenwich, the point designated as ‘absolute zero’ in terms of longitude. One degree 19 minutes at 15 degrees North computes to roughly 86 miles. So the ‘real time’ at Winchester is 5 minutes 16 seconds behind the time at Greenwich. About 60 miles further west the people of Bristol worked to a time that was a further few minutes later than Winchester.
This situation was a bit like the the industry that has arisen over the past 20 years with multiple LMSs and content creation tools and other learning offerings. We were tied into many bespoke systems – there were some standards and reference models to make things slightly easier, but our efforts were were still stuck on point-solutions. We produced smarter modules and courses, better learning pathways, or integrated new technologies but the overall outcome was to make the learning landscape more complex and full of ‘busy work’.
The Disruptive Driver The most significant disruption to our concept of time came not through new time-keeping inventions but through a totally different invention in a totally different domain – the railway network.
In 1840 the Great Western Railway in England (running from London to Bristol and further west into Devon and Cornwall) applied a standard railway time across its network, based on London Time (or Greenwich Mean Time).
“The key purpose behind introducing railway time was twofold: to overcome the confusion caused by having non-uniform local times in each town and station stop along the expanding railway network and to reduce the incidence of accidents and near misses, which were becoming more frequent as the number of train journeys increased.
The railway companies sometimes faced concerted resistance from local people who refused to adjust their public clocks to bring them into line with London Time. As a consequence two different times would be displayed in the town and in use, with the station clocks and (time) published in train timetables differing by several minutes from that on other clocks. Despite this early reluctance, railway time rapidly became adopted as the default time across the whole of Great Britain, although it took until 1880 for the government to legislate on the establishment of a single Standard Time and a single time zone for the country”. (Wikipedia ‘Railway Time’)
There was undoubtedly opposition to this new disruptive approach to an age old issue. Charles Dickens was one who expressed concerns. In Dombey and Son he wrote "There was even railway time observed in clocks, as if the sun itself had given in." However, despite concerns and opposition, railway time was rapidly adopted across the world. A train crash in New England in the USA in 1853 caused by the guards having different times on their watches was just one of many accidents and mishaps that made it obvious that a new order was needed.
L&D Disruptions
Our approaches to learning are being confronted by not just one external disruptive driver, but by many.
  • Change is occurring at an increasingly rapid rate. Taking people away from the workplace to train them in order to keep up and do their jobs better is becoming a less viable option by the day.
  • Increasing complexity and reliance on tacit knowledge means that ‘extracting’ knowledge and codifying it into modules and courses is not only becoming more difficult, but often slows speed to capability.
  • Daily pressures make it less-and-less likely that people can take time away from their work to attend a course. Although it is important to have time for reflective practice and sharing with others most structured training and development courses are built around content, not conversations, sharing and reflection.
  • Evidence suggests that most mistakes are due to errors of ineptitude (mistakes we make because we don’t make proper use of what we know) rather than  errors of ignorance (mistakes we make because we don’t know enough). Yet most approaches to learning assume the opposite is the case.
  • The rise and rise of social medial and technologies has opened up opportunities for people to access help and support from across wider networks in almost real-time. We don’t need to ‘know’ the minutiae if we know where to find it, or who can help us. This challenges many assumptions of our current learning approaches.
There are many other disruptors confronting existing models and practices used by L&D departments and their learning providers. The point is, like the impact of the railway on timekeeping there is an urgent need to adapt. The old must give way to the new. Carrying on regardless is not an option. But many are still doing just that, or focusing on incremental changes only. But everything has changed, and no change in response is simply not an option.
On a positive note there are many things that L&D departments and their organisations can do to change and adapt, but they must move fast. Adopting Dr Quinn’s performance augmentation mindset, and a range of practices that will support it, is a very good start.

Thursday 1 May 2014

What Does the Training Department Do When Training Doesn’t Work?

clip_image002The global training industry is large and in growth again post-2008. Data provided by the US membership organisation Training Industry suggests annual growth around 6% per year since 2009.
Training Industry estimates the 2012 figures for the training market at $131billion in the USA and $160billion for the rest of the world – a global total of $291billion.
ASTD data suggests US companies spent $164billion on employee learning and development in 2012 saying that ‘despite a continuously changing economic environment, organizations remain committed to training and development’.
Although the Training Industry and ASTD figures vary a little it is clear that the global training industry is a large and apparently healthy one. The annual global market of around US300billion is equivalent to the total GDP of countries such as Denmark and Chile.
It’s not just about the money
So how well is all this money being spent?
It’s almost impossible to give an accurate answer to that question other to say that, as with all systems, we can be sure there’s room for improvement.
One of the challenges about how well the money is being spent is that the ‘holy grail’ for many training and learning departments - identifying the financial value of the investment in training and other structured learning and development activity (the ROI) – is rarely achieved and, many would argue, is a flawed measure anyway. We know that there’s a lot more to ‘value’ than a tangible monetary figure (try getting someone to swap their late mother’s wedding ring valued at $1,000 for a new one valued at $3,000 and you’ll get the picture).
In the ROI world the perfect solution consists of a identifying a clear causal chain linking and isolating training input with the ‘hard numbers’ of output. Unfortunately life’s not always as easy as that. Sometimes isolating ROI is easy, sometimes it’s impossible. I have seen some very impressive and valid ROI results and I have seen many other ROI projects end up in the ‘too hard’ basket.
Supporting our workforces to get better and achieve to their potential is not just about the money. In the words of a famous English football manager “it’s much, much more important than that”.
Of course efficiency is important in any training activity – both in financial terms and in terms of achieving results at the speed of business. And it’s not only efficiencies within the training process that we need to look for. Sometimes it’s more efficient and more effective simply not to train at all.
We see examples of opportunities for improving performance by not training all around us. There’s an article I wrote some time ago which addresses that particular issue in detail here.
Training is not a panacea
Most learning and training professionals understand that training is not a universal panacea. Sometimes training works, sometimes it doesn’t. Occasionally the outcomes are the exact opposite of the intentions (see my article on compliance and diversity training here for examples of this).
So, what happens when it’s clear that training is not an effective or and efficient solution for a specific problem?
In other words, what does the training department do when training doesn't work?
The training department response
There are a number of options available (other than shutting down altogether).
  • it can change its name
  • it can change its practices
  • it can change its skillset
  • it can change its mindset
Changing names
clip_image004Name changes alone may feel good, but they usually do little or nothing in terms of real change.
Sometimes a name change does have the effect of changing perceptions. But simply changing perceptions alone has little effect on improving organisational performance. We have all seen ‘training departments’ become ‘learning and development’ departments and trainers rebranded as L&D consultants or with some similar – often more exotic - title. There are some tremendous new and exciting ones that I am sure many in the profession could share.
That’s not to say that changing names should be avoided. When other changes are implemented, a change in name for the training department and a change of titles for the team is probably essential. But this needs to be part of a wider change process.
Changing practices
Changing practices, or the ‘way we do things around here’ is both a sensible and increasingly common response to the realisation that training is to an extent a one-trick pony that can’t cover the wide spectrum of needs in today’s fast-moving and demanding work environment.
Training’s inherent inertia often creates more problems than it solves. Training takes time to plan and prepare, and to deliver. Usually the best solutions to performance problems are faster and more cost effective than training. Often the root cause of the performance problem is something that training simply can’t address. Only a minority of instances of under-performance are due to lack of knowledge or skill. Most are due to motivational or external environmental factors.
Equally, building high-performing teams and organisations is overwhelmingly a matter of exploiting opportunities for experiential learning and practice.
Experience and practice, together with coaching and exploiting networks, and with sharing challenges and potential solutions with colleagues provide the answer. Content-centric , away-from-work, training approaches are on the other hand overwhelmingly ineffective in fast-moving knowledge-based environments where people need to ‘know now’ and ‘know how’ almost instantly in order to perform. They are usually far more costly, too.
Learning in the workflow is the way forward. This is where I have seen the 70:20:10 framework, when used strategically, help organisations adopt new and more effective practices.
Changing skillsets
The dominant model of physical content design and delivery that has been in use over the past 100 years is being shown as increasingly less effective and more costly when compared with the alternatives. the problem is not only that the alternative options have expanded, but the ground has also moved, too. Speed and change dominate. The need to build capability well, fast and flexibly has become critical.
Meta-learning has become more important than learning facts and figures. The details changes almost daily, so there is little use in learning well in advance of needing to ‘do. Ubiquitous access has done away with the need to know the detail unless it’s used on a daily basis. If it is used daily we remember it through exposure and practice. If it isn’t used daily we can find it when we need to.
Training departments need to adopt new approaches, and with these new approaches comes the need for new skillsets.
Traditional training skillsets still have their place (particularly in what I’d call the ‘10’ – the structured learning part of the 70:20:10 model). But the need for skills in design and delivery of face-to-face training events is without a doubt on the decline. There will always be the need for highly skilled trainers. It is just that the world probably doesn’t need an increasing number of them. Just the opposite.
So training departments need to help their people build new skillsets, or need to acquire new people with new skillsets – often both.
These new skillsets will be many and varied. Jane Hart, in her Social Learning Handbook 2014 explains:
‘New learning practices involve understanding it is not just about delivering courses but about helping people to make the most of how they learn naturally and continuously as they do their jobs – in the flow of work – in project or work teams.
It’s not just about internal experts telling people what they should know or do – but about peers sharing their thoughts and experiences, and in doing so learning just as much from one another’
Identifying and exploiting workplace learning opportunities requires skills that are different from those traditionally needed by trainers.
Training needs analysis skills need to be replaced with performance consulting skills. New skills for understanding and using the Social Web, and for helping workers develop their personal knowledge management capabilities are required. Skills in utilising scaffolding theory and scaffolding learning experiences are essential. (Scaffolding is a concept introduced in the 1950s by Jerome Bruner , one of the greatest educational psychologists of our era and still, at the age of 98, senior research fellow at New York University).
These are just a few in the new armoury for training and learning professionals. There are many other items in this ‘new skillset’.
Changing mindsets
More than all the other changes, the most important action a training department can take when training doesn’t work is to work on changing mindsets.

Mindset: noun \ˈmīn(d)-ˌset\
a particular way of thinking : an attitude or set of opinions. An inclination or a habit…a way of life
To be successful and effective, and to be able to adapt to the emerging knowledge-based fast-moving world, training and learning professionals need to cultivate a development mindset. This is needed above everything else.
A development mindset is one that understands learning and development is a continuous process and that learning and working are not separate activities but simply aspects of the same thing – doing good work and improving continuously.
A development mindset is one that sees opportunities for learning and development in everyday work activities and has the capacity to exploit them.
A development mindset is one that doesn’t need to assume the role of ‘expert’ to help others grab development opportunities as they emerge.
A development mindset is one that understands the power of learning through real-work experiences and practice rather than feeling the need to create ‘training systems’ and simulations for learning
A development mindset is one that helps, supports, guides, advises, connects and reinforces rather than teaches or instructs.
A development mindset helps fishers fish.
That’s what the training department needs to do when training doesn’t work.   It’s easy, isn’t it?