BSEE selects Radiant Digital as a training development vendor

Radiant Digital (Radiant) was awarded the NOTC Training and Learning Curriculum Development Blanket Purchase Agreement (BPA) for the Department of Interior’s Bureau of Safety and Environmental Enforcement (BSEE).

BSEE selected Radiant as a training development vendor to enhance the capabilities of its compliance professionals. BSEE sought a firm experienced in developing curriculum for inspection, maintenance, and operations personnel in facilities where the consequences of unplanned incidences can have a severe impact on the health and safety of workers, on the well-being of surrounding communities, and the environment. It also required a firm with the experience to bring contemporary approaches to its learning programs, including mobile and extended reality (XR) applications.

“We’re excited to have been selected by BSEE to help them improve and expand the learning opportunities for their employees,” said Erik Fullerton, Radiant’s Director, Learning and Organizational Change. “BSEE’s mission is a critical one, and Radiant is proud to play a part in helping them maintain the safety of workers and protect our coastal environment.”

Radiant’s extensive experience designing learning interventions for companies involved in industrial applications requiring a high emphasis on safety makes them a perfect partner for BSEE as it seeks to maintain the highest level of safety at US offshore energy exploration and production facilities. Radiant’s breadth of services, such as user experience design, application design and development, and multimedia design, form an excellent complement to the instructional systems design from which future products will spring.

“Radiant has been helping oil and gas firms train their employees to inspect, maintain, and operate their facilities for over a decade,” Fullerton continued, “and we’re looking forward to bringing that experience to bear on BSSE’s unique challenges.”

About Radiant

Radiant is a value-driven organization focused on outcomes while delivering advanced and innovative solutions across the digital enterprise. Radiant provides digital transformation and digital experiences aligned to support their client’s needs to be more operationally efficient, more competitive through insight into their customers, and have a motivated and knowledgeable workforce. To learn more, please visit Radiant’s website or LinkedIn.

About BSEE

The Bureau of Safety and Environmental Enforcement (BSEE), located within the Department of Interior (DOI), has a requirement to produce training and multimedia products for its National Offshore Training Center (NOTC) program. BSEE’s mandate is regulating, monitoring, and inspection of energy production operations on the Outer Continental Shelf (OCS). The BSEE compliance personnel currently consist of approximately 400 professionals. To learn more, please visit BSEE’s website or LinkedIn.


[Webinar] Learning via Immersive Media: Insights from Flight & Aircraft Maintenance Training

https://youtu.be/s2CP5BNub6Y

Radiant presents:

"Learning via Immersive Media: Insights from Flight and Aircraft Maintenance Training"

Interest in immersive learning using AR/VR continues to grow as the cost to implement the technology falls. But is there value in implementing immersive learning for your technical training? Hear some of the lessons that have been learned from pilot training, where immersive learning has been used for decades. Learn about the importance of analysis when designing simulations and how simulation fidelity is geared based on learner level.

Special guest speakers:


Helpful Frameworks for Redefining Your Learning

Instructional designers all have the same essential mandate to design educational interventions that result in closing performance gaps. For decades now, technology has been used to redefine the way we train. The recent pandemic has accelerated the application of technology to transform traditional classroom training into remotely delivered, blended experiences.

As IDs and facilitators, we’re under tremendous pressure to curate engaging learning experiences and achieve effective, efficient knowledge transfer. A tall order when in-person training was one of the arrows in the quiver, the difficulty is greatly compounded in fully remote offerings. It is more remarkable still if the offering is both fully remote and asynchronous.

Using technology to augment or redesign learning is an opportunity to improve the learner’s experience, but frequently it results in the opposite outcome. Improving and upon existing methods is difficult; we tend to stick with what’s worked in the past. This is true of in-person or remote, synchronous or asynchronous learning. We can, however, look to another group of educators for help in framing and vetting our proposed designs.

The two most popular frameworks used to plan technological integration into K-12 learning contexts are the SAMR and TPACK models. While these are models frequently associated with K-12 education, they offer the ID working with adults in professional contexts a valuable tool to help frame and vet design decisions before investing in them.

SAMR visualizes a continuum of technological transformation. The first two levels, called enhancement, are the typical ways that we use technology to change the way we transfer knowledge. While the second two, modification and redefinition, seek to transform how we enable learners to reach objectives.

SAMR provides us with a handy way to think about our lesson planning and helps us set quantifiable goals, especially when updating existing content. SAMR has an oft-cited inherent pitfall; it may provide users a false sense that the goal should always be redefinition because it is a stepped framework. Not every task is best served by redefinition. We may do a disservice to learners if we over-transform functions that can be more efficiently accomplished with a traditional or low-tech solution. This is where using TPACK in conjunction with SAMR can provide us with a complete way to think about our overall offering and its objectives.

TPACK reminds us that introducing technology itself is not the goal but instead an improved learning outcome. It posits that three dimensions of knowledge must come together to create a practical implementation of technology. The dimensions are pedagogical, content, and technological expertise, i.e., technology itself doesn’t necessarily equate with better outcomes.

  • Content knowledge, do we have the requisite knowledge to transfer it to the learners effectively?
  • Technology knowledge, do we know how to implement the technology in question properly?
  • Pedagogical knowledge, do we know the best methods to teach the content?

The goal should be the convergence of these three areas, technological pedagogical content knowledge (TPACK). Having only one or two of these areas taken together as the basis for a given objective is likely to result in a suboptimal solution.

For example, you have equipment maintenance training traditionally done with a representative piece of equipment in a classroom setting. Still, it is no longer feasible for logistic reasons to have the training done in a central location. You decide to transform the exercises by using an AR solution so that the training can travel. You have the content knowledge (through your SMEs) to commission the models. And you have the pedagogical knowledge and, by using cognitive task analysis, have designed a relevant and effective progression of objectives supported by reciprocal teaching. But your facilitators rotate, and they don’t have adequate support at the remote training sites, resulting in significant downtime and a poor overall learner experience. Thus, your solution fails to account for the lack of technological knowledge and therefore fails to reach TPACK convergence.

By framing potential avenues for transformation using SAMR and vetting options using TPACK, we can identify where the best bang for our development buck might lie. Both models have drawbacks, not the least of which is the lack of quantitative study tying the models to the more significant body of educational research. Despite this, the models can provide a helpful framework when planning course enhancements or redesigns. However, their use must be coupled with evaluation criteria to define the enhancement’s success or failure.

Looking for ways to reframe our development choices is an excellent way to check assumptions and avoid mistakes. At Radiant Digital, we look for new ways to reframe our thinking about professional training to efficiently help our clients apply their training dollars. Radiant specializes in custom content creation for enterprises with learning needs unique to their organizations. We can help you design, develop, and implement your learning solutions from cognitive task analysis to learning application development and integration.

Koehler, M. J., & Mishra, P. (2009). What is technological pedagogical content knowledge? Contemporary Issues in Technology and Teacher Education, 9(1), 60-70.

Puentedura, R. R. (2013, May 29). SAMR: Moving from enhancement to transformation [Web log post]. Retrieved from http://www.hippasus.com/rrpweblog/archives/000095.html


Applying Data Analytics to Improve Training Outcomes

​Introduction

Corporations spend billions of dollars ($370B in 2019) on training each year. Given this level of investment, it is surprising that a relatively insignificant portion of the expenditure is spent examining training efforts' efficacy.

For over 40 years, the Kirkpatrick Model, named for its creator Dr. Donald Kirkpatrick, has provided the most extensively used training evaluation guidance. The original model had four levels, but many researchers refined it in the intervening years. Now the model is often shown with a fifth level.

Model for Measuring Training Effectiveness

The following are the levels in the Kirkpatrick Model that evaluate the outcome of training programs.

L1  Reaction

Learners’ reaction to the learning intervention (training). Questions are subjective, e.g., do you feel that the training was beneficial?

L2 – Learning

An evaluation of the knowledge transfer achieved by the learning intervention. Questions are objective, e.g., put the steps of this work procedure in their correct order.

L3 – Behavioral Change

An evaluation of whether learners apply the desired behavioral change as part of their job function.

L4 – Business Results

An evaluation of whether the targeted behavioral changes are translating into performance improvements.

L5 – Training ROI

A ratio derived by comparing the cost of training development and administration to the financial benefits derived from the behavioral change.

Barriers to Implement Evaluation Programs

Many organizations find themselves unwilling to follow up their training dollars with additional evaluation expenditures. However, this is both counterproductive and counterintuitive. Only by gathering and analyzing appropriate evaluative data can any organization hope to produce and iterate effective learning programs. This maxim becomes truer the larger an organization becomes. Small behavioral changes at scale can lead to millions in benefits over the life of a learning program. Why are organizations (even large multinationals) so hesitant to employ sound evaluation practices as part of their standard operating procedures? We believe it’s because most training departments find the prospect overwhelming and lack the experience to justify broader stakeholders' justification. The questions at the outset of any evaluation effort may seem simple but can be daunting to organizations without that experience.

  • Where do we start?
  • Which data do we gather?
  • How do we gather them?
  • How do we analyze them?

Our Approach

The easiest way to ensure your evaluations are providing the requisite data to make decisions is to think about the data at the outset of your initiative. If possible, this should be the first step of program design, just after the gap analysis but before you begin delineating learning objectives.

A close up of a logo Description automatically generated

Figure 1: Continuous Improvement model for Training effectiveness using Analytics

Phase 1 – Data Identification

If you’ve performed a gap analysis, you will have identified improvement areas, even if relatively informal. It is at this stage that you should identify the evaluative data that you will gather. For example, if the gap was related to accidents on the job, the key performance indicator (KPI) that must be measured is the change in the number of accidents over a given timeframe. A learning program may have many such data points and associated underlying supporting data that must be gathered to make informed decisions on iteration, expansion, or cancellation of the program.

This effort is often skipped, but it should take place even if you never intend to evaluate Level 2. The reason why is related to one of the most fundamental premises of learning design: the purpose of learning is behavioral change. Thus, if you don’t know which metrics you want to affect, you can’t craft an informed behavioral change strategy. Subsequently, you cannot possibly create efficient learning interventions.

Phase 2 – Learning Design, Development, and Deployment

When armed with clear targets for the metrics to be gathered, learning design becomes much more straightforward. Instructional designers work with subject matter experts to develop an approach that elicits the behavioral changes likely to affect the metrics identified in Phase 1. Only that knowledge directly tied to the identified behaviors through learning objectives should be part of the design; anything not related is extraneous and should be jettisoned.

Phase 3 – Gathering Data in the Field

Implementing all levels of the Kirkpatrick Model can be an expensive and time-consuming process. However, it is unnecessary to measure everything. We follow industry experts such as Leslie Allan, who suggest applying the levels only as appropriate, our synthesis of this guidance:

  • Level 1 (Reaction) for all programs
  • Level 2 (Learning) for “hard-skills” programs
  • Level 3 (Behavior) for strategic programs
  • Level 4 (Results) for enterprise-wide programs or programs affecting tasks with high-cost impacts
  • Level 5 (ROI) for enterprise-wide programs or programs affecting tasks with high-cost impacts

Gathering Levels 1 and 2 is typically enabled by a learning management system and is relatively straightforward.

Level 3 may involve leveraging existing reporting avenues, or it may require new technology to be put in place to gather the needed data. For example, are workers performing every step in a given work task each time it performed? There may already be technology to measure this in an automated fashion, or it may require self-reporting, supervisor observation, or a combination of all three.

Level 4 will ultimately require you to gather the Level 3 behavioral data and the data related to the KPI(s) that you identified in Phase 1.

At this stage, the key to success is collecting data from multiple sources such as (1) Learning Management System, (2) Service Management Systems such as Service Now, and (3) Navigation data using UI Analytics tools, and (4) Surveys post-training. Though the data looks disjointed and discreet, it requires some knowledge of data aggregation and ingestion so that Data scientists and Analysts can draw the insights.

Phase 4 – Developing Insights

The Levels 1-3 data gathered in the previous phase include the raw figures, responses, feedback, and other logistical information obtained directly from the Digital platform. This data can be overwhelming and may not make sense by itself. Data has to be normalized for analysis and fed to analytics platforms to gain insights. Any insights gained should be compared with the objectives and goals. This is where the specialized skills of Data Management, Data Science, and Data Analytics are necessary to aggregate, persist, curate, train, and manage the data. Achieving the desired goals requires discipline and a commitment to constantly collecting and processing the information in a non-intrusive fashion.

Level 4 calls for a more rigorous analysis strategy because one must determine if the identified behavioral changes positively affect the bottom line. You could have a highly successful training program from a behavioral change standpoint, but it could fail to close the performance gap. This gap means that you failed to associate the correct behaviors with your identified KPI(s) and that the program needs modification.

Both Levels 4 and 5 require vetting from a wider stakeholder group with the expertise to reliably agree on the relationships between KPI(s), costs, and supporting behaviors. The effort and time involved make these levels only reasonable for large, high-impact programs.

 

References 

Phillips, J. J., & Stone, R. D. (2000). How to measure training results: A practical guide to tracking the six key indicators. New York: McGraw Hill.

Mazareanu, E. (2020, May 04). Global workplace training: Market size 2007-2019. Retrieved July 20, 2020, from https://www.statista.com/statistics/738399/size-of-the-global-workplace-training-market/


Flattening the Forgetting Curve for Learners and Reducing Information Load

Scientific study has proven that humans forget 50% of what they learn within one day of learning. After completing an online course, even the brightest minds cannot remember every piece of information presented.

We tend to forget information over time if we do not apply it to our daily activities. The Ebbinghaus Forgetting Curve commonly represents the effect.

Ebbinghaus Forgetting Curve

What is the Forgetting Curve (and How Do You Combat it)?
Image Source: Cursim.com 

In the 1880s, a German psychologist named Hermann Ebbinghaus tested his memory over various periods. He gathered all the data from his studies and plotted it on a graph similar to the one above.

This work gave birth to the Ebbinghaus Forgetting Curve, a mathematical formula that describes the rate at which memorized material is eventually forgotten. Ebbinghaus initially generated lists of random three-letter non-words (the fact that they were non-words was necessary; there was no pattern or hardcoding associated, which challenged the memory). Ebbinghaus then memorized these non-words until he could recall all of them correctly.

He then tested himself at intervals like half an hour or a day or a few days later to see how many of the non-words he remembered, which went as far as 31 days.

Ebbinghaus discovered that, over time, he would forget a significant number of these non-words. The rate and volume of forgetting over time were directly proportional to the passing time. So the forgetting process began as soon as he stopped referencing the list; it first progressed at a rapid pace, eventually tapping off over the week.

The forgetting curve is tricky to tackle, especially when remembering different formats of information shared in corporate training programs. Not all training information is engineered explicitly to battle this phenomenon. Thus, understanding the forgetting curve and its influences helps instructors create more effective training content while devising ways to aid in employee knowledge retention.

When we absorb content, we are actively making relational associations with other information that we already know. This knowledge or “cognitive economy” will come from our long-term memory rather than our short-term or working memory.

Implications of the Forgetting Curve

The general scientific consensus states that there are two primary types of memory.

  1. Working Memory – Helps remember simple information in the short-term. It is characterized by,
    • Limited Capacity – Learners can store only around 7 to 9 items at a time in their memory
    • Limited Duration – The information absorbed can be forgotten easily with time or due to distractions.
  2. Long-term Memory - Refers to information storage retrieved over an extended period. A repeated recall is vital to ensure that information retention persists through long-term memory because the information that is frequently accessed will become easier to recall over time. Any learning designer wants to ensure that their content resides in learners' long-term memory, which is an essential mechanism to flattening the Forgetting Curve.

Typically, the higher the instruction's cognitive load, the steeper the forgetting curve will be as learners struggle to build the schemas necessary for retention. Cognitive load is determined by the number of working memory resources a learner must use to ingest and retain information.

Reducing cognitive load is critical to designing effective instruction and can be accomplished in several ways. Chunking, retrieval practice, multiple delivery media, and spaced repetition are relatively straightforward methods that you can use to reduce cognitive load and flatten the forgetting curve for your learners. These techniques can be applied whether you're building synchronous or asynchronous instructions and with or without self-paced elements.

Chunking

Chunking refers to splitting or grouping complex, tangentially related information into cognitive chunks that form smaller, more cohesive groups. A three-day (21 seat-hour) instructor-led training course may comprise three basic groupings (one for each day) of loosely related information, but creating 10-12 chunks of highly associated information will help learner retention.

Another example of chunking is producing microlearning modules to reinforce the valuable information learned in training sessions. Each chunk of standalone learning content can last about 10 minutes, enabling learners to meet one learning objective at a time comprehensively.

Spaced Repetition

Spaced repetition reinforces learning at periodic intervals. Implementing spaced repetition must also include increasing the time between each repetition and increasing the complexity of the information presented. This way, the brain is trained for better recall. As time passes and the space between repetitions increases, the mind becomes better equipped to retain and recall concepts. Traditionally, using flashcards (there are myriad flashcard software packages), modern approaches also include using mixed-media and associated short quizzes pushed to the learner from a central learning function.

http://www.cursim.com/assets/2020/04/Graph-2-01-1024x679.png
Image Source: Cursim.com 

Retrieval Practice

Implementing a technique that elementary-school teachers have been using for centuries in retrieval practice instead of only a review of a chunk of information, learners test their knowledge recall in a ‘free’ environment. A low-pressure environment where making mistakes does not lead to consequences allows the learner to take chances. Immediate feedback, with an explanation of the correct answer, is essential for recall and understanding.

Unlike spaced repetition, it does not require increased intervals and complexity. You can effectively implement the Retrieval Practice Method within a given training session or provide it through intersession activities. By merely testing knowledge, you are helping the learner to retain it.

Blending it up

Creating blended learning is another way you can help retention. By providing different avenues for acquiring related chunks of information, you provide individual learners the ability to find their own ‘a-ha moments.’ You’ll be looking at disseminating chunks through various media videos, podcasts, infographics, and myriad other possibilities at the basic level. However, as you progress along the blended continuum, you can achieve more effective knowledge transfer. Structuring an exercise to accomplish a task as part of software training can certainly be useful. However, a blended approach could involve assigning learners a group project to analyze the software usability related to a given set of tasks and present potential improvements.

Wrapping up

By making it easier for your learners to absorb and retain the content, you will more readily achieve the ultimate goal of all training, behavioral change. At Radiant, assisting clients with workforce transformation is one of our core missions; let us help you achieve your next learning initiative's goals.


[Webinar] Transforming ILT to Blended Learning

https://vimeo.com/413988450

Radiant shares best practices for transforming ILT to Blended Learning in this 30-minute webinar as more companies turn to distance training for a remote workforce.


Making the Leap to Virtual Events

The shift to conducting day-to-day business virtually rather than in person has been underway for many years. The recent global health crisis has only magnified the value of effective communication with your teams, students, or potential customers via an online connection. While face-to-face meetings provide a level of engagement and interaction that is difficult to match, virtual gatherings' cost savings and flexibility make them a compelling proposition for a wide range of industries.

This image has an empty alt attribute; its file name is svf1.png
Image Source: Business2Community

More so than traditional events, the success of a virtual event depends on the presenter’s preparation, enthusiasm, and participant engagement design. However, it doesn’t matter how well designed an event is if nobody can see it. Transitioning to mostly online events can be intimidating; there are myriad approaches and tools that you can utilize.

So, we’ve compiled some helpful insight to help you get a sense of your choices.

Deciding on an approach and platform primarily depends on:

  1. The size of your event(s)
  2. The frequency of your event(s)
  3. Your budget

Additional considerations include:

  • Will you need to collect payment from participants, and how will that be managed?
  • What sort of data do you need to gather on your attendees?
  • What technologies has your organization already invested in that can be leveraged?
  • What are the capabilities (time/IT knowledge) of the staff that will support these efforts?

Small Scale Events

Frequent small <30 person events like corporate training and sales meetings can be produced on various platforms. Many DIY platforms meet this requirement while offering features that make it easy to record, share, and follow up with your attendees.

This image has an empty alt attribute; its file name is sv2.png
Image Source: Brafton

Some of the top, small-scale webinar tools include Zoom, Adobe Connect, WebEx Training Center, MS Teams, & Join. Me.

Key features in these platforms include:

  • Quick scheduling and app integration with major email clients- Select your dates, times, and create custom URLs for your webinars in moments.
  • User-friendly Interface
  • Intuitive platform with options for automated webinars and customized invites
  • Chat-based instant interactions
  • Password-enabled sessions and private channels
  • Waiting rooms and host permissions
  • HD Streaming
  • Audio and Video Recording function that stores the recorded file in the cloud for easy download later
  • Multi-device, browser, and OS support
  • Group email notification
  • Information pop-ups to the audience during the webinar session
  • Integration with Facebook Live and YouTube
  • Host Control to Mute/Unmute participants
  • Viewer Engagement with dynamic presentations, interactive whiteboards, videos, and live chat.
  • File management on the cloud and secure sharing

Growing enterprises will find these features especially handy for corporate training, sales meetings, and focused team-based project discussions.

Medium Scale Events

These web conferencing tools are perfect if you want to host frequent events- especially if you want to offer automated webinars that allow pre-recorded content to be transmitted to larger groups.

Some popular platforms include Demio, ClickMeeting, WebinarNinja, Webinars OnAir, and GoToWebinar.

These may include most of the features previously mentioned and other prominent features such as:

  • Analytical insights on past webinars and the audience of the upcoming webinar
  • Source tracking options to check which channel has maximum signups
  • Reporting on webinar performance
  • Password protection
  • Subaccounts creation
  • Closed Captioning with text captions for hard-of-hearing audiences or people using devices on mute.
  • Advanced content editing
  • Scaling-up with less time lag
  • Just-in-Time Webinars (on-the-spot sign up)
  • Custom landing pages
  • Customized invitations, branded webinars, and automated callback
  • Email automation and drip campaigns
  • Multiple presenter support
  • Audience poll and surveys for feedback
  • Data Export

You can either opt for a basic plan that comes with a host of fundamental features or a paid plan based on your budget that would include advanced features like automation, reporting, analytics, custom marketing, and more.

Large Scale Events

Companies involved in intensive and proactive marketing activities or responsible for facilitating conferences composed of 100+ participants (especially those that must show ROI based on attendee metrics) may find value in opting for robust turn-key event solution providers.

A reliable broadcast distribution technology is the backbone of these complete event management tools. These can be especially useful if you need ROI based on attendee metrics and are best served by engaging a turn-key virtual event platform provider such as BigMarker, Cvent, Attendify, and Aventri.

Some of the top-notch features that make global-scale conferences exceptional include:

  • Reporting
  • Conference registration
  • Advanced configuration
  • Event-mailing lists linking
  • Budget management & event payment processing
  • Survey templates
  • Mobile-friendly registrations
  • Integrations with major Global Distribution Systems (GDS)
  • Ticket Management
  • Multi-event support
  • Instant Content Updates
  • App Monetization
  • Unlimited Document Sharing
  • Performance and Web Analytics Dashboard
  • Attendance Management
  • Widget Support
Preparation Tips

Regardless of what you choose, it’s critical to focus on designing interactions with your participants, doubly so in learning scenarios where learner engagement is crucial for retention.

Beyond the need for meticulous design, you can help to ensure things go smoothly by following some of these best practices:

  • Always have a team member dial-in to make sure the event link is working. Also, have this person send you a question to look at its preview in the webinar software.
  • Let your audience know how you will be addressing questions while you share the meeting agenda.
  • Use pre-webinar slides and announcements, possibly with a countdown for the meeting to start so that participants know they are in the right meeting room.
  • Talking to participants and assuring them, the meeting will soon confirm that the audio is working. Also, mute all guest lines to ensure there is no background noise.
  • Send out slides before the webinar to participants to know what to expect and are prepared with questions.
  • Send out a recording of the meeting, post the webinar within 24 hours, and let the participants know in advance.
  • Decide on a standardized format that everyone can comprehend, with the right branding message and visuals that can be easily viewed on mobile and desktop devices.
  • Time your Q&A sessions in a way that the presenters answer a maximum number of questions.

Some of the popular webinar formats include:

This image has an empty alt attribute; its file name is sv3.png
Image Source: Slideshare

If you’re preparing for a virtual event, talk to our experts for support.


Combining Knowledge Management with Employee Engagement

Your most valuable asset is your intellectual capital, i.e., the knowledge of your employees. Unfortunately, when people move on, their knowledge goes with them. There are two different, but not mutually exclusive, strategic approaches for managing tacit and explicit knowledge.

Burchiello noted for his
paradoxical style and founder of
a school of writing.

Externalization of tacit knowledge has focused on both human-centric and ICT-centric knowledge management theory for over fifteen years. The whole conception of tacit knowledge in the knowledge management literature has been criticized for being based on an incorrect interpretation of Polanyi’s original theory (‘Paradox and the Shape of Employment Growth’) of knowledge. At the same time, it has been reported that many knowledge management projects related to the externalization of tacit knowledge do not meet their objectives. These findings suggest that there is something wrong with the dominant epistemology of knowledge management theory.

Much of the focus on knowledge management, especially concerning some of the works, has been implementing knowledge management systems and strategies, making implicit knowledge explicit and possible to store. That perspective takes a static view of knowledge, treating it as an object and possible to separate from practice. Much of the early literature also takes a normative perspective, focusing on becoming successful when implementing knowledge management strategy. However, the early literature has been questioned by several researchers who have offered critical reviews and questioned whether or not it is at all possible to manage knowledge while treating it as an object. (1) They argue that knowledge should rather be understood from a constructivist approach that treats knowledge as a social process.

Engagement

Circus Maximus, Rome. Events
and social center.

Providing a platform from which knowledge can easily be shared and captured that extends employee’s network of trusted colleagues and allows them to recognize and be recognized should be the goal of any modern knowledge management effort. Employees want to engage with their colleagues and be recognized for their work. Addressing employee engagement issues within your knowledge management framework will lead to better disseminating organizational knowledge and more highly engaged employees.

  • Personalization – Knowledge management systems designed to help people locate and communicate with each other focuses on spreading experience and engaging employees.
  • Codification – Knowledge management systems designed to convert tacit knowledge into explicit knowledge focus on applications that facilitate the storage, transfer, and application of knowledge.
  • Colleagues – The #1  thing employees cite that they like about their jobs is their relationships with colleagues.
  • Trust – The #1 predictor of knowledge sharing is trust between employees.
  • Recognition – Less than 1/3 of employees feel strongly that they are valued for their work.

Benefits of a knowledge management platform with engagement

  • Efficiency – Through the reduction of errors and unnecessary revision
  • Quality – Through the sharing of best practice and new technology knowledge
  • Consistency – By empowering new developers and teams with reliable information
  • Adaptability – As knowledge is captured, capitalized, and not lost to attrition
  • Retention – Through enhancing employee morale
Archimedes revolutionary screw
pump.

Inspiring ideas

When implementing knowledge management with engagement, you have a wide variety of possible elements to choose from that will provide you different benefits.

Engagement elements – Peer recognition helps empower employees and align culture. Recognize your employees and allow them to recognize each other. Clear objectives help break down communication barriers and promote accountability by using specific, measurable, relevant, and time-bound objectives. Gamification taps into the human competitive nature; personal or team goals address employees’ need for recognition and feedback.

Insight – Information extracted from the system that allows more nimble HR decisions. Surveys allow you to effectively gauge the pulse of your organization by finding out what your employees are really thinking. Analytics helps you better understand your culture and how your employees interact.

Knowledge – Elements that facilitate knowledge retrieval and application. Provide a collaboration space that integrates with actual workflows and captures the knowledge that is produced there. A Messaging feature provides a useful searchable chat and allows you to mine a wealth of data. Provide a platform for communities to form naturally and capture their knowledge. Assign designated experts or let the community select them. Build trust networks and break down silos. Let users vote up the right answers and best contributors. Designate the official best answer and capture the knowledge.

“Knowledge Management should not be seen simply as a set of discrete tasks (storage, transfer, and creation), but as a set of continuous practices (decision making, acting, and negotiating) that support the circulation of knowledge.” — Diedrich& Guzman.

Knowledge management maturity

Where you are, and where you want to be? Radiant will assess where you are and help you map the path to innovation. Reaching the goal of dynamic knowledge supporting continuous improvement requires sponsorship, buy-in, and a well-crafted strategy.

APQC’s levels of knowledge management maturity

APQC Knowledge Management Progression

 

Plan the flow of knowledge

Agora at Athens. Commercial, assembly, and residents gathering place.

It is a mistake to implement a knowledge management approach such as communities of practice or an expertise location system without first understanding the flow you are trying to enable. The first step in any knowledge management initiative is understanding how you want knowledge to move through the organization. Once you determine how and what knowledge needs to flow (and from and to whom), you can enable the process with knowledge management tools and approaches such as communities and networks, best practice transfer or lessons learned programs, wikis, enterprise social media, and so on.

Focus on breaking down barriers

A system must be used to be effective. A flexible, easy-to-use system that integrates naturally with existing workflows is needed. Focus on breaking down barriers impeding knowledge flow and not attempting to change an existing culture immediately. Blending the correct mix of elements is critical. The first step of any new knowledge management effort should be a knowledge audit to assess the current landscape and identify barriers.

For a knowledge management project to succeed and the Knowledge Management System (KMS) to become a stable, taken for granted part of the daily work activities in an organization, human and non-human others’ actions, apart from the project group members, are needed. It is indeterminable from the beginning and throughout the process what these actions will be. (2) From implementation to appropriation: understanding knowledge management system development and introduction as a process of translation. (3)

Knowledge management is about enabling what most people want to do naturally—share what they know and learn from others. The barriers to sharing are often structural: there is not enough time, the process is cumbersome, people don’t know the source or recipients and are not sure they can trust the information, or they resist codification because they know instinctively that tacit knowledge is richer than explicit knowledge. (4)

Example elements of a knowledge management implementation

Selecting the right blend of elements is a critical part of a successful implementation.

Crafting and implementing a strategy

For a knowledge management system to work, selecting technological solutions cannot be devoid of people and processes.

  1. Audit – Assess barriers, analyze user needs, define business goals.
  2. Plan – Confirm a vision, define the scope and elements of the effort, create a roadmap.
  3. Communicate – Solicit input from the user base and begin the change management process.
  4. Execute – Configure and integrate technology elements, adjust on user feedback.
  5. Adopt – Communicate, train, workshop, staged rollout, etc.
  6. Manage and Evaluate – Manage, evaluate, and iterate based on user response.

Selecting technology

Once you understand the barriers to information flow and your users’ needs, you can begin identifying technology to surmount those barriers. Commercial off-the-shelf (COTS) or custom applications may be selected, or a combination of the two may provide the best solution.

COTS systems that offer required functionality and can be deployed with or without customization.

  • Quicker deployment
  • Strong customer support
  • Access to the user community
  • A standard and stable environment
  • Interfaces vetted for usability.
  • Existing training materials
  • API integration and support for 3rd party applications
  • Flexible pricing models
  • Facilitates handoff to customer organization for ongoing support and iteration

Custom applications are purpose-built applications that integrate best of breed components.

  • Tailored to your specific needs and culture
  • Scalable
  • Facilitates program integration
  • Improved security – precise ‘admin’ rights to user groups
  • Open-source use to achieve cost savings.
  • Adaptable to meet the needs of the evolving enterprise

Radiant is technology agnostic and will work with you to select the best blend of applications to match your needs. Let’s start a conversation about the best choice for your organization.

(1) (e.g., Styhre, 2003; Alvesson, 2004; Tsoukas, 2008) (2) Diedrich, A., & Guzman, G. (2015) (3) Journal of Knowledge Management, 19(6), 1273-1294 (4) APQC 2015 KM White paper