Wednesday, November 26, 2014

IDs Use Standards: Ensures Context Sensitivity

Standards are the measures that IDs use when determining whether they will sign-off on a learning solution they have created, or not – whether their name goes on the final product.


The competent instructional designer/developer (ID) ensures context sensitivity.

Little things can be jarring; they jangle the nerves and create distractions.  Little things out of context can become blow up disproportionately to become flaming issues.  

P-20 education and workplace (adult education) often come to loggerheads over terms simply because their contexts and expectations based on context differ.  One of the highly touted differences between childhood education (pedagogy) and adult education (andragogy) is the undeniable fact that adults bring years of experience.  


      (Side note: having worked with special needs children and children of abuse and poverty, I content that children bring significant experience to their learning, especially their P-20 learning as well... experience is the essential difference according to experts.)  

Creating learning without considering the learner’s previous experience is futile at best.  This may be the reason that so many courses spend the first twenty-to-thirty percent of the course defining and building common experience bases.  During this time early in the course, the instructor and learners get acquainted, learn about each other’s jobs and roles and experiences, discover the course goals compared to the learner’s goals, and map out the course’s structure.  Along the way, they discover whether there are potential barriers such as language, technology, physical environment, or just a mis-match between learner and course intent. 

Why spend that much precious time setting context?  Because, context is important.  In fact, learning will not occur until the learner sees a need for it (also see; The Teachable Moment).   When learners have context, they learn. When context is missing, they struggle.

For a moment, consider the impact of requiring a course with 25%-30% of it’s content focused on US laws, regulations or code.  Contextually, this is important for learners within the United States.  However, does it work in Puerto Rico, China, Australia, Canada, India, Greece, Switzerland, or Sweden?  Language differences aside, the issue of laws, regulations and codes needs to addressed in order for the rest of learning to be effective outside the US. This an essential context issue. 

Now, consider the impact of words.  The US government has enacted the Plain Language Act [http://www.plainlanguage.gov/] requiring government agencies to write in ways that avoid confusion.   They are improving, but the task is monumental.   Very few courses start out by defining the reading level.  Even fewer courses intentionally choice a ‘voice’ for their course.  Yet, both reading level and voice can impact learners’ ability to learn. 


Case Study #1: Fun and Games

Once upon a time many decades ago, (before web-based everything) our intrepid instructional designer had the opportunity to work on a CD-based learning game.  The project team included a skilled technical writer.   This writer started his participation in the project by asking what we (the project team) wanted our learner/player to hear in their head when they played.  It took the team awhile to work it through.  Eventually, it was clear.  We wanted to game to come across as “fun”, even though it was teaching highly technical terms.   The writer re-worked every sentence in the games material to echo that “fun” idea.  What magic did he employ?  I’m still not sure.  Technical writers are valuable members of instructional design teams, because they bring an impartial eye to context and the language of that context.


Case Study #1: Developmental Delayed Hispanic Young Adults

In another time and place, an instructional designer was asked to build a computer skills lab for developmentally delayed young adults (17-21) whose primary language was Spanish, but did speak some English and needed to build technology-specific language in both Spanish and English.  They needed to be able to access computers to write emails and text messages, visit websites such as sports and hobbies, and they need to be able to computer play games.  They needed to be able to talk with their peers and co-workers about using computers.  The course designed a very repeatable lab which each learner could do multiple times to strengthen his or her skills (keyboard, mouse, and language skills).  The lab provided them with many different job aids on binder-ring.  Each index card for the ring had a term in both English and Spanish, a short explanation (under 10 words) in both English and Spanish, and a picture of the computer part or term.  For this learning, the context was concrete and factual.  The learners loved it and loved having job aids that they could share.  The shareable nature of the cards provided context for them across learning, work, and home.


Definition of a Standard – Ensure Context Sensitivity

Consider the definition and performances listed for The Institute for Performance Improvement (TIfPI’s) standard Ensures Context Sensitivity.


Definition:
considers the conditions and circumstances that are relevant to the learning content, event, process, and outcomes.

Performances that demonstrate this standard:
  • Creates solutions that acknowledge:
  • §  Culture
    §  Prior experience
    §  Relationships to work
    §  Variability in content
  • Verifies that materials reflect the capabilities of audience (e.g., readability, language localization, plain language, global English, physical capabilities, technology limitations, etc.).
  • Maps to other learning opportunities
  • Aligns content with learning objectives and desired outcomes
Individuals applying for learning solution certifications with marks and badges will be asked to describe ways in which he or she accomplished at least 3:4 performances (required) one of which must be:
  • Creates solutions that acknowledge:
  • §  Culture
    §  Prior experience
    §  Relationships to work
    §  Variability in content

Can you see yourself doing these performances?  Can you see yourself doing at least the three of the four required performances with every learning solution?  

Can you see other IDs doing these performances, perhaps differently, but still doing them?  If so, you need to consider applying for a learning solutions development credential.  Get the ID CertificationHandbook and visit www.tifpi.org for more information.

Want a list of all nine IDstandards?   

Would you like to know about the study -- a practice analysis -- that TIfPI Practice Leaders did to generate and validate nine standards, including Elicits Performance Practice?   Would you like a copy of the infographic with standards and learning solution certification types?   


Tuesday, November 18, 2014

IDs Use Standards: Enhances Retention and Transfer

The competent instructional designer/developer (ID) enhances retention and transfer (of learning).  Standards are the measures that IDs use when determining whether they will sign-off on a learning solution they have created, or not – whether their name goes on the final product.

When we look at many different learning theories, they are all enhancing retention.  The point of learning is to change the way that the learner sees the world, thinks, and then acts.  As the field matures, practitioners must learn more about memory retrieval and the way that brains and emotions work -- but that discussion is for another day.  Today, learning is all about giving the learner a reason to change (motivation), giving them the knowledge and skills necessary to act in new ways, giving them practice with feedback. For example, Generative Learning Theory promotes the roles of recall, integration, organization, elaboration -- all ways to promote retention.  

Retention 

Feedback is key here as it provides information to the learner about their progress.  Therefore, effective retention techniques include the tools that assess learner’s current state at several points during the learning (pre, peri, post). 

Studies of learning retention (as well as our experience) tell us that retention of new knowledge and skills degrades over time, if not used.  Therefore, instructional designers have a portfolio of techniques to enhance retention such as:
  • Memory aids (e.g., visuals, infographics, handouts, job aids, acronyms, etc.) 
  • The structure and organization of the learning

o  Sequencing orders the learning events in a logical pattern – A-Z, 1-10, easiest to hardest, process steps, etc.
o  Scaling builds learning components one on top of another increasing complexity and difficulty with each round.
o  Scaffolding removes supports and guidance over time givinglearners more support and direction early in the learning and moving toward greater autonomy and self-discovery as the learner becomes more skillful.
  • Checklists and templates to guide decisions and work products 


All of these techniques are focused on building retention.  Many of them included elements of feedback that allow learners to track progress.

Feedback is an important aspect of retention; it provides the milestones that allow learners to experience improvement and change. The most important feedback may be the one that creates the teachable moment – that moment when a potential learner internalizes the need to learn.  


Transfer

Now, comes the challenge – transfer.  No matter how good the learning is in the learning environment, the ‘rubber meets the road’ when the learner must transfer their learning to their real world – often their work world. 

Many of the memory and retention techniques also work as transfer techniques.  However, every instructional designer/developer soon discovers that no matter what is taught in class, the real world trumps the world created in any learning environment.  If the workplace does not support the use of new skills, the skills are soon mothballed and then forgotten.  Therefore, learning designs that consider and even replicate aspects of the work environment assist the transfer of new skills from the learning environment to the workplace.  


Case Study #1: When 100% = Zero

In a not so distant universe, an instructional design consultant was required to take in-house multiple courses in order to consult in at a company in a highly regulated industry.   The requirement was that every learner (the ID, included) would receive 100% on all courses.  However, there was no pre-test to determine whether a learner had some of the skills and knowledges, no intermediate feedback, no memory aids other than some pretty graphics, and lots of reading.  The final test allowed our learner-cum-consultant to retake the test as many times as needed in order to achieve the required 100% score.  After the second attempt, the testing process was all about tracking down the right answer through trial and error (and documentation of answers given that did or did not work).  Yes, mistakes and failure are important feedback and learning motivators.  However, under this set of conditions, what value did the 100% score have?  How much retention or transfer existed. (Hint: none)  These were beautifully designed learning events with very low retention or transfer... but they did satisfy a regulator requirement.  


Case Study #2: Acts like 1 Yr in 6 Mos

In another universe and several decades ago, an instructional designer was asked to build an on-boarding program for non-traditional software programmers.  The company hired groups of individuals who had never taken computer courses but showed aptitude for logic and interpretation of codes (esp. music, art design, accounting).  The designers job was to provide scaled and scaffolded learning in code development, business communications, customer service and use of in-house tools to manage code and client communications.  Then, she would top it all off with a 10-day goal-based scenario, which is a kind of war game with the setting and details specific to the goals of the workplace.  In this case, the goals were around solving problems with code.  Multiple groups went through this process, then off to work in their new work units.  Several months later, participants and their managers were asked back for a debrief.  The learners said that they did not have enough skills and needed more.  Their managers said that, at 6 months, their new employees were working the way that more traditional hires would have worked at the end of their first year… but, of course, we need more skills sooner. 


Case Study #3: Overheard conversation

While attending a workshop, our instructional designer overheard another participant talking with the workshop presenter.  The participant said:  “My colleagues said that I just had to take this course.  We go back to your course materials, book, and templates all the time.  But, they said, that it was really worth my time to come to the class as well.  And… well, they want me to come back and tell them what’s new in the field, too.” 

Comparison

These three cases bring very different paradigms to the design of the learning and generate very different results.  Retention and transfer aspects of these course designs were handled differently and valued differently.  Their outcomes showed the difference in the design efforts to enhance retention and transfer. 


Definition of a Standard – Enhances Retention and Transfer

Consider the definition and performances listed for The Institute for Performance Improvement (TIfPI’s) standard Enhances Retention and Transfer.

Definition:  Ensures that the learning environment creates and measures recall, recognition, and replication of desired outcomes.

Performances that demonstrate this standard for certification:
  • Chooses elements of the “real” work environment, tools, and technology to include in the practice learning environment.
  • Measures readiness for learning.
  • Triggers relevant previous experience.
  • Provides interim self-assessment or skill measurement opportunities.
  • Incorporates tools for on-the-job performance.
  • Provides opportunities for learner to integrate changed skills based on feedback.
  • Provides feedback techniques that give learners information relevant to enhancing performance, retention, and transfer.

Individuals applying for learning solution certifications with marks and badges will be asked to describe ways in which he or she accomplished at least three of the seven performances.

Can you see yourself doing these performances?  Can you see yourself doing at least three of these performances with every learning solution?  Can you see other IDs doing these performances, perhaps differently, but still doing them?  If so, you need to consider applying for a learning solutions development credential.  Get the IDCertification Handbook at www.tifpi.org.

Want a list of all nine IDstandards?  

Would you like to know about the study -- a practice analysis -- that TIfPI Practice Leaders did to generate and validate nine standards, including Enhances Retention and Transfer?   Would you like a copy of the infographic with standards and learning solution certification types? 


____________________

Want to learn more about creating credentials like the ID Certifications with Badges?
Try the free Overview to Credentialing or Foundations of Credentialing.



Monday, November 10, 2014

IDs Use Standards: Elicits Performance Practice


 

ID Standards are the measures that IDs use when determining whether they will sign-off on a learning solution they have created, or not – whether their name goes on the final product. They are the hallmark of the master instructional designer craftsman.

The competent instructional designer/developer (ID) elicits performance practice:


There is an old saying, "practice makes perfect." 

At the heart of learning is change, particularly performance change.  If there is no change in performance, learning is questionable.  Therefore, practice within the learning event is an important element that allows the learner, instructor (when used), and ID to recognize whether change is occurring.
 

Eliciting performance practice is so important that it appears in the nine very different learning theories and theorists reviewed for the ID Practice Analysis.

Table of Instructional Design Theorist & Elicits Performance Practice

Learning theories hone in one or more specific elements of practice or the practice environment.   For some, practice is all about the thinking steps, while others elicit discovery.  For still others it’s about integration and application.  For others it’s about demonstrating mastery.  Each theory and theorist promotes different aspects of eliciting performance practice as an essential function of their theories or philosophic approaches.   However, competent instructional designers pick and choose; they use the focus that is most appropriate for the learner and the situation in which the learner must learn.  Therefore, the ID certifications do not focus on the theory, but on whether the ID demonstrates selecting techniques that promote performance practice. Reviewers do not judge the appropriateness of those techniques, merely determine whether the candidate has shown that they did provide performance practice. 


The Serious Elearning Manifesto lists the following hallmarks of effective elearning:
·         Performance focused
·         Meaningful to learners
·         Engagement driven
·         Authentic context
·         Realistic decisions
·         Individualized challenges
·         Spaced practices
·         Real-world consequences.

Taken together, they describe a practice environment that provides not just random activities but focused practices that reflect the world of learner – that elicits performance practice in the e-world as preparation for real world work.

Performance practice is just as important in instructor-led training (ILT), coaching and mentoring, goal- or problem-based scenarios, serious learning games, or any of the other learning solution types.  


Case Study:  Impacting real world decision

Once upon a time (all to recently), an instructional designer was asked to design an elearning solution that “taught” staff about the organizational structure – the divisions, groups, subgroups and their leaders.  Of course, this course’s learning objectives focused on identifying who to contact in various parts of the organization.  Since so many high-level executives had to buy into this course, it was important that the course be “outstanding” and that it showcase each division and group to their advantage.

Our intrepid ID had concerns about whether this was quality learning, even as the course was being designed and built.  There were no decisions to make, no real-world consequences, and the only challenge available was remembering the name of the group or division that did a given type of work.  However, everyone does need to recognize the key groups and divisions within their organization, so that information was authentic.   In addition, this ID had created something similar many decades ago (when elearning was in its infancy) that taught state employees about the structures of the legislative, judicial, executive branches in which they worked.  These concepts were highly valued by the employees taking that first elearning course, so maybe this new solution would be just as valuable… or maybe not.  

Definition of a Standard – Elicits Performance Practice

Consider the definition and performances listed for The Institute for Performance Improvement (TIfPI’s) standard Elicits Performance Practice.


Definition: ensures that the learning environment and practice opportunities reflect the actual environment in which the performance will occur.

Performances that demonstrate this standard for an ID certification: 

  • Creates practice opportunities that mimic work tasks and work processes.
  • Chooses elements of the “real” work environment, tools, and technology to include in the practice learning environment. 
  • Scripts steps and interactions. 
  • Creates the full spectrum of support materials to ensure that learning occurs. Note that any one solution may not require the use of all 6 performances listed.  
  • Describes for the learner what the practice opportunities will be.
  • Creates practice opportunities that connect learner’s real work to the learning process and outcomes.
Can you see yourself doing these performances?  Can you see yourself doing at least the two required performances with every learning solution?  Can you see other IDs doing these performances, perhaps differently, but still doing them?  If so, you need to consider applying for a learning solutions development credential.  Get the ID Certification Handbook  at www.tifpi.org.


Individual IDs applying for learning solution certifications with marks and badges will be asked to describe ways in which he or she accomplished at least the following two required performances (and preferably more):
  • Creates practice opportunities that mimic work tasks and work processes.

    • Chooses elements of the “real” work environment, tools, and technology to include in the practice learning environment. 


    Want a list of all 9 ID standards?  

    Would you like to know about the study -- a practice analysis -- that TIfPI Practice Leaders did to generate and validate nine standards, including Elicits Performance Practice?   Would you like a copy of the infographic withstandards and learning solution certification types?


    Monday, November 3, 2014

    ID Use Standards: Collaborates and Partners


    Standards are the measures that IDs use when determining whether they will sign-off on a learning solution they have created, or not – whether their name goes on the final product.

    The competent instructional designer/developer (ID) collaborates and partners:

    Whom do you include on your learning solution development team?  Subject/content experts?  Project sponsors?  A other IDs working on specific parts and pieces of the whole?  Perhaps more importantly, have you ever developed a learning solution that did NOT require some degree of collaboration and partnership?

    At its most basic, collaboration is working together for a creative end product, while partnering is sharing risk.  In the business world, risk tends to be related to finances (on budget), which also translate to ‘on time’, within staffing and resourcing, and producing the desired end product (or better). 

    Who on your teams share the risk of an instructional design and development project.   Consider what each of these players brings to your projects that helps manage the risks of that project:
    • Sponsoring manager or executive
    • Subject/content experts
    • Project lead, manager, or executive
    • Learning technologist
    • Graphic artist
    • Audio/videographer
    • Technical writer
    • Other instructional designers

    Typically, what are the risks in a learning project?  For example, consider the impact of not being able to work with subject expert who can give you the time and materials you need.  Or, consider the times when the sponsor made decisions without understanding the impact, then required rework when the results were not acceptable.   Think about a time when the project had a specialist such as graphic artist, videographer, or technical writer was not included in the project, only to require much more time for a lower quality product.  Now, think of time when you worked with another ID who wasn’t quite holding up their end of the project.  What was the impact?  And, a time when the IDs were in tune with each other and going the extra mile together?  What was the impact of that?  Or, consider the management of the project.  Have you ever played both the instructional designer and the project manager roles, simultaneously?  Have you worked on large projects with a strong project manager?  Were the risks handled differently?  Risk is an essential element of learning solution development projects and the right team makes all the difference.

    Look at the list of partners one more time.  Notice the number of partners that there for “creative” purposes – visuals, sound, animation, quality writing.  Collaboration, by the definition, is working together for a creative purpose.  We sometimes forget that instructional design and development are creative endeavors.   Check out Business Insider article, The Difference Between Creativity and Innovation, by Andrew (Drew) C. Marshall, and innovation consultant, of Principal of Primed Associates, an innovation consultancy.

    “Creativity is about unleashing the potential of the mind to conceive new ideas. […] Innovation is about introducing change into relatively stable systems. It’s also concerned with the work required to make an idea viable.”

    Instructional design is a premier example of creativity and innovation… or can be, when it is well done.   There is creativity in the design. Then more creativity is by all those partners added during development.  Then, if done right, the learning solution unleashes new potentials in the mind of the learner as well.  (A vote for learning as a creative act.)  Add in the innovations.  For learning to be effective, the learning solution itself is a change that is introduced in a relatively stable system, whether we consider that system the individual learner or the business.  Innovation is also the work required to make an idea viable.  Anyone who has worked on a learning solution that is addressing a unique challenge (e.g., tight timelines, limited resources, new and untested technologies, new methodologies, new theoretical basis) knows that much creativity is required for the problem identification and ideation of a solution, and that more creativity and innovation are required to make it real.   Yes, instructional design is a creative and innovative field and most learning solutions require creativity and innovation. 

    The greater the degree of creativity and innovation involved, the more important the collaborations and partnerships become.  In systems (businesses) that are risk-adverse, the work of partnering may take over the work of collaborating.  That is, a key challenge of a high-risk project in a risk-adverse organization is that the partnership work takes over in an attempt to minimize the risks and preempts the creative-innovative work, which effectively stalls the project. 

    The quality of an end-product learning solution is directly related to quality of collaborations and partnerships involved. 

    Case Study:  

    Once upon a time long, long ago (well, 15 years ago, anyway), a team of instructional designers was called in to create a learning solution for 5,000 industry-specific software installation project managers around the world.  Their company was installing a project management software to help bring down the cost of software installations.  The team was called in 60 days before ‘go live’ on the project management software (and the end of the calendar year) and 45 days before the first training course needed to occur.  The challenges of this project included a short timeline, new technology, a process definition that had not defined key actors current or future roles, and the fact that one could not bring in 5,000 people into headquarters in the last 15 days of the calendar year, when most employees had already scheduled their vacations.  In addition, the essential processes, steps, functions, actions (i.e., the course content) required of software users was not, yet, defined and would continue to change during the 6 months following ‘go live’ as new software modules were added.  

    The learning solution design and development team brought the essential number of learners down to 500 who were key, and 150 high-profile project executives (PEs) that were essential and began brainstorming solutions that would work best for this group.  In the end, the solution created an electronic performance support (EPS or EPSS) that allowed subject experts to change process documentation and provided supporting information such as screen shots, video clips of key steps, diagrams, and a process workflow that matched the audience’s essential workflow from initiating a project to closing it.  Since these projects were multimillion-dollar projects, the project financial officers were key to project success; they tracked staffing hours, deliverables, invoiced clients and tracked payments.  They were the stability of a project that would run for several years.  Therefore, they were trained as coaches to the PEs and set up with a 2-hour webinar that would get their PEs started.  The solution set both creative and innovative in that nothing like that had been done in this company and the technologies involved were emerging.  

    This was a high-risk project with many opportunities for failure.  Luckily, the organization involved was risk-tolerant and willing to provide partners who actively helped the team work through the issues.  The design and development team were experienced at creative-innovate designs and solutions under tight timelines. Together they made it happen.

    This project was unique in so many ways.  However, many instructional design and development projects are just that – unique.  Collaborative creativity and innovation paired with strong partners working toward an essential goal are key hallmarks of almost all learning solutions. 

    Definition of a Standard – Collaborates and Partners 

    Consider the definition and performances listed for The Institute for Performance Improvement (TIfPI’s) standard, Collaborates and Partners.

    Definition:  Works jointly with sponsors and other members of the solution development team to develop the solution.Performances that demonstrate this standard for a Solution Domain Badge: 
    • Addresses sponsor’s issues and needs by listening to requests for modifications, offering solutions to modification requests, and reporting progress.
    • Participates in the project team through: 
      1. Identification of project issues
      2. Meeting attendance
      3. Regular reporting
      4. Generating ideas to resolve issues, improve sustainability, and enhance learning solution.
    • Negotiates changes to solution involving other team members during development and solution testing.
    • Plans solution product tests that validate with the sponsor and intended audience that the right solution elements have been developed.
    • Executes product tests including reporting results of tests.
    • Works with content experts to identify content, relevant work processes and procedures, and appropriate feedback and assessment technique.

    Note that any one solution may not require the use of all 6 performances listed.  Individuals applying for learning solution certifications with marks and badges will be asked to describe how he or she demonstrated at least 3:5 performances, one of which must be: identifies key partners and collaborators by role.

    Can you see yourself doing these performances?  Can you see yourself doing at least 3 of these performances with every learning solution?  Can you see other IDs doing these performances, perhaps differently, but still doing them?  If so, you need to consider applying for a learning solutions development credential.  Get the IDCertification Handbook or just visit www.tifpi.org.


    Would you like to know about the study -- a practice analysis -- that TIfPI Practice Leaders did to generate and validate nine standards, including Collaborates and Partners?   Would you like a copy of the infographic with standards and learning solution certification types?  Download these doucuments.