Trent: Stepping outside my Comfort Zone!

3 Dec

As this semester comes to an end, I begin to reflect on what I have learned throughout this class.  I will not lie, I was very uneasy about taking IT 570 at first, because it was outside of my comfort zone and I have not focused a lot of my graduate work towards education and learning styles.  However, as I have experienced with other things in the past, I think some of the best learning takes place when you step outside of your comfort zone. 

At the beginning of the semester, I had very little experience with instructional design and aligning pedagogical approaches with the overall strategy and outcome.  The design project has enabled me to realize that as I have designed my own 4-H curriculum and programs in the past, the outcome and strategies may not have been appropriately aligned.  I now understand that for effective learning to take place, I must become more organized in my teaching process.  I must do a better job of more clearly defining goals and objectives, enabling learners to more easily grasp learning strategies.  I know that I will be able to apply the many objectives from this class to my own career, enabling me to be a much more efficient agent when it comes to preparing instruction.  I am very glad that I stepped outside of my own comfort zone!


6 Mobile Learning Trends That Grew in 2012: Trent

27 Nov

1. mLearning in the classroom…and in the workplace.

Yes, teachers are now encouraging students to bring their smart phones and tablets to class and some schools are even providing them. There are thousands of apps available for the classroom that teach math, language, and even handwriting, and text message polls that encourage class participation. The best part? Research shows that it’s actually working. A study funded by the Department of Education showed vocabulary improvement by up to 31% in Title I elementary school students after just two weeks of using a particular educational gaming app. With such fantastic results in the K – 12 sphere, it’s no wonder mobile learning is seeing such rapid uptake in the workplace, too.

2. Bring Your Own Device (BYOD)

How will BYOD affect mLearning?

How will BYOD affect mLearning?

Many companies and schools are adopting this policy as it is more cost effective and encourages people to keep working even after they go home for the day…or at least, that’s what the buzz is about. Reality is not always matching up so far. IBM was one of the first companies to implement this policy, and so far the results have been, well, nightmare-ish. Ideally BYOD would save companies money, but according to IBM CIO Jeanette Horan, it hasn’t. Instead it’s been a headache for IT and caused all sorts of security risks. We’re grateful IBM volunteered itself to be the BYOD guinea pig because once these issues get sorted out it’s bound to be a beautiful thing.

3. “Snack learning”

Just like it sounds, snack learning is “bite-sized” tidbits of information you can grab on the go. Meant to be consumed in a couple of minutes, snack learning is convenient when you have a five-minute break between meetings or need a quick tutorial on how to run a software program. It’s great for brushing up on an old topic or learning the basics of a new one and caters to all types learners, from those of us with short attention spans to the knowledge-hungry learners. Learning “snacks” are perfect for additional reinforcement, quick tutorials, and the immediate assistance that the workplace demands.

4. Tin Can API

Tin Can API is the new eLearning standard.

Tin Can API is the new eLearning standard.

Tin Can API is SCORM’s smart and attractive younger sibling. By using the “Noun, verb, object” statements it’s able to track the important stuff, like what is actually being learned or done. Rather than the old system of recording pass/fail data, Tin Can provides trainers with useful information that can help personalize learning. Unlike SCORM, Tin Can API is also easy to implement and major players like Articulate, Lectora, and Blackboard have already adopted Tin Can. We are big believers in Tin Can at BLP, so much so that we have developed the first Tin Can API compliant learning game engine – The Knowledge Guru. It’s launching at DevLearn 2012. Tin Can API is currently at version .95, but will be at 1.0 soon.

5. Location-based integration and workplace training

You may have already seen location-based integration in museums, colleges or other places where tours are common, but we know it can go further. Whether it provides auditory, visual or textual information or directions, we predict this will become a great resource for employers as they train new employees and welcome customers into their companies. With smart phone use on the rise, businesses would be absurd not to use this to their advantage. Getting creative with training, like recording podcasts for sales reps who spend most of their days on the road, is an efficient use of time and can boost productivity.

6. Cloud computing

Cloud computing is a convenient way to file share.

Cloud computing is a convenient way to file share.

It’s inexpensive (or free), easy to use and provides a central location for large amounts of information that need to be shared – what’s not to love? At BLP we use Dropbox to share files, but I also use it at home to share family photos with family and friends. Most companies offer free storage up to a certain amount, then charge incrementally. The convenience of cloud computing will leave users on cloud nine as it wipes out the hassle of attaching files via e-mail or uploading them to a thumb drive and creates a simple way to collect and distribute information. It’s already changing the way we learn and work… and is only going to grow.

Mobile learning is exciting, the trends are important, but effective learning experiences still come down to rock-solid instructional design. Use these technologies, enjoy them, but always make sure decisions are driven by the ways people learn — and what motivates them.

Alternatives to Kirkpatrick: Kaufman’s 5 Levels of Evaluation: Trent

26 Nov

Below is a great article from Dianne Rees, who is a writer and instructional designer specializing in biotechnology, pharmaceuticals, health care, and legal elearning and training.

As I reflect back on my design plan and this past semester, I see the merits of this alternative method of elvaluation.  This semester had taught me that any instructional design requires meaningful evaluation in order to be meaningful instruction, as Dianne Rees describes below:

Meaningful instructional design requires meaningful evaluation. However, evaluation, like organizational development itself, requires buy-in at many levels. This buy-in’s necessary…

  • To identify meaningful metrics
  • To collect data
  • To react to data, making appropriate improvements
  • To undertake change management necessary for these improvements

If you’ve been in this field for any length of time, you’ve probably come across Kirkpatrick’s four levels of evaluation (reaction, learning, performance, & results). Kirkpatrick’s approach has come under fire for a number of reasons (e.g., emphasis on training events, implied linearity and causality, and more) and many articles have provided thoughtful critiques.

However, in this series of posts, I haven’t come to bury Kirkpatrick’s approach or to praise it. Instead, I’m going to discuss some alternatives.

One alternative: Kirkpatrick Plus

Articulated by Kaufman, Keller, and Watkins (“Kaufman”) (1995), this evaluation framework connects performance to expectations. Kaufman proposes 5 levels of evaluation

Level 1: Resources and processes
Level 1 is actually divided into two levels, 1a and 1b.

  • Level 1a focuses the evaluation lens on inputs, e.g., such as the availability and quality of materials needed to support a learning effort.
  • Level 1b considers processes. What’s their quality? Are they efficient? Are learners satisfied with them?

Compared to Kirkpatrick’s Level 1 (Reaction), Kaufman’s Level 1 focuses not only on learner satisfaction, but on the organizational factors that can impact learner satisfaction.

Level 2: Acquisition
This level is focused on individual and small group payoffs—what Kaufman calls ”micro” benefits. Are the objectives or desired outcomes of the learning intervention met? It’s pretty analogous to Kirkpatrick’s Level 2 evaluation (Learning), but Kaufman notes that the learning intervention may not necessarily be training.

Level 3: Application
This is still a micro analysis, examining individual and small group impacts. The relevant inquiry here is whether newly acquired knowledge and skills are being applied on the job. Level 3 also is quite similar to Kirkpatrick’s Level 3 (Behavior/Performance).

Level 4: Organizational payoffs
Here, the analysis examines macro benefits. What are the benefits from an organizational standpoint? Level 4 is analogous to Kirkpatrick’s Level 4 (Results).

Level 5: Societal contributions
Kaufman considers this a mega analysis. How is the organization contributing to its clients and society? Is it responsive to client/societal needs?

Issues of health, continued profits, pollution, safety, and well-being are central [in this level]. The basis for mega-level concerns is an ideal vision, which is a measurable statement of the kind of world required for the health, safety, and well being of tomorrow’s children.

Level 5 has no analog in Kirkpatrick’s Evaluation Model.

A better model?

The “Kirkpatrick Plus” framework doesn’t stray that far from Kirkpatrick’s Evaluation Model and so can be subject to many of the same criticisms. Notably, while measuring organizational payoff’s an important part of a meaningful evaluation, teasing apart the effects of a learning intervention from all the other variables that impact ROI is notoriously difficult. And if you think measuring organizational payoff is challenging, imagine how hard it is to measure societal impact. (This isn’t to say this evaluation aspiration isn’t a worthy one.)

I do think making the organization’s efforts part of the evaluation process (as in Kaufman’s Level 1) is an important step in the right direction. The organization’s commitment to success (e.g., by providing necessary resources, processes, and other supports) should be subject to as much scrutiny as the learner’s performance.

Still shopping for a better model? Stay tuned for the next post.


Kaufman, R., Keller, J., & Watkins, R. (1995). What works and what doesn’t: Evaluation beyond Kirkpatrick. Performance and Instruction, 35(2): 8-12. Retrieved from

Agile Instructional Design: Trent

26 Nov

Below is an intriguing article from Dianne Rees, and Instructional Designer, that specializes in biotechnology, pharmaceuticals, health care, and legal e-learning and training.  The article entails an alternative approach to the ADDIE Model.  Good Read.

Agile Instructional Design

But first, a few words about ADDIE.

Next to religion, politics, and whether you’re a PC or a MAC user, ADDIE, with its sequential steps of analysis, design, development, implementation, and evaluation, tends to arouse a lot of fervor in instructional designers. ADDIE’s an important model and for a discussion of its history and morphing, see the always excellent site, Big Dog Little Dog’s Performance Juxtaposition.

However, time constraints, client logistics, and the natures of dynamic organizations often make ADDIE (at least in its older incarnations) untenable. For many instructional designers, ADDIE’s become synonymous with a bygone era with a much slower pace , though arguably ADDIE was never intended to be so rigidly applied.

Agile design is an alternative approach that has a lot of merit.

Agile has its origins in the software development industry, famous for its rapid cycle times. Agile embraces the idea that development occurs in steps and iteratively, as analysis inputs are collected from busy cross-functional teams. It’s about flexible responses to a changing picture of what the situation on the ground is really like.

The important difference between Agile and ADDIE is that in Agile design, recommendations, preliminary mockups, and pieces of a project are shared with clients and target audiences early to see if they’ll fly. Adjustments are made throughout the design and development process rather than after development and/or implementation.

Although ADDIE is an important foundational model, I think that Agile design reduces the risk of spending a lot of time creating a very polished product that ultimately isn’t very useful. Agile turns clients and potential learners into active participants throughout the design process, which makes it more likely that your solution will actually be integrated into an organization’s workflow.

Whether you’re using Agile or ADDIE, there are some important tenets to stick to:

  • Make sure everyone has a shared vision about what the goal is and how to measure success
  • Ask first whether an instructional solution is really the one that’s needed
  • Think like a designer (have a systematic, but creative, approach that’s open to solutions from analogous fields)
  • Make pilot testing part of your project plan
  • Stay hungry to do better


Clark, D. (2011). ADDIE Model. Retrieved September, 29, 2012, from

Unger, K., & Novak, J. (2011). Excerpt: Mobile game development – going into production. Retrieved March 20, 2012 from

Managing Student Behavior

20 Nov

Here is a great link to a FFA or AG Teaching PLE type website.  I find there October topic, Managing Student Behavior, very interesting.  It is neat!  Once you access each topic, it opens up a discussion board where you can collaborate with other teachers and agents.  This is what 4-H agents need.

4-H Horse Hippology and Horse Judging PLE

20 Nov

Thanks to Paul for sharing this awesome WordPress site from the Orange County Bit N’ Bridle 4-H Club.  It really shows what 4-H agents and volunteers can do with technology.  It is the perfect example of what I would envision a 4-H agent utilizing a PLE for.  They have pulled in content from many different sources, and made them available all in one location for their 4-H Horse Project members to access and study.  I am currently working with my own Horse Bowl & Hippology students, and it is very challenging to cover all of the material within the alloted time.  I have been linking sites and materials to our county website, but this “PLE” looks to be much more efficient.  I am on the state committee for Horse Bowl and Hippology, and Horse Judging.  I believe I will bring this site up at our upcoming Advisory Meeting as a new and innovative way to study.  Thanks Paul!

I think a PLE like this would save agents so much time, and allow us to run our school and project group club so much more efficiently!


Pin Up Slide 1

20 Nov