-
Notifications
You must be signed in to change notification settings - Fork 34
Instructional Design Planning
I've worked with a variety of phases within developing and managing e-learning content. It's not always cut and dry as to what people's roles are, and we often get into overlap with what we know, what we don't know, and how to get results. I'd like you (as a Instructional Designer) to not have to completely concern yourselves with the SCORM Standard (or any e-learning standard). That's for geeks and tech people. You're focusing on the design which means itemizing out what makes your content tick. Developers will use their tools (and mumbo jumbo) to translate that into communication with a Learning Management Server. I will blend in what I feel should be the base understanding, and will also take it a little beyond so you can connect the dots with these concepts.
This section is the "planning" part. You should consider the types of presentations you're building so that you can define your scoring approach. This requires you to essentially decide if everything get points, or percents, or fractions of points. Commonly, when you have 10 questions and they are all worth a point, it's pretty simple math - 7 correct out of 10 is 70%. However, when you increase values of certain questions, they are worth more, because they are more important, or harder than other questions. You can even give something partial credit so at a minimum they get 0, but at a maximum they could get 5. So if they partially get it correct, they receive some value between 0 and 5. All this rolls up and will later be evaluated to see if the student passes based on the scaled passing score I will cover later. Essentially, what do you consider to be a passing score?
You can actually just say that they looked at it, and by default that's 100%. It's really ultimately up to you. Just don't be afraid of a little math.
Some example approaches and use cases:
- Objectives - You've defined objectives per interaction, and all the scoring will be rolled up and calculated by SCOBot.
- DIY - You could have a question per page. Its just a matter of incrementing the "correct" + 1 each time. Then correct divided by total = your scaled score. The correct total will equal your raw score. Max can equal your total questions and min can equal zero. If scaled is greater than or equal to scaled passing score, they are passed. Else, they are failed.
- Auto-Score - You can't be bothered with math, everyone wins. Set scaled score to 1, and success status to passed. I call this a happy ending.
This depends heavily on your implementation, but if you're having content built by hand, or if its being done by templates or some Content Management System or Author Tool you may need to see what all the capabilities are. Developers may need information up front, and this directly effects other systems getting them what they need in order to do real-time scoring.
Within your content there is an approach to what passes or fails a student. This may vary from presentation to presentation, but I've seen anything from just forcing the student to passed, to actually calculating and totaling up their performance. In the design phase of a project, its a great idea to nail this down so developers designing these presentations can take advantage of SCORM to accomplish what is needed.
imsmanifest.xml -- When creating the Content Aggregation Package or CAM (a package used to transfer content to a LMS), you can specify a 'minNormalizedMeasure' or scaled passing score in the SCORM world. This means when the scaled score has exceeded the scaled passing score, the student will be set to 'passed'. Excerpt from a imsmanifest.xml:
<imsss:sequencing>
<imsss:objectives>
<imsss:primaryObjective satisfiedByMeasure="true">
<!-- equates to cmi.scaled_passing_score at 60% -->
<imsss:minNormalizedMeasure>0.6</imsss:minNormalizedMeasure>
</imsss:primaryObjective>
</imsss:objectives>
</imsss:sequencing>
This next comment depends so greatly on the implementation, and how the developer uses SCORM. Commonly, when you force passed, you'd also want to set the scaled score to 100%. Sometimes, this doesn't happen so you end up with 'passed' and 0% for a score. So if you encounter this, simply instruct the developer to set the scaled score to 1 (aka 100%).
If you have several test questions or 'interactions' all those will be pooled together. This is very important when testers go into the content and start to answer questions. If they get scores that don't add up, somethings gone horribly wrong, and developers will need to correct these errors.
The SCO itself may default, if a scaled passing score is not provided. You need to convey this to the development team so they can implement it. You may also note that objectives that are defined within interactions, that they may have there own scaled passing score as well. SCORM doesn't offer any facet for defining this, so its also something that has to be done at author time, if you have requirements in this area.
You decide what success is. The content can decide right from wrong, but the full collection of that performance is commonly calculated by the content. This is why its critical to get these requirements solidified for your development team.
What is progress? Is it flipping thru pages? Watching a certain amount of a video? Is it answering at least 8 of the 10 questions? You really define what progress is within the content. Developers will implement this, but if you don't define it up front, it may cause re-work later in the project.
You can really decide whats pushing the progress of the student based on a few approaches or use cases.
- Page Turns - you can increment the progress as the student turns thru the pages in the SCO.
- Last Page - you just set the progress to 100% on the very last page (no tracked progress though)
- % of Video - you might have a single page SCO with a video. You could base it on how much of the video they watched.
- Objectives - Based on how many Objectives they completed within your SCO.
Within the Content Aggregation Model or CAM (transport package to a Learning Management Server) you can actually pass the completion threshold excerpt from the xml:
<adlcp:completionThreshold>0.75</adlcp:completionThreshold>
This means the content can compare its progress, to the threshold to decide completed or incomplete. Be aware, if you set completion threshold, the LMS will maintain the completion status if the SCO doesn't set one.
Developers can also default this to a value internally if one is not made available by the above. But, you need to work out if that's a global value, or something that has to be set per. Work out those requirements and have the Developers implement it.
This is actually more important than I use to give it credit for. You have a few exit strategies to consider since they have consequences on how the LMS handles learner attempts and the ability to resume. SCORM doesn't really make any claims to how a LMS chooses to manage attempts, but does make it pretty clear about how it should behave based on the exit types used.
- "" - This is undetermined. SCO is unable to decide the exit type.
- "normal" - This is a finish or submit style exit type.
- "suspend" - Your allowing the student to come back and resume their attempt.
- "time-out" - You had a timed task, and the time ran out.
- "logout" - This is depreciated going forward, so I often ignore this as a option, but it will act like normal.
Under the specification "(undetermined)", normal, time-out and logout will create a new attempt of the student re-launches the SCO. This means a clean attempt. SCORM Makes no rules on how a LMS stores prior attempts or even if they have to. Food for thought. ###What does this mean? So when a student launches a SCO, they start off with a clean slate. They are in "ab-intio" (the beginning) of their session. If they exit in anything other than 'suspend', when they come back it will reset the attempt, and start a new one. So any prior data, answers etc ... will be gone (but possibly stored in a prior session on the LMS).
Any issues, concerns or feedback - make contact