Contact Us Support Forum Get Email Updates
 
 

Thanks! Someone will be in touch with you shortly.

Rather just email us? Email us here.
Rather speak with someone in person?
Call any time with Tin Can questions:

866.497.2676

Author Archive

Learning plans, goals and targets are important. Setting goals for learning allows us to evaluate whether or not we are learning the things that we set out to learn. It’s standard practice for e-learning courses and qualifications to have learning outcomes attached to them, and these are used to measure if our learning has been successful. They are also used by educators and trainers to evaluate whether or not their teaching and training have been effective, and are used to inform interventions, further learning requirements and amendments to learning materials and lesson plans.

Learning Goals with Tin Can

Brian Miller touched on the use of sub-statements in Tin Can to represent future plans. The spec puts it this way: “One interesting use of sub-statements is in creating statements of intention.” and gives the following example:

{
    "actor": {
        "objectType": "Agent",
        "mbox":"mailto:test@example.com"
    },
    "verb" : {
        "id":"http://example.com/planned",
        "display": {
            "en-US":"planned"
        }
    },
    "object": {
        "objectType": "SubStatement",
        "actor" : {
            "objectType": "Agent",
            "mbox":"mailto:test@example.com"
        },
        "verb" : {
            "id":"http://example.com/visited",
            "display": {
                "en-US":"will visit"
            }
        },
        "object": {
            "id":"http://example.com/website",
            "definition": {
                "name" : {
                    "en-US":"Some Awesome Website"
                }
            }
        }
    }
}

MORE…

No Comments
 

Statements come from all kinds of places: content created in authoring tools, mobile apps, learning platforms and business systems. It’s not always immediately obvious which application the statement came from, which might be useful to know. This blog explains how you can tag the statements your tool or product generates and why that information is useful.

We’ve worked hard to make the Tin Can (xAPI) spec as clear as possible and have required Learning Record Stores (LRSs) to validate incoming data to ensure the same data structure is always used. There’s no way for statements to be sent to a conformant LRS unless they follow the prescribed data structure, and you’ll find that the major LRSs are strict with the data they accept.
MORE…

1 Comment
 

What’s the difference between session duration and attempt duration? Timestamp or Stored? When should you record time taken and how can you report it? This series of blogs looks specifically at duration and how to report on it.

As a provider of an LMS, LRS or other system launching content and reporting on duration information, you can use the table presented last week as a guide for reporting. In an ideal world, you can simply look at the Result Duration property of the completed/passed/failed, suspended and terminated statements to grab your attempt and session durations. Win!

Handling limited data

Unfortunately, the world is not an ideal place. In practice, many Activity Providers have not implemented duration at all, or are only reporting duration at activity completion, leaving the report viewer wondering about the time spent by learners on partially completed attempts. Many early adopters, who designed their statements before the best practice I described last week emerged, are understandably waiting for the release of CMI5 before updating their statement structure.

As an LMS provider that leaves you with two options:

  1. Encourage your activity providers to improve the data they’re sending (point them to this blog series).

  2. Work with the data they provide or you can gather yourself.

Working with the data you have most likely means using Timestamp to calculate duration. For session duration, you can simply take the first and last statements issued in a session and subtract! The harder part is working out the break points between sessions, especially if the learner re-launches the experience soon after leaving it. The following guidelines will help:

  • As the LMS launching the experience, you should know when the session started. In fact it’s good practice for the LMS itself to issue a statement using the verb http://adlnet.gov/expapi/verbs/launched to indicate that it launched the experience. This means that even if the Activity Provider never issues a single statement, you know when experience was launched. This is essential for reporting if the experience can be launched from multiple systems and the Activity Provider is not sending the data you need.

  • When the learner has launched the experience again, you can assume that the previous session ended at about the time of the previous statement before that new launch.

  • When the learner hasn’t launched the experience again, you can assume that either the session is still in progress, or the last statement issued represents the end of the session.

  • To work out if the session is still in progress, you’ll need to define a session timeout period. If the activity provider is doing client side JavaScript tracking, then the LRS should define a timeout for the launch security credentials and you can use that same value. If not, define something sensible for the types of experience you’re launching. Any statements issued after the timeout period you define can be considered a new session.

Attempt duration can be harder or even impossible depending on what data the Activity Provider sends. If you can follow them, you can use the rules below in priority order depending on what data the Activity Provider sends:

  • If the Activity Provider sends a ‘suspended’, ‘completed’, ‘passed’ or ‘failed’ statement with a Result Duration, then take this as the attempt duration. If more than one of these statements are sent, the latest one in a given attempt will represent the latest duration.

  • If the Activity Provider sends an ‘attempted’ statement with a Result Duration of zero then this marks the start of the attempt for the purposes of calculating attempt duration.

  • If the Activity Provider sends a ‘suspended’, ‘completed’, ‘passed’ or ‘failed’ statement without a Result Duration, then then the latest of these within an attempt marks the end of that attempt. Add up the session durations of all sessions within that attempt.

  • Assume that the last statement (excluding ‘launch’ and ‘initialized’) before an ‘attempted’ statement with a Result Duration of zero was the last statement in that previous attempt.

  • If Result Duration is not used by an Activity Provider but they use the ‘attempted’ statement correctly, you can calculate the end of a previous attempt as the latest ‘suspended’, ‘completed’, ‘passed’ or ‘failed’ statement before an ‘attempted’ statement.

  • If Result Duration is not used by an Activity Provider and they use the ‘attempted’ statement incorrectly, then it may not be possible to accurately track the start and end of an attempt. The only sensible solution here is either not report attempt duration for these activities or allow your administrators to configure how duration is reported on a per activity basis.

As you can see, reporting on limited data from Activity Providers is hard! This complexity can be avoided by Activity Providers sending the data as outlined last week. If they don’t and you really need to report on their data, we can help.

1 Comment
 

I’m super excited about the latest recipe we’ve published on the registry! Not only is it a great recipe tackling an important use case, but it was written by adopters who needed it for a real project. It was written incredibly rapidly, going from first draft to ready-to-try in less than two weeks. This blog gives you the details.

Sometimes when I talk to people about recipes, they’re disappointed to hear that there isn’t yet a recipe for the use case they are interested in. “Don’t worry!” I always console them, “You can write your own.” TES took that advice to heart and one of the first things they did after hiring a developer to take Tin Can further in their app was to draft up a recipe covering the events they wanted to track. In this case, attendance at events such as meetings, classroom sessions, conferences, etc.

The actual recipe can be found here in the registry. The recipe is split into ‘Simple Attendance’ which uses a single statement to record that a group attended the event, and ‘Detailed Attendance’ which is used to record more events such as scheduling, registering, joining and leaving. It’s envisaged that some recipe adopters will implement only Simple Attendance whilst others will compliment it with the nuances captured by Detailed Attendance statements.

The bulk of the recipe was written by Sean Donaghy of TES. I helped by reviewing each iteration and making a couple of edits where it was faster to make the change directly than write up an explanation. I’m very happy to help anybody who wants help with reviewing a recipe they’re working on.

This first release of the recipe is considered an alpha version. Aside from the TES developers who are busily implementing the recipe in their product, nobody else has tried the recipe yet. There are likely some changes to come as implementers run into challenges we couldn’t predict. If you do implement the recipe, we really appreciate your comments and feedback. You’ll use the recipe ids (http://xapi.trainingevidencesystems.com/recipes/attendance/0_0_1#simple and http://xapi.trainingevidencesystems.com/recipes/attendance/0_0_1#detailed) as a “category” Context Activity so that when you upgrade to the final release version of the recipe you can easily identify which statements used which version.

Recipes are really important to ensure your statements can be understood by other tools. If you’re working on a Tin Can project and neither following nor writing a recipe, please do get in touch so I can help you.

You can expect this to be the Year of The Recipe for Tin Can. We already had the Open Badges recipe last month and there’s a few more in the works that will pop up as the year progresses. Watch this blog for more news sometime soon!

No Comments
 

What’s the difference between session duration and attempt duration? Timestamp or Stored? When should you record time taken and how can you report it? This series of blogs will look specifically at duration and how to report on it.

The SCORM veterans at Rustici Software tell me that duration reporting by SCOs was notoriously patchy. That’s why you’ll notice that SCORM Cloud will tell you it’s calculation of duration alongside the figure reported by your e-learning. It can do this because the SCO sits inside a player – either a frame or a pop-up window – that allows SCORM Cloud to keep an eye on the SCO. Tin Can does away with the need for frames and pop-ups; that’s a really good thing for everybody but it does mean that activity providers need to be on the ball with reporting duration.

As an Activity Provider, if you want to provide good duration information you’ll need to issue statements for the events in the table below. There’s no universally agreed set of verb ids for this yet (CMI5 is still under development), but I’ve listed the verbs that I would use (and have used) when designing statements.

Event

Verb id

Comments

Entering the experience

http://adlnet.gov/expapi/verbs/initialized

The very first statement issued by the activity provider as soon as it can. I recommend including a Result Duration of zero (“PT0S”) with this statement.

Exiting the experience

http://adlnet.gov/expapi/verbs/terminated

Issued whenever the learner leaves the activity and includes the session duration as the value of Result Duration.

Starting an attempt

http://adlnet.gov/expapi/verbs/attempted

Issued at the start of a new attempt. You should always include a Result Duration of zero (“PT0S”) with this statement.

Suspending an attempt

http://adlnet.gov/expapi/verbs/suspended

Issued when the learner leaves mid-way through an attempt. Includes the current attempt duration as the value of Result Duration.

Resuming an attempt

http://adlnet.gov/expapi/verbs/resumed

Issued when the learner returns to an attempt that’s been suspended. You can optionally include the attempt duration tracked so far as the value of Result Duration with this statement.

Ending an attempt

http://adlnet.gov/expapi/verbs/passed

http://adlnet.gov/expapi/verbs/failed

http://adlnet.gov/expapi/verbs/completed

The attempt can be considered over when the learner passes, fails or completes the activity. These statements should include the final attempt duration as the value of Result Duration and a Result Completion of “true” to make clear that the attempt is complete.

The object of all these statements needs to be the main activity itself rather than something within that activity. You might also want to track duration for sub-activities such as modules, screens or interactions within an e-learning course too, but this is no substitute for tracking the duration of the activity as a whole.

Note that you should use the Context Registration property to link groups of statements together. The concept of registration is linked to the idea of a learner being enrolled on a course and can be a little complex. Sometimes a registration is equal to an attempt for all practical purposes, but that’s not always the case and single registration may contain multiple attempts.

In our golf prototype, for example, the user is given the option to return to a saved location or start again. If they choose to start again, that’s a new attempt but not a new registration. In our Tetris prototype each game of Tetris is a new attempt, but they all sit within a single registration. As a rule of thumb, if the LMS initiates a clear slate for the learner when launching the activity, that’s a new registration; if the learner has multiple attempts within the activity, it’s not.

What if the user closes the browser?

A common problem with SCORM was that if the user closed browser windows or got disconnected then tracking data could be lost. Tin Can solves this by allowing tracking to be performed server side so that data can be sent even after the user disconnects. It is still possible to do Tin Can with client side JavaScript though, and in this case the same problem remains.

For duration tracking, this means that the crucial final statements containing the duration information can be lost if the user closes the browser before they can be sent. There are a number of things activity providers can do to minimise this issue:

  • Make clear how learners can safely exit the course; provide some instructions and a clear save button at the top of the page. For example, the latest version of our golf prototype adds a ‘Save and Exit’ button to the navigation.

  • Consider the session over and issue statements when the learner gets to the end, not when they exit.

  • Launch the course in the same window rather than a new window and return the user to the LMS when they exit. This may require some collaboration with the LMS provider to achieve, but is a smoother experience for the user and discourages closing the window. The draft CMI5 specification includes a standard way for the LMS to provide a return URL to the activity provider.

I hope that clears up the questions you had around how to track duration and helps us all to do Tin Can better. If you do have further questions, please get in touch.

1 Comment
 

Tin Can API Email Updates

* indicates required

Tin Can API Email Updates

Thanks for signing up for the Tin Can API newsletter!

Make sure to follow us on Twitter @ProjectTinCan,
and tweet this page to let others know about the Tin Can API.

Close