Improve the xAPI publishing method to output more detailed interaction data

Hello!

Within our organisation we use Evolve alongside several other authoring tools. We would like to use it as our primary tool, but have to rely on using other tools when we have customer requirements around diving into learner response data.

For the xAPI publishing method, the xAPI statements that are produced by Evolve provide some basic bits of information about the question, but does not provide the minimum required amount of information of what the learner has actually answered. With a little more information added in to the statements, it would unlock a great deal of value.

It is my understanding that the Exceed publishing method can provide this information if the content is uploaded to Intellum, but as our customers all have their own LMS systems already in place this is not an option.

Are there any active plans in the roadmap for enhancing the xAPI data to include more information around the learner’s responses?

4 Likes

I would also be very interested in the response to this question.

2 Likes

Absolutely interested in a response and next steps on this matter

This would be something my team would be very interested in as well.

This would be incredibly useful for us as well!

Another enhancement we would like on xAPI score settings and publishing would the possibility to NOT submit a score. Evolve can do this with SCORM, but not with xAPI.

This is causing issues where we have LMS-based scored evaluations, and not within the xAPI module itself. In this case since the xAPI module still reports a score of zero, it skewing the user score reports when it averages it with the actual result.

1 Like

Hi @hbailey can you tell us if this is on the Roadmap for Evolve at all?

I am interested in this as well.

Hi, I posted this is another thread asking about interaction tracking, bit it’s relevant here also:

“We are considering working on reporting from a more holistic standpoint (rather than targeting one or two particular standards), but it would be a significant undertaking in development time, so we don’t have a definite timescale for this work yet. When we do the work, we would undertake research first to find out more about the kind of problems we would be trying to solve.”

2 Likes

Thanks for the update Sam.

There is definitely an appetite for better reporting and I’m intrigued to see what Intellum come up with.

3 Likes

Thank you for the update Sam and am very interested in the outcome/timeframes when you have an idea. Happy to be part of the research also!

2 Likes

Hi there,

did anyone try testing Evolve xAPI courses in ScormCloud?

When I checked my Evolve xAPI course in ScormCloud, I wasn’t able to get a started-but-not-finished statement. I also used some variables and logic to set a course score throughout the course, but it looks like ScormCloud didn’t receive those either.
Testing the same course in our LMS produced the same results, so I suspect that Evolve’s xAPI courses don’t send those statements.

Can someone confirm?
We really need to know if learners got started on a course - maybe someone has a workaround for this?