Staying within Suspend Data Limits of our LMS while using Logic in Courses

We are exploring the great new Logic-based features in Evolve 7, but we’ve learned that it is possible to store too much data in the Suspend Data of our LMS. We want to ensure that we don’t accidentally create a course so complex or large that it exceeds our 64KB limit and breaks functionality of the course.

To back up:

Using the new Logic variables requires you to turn on Active Tracking when you publish your course. This changes the way your course content uses the Suspend Data in the LMS.

Our LMS is set to allow storage of up to 64KB of data in Suspend, but we also know that exceeding this limit produces errors in the communication between SCORM package and LMS, making it impossible for the LMS to store any more data once the limit has been exceeded.

I created a test course where I added three text input fields that all save their contents as variables, and pasted eight paragraphs of lorem ipsum text into each field to see if I could fill up the storage. Even this unusual amount of user-generated data was not enough to break the limit and bog down the course when the limit is 64KB, even though it DID break the course when the limit was at the original 4KB setting.

Still, we’re trying to imagine any scenarios we could build that would be too big.

Has anyone else dealt with this issue and done your own testing? Any insights you could share that would help us develop guidelines for “not breaking courses” would be very helpful.

Thanks-
Ted

We have extensive experience using Logic in courses built in Evolve and exceeding the same 64 KB Suspend Data limit within our own LMS, and client LMSs.

Evolve has a guide to Logic, and I think the following page from it is pertinent to your question:

https://appitierre.zendesk.com/hc/en-us/articles/360015569619-Using-Logic-with-SCORM

On this page, it states in bold, “We recommend that the suspend data limit should be at least 64kb for the average sized course (3 pages with around 5-10 components per page).

I think our smallest course has ten pages, maybe, and our older ones definitely have more than five to ten components each per page :rofl:

The only proper solution to this is to have the option in your LMS to extend or remove the Suspend Data limit. The page linked above shows how to do this if you have access to the admin settings within SCORM Cloud or Moodle.

We use TalentLMS currently, and have asked them if they are able to extend or remove the Suspend Data limit. Unfortunately, they are not currently looking to do that and have no plans on their roadmap to do so in the near future :man_shrugging:

It is also not that common for clients to be able to do this in the LMS they use as they might be a client themselves of another company that sets up and controls the LMS they use, or their LMS might simply not have that option.

Suspend Data issues have been known about for a long time yet, it seems, that many LMSs have this limitation included in them, with zero or limited options to change.

After a lot of testing, the unfortunate outcome is if you HAVE to include certain features such as Bookmarking in longer courses containing long Pages full of Components, and/or a lot of Variables and Triggers (our courses range between 0 and 461 Triggers and 0 to 194 Variables used), the only “solution” is to enable the Local Storage Extension if you keep running into issues.

I say “solution” in double quotes because it can actually cause further issues itself. The TL;DR of Local Storage is that it utilises the user’s browser to store interaction information, such as what they have completed in the case of SCORM + Bookmarking, and works in conjunction with the information saved by the LMS itself.

However, this completely removes device agnosticism and even browser agnosticism on the same device for a user i.e. if they get partway through the course on their Laptop, and then later on decide to log back into the course on another device, like a Phone or Tablet, then their progress won’t be saved and they will have to start that same course from the beginning (or they will see some wacky stuff depending on the Logic included in the course).

Similarly, if they start the course on their Laptop using Google Chrome, and then switch to Mozilla FireFox to pick the course back up, they will have to start the same course from the beginning.

Furthermore, this will affect you and your colleagues when testing courses because even if you Duplicate a course within Evolve, work on it, re-Publish, and upload to the same LMS, the course will still grab bits of information saved in the browser from the previous course incarnation, and the course features you were expecting to have been fixed, even after a reset of progress within the LMS, will not work correctly.

We even have unconfirmed reports of the above issue even when using a different LMS for testing the same course within the same browser.

The workaround for the testing issue is that you will have to reset your progress in that course within the LMS, log out of your LMS, and then delete your internet browser history, including cookies, to a point before you first opened that course’s first incarnation in your LMS. This can be a very bad thing if you don’t know all of your logins for the various platforms and apps you use, as that information will be deleted. It is also a terrible thing to tell clients, and for you or clients to tell users to do, because of the above downsides.

Although it looks like you’re not using it, xAPI files tend to exceed the Suspend Data limit of 64 KB more often than the SCORM + Advanced Tracking due to xAPI sending data for every interaction (viewing, clicking, interacting, etc) rather than the limited data sent by SCORM files.

A downside of SCORM files versus xAPI files is to do with re-interactable MCQs. For the xAPI files, if it is working correctly and not exceeding your LMS Suspend Data limit, the MCQs are re-interactable for the user, where they can freely be set to “Show Answer” if the user got the answer wrong when they answered it intially, and come back to it after logging out and logging back in again.

For the SCORM files, the MCQs are re-interactable to “Show Answer” while the course is open, but if a user logs out and logs back in again, the MCQs are no longer re-interactable and only show a Tick or Cross depending on whether they got it right or wrong when they first answered it.

Sorry for waffling on, but feel free to let me know if you have any follow-ups in a Reply.

1 Like

Thanks for the detailed answer, Unique…, I love waffles and waffling!

I’m familiar with that bit in the documentation about the recommended suspend data being “at least” 64KB, and I’ve pointed out that it’s a limitation of the SCORM 1.2 standard to a maximum of 64KB (Does 2004 have a similar limit?). This says to me that there will always be a tight ceiling between the “average” Logic course and the limitations of the SCORM standard itself.

That’s unfortunate, and so it seems we as Evolve designers need more tools within the interface to help us test ahead of time to make sure we’re not publishing courses that will exceed the limit. For example, my colleague recommended a character limit on input field components – that’d help us control how much of that data is stored. Another idea is some kind of automated test within the Evolve interface (Or during the publishing process?) that can tell us if we’re creating courses too big or complex.

As it stands, we are testing courses in SCORM Cloud to get a sense of which features bloat the suspend data, but would prefer this not to be a standard testing step in every course we publish! There must be a better way… :neutral_face:

That idea to use Local Storage is interesting but I see the limitations for longer courses if learners are going to complete them in more than one sitting, switching from mobile to desktop.

Do you have a sense of which components are the prime culprits for ballooning the suspend data? I hear you talking about courses with several pages with several components per page, so much of it must be taken up just tracking the completion/bookmarking of each component (even if it’s not a Logic variable), correct?

We’re assuming that some of the simpler Variable components (like a confidence slider) would store maybe a single value like 1-5, or one byte of data, while others like the text input field can store several paragraphs of text, where 4 paragraphs roughly equals 4KB of suspend data. Is that your experience?

Also, thanks for raising that issue about the re-interactable MCQs. This is an issue my colleague encountered, and it’s nice to know that works in xAPI and not in SCORM. Is that a bug in Evolve’s implementation or is that due to the characteristics of the two standards?

SCORM 1.2 is technically limited to 4 KB and SCORM 2004 has the 64 KB limitation. However, as I understand it, this limitation is removed by Evolve when you select Advanced Tracking (or Evolve might not enforce it as default). We contacted Evolve about it and that was their response as far as I know, although I don’t have the email on hand to check.

This actually means that its completely down to the LMS itself enforcing a Suspend Data limit.

It would be great if there was a “visual” counter for the Suspend Data sent that we could enable (perhaps in an Extension) when Live Previewing a course in Evolve. I think I even tried a “hack” by extracting the Zipped Folder after Publishing and trying to copy some forum post to compress the Suspend Data sent by the file.

Unfortunately, I’m not a coder, and the way that the courses are packaged by Evolve were too different to the way the courses were packaged by the authoring tool used by the user in that forum.

An interesting aside is that the company that made that particular authoring tool actually went on to release another version of their authoring tool that compresses the Suspend Data somehow so that you are very unlikely to exceed the limitations set by LMSs!

I know that SCORM Cloud has really good granular admin options where you can actually set the Suspend Data limit to match the Suspend Data limit for the LMS that the course is supposed to end up in, but I still find testing in-platform much more useful (except when I have to keep deleting my browser history).

As you correctly surmised, I think Bookmarking is the biggest culprit, especially in xAPI courses, but also for long SCORM courses. I’ve had complete resets of courses with no Logic but containing three pretty long pages with fifty Components on them each that require Local Storage to be enabled to work correctly.

MCQs probably contribute a little bit, but Bookmarking would definitely require the most data imo. I honestly don’t know how much Suspend Data looking at an embedded/external graphic, Text component, or Media Carousel would create. We’ve had better successes with our video-only courses as the videos are hosted externally, so there’s barely anything on a page (Video Stream, Links Component to download a course file, MCQs maybe).

As for the MCQ thing, I think its a difference between the two standards. The xAPI file must create and store more data when a question is interacted with, including what the correct answer is set as initially. It seems that SCORM only flags whether the question is complete and whether or not the selected answer was correct or not. This is actually not an issue we’ve raised with Evolve Support as we set re-interactable MCQs as low-priority for us currently.

It would be a different story if these courses were intended for exam revision where knowing what the correct answer is would be useful. Let me know if you raise that issue with Evolve as I’d be interested to know whether it can be fixed on their end or whether it is unfixable due to the difference in standards.

1 Like

I can confirm that our IT team was able to boost our LMS Suspend Data limit to 64KB for SCORM 1.2 packages in Saba Cloud, so the limit must not be internal to the content but set by the LMS.

2 Likes