April 2, 2020 at 11:44 am #441679
Hi y’all – we’re pushing out our training package on Monday and are doing final user testing. No one has been able to get successfully past one question near the middle of the training. I finally tried something and it seems to work, so I am posing a question here: do multiple response questions have a limitation on how many correct answers are possible?
The item in question asks the users to select all topics discussed in the (long and meandering) section. I compiled a list of 8 choices, 4 correct and 4 incorrect. Yet the LMS consistently failed to process the program past the question, regardless of which – or how many – options you would choose. So I cut the list to 6 choices (3 of each) – and now the LMS processes the page with no problems. (None of the answer options had anywhere near 255 characters.)
Has anyone else seen this behavior with multiple response questions before, or is this further evidence that there is extant alien life somewhere influencing software behavior (in other words, gremlins with coronavirus or something)?
Thanks in advance!April 2, 2020 at 11:52 am #441681
Adam CainMember22 pts@cainam
I think the longest one I’ve used had 10 choices – no issues. So the only time you have problems is when the course is loaded into the LMS? If you publish as html and try it out locally does it have any problems?April 2, 2020 at 12:06 pm #441696
Brian RobertsonMember12 pts@brobertson4402
As far as I know there is not a limit on the number of answers to a multi-select question. I suspect there is some other issue with the question that was resolved when you reduced the number of possible answers. How is it processing feedback, etc? I don’t think your LMS processes the page. It should be dependent on the Lectora output for any page processing. What happens when you preview in Lectora? Does it work there?April 2, 2020 at 2:50 pm #441700
Trying the question locally, and previewing in run mode, shows no effects of anything unusual. All behaviors (incl correct answers) are as expected. I am thinking that there are other culprits at work somewhere, because there are previous multiple response (and other format) questions that show no such problems.
Thanks for the tips!April 2, 2020 at 5:46 pm #441703
Tim KMember346 pts@timk
Do you retain the value of the question variable(s) between sessions? The space to save variables in the LMS is limited. This could explain why it works with a “shorter” question.
Are there any error in the console?April 3, 2020 at 11:36 am #441748
Tim – thanks – Yes, question variable values are retained. And it’s intriguing that a selection pool of 8 items would cause some type of “overflow” when a drag/drop selection pool (different question) of 12 items would not do that.
Are we seeing errors? MULTIPLE. And frankly it’s driving me nuts (ok, moreso than usual). The errors we’re seeing when testing in the LMS (a platform called Journey, from a vendor named Red Vector) are usually the one attached. Sometimes they are preceded by a window that drops open and warns that stack value limits are exceeded. The problem is that the errors are appearing only during quizzes (in this module, called knowledge checks).
Each of five sections has a 10-12 question quiz at the end, involving an array of t/f, multiple choice, multiple response, and drag/drop items. The errors do not appear in the first quiz, but as the user proceeds into the second quiz (after the second section) they start appearing more or less randomly. I’ve rebuilt the question items several times, validated that the question numbers and variables correspond correctly, and have just about stood on my head while publishing (no errors). But the errors continue to pop up. We’ve submitted a ticket to the vendor and asked for investigation on their side but so far there’s no response.
Attachments:April 3, 2020 at 11:50 am #441753
Adam CainMember22 pts@cainam
Years ago I remember modifying a test because of running into some variable storing limits… can’t remember the exact scenario, or if this might help – but what I did was not include the actual text in the answers themselves, but just 1a, 1b, 1c for the text (or a shortened version of the answer), and added text blocks outside the test question with the real answers/distractors. (just hide the text for the actual question, and use your own). Thereby cutting down on variable length for each question…This post has received 2 votes up.April 3, 2020 at 12:12 pm #441755
Tim KMember346 pts@timk
cmi.suspend_data is indeed the field in the LMS where all values are saved that are “retained between sessions”. In Scorm 1.2 the field is limited to 4096 characters (for the whole course, not per question). As you can see in your screenshot a lot of space is already used for the page tracking.
By the way: The variable of a dnd question with 12 items is 53 characters long, while the variable of a multiple choice question is as long as all selected option texts so it can very easily be much longer.
But all values are stored in the same field, so it is not a problem with this individual question, it is just the one that usually exceeds the limit. But it may be the next, if your answers to the previous questions happen to be very short, e.g. if you select only 1 option for each mc question.This post has received 2 votes up.April 3, 2020 at 12:54 pm #441758
Tim and Adam – THANKS!! This is great information, and Adam, a great workaround! I will see how I can modify these questions as soon as I can get v18 up and running. (Yeah, I thought I could get it done fairly easily…boy was I wrong…)
You must be logged in to reply to this topic.