Explore the community Forums Lectora Tips and tricks Directed Evaluation Questions Based Variable

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #407224 Score: 0
    Profile photo of Andrew.Robertson
    Andrew.Robertson
    Member
    beginner
    intermediate
    curious george
    wise owl
    contributor
    @Andrew.Robertson

    So this is a fairly large question for a tip or trick, but I think I’ve thought it out fairly well – but I just want to see if anyone else has done this or something similar.

    We have a geographically diverse population with different skills and different requirements at over 90 sites. We’d like to administer one test so that it roles up properly within our LMS and it’s always one enrollment for everyone as opposed to different enrollments based on the site (LMS administration is easier).

    Is there a way to have a pool of 50 questions and then only display certain ones based on information obtained from the LMS (e.g. user ID or site) or the user (e.g. self selection of the site)? I was thinking of using an IF/THEN check that would run when the page is loaded that would either automatically skip to the next applicable question using a next page action or display the page for user response.

    #407367 Score: 1
    Profile photo of Tim K
    Tim K
    Member
    contributor
    beginner
    intermediate
    advanced
    friend finder
    junior moderator
    advocate
    wise owl
    curious george
    Group Member
    251 pts
    @timk

    I’d say it depends on how exactly you want to set up the test. Do you plan to present each learner with an individual selection of questions or something like 3 levels, e.g. “Beginner”, “Advanced” and “Expert”? The latter should be easier to do. You write that you want ONE test. Is there a reason for that? A Lectora course can easily have several tests for each level, but also one test with a section for each level. Both ways you can calculate a LMS score from the separate test or section scores. Skipping questions is more difficult to handle because skipped questions are still part of the test and will be evaluated as “incorrect” when empty.

    This post has received 1 vote up.
    #407378 Score: 1
    Profile photo of mallow76
    mallow76
    Member
    beginner
    intermediate
    contributor
    wise owl
    curious george
    22 pts
    @mallow76

    I’d agree with Tim. Don’t try and achieve this using one single test in Lectora. If you have a core set of questions that everyone should get asked then group them in one test. This would get shown to everyone. Then create separate tests for all other variations that you want to target – skill level or location etc. Just show these tests to whoever you wish using your IF THEN reasoning (i’d think that it probably will require the user to self-select).

    You will then need to convert the score at the end module as the default % score will not be correct (as, for example, the user will only have completed 2 tests out of a possible 10). In this example you would perhaps want to multiply the total score by 5 and then pass this value to the LMS (or alternatively you could work with the individual test scores to calculate the proper score that way). NB: To make the calculated total scores consistent and fair, all of the optional tests should really have the same number of questions / weightings etc.

    This post has received 1 vote up.
    #407490 Score: 1
    Profile photo of CarlFink
    CarlFink
    Member
    beginner
    intermediate
    contributor
    LUC16 Attendee
    wise owl
    curious george
    advanced
    13 pts
    @CarlFink

    And I agree with both Tim and Mallow. The system mallow76 describes is, in fact, exactly what I’ve done in the past and it works well.

    This post has received 1 vote up.
Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic.