Releases

Out now: Numbas v3.0, the marking algorithms rewrite

Today we’ve released Numbas v3.0. It’s the thing I’m second-most proud of producing in the last year (my daughter was born last October).

The marking code at the heart of Numbas has been completely rewritten, to make it much easier for question authors to change how students’ answers are marked. This has also allowed the introduction of custom part types, to make it easier to use and reuse different marking algorithms. Read the rest

Development log: March 2017 – Usability improvements

We’ve made a few changes to Numbas and the editor recently, with the aim of improving usability. It involved moving some parts of the editor around, so I thought I’d better show what we’ve done.

First of all, when you submit an answer to a part of a Numbas question, the input changes colour depending on the score you were awarded. (If you’ve got score feedback turned off, it turns the same colour no matter how you did)

Here’s an example:

Read the rest

Development log: December 2016, Numbas 2.1

Numbas has acquired a few new features and had a bit of a tidy-up in the last couple of months, so I thought it was time to bump the version number up to 2.1 and let you all know what’s been happening with another development log.

Groups of questions in exams

You can now separate the questions in an exam into groups, allowing you to pick a subset from each group at random. This feature was requested by Ione Loots at the University of Pretoria, who wanted a way of showing students a randomly-picked variation of each question in a test. (documentation, issue) Read the rest

Run Numbas in more places than ever before with the new LTI tool provider

We’re happy to announce the release of a Basic LTI 1.1 tool provider for Numbas exams.

lti-provider-dashboard

One of the more complicated parts of using Numbas is getting it to work with your Virtual Learning Environment (VLE). We designed Numbas to use the SCORM standard, which ideally would allow it to run in any SCORM-compliant VLE without any configuration or input from the server administrator. However, there have always been a couple of wrinkles in that plan: not all VLEs support SCORM, and some of those that claim to don’t do it properly.

Blackboard’s SCORM player has a few long-standing bugs and missing features which mean that we haven’t recommended it for serious use. Since we can’t fix those problems ourselves, we’ve spent a long time trying to find a way work around Blackboard’s problems. Additionally, when a large contingent of Norwegian lecturers visited us for the MatRIC colloquium this April, we discovered that very few institutions in Norway use VLEs which support SCORM. Someone suggested we look at LTI, since many more VLEs seem to support it. Read the rest

Embed GeoGebra worksheets in Numbas and mark them

You can now embed GeoGebra applets in Numbas questions and, using GeoGebra’s new exercises feature, award the student marks based on constructions within the applet. This is a huge step forward, making it much easier to include interactive diagrams in Numbas questions.

Here’s a video showing how to embed a GeoGebra applet in a Numbas question, and award the student marks if they complete a certain construction. There are even steps, giving marks for each stage of the process!

You can use the values of Numbas question variables in the definitions of objects in the GeoGebra worksheet, meaning that diagrams can accurately reflect the rest of your question.

In this question, the gradient of the slope and coefficient of friction are randomly generated in Numbas, then passed to the GeoGebra applet.

In this question, the gradient of the slope and coefficient of friction are randomly generated in Numbas, then passed to the GeoGebra applet.

I’ve put together a small demo exam with a couple of questions showing some ways you can use GeoGebra inside Numbas.

To get started using GeoGebra in your own Numbas questions, read the extension’s documentation.

Numbas 2.0

I’m very proud to announce the release of Numbas 2.0, which features a completely rewritten editing interface and a reorganised item database.

numbas v2 homepage

We’ve added some very helpful new features, and changed the way the database is organised to make working in groups much easier. All exams and questions in the editor database are now organised into projects, which provide a simple way of collecting together material relating to a particular course or activity in one place.

Projects allow you to automatically grant editing rights to a group of collaborators, keep track of changes that have been made to your content, and filter out irrelevant material. Project-level comments make it easier to coordinate writing, testing, and deployment of questions and exams with your team members.

We’ve rebuilt the editing interface from the ground up, to make it cleaner and easier to use. Read the rest

A tool to analyse Numbas attempt data from a Blackboard course

Good news, everyone! We’ve found a way to get useful information about attempts on Numbas tests out of Blackboard, and present it like this:

687474703a2f2f6e756d6261732e6769746875622e696f2f626c61636b626f6172642d73636f726d2d616e616c797369732f626c61636b626f6172642d7265706f72742e706e67

The current situation at Newcastle is that all of our in-course Numbas tests which count towards credit are run through a Moodle server set up specifically for the purpose, even though our institutional VLE is Blackboard.

The reason for that boils down to the fact that Blackboard doesn’t make it easy to analyse data to do with SCORM packages: the built-in SCORM reports don’t give much useful information and are tedious to generate, and it’s unclear where in the database the SCORM data lies. If a student claims that their answer was marked incorrectly, we have no way of checking it because Blackboard only gives you the student’s reponse to an interaction, and not the expected response. And sometimes that’s not enough: it’s much easier to work out where a student’s gone wrong if you can load up the test as they saw it. SCORM has a review mode which does that, and while I was able to add support for that to the open-source Moodle server, Blackboard is a black box and brooks no intervention.

Can you work out where this student went wrong?

Can you work out where this student went wrong?

Read the rest

Development log: August 2015

numbas large layout

Development of Numbas has continued apace over the Summer break. I’m about to go on holiday for a couple of weeks, so I thought I’d write a development log to keep you up to date with all the latest changes.

The biggest change is that I completely rewrote the default theme to use the Bootstrap framework. As well as making everything look more “modern”, it should make using Numbas on smaller screens a lot easier. When the screen is below a certain width, the question list collapses into a sliding menu, which you can reveal by clicking on the icon at the top left of the screen. The old layout, with all the navigation bumped to the bottom, led to a lot of scrolling up and down.

numbas small layout

Other changes

The Numbas runtime:

  • New JME functions: len(set) (code), reverse(list) and indices(list,value). (code, documentation)
  • There’s now a version of the table function which doesn’t require a list of column headers. (documentation)
  • The shuffle function now works on ranges as well as lists. (code, documentation)
  • “Match choices with answers” parts now have a couple of layout options, which let you remove certain elements from the grid. You might want to do this when your grid is symmetric (for example, when asking the student to state which elements of a set are equivalent to each other). (code, documentation)
  • Fixed a bug in the random number generator seed which caused a warning in Chrome. (issue)
  • The code to count significant figures in a number now copes with E notation. (issue)
  • The value of cmi.session_time in the SCORM data model is now set properly (code)
  • The “noLeadingMinus” simplification rule rewrites -0 to 0. (code)
  • Added an option to not show the results page when the exam is finished. (issue)
  • When a question only has one part, there’s no longer a “submit part” button in the part feedback box. Instead, there’s just the “submit answer” button at the bottom of the question. (code)
  • As part of the groundwork for enabling adaptive marking, part objects now have a getCorrectAnswer method which returns the correct answer to the part in a given scope. (code, documentation). Each part also has a method studentAnswerAsJME which returns the student’s answer to the part as a JME data type. (documentation)
  • Part marking scripts need to store some information which the validation script uses to decide what feedback to give. This should now be stored in this.validation. (code)
  • Fixed a bug in Numbas.jme.display.mergeRuleset which led to some rules going missing. (code)
  • If the “minimum/maximum number of marks” options in a multiple choice part are empty, use 0. (issue)
  • The JME function zip(lists) no longer gets stuck in an infinite loop if you give it no arguments. (code)
  • There’s now a function Numbas.jme.tokenToDisplayString which turns a JME token into a representative string, and a dictionary Numbas.jme.typeToDisplayString which defines how that behaves for each data type. (code, documentation)
  • Fixed a bug where names of expected variables in “mathematical expression” didn’t have excess whitespace trimmed. (issue)
  • Added a function Numbas.util.nicePartName which gives a human-readable identifier for a part. (documentation)
  • Nested unary minus and plus now get brackets around them when rendered as LaTeX. (issue)
  • The source code for each part type is now in a separate file, and the builtin JME functions are in a separate file to the core JME interpreter. This should make the code easier to maintain. (code)
  • Added display-only JME functions sub(name,index) and sup(name,index) to display variable names with arbitrary subscripts or superscripts. (code, documentation)
  • The logic around marking parts with zero marks has changed so we can give more useful feedback. Previously, parts with zero marks available just weren’t marked, but sometimes you want a part to be marked for adaptive marking, or just to get some feedback. Parts with zero marks now show a tick or a cross, even when they don’t contribute to the total score. (code)

The Numbas editor:

  • Fixed the logic to decide when to show the delete button for questions and exams. (issue)
  • Fixed a bug where links to pages on the same domain as the Numbas editor were made relative. (issue)
  • Added a lot of links to help pages that had been missing from the various part editor tabs. (issue)
  • Fields which take a JME expression use a monospace font, so they’re more readable. (issue)
  • Question and exam descriptions are sanitized to remove bad HTML. (somebody put a whole form element in their description!) (code)
  • The question search page now takes an exclude_tags parameter so you can exclude questions with certain tags. There’s no user interface for this yet. (code)
  • The sorting of tags on the question edit page is now case insensitive. (issue)
  • Added a feedback label “Needs to be tested”. (code, documentation)
  • Fixed a bug involving custom functions whose names contain a capital letter. (issue)

Adaptive marking based on answers to other question parts

A long-standing limitation of Numbas has been the inability to offer “error-carried forward” marking: the principle that we should only penalise them once for making an error in an early part. When calculations build on each other, an error in the first part can make all the following answers incorrect, even if the student follows the correct methods from that point onwards.

Numbas now has a system of adaptive marking enabled by the replacement of question variables with the student’s answers to question parts.

Read the rest