James Denholm-Price has been using Numbas to test first year Linear Algebra students for more than two years.

Adopting Numbas allowed James to replace his multiple-choice paper question sheets with online tests, giving students the chance to test their knowledge of the subject using tests that changed each time.

The randomly-generated questions reduced the risk of copying, and up to 94% of students participated in the practice tests.

Teachers were able to test new e-assessment questions “live” before adding them to final summative assessments.

Clinical Nurse Educator Antony Robinson used Numbas to test new emergency nurses at the Royal Darwin Hospital in Australia.

Antony adapted the tool to help new additions to the hospital’s emergency department to interpret results from an electrocardiogram, a device which measures the electrical activity of the heart.

Nurses were shown a quiz featuring animated heart rhythms, and had to answer a series of questions on what they saw before interpreting the rhythm itself.

Antony said he used Numbas for the tests as it was flexible enough to meet his needs.

**“The commercial tools weren’t able to provide for this use case easily”**, he said.

Carolijn Tacken teaches maths to classes of 12 to 18-year-old students in the Netherlands.

As many of her class use technology to research and learn, she wished to create quizzes that would reinforce the lessons in each chapter of their textbooks.

Using Numbas, she has created tests to help her students prepare for exams. Some have been uploaded to the school’s Virtual Learning Environment, while others have been added to iBooks and shared using Dropbox.

“It’s great that Numbas works on many different platforms, as a lot of my students have iPads”, she said.

Alex Goddard had recently joined the university as part of the teaching team in the Department of Chemistry. He was keen to give students a way to supplement their learning with online self-check tests.

“I was advised to use Numbas for this purpose, and got to grips with it after playing around with it for a few days”, he said.

Writing the majority of questions himself, he offered the tests to students in the university’s Virtual Learning Environment.

He hopes to roll the tests out to more year groups, and deliver tests on different topics in future.

Development on Numbas continues apace, so I thought it’d be a good idea to start writing about changes in more detail here so they don’t slip by unnoticed. Read the rest

We’ve just released some major new features for Numbas, which means we’ve bumped the version number up to **v1.6**.

The two big changes are **template questions** and **custom marking scripts**. Read the rest

Last week I deployed the new question search and organisation interface to the **math**centre editor. We noticed that the global question database was becoming quite unwieldy now that we have so many users (not complaining!), and finding both your questions and good questions written by others was getting harder. The new interface downplays the big list, instead presenting you with a few different ways in to the most useful parts of the site.

The questions index page now shows you a kind of ‘dashboard’, with links to the most popular tags, your recently-edited questions, as well as some highlights picked by us and your starred questions – you can save a question to this list by clicking on the star next to its name on the question edit page. You can still search the entire database by entering keywords or question titles in the search box at the top of the page.

It’s not a coincidence that we also delivered a workshop on using Numbas at eAssessment Scotland 2013. An hour really wasn’t long enough to do very much at all, but everyone seemed very positive about Numbas and keen to investigate it further. I asked for a show of hands at the start to find out who had signed up for the workshop because they’d already heard of Numbas, and I was pleasantly surprised by the response!

In future conference action, James Denholm-Price of Kingston University London will be giving a talk titled “Using Numbas for formative and summative assessment” at CETL-MSOR 2013 on the 10th of September. James used Numbas in his linear algebra course last year and has many interesting things to say about it. Also at CETL our two summer students, Hayley Bishop and Sarah Jowett, will be talking about their work on the maths support wiki we’re creating with Birmingham University, to complement our respective maths support centres. More on that later!

On the 23rd of October I’ll be giving a talk about Numbas at an IOP-sponsored event on “Promoting learning through technology” at Edinburgh University. I don’t think the event has a webpage or even a definite venue yet; I’ll give details when I have them.

Finally, we’ve set up a numbas-users mailing list on Google Groups. The idea is to have a place to discuss Numbas use, ask and answer questions about authoring, and talk about features you’d like to see. Bill has started it off by asking for comments on the new searching interface.

*By Dr. Nicholas Parker.*

### a. Introduction

Since 2008 the School of Mathematics and Statistics has incorporated computer-based assessments (CBAs) into its summative, continuous assessment of undergraduate courses, alongside conventional written assignments. These CBAs present mathematical questions, which usually feature equations with randomized coefficients, and then receive and assess a user-input answer, which may be in the form of a numerical or algebraic expression. Feedback in the form of a model solution is then provided to the student.

From 2006 until the last academic year (2011/2012), the School employed the commercial *i-assess* CBA software. However, this year (2012/2013) the School rolled out a CBA package developed in-house, *Numbas*, to its stage 1 undergraduate cohort. This software offers greater control and flexibility than its predecessor to optimize student learning and assessment. As such, this was an opportune time to gather the first formal student feedback on CBAs within the School. This feedback, gathered from the stage 1 cohort over two consecutive years, would provide insight into the student experience and perception of CBAs, assess the introduction of the new *Numbas* package, and stimulate ideas for further improving this tool.

After an overview of CBAs in Section b and their role in mathematics pedagogy in Section c, their use in the School of Mathematics and Statistics is summarized in Section d. In Section e the gathering of feedback via questionnaire is outlined and the results presented. In Section f we proceed to analyze the results in terms of learning, student experience, and areas for further improvement. Finally, in Section g, some general conclusions are presented.

### b. A Background to CBAs

**Box 1:** Capabilities of the current generation of mathematical CBA software.

- Questions can be posed with randomized parameters such that each realization of the question is numerically different.
- Model solutions can be presented for each specific set of parameters.
- Algebraic answers can be input by the user (often done via Latex commands), and often supported by a previewer for visual checking
- Judged mathematical entry (JME) is employed to assess the correctness of algebraic answers.
- Questions can be broken into several parts, with a different answer for each part.
- On top of algebraic/numerical answers, more rudimentary multiple-choice, true/false and matching questions are available.
- Automated entry of CBA mark into module mark database.

Computer-based assessment (CBA) is the use of a computer to present an exercise, receive responses from the student, collate outcomes/marks and present feedback [10]. Their use has grown rapidly in recent years, often as part of computer-based learning [3]. Possible question styles include multiple choice and true/false questions, multimedia-based questions, and algebraic and numerical “gap fill” questions. Merits of CBAs are that, once set up, they provides economical and efficient assessment, instant feedback to students, flexibility over location and timing, and impartial marking. But CBAs have many restrictions. Perhaps their over-riding limitation is their lack of *intelligence *capable of assessing methodology (rather CBAs simply assess a right or wrong answer). Other issues relating to CBAs are the high cost to set-up, difficulty in awarding of method marks, and a requirement for computer literacy [4].

In the early 1990s, CBAs were pioneered in university mathematics education through the CALM [6] and Mathwise computer-based learning projects [7]. At a similar time, commercial CBA software became available, e.g. the Question Mark Designer software [8]. These early platforms featured rudimentary question types such as multiple choice, true/false and input of real number answers. Motivated by the need to assess wider mathematical information, the facility to input and assess algebraic answers emerged by the mid 1990s via computer-algebra packages. First was Maple’s *AIM* system [5, 14], followed by, e.g. *CalMath* [8], *STACK* [12], *Maple T. A*. [13], *WebWork* [14], and *i-assess* [15]. This current generation of mathematics CBA suites share the same technical capabilities, summarized in Box 1.

Read the rest

We’re giving a day-long workshop titled “Building online maths assessments using Numbas” at the University of York on the 4th of July. It’ll be an introduction to using Numbas, from logging on to the mathcentre editor, through selecting questions to make a test, to eventually writing your own questions.

The workshop is provided by the Sigma North East network for excellence in mathematics and statistics support, and attendance is free.

There’s more information, and a booking form, on the Sigma NE event page.

I’ve just released v1.5 of Numbas on Github. While there have been loads of changes since the last time I remembered to bump the version number up, the biggest change recently is that I have rewritten the default theme to use the knockout.js framework. It makes the underlying code a **lot** simpler, and allows us to do a few new things that would have been very complicated. In particular, there is now a *Review* mode which is made available when you have finished an exam – you can click on any question on the feedback page to go back to it and compare your answers with the expected answers, as well as seeing any marking feedback and the model solution in the Advice section.

The version numbers in the Numbas source code repository don’t mean too much since we push updates to the *stable *branch as soon as they’ve been tested instead of lumping them together, but it’s good to mark progress every now and then.

The **math**centre editor tracks the *stable* branch, so you can try the new features there now.

The Review mode was something we had in the previous system we used at Newcastle, and it was the one thing that many of the students asked for in a survey conducted at the end of the first semester by Dr. Nick Parker. I’m glad it’s finally in Numbas!

When you finish an exam, you can click on any question to review it.

Marking feedback and the correct answers are shown for each part.

And the model solution given in the Advice section is also revealed.