Home > Testing resources: the Claim Evaluation Tools database

Testing resources: the Claim Evaluation Tools database

The Claim Evaluations Tools will help you assess people’s ability to evaluate treatment claims.
New for 2017:  Our manual will help you develop your own questionnaire in a few easy steps.
A boy raising his hand in class

Can you help us validate the Claim questions? Find out more.

If you are preparing a lesson to teach people about one or more of the Key Concepts you may be interested in evaluating your students’ learning achievements after the lessons.

We have developed a set of multiple-choice items – the Claim Evaluation Tools database – to assess peoples’ ability to evaluate treatment claims. The multiple-choice items can be used to produce tests in school and other teaching settings, or, for example, in randomised trials evaluating outcomes of educational interventions. Also, because little is known about the public’s “baseline” ability to assess treatment claims, items from the Claim Evaluation Tools database can be used in cross-sectional studies to gauge ability in a population, and thus provide background information to help tailor interventions addressing people’s educational needs.

All the items within the Claim Evaluation Tools database have been developed for use in children (from the age of 10) as well as for adults.

How you can use the Claim Evaluation Tools database

Instead of a standard, fixed questionnaire, we have developed the Claim Evaluation Tools database as a flexible battery of evaluation tools from which teachers, researchers and others can select those relevant for their purposes. Each multiple-choice item is designed to address one specific Key Concept. This means that you can create your own questionnaire based on which Key Concepts you want to teach.

The Claim Evaluation Tools database is open access and free for non-commercial use.

If you are interested in trying out the multiple-choice items, please take a look at our manual that will help you develop your own questionnaire in a few easy steps.

If you have any questions, you can use the form below to contact us and receive more information on how to proceed.

Please tell us how you intend to use the Claim Evaluation Tools:

How the Claim Evaluation Tools have been validated

The items in the database have been and continue to be rigorously evaluated in several contexts.  Evaluation includes feedback from experts and end-users, and statistical testing using classical psychometric and Rasch analysis. If you would like to contribute to validating a sample set in your context, please contact us.

Multiple-choice items within the Claim Evaluation Tools database are currently available in several languages, including:

  • English
  • Luganda (Uganda)
  • Norwegian
  • Spanish (Mexico)
  • Chinese
  • German

How the Claim Evaluation Tools database is maintained

The maintenance and revisions of the Claim Evaluation Tools database will reflect changes in the list of Key Concepts. Just as it is anticipated that the Key Concepts list will be a “living” document, so also the Claim Evaluation Tools database will continue to evolve.

Relevant publications

  • Austvoll-Dahlgren A, Oxman AD, Chalmers I, Nsangi A, Glenton C, Lewin S, et al. Key concepts that people need to understand to assess claims about treatment effects. Journal of Evidence-Based Medicine. 2015;8(3):112-25.
  • Austvoll-Dahlgren A, Nsangi A, Semakula D. Key concepts people need to understand to assess claims about treatment effects: a systematic mapping review of interventions and evaluation tools. Submitted paper. 2016.
  • Austvoll-Dahlgren A, Semakula D, Nsangi A, Oxman A, Chalmers I, Rosenbaum S, et al. Measuring ability to assess claims about treatment effects: The development of the “Claim Evaluation Tools”. Submitted paper. 2016.
  • Austvoll-Dahlgren A, Guttersrud G, Nsangi A, Semakula D, Oxman A, group. TI. Measuring ability to assess claims about treatment effects: A latent trait analysis of the “Claim Evaluation Tools” using Rasch modelling. Submitted paper. 2016.

Image credits