Drafting MBE Items: A Truly Collaborative Process

By: Professor Tim Davis (Wake Forest)

In 2003, I was invited to attend a meeting of NCBE’s Multistate Bar Examination (MBE) Contracts Drafting Committee, one of the seven drafting committees for the seven subjects covered on the MBE. As I observed the deliberations, the collaborative nature of the process in which committee members were engaged became readily apparent. Over time, I would come to learn more about this collaborative process that culminates in professionally crafted questions that appear on the MBE—and experience how that process has facilitated my developing expertise, affording me the opportunity to be of greater service to law students and professors.

Multistate Bar Examination (MBE)

The MBE is a six-hour, 200-question multiple-choice examination covering seven substantive areas of law: Civil Procedure, Constitutional Law, Contracts, Criminal Law and Procedure, Evidence, Real Property, and Torts. (Visit the NCBE website for a complete MBE Subject Matter Outline.) It has been produced by NCBE since 1972 and is administered by 54 user jurisdictions as part of the bar examination. It is developed and scored by NCBE. Its purpose is to assess the extent to which an examinee can apply fundamental legal principles and legal reasoning to analyze given fact patterns. The 2019–2020 MBE Drafting Committees (one for each of the seven subjects covered on the MBE) included 44 members from 23 jurisdictions (Arizona, Arkansas, California, Colorado, District of Columbia, Florida, Idaho, Iowa, Louisiana, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, New Jersey, New York, North Carolina, Ohio, Oklahoma, Pennsylvania, Texas, Virginia, Wisconsin), among them 7 practicing attorneys, 9 judges, and 28 faculty members from 24 different law schools.

  • The Initial Stage: Independently Drafting Questions According to Best Practice Principles

Development of an MBE item, or question, from drafting to administration is a process that can take up to three years. The initial stage of this process begins with individual effort. Each member of the seven drafting committees drafts items independently. Drafters are provided with test specifications that identify the topics to be tested within each subject area (e.g., from the MBE Contracts subject-matter outline, this might be “formation of contracts” or “contract content and meaning”). With this general guidance, the most difficult stage of the drafting process begins—conceptualizing an item. In generating ideas for items, drafters may turn to cases, treatises, Restatements of the Law, and their experiences as professors, practitioners, and judges.

Best Practice Principles for Drafting Items

Multiple-choice questions can be broken down into the following components: the stem (factual scenario); the lead-in (call of the question); and the options (potential answers), which consist of the key (the correct answer) and the distractors (the incorrect answers). (See the diagram below.)

Components of a Multiple-Choice Question

Multiple-choice questions can be broken down into the following components:

Stem: The factual scenario

Lead-in: The call of the question

Options: The potential answers

Key: The correct answer

Distractors: The incorrect answers

The four principles that guide drafters in item writing are based on best practices derived from psychometric research and specify that items should:

  • be clear and concise;
  • use only the minimum number of actors and facts necessary to support the correctness of the key and the plausibility of the distractors;
  • test core concepts rather than trivial or obscure topics; and
  • assess examinees’ knowledge of legal doctrine and their ability to apply legal reasoning and lawyering skills and strategies rather than rote memorization.

These principles also instruct drafters to aim for a level of difficulty that corresponds to the minimum competency expected of newly licensed attorneys.

Applying the Principles to Each Component of an Item

Drafters attempt to adhere to these principles when working on each component of an item.

The Stem: Particularly in regard to an item’s first component, the stem, drafters aim for clarity. Emphasis is placed on using the minimum number of characters and the minimum amount of background information necessary to support the lead-in and the options. Drafters also seek to avoid developing stems that incorporate subtle gender, racial/ethnic, or regional biases.

The Lead-in: The lead-in, the second component of an item, should be a clear, focused question that frames an examinee’s task based on information presented in the stem. Best practices dictate that drafters write lead-ins that are positively framed complete sentences. Practices to avoid include writing lead-ins that are framed negatively or as sentence fragments or that are unfocused or add facts that do not appear in the stem.

Drafters also attempt to draft different forms of lead-ins to test particular skills. For instance, a predictive outcome lead-in (e.g., “Will the buyer prevail in a breach of contract action against the seller?”) emphasizes an examinee’s ability to synthesize law and facts to predict an outcome. In contrast, a practice-oriented lead-in (e.g., “What objection should the attorney make?”) emphasizes an examinee’s skill at developing an analytical or logical framework for solving a legal problem.

The Options: An item’s third component, the options, consists of the key and three distractors. Best practices for drafting this component include writing options that are concise and nonrepetitive of information in the lead-in, responsive to the lead-in, parallel in style to one another, and based solely on the information provided in the stem. Obviously, the key must be the unequivocal correct answer. At the same time, each distractor must be plausible and supported by information in the stem.

The Next Stages: Multiple Rounds of Review

NCBE Test Editor Review

The collaboration begins to unfold more fully after a committee member drafts a set of items that are submitted to the NCBE test editor assigned to that particular committee. Committee members also provide a citation to the rule, case, or other legal authority that supports each item. Test editors, all of whom are lawyers, engage in a detailed review of each question. The editor’s review is focused on an item’s compliance with NCBE best practices, including that the information in the stem is clear, the lead-in is posed as a focused question, and the options are parallel and responsive to the lead-in. The editor checks to ensure that the key is in fact correct and none of the distractors are plausible second or partial keys, the cited authority in support of the key is accurate, and the item tests a core concept and presents a realistic factual scenario. Test editors send comments to drafters, who then have an opportunity to respond to the editor’s concerns before items are forwarded to the next stage of the process.

Outside Expert Review

New items, which become a part of each committee’s item pool, are also reviewed by two outside content experts, a practicing lawyer and a law professor, both with subject-­area expertise. These reviewers engage in a process similar to that of test editors but focus primarily on whether a question tests a core concept and is realistic and whether the key represents an accurate reflection of the current law on the topic being tested. The reviewers also determine whether the item is at the appropriate level of difficulty, whether more than one option is potentially correct, whether a question is imbued with any subtle bias, and whether each distractor is plausible. External reviewers’ written comments are shared with the entire committee for discussion at the next scheduled committee meeting.

Full Committee Review and Pretesting

Drafting committees meet twice per year. In advance of each committee meeting, committee members review the materials that will be discussed. These materials consist of newly drafted items, items on which the external reviewers have commented, and pretested items that failed to meet NCBE statistical standards (as explained later). Committee members also review items that the committee chair and the test editor have selected to appear on two future MBE exam administration test forms (test form referring to the collection of items in the order in which they are presented to examinees).

At the meeting, committee members engage in lively discussion that addresses the concerns raised by the test editor and the external reviewers. Along with editing items in response to those concerns, committee members jointly edit items in response to one another’s concerns. If editing fails to resolve a concern, an item will either be assigned to a committee member for further revisions after the meeting adjourns or retired (never placed on an exam). Revised items will typically be discussed at the next scheduled meeting. Items that have been revised to address concerns or those for which there are no concerns are promoted to  pretest-ready status.

Following the meeting, the test editor and the committee chair will select items to be pretested from this group of test-ready items. At each administration of the bar exam, 25 of the 200 items that appear in an MBE exam booklet are items that are being pretested and are therefore not used to calculate the examinee’s score. (The remaining 175 scored items are those that have already successfully passed the pretest process.) Examinees’ performance on these pretest questions is carefully evaluated to determine whether those questions meet NCBE statistical standards and can be included as scored questions on a future MBE. Pretest items that fall outside of these statistical standards are submitted for review by the committee, which has the opportunity either to edit the items at a future committee meeting or to retire them.

Items that are slated to appear as scored items on future administrations are also reviewed at the semiannual meetings. Each such item would already have gone through the process described above—reviewed as a new item by a test editor, external reviewers, and the full committee; pretested; and reviewed again twice by the full committee as part of an upcoming test form.

My Experience Serving on the Committee

At the time I attended my first meeting of the MBE Contracts Drafting Committee, I did not appreciate that it would mark the beginning of one of my most rewarding professional and service endeavors and that it would also facilitate opportunities that extend beyond my committee work. The experience I have gained in drafting multiple-­choice questions through my work on the committee has assisted me with the questions I draft for my students, and it has also allowed me to be of service to other law professors by leading workshops on the mechanics and best practices for drafting multiple-choice questions. It has been an honor to work with the members of the committee and NCBE staff and experience their professionalism and service not only to NCBE but also to the legal profession.

This post first appeared in The Bar Examiner and has been republished here with the permission of the author and publication. The original publication can be found here

Share our content!