An Essential California Bar Exam Supplement

  • SELL YOUR EXAMS
  • TESTIMONIALS
  • Berkeley Law
  • Boston University
  • Cal Western
  • Chapman Law
  • Concord Law
  • La Verne College of Law
  • Monterey College of Law
  • Pepperdine Law School
  • San Diego Law
  • Santa Clara Law
  • Southwestern
  • Trinity Law
  • UC Hastings
  • Univ. of West LA
  • Western State
  • Whittier Law
  • Bar Exam 101
  • Bar Exam Doctor
  • Bar Exam Toolbox
  • Bar Secrets
  • Be Prepared
  • Jurax Bar Prep
  • Make This Your Last Time

BARESSAYS.COM SERVICES FOR THE CALIFORNIA BAR EXAM

For more information about why BarEssays.com is an essential study supplement for the California Bar Exam essays, read our FAQ: -- CLICK HERE TO READ THE BARESSAYS FAQ

BarEssays.com provides a database of more than 3,000 REAL examples of REAL essays and performance exams by REAL students that were actually taken during the California Bar Exam and graded by the California Bar Examiners. On BarEssays.com, you can search for any essay tested in the past decade and view REAL high and low scoring examples of that essay. BarEssays is recommended by law schools, tutors, and review courses as an essential supplement for the essay portion of the California Bar Exam and is successfully used by thousands of students annually.

BarEssays.com Standard Membership ($175)

A standard membership to BarEssays.com includes access to the essay database, featuring 3000+ real graded essays and performance exams returned from past California Bar Exams that includes a range of high and low scoring examples to every essay tested since 2005. All memberships (standard and premium) expire after the next bar exam.

The BarEssays essay database is the only service of its kind. No other review course or study program will provide you with a range of real graded high and low scoring examples for every California Bar Exam essay tested in the last decade. Typically, review courses will only provide model answers, written and heavily edited by professionals on staff. However, model answers are of limited help because they are far different from the average real high scoring essay. For this reason, review courses, tutors, and law schools recommend BarEssays as an essential supplement for any study program

BarEssays recommends that you write at least one essay each and every day . That is, print out an essay, time yourself for one hour, and write it out. After simulating your essay, search our database to look at high and low scoring essays to compare your essay to our graded examples. Learn from the mistakes on the low scoring essays. Emulate the positive aspects of the high scoring essays.

The more essays you practice, the more essays you review, the better you will become at essay writing! Through the process of self critique and examination, you will become adept at how to improve your scores. BarEssays members often report of an “aha!” moment where they understand what they need to do to write a high scoring essay.

-- CLICK HERE FOR A SCREEN SHOT OF OUR SEARCH ENGINE

-- CLICK HERE FOR A VARIETY OF SAMPLE GRADED ESSAYS

BarEssays.com Premium Membership ($225)

A premium membership includes all of the advantages of a standard membership, plus a lot of additional material:

  • Model answers for every past essay question tested on the California Bar Exam since 2005, written by a professional bar grader.
  • Professional Bar Grader Reviews - Over 300 reviews of California Bar Exam essays in the database, all written by professional bar graders, detailing in line by line “pop up” commentary why essays received specific scores and how the essay could have improved.

Note: You must install the latest version of Adobe Acrobat Reader to read the commentary.

Over 40 Essay Attack Templates (detailed roadmaps) for Commonly Tested Subjects

-- CLICK HERE FOR A SAMPLE ESSAY ATTACK TEMPLATE

Essay Submission Program ($99 and up) Practice Essay Grading

All members of BarEssays.com are eligible to participate in our Essay Submission Program, whereby you are able to submit your own practice essays and performance exams for detailed critique and grading by a bar exam professional at negotiated low prices.

As part of the Essay Submission Program, you can submit any essay tested on the California Bar Exam from 2002 to the present, to be graded within 48-72 hours. We receive tremendous reviews about our Essay Submission Program. The Essay Submission Program is only available to current members of BarEssays.com.

-- CLICK HERE FOR A SAMPLE GRADED PRACTICE CALIFORNIA BAR EXAM ESSAY

Live Essay Workshops Video Package of 14 Workshops Available for $249

Beginning with the October 2020 California Bar Exam, BarEssays has been conducting live essay workshops, taking students through specific essay questions and teaching how to deconstruct and answer questions in detail to receive maximum points.

The workshops are conducted by a former official grader of the California Bar Exam and long term bar exam instructor. The workshops became extremely popular and well received.

We are making available a package of 14 of the previous workshops videos and handouts for $249. This includes reviews of 24 previously tested questions and two performance exams.

**You can find detailed information about each workshop video at this link (click here).

**You may view a free sample community property workshop at this link (click here).

If you are a BarEssays member, click on this link to purchase access to the workshop videos. If you are not a BarEssays member, email us to purchase access to the previous essay workshops.

Future Live Essay Workshops

In addition to the videos of previous workshops videos, we will be conducting live workshops for each new examination. Announcements of these workshops will be sent to all BarEssays members and in the most popular California Bar Exam Facebook group.

National Conference of Bar Examiners

It’s All Relative—MEE and MPT Grading, That Is

This article originally appeared in The Bar Examiner print edition, June 2016 (Vol. 85, No. 2), pp 37–45.

By Judith A. Gundersen 1 The Multistate Essay Examination (MEE) and the Multistate Performance Test (MPT) portions of the bar exam are graded by bar examiners in user jurisdictions. They are not centrally graded at NCBE, but NCBE prepares detailed grading materials for both exams and provides hands-on grader training designed to facilitate consistent, accurate, and fair grading across all MEE and MPT user jurisdictions and, in particular, for Uniform Bar Examination (UBE) jurisdictions. It is critical that grading of the MEE and MPT portions of the UBE be consistent across UBE jurisdictions, as the score portability afforded by the UBE is based on the assumption that the exams are graded in a consistent manner no matter where graded or by whom.

This article discusses the relative or rank-ordering grading philosophy NCBE uses in its grader training, the reasons we advocate this approach, and recommendations for optimal use of this grading method. I’ll start with a review of how grading materials are prepared at NCBE and the nature of our grader training.

Preparation of MEE and MPT Grading Materials

Grading materials for the MEE and MPT include the MEE analyses and MPT point sheets, which are detailed discussions of all the issues raised in the items by the item drafters and suggested resolutions or analyses of the issues. The analyses and point sheets are drafted by the authors of the items and are then discussed, edited, and revised by the respective NCBE drafting committees at semiannual meetings. Preparing MEE and MPT items and their grading materials takes at least two years and is an iterative process with many lawyer-volunteers, NCBE staff, outside content experts, and pretesters involved. 2

MEE and MPT drafters know the importance of crafting excellent grading materials. The process of preparing grading materials also serves as a good check for the drafting team on the item’s internal consistency, degree of difficulty, and gradability; it is quite common for grading materials to unveil problems with the item that were not identified by the drafter or committee at an earlier stage.

Grader Training

MEE and MPT grading materials are very thorough and can effectively guide graders through the grading process. In addition, NCBE also conducts hands-on grader training sessions at its Madison, Wisconsin, headquarters the weekend following the bar exam. Graders may attend the grading workshop in person, by conference call, or via on-demand streaming as available following the workshop. Participation by user jurisdictions is high—hundreds of graders representing most MEE and MPT jurisdictions participate in one of these three ways.

The grading workshop lasts one day and consists of a dedicated session for each MEE and MPT item led by drafting committee members who are experienced grading workshop facilitators. Sessions begin with an overview of the item and grading materials, and any questions about the area of law (MEE) or the assigned task (MPT) are addressed. The participants then set about silently reading several real examinee answers (sent by bar administrators from all over the country) and grading them. Grades are assigned using a 1–6 relative score scale (as discussed later). As professors often do in law school, workshop facilitators rely on the Socratic method from time to time—graders are called on to explain the grades they gave. This is particularly true if a grade might be an outlier from grades assigned by other graders in the session. Based on the review and grading of the sample of examinee answers and the ensuing discussion between graders and facilitators, grading materials may be refined or grading weights adjusted. Final versions of the grading materials are then made available to graders in user jurisdictions a day or so after the workshop.

Grading workshop participation alerts graders to common answer trends and also gives them a head start on calibration—the development of coherent and identifiable grading judgments so that rank-ordering is consistent throughout the grading process and across multiple graders. (The focus of this article is not on calibration, but that doesn’t mean it isn’t a critical component of the grading process. See the section on calibration below.)

The Relative Grading Philosophy in Action

What is relative grading and how does it work.

With NCBE’s grading materials in hand, graders are ready to begin the grading process in their own jurisdictions with their own examinees’ answers. But grading MEEs and MPTs isn’t like marking a paper with a score from 1% to 100% or meting out an A, B, C, D, or F (or drawing smiley or frown faces on papers; one of my sons’ third-grade teachers, whom I will call Ms. Brinkman for purposes of this article, was fond of drawing a big “frowny face” on papers that didn’t meet her standards!). Instead, NCBE trains bar examiners to grade the MEE and MPT on a relative basis—making distinctions between papers and rank-ordering them according to whatever score scale the jurisdiction has in place. (Jurisdictions may use whatever score scale they wish—e.g., 1–5, 1–6, 1–10, etc.—although NCBE uses a 1–6 score scale at its grading workshop, for reasons detailed later in this article.)

Relative grading training helps graders identify consistent standards in ranking papers and then apply those standards to put papers in piles according to their relative strength. The 1–6 scale used at the workshop simply means that a score of 6 is reserved for the best papers among all answers assigned to a particular grader. It is better than a 5, which is better than a 4, and so on, all the way to 1—a paper that is among the weakest papers. Relative grading means that in any group of answers, even if no single paper addresses all the points raised in an item, the strongest papers still deserve a 6 (using a 1–6 score scale). They do not have to be perfect nor necessarily deserve a true A or 100% (or a double “happy face” according to Ms. Brinkman). Using the same principles, a paper need not be completely devoid of content to get a 1 if the other papers are strong.

This relative grading philosophy (also referred to as norm-based grading) may be a little different from the way many of us had our papers graded in school, where we were held to an “absolute” or “criterion-referenced” standard: we had to answer a certain number of parts of a question correctly to get a high score or an A regardless of how our fellow students answered. Or if we missed some points, we would get a low grade even if many of our fellow students also missed the same points.

NCBE’s focus on relative grading does not mean, however, that absolute or criterion-referenced grading does not belong on the bar exam; it does, particularly on the Multistate Bar Examination (MBE)—the only part of the bar exam that is equated across time and across exam forms. Equating is the process of determining comparable scores on different exam forms. For the MBE, the absolute standard or “cut score” has the same meaning across administrations and jurisdictions. A scaled score (scaled means that it is a standardized score derived after equating) of 135 on the MBE is a 135 no matter when or where earned and will always mean that the examinee passes if the cut score is 135. By contrast, essays and performance tests cannot be equated in the way a multiple-choice exam like the MBE can be, so a total raw score of, say, 24 (or 100 or 1,000) on the written part of the bar exam may have a different meaning depending on the particular exam form, the examinee pool, the grader, and the jurisdiction. 3

Because of the high-stakes nature of the bar exam, we must account for the differences in written exams across administrations, jurisdictions, and graders, and we do this by using the equated MBE score distribution as a highly reliable anchor. We weight the MEE and MPT raw scores for each examinee according to the jurisdiction’s weighting scheme (e.g., on the UBE, the MEE is weighted 30% and the MPT 20%). We then map the total weighted MEE and MPT raw scores for each examinee to the MBE scaled score distribution according to performance level. This process is referred to as scaling and has the effect of adjusting the MEE and MPT scores so that they have the same mean and standard deviation as the MBE scores do in the testing jurisdiction (standard deviation being the measure of the spread of scores—that is, the average deviation of scores from the mean).

Scaling written scores to the MBE is a psychometrically valid practice because examinee performance on the MBE is strongly correlated to examinee performance on the combined MEE and MPT. Because the MBE is an equated exam, MBE scores have constant meaning across time and across jurisdictions, even though the items on particular exams may vary slightly in intrinsic difficulty. By scaling the combined MEE and MPT scores to the MBE scaled score distribution, we capitalize on (or leverage) the equating done to the MBE to give the MEE and MPT scores the same constancy in interpretation, despite the fact that MEE and MPT items may vary in difficulty from administration to administration.

It is important to point out that if the relative grading approach is used consistently across jurisdictions and administrations, the MEE and MPT raw scores will have the same mean and standard deviation in all jurisdictions and administrations no matter if the intrinsic difficulty of the MEE or MPT items changes or if the examinee population becomes more or less proficient. In jurisdictions that use the same grading scale, each jurisdiction will also have approximately the same raw score mean and standard deviation as well as having the same mean and standard deviation for all administrations. It is only by scaling to the MBE that differences in either the items or the examinees can be reflected in the scores. 4

Why Use Relative Grading?

There are compelling psychometric and policy reasons why, given the current process for grading the MEE and MPT, NCBE trains graders to use a relative grading approach (with subsequent scaling to the MBE) to consistently grade the MEE and MPT.

Score Scales and Grading Procedures Vary Among Jurisdictions

Because of a decentralized approach to grading the MEE and MPT, no matter how successful we are at training graders across jurisdictions to promote uniformity, we must allow for the fact that there could be some scoring variation among jurisdictions.

Relative grading does not require that all jurisdictions use the same score scale. Rather, papers placed in a particular pile (assigned grade) reflect a level of proficiency that is more similar to others in the same pile than to papers placed in a different pile, and higher grades reflect higher degrees of proficiency. As stated earlier, an examinee’s raw MEE and MPT scores are weighted appropriately, added together, and then mapped to the MBE scaled score distribution for the given jurisdiction. An examinee who performs well on all or most parts of the written portion of the exam will generally have scores that “land” on the upper end of the distribution of the MBE scaled scores for that jurisdiction. Someone who earns a lot of 6’s on her MEE and MPT answers (in a jurisdiction using a 1–6 score scale) will generally have her total written score mapped to the top of the MBE scaled score distribution for her jurisdiction; an examinee who consistently earns 1’s and 2’s on his MEE and MPT answers will usually find that his total written score maps close to the bottom of the MBE scaled score distribution. This will be true no matter what score scale is used.

Relative grading is also adaptive enough to work with different approaches to the grading process. It does not matter if each paper is read by only one grader or by two graders who have to agree; or if a single grader grades all answers to a particular item or answers are divided among several graders. Nor does it matter if grading is done over the course of a day or weekend of intense grading or over the course of two months. As long as graders achieve and maintain calibration (as discussed later), relative grading should serve to keep answer assessment consistent across time and across graders.

MEE and MPT Items May Vary in Difficulty from One Administration to the Next

As much as the drafting committees and our test development process try to standardize MEE and MPT difficulty across exam administrations, it is impossible to create items that represent exactly the same degree of difficulty. And MEE and MPT items cannot be pretested live to gather performance data in the way that MBE questions can because they’re too few and too memorable. (MBE pretest questions are indistinguishable among the scored MBE items on each MBE exam form.) Without live pretesting, we must find some other fair way to take into account differences in MEE and MPT difficulty across exam forms.

With relative grading, it doesn’t matter if an exam form represents the exact degree of difficulty as past (or future) MEEs or MPTs. Relative grading means that an examinee who sits for a harder exam is not penalized and an examinee who sits for an easier one is not rewarded, because it focuses only on how examinees do in comparison to one another on the same exam. For example, suppose that February 2016 examinees were given more difficult MEE or MPT items than those administered in, say, July 2015. That would be unfair to the February 2016 examinees or, alternatively, would seem like a windfall to the July 2015 examinees if MEE and MPT items were graded according to an absolute standard. The July 2015 examinees would get overall higher scores because the items were easier. In the world of high-stakes tests like the bar exam, this is a situation to avoid, and relative grading helps do that. It focuses on comparing answer quality according to other answers to the same items. Answers to easy items are still rank-ordered, as are answers to harder ones. Scaling the total raw score on the written portion of the bar exam to the MBE, which is equated across administrations and accounts for differences in exam difficulty, means that it doesn’t matter whether the written portion on one administration is harder than on another. As long as graders are able to rank-order answers, they can fairly and consistently grade the MEE and MPT from administration to administration regardless of differences in exam form difficulty.

Examinee Proficiency Varies from One Administration to the Next

Examinee proficiency may vary across administrations. For example, in the February administration, examinee proficiency tends to be lower due to a larger proportion of repeat test takers. We see this lower performance reflected on the MBE in February and expect to see lower scores on the MEE and MPT as well. However, asking graders to maintain consistent grading standards across administrations, examinees, and items would be extremely difficult, if not impossible. There are simply too many moving parts across test administrations to make such a grading task reasonable for maintaining score meaning across administrations. But relative grading—comparing answers among the current pool of examinees and then scaling those raw scores to the MBE—is manageable for graders and fair to examinees.

It is also important to note that using a relative grading system rather than an absolute grading system does not mean that graders are artificially inflating or deflating grades in a way that allows more examinees to pass or causes more examinees to fail. All relative grading does is help graders make rank-ordering decisions, which are critical to having the question “count” in an overall bar exam score, as discussed below. Scaling to the MBE lines up an examinee’s overall written score to a statistically accurate corresponding point on the MBE score distribution. Scaling standardizes rank-­ordering decisions across time and exams.

Likewise, relative grading does not benefit or penalize examinees who sit in jurisdictions that have a weaker or stronger examinee pool. Relative grading practices work in tandem with the process of scaling to make the appropriate offset for each examinee’s position relative to his or her own jurisdiction’s examinee group and the position of that examinee group relative to other jurisdictions’ examinee groups. To make meaningful and fair comparisons across time and jurisdictions, we need to know what absolute level of performance is represented by a particular group’s average. We don’t have that absolute performance information for essays, but we do have average performance on the MBE for the relevant groups. By virtue of the equating process, those scores are on an absolute scale.

Because the data have consistently shown across groups and time that the total MBE scaled score is strongly correlated with overall performance on the written components (correlation above .80 when reliability of the two measures is taken into account), we can use MBE performance information as a proxy indicator of the groups’ general ability levels. As a result, an examinee whose total raw essay score is ranked at the top of a weak group will have, after scaling, a total scaled essay score that reflects that differential, and an examinee who is more toward the bottom of a strong group will have a total scaled essay score that accounts for that positioning as well. Similarly, offsets are made (via scaling) to account for an examinee who sits for an administration with easier essay questions or one who sits for an administration with harder essay questions. The scaling process is critical to ensure that scores have a consistent meaning and also to ameliorate any efforts at gaming the system by attempting to pick a group or an administration that is anticipated to behave a certain way (e.g., sitting for a test that is anticipated to be easy or sitting with a group that is anticipated to be particularly skilled).

Graders Vary in Harshness or Leniency

In addition to evening out the differences in MEE and MPT difficulty from one administration to the next, relative grading ameliorates grader harshness or leniency from one administration to the next and from grader to grader. Even a harsh grader in jurisdiction A has to distribute an array of grades, low to high, among papers if she uses relative grading—she can’t give all papers a low grade because then she’s not rank-ordering. A lenient grader in jurisdiction B grading the same item can’t give all papers a high grade if he uses relative grading and follows instructions to use all grade values.

If a particular question is graded by a harsh grader or a lenient grader, as long as that grader is consistently harsh or lenient in rank-ordering, examinees are not unfairly penalized or rewarded—the rank-ordering decisions made by the grader remain; the actual raw scores assigned, whether harsh or lenient, are smoothed out to fit the MBE scaled score distribution. Examinees will not be penalized even if harsher graders have a lower mean score than lenient graders. (Note that if multiple graders are assigned to grade a single question, they must be calibrated so that they do not have different levels of harshness or leniency.)

Relative Grading Facilitates the Equal Weighting of All Items

Relative grading facilitates spreading out scores, which is critical to ensuring that all items carry the weight they should in an examinee’s overall written bar exam score. The weight an item gets is strongly affected by the amount of variation that scores have on that item. The less variation, the less weight the item carries in determining the total written score value. A question that every examinee gets right doesn’t discriminate or distinguish between examinees, just as a question that every examinee gets wrong doesn’t discriminate. For example, a question asking an examinee to write the English alphabet wouldn’t distinguish between examinees, because virtually everyone would get the answer correct. Or a question asking examinees to write the Burmese alphabet would probably stump 99% of U.S. examinees. In both instances, those questions would let us know that all examinees do know the English alphabet but don’t know the Burmese alphabet, but they wouldn’t provide any information to allow us to make distinctions between examinees based on their performance.

All MBE, MEE, and MPT items are designed to elicit information about examinee performance. Relative grading on the MEE and MPT, both of which have multiple issues per item, allows graders to gather information about examinee performance and assign a score that accurately reflects examinee performance. All MEE and MPT items are drafted, reviewed, edited, and pretested to ensure that graders will be able to spread examinee scores according to relative quality if they follow grading instructions properly.

Graders should award points or credit reflecting the spectrum of the score scale used in their jurisdiction to maximize and equalize the information provided by each MEE or MPT item. Consider the following examples of what happens when a grader fails to discriminate among answers. Suppose a jurisdiction uses a 10-point score scale and a grader is using an absolute or criterion-referenced approach (or, for that matter, a relative grading approach) and no examinees address all points raised in an item. The absolute grader won’t award any papers a 10 or possibly even a 9, depending on how inadequate the answers are. And the relative grader, if not trained properly, might hold back and not award maximum points even to the best answers. So now that 10-point scale (also being used to grade other items), for purposes of this item, is a de facto 8-point scale, because no one is getting a 9 or a 10. Suppose a grader is even more extreme and uses only 5 points on a 10-point score scale—from 2 to 6, for example. The result is that the item has even less impact on examinees’ overall written scores in comparison to other items that are being graded on all points of the jurisdiction’s 1–10 scale. 5

Another way of not spreading out scores is by bunching a large percentage of scores in the middle of the score scale. For example, on a 1–6 score scale, a grader who gives 75% of her papers a 4 (whether using absolute or relative grading), is, in effect, downgrading the weight given to that item. This particular question has elicited very little information about examinee performance and has compressed the score scale from 6 points to just a few points—and mainly to one point, a 4.

One reason why we emphasize relative grading is that it should help graders spread out scores—no examinee has to write a perfect paper to get the highest score, and no examinee has to leave the page blank to get a very low score. As long as a grader keeps that principle in mind, it should be natural to spread out scores. And the actual score distribution for each grader is easy to keep track of and need not be in equal piles. It is enough to make meaningful distinctions between relative examinee performances that reflect all or most of any given score scale.

What Is the Best Approach for Optimizing Relative Grading?

Using a manageable score scale.

While relative grading works the same no matter the score scale, it tends to work best and is easiest to manage using score scales that are relatively compressed. For example, if a grader uses a 1–100 scale, it’s conceivable that the grader could make 100 piles of rank-ordered answers, but 100 separate piles representing qualitative differences between answers can be pretty hard to wrap one’s brain around. And, of course, a 1–100 scale brings to mind grading as it’s done in school—absolute grading. That is, a 90–100 is an A and is reserved for an answer that covers all possible issues in the item, as opposed to an answer that is the best of a possibly weak group of answers. Also, probably for many jurisdictions that use a 1–100 scale, their graders assign grades by 10’s—that is, 10, 20, 30, etc.—so that the score scale is really functioning more like a 10-point scale, not a 100-point scale.

NCBE uses a 1–6 scale to train graders, in part, because six piles of answers are manageable and memorable. And we use a 6-point scale instead of a 5-point scale because a 5-point scale resembles the A, B, C, D, and F grading paradigm that makes it a bit too easy to bunch scores on the midpoint or average—a 3 or a C. Using a 1–6 scale means that graders can’t just label an answer average, or a 3—they have to make a decision as to whether it’s a 3 or a 4, that is, a little bit above an average paper or a little bit below. Because many graders tend to bunch their answers in the middle (rather than at the ends of the score scale), just by using a 1–6 score scale rather than a 1–5 scale, they have to make a choice around that critical midpoint, which makes bunching harder and spreading easier. Some jurisdictions use a 1–10 scale, which also works well provided that all points on that score scale are awarded.

Participating in NCBE’s Grader Training

While relative grading is fairly intuitive, it is informed and standardized by the MEE and MPT grading materials and the post-exam grader training that puts the relative grading principles into action. That training emphasizes reading through several answers before assigning final grades to those first several papers read, as a grader won’t yet have a good idea of what the overall pool of answers looks like. NCBE’s grader training and materials also assign weights to subparts in a question. So an examinee who performs well on one subpart of an MEE question worth 25% of the total score that could be awarded for that question is not assured a 6 unless he performs well on the other parts of the question, too, in comparison with other examinees. In other words, there is a weighting framework for assigning points, which helps to keep graders calibrated and consistent.

NCBE also offers online support and hands-on workshops that demonstrate how to use relative grading. For most graders, it is not hard to rate answers in order of their relative quality. It might be a little more difficult to use the entire score scale, whatever that may be, but practices as simple as keeping track of the number of score piles and the number of answers in each score pile go a long way toward keeping score distribution (or lack thereof) front and center during the grading process. And consistency can be maintained by keeping benchmark papers for each score pile to illustrate what representative answers look like for each score—6’s, 5’s, 4’s, etc. This is particularly important if grading is done over an extended period of time or by multiple graders.

Ensuring the Calibration of Graders

No grading process will be effective if graders (especially multiple graders assigned to a single item) are not calibrated. It should not matter to examinees who grades their papers or when their papers are graded. Relative grading and absolute grading both require calibration to be consistent and fair. Under either grading method, calibration requires reading through several sample answers, a thorough understanding of and facility with the grading rubric, and agreement on the standards to be applied and how to apply them. In my experience, calibration is an exercise to which jurisdictions devote substantial resources, time, and verification to ensure that graders become and remain calibrated throughout the grading process. 6

The relative grading approach is employed widely in the jurisdictions that administer the MEE and the MPT. When used in conjunction with scaling to the MBE and proper training and calibration of graders, relative grading promotes consistency and fairness in grading the written portion of the bar exam. It compensates for the use of varying score scales and grading procedures among jurisdictions as well as differences in the harshness or leniency of the graders themselves. It neither artificially inflates or deflates grades but facilitates the spreading out of scores, which are then scaled to the highly reliable anchor of the equated MBE score distribution. Finally, it ensures that any variation in difficulty of items from one administration to the next does not penalize or reward examinees, while facilitating the appropriate weighting of all items so that each item provides information about each examinee’s ­performance.

  • NCBE Testing Department staff members Dr. Mark Albanese, Dr. Joanne Kane, Dr. Andrew Mroch, and Douglas Ripkey all provided assistance with this article. (Go back)
  • For a detailed explanation of how MEE and MPT items and their grading materials are prepared, see my article in the June 2015 Bar Examiner : Judith A. Gundersen, MEE and MPT Test Development: A Walk-Through from First Draft to Administration , 84(2) The Bar Examiner 29–34 (June 2015). (Go back)
  • Some jurisdictions may grade on an absolute basis—awarding points according to the grading rubric—regardless of how other examinees answer the question. As long as this grading method is employed consistently and spreads out scores, it is an acceptable method of grading. (Go back)
  • Precisely how the written portion of the exam is scaled to the MBE is complex; there is not enough space in this article to fully discuss it, nor am I qualified to explain how it is done from a true measurement perspective. For a complete discussion of the steps we undertake in scaling written scores to the MBE, see Susan M. Case, Ph.D., The Testing Column: Demystifying Scaling to the MBE: How’d You Do That? , 74(2) The Bar Examiner 45–46 (May 2005); see also Mark A. Albanese, Ph.D., The Testing Column: Scaling: It’s Not Just for Fish or Mountains , 83(4) The Bar Examiner 50–56 (December 2014). (Go back)
  • Minimizing a question’s contribution to an overall written score is not necessarily problematic; if a question does not perform as intended, so that no examinees (or all examinees) get it right, then it is appropriate that that particular question’s impact is minimal. Note that graders should not artificially spread scores just for the sake of spreading them. Distinctions made between papers should be material. (Go back)
  • For a full discussion of calibration, see my Testing Column in the March 2015 Bar Examiner : Judith A. Gundersen, The Testing Column: Essay Grading Fundamentals , 84(1) The Bar Examiner 54–56 (March 2015). (Go back)

Photo of Judith Gundersen

Contact us to request a pdf file of the original article as it appeared in the print edition.

In This Issue

Winter 2023-2024 (Vol. 92, No. 4)

california bar essay grader

  • Letter from the Chair
  • President’s Page
  • Facts & Figures
  • Military Spouse Attorney Licensure: Progress and Perspectives
  • Distance-Education Allowances and Academic Freedom in Law Schools: Recent Developments in the ABA Standards
  • Texas Bar Exam: 2023 Standard-Setting Study
  • The Testing Column: Standards? We Don’t Need No Stinking Standards!
  • The Next Generation of the Bar Exam: Quarterly Update
  • FAQs About Bar Admissions: Answering Questions About: Transferring UBE Scores Between Jurisdictions
  • News & Events
  • In Memoriam: Mark A. Albanese, PhD
  • In the Courts

Bar Exam Fundamentals

Addressing questions from conversations NCBE has had with legal educators about the bar exam.

Online Bar Admission Guide

Comprehensive information on bar admission requirements in all US jurisdictions.

NextGen Bar Exam of the Future

Visit the NextGen Bar Exam website for the latest news about the bar exam of the future.

BarNow Study Aids

NCBE offers high-quality, affordable study aids in a mobile-friendly eLearning platform.

2023 Year in Review

NCBE’s annual publication highlights the work of volunteers and staff in fulfilling its mission.

2022 Statistics

Bar examination and admission statistics by jurisdiction, and national data for the MBE and MPRE.

  • 0 Shopping Cart $ 0.00 -->

JD Advising

Did you fail the bar exam? We have lots of free resources to help you regroup for your next attempt! Check out our guide on what to do if you failed the bar exam , as well as our guide on hiring a bar exam tutor !

Explore Your Path to Bar Exam Success! Book a free 20-minute consultation with our experts to find the best bar prep services tailored to your needs. Schedule your call today !

🌟 Boost Your Bar Exam Prep with JD Advising! Looking for one-on-one attention in your bar prep? Our expert bar exam tutors are ready to help you pass! Sign up early before spots fill up!

How—and Why—You Need To Self-Grade Bar Exam Essays

How—and Why—You Need To Self-Grade Bar Exam Essays

We just had a student tell us last week that learning how to self-grade bar exam essays was the best thing she did when she was studying for the bar exam.

“Self-grading” is a method where you read a bar exam essay fact pattern then write or bullet point your answer and compare what you wrote with the model answer . When you compare what you wrote with the model answer, use a different color pen (or font) to write out anything you missed and to give yourself tips like an actual grader would.

Self-grading is helpful for several reasons:

  • You will learn the rules better if you are forced to write out what you didn’t know
  • You will become well acquainted with how to structure your essay
  • You will become better at picking out material facts in the fact pattern
  • You will start to “think like a bar exam grader”
  • You will find yourself improving quickly
  • You are most invested in your success so you will provide yourself with better quality feedback than a commercial course grader. (*Note: It is not bad to hand in your essays to course graders as they may see something you miss. However, the person most invested in your success is you so you will provide yourself with the best feedback!)

How do you self-grade bar exam essays?

You need to have your answer and the model answer side-by-side.

Compare your essay with the model answer and ask yourself:

  • Did I recognize the issues? (If you missed an issue, why did you miss it? Did you misread the call of the question? Are there facts you did not address that you should have?)
  • Did I state the rules accurately? (If not, write in the rules or type them in a different color font so that they stick out to you!)
  • Did I analyze the issues correctly?
  • Did I conclude correctly?
  • Also, look at your general organization and read your writing for clarity.

Grade yourself in a different color font (or a different colored pen) as if you were actually a grader. This will get you into the mind of a bar exam essay grader. For example:

  • If you missed a rule, write out the rule.
  • If you omitted a discussion of an important fact, write out those facts when you self-grade.
  • If you did not discuss an issue, ask yourself “what in the fact pattern should have lead me to discuss that issue?”
  • If you did not conclude or if your conclusion was incorrect, why was it missing or incorrect? Was it because you did not know the rule? Did you read the fact pattern too quickly and miss a fact?

This will help you write essays that graders actually want to read!

Then you can constantly review your self-graded essays to turn weaknesses into strengths.

A few important notes on self-grading bar exam essays:

  • When you self-grade bar exam essays, you can ignore case or statute citations, policy discussion, a lengthy analysis of the history of the law, or anything else that does not directly go to the rule, application, or conclusion.
  • When you self-grade bar exam essays, use the answer promulgated by the National Conference of Bar Examiners (NCBE) if you are in an MEE state, or the answers promulgated by your state otherwise. Do not use student answers. Even highly-graded student answers still tend to frequently miss what is in the model answers!
  • As mentioned above, you can also hand in your essay answers to a bar review course to grade! We recommend that you do not rely solely on any one grader though. You will learn the most from grading your own essays!

Looking to Pass the Bar Exam?

Free Resources:

  • 🌟 Bar Exam Free Resource Center : Access our most popular free guides, webinars, and resources to set you on the path to success.
  • Free Bar Exam Guides : Expert advice on the MBE, the MEE, passing strategies, and overcoming failure.
  • Free Webinars : Get insight from top bar exam experts to ace your preparation.

Paid Resources:

  • 🏆 One-Sheets : Our most popular product! Master the Bar Exam with these five-star rated essentials.
  • Bar Exam Outlines : Our comprehensive and condensed bar exam outlines present key information in an organized, easy-to-digest layout.
  • Exclusive Mastery Classes : Dive deep into highly tested areas of the MBE, MEE, MPT, and CA bar exams in these live, one-time events.
  • Specialized Private Tutoring : With years of experience under our belt, our experts provide personalized guidance to ensure you excel.
  • Bar Exam Courses : On Demand and Premium options tailored to your needs.
  • Bar Exam Crash Course + Mini Outlines : A great review of the topics you need to know!

🔥 NEW! Check out our Repeat Taker Bar Exam Course and our new premier Guarantee Pass Program !

Related posts

Feel Like I Failed the Bar, Second Semester Law School Grades, Expect During Law School, Study Efficiently

  • Privacy Policy
  • Terms of Use
  • Public Interest

MBE-Tip-of-the-Day-Series-Torts

By using this site, you allow the use of cookies, and you acknowledge that you have read and understand our Privacy Policy and Terms of Service .

Cookie and Privacy Settings

We may request cookies to be set on your device. We use cookies to let us know when you visit our websites, how you interact with us, to enrich your user experience, and to customize your relationship with our website.

Click on the different category headings to find out more. You can also change some of your preferences. Note that blocking some types of cookies may impact your experience on our websites and the services we are able to offer.

These cookies are strictly necessary to provide you with services available through our website and to use some of its features.

Because these cookies are strictly necessary to deliver the website, refusing them will have impact how our site functions. You always can block or delete cookies by changing your browser settings and force blocking all cookies on this website. But this will always prompt you to accept/refuse cookies when revisiting our site.

We fully respect if you want to refuse cookies but to avoid asking you again and again kindly allow us to store a cookie for that. You are free to opt out any time or opt in for other cookies to get a better experience. If you refuse cookies we will remove all set cookies in our domain.

We provide you with a list of stored cookies on your computer in our domain so you can check what we stored. Due to security reasons we are not able to show or modify cookies from other domains. You can check these in your browser security settings.

We also use different external services like Google Webfonts, Google Maps, and external Video providers. Since these providers may collect personal data like your IP address we allow you to block them here. Please be aware that this might heavily reduce the functionality and appearance of our site. Changes will take effect once you reload the page.

Google Webfont Settings:

Google Map Settings:

Google reCaptcha Settings:

Vimeo and Youtube video embeds:

You can read about our cookies and privacy settings in detail on our Privacy Policy Page.

Make This Your Last Time - A Candid, No-BS Look at Bar Exam Preparation

Make This Your Last Time

Bar Exam Preparation

Graded Bar Exam Essay Answer Bank

Are you writing bar exam essays the way the graders want? One way to check your work is to see graded essays, including high- and low-scoring answers from actual bar takers.

To that end, you’ll find two things here:

  • Mostly for California. Other states like New York and Nevada are also available.
  • For more actual essays from California takers, check out BarEssays ( click here to grab a $25 coupon ).
  • Below the essay bank is a link to California bar essay issue charts by subject and year.

Real graded CA bar essay answers (some other states available)

Washington MEE 1: 5

Essay 1 (Business Associations): 55 , 55 , 55 , 52.5 (55/50), 60 , 55 (55/55), 55 , 50

Essay 2 (Torts): 55 , 55 , 55 (annotated with comments) , 75 (80/70), 70 , 65 (65/65), 60 , 60

Essay 3 (Professional Responsibility): 60 , 60 , 45 , 50 (50/50), 70 , 62.5 (60/65), 50 , 60

Essay 4 (Criminal Law and Procedure): 55 , 55 , 50 , 62.5 (60/65), 55 , 52.5 (55/50), 65 , 65

Essay 5 (Remedies): 40 , 50 , 55 , 62.5 (60/65), 55 , 55 (50/60), 50 , 50

California PT: 60 , 60 , 55 , 65 (65/65), 60 , 65 (65/65), 60 , 60

Essay 1 (Civil Procedure): 57.5 (55/60), 60 , 55 (55/55), 65

Essay 2 (Constitutional Law): 57.5 (60/55), 70 , 52.5 (55/50), 55

Essay 3 (Real Property): 52.5 (55/50), 55 , 62.5 (70/55), 60

Essay 4 (Professional Responsibility): 67.5 (65/70), 55 , 57.5 (55/60), 60

Essay 5 (Evidence): 62.5 (60/65), 65 , 57.5 (60/55), 60

California 90-minute PT: 60 (60/60), 65 , 75 (75/75), 60

Essay 1 (Contracts): 67.5 (65/70), 62.5 (65/60), 55 , 52.5 (55/50), 65 , 55 , 60

Essay 2 (Constitutional Law): 57.5 (60/55), 55 (55/55), 55 , 60 (60/60), 60 , 55 , 55

Essay 3 (Professional Responsibility): 62.5 (60/65), 60 (55/65), 50 , 62.5 (60/65), 65 , 55 , 55

Essay 4 (Business Associations): 55 (55/55), 57.5 (60/55), 45 , 60 (60/60), 50 , 55 , 50

Essay 5 (Wills/Community Property): 57.5 (55/60), 62.5 (60/65), 60 , 57.5 (60/55), 55 , 50 , 65

California 90-minute PT (Objective Memo – Torts): 75 (75/75), 65 (65/65), 60 , 60 (60/60), 65 , 55 , 50

New York MEEs and MPTs: Set 1

Nevada Essays and NPTs: Set 1

Selected answers for 2022 July MEEs and MPTs

The essays are generally structured fine. Formulaic IRAC is a good way to organize arguments.

Regarding the second essay specifically (Con Law), there were more free speech issues in the first call of the question, such as whether content-based or content-neutral rules apply in a public school. I didn’t see respective rules for those (strict scrutiny for the content-based, for example). I think you did well hitting on those threshold issues like overbreadth, but you may have missed more important issues.

This leads me to think that perhaps you are getting some issues right but not others. I encourage you to study when the selected answers get posted on the state bar website (or here ) to see which issues they discussed and how they’re resolved.

I think that you may also need to connect the rules to the facts, rather than simply listing out facts in your application paragraphs. I noticed this in the third essay (Professional Responsibility). One way is to use the word “because.”

I wanted to add that you did a good job on the PT. This is where some people lose a lot of points when they neglect it since it’s worth 2x an essay. 

Essay 1 (Criminal Law and Procedure): 67.5 (70/65), 70 (65/75) , 67.5 (80/55), 65 (65/65), 50 , 67.5 (60/75), 55 , 65

Essay 2 (Community Property): 72.5 (80/65), 57.5 (55/60), 65 (65/65), 65 (65/65), 60 , 65 (70/60), 55 , 65

Essay 3 (Torts, Remedies): 62.5 (65/60), 52.5 (55/50), 55 (55/55), 57.5 (60/55), 55 , 60 (60/60), 60

Essay 4 (Evidence, Professional Responsibility): 52.5 (55/50), 50 (50/50), 65 (65/65), 57.5 (60/55), 55 , 62.5 (65/60), 55 , 70

Essay 5 (Business Associations, Remedies): 65 (60/70), 62.5 (65/60), 65 (70/60), 62.5 (60/65), 60 , 57.5 (55/60), 55

California 90-minute PT (objective memo): 52.5 (55/50), 60 (65/55), 57.5 (55/60), 57.5 (55/60), 55 , 60 (65/55), 55 , 60

Some things to focus on:

  • Lay out the rules first. However, try to only introduce rules that you are going to use. This keeps your answer tight and gives you more time to talk about the facts and other issues. If you are spending a ton of time writing about the rules, you may not have that luxury. Also, don’t introduce new rules after you start talking about the facts or applying another rule.
  • Focus on completely identifying the issues. Ask yourself, what are they trying to test you on with these fact patterns? The more you solve similar fact patterns, the more you’ll see issue patterns you can whip out on the exam. Also, sometimes it’s better to have more issues identified with less analysis, especially for broad calls of the question. Professional Responsibility is one such subject where you want to focus on the issues and rules first (cook the essay), then fill in your analysis as time allows.
  • Be clear with your organization and presentation. You want to make it easy for the grader to know what issue you’re discussing, what the rule is, and where you are applying the rule. And of course what your conclusion is. Use headings and paragraphs liberally. The easier you are to understand, the more likely they will give you points. Remember that reading thousands of these essays is exhausting for the grader. Give them a breath of fresh air. Writing bar essays isn’t just about legal knowledge; it’s doing a service for your client.
  • Address all the questions completely.

From the applicant:

Here are a few of my thoughts: 

  • Essay #1  – The 25 point discrepancy between the first and second grader makes absolutely no sense to me.  When I read my response, I cannot understand how I possibly could have been given a 55 by the second grader.
  • Essay #3  – I broke out all the elements for the equitable remedies (TROs, Preliminary Injunctions, Permanent Injunctions) because this was a remedies-heavy question.  So again, not sure what I did to receive only a 55 on this essay.
  • PT  – Because the bar exam was back in person in February, I had the ability to make a strategic decision to do my PT first (before Essay #4 and #5), which I did.  All I could hear anyone talking about after the afternoon session on the first day was how they did not finish the PT.  This made me feel even more confident because I did finish.  I even spent 5 extra minutes on the PT to make sure it was bulletproof to avoid what happened to me on the PT in July.  In July, I missed the infamous “Blue Pencil” issue and did not finish concluding.  Yet, I got the same exact score on the PT in February as I did in July.  When I walked out of the afternoon session on day 1, I thought I  at least  got a 65 on the PT, but I honestly was expecting a 70 or 75.  With how much the PT is worth, this dropped my raw writing score significantly.
  • MBEs  – my percentiles went up in most areas compared to the July bar, but my scaled score was lower.  I understand the scaling changes for each exam, but not quite sure what else I can do for the MBEs when I spent 90% of my time during bar prep preparing for the MBEs.

My feedback:

First, I have to say that this was one of the hardest CA exams, with a 33.9% pass rate as you know. Even with a higher raw written score, you received a lower scaled score. Unfortunately, that’s just the casualty of whatever statistical adjusting they do. Theoretically, scaled scores are equivalent to one another… Theoretically. 

Essay 1 : 

I agree with you that the disparity in scores for Essay 1 is unusual. I don’t see how it deserves a 55. You pretty much did textbook IRAC, identified numerous issues, and organized and presented it all in a clean format. The only thing I can think of is that you didn’t actually connect the rules to the facts step by step. If you analogize it to an algebra problem, each line is a small step to the answer. Here’s an example:

Rule = perfect. (Note that I also had perfect rules on my first attempt.)

  • “Here, Jim and Fred both armed themselves with handguns and went to the store on Avon street.” Great.
  • “As an accomplice, Jim can be liable for any foreseeable crimes if he aided, encouraged, or abetted the crimes.” What made Jim an accomplice? The bridge is missing. One way to tell if you are bridging the rule to a relevant fact is if you can use the word “because.” The better thing to say here is to say  what Jim did  that would qualify as aiding, encouraging, or abetting. I don’t think the facts say such a thing.
  • “Here, it can be inferred that Jim partook in these crimes and aided, encouraged, or abetted the target crimes and any foreseeable crimes.” I disagree with the “inferred” part. You should cite a fact from the hypo to support this. You talk about conspiracy below this, which is closer to what happened in the fact pattern.

My guess is that the 2nd reader read the essay more carefully since it was a reread in the gray zone.

I didn’t read through the rest in the interest of time, but one thing that’s evident is that essay grading is subjective. Hence, any “best practices” you can do to make it easy for the grader to give you points will help. I think your formatting is nice and clean. I love how you break down the issues, and you know your rules. You’re in a great place in terms of knowledge. Perhaps better usage of the facts will help next time around. 

Essay 3 : 

You did a great job on the remedies question. When the selected answers come out, take a look at what other torts there could have been. At first glance, you probably got the major ones.

Without doing a deep dive, I can only say that it was hard to see where the application of the facts was. I saw a lot of statutes being recited, and noticed your usage of facts in the middle of one of the paragraphs.

Raw written score :

Your average written score is 61.8 (432.5 / 7). You needed a 62.78 for this exam. That’s just 7 more raw points (~439.5 vs. 432.5). Can’t help but think that you would have “passed” the essay portion if they’d kept your first read.

VIDEO: review and breakdown of the 5th set of essays

Essay 1 (Civil Procedure): 65 , 55 , 55 , 55 , 62.5 , 50 , 50 , 65

Essay 2 (Professional Responsibility): 60 , 65 , 50 , 50 , 57.5 , 50 , 75 , 62.5

Essay 3 (Torts): 55 , 55 , 51 (adjusted for ExamSoft issue), 65 , 62.5 , 55 , 55 , 65

Essay 4 (Criminal Procedure): 50 , 60 , 50 , 60 , 62.5 , 65 , 65 , 65

Essay 5 (Wills, Community Property): 70 , 55 , 50 , 65 , 72.5 , 60 , 60 , 57.5

California 90-minute PT: 55 , 55 , 55 , 55 , 60 , 45 , 55 , 55

  • PT-only donation: 65

Washington MEE 1: 4

Washington MEE 2: 3

Essay 1 (Evidence): 55 , 45 , 52.5

Essay 2 (Contracts & Remedies): 60 , 50 , 60

Essay 3 (Community Property): 55 , 65 , 80

Essay 4 (Professional Responsibility): 50 , 60 , 62.5

Essay 5 (Real Property): 50 , 50 , 52.5

California 90-minute PT: 60 , 45 , 55

Second column of essays has some of my annotations in bubble comments.

Essay 1 (Professional Responsibility): 57.5 , 50

Essay 2 (Business Associations – Corporations): 52.5 , 50

Essay 3 (Real Property): 57.5 , 55

Essay 4 (Criminal Law & Procedure): 62.5 , 55

Essay 5 (Remedies – Contracts): 55 , 55

California 90-minute PT: 52.5 , 50

Essay 1 (Torts):  60 ,  70 ,  57.5 ,  60 ,  50  

Essay 2 (Professional Responsibility):  65 ,  75 ,  57.5 ,  57.5 ,  60  

Essay 3 (Contracts Remedies):  60 ,  65 ,  52.5 ,  60 ,  60  

Essay 4 (CA Evidence):  65 ,  60 ,  57.5 ,  70 ,  65  

Essay 5 (Business Associations):  65 ,  55 ,  65 ,  65 ,  55  

California 90-minute PT:  57.5 ,  60 ,  60 ,  55 ,  70

The recurring theme I see in your essays (nice job on the PT) is that issues are missing. I think your IRACing is good. However, you can’t do the IRACs if you don’t bring up the I (issue) in the first place.

To this end,  Approsheets  may be helpful. In addition, I’ve heard good things about  Mary Basick’s blue book (2nd ed.)  

Sample answers aren’t out yet, but I’d suggest looking at the higher-scoring answers in the essay answer bank for reference

Examples of things you could have discussed:

Q1. NIED. What recovery is possible (part of the questions)

Q2. PR essays typically have general calls (what violations or obligations are there)… Issue identification is especially important.

Q3. Valid contract formation. Could have discussed more specific sub-rules and corresponding facts for the elements for specific performance (final call of the question)

Lesser point: Since you’re handwriting, presentation becomes more important. Try to leave enough spacing between issues and discussions so that it’s easier for the grader to see where discussions end and begin.

Essay 1 (Civil Procedure):  55 ,  55 ,  55 ,  50 ,  62.5 ,  65 ,  60 ,  55 ,  65  

Essay 2 (Remedies, Con Law):  60 ,  62.5 ,  70 ,  55 ,  62.5 ,  52.5 ,  55 ,  72.5 ,  50 ,  60  

Essay 3 (Criminal Law and Procedure):  55 ,  60 ,  55 ,  60 ,  57.5 ,  65 ,  55 ,  55 ,  70  

Essay 4 (Prof Resp):  60 ,  55 ,  50 ,  50 ,  55 ,  52.5 ,  55 ,  65 ,  55 ,  55  

Essay 5 (Contracts):  60 ,  57.5 ,  60 ,  55 ,  57.5 ,  55 ,  65 ,  70 ,  60 ,  55  

California 90-minute PT:  55 ,  60 ,  60 ,  65 ,  62.5 ,  55 ,  70 ,  60 ,  60  

My assessment of essays in the 5th column of essays above (in  bold ).

Essay and PT set 1  (scores: Q1 55, Q2 57.5, Q3 62.5, Q4 55, Q5 57.5, PT 70) 

Essay and PT set 2  (scores: Q1 55, Q2 62.5, Q3 60, Q4 60, Q5 62.5, PT 62.5)

Essay 1 (Wills, Trusts, Community Property):  70 ,  60 ,  55 ,  65  

Essay 2 (Torts):  57.5 ,  65 ,  65  

Essay 3 (Real Property):  52.5 ,  62.5 ,  60 ,  65/70  

Essay 4 (Civil Procedure, Evidence):  65 ,  55 ,  60  

Essay 5 (Prof Resp):  55 ,  52.5 ,  60  

California 90-minute PT:  57.5 ,  60 ,  60 ,  65/60

Essay 1 (Contracts):  60 ,  55  

Essay 2 (Evidence):  65 ,  65  

Essay 3 (Prof Resp):  75  (sample from  BarEssays ),  55  

Essay 4 (CA Comm Prop):  60 ,  55  

Essay 5 (Con Law):  55 ,  50  

California 90-minute PT:  55 ,  55  

Essay and PT answers set 1  (scores: Q1 60, Q2 50, Q3 50, Q4 55, Q5 50, PT 55) 

Essay and PT answers set 2  (scores: Q1 65, Q2 57.5, Q3 55, Q4 65, Q5 55, PT 60)

New York :  2018 July New York (UBE) MEE and MPT set  (MEE 1-6 scores in order 57.06, 44.55, 55.46, 41.42, 53.79, 40.09; MPT scores in order 49.42, 52.44; total 137.7)

Essay 1:  55  ( Brian’s Annotations )

Essay 2:  60  

Essay 3:  60  

Essay 4:  65  ( Brian’s Annotations )

Essay 5:  65  

California 90-minute PT:  65  

Essay and PT answers set 1  (scores: Q1 55, Q2 55, Q3 65, Q4 50, Q5 60, PT 60) 

Essay and PT answers set 2  (scores: Q1 55, Q2 60, Q3 55, Q4 70, Q5 60, PT 50)

Essay 1:  60 ,  72.5 ,  60 ,  65 ,  60 ,  65 ,  60  

Essay 2:  60 ,  62.5 ,  60 ,  60 ,  55 ,  55 ,  65  

Essay 3:  65 ,  62.5 ,  65 ,  55 ,  55 ,  50 ,  50  

Essay 4:  60 ,  65 ,  65 ,  70 ,  60 ,  60 ,  65  

Essay 5:  65 ,  62.5 ,  60 ,  60 ,  55 ,  55 ,  60  

California 90-minute PT:  65 ,  57.5 ,  75 , –, –,  55 ,  45

Essay 1: –,  62.5 ,  60 ,  55 ,  50  

Essay 2: –,  57.5 ,  55 ,  60 ,  50  

Essay 3: –,  65 ,  55 ,  70 ,  65  

Essay 4: –,  60 ,  60 ,  60 ,  50  

Essay 5: –,  65 ,  60 ,  55 ,  65  

Essay 6: –,  65 ,  60 ,  60 ,  60  

PT A:  57.5 ,  70 ,  55 ,  70 ,  65  

PT B:  60 ,  60 ,  70 ,  65 ,  60

Essay 1:  55 ,  65 ,  65 ,  55 ,  55  

Essay 2:  55 ,  55 ,  60 ,  55 ,  45  

Essay 3:  60 ,  50 ,  60 ,  55 ,  55  

Essay 4:  55 ,  65 ,  55 ,  55 ,  50  

Essay 5:  55 ,  60 ,  75 ,  60 ,  55  

Essay 6:  50 ,  60 ,  50 ,  55 ,  55  

PT A: –, –,  50 ,  50 ,  55  

PT B:  50 , –,  60 ,  60 ,  55  

Essay and PT answers set 1  (scores: Q1 55, Q2 60, Q3 45, Q4 60, Q5 65, Q6 60, PT A 50, PT B 55) 

Essay and PT answers set 2  (scores: Q1 55, Q2 50, Q3 50, Q4 50, Q5 55, Q6 50, PT A 55, PT B 60) 

Essay and PT answers set 3  (scores: Q1 60, Q2 57.5, Q3 65, Q4 52.5, Q5 65, Q6 60, PT A 60, PT B 57.5)

Nevada :  Essays Day 1  (scores in order: 63.32, 90.06, 71.40 / 75.00 passing),  MPTs  (score in order: 67.50, 51.91 / 75.00 passing),  Essays Day 3  (scores in order: 61.52, 93.47, 76.43, 78.82 / 75.00 passing)

Essay 1:  60 ,  65 ,  62.5  

Essay 2:  50 ,  60 ,  60  

Essay 3:  50 ,  55 ,  60  

Essay 4:  60 ,  70 ,  55  

Essay 5:  50 ,  55 ,  57.5  

Essay 6:  50 ,  60 ,  57.5  

PT A:  55 ,  55  

PT B:  65 ,  70  

My analysis of the first column of essays and PTs above (in bold )  (my estimated scores before actual scores were sent to me + my evaluation of the answers and suggestions)

Essay and PT answers set 1  (scores: Q1 55, Q2 60, Q3 65, Q4 50, Q5 60, Q6 50, PT A 60, PT B 70) 

Essay and PT answers set 2  (ZIP file with images; scores: Q1 55, Q2 75, Q3 65, Q4 55, Q5 50, Q6 50, PT A 55, PT B 60) 

Essay and PT answers set 3  (scores: Q1 50, Q2 50, Q3 50, Q4 60, Q5 55, Q6 55, PT A 55, PT B 55)

Essay 1:  60  

Essay 2:  65  

Essay 4:  55  

Essay 5:  55  

Essay 6:  60  

PT A:  60  

PT B:  55

Essay and PT answers set 1  (scores in order: Q1 60, Q2 55, Q3 55, Q4 60, Q5 55, Q6 55; PT A 50, PT B 50) 

Essay and PT answers set 2  (scores in order: Q1 60/55, Q2 75/85, Q3 60/55, Q4 65/65, Q5 65/60, Q6 60/60; PT A 55/55, PT B 60/60)

Essay 1:  65  

Essay 3:  55  

Essay 6:  55  

PT B:  70  

New York :  2014 July New York essay answers set  (scaled scores in order: 60.82, 44.77, 43.83, 40.62, 45.01, 31.66)

Essay and PT answers set 1  (scores in order: Q1-6 55, 55, 55, 55, 55, 55; PTs 55, 60)

Essay 1:  65 ,  60  

Essay 2:  55 ,  50  

Essay 3:  55 ,  50  

Essay 4:  50 ,  55  

Essay 5:  50 ,  55  

Essay 6:  60 ,  60  

PT A:  55 ,  65 ,  70  

PT B:  60 ,  55  

Issue Outlines

Sample legacy (3-hour) CA PT answers by Brian

PT 2008 FEB B (Dr. Snyder)  — with grader commentary PT 2006 FEB B (Estate of Small)  — with grader commentary and self-analysis PT 2012 FEB B (State v. Dolan)  — self-graded PT 2009 JULY A (Farley)  — self-graded PT 2010 JULY A (Vasquez CC&Rs)  — self-graded with notes, comments, and questions Snow King Mountain Resort answer , notes Pearson

California essay and issue charts by subject

Anonymous donor’s how-to-use: Barbri has a different chart that has the subjects in the left column, the month and year of the test along the top row, and the question number where the subject and test date intersect, but creating the chart this way made me focus on my weakest subjects instead of trying to predict which topics would be tested. There were certain subjects where I missed a lot of issues, e.g., constitutional law and community property, because I didn’t take ConLaw II or CP in law school.

Instead of doing the same amount of essays for each subject, I’d only outline the subjects that I’d spotted most of the issues for and use my extra time reviewing how the essays were organized in my weakest subjects for as many essays as I could.  Having a chart allowed me to find those essays faster, compare how they were written from year to year, and create a template for how to approach issues that I hadn’t learned in law school.

Finished Papers

california bar essay grader

Customer Reviews

PenMyPaper

Hire experienced tutors to satisfy your "write essay for me" requests.

Enjoy free originality reports, 24/7 support, and unlimited edits for 30 days after completion.

COMMENTS

  1. California Bar Exam Grading

    The California Bar Examination consists of the General Bar Examination and the Attorneys' Examination. ... Six groups, each consisting of experienced graders and up to four apprentice graders, are selected to grade the essay and PT answers. The groups convene three times early in the grading cycle for the purpose of calibration.

  2. Bar Essay Grader Recommendations : r/CABarExam

    Bar Essay Grader Recommendations . Hi all, I am a repeater was extremely close to passing (1375). I wanna keep the same strategy I kept last time one of which was utilizing Themis Unlimited Grading. I submitted like 80 essays last bar prep, but I'm gonna go harder and submit 120 this time around. I was wondering if you guys can suggest any ...

  3. California Bar Examination

    The next California Bar Exam is scheduled for July 30-31, 2024. The exam will be administered in person. Applications are now available in the Applicant Portal . How to apply: Log into the Applicant Portal. From the menu options, click "View Forms." Under the Examinations menu, select the "California Bar Examination Application."

  4. The Ultimate Guide to How I Passed the California Bar Exam

    A standard subscription provides access to the website until the conclusion of the upcoming California Bar Exam, and a premium subscription includes "model answers for every essay question tested since 2005, in addition to over 300 professional bar grader reviews of essays and performance exams in the database, all written by former official ...

  5. How is the California Bar Exam Scored?

    The process is outlined below: If you receive a score at or over 1440, you automatically pass the California bar exam. If, after one reading of your answers, your score is below 1390, you fail the California bar exam. If your score is 1390 or greater but less than 1440, you will get a "second read" by a different set of graders.

  6. How Are California Bar Exam Essays Graded

    Here, we tell you how California bar exam essays are graded so that you can get that 1390 on the California bar exam! We give you a detailed guide and go thr...

  7. California Bar Exam Essays

    Professional Bar Grader Reviews - Over 300 reviews of California Bar Exam essays in the database, all written by professional bar graders, detailing in line by line "pop up" commentary why essays received specific scores and how the essay could have improved.

  8. How To Consistently Outline Bar Exam Essays

    One of the keys to a successful bar exam essay is solid organization. For each fact pattern in an MEE, you essentially have 30 minutes to read the pattern, read the questions, and write a clear and cogent answer to each question. A typical MEE fact pattern has three or four questions, and each answer is basically a mini-essay.

  9. It's All Relative—MEE and MPT Grading, That Is

    Instead, NCBE trains bar examiners to grade the MEE and MPT on a relative basis—making distinctions between papers and rank-ordering them according to whatever score scale the jurisdiction has in place. (Jurisdictions may use whatever score scale they wish—e.g., 1-5, 1-6, 1-10, etc.—although NCBE uses a 1-6 score scale at its ...

  10. PDF California First-Year Law Students' Examination

    This publication contains the four essay questions from the October 2021 California First-Year Law Students' Examination and two selected answers for each question. The selected answers are not to be considered "model" or perfect answers. The answers were assigned high grades and were written by applicants who passed the examination.

  11. How—and Why—You Need To Self-Grade Bar Exam Essays

    Self-grading is helpful for several reasons: You will learn the rules better if you are forced to write out what you didn't know. You will become well acquainted with how to structure your essay. You will become better at picking out material facts in the fact pattern. You will start to "think like a bar exam grader".

  12. Graded Bar Exam Essay Answer Bank

    Below the essay bank is a link to California bar essay issue charts by subject and year. ... When I read my response, I cannot understand how I possibly could have been given a 55 by the second grader. Essay #3 - I broke out all the elements for the equitable remedies (TROs, Preliminary Injunctions, Permanent Injunctions) because this was a ...

  13. Dear THEMIS California Essay Grader: : r/Bar_Prep

    Dear THEMIS California Essay Grader: You may want to recalibrate your grading metrics and actually match how California grades their essays. An applicant essay that gets the overall topic of the essay (e.g. contracts, property, civpro) correct but misses many of the main issues gets a 50. Yes, you get a 50 just for getting the essay topic correct.

  14. California Bar Exam Score Calculator

    CALIFORNIA Bar Exam Score Calculator Exam: Seperac Bar Review Bar Score Calculator EXAM SCORE COMMON SCALE SCORE ... WEIGHT: WEIGHTED SCORE ESSAY 1 0.071 ESSAY 2 0.071 ESSAY 3 0.071 ESSAY 4 0.071 ESSAY 5 0.071 PT 0.143 MBE SCALED SCORE 0.5 WRITTEN SCORE TOTAL SCORE To purchase a subscription, click here. For subscriber comments, ...

  15. PDF California Bar Examination

    This publication contains the five essay questions from the July 2021 California Bar Examination and two selected answers for each question. The selected answers are not to be considered "model" or perfect answers. The answers were assigned high grades and were written by applicants who passed the examination after the First Read.

  16. California Bar Essay Grader

    Our paper writing service is the best choice for those who cannot handle writing assignments themselves for some reason. At , you can order custom written essays, book reviews, film reports, research papers, term papers, business plans, PHD dissertations and so forth. No matter what academic level or timeframe requested is - we will produce ...

  17. News Releases

    State Bar Releases Results of February 2024 Bar Exam. The State Bar announced today that 1,337 applicants (33.9 percent) passed the February 2024 General Bar Exam, and 197 applicants (52.7 percent) passed the Attorneys' Exam. If those who passed satisfy all other requirements for admission, they will be eligible to be licensed by the State ...