I have a date looming with my editor in the near future, and I want to make a good impression. One of my major projects is about to be completed, and it’ll soon be time to get out the knives. But before I do that, I would love to make one editorial pass with an automated tool, if I can find one worth my time. I have something that I wrote myself, and I’ve been using it for years. It finds many of the unfortunate foibles that I am prone to, but only the ones that I’ve _noticed_ that I’m prone to. Whenever I unearth a new bad habit, I add it to my tool, and hopefully, I’ll never publish something again that contains that particular kind of mistake. But what about all the mistakes I’m still making that nobody has pointed out to me yet? Could it be that somebody has written a tool that has more in it than the one I wrote myself? To find out, I thought I would make a tour of the online grammar checkers and see if there’s anything there that can help me save a little face.
For each of these tools, I will be testing it on the same input passage of text – a scene from my upcoming novel, Strange People. The sample is a first draft, approximately 1500 words in length. I won’t be making any corrections as I go, so each test will be with the exact same input. And since this is a real scene from my WIP, this should be about as realistic a test of the grammar bots’ capabilities as I can muster. So let’s get started.
The Grammarly experience starts out rather nicely. Just plop some text into the box and hit Go. Some very reassuring progress messages flit past, as it checks such things as pronoun agreement, sentence structure complexit, etc. Then up comes a very stylish looking report, telling me that for my sample text, Grammarly has found 51 errors. Whoa! 51? The whole scene is only 1500 words. I’ve got a critical mistake in essentially every single paragraph? Let me check. Nope. It’s even worse. There are only 29 paragraphs in the sample, so I have more than 1 critical error per paragraph. I am mortified. But wait! The stakes are even higher. Because apparently, one of those critical mistakes is an accusation of plagiarism. As you can imagine, I was anxious to dig to the bottom of these mistakes and quickly get them beaten out of my manuscript.
But as helpful as Grammarly has been up to this point, they suddenly got very coy. Now they don’t want to talk to me until I’ve registered for a free account. Okay. I’ll bite. I want to know how they could possibly have found 51 critical errors – that word keeps shaking its finger at me: critical. So I launched into the account registration process. This is the name and password I want to use, and that’s the email address I want to be notified at, and no I do not want to receive the newsletter. Now go. Oops. Once last screen. Give us your credit card.
Yup. You want to know what evil secrets we’ve found lurking in your text? Give us your card, man. We promise, we won’t use it for two weeks, while you continue to evaluate our great service, but we want the card now. Pay up. Meanwhile, we’ll just hold your writing reputation hostage.
Well, guess what, Grammarly. I don’t negotiate with kidnappers. You can keep your critical errors. I can always write more.
Contrary to Grammarly, GrammarBase reported 34 critical errors on the same sample text. Let’s dive in and see what they didn’t like.
First of all, fully 20 of those 34 critical errors are apparently spelling mistakes, but upon closer inspection, it turns out that they are all attached to two words, which just happen to be character names. And yes, they were spelled correctly. Obviously their database is not going to know that, but suddenly I’m down from 34 errors to 14. Whew.
The next most commonly charged error for this sample is passive voice. Unfortunately, Grammarly doesn’t seem to understand the proper definition of the term, though, because all they have done is flag usages of “to be,” in its various conjugations. For example, in the sentence, “Her load had been much heavier than she had expected it to be,” GrammarBase tells me that my use of ‘to be’ is passive. Sigh. Oh well, those 5 charges of passivity all turned out to be similarly bogus, so I’m down to 9 errors now.
My next most frequent gaffe, apparently, is using overly complex language. Excellent! I know I have a tendency to be a bit verbose at times, so this could be a real assistance. But wait. What? The first flag I see is in the sentence, “It was the best day she could remember in quite a long time.” What on Earth could be construed as pedantic or complex in that? I could only shake my head and laugh. GrammarBase reports that, in that sentence, the word ‘quite’ was too complex, and should be simplified. Since the other flags for complexity were similarly flawed, those 3 complexity charges are eliminated from the count, and I’m down to 6 critical errors to correct. The mathematically inclined among you will note that of the 28 errors expunged so far, I have yet to change so much as a comma in my manuscript. But let’s press on.
1 charge of hidden verb, which was not hidden at all, nor was it a verb. 1 claim of redundant usage, when in fact it poetic repetition, so ignored. 1 cliche, for using “sigh of relief,” which I agree is a common phrase. So I thought about it, but I decided that in this particular passage, that familiar phrase is exactly the right tone, so I ignored the flag and moved on. Two of the flags are there to suggest alternate words to replace the character names, under the assumption that they are not real words, so I ignored those too.
And that leaves us with one last error, and I think it’s got to be the most amusing alert of the entire bunch. It was raised for the following line of dialogue, in which a police officer shouts at a suspect: “Down on the floor, lady! Now!” In its politically correct wisdom, GrammarBase tells me that this use of ‘lady’ is gender biased, and potentially offensive. Oh my. Really? Perhaps I should fix that then. Doing its helpful best, GrammarBase suggests that I replace it with ‘woman.’ o.O Yeah. That won’t be viewed as sexist. So into the ignore bin with you, too, and there you have it.
Final score: GrammarBase 0, Jefferson 34. To be completely fair, I did actually consider making one of the suggested changes, so I’ll adjust that.
GrammarBase: 0.5, Jefferson 33.5, which translates as a 1.4% sticky rate.
Judging by its interface, PaperRater is aimed more at academic writing than creative writing. For example, during the input process, they ask for your education level, and a list of citations. But this doesn’t mean it can’t be useful, so I ran my standard sample through it.
And got four suggestions. On the surface, this looks like it might not be very many flags, but then again, none of the flags on the previous candidates turned out to be very useful, so maybe four is a good number. If the flags actually have any merit.
Two of the complaints were repeats of what GrammarBase complained about – the poetic repetition and a common phrase. (Although at least PaperRater had the grace not to call it a cliche.) The third ding was for a character name again, flagged as being misspelled. But the fourth complaint was odd. The character in my sample scene is Sister Diaphana – a rather chubby woman. For some reason, PaperRater took exception to me referring to her as “the portly nun,” and suggested instead that I call her “the port nun.” I’m not sure if they wanted her wandering the docks, or if they simply thought she was the nun on the left, but either way, their suggestion is as wrong as it is humorous.
I was also amused that PaperRater recommended some exercises for me to help me expand my vocabulary. Apparently, I wasn’t using big enough words, given my education level. This is no doubt a product of having used an academic grammar tool, telling it that I have a PhD, and then feeding it a scene from my YA novel.
PaperRater: 0, Jefferson 4. For a stickiness rating of 0, but a much smaller time-waste factor to get there.
ProWritingAid doesn’t look like much. Some of the other tools have put a lot more effort into the slick presentation stuff, but let’s see how it performs where it counts.
The first difference I noted was the number of flags. ProWritingAid takes the cake so far, having reported 116 issues. That’s quite a few, but I was also pleased that they did not label them as “critical,” in some attempt to jack my fear register up a notch or three.
The first report was on overused words, and I must say, it gave me pause for thought. Now, the fact that “had” appears frequently is not a big surprise. A sizable chunk of the sample scene is written in past perfect, so of course, there are a lot of hads to be had. Another thing I liked about this report was that it didn’t just say: you used “just or then” 10 times and 10 is too many. They also recommended a target frequency, suggesting that I remove one occurrence. And different frequent words obviously have different comfortable frequency ratings, because it wanted me to reduce my “see/saw” usage down to five. Very interesting. I admit I’m going to have to come back and look at this report more carefully later.
Next up was the sentence length variety assessment. On this report, I seem to have scored well, and ProWritingAid had no suggestions for improvement. Similarly, the grammar and spelling report was clean, too. Only tripping twice, reporting once on a character name, and once on a word that was cut off half uttered, when Sister Diaphana’s dialogue was interrupted. Curiously, none of the other checkers reported the word “unlov” as being misspelled.
The writing style report seems to have complained about what it thinks are a bunch of adverbs, but of all the occurrences of “really,” “possibly,” “heavily,” “normally,” and “lately” that it found, the only one being used as in a traditional adverbial role is “sighed heavily,” and I am happy with it there, so I’m not going to remove it.
One of the really interesting reports, to me, was the “sticky sentence” report, which ProWritingAid defines as a sentence with an unusually high number of common “glue” words, such as conjunctions, prepositions, etc. They suggest that sentences with a high proportion of such words can usually be rephrased to make them both simpler and more direct. And of the 10 sentences ProWritingAid flagged for me, I would say that fully half of them may need to be reduced.
My cliche report was clean, with the exception of one use of “the more the merrier,” which occurs in the dialogue of a character who is known for talking in annoying aphorisms, so it stays.
The repeated words and phrases analysis tries to point out places where words have been used frequently and close together. Some of the occurrences are legit, but some of them may need to be thinned out. I’ll be returning to this report as well.
The recurring phrase report is a breath of fresh air. ProWritingAid correctly spotted a seven word sequence that I had used twice in the scene: “than she had expected it to be”. It also spotted a duplication of 6, 5, and 4-word phrases that I will be rewriting in the morning. I’ve written a checker of my own that I’ve been using for a few years, and it includes a similar analysis, but this is the first time I’ve seen one of my personal tools implemented in any of the checkers I’ve examined. It suggests to me that ProWritingAid is not just knocking off the easy, commonplace evaluations. They seem to actually be diving into the heart of common stylistic mistakes and helping writers to find and remove them.
Anyway, there are still a half dozen or more reports to go through, but I don’t want to belabor the point. From what I’ve seen in this test, ProWritingAid is not only the best tool in its class for my needs, it is the only tool. I can see myself using PWA regularly, as a prelude to sending things to my editor, and there isn’t another tool on my list that I would ever bother returning to. And better yet, PWA never once asked for my credit card. From what I can see, the reports I cited here are available for free, always. If I had any quibble at all, it would be that I don’t generally like the idea of uploading my entire novel to somebody else’s web site in order to evaluate it. They have a plug-in for MS Word, which I would buy in a heartbeat, if I used Word, but I don’t. For now, I’ll just have to satisfy myself with the free web tool, and hope that some day they implement a plug-in for Scrivener.[For those waiting for the summary score, I honestly can’t figure out how to calculate one. ProWritingAid gave me so many things to think about, on so many different screens that I’m not sure how to total them. I’m also not sure how many of their suggestions I’m going to act on. But I would hazard a guess that it’s somewhere around 20% of the recommended changes, which is *way* higher than any of the other contenders.]
And if you know of another grammar tool out there that I’ve missed, add a comment below and I’ll see about adding it to the article. What checkers do YOU use?