What Do Students Think about Automated Grammar Tools?

I work at a Sino-American University in Shanghai, China. We’re a new school, but we’ve already started to expand to include offering graduate degrees. Our first graduate program was the Global Master of Social Work program, where students spend one year studying in Shanghai and one year in New York. Now, working with graduate students presents a unique set of literacy challenges, as they are engaged in acquiring the advanced literacy skills needed to pass as members of their chosen discipline—this increasingly means the ability to write well.

Many graduate students, and some professionals (like me), are nervous about writing. They’re worried about clearly expressing their ideas. They’re concerned about adequately structuring and supporting their claims. And, they’re worried about grammatical and mechanical accuracy. There are many places students can turn to for help: writing centers, professors, peers, and so on. But, there are often affective or material barriers to activating these supports. Enter the automated grammar evaluator; an increasingly popular one is Grammarly, which offers both a free-to-use and premium versions. The premium version will check for more kinds of grammatical concerns as well as offer vocabulary advice, plagiarism checks, and more varied stylistic opinions based on the communicative context (e.g., personal blog posts, business emails, academic reports of research, etc.).

At my school, we’ve decided to make Grammarly available to our graduate students—a mix of American and Chinese individuals. This decision has led to questions about how students interact with applications like Grammarly, the possible impacts on their writing, and how we can scaffold students’ use of this tool so that they can improve their writing over time. This led me to Cavaleri and Dianati’s (2016) article which, rather conveniently, discussed how students at their institution reacted to using Grammarly to support their writing. They had two major takeaways. First, Grammarly fit into their interpretation of the Technology Acceptance Model (TAM)—specifically that the software was perceived of as both easy to use and useful. This fit to the TAM meant that students were likely to incorporate Grammarly into their writing practice. They also found that students reported the feedback from Grammarly was detailed enough to help them understand the root cause of their errors and to make informed decisions about how to correct them.


Cavaleri and Dianati (2016) examined the use of Grammarly, an automated grammar tool, by students at two campuses of their Australian university. They were interested in determining to what degree students were willing to incorporate Grammarly into their writing practices and how they made use of the tool. To carry out this study, they used a targeted survey approach. They began by emailing survey invitations to all students who had signed up for a copy of their institution’s site license using their school email addresses; this information was provided to the researchers by Grammarly. They then asked students a series of five-point Likert-scale questions to determine students’ attitudes towards writing and the Grammarly tool; to understand what they found helpful about the tool’s writing advice; and, the degree to which they incorporated the software into their own practice. They also provided open-ended questions to elicit responses from the students about the impacts of using Grammarly on their writing.

They discovered that most of their respondents found Grammarly both easy to use and helpful in the kinds of advice that it gave. Following the Technology Acceptance Model (TAM), this suggests that students were very likely to maintain their use of Grammarly and to continue to incorporate it more fully into their writing practice in the future. A vast majority of students reported that the software also provided thorough explanations of the errors in their writing. Moreover, this feedback was structured in such a way that it increased their understanding of discreet grammatical points—such as subject/verb agreement issues. Their findings suggested that Grammarly can be a potent tool for helping writers improve in the areas of grammatical accuracy and, if you have the premium version, lexical range. That being said, they point out that there is the need to train students on how to make efficient use of the software and to scaffold its incorporation into their writing practice.

Closing Comments

At NYU Shanghai, we’re still waiting to see how Grammarly pans out for our graduate student writers. But, Cavaleri and Dianati (2016) provide useful tools-for-thought about deploying a tool like Grammarly in your writing-intensive classes. I suspect that we’ll be replicating their research so that we can determine if the benefit of providing the software for free to our students is worth the cost, as it is another line-item on the budget. On a personal note, I use the premium version of Grammarly for all of my own writing—from this blog post to the articles that I send out for publication. I feel it helps. It helps me to feel more confident about my writing. And, I’m convinced that it got me my first compliment from an editor, who said a recent submission was “clearly written and well-edited,” praise I’ve never received before I started using the software. That being said, for students, there is the need to teach them that they don’t have to take every suggestion that Grammarly makes—just like they don’t have to act on every piece of feedback from their peer’s during peer review. They have to be taught how to make sense of the advice the software gives so that they know what to act on and what to ignore. They also need to be made aware that the software isn’t a silver bullet that will eliminate all errors. In some cases, it may introduce new errors because of an over-zealous anti-passive voice algorithm, for instance. Likewise, there are some errors that it may just miss altogether.


Cavaleri, M., & Dianati, S. (2016). You want me to check your grammar again? The usefulness of an online grammar checker as perceived by students. Journal of Academic Language & Learning, 10(1), 223-236.

Joshua M. Paiz on Twitter
Joshua M. Paiz
Lecturer at NYU Shanghai
Joshua M. Paiz holds a Ph.D. in TESOL from Purdue University and is currently a lecturer and L2 writing specialist at NYU Shanghai. His research interests include L2 writing, SLA, identity in applied linguistics, and critical issues in TESOL.

3 thoughts on “What Do Students Think about Automated Grammar Tools?”

  1. Thanks for this report; it is interesting to compare this study of 18 students with another institutes’ study which seems to use 110 students, one remarkable finding is “that 25% of participants felt less confident about their writing skills after using Grammarly, perhaps because of its tendency toward “overcorrections (Thorbes, 2016, para. 8).”


    Thorbes, C. (2016, Feb 12). Evaluating Grammarly: A process for assessing learning technologies. Teaching and Learning Center Blog. British Columbia, CA: Simon Fraser University. Accessed 31 Oct 2017. Accessible from: https://www.sfu.ca/tlc/blog/grammarly.html

    1. Mura,

      Thanks for sharing this. I appreciate perspective where ever and whenever we can get it. It would be interesting to see a comparative study done with so-called L1 and L2 writers that measures confidence and then attempts to uncover what undergirds that confidence.



Leave a Reply