top of page

Turnitin Clarity

Turnitin is an education technology company that provides a suite of solutions for academic institutions. 

Clarity from Turnitin is designed to bring transparency to the student writing process by providing visibility to educators into their writing journey, including their use of AI tools along the way.

Portfolio Hero Image_edited.png

What is Turnitin?

Turnitin is an education technology company that provides a suite of solutions for academic institutions, administrators, instructors, and students. Its core offering is to check for the integrity and originality of student work. In addition, Turnitin equips instructors with tools to streamline grading and deliver meaningful written feedback.

What is Turnitin Clarity?

Clarity from Turnitin is designed to bring transparency to the student writing process by providing visibility to educators into their writing journey, including their use of AI tools along the way.

The generative research conducted by the research team revealed that in an AI-powered world, we need more than just AI detection. We need solutions that promote learning, invite dialogue, and equip educators and students to use AI ethically and confidently. Educators needed to engage, assess, and uphold integrity with full visibility into the writing process, instead of only detecting AI-generated content at the end of the submission.

The first milestone set by the company was to establish a proof of process, ensuring that both students and instructors could trust the integrity of the work submitted. 

The Clarity ecosystem consists of a student writing environment called the Composition Workspace and a report detailing the writing process for the educators called the Writing Report. 

Composition Workspace
Editor.png

Composition workspace allows the student to begin, pause, resume and resubmit their work until the submission deadline. It also offers responsible and ethical use of AI .

The Writing Report
Instructor report.png

Each submission generates a report for the instructors to view and interact with - The Writing Report. This report equips the instructors with the insights and supporting data into how students constructed their writing, from creation to submission, which aids in teaching students to write with integrity and foster more meaningful engagement between educators and students.

Without the Clarity add-on, the integrity of student work is assessed only after the final submission is made. Two types of reports are generated: a Similarity Report and an AI Writing Report. The Similarity Report provides a “similarity score,” indicating the percentage of text that matches existing sources, along with a detailed breakdown of the matched content and its sources. The AI Writing Report provides an “AI percentage score,” estimating the likelihood of AI involvement and highlighting specific sections predicted to be AI-generated.

Shaping Instructor's Writing Report: My Role in the Design Journey

I joined the project after the first version of the instructor report had been designed and tested.

Initial feedback indicated that instructors struggled to interpret the Instructor Report, navigate through it  and  comprehend the overall message that we were trying to convey about a students’ proof of process. 

Writing Report Redesign - Whipping up a new version!

I was tasked with rethinking the flow and screen design to improve clarity, comprehension, and ease of navigation.

To set the context for the redesign, here’s a overview of the user flow before we explore how it evolved. 

ChatGPT Image Oct 17, 2025 at 06_35_45 PM.png

The table below shows the comparison between the two designs, it points out the problem areas in the older design and explains how it was solved with the new version.

Version 1 - Inbox | Student Submissions
  1. It was not clear enough what submissions were being called out for possible integrity issues. 
     

  2. Instructors did not understand what 'View Report' and the flag number meant.

Version 2 - Inbox | Student Submissions
  1. Clearly called out the submissions with potential issues vs those that do not have any integrity issues by using proper naming, iconography and colors
     

  2. For those that have integrity issues, we provided a summary of all the potential issues upfront with the option to dig deeper.

Inbox_edited.jpg
Version 1 - The Writing Report Landing Page
  1. Data and numbers in the side panel were difficult to understand. Instructors had questions like:

    1. What do those numbers mean?

    2. What are we supposed to conclude by looking at these numbers?

  2. The Writing Observations were found to be too granular to convey an overall message about the students' writing.
     

  3. IA / Navigation: It was not clear what the instructors should do and where they should go next. What should be the next logical step?

    1. Should they see the pasted text and why?

    2. Should they click on the links provided to them and why?

IMG_7901_edited.jpg
Version 2 - The Writing Report Landing Page
  1. A clear differentiation between findings that suggest a potential misconduct and those that are benign yet important observations about the writing process, through careful choice of words.
     

  2. Insight - Evidence: Each finding has a clear hierarchy of data granularity, leading with key insights supported by just enough context to be understood, while offering deeper layers of evidence only if the instructor wishes to explore further. List of raw data is removed from this view.The raw data is now connected to the insights. The insight is what is important, the data just backs up as evidence.
     

  3. Improvement in the IA by not treating summary and pasted text as two views, modes or options that can be switched amongst, along with removing the link to switch between the views
     

  4. Improvement in the overall visual representation of the findings and navigation by using the card pattern for classifying the type of finding with an entry point to learn more about it instead of it looking like a flat list of text.

IMG_7901_edited.jpg
Version 1 - Pasted Text
  1. Partial information - If there is no changed pasted text, it still needs to be conveyed.

IMG_7902_edited.jpg
Version 2 - Pasted Text
  1. Complete information - Added indication of how much of both, unchanged and changed pasted text was found. ​

  2. Added a heading to convey how much of pasted text has been found

  3. Aligned the visual style and interaction, similar to how findings are shown on the report landing page

IMG_7902_edited.jpg
Version 1 - Version History
  1. Navigating through the history does not clearly tell the story of the writing process
     

  2. Disconnect between findings for review and the version history: Instructors struggled to connect the summary flags with the story shown in the version history
     

  3. Clicking through the version history and searching for changes is cumbersome

IMG_7903_edited.jpg
Version 2 - Version History
  1. Introduced the concept of timeline and playing it back to see how the entire work came together
     

  2. Each event of potential misconduct as well as benign but of significance is showed on the timeline
     

  3. Timeline helps in telling the entire story from beginning to end with the important events which removes the disconnect is there in the version history pattern

IMG_7903_edited.jpg

Feedback on the second version, got a lot of positive reviews and votes when compared to the first one. We decided to move forward with that idea.

Shaping the second version to completion

Role - Lead

Approach - Design Sprinting 

Duration of each sprint - 1 week

Time - 3 weeks​

Sprint structure consisted of 6 major tasks : 

  1. Synthesize Findings (solo + team)

  2. Pick the themes and problems within to be solved, 

  3. Brainstorm + Ideate on solution, 

  4. Review new designs with UX team,

  5. Feedback incorporation, 

  6. Generate prototype + Setup the test in Userlytics

  7. Initiate testing

Sprint tasks.png

In week 1, our success rate on tasks related to both workflow actions and comprehension was 92%, which was a significant increase from the previous version.

 

In week 2, we explored more granular improvements allowing us to improve the comprehension of the pasted text events and usefulness of the timeline.

 

Finally in week 3, we marked improvements in comprehension of transcription and attribution events garnering praise like:

 

“This was really informative. I haven't seen anything like [this report] before where it can give you a lot of useful information. I thought it was quite fantastic. This would make the [academic integrity] hearings go a lot quicker if you can present this kind of irrefutable evidence.”

Inbox - A Clear Starting Point

​Professor Greene is reviewing the inbox after students completed a 3-4 page essay that focuses on different species within the Mongoose family. She surveys the inbox and decides to quickly review Robert Pettigrew’s, because it has 2 flags, and she wants to learn more.​​

Screenshot 2025-10-07 at 3.03_edited.jpg

Key Details
 

  1. Instructors clearly understand from their inbox which submissions might contain integrity issues through Flags and the expanded view.
     

  2. Decision to keep only process related insights and not include Similarity and AI related insights as segregation of all the reports will be handled by a separate project

Writing Report

Professor Greene sees the final submission with a writing report and notices the two flags mentioned in the inbox: Writing process and Reliance on pasted text. 

Writing report - Default.png

Key Details

Comprehension 

  1. Organized the data into a more structured format. Introduced Flags, Findings and Observations architecture 

    • Flags are triggered if certain noteworthy findings are discovered during the writing process

    • Findings are calculated based on raw data as well as certain events that are captured during the writing process

    • Observations are ALL the findings and events which are meaningful in itself when presented at the right place

  2. Included copy that explains each finding and every number in an easy to understand language ​

 

Cohesiveness

  1. Different portions of the report convey the same story irrespective from where you start interacting

  2. Playback timeline to be always present as an integral portion of the report instead of it being shown as a part of version history or time spent

  3. Playback timeline and the findings in the side panel work hand in hand. The instructor interacts with either of them to get the same result.

Flags

Flags surface potential integrity concerns within the context of student writing, helping you quickly identify where closer review may be needed.

Screenshot 2025-09-30 at 2.32.12 PM.png

Key Details
 

​Writing process flags highlight unusual patterns in writing time or revision effort.

  1. Writing Time
    This finding indicates that the amount of time the student took to write this paper is much faster when compared to the length of the paper that was written
     

  2. Shorter writing time than cohort
    This finding indicates that the student spent less time writing this draft than other students in the class.
     

  3. Minimal Revision
    Professor Greene wants to know more about why the writing time was shorter, so she expands the Minimal revision finding and sees that the majority of the student’s work came from pasted text and from typed words, but very little of the student’s writing has been revised, which could be a sign that the student was typing content from an external source.
     

  4. Reliance on Pasted text
    Reliance on pasted text flags show how much a student relied on pasted content.

    • Pasted text ​in final document
      This finding indicates that more than 50% of the final word count comes from pasted text.
       

    • Significant pasting throughout the writing process
      This finding indicates that during the writing process the amount of words pasted by the student make up for more than 40% of the total words

Observations

The Observations tab helps you quickly understand how the student crafted their submission. It is a master list of all the events suspicious or not that happened during the writing process. Currently the type of events that are tracked are 'pasted text events' when the student pastes in a block of text. In future, events like continuous typing which could mean transcription will also be tracked. 

Screenshot 2025-10-07 at 4.46.26 PM.png

Writing Process Information

This section provides an overview of writing metrics with details that summarize how a student developed their final document.

  1. Total writing time reflects only the time the student actively engaged with the document through clicking or typing, not through passive actions like scrolling or idle time.
     

  2. Number of writing sessions is number of times the student started and stopped activity in the document by entering and leaving the writing space or by becoming inactive for an extended period.

Pasted Text Findings

This section identifies moments when content was pasted into the document from outside of the writing space.
 

Playback Interaction

Responsive Designs

Class size - Extra Large - 1600+dp

Desktop, Ultra-wide monitors

Extralarge.png
Class size - Expanded - 840 dp -1199 dp
Extended-1.png
Extended-2.png
Class size - Medium - 768 dp -1024 dp

Tablet in portrait, Phone in landscape

Medium 1.png
Medium 2.png
Medium 3.png
Medium 4.png

Playback Interaction for Mobile

Handling the timeline for tablet and mobile

Onboarding

Accessibility

Annotations

Instructor Report Fast followsAI detection in pasted textSubmission flow updatesDisplay changes for changed pasted text

Appendix: Laying the Groundwork: Research That Shaped the Problem and the Path Forward

The generative research conducted by the research team revealed that in an AI-powered world, we need more than just AI detection. We need solutions that promote learning, invite dialogue, and equip educators and students to use AI ethically and confidently. Educators needed to engage, assess, and uphold integrity with full visibility into the writing process, instead of only detecting AI-generated content at the end of the submission.

The first milestone set by the company was to establish a proof of process, ensuring that both students and instructors could trust the integrity of the work submitted.

What should we build?

In order to establish proof of process, our product could do a lot of things, so we tried to focus on what we should do. These were the fundamental questions to be answered by the foundational research. 

  • What information could be used to illuminate integrity vs. process?

  • What could be used for both?

  • What information or data would be most relevant or most controversial?

  • How alike are instructors and students when they think about these data points?

Concept Testing

In order to figure out answers to these questions, various concepts were put in front of the instructors and students, in the form of cards shown below to understand if one or more product concepts resonates with the target audience and if it aligns with their needs and expectations. 

Research Purpose

Students
  1. Understand what data on student process / provenance is most and least valuable for the students

  2. Determine if there are any types of data that the students want that we have not yet considered

  3. Understand student attitudes around sharing details of their writing process with instructors

Instructors
  1. Understand what data on student process / provenance is most and least valuable for the instructors

  2. Determine if there are any types of data that the instructors want that we have not yet considered

Concepts

Concept testing ideas.png

Deciding where we fall on the spectrum of detail

The feedback that was gathered by talking to various instructors and students, revealed that the product they were looking for was somewhere in the middle of detection and surveillance. 

Screenshot 2025-08-26 at 4.39.46 PM.png
Screenshot 2025-08-26 at 4.45.29 PM.png

Chosen candidates for the first milestone

The top three candidates that were considered to be most helpful to the instructors and students to capture the proof of process:

  1. Pasted Text: Plenty of positive sentiment. Seen as a key indicator of academic misconduct. They also found value in it to help students find pasted content they may have forgotten or recall lost sources
     

  2. Revision History: Students and instructors both see this as invaluable to show proof of process as well as helping with their writing growth
     

  3. Time Spent: Many participants (students and instructors) viewed short time spent as a flag to a possible integrity issue

REFLECTIONS AND CONCLUDING NOTES

REFLECTIONS

NEXT STEPS

bottom of page