From Intention to Text: AI-Supported Goal Setting in Academic Writing
Summary
This paper presents WriteFlow, an AI voice-based writing assistant designed to support reflective academic writing through goal-oriented interaction, addressing limitations of efficiency-focused writing tools by scaffolding metacognitive regulation and goal articulation. Findings from a Wizard-of-Oz study with 12 expert users demonstrate that the system effectively supports iterative goal refinement and goal-text alignment during the drafting process.
View Cached Full Text
Cached at: 04/20/26, 08:30 AM
# From Intention to Text: AI-Supported Goal Setting in Academic Writing
Source: https://arxiv.org/html/2604.15800
11institutetext:Department of Media Technology and Interaction Design, KTH Royal Institute of Technology, Stockholm, Sweden
11email:yueling@kth\.se22institutetext:Department of Digital Learning, KTH Royal Institute of Technology, Stockholm, Sweden33institutetext:Digital Futures, Stockholm, Sweden
33email:\{rldavis,oviberg\}@kth\.se###### Abstract
This study presents WriteFlow, an AI voice-based writing assistant designed to support reflective academic writing through goal-oriented interaction. Academic writing involves iterative reflection and evolving goal regulation, yet prior research and a formative study with 17 participants show that writers often struggle to articulate and manage changing goals. While commonly used AI writing tools emphasize efficiency, they offer limited support for metacognition and writer agency. WriteFlow frames AI interaction as a dialogic space for ongoing goal articulation, monitoring, and negotiation grounded in writers' intentions. Findings from a Wizard-of-Oz study with 12 expert users show that WriteFlow scaffolds metacognitive regulation and reflection-in-action by supporting iterative goal refinement, maintaining goal–text alignment during drafting, and prompting evaluation of goal fulfillment. We discuss design implications for AI writing systems that prioritize reflective dialogue, flexible goal structures, and multi-perspective feedback to support intentional and agentic writing.
## 1 Introduction
Academic writing is a cornerstone of higher education, serving not only as a medium for assessment but as a powerful engine for learning, knowledge construction, and intellectual development. Rather than reproducing information, students engage in writing as a process of knowledge transformation, negotiating the dynamic interplay between rhetorical challenges (how to communicate ideas) and content-related challenges (what to communicate)[4](https://arxiv.org/html/2604.15800#bib.bib43). Through this process, tacit, experiential, and fragmented knowledge can be externalized, refined, and made transferable across contexts[23](https://arxiv.org/html/2604.15800#bib.bib45). Academic writing further fosters strategic and self-regulated cognitive skills[12](https://arxiv.org/html/2604.15800#bib.bib44), including goal setting, planning, monitoring, and revising—capabilities that are critical for scholarly inquiry, professional practice, and lifelong learning. However, despite its importance many students struggle to meet the demands of academic writing[17](https://arxiv.org/html/2604.15800#bib.bib7),[20](https://arxiv.org/html/2604.15800#bib.bib6), highlighting a persistent gap between the recognized importance of writing and students' ability to effectively engage in it.
With the rise of large language models (LLMs), the ways students engage in academic writing practices have started to change. For example, systems such as ChatGPT show strong reasoning and open-ended text generation capabilities[26](https://arxiv.org/html/2604.15800#bib.bib68),[2](https://arxiv.org/html/2604.15800#bib.bib69), and have become increasingly embedded in students' learning[11](https://arxiv.org/html/2604.15800#bib.bib4). Yet, growing evidence suggests that reliance on such tools may undermine learning by encouraging cognitive offloading and reducing metacognitive engagement[9](https://arxiv.org/html/2604.15800#bib.bib75). In academic writing, these risks are particularly acute, since its value lies not only in text production but in sustained reflection, reasoning, and iterative goal revision[10](https://arxiv.org/html/2604.15800#bib.bib47),[12](https://arxiv.org/html/2604.15800#bib.bib44).
Academic writing is a recursive and cognitively demanding process requiring writers to manage evolving goals, integrate cross-sectional ideas, and continually evaluate and revise arguments[10](https://arxiv.org/html/2604.15800#bib.bib47),[12](https://arxiv.org/html/2604.15800#bib.bib44). However, most commercial LLM interfaces are optimized for linear, turn-based dialogue, which poorly aligns with non-linear writing processes. Revisiting earlier reasoning or managing multiple concurrent goals is cumbersome, often resulting in surface-level interactions that limit metacognitive regulation, which is a strong predictor of academic success[24](https://arxiv.org/html/2604.15800#bib.bib1).
These challenges can be seen through the lens of self-regulated learning (SRL), which emphasizes learners' active metacognitive regulation across forethought, performance, and reflection phases[30](https://arxiv.org/html/2604.15800#bib.bib80), with goal setting as a central mechanism[31](https://arxiv.org/html/2604.15800#bib.bib33). In academic writing, goals are continuously revised as ideas evolve, making dynamic goal regulation essential for preserving the epistemic value of writing with AI.
Recent AI-driven writing tools have begun incorporating an SRL lens to support self-reflection and critical evaluation[15](https://arxiv.org/html/2604.15800#bib.bib77),[25](https://arxiv.org/html/2604.15800#bib.bib78), yet do not explicitly support iterative goal adjustment. This study aims to fill this gap by examining how AI-supported systems can scaffold reflection and goal setting in academic writing. We present WriteFlow, a Google Docs–based writing assistant co-designed to support reflection-in-action[21](https://arxiv.org/html/2604.15800#bib.bib13) through AI-mediated dialogue, structured goal generation, and goal-alignment tracking. Building on prior work showing conversational interaction as a powerful design resource for fostering reflection[3](https://arxiv.org/html/2604.15800#bib.bib15), WriteFlow reconceptualizes chat-based interaction as a space for goal negotiation and metacognitive regulation.
This study contributes to research on AI-supported academic writing by (1) providing a formative account of students' writing goal–related stress and current patterns of LLM use; (2) introducing WriteFlow, a voice-based AI writing assistant that scaffolds metacognition and self-regulation through iterative construction and monitoring of writing goals; (3) providing empirical evidence of how WriteFlow supports tracking alignment between stated goals and emerging text; and (4) deriving design implications for human–AI writing systems that support evolving goal setting and metacognitive scaffolding.
## 2 Background
### 2.1 Goal Setting for Self-Regulated Academic Writing
Academic writing is guided by hierarchically structured goal structures that include abstract intentions (e.g., audience awareness) and concrete subgoals (e.g., revising a paragraph)[10](https://arxiv.org/html/2604.15800#bib.bib47),[12](https://arxiv.org/html/2604.15800#bib.bib44). Expert writers generate richer goal structures and flexibly revise them as task constraints evolve[12](https://arxiv.org/html/2604.15800#bib.bib44). Research shows planning and quality-oriented goals improve text quality and promote higher-level revision behavior[1](https://arxiv.org/html/2604.15800#bib.bib30). Specific, proximal, and appropriately challenging goals enhance metacognition, motivation, and overall writing performance[22](https://arxiv.org/html/2604.15800#bib.bib34),[19](https://arxiv.org/html/2604.15800#bib.bib31),[5](https://arxiv.org/html/2604.15800#bib.bib32). Process-oriented goals are particularly effective; compared to product-focused goals, process goals combined with feedback support learning of writing strategies, strengthen self-efficacy, and promote transfer[22](https://arxiv.org/html/2604.15800#bib.bib34). Although automated writing evaluation systems provide personalized feedback[13](https://arxiv.org/html/2604.15800#bib.bib24), they offer limited support for reflection on writers' underlying intentions or monitoring evolving goal structures during the writing process.
### 2.2 AI Writing Tools and Metacognitive Support
Recent AI writing tools have increasingly incorporated support for metacognitive processes during academic writing. Systems such as VISAR[29](https://arxiv.org/html/2604.15800#bib.bib61) enable writers to construct hierarchical goal structures during the planning phase, helping them articulate and organize content goals before drafting begins, though research indicates such interfaces can increase cognitive load[18](https://arxiv.org/html/2604.15800#bib.bib60). Other tools focus on supporting metacognitive reflection during revision and feedback stages. Friction[28](https://arxiv.org/html/2604.15800#bib.bib59) helps writers formulate actionable revision goals when editing existing drafts, while ALure[16](https://arxiv.org/html/2604.15800#bib.bib56) scaffolds self-regulated learning through structured prompts that encourage writers to reflect on their strategies and progress. Reverse outlining approaches[7](https://arxiv.org/html/2604.15800#bib.bib55) enable retrospective assessment of whether written text aligns with intended structure, promoting reflective revision. Research on feedback timing suggests that continuous, in-action feedback better supports learning than post-hoc evaluation alone[14](https://arxiv.org/html/2604.15800#bib.bib58),[8](https://arxiv.org/html/2604.15800#bib.bib57), though such approaches also risk fostering dependency if not carefully designed.
Despite these advances in metacognitive support in planning and revision stages, existing tools do not help writers track the evolving relationship between their stated goals and emerging text during the drafting process itself. Writers lack explicit mechanisms to notice when drift occurs—that is, when the text being produced no longer serves the goals originally intended. This gap is particularly acute in AI-assisted writing contexts, where generated content may subtly pull writers away from their intentions without them recognizing the misalignment until substantial revision is required. Writers need support not only for setting goals (planning tools) and evaluating completed text (revision tools), but also for maintaining awareness of goal-text alignment throughout drafting, enabling them to decide whether to realign their text with original goals or intentionally revise those goals in light of emerging insights.
Addressing this gap, we present empirical findings on how an AI-based writing assistant can be designed to support metacognitive scaffolding through iterative goal setting and monitoring. This study consists of (1) a formative study informing the co-design of WriteFlow and (2) an expert user evaluation.
## 3 Formative Study
To understand how adult writers use AI when addressing academic writing challenges, we conducted a survey with 17 students (8 female, 8 male, 1 non-binary; aged 22–34). The open-ended online survey[1] administered between May 22 and 27, 2025 included ratings of ten writing challenges grounded in Cognitive Process Theory of Writing, which posits that writing is a non-linear, goal-directed, and recursive mental process rather than a strictly staged product[12](https://arxiv.org/html/2604.15800#bib.bib44), along with questions about coping strategies and AI use. Responses were analyzed using Reflexive Thematic Analysis[6](https://arxiv.org/html/2604.15800#bib.bib20). The study identified three key writing challenges and 20 coping strategies (detailed descriptive statistics and thematic analysis results are available in our OSF repository[2]). The rating data showed that the most stressful challenges were **evolving writing goals** and **setting writing goals**, reflecting difficulties in revising plans as new ideas and sources emerge. Participants reported using outlining, documentation, and AI tools to track evolving ideas, test structural changes, and maintain alignment with central arguments. These strategies supported reflection and reduced uncertainty during revision. In their open-ended responses, the study participants also highlighted a third key challenge: **preserving authorship**. Some participants (n = 6) reported primarily using ChatGPT in a human-in-the-loop manner, leveraging it for ideation, tone refinement, and feedback interpretation. However, other participants (n = 9) expressed concerns about overreliance, authenticity, and ownership. Based on these findings, we derived five design requirements (R) for AI-supported reflective academic writing: (R1) facilitate goal articulation; (R2) support iterative goal refinement; (R3) enable organization and revisiting of ideas relative to goals; (R4) preserve writer voice and meaning; and (R5) ensure AI feedback is transparent, revisable, and aligned with user intent. Together, these requirements underscore the centrality of human judgment and agency in AI-assisted academic writing.
**Figure 1:** The overview of WriteFlow, a Google Docs add-on for goal-oriented academic writing. WriteFlow interface consists of a voice agent (A) and a sidebar panel with three pages: Writing Task (B), AI Chat (C), and My Goals (D). Users can upload Google Docs and communicate with the voice agent at any stage of writing to discuss their writing directions. The agent then generates writing goals to help them plan, track, and monitor their writing process.
## 4 System Overview
Figure 1 presents WriteFlow's workflow and interface[3]. Users provide writing requirements and upload drafts, which the system uses to contextualize task understanding. Through Voice Mode, users discuss their writing plans with an AI-mediated conversational agent, which generates writing goals aligned with their intentions. Goals are stored on the **My Goals** page, where users can track progress, receive targeted suggestions, and review post-completion evaluations. WriteFlow also provides an **Outline** view that supports creating, revising, and comparing multiple outline versions across drafting stages.
**Goal Setting and Monitoring.** WriteFlow supports planning and self-regulated writing by scaffolding goal articulation, refinement, and progress monitoring. During voice-based interaction, the system helps users externalize ideas and translates them into adaptive writing goals (R1, R3). Self-evaluation cards enable users to iteratively revise goals (R2), while progress tracking (R5) and suggestion cards support focused execution and sub-goal formation.
**Goal-Text Alignment Evaluation.** Central to WriteFlow's design is the Goal Completion Evaluation feature, which directly addresses the challenge of tracking alignment between evolving goals and emerging text. After goal completion, the system evaluates alignment between goals, outlines, and written content to support reflective revision and preservation of authorial intent (R4). The Outline view further supports flexible goal evolution by allowing users to create and compare multiple outline versions across drafting stages, enabling tracking of how writing plans change over time.
## 5 User Evaluation
We used WriteFlow as a design probe and conducted an exploratory Wizard-of-Oz study to answer the following research questions: **RQ1.** In what ways does the use of WriteFlow support users' metacognitive goal-oriented processes during academic writing, and what design refinements are needed, base
---
[1] Survey instrument, participant demographics, and detailed findings for the formative study: OSF repository, https://osf.io/ba6d2/overview?view_only=e69b00a3acfd42529d3fb2c9c10b1ef7
[2] Survey instrument, participant demographics, and detailed findings for the formative study: OSF repository, https://osf.io/ba6d2/overview?view_only=e69b00a3acfd42529d3fb2c9c10b1ef7
[3] WriteFlow ProtoPie prototype: https://cloud.protopie.io/p/3176a8c0ab9ad1f9e99b0910Similar Articles
The sweet spot for AI-assisted writing is 50%
An analysis arguing that the optimal balance for AI-assisted writing is around 50% AI and 50% human input, where AI handles structure and organization while humans provide voice, judgment, and editorial control. The author contends that 100% AI reads as slop while 0% AI leaves capability on the table, and that meaningful AI assistance requires genuine expertise, strong structure, and distinctive human voice.
The AI-Free Writing Checklist
An open-source checklist of AI giveaway words/phrases to help marketers and writers humanize AI-assisted content.
CoAuthorAI: A Human in the Loop System For Scientific Book Writing
CoAuthorAI is a human-in-the-loop system that combines retrieval-augmented generation and hierarchical outlines to enable accurate, coherent scientific book writing, achieving 98% recall and 82% human satisfaction in evaluations.
Writing with ChatGPT
OpenAI Academy published a guide on using ChatGPT for workplace writing tasks, covering a Plan→Draft→Revise→Package workflow and tips for effective prompting. The guide targets professionals looking to speed up drafting, tone adjustment, and content refinement using ChatGPT.
Prober.ai: Gated Inquiry-Based Feedback via LLM-Constrained Personas for Argumentative Writing Development
The article introduces Prober.ai, a web-based writing environment that uses LLM-constrained personas to provide inquiry-based feedback for argumentative writing, aiming to prevent cognitive outsourcing. Developed as a hackathon prototype, the system gates revision suggestions behind student reflection to preserve critical thinking skills.