‘Attention is All You Need’ Author Suggests LLMs ‘Reflect’ in Pre-Training

5 days ago 9
  • Published on April 14, 2025
  • In AI News

The study challenges the claim that reflection in AI models emerges only after fine-tuning or reinforcement learning. 

Essential AI, a startup founded by Ashish Vaswani—co-author of the landmark ‘Attention Is All You Need’ paper that introduced transformers—released a study a few weeks ago, titled ‘Rethinking Reflection in Pre-Training’.

Cypher Event

The research reveals that an AI model’s capacity for self-reflection on its reasoning arises during pre-training itself, rather than through fine-tuning or reinforcement learning, as is often perceived. 

By testing an AI model (OLMo-2) at various stages of training using tasks with intentional errors, the researchers discovered that reflection naturally emerges during the training process. 

The researchers created datasets across different domains such as mathematics, coding, logical reasoning, and knowledge acquisition. These datasets contained deliberately modified chain-of-thought (CoT) reasoning paths with introduced errors like arithmetic mistakes and logical inconsistencies. They also tested models on their ability to correct their incorrect reasoning.

A key finding was that reflection could be activated using simple and natural language triggers. 

Interjections like “wait” prompted even partially trained models to pause, recognise and correct errors arising from the reasoning paths. 

“For instance, an OLMo-2 7B model pre-trained on four trillion tokens displays self-correction on our six self-reflection tasks,” read a section of the study. 

The study also revealed that as models underwent more training, their ability to identify mistakes and correct reasoning steadily improved.

The startup has also published a technical report that outlines the research methodologies, outcomes and results. 

Essential AI emerged from stealth mode in December 2023, raising $56.5 million in a funding round led by Google, Thrive Capital, AMD, and others. The startup is focused on building ‘full-stack AI products’, including LLMs that increase productivity in ‘monotonous’ workflows. 

Vaswani was also joined by Niki Parmar as a co-founder, who had also co-authored the ‘Attention Is All You Need’ paper. However, she recently joined the AI startup Anthropic. 

Attention Is All You Need was a research paper published by Google in 2017 that introduced the ‘Transformer’ architecture, which serves as a backbone for most, if not all, large language models today. 

Picture of Supreeth Koundinya

Supreeth Koundinya

Supreeth is an engineering graduate who is curious about the world of artificial intelligence and loves to write stories on how it is solving problems and shaping the future of humanity.

Related Posts

Our Upcoming Conference

India's Biggest Conference on AI Startups

April 25, 2025 | 📍 Hotel Radisson Blu, Bengaluru

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

Happy Llama 2025

AI Startups Conference.April 25, 2025 | 📍 Hotel Radisson Blu, Bengaluru, India

Data Engineering Summit 2025

May 15 - 16, 2025 | 📍 Hotel Radisson Blu, Bengaluru

MachineCon GCC Summit 2025

June 20 to 22, 2025 | 📍 ITC Grand, Goa

Cypher India 2025

Sep 17 to 19, 2025 | 📍KTPO, Whitefield, Bengaluru, India

MLDS 2026

India's Biggest Developers Summit | 📍Nimhans Convention Center, Bengaluru

Rising 2026

India's Biggest Summit on Women in Tech & AI 📍 Bengaluru

Read Entire Article