
Edtech leader PhysicsWallah has introduced its first Small Language Model (SLM), aptly named Aryabhata 1.0, tailored specifically for mathematics preparation for competitive exams like JEE Mains. This innovation reflects the company’s deeper push into AI-driven, domain-specific learning tools that are both resource-efficient and highly accurate.
Co-founder Prateek Maheshwari shared the update on LinkedIn, stating that Aryabhata 1.0—trained using just a single H100 GPU—has already delivered stellar results. The model achieved 86% accuracy in the January 2024 JEE Mains (Maths) and an impressive 90.2% in the April session, surpassing the performance of several larger, more resource-intensive models.
The model is powered by over 130,000 meticulously curated question-and-answer pairs developed by the PhysicsWallah team. Combined with advanced fine-tuning techniques, Aryabhata 1.0 is designed to emulate pedagogically sound reasoning patterns, enhancing problem-solving abilities for learners targeting engineering entrance exams.
Named after the legendary Indian mathematician Aryabhata, the model symbolizes a fusion of India’s ancient mathematical legacy with cutting-edge artificial intelligence.
Looking ahead, PhysicsWallah plans to scale Aryabhata 1.0’s capabilities to tackle JEE Advanced and cover additional mathematical domains. Maheshwari also extended an open invitation to educators, developers, and researchers to engage with the model, test its effectiveness, and contribute feedback to shape future iterations.
Also Read: Centre proposes Higher Education commission of India to streamline regulatory framework
With a robust learner base of over 10 million paid students, PhysicsWallah’s foray into AI education could redefine how competitive exam preparation is delivered in India.
