Hey Everyone!
I am a Master’s student in Computer Science at the University of California, Santa Barbara, and a graduate of the Indian Institute of Technology (IIT) Palakkad, where I earned my Bachelor's degree in Computer Science. My current research focuses on Transformers and Large Language Models (LLMs), with particular interest in model efficiency, routing mechanisms, and architectural innovations such as Mixture-of-Experts (MoE), adapter-based tuning, and speculative decoding. I am broadly motivated by challenges at the intersection of language understanding, scaling laws, and structured reasoning in LLMs. Previously, I worked on problems in Graph Machine Learning and Reinforcement Learning, and I continue to draw on that experience when designing structured attention mechanisms or analyzing inductive biases in large models. I am passionate about building scalable, interpretable learning systems that bridge theoretical foundations and real-world deployment.
Recent posts
Are your Go Routines Treated Unfairly?
Go Runtime Scheduler
Scheduler as a Platform
Platform - Airasia
Winter of Code
My journey through winter of code program