Hi, I'm
Temi
Lajumoke
I'm a Software & Machine Learning Engineer with over 8 years of experience building scalable and highly performant software systems across the stack: from frontend web apps & mobile apps, to distributed backend systems, to deep learning models, and cloud native infrastructure.
I currently work on distributed system runtimes and inference for Large Language Models at Amazon AI.
Beyond engineering, I compose relaxing fingerstyle guitar music and recharge in the great outdoors, whether it's kayaking on tranquil waters or snowboarding down groomed slopes.
Current Personal Research Focus
I'm fascinated by making AI more accessible and efficient. Currently, I'm focused on building and training small language models (SLMs) that empower AI agents to run anywhere, while maintaining the power of their larger cousins. My work explores orchestration and observability strategies for agentic systems using these SLMs.
Recent (published) work include a GPT-inspired decoder-only model and a ground-up recreation of the original transformer architecture from the "Attention is All You Need" paper.