Accelerating Transformer Attention with Custom CUDA Kernels

As part of my GPU Architectures course, this project explores profiling and optimizing attention mechanisms in transformers using custom CUDA extensions. The focus is on reducing inference and training latency through kernel-level enhancements, improving GPU resource utilization for deep learning workloads.

February 2025 · Afnan Alabdulwahab

The Turing Test: Detecting LLM-Generated Content

As part of my Decoding LLMs course, this project investigates the challenges of detecting AI-generated text in academic settings. Our team explores state-of-the-art detection methods, benchmarks existing classifiers, and develops a detection model trained on diverse datasets. The goal is to enhance academic integrity by providing reliable AI-text detection tools.

February 2025 · Afnan Alabdulwahab

Speech Emotion Recognitiong

As part of my Deep Learning course, this project explores the use of convolutional and recurrent neural networks for Speech Emotion Recognition (SER). Using the RAVDESS and TESS datasets, we train models to classify emotions from audio signals, aiming to improve human-computer interaction, mental health applications, and AI-driven affective computing.

February 2025 · Afnan Alabdulwahab