Applied AI Projects: Driving Innovation with Large Models & Intelligent Workflows

My portfolio showcases a range of challenging AI projects where I’ve leveraged cutting-edge techniques, including Foundational Transformer Models, Large Language Models (LLMs), Multimodal AI, and Agentic Workflows, to deliver tangible impact. From enhancing human-computer interaction through advanced speech synthesis to building robust data detection systems, these projects demonstrate my ability to translate complex AI research into practical, high-value solutions. Explore how I engineer intelligent systems to solve real-world problems.

Contact Me on LinkedIn


Clinical Data Review with Agentic AI (CluePoints)

At CluePoints, I develop agentic AI workflows for clinical data analysis. This project goes beyond traditional detection by ingesting and using clinical trial protocols to drive the data issue detection system, ensuring robust identification of inconsistencies in complex datasets.

  • Key Technologies: Agentic AI, LLMs (potential for protocol processing), Clinical Data Analysis, Workflow Automation.

Learn More

AI-based Pronunciation Analysis & Feedback

Developed a pronunciation training system for English learners, leveraging a transformer-based foundational audio model (wav2vec) fine-tuned for duration-aware phoneme sequence prediction. This enabled highly accurate, data-efficient mispronunciation detection and personalized feedback.

  • Key Technologies: Transformer Models (Wav2Vec), Fine-tuning, Self-supervised Learning, Speech Representation Learning, AI in Education.

Learn More

Voice Generation with Controllable Emotional Expressiveness (PhD Project)

My PhD research focused on building controllable speech synthesis systems through an attention-based Deep Learning decoder, integrating text and emotional expressiveness encoders. This work explored principles integral to modern generative AI for speech.

  • Key Technologies: Deep Learning, Attention-based Decoders, Text-to-Speech (TTS), Emotional AI, Generative AI Foundations.

Learn More

Sentiment Analysis & Emotion Recognition with Transformers

Pioneered a Transformer-based Joint-Encoding (TBJE) approach for multimodal Sentiment Analysis and Emotion Recognition, integrating a modular co-attention mechanism for effective integration of multiple modalities.

  • Key Technologies: Transformer Architecture, Multimodal AI, Co-attention, Sentiment Analysis, Emotion Recognition, NLP.

Learn More

Emotional Voices Database (EmoV-DB)

Created an open-source emotional speech database (EmoV-DB) crucial for training deep learning-based systems in emotional speech synthesis and analysis, addressing the critical lack of high-quality emotional data in the field.

  • Key Technologies: Deep Learning, Speech Synthesis, Emotional AI, Dataset Curation, Speech Analysis, Open-Source.

Learn More


Ready to Explore Advanced AI Solutions for Your Business?

Whether you’re looking to integrate Large Language Models, develop custom Agentic AI workflows, or leverage Multimodal AI for richer insights, I’m here to help.

Contact Me on LinkedIn