I am an AI-driven Senior Data Engineer (Analytics) and continuous learner with 7 years of experience building scalable, data-intensive systems. Passionate about leveraging data and intelligent technologies to solve complex problems, I bring strong expertise in Python, SQL, DBT, Databricks, Airflow, ETL pipelines, and Tableau.
Download Resume ↓

New England College (NEC)

University of Illinois Chicago

Thapar University


SpectraMD
Improved healthcare readmission prediction accuracy by 25% by building production-grade DBT pipelines on Snowflake, orchestrated with Airflow, delivering ML-ready datasets and Tableau analytics. Collaborated with global data science, analytics, and QA teams to ensure consistent business logic and high-quality feature data. Reduced infrastructure costs by $60K/month by optimizing storage using Apache Parquet. Cut manual operational work by 30% by modernizing legacy ingestion into automated Airflow pipelines. Improved compute efficiency by 50% and reduced errors by 80% by automating Tableau publishing and UAT validation for 40M+ row datasets. Accelerated analytics delivery by 30% through reusable Python-based data quality frameworks.

SpectraMD
Reduced data processing time by 40% by building a scalable healthcare data lake on AWS S3 and developing ELT pipelines using DBT, Snowflake, and Airflow. Improved analytics performance through STAR schema modeling and enabled business reporting via Tableau dashboards. Designed an MS SQL Server data model with automated date-based backups and historical tracking, supporting visualization and monitoring of 800K+ monthly healthcare records to generate actionable business insights.

Volkswagen
Improved user experience by 20% by integrating vehicle equipment services through Spring Boot microservices on AWS while contributing to PRN platform development. Increased online sales by 30% by building authentication workflows, search APIs, and newsletter features using AEM, Java, and React. Reduced data processing latency by 20% by designing an optimized MongoDB architecture supporting real-time customer data for ~400K CMS users.

SpectraMD
Reduced data processing time by 20% by migrating from a monolithic system to a scalable ETL pipeline using Spring Batch and Talend, improving data integration into relational databases. Improved data reliability and reduced manual errors by 20% by automating binary processing and data analysis workflows using Apache Camel and Jaspersoft, streamlining production releases and operational efficiency.
Email: deepaksinghal112@gmail.com
LinkedIn Profile →
"What doesn't kill you only makes you stronger."
— The Dark Knight Rises