10+ years of experience building scalable data infrastructure on GCP, optimizing data pipelines, and implementing cutting-edge AI solutions.
With 10 years of experience as a Data Engineer on Google Cloud Platform (GCP) and 2 years as an AI Engineer, I specialize in building scalable data infrastructure, optimizing pipelines, and implementing machine learning models.
I bring a systematic approach to problem-solving and excel at collaborating with cross-functional teams to drive project success. My expertise includes migrating 500 petabytes of data from BigQuery to BigLake, achieving an 80% cost reduction, and developing innovative tools that streamline data processing workflows.
MCA from VTU University
BCA from Bangalore University
Bengaluru, India
IISc PG Level Advanced
Google Professional Data Engineer
Led migration of 800 pipelines from newsflow 1.0 to 2.0, transitioning from data warehouse to data mart architecture.
Developed robust API automation framework reducing pipeline development time from weeks to hours.
Implemented AI-powered SQL code feedback system for all PRs, enhancing code quality and deployment efficiency.
Built end-to-end pipeline to replace 3rd party paid tool, handling data retrieval, processing, and delivery.
Created Python IDE simulator with visualization of conceptual machine state updates.
Computational Data Science
Data Engineer (Valid until 2025)
Big Data Technologies
I'm always interested in hearing about new opportunities and collaborations
Bengaluru, India
I'm currently open to discussing new projects, innovative ideas, or opportunities to be part of your vision.