Open to AI + Data Engineering roles

Building production-ready systems as aData Engineer

I design end-to-end data products where interaction data becomes actionable insight in near real time.

About Me

From raw data to clear, trusted decisions.

I build reliable data systems that combine analytics engineering, platform thinking, and GenAI workflows to turn fragmented data into measurable business outcomes.

Skills
PythonSQLDatabricksAirflowAWSAzureSparkSpark SQLPySparkDelta LakeKafkaClickHousedbtPower BIFastAPIRAGData QualityETL Pipelines
Focus
Data products, platform observability, and analytics engineering with production-first reliability.
Explore
Projects, experience, certifications, contact, telemetry, and a local RAG assistant.
Pipeline

Streaming architecture in product

How interactions move from the website into the analytics pipeline and back into the telemetry dashboard.

1
Capture
Web App
User interaction layer
Captures user actions and event context.
2
Validate
Event Ingest
API validation layer
Validates payloads and standardizes event shape.
3
Stream
Stream Buffer
Kafka / Redpanda
Decouples producers and analytics consumers.
4
Query
Analytics Store
ClickHouse warehouse
Stores events for low-latency analytical queries.
5
Insight
Telemetry View
Operational dashboard
Presents trends, metrics, and recent activity.
Ordering
Session-level ordering is preserved for reliable behavior traces.
Idempotency
Unique event IDs protect metrics from retry duplicates.
Freshness
Near real-time ingestion keeps dashboard signals actionable.