Senior Data Integrations Engineer - Snowflake & Databricks

ComboCurve

ComboCurve

United States
Posted on Jul 17, 2025
Senior Data Integration Engineer – Snowflake & Databricks

Team: Data & Data Integrations | Location: Houston, TX or Remote (US) | Type: Full‑time

About ComboCurve

ComboCurve is the modern cloud-based solution for oil & gas companies managing asset portfolios. Our intuitive platform supports engineering, finance, and strategy teams with collaborative, data-driven forecasting, economics, and reserves workflows. We're backed by top-tier energy-focused investors and trusted by leading E&P operators.

What You'll Do

You will own the full lifecycle of inbound and outbound data integrations between the ComboCurve Data Domain and our customers' analytics environments. Partnering with product, data, and customer-facing teams, you will design and build high-performance pipelines, data models, and sharing solutions on Snowflake and Databricks—enabling clients to consume, analyze, and operationalize oil and gas datasets at scale.

· Architect curated and reusable data sharing patterns.

· Develop Python / SQL code, ELT jobs, and automation workflows that meet stringent performance SLAs.

· Model relational and semi-structured data (OLTP, OLAP, facts/dims, schema‑on‑read) to support analytics.

· Tune & Optimize cluster, warehouse, and query performance; profile costs and recommend right-sizing.

· Secure & Govern data using Unity Catalog, Snowflake roles, OAuth, and least‑privilege patterns.

· Collaborate with customer data teams to troubleshoot connectivity, schema drift, and data‑quality issues.

Core Responsibilities

· Data Modeling & Lakehouse Design (OLAP, curated, and raw zones)

· Snowflake Architecture, Warehousing, and Data Sharing

· Databricks Architecture (Workspace, Unity Catalog, Delta Live Tables)

· Performance Tuning & Cost Optimization

· Advanced SQL Development & Code Review

· Cloud Integrations & Automation (GCP preferred; AWS/Azure a plus)

· ETL/ELT Pipeline Development & Orchestration (Airflow, Workflows, dbt, or similar)

Soft Skills

· Problem Solving & Analytical Thinking

· Clear, Concise Communication—written and verbal

· Adaptability in a fast-growing environment

· Collaboration and mentoring across multi-disciplinary teams

Required Skills & Experience

· 5+ years in data engineering or integration engineering with both Snowflake and Databricks.

· Proven delivery of bi-directional data collaboration (shares, Delta Live Tables, CDC, REST APIs).

· Expert Python scripting and advanced SQL.

· Deep experience with relational (Snowflake, Postgres, MSSQL) and NoSQL (MongoDB) stores.

· Hands-on with transactional data lakes (Delta, Iceberg) and schema‑drift handling.

· Solid grasp of traditional dimensional modeling and lakehouse zoning/modeling concepts.

· Ability to work autonomously with minimal supervision and collaborate closely within agile squads.

· Strong understanding of the oil‑and‑gas data domain (well hierarchies, production, forecasts).

· Excellent documentation habits and Git‑based workflow proficiency.

Preferred Skills & Experience

· Apache Flink for streaming enrichment.

· Iceberg table format experience beyond Delta.

· Advanced Apache Spark (performance tuning, structured streaming).

· DevOps background (Terraform, CI/CD, container orchestration).

Why Join ComboCurve?

  • We're shaping the next generation of O&G decision-making tools.
  • You'll work in a high-impact, high-autonomy environment with some of the brightest minds in energy tech.
  • Competitive salary, equity, benefits, and a team that wants you to grow.

ComboCurve is an equal opportunity employer. We celebrate diversity and are committed to building a team that represents a variety of backgrounds, perspectives, and skills.