Lead Data Architect
Ascendum Solutions
This job is no longer accepting applications
See open jobs at Ascendum Solutions.See open jobs similar to "Lead Data Architect" Purpose.Title: Databricks Technical Lead/Architect
Location - Cincinatti, OH or Charlotte, NC Hybrid must be in office 4 days/ week
Duration: 2+ Years Contract to hire.
• This is a contract-to-hire position, and candidates must be prepared to transition to a full-time employee role.
**************Note: Candidates should be eligible to work for any employer in the United States without needing Visa sponsorship, now or in the future. **************
Candidate must be in proximity to Cincinnati, OH, or Charlotte, NC, with the ability to work in a hybrid model (in-office Monday to Thursday).
Top 3 Skills - Databricks certified technical lead, Data engineering, big data, or API development Python, Data Domain experience.
•Microsoft Certified: Azure Solutions Architect Expert or Databricks Certified Data Engineer/Architect certification.
Job Description
Job Summary:
We are seeking a seasoned Databricks Technical Lead to lead the design, build, and optimization of our data platform, services, APIs, and cloud migrations.
As a product-centric role within an agile delivery framework, you will ensure our data solutions align with business objectives and deliver tangible value.
Required Qualifications:
- 10+ years of experience in data engineering, big data, or API development, with at least 3+ years in a leadership role.
- Proven experience leading product-centric data engineering initiatives in an agile delivery environment.
- Expertise in Azure Databricks, Apache Spark, Azure SQL, and other Microsoft Azure services.
- Strong programming skills in Python, Scala, and SQL for data processing and API development.
- Experience in building and managing APIs (REST, GraphQL, gRPC) and microservices.
- Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, and Delta Lake.
- Proficiency in CI/CD pipelines, Terraform/Bicep, and Infrastructure-as-Code.
- Experience with data security and compliance measures (e.g., encryption, access control, auditing) for sensitive HR and employee data.
- Strong problem-solving skills, with a focus on performance tuning, security, and cost optimization.
- Experience with containerization (Docker, Kubernetes) and event-driven architecture is a plus.
- Exposure to Informatica for ETL/ELT and data integration.
- Excellent communication and leadership skills in a fast-paced environment.
Preferred Qualifications:
- Experience with agile development methodologies such as Scrum or SAFe.
- Familiarity with machine learning workflows in Azure Databricks.
- Knowledge of Azure API Management and Event Hub for API integration.
- Experience with Informatica PowerCenter or Informatica Intelligent Cloud Services (IICS).
- Hands-on experience with Oracle HCM database models and APIs, including integrating this data into enterprise data solutions
- Experience with HR Analytics.
Key Responsibilities
Core Responsibilities:
- Develop and maintain the HR data domain in a secure, compliant, and efficient manner in accordance with best practices.
- Lead the development of a data engineering team responsible for designing scalable, high performance data solutions, APIs, and microservices in Azure Databricks, Azure SQL and Informatica.
- Ensure the highest levels of security and privacy regarding sensitive data.
- This is a job for an exceptional professional who deeply understands big data processing, data architecture, cloud migrations, API development, data security, and agile methodologies in the Azure ecosystem.
Key Responsibilities:
Azure Databricks & Big Data Architecture:
- Design and implement scalable data pipelines and architectures on Azure Databricks.
- Optimize ETL/ELT workflows, ensuring efficiency in data processing, storage, and retrieval.
- Leverage Apache Spark, Delta Lake, and Azure-native services to build high-performance data solutions.
- Ensure best practices in data governance, security, and compliance within Azure environments.
- Troubleshoot and fine-tune Spark jobs for optimal performance and cost efficiency.
Azure SQL & Cloud Migration:
- Lead the migration of Azure SQL to Azure Databricks, ensuring a seamless transition of data workloads.
- Design and implement scalable data pipelines to extract, transform, and load (ETL/ELT) data from Azure SQL into Databricks Delta Lake.
- Optimize Azure SQL queries and indexing strategies before migration to enhance performance in Databricks.
- Implement best practices for data governance, security, and compliance throughout the migration process.
- Work with Azure Data Factory (ADF), Informatica, and Databricks to automate and orchestrate migration workflows.
- Ensure seamless integration of migrated data with APIs, machine learning models, and business intelligence tools.
- Establish performance monitoring and cost-optimization strategies post-migration to ensure efficiency.
API & Services Development:
- Design and develop RESTful APIs and microservices for seamless data access and integrations.
- Implement scalable and secure API frameworks to expose data processing capabilities.
- Work with GraphQL, gRPC, or streaming APIs for real-time data consumption.
- Integrate APIs with Azure-based data lakes, warehouses, Oracle HCM, and other enterprise applications.
- Ensure API performance, monitoring, and security best practices (OAuth, JWT, Azure API Management).
HR Data Domain & Security:
- Build and manage the HR data domain, ensuring a scalable, well-governed, and secure data architecture.
- Implement role-based access control (RBAC), encryption, and data masking to protect sensitive employee information.
- Ensure compliance with GDPR, CCPA, HIPAA, and other data privacy regulations.
- Design and implement audit logging and monitoring to track data access and modifications.
- Work closely with HR and security teams to define data retention policies, access permissions, and data anonymization strategies.
- Enable secure API and data sharing mechanisms for HR analytics and reporting while protecting employee privacy.
- Work with Oracle HCM data structures and integrate them within the Azure Databricks ecosystem.
Product-Centric & Agile Delivery:
- Drive a product-centric approach to data engineering, ensuring alignment with business objectives and user needs.
- Work within an agile delivery framework, leveraging Scrum/Kanban methodologies to ensure fast, iterative deployments.
- Partner with product managers and business stakeholders to define data-driven use cases and prioritize backlog items.
- Promote a continuous improvement mindset, leveraging feedback loops and data-driven decision-making.
- Implement DevOps and CI/CD best practices to enable rapid deployment and iteration of data solutions.
Leadership & Collaboration:
- Provide technical leadership and mentorship to a team of data engineers and developers.
- Collaborate closely with business stakeholders, product managers, HR teams, and architects to translate requirements into actionable data solutions.
- Advocate for automation, DevOps, and Infrastructure-as-Code (Terraform, Bicep) to improve efficiency.
- Foster a culture of innovation and continuous learning within the data engineering team.
- Stay updated on emerging trends in Azure Databricks, Azure SQL, Informatica, Oracle HCM, and cloud technologies.
This job is no longer accepting applications
See open jobs at Ascendum Solutions.See open jobs similar to "Lead Data Architect" Purpose.