Summary
Rialto Capital is seeking a Lead Data Engineer to drive the development of scalable, high-performance data solutions. This leadership role requires deep technical expertise in modern data architectures, along with a strategic mindset to help evolve our enterprise data platform into a trusted, unified data lake house. The successful candidate will lead projects, mentor engineers, and collaborate cross-functionally to meet critical business needs.
Key Responsibilities
Strategic Data Architecture: Lead the design and implementation of data solutions aligned with the long-term vision of a centralized, enterprise-wide data lake house platform.
Data Pipeline Design & Development: Architect and manage ELT pipelines using Azure technologies and Microsoft Fabric, ensuring seamless integration with existing systems and alignment with business requirements.
Data Management & Optimization: Optimize data ingestion, storage, and transformation workflows in Fabric. Ensure scalable and performant solutions across structured and semi-structured data sets.
Engineering Excellence: Develop robust, reusable data models and curated tables from diverse data sources. Ensure ingestion of third-party data meets data governance, lineage, and quality standards.
Security & Integration: Collaborate with Information Security and Infrastructure teams to enforce data security policies and streamline data integration practices.
Leadership & Mentorship: Guide and mentor junior data engineers. Foster a culture of innovation, collaboration, and technical excellence. Work with offshore teams to deliver business-critical solutions.
Quality Assurance: Establish and enforce rigorous testing, validation, and monitoring processes to ensure the accuracy, reliability, and integrity of all data assets.
Delivery & Prioritization: Lead the prioritization and execution of business-driven data initiatives, including executive reporting needs and ad hoc requests.
Innovation & Continuous Improvement: Stay current with emerging data technologies. Contribute to the evolution of the enterprise data strategy and roadmap.
Specifications
-
Bachelor’s required or Master’s degree preferred in Computer Science, Data Science, Engineering, or a related field.
-
5+ years of hands-on experience in data engineering, with a strong track record in designing and delivering large-scale data solutions.
-
Expertise in Python/PySpark for data processing and transformation.
-
Proficiency in SQL, NoSQL databases, and data warehouse solutions.
-
Strong experience with Azure data services, especially Data Factory and Microsoft Fabric.
-
Proficiency in cloud platforms (Azure, AWS, GCP) and their integration with Fabric.
-
Deep understanding of data modeling, data warehousing, and data architecture optimization.
-
Experience with CI/CD practices and tools such as Azure DevOps.
-
Prior experience in a data solution architect role is a strong plus.
-
Strong analytical and problem-solving abilities with excellent communication and leadership skills.