TeckLeap
Lead Snowflake Data Engineer - Contractor

TeckLeap

, New York

In Office

10-20 YEAR

Posted: May 6, 2026


About Company

Providing innovative, comprehensive, and end-to-end information technology and workforce management solutions that empower businesses to achieve digital transformation, enhance operational efficiency, and drive sustainable growth


Job Description

We are seeking a Lead Snowflake Data Engineer to design, own, and deliver end-to-end data engineering solutions in modern cloud environments. This role focuses on building scalable, high-performance data pipelines using Snowflake and Cortex AI, with full lifecycle ownership—from ingestion and transformation to modeling, optimization, and consumption.

Key Responsibilities

  • Lead the design and development of end-to-end ELT pipelines using Snowflake
  • Architect scalable data models optimized for performance, cost, and analytics consumption
  • Build and maintain backend data services using Python and PySpark
  • Leverage Snowflake Cortex AI to enable advanced analytics and intelligent data products
  • Drive performance tuning across pipelines, including query optimization, clustering, and warehouse scaling
  • Enforce best practices in data governance, security, and compliance
  • Collaborate across business, analytics, and engineering teams to deliver high-quality solutions
  • Provide technical leadership and mentorship to engineering teams
  • Communicate architecture decisions and trade-offs effectively in client-facing environments

Required Qualifications & Technical Expertise

  • 10+ years of experience, or equivalent ownership of production-grade data platforms
  • Deep expertise in:
    • Snowflake (data modeling, performance tuning, optimization)
    • Python and PySpark
    • Advanced SQL
  • Proven ability to design and deliver end-to-end data pipelines (ingestion transformation modeling consumption) in cloud environments (AWS preferred)
  • Required: Ownership of at least one production-grade Snowflake pipeline end-to-end
  • Strong foundation in modern data warehousing:
    • Dimensional modeling (star/snowflake schemas)
    • ELT/ETL design patterns
    • Data marts and optimization strategies
  • Experience with distributed data processing and large-scale datasets
  • Hands on experience with Snowflake Cortex AI integration
  • Working knowledge of React.js or similar frameworks
  • Strong understanding of data governance, security, and compliance
  • Ability to:
    • Clearly explain and defend architectural decisions
    • Design systems that perform reliably at scale
    • Balance performance, cost, and maintainability

Technical Depth (Must Be Demonstrated)

Candidates should be able to clearly explain and apply the following in real-world scenarios:

Snowflake Performance & Scaling

  • Warehouse scaling modes (auto-scale, multi-cluster) and when to use them
  • Clustering keys and performance trade-offs
  • Cost vs performance optimization strategies

Snowflake Storage & Optimization

  • Micro-partitioning and its impact on pruning and query performance
  • Practical optimization techniques for large datasets

End-to-End Pipeline Design

  • Designing a complete ELT pipeline using Snowflake
  • Deciding where transformations should occur (Snowflake vs external processing)
  • Ensuring scalability, maintainability, and performance across the pipeline Engagement.
;