Manage cookies
We use cookies to provide the best site experience.
Accept All
Cookie Settings
Manage cookies
Cookie Settings
Cookies necessary for the correct operation of the site are always enabled.
Other cookies are configurable.
Essential cookies
Always On. These cookies are essential so that you can use the website and use its functions. They cannot be turned off. They're set in response to requests made by you, such as setting your privacy preferences, logging in or filling in forms.
Analytics cookies
Disabled
These cookies collect information to help us understand how our Websites are being used or how effective our marketing campaigns are, or to help us customise our Websites for you. See a list of the analytics cookies we use here.
Advertising cookies
Disabled
These cookies provide advertising companies with information about your online activity to help them deliver more relevant online advertising to you or to limit how many times you see an ad. This information may be shared with other advertising companies. See a list of the advertising cookies we use here.

Senior Data Engineer

Tech
Indianapolis, IN | Full-time | Hybrid
We are seeking a Senior Data Engineer with deep expertise in SQL, Python, and modern cloud-native data architectures. The ideal candidate will have strong hands-on experience with Microsoft Fabric as well as Snowflake and other cloud platforms. This role involves leading complex data transformation initiatives, architecting scalable solutions, and collaborating directly with clients to deliver measurable business impact.

Responsibilities
Technical Leadership & Architecture
  • Design and architect end-to-end Microsoft Fabric solutions (data lakes, warehouses, real-time analytics).
  • Build and optimize pipelines integrating diverse data sources (SQL, APIs, Salesforce, Oracle, Dynamics).
  • Implement Direct Lake connectivity and optimize semantic models for analytics.
  • Lead advanced data modeling initiatives (dimensional, star/snowflake, data vault).
  • Develop SQL queries, stored procedures, and database optimization strategies.
  • Build Python applications for data processing and automation.
  • Implement real-time analytics solutions (Event Streams, KQL Database).
Client Engagement & Delivery
  • Lead technical discovery sessions to understand client data landscapes.
  • Collaborate with stakeholders to translate business needs into technical solutions.
  • Mentor junior engineers and review code to ensure best practices.
  • Provide expertise during pre-sales and proposal development.
Performance Optimization & Governance
  • Optimize performance via indexing, partitioning, and query tuning.
  • Implement data quality, lineage, and monitoring frameworks.
  • Ensure compliance with data privacy regulations (GDPR, HIPAA, SOX).
  • Establish CI/CD pipelines for data engineering workflows.

Requirements
Technical Expertise
  • 5+ years in data engineering or analytics engineering roles.
  • Hands-on experience with Microsoft Fabric (Data Factory, OneLake, Lakehouse, Dataflows Gen2, Direct Lake, Notebooks, Power BI, KQL databases).
  • Advanced Python skills with Pandas, NumPy, SQLAlchemy, PySpark.
  • Expert-level SQL across multiple database platforms (SQL Server, PostgreSQL, Snowflake, MySQL).
  • Experience with Snowflake, Azure, and AWS.
  • Proficiency with Git and collaborative workflows.
  • Familiarity with containerization (Docker, Kubernetes).
Data Engineering Skills
  • Advanced understanding of data modeling (dimensional, normalization/denormalization, data vault).
  • Experience implementing SCDs (Types 1–7).
  • Knowledge of modern data architecture patterns (data mesh, lakehouse).
  • Strong background in data governance, lineage, and quality.
Consulting & Leadership
  • Strong client-facing communication skills.
  • Ability to work independently across multiple projects.
  • Experience in Agile/Scrum environments.

Preferred Qualifications
  • Bachelor’s degree in Computer Science, Data Engineering, or related field.
  • Microsoft Fabric or Azure certifications.
  • Snowflake certifications (SnowPro Core, SnowPro Advanced).
  • Experience with Scala, Java, or R.
  • Experience with BI tools (Tableau, Power BI, Looker).
  • Prior consulting background.

Benefits
  • Competitive compensation package.
  • Full medical, dental, and vision coverage.
  • Fully vested 401K plan.
  • Flexible hybrid/onsite schedule.
  • Wellness and fitness programs.
  • Catered lunches and collaborative work environment.
  • Training and development opportunities to support continuous learning.
You're welcome!
Fill the form below and we contact you soon
You Resume