Tech Lead Databricks Data Engineer
REMOTE
USA, Canada, USA timezones
Are you looking for a remote job at Mitre Media? Work Remotely brings you this exciting opportunity. Here are the details:
Company: Mitre Media
Position: Tech Lead Databricks Data Engineer
Location: USA, Canada, USA timezones
Salary: $160k – $180k
More about Mitre Media
Mitre Media is redefining FinTech with AI-driven tools like Dividend.com and MutualFunds.com that empower millions of investors globally. Their portfolio leverages Large Language Models to deliver novel data insights and visually rich user experiences for individual investors and top asset managers like BlackRock and Vanguard. For over a decade, they have combined premium financial data with advanced advertising solutions in an entrepreneurial, remote-first environment.
Job Description
As the Tech Lead Data Engineer, you will architect and maintain the mission-critical data backbone that powers Mitre Media’s entire product suite. Reporting to the CTO, you will design Databricks-based ETL pipelines and model complex investment data to surface high-quality, low-latency datasets for user-facing features and AI workloads. You will lead the evolution of the cloud data platform on AWS and GCP while following the ShapeUp methodology for project planning. This role requires a pragmatic leader who can mentor engineers and ship high-impact solutions in a distributed, cross-functional team.
Preferred Qualifications
- Experience with Apache Airflow, Luigi, or Dagster for complex DAG orchestration.
- Familiarity with building feature stores or inference-ready tables for ML and LLM workflows.
- Knowledge of financial markets, including equities, ETFs, and mutual funds.
- Experience with data visualization stacks such as Looker, Tableau, or Stripe’s Vizier.
- Knowledge of vector databases and embedding pipelines for modern AI applications.
Required Qualifications
- Expert proficiency in Python and working knowledge of Scala or Java.
- Extensive hands-on experience with Databricks, Apache Spark, and Delta Lake.
- Strong analytical SQL skills and modular data-model design using DBT.
- Production experience managing cloud data services on AWS or GCP, such as S3, Glue, or Dataflow.
Work Responsibilities
- Design and optimize large-scale ETL workflows using Spark and Delta Lake in the Databricks environment.
- Develop sophisticated algorithms to transform raw financial market data into actionable insights.
- Maintain data quality and lineage by implementing robust testing, monitoring, and alerting for all pipelines.
- Evolve the cloud data infrastructure for better scale, performance, and cost efficiency.
- Mentor team members and champion best practices in code reviews and data documentation.
Salary Range
$160k – $180k
This role offers competitive compensation. Additional benefits and perks may be available based on performance.
Important Note: The recruitment information provided above is purely for informational purposes. The details have been sourced from the official website of the organization. We do not offer any recruitment guarantees. All recruitment procedures must strictly adhere to the official process outlined by the company. We do not charge any fees for providing this job information.