Tech Lead Databricks Data Engineer
REMOTE
USA, Canada, USA timezones
Are you looking for a remote job at Mitre Media? Work Remotely brings you this exciting opportunity. Here are the details:
Company: Mitre Media
Position: Tech Lead Databricks Data Engineer
Location: USA, Canada, USA timezones
Salary: $160k – $180k
More about Mitre Media
Mitre Media is redefining FinTech with AI-driven tools that empower millions of investors through a portfolio including Dividend.com and MutualFunds.com. For over a decade, they have served individual investors, financial advisors, and top asset managers like BlackRock and Vanguard. The company operates as a lean, entrepreneurial team that values a remote-first culture and high-impact data insights.
Job Description
As Tech Lead Data Engineer, you will architect and maintain the data backbone powering every feature across Mitre Media’s product suite. Reporting directly to the CTO, you will design Databricks-based ETL pipelines and model complex investment data to surface low-latency datasets. You will collaborate in a remote-first environment using the ShapeUp methodology for project planning and execution. This role is critical for delivering high-quality data for both user-facing features and internal AI/analytics workloads.
Preferred Qualifications
- Experience with orchestration tools like Apache Airflow, Luigi, or Dagster.
- Familiarity with visualization stacks such as Looker, Tableau, or Stripe’s Vizier.
- Financial-markets domain knowledge regarding equities, ETFs, and mutual funds.
- Background in machine-learning engineering or statistical analysis.
- Familiarity with building vector databases or embedding pipelines for LLM workflows.
Required Qualifications
- Expert proficiency in Python plus working knowledge of Scala or Java.
- Hands-on experience with Databricks, Spark cluster tuning, and Delta Tables.
- Strong analytical SQL skills and modular data-model design using DBT.
- Production experience on AWS or GCP data services such as S3, EMR, or Glue.
Work Responsibilities
- Design, implement, and optimize large-scale ETL workflows in Databricks using Apache Spark and Delta Lake.
- Develop algorithms that transform raw market data into actionable insights for investors.
- Own data quality and lineage by instituting tests, monitoring, and alerting for critical pipelines.
- Evolve the cloud data platform for scale, performance, and cost efficiency on AWS and GCP.
- Mentor engineers and champion best practices in code reviews, documentation, and DevOps for data.
Salary Range
$160k – $180k
This role offers competitive compensation. Additional benefits and perks may be available based on performance.
Important Note: The recruitment information provided above is purely for informational purposes. The details have been sourced from the official website of the organization. We do not offer any recruitment guarantees. All recruitment procedures must strictly adhere to the official process outlined by the company. We do not charge any fees for providing this job information.