Lead Data Engineer, Business Intelligence

US - MA - Boston

Location(s)

US - MA - Boston

Team(s)

Business Support, Product & Engineering


Rapid7 (Nasdaq: RPD) is advancing security with visibility, analytics, and automation delivered through our Insight cloud. Our solutions simplify the complex, allowing security teams to work more effectively with IT and development to reduce vulnerabilities, monitor for malicious behavior, investigate and shut down attacks, and automate routine tasks. Over 9,300 customers rely on Rapid7 technology, services, and research to improve security outcomes and securely advance their organization. For more information, visit our website, check out our blog, or follow us on LinkedIn.

The Opportunity

The Data Engineer Lead will be responsible for leading a team of Data Engineers with responsibility for successful data pipeline and data product delivery.  You will lead the design and deployment of data architecture, data pipelines, machine learning models, DevOps/DataOps in the Snowflake Data Ecosystem. You must be comfortable managing other Data Engineers, adept with agile tools and methodology, demonstrate advanced analytical skills, technical and business knowledge and have a strong understanding of how to leverage the best industry standard tools and methodology to solve problems. The Data Engineer Lead will work closely with Data Analysts, Data Scientists, Enterprise Architects, and Product Analytics Team by providing data mapping and wrangling expertise. 

In the role you will:

  • Lead a team of Data Engineers and provide mentoring, day-to-day direction, prioritization of tasks and resource balancing within the squad/team

  • Lead and manage Rapid7 core data infrastructure with teammates using Fivetran, Snowflake, Airflow, GitHub, Docker, AWS

  • Lead data engineering Sprint Planning and execution including prioritization, update, and monthly review of Asana/JIRA tasks for team members

  • Lead & evaluate data tooling as needed and research opportunities for data acquisition and processing of batch and streaming data

  • Architect and develop data practices for data integration, data modeling, unit testing, and data productionization

  • Optimize data lifecycle management and process to improve delivery efficiency, data quality and data integrity

  • Refine and manage Data Security Policies and RBAC (Role-based Access Control)

  • Lead & collaborate with enterprise architect to engineer a reverse ETL process providing systems data integration between Snowflake and operational data platforms such as Salesforce, Gainsight, and Marketo

In return you will bring:

  • 5+ years within a data & analytics role performing data pipelining, infrastructure, integration and/or technical development of data architecture

  • At least 3 years of experience working with relational databases (Snowflake, Redshift, Postgres, MySQL etc.) with excellent SQL skills

  • Minimum 3 years of experience with a major cloud provider (AWS, Azure, GCP)

  • Hands-on experience with code deployment in cloud environments using tools such as Docker, Kubernetes, EC2 etc. 

  • Experience leading and mentoring others on a team

  • Excellent written and verbal communication skills 

  • Expertise in data architecture, data warehousing, and metadata management

  • High proficiency in one object oriented language such as Python, Java, Scala etc. 

  • Extensive experience with a CI/CD tool such as Jenkins, Github Actions, GitLab, a plus

  • Data Governance tools, data profiling and process improvement experience, a plus