hero

Careers

Make your next career move with a Hetz portfolio company
Hetz Ventures
companies
Jobs

Senior Data Engineer

Onit

Onit

Software Engineering · Full-time
Israel
Posted on Apr 6, 2026

About the job

About Us

We're reimagining how enterprise security teams operate in an AI-first world, transforming weeks of manual work into intelligent, automated workflows. Backed by tier-1 VCs, with second-time founders who've successfully built and sold cybersecurity companies, we're on a mission to empower security professionals to focus on strategic, high-impact work while AI handles the rest. If you're passionate about building the future of enterprise security, we want to hear from you.

About the Role

We are seeking a highly skilled Senior Data Engineer to design, build, and maintain scalable data pipelines and infrastructure for our platform. The ideal candidate will have extensive experience with ETL/ELT processes, analytical databases, and workflow orchestration. You will work on high-volume security data processing, building the data backbone that powers our AI-driven security platform. You will collaborate closely with Data Scientists, Security Researchers, and Full-stack Engineers.

Responsibilities

  • Design and build end-to-end data pipelines that ingest high-volume security data from multiple third-party sources, transform and normalize raw connector data into deduplicated schemas with memory-efficient stream processing, and orchestrate complex workflows across multi-tenant environments.

  • Write complex analytical SQL including aggregations, window functions, CTEs, and materialized views.

  • Manage database schemas across both an application data database and OLAP database, including migrations and cross-database federation.

  • Optimize stream processing pipelines to handle high-volume data ingestion with memory efficiency.

  • Apply best practices for data governance, security, and compliance including multi-tenant data isolation patterns, while continuously monitoring and optimizing pipeline performance for cost efficiency and reliability.

Requirements

  • 5+ years of experience in data engineering or a similar role.

  • Strong proficiency in TypeScript/Node.js for building data pipeline code.

  • Expertise in OLAP databases such as Snowflake, BigQuery, Redshift, ClickHouse

  • Experience with relational databases schema design and migrations such as Aurora PostgreSQL

  • Hands-on experience with workflow orchestration tools (Temporal, Airflow, or similar):

  • Experience building ETL pipelines with streaming data transformations ( AWS Kinesis, AWS Kafka)

  • Strong experience with AWS cloud services

  • Proficiency in SQL for complex analytical queries.

  • Knowledge of data modeling, deduplication strategies, and distributed computing concepts. (Apache Spark, dbt, Apache Parquet)

  • Excellent problem-solving skills with the ability to troubleshoot complex pipeline issues.

  • Knowledge of event-driven architectures and multi-tenant data isolation patterns.

  • Experience with real-time data streaming and high-volume JSON processing. (Kafka Streams, AWS Kinesis Data Streams)

Nice to Haves

  • Experience in the cybersecurity domain.

  • Experience with ClickHouse.

Apply Now →

Onit is an equal opportunity employer.