Voodoo logo

Experienced Data Engineer - Streaming Platform

Voodoo
Full-time
On-site
Paris, 11

JobsCloseBy Editorial Insights

Voodoo is seeking an Experienced Data Engineer to join the Ad-Network Team in Paris, onsite three days a week, to design and maintain real-time data pipelines processing bid requests, impressions and user engagement. You’ll build with Flink or Spark Structured Streaming, integrate OpenRTB signals, and work with GCP Pub/Sub, Kinesis, Pulsar or Kafka, plus manage Avro or Protobuf schemas. Expect Java/Scala/Python code, Kubernetes deployments, CI/CD you’re comfortable with, and strong monitoring. To apply, showcase 3-5+ years in streaming, concrete throughput latency wins, collaboration with backend engineers, and any open source or cost-optimization work. Highlight relocation readiness and why Paris excites you.


Founded in 2013, Voodoo is a tech company that creates mobile games and apps with a mission to entertain the world. Gathering 800 employees, 7 billion downloads, and over 200 million active users, Voodoo is the #3 mobile publisher worldwide in terms of downloads after Google and Meta. Our portfolio includes chart-topping games like Mob Control and Block Jam, alongside popular apps such as BeReal and Wizz.
Team
The Engineering & Data team builds innovative tech products and platforms to support the impressive growth of their gaming and consumer apps which allow Voodoo to stay at the forefront of the mobile industry. Within the Data team, you’ll join the Ad-Network Team which is an autonomous squad of around 30 people.  The team is composed of top-tier software engineers, infrastructure engineers, data engineers, mobile engineers, and data scientists (among which 3 Kaggle Masters). The goal of this team is to provide a way for Voodoo to monetize our inventory directly with advertising partners, and relies on advanced technological solutions to optimize advertising in a real-time bidding environment. It is a strategic topic with significant impact on the business.
This roles requires to be onsite 3 days/week and is Paris based.

Role

  • Build, maintain, and optimize real-time data pipelines to process bid requests, impressions, clicks, and user engagement data.
  • Develop scalable solutions using tools like Apache Flink, Spark Structured Streaming, or similar stream processing frameworks.
  • Collaborate with backend engineers to integrate OpenRTB signals into our data pipelines and ensure smooth data flow across systems.
  • Ensure data pipelines handle high-throughput, low-latency, and fault-tolerant processing in real-time.
  • Write clean, well-documented code in Java, Scala, or Python for distributed systems.
  • Work with cloud-native messaging and event platforms such as GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka to ensure reliable message delivery.
  • Assist in the management and evolution of event schemas (Protobuf, Avro), including data consistency and versioning.
  • Implement monitoring, logging, and alerting for streaming workloads to ensure data integrity and system health.
  • Continuously improve data infrastructure for better performance, cost-efficiency, and scalability.

Profile (Must have)

  • 3-5+ years of experience in data engineering, with a strong focus on real-time streaming systems.
  • Familiarity with stream processing tools like Apache Flink, Spark Structured Streaming, Beam, or similar frameworks.
  • Solid programming experience in Java, Scala, or Python, especially in distributed or event-driven systems.
  • Experience working with event streaming and messaging platforms like GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka.
  • Hands-on knowledge of event schema management, including tools like Avro or Protobuf.
  • Understanding of real-time data pipelines, with experience handling large volumes of event-driven data.
  • Comfortable working in Kubernetes for deploying and managing data processing workloads in cloud environments (AWS, GCP, etc.).
  • Exposure to CI/CD workflows and infrastructure-as-code tools such as Terraform, Docker, and Helm.

Nice to have

  • Familiarity with real-time analytics platforms (e.g., ClickHouse, Pinot, Druid) for querying large volumes of event data.
  • Exposure to service mesh, auto-scaling, or cost optimization strategies in containerized environments.
  • Contributions to open-source projects related to data engineering or stream processing.

Benefits

  • Competitive salary upon experience
  • Comprehensive relocation package (including visa support)
  • Swile Lunch voucher
  • Gymlib (100% borne by Voodoo)
  • Premium healthcare coverage SideCare, for your family is 100% borne by Voodoo
  • Child day care facilities (Les Petits Chaperons rouges)
  • Wellness activities in our Paris office
  • Unlimited vacation policy
  • Remote days