BID Operations logo

Senior Data Engineer

BID Operations
Full-time
On-site
Sydney, 02

JobsCloseBy Editorial Insights

Senior Data Engineer at BID Operations will own the architecture, scalability and reliability of a real-time data platform built on Kafka, RabbitMQ, Airflow and ClickHouse to power financial analytics. The role combines hands-on engineering, mentoring, and partnering with leadership to align data strategy with business goals while upholding cloud deployments and data governance. The core emphasis is fault-tolerant ETL/ELT pipelines, sub-second data availability, optimized ClickHouse schemas and monitoring with automated recovery. To apply, tailor your resume to show real-time pipelines, Python and SQL excellence, leadership impact, and metrics on throughput and latency; highlight cross-functional collaboration and the ability to translate technical concepts for non-technical stakeholders. The role is onsite in Sydney with hybrid options and a vibrant culture.


About the company:

At BID Operations, we are passionate about supporting our clients in their journey towards success. Our mission is to empower you to thrive by handling the essential yet time-consuming aspects of your business operations, allowing you to concentrate on strategic growth and innovation.

About the role:

As a Senior Data Engineer, you will be a technical leader responsible for the architecture, scalability, and reliability of our high-throughput, real-time data ecosystem. You will oversee the evolution of our data infrastructure, leveraging Kafka, RabbitMQ, Airflow, and ClickHouse to power mission-critical financial analytics. Your role is to bridge the gap between complex business requirements and high-performance engineering, ensuring our data pipelines can handle the rigours of real-time financial data processing.

Requirements

Key Responsibilities:

  • Lead the design and evolution of highly scalable, fault-tolerant ETL/ELT pipelines.
  • Drive the strategy for real-time messaging and stream processing using Kafka and RabbitMQ to ensure sub-second data availability.
  • Act as the subject matter expert for ClickHouse, optimising complex schema designs, indexing strategies, and query performance for large-scale financial datasets.
  • Oversee the deployment of data services within cloud environments, implementing advanced security protocols and data governance standards essential for the finance industry.
  • Collaborate with senior leadership to align data strategy with business objectives. Mentor data engineers through code reviews and technical guidance.
  • Implement advanced monitoring and automated recovery systems to ensure the integrity and quality of high-stakes financial data.

Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Proven experience in data engineering, with a strong background in designing and implementing ETL processes within cloud environments.
  • Experience within the Finance or Trading technology sector, with a proven track record of handling real-time market or transactional data.
  • Strong programming skills in Python, with experience in developing robust, maintainable, and scalable data processing pipelines.
  • Extensive SQL knowledge and experience.
  • Excellent problem-solving skills and the ability to work collaboratively in a team environment.
  • Strong communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.

Benefits

  • Hybrid working arrangement
  • Opportunities for enriching career growth, including exposure to regional contexts
  • Complimentary snacks and beverages available in the office pantry
  • Healthcare coverage (medical, dental, optical), gym benefits
  • Flexibility in smart casual dress code
  • Young, vibrant and open work culture