Pathway Genomics Corp.

Data Engineer - Stream Data Processing - Distributed Data Processing

Click Here to Apply

Job Location

Paris, France

Job Description

About Pathway

Deeptech start-up, founded in March 2020.

  • Our primary developer offering is an ultra-performant Data Processing Framework (unified streaming + batch) with a Python API, distributed Rust engine, and capabilities for data source integration & transformation at scale (Kafka, S3, databases/CDC,...).
  • The single-machine version is provided on a free-to-use license (`pip install pathway`).
  • Major data use cases are around event-stream data (including real-world data such as IoT), and graph data that changes over time.
  • Our enterprise offering is currently used by leaders of the logistics industry, such as DB Schenker or La Poste, and tested across multiple industries.

Pathway is VC-funded, with amazing BAs from the AI space and industry. We have operations across Europe and in the US. We are headquartered in Paris, with significant support from the French ecosystem (BPI, Agoranov, WILCO,...).

The Team

Pathway is built by and for overachievers. Its co-founders and employees have worked in the best AI labs in the world (Microsoft Research, Google Brain, ETH Zurich), worked at Google, and graduated from top universities (Polytechnique, ENSAE, Sciences Po, HEC Paris, PhD obtained at the age of 20, etc…). Pathway’s CTO is a co-author with Goeff Hinton and Yoshua Bengio. The management team also includes the co-founder of Spoj.com (1M+ developer users) and NK.pl (13.5M+ users) and experienced growth leader who has scaled companies with multiple exits.

The opportunity

We are searching for a person with a Data Processing or Data Engineering profile, willing to work with live client datasets, and to test, benchmark, and showcase our brand-new stream data processing technology.

The end-user of our product are mostly developers and data engineers working in a corporate environment. Our development framework is one day expected to become for them a part of their preferred development stack for analytics projects at work.

You Will

You will be working closely with our CTO, Head of Product, as well as key developers. You will be expected to:

  • Implement the flow of data from their location in client's warehouses up to Pathway's ingress.
  • Set up CDC interfaces for change streams between client data stores and i/o data processed by Pathway; ensuring data persistence for Pathway outputs.
  • Design ETL pipelines within Pathway.
  • Contribute to benchmark framework design (throughput / latency / memory footprint; consistency), including in a distributed system setup.
  • Contribute to building open-source test frameworks for simulated streaming data scenarios on public datasets.
Minimum Requirements
  • Inside-out understanding of at least one major distributed data processing framework (Spark, Dask, Ray,...)
  • 6 months+ experience working with a streaming dataflow framework (e.g.: Flink, Kafka Streams or ksqldb, Spark in streaming mode, Beam/Dataflow)
  • Ability to set up distributed dataflows independently.
  • Experience with data streams: message queues, message brokers (Kafka), CDC.
  • Working familiarity with data schema and schema versioning concepts; Avro, Protobuf, or others.
  • Familiarities with Kubernetes.
  • Familiarity with deployments in both Azure and AWS clouds.
  • Good working knowledge of Python.
  • Good working knowledge of SQL.
  • Experienced in working for an innovative tech company (SaaS, IT infrastructure or similar preferred), with a long-term vision.
  • Warmly disposed towards open-source and open-core software, but pragmatic about licensing.
Bonus Points
  • Know the ways of developers in a corporate environment.
  • Passionate about trends in data.
  • Proficiency in Rust.
  • Experience with Machine Learning pipelines or MLOps.
  • Familiarity with any modern data transformation workflow tooling (dbt, Airflow, Dagster, Prefect,...)
  • Familiarity with Databricks Data Lakehouse architecture.
  • Familiarity with Snowflake's data product vision (2022+).
  • Experience in a startup environment.
Why You Should Apply
  • Intellectually stimulating work environment. Be a pioneer: you get to work with a new type of stream processing framework.
  • Work in one of the hottest data startups in France, with exciting career prospects.
  • Responsibilities and ability to make significant contribution to the company’s success.
  • Compensation: annual salary of €60K-€100K + Employee stock option plan.
  • Inclusive workplace culture.
Further details
  • Type of contract: Permanent employment contract.
  • Preferable joining date: early 2023.
  • Compensation: annual salary of €60K-€100K + Employee stock option plan.
  • Location: Remote work from home. Possibility to work or meet with other team members in one of our offices:
  • Paris – Agoranov (where Doctolib, Alan, and Criteo were born) near Saint-Placide Metro (75006).
  • Paris Area – Drahi X-Novation Center, Ecole Polytechnique, Palaiseau.
  • Wroclaw – University area.

Candidates based anywhere in the EU, United States, and Canada will be considered.


#J-18808-Ljbffr

Location: Paris, FR

Posted Date: 11/12/2024
Click Here to Apply
View More Pathway Genomics Corp. Jobs

Contact Information

Contact Human Resources
Pathway Genomics Corp.

Posted

November 12, 2024
UID: 4934700017

InternJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.