Resultados de pesquisa: 6 vagas
...recognition technologies.
Experience with Machine Learning systems at scale
Our stack
Our codebase is mainly written in Python and Spark; in some cases we also write in Scala, Ruby, and Java.
Our servers live in AWS.
Our data is stored in S3, RDS MySQL, Redis,...
...kernel, networking, storage, to applications
Architect cloud infrastructure solutions like Kubernetes, Kubeflow, OpenStack, and Spark
Deliver solutions either on-premise or in public cloud (AWS, Azure, Google Cloud)
Collect customer business requirements and advise...
...data pipelines using Java within AWS infrastructure.
Implement scalable solutions for data analysis and transformation using Apache Spark and PySpark.
Utilise Airflow for efficient workflow orchestration in complex data processing tasks.
Ensure fast and interactive...
...re looking for:
~ Degree in Mathematics, Economics, Finance, Accounting or Computer Science
~3+ years of experience with Apache Spark, ideally PySpark
~4+ years of experience with SQL
~3+ years of programming experience with one of the programming languages, such...
...teamwork skills.
• Upper-intermediate English.
Knowing these technologies will count as a plus:
• Data engineering technologies (Apache Spark, PySpark, Airflow, and Presto)
• Experience with cloud services (preferably AWS) and containerization technologies (Docker,...
...-scale production challenges using software engineering principles, leveraging cutting-edge technologies such as Kubernetes, Trino, Spark, Kafka, Airflow, Flink, MLFlow, and more. Our infrastructure is entirely cloud-based, offering a dynamic and innovative environment...