Data Engineer (Pyspark or Scala) @ Antal
Kraków
Antal
Must have Requirements Pyspark or Scala development and design. Experience using scheduling tools such as Airflow. Experience with most of the following technologies (Apache Hadoop, Pyspark, Apache Spark, YARN, Hive, Python, ETL frameworks, Map
Wczoraj z nofluffjobs.com