Spark ETL Developer (Chevy Chase, MD)

Alderson Loop is looking for a Spark ETL Developer. This person is responsible for design transformations, mapping data from source to target for jobs on platform. They will develop ETL/ELT jobs on the Hadoop platform using Spark and other Big Data technologies.


  • 3+ years of programming experience in core Java

  • 2+ years of Hadoop Eco System (HDFS, Yarn, MapReduce, Ozie and Hive)

  • 2+ years of Spark core, Scala and Spark SQL


  • Experience with ETL Tools

  • Experience with Spark Streaming

Please send resume to (Subject line: Spark ETL Developer)