A

Senior Data Engineer

Association Analytics
Full-time
On-site
Arlington, Virginia, United States


Association Analytics is the leading analytics solution provider in the association space.  Our Data Engineers (DEs) are engaged in all phases of the data management lifecycle. This includes:  






  • Complete development efforts across data pipeline to store, manage and provision to data consumers.  








  • Being an active and collaborating member of an Agile/Scrum team and following all Agile/Scrum best practices 








  • Write code to ensure the performance and reliability of data extraction and processing 








  • Support continuous process automation for data ingest.  








  • Achieve technical excellence by advocating for and adhering to lean-agile engineering principles and practices such as API-first design, simple design, continuous integration, version control, and automated testing. 








  •  Work with program management and engineers to implement and document complex and evolving requirements  








  • Collaborating with others as part of a cross-functional team. 








  • Mentors other team members in areas of expertise and ETL processes and standards and participates in code reviews 




This role can be in the Arlington, VA HQ 2 days per week or fully remote.




PREFERRED SKILLS AND QUALIFICATIONS 





  • Candidates will be given special consideration for extensive experience in NET core web applications using Razor pages, C#


  • Bachelor’s Degree in Information Technology or related field preferred  








  • 2+ years’ experience deploying applications using Amazon web services, cloud infrastructure to include EC2, ECR, RDS, Lambda, S3 and Cloud Formation, Redshift, CloudTrail, CloudWatch. 








  • Proficiency in developing ETL processes, and performing test and validation steps 








  • 4+ years’ experience manipulate data with Python  








  • Candidates will be given special consideration for extensive experience with Python 










  • Expert-level proficiency in Python, with a strong understanding of its libraries and frameworks for data analysis and manipulation (e.g., Pandas, NumPy, SciPy). 








  • Strong knowledge of big data analysis and storage tools and technologies (AWS Redshift) 








  • Strong understanding of the agile principles and ability to apply them 








  • Developing, scheduling, and monitoring batch-oriented workflows as code (Airflow)