We are looking for an experienced Senior Data Engineer to join our growing team of data experts. As a data engineer at MedaLogix, you will be responsible for developing, maintaining, and optimizing our data warehouse and data pipelines. The data engineer will support multiple stakeholders, including software developers, database architectures, data analysts, and data scientists, to ensure an optimal data delivery architecture. The ideal candidate should possess strong technical abilities to solve complex problems with data, a willingness to learn new technologies and tools if necessary, and be comfortable supporting the data needs of multiple teams, stakeholders, and products.
• 7+ years as a Data Engineer or related role, with a focus on designing and developing performant data pipelines.
• Intermediate/Expert-level knowledge of Kubernetes fundamentals like nodes, pods, services, deployments etc., and their interactions with the underlying infrastructure.
• 2+ years hands-on experience with Docker containerization technology to package applications for use in a distributed system managed by Kubernetes.
• 3+ years’ experience with orchestration platform airflow. Experience with Azure Data Factory is a plus but not required.
• Expertise in using DBT and Apache Airflow for orchestration and data transformation. • Strong programming skills in Python and SQL. Experience with Scala is a plus but not required.
• Strong experience with at least one cloud platform such as AWS, Azure, or Google Cloud
• Experience working with cloud Data Warehouse solutions (Snowflake).
• Excellent problem-solving, communication, and organizational skills.
• Proven ability to work independently and with a team.
• Approachable, personable and team player comfortable working in an Agile environment
• Experience with working on large data sets and distributed computing.
• Knowledge of EMR systems like HCHB/MatrixCare.
• Prior experience as a Senior Data Engineer within a healthcare SaaS group