Data Engineer, Data & Analytics Job in Ballwin, MO

at EyeCare Partners

EyeCare Partners is the nation’s leading provider of clinically integrated eye care. Our national network of over 300 ophthalmologists and 700 optometrists provides a lifetime of care to our patients with a mission to enhance vision, advance eye care and improve lives. Based in St. Louis, Missouri, over 650 ECP-affiliated practice locations provide care in 18 states and 80 markets, providing services that span the eye care continuum. For more information, visit www.eyecare-partners.com.

Data Engineer – Data & Analytics

Position Summary:  We are seeking an experienced professional who will serve as the Data Engineer in our Data & Analytics that is deploying some of the modern data platforms and analytics tools. The Data Engineer will be responsible and accountable for expanding and optimizing our data and data pipeline architecture, as well as optimizing data collection. This role is also responsible for planning and executing data migrations from various practice management (PM) and electronic health/medical (EHR/EMR) records into our propriety PM/EMR systems as part of an overall business integration program.  This role will support our software engineers, data architects, data analysts and data scientists on various enterprise data initiatives and will ensure SLA based data delivery. The ideal candidate will be excited by the opportunity to design & build our company’s data architecture to support our next generation of data and analytics solutions.

Essential Responsibilities:

Job responsibilities includes but are not limited to:

  • Design, build and maintain data pipelines from various source systems into Snowflake
  • Analyze data elements from various systems, data flow, dependencies, relationships and assist in designing conceptual physical and logical data models
  • Design, build and maintain complex data sets designed to meet various business needs in the areas of reporting, advanced analytics and ad-hoc analysis
  • Coordinate the build and maintenance of data pipelines by third party service providers
  • Enabling and executing data migrations across systems (e.g. SQL server to Snowflake or other cloud data platforms)
  • Development and implementation of scripts for datahub maintenance, monitoring, performance tuning
  • Work with data and business analysts to deploy and support a robust data quality platform
  • Work with data and business analysts to deploy and support a robust data cataloging strategy
  • Work with various business and technical stakeholders and assist with data-related technical needs and issues
  • Work with data and analytics teams and drive greater value from our data and analytics investments
  • Work closely with cross-functional teams to understand and transform business requirements into scalable and manageable solutions
  • Present solutions and options to leadership, project teams and other stakeholders adapting style to both technical and non-technical audiences
  • Ensures teams adhere to documented design and development patterns and standards
  • Proactively monitor and resolve on-going production issues
  • Work closely with various technical teams to ensure consistency, quality of solutions and knowledge sharing across the enterprise
  • Educate organization on available and emerging tool sets
  • Ensure adherence to the approach of self-service data solutions and enable other teams with analytics solutions delivery via ‘Data as a Service’ model. Educate organization on available and emerging tool sets
  • Ensure adherence to the approach of self-service data solutions and enable other teams with analytics solutions delivery via ‘Data as a Service’ model

Requirements:

The requirements for this role includes but are not limited to the following:

  • Bachelor’s degree with 2-3 years’ experience or Master’s degree with 1-2 years’ experience in a STEM (Science, Technology, Engineering, Math) field
  • 2+ years of hands-on-experience in the design, development, and implementation of data solutions
  • Advanced SQL knowledge with strong query writing, stored procedures skills
  • Experience with Snowflake development and support
  • Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, DMS
  • Experience with relational databases such as SQL Server and object relational databases such as PostgreSQL
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with data analysis, ETL, and workflow automation
  • Experience working with multiple ETL/ELT tools and cloud based data hubs
  • Good scripting skills (PowerShell, Linux and/or cloud scripting tools)
  • Experience with Linux operating system
  • Good understanding of Java and related integrated development environment (IDE) such as Eclipse
  • Demonstrated problem solving
  • Demonstrated ability to think and work with a proactive mindset
  • A self-motivated personality with a passion for working in a fast-paced environment
  • Demonstrated ability to work efficiently and effectively in a fast-paced, matrixed environment, and ability to execute despite ambiguity