Michigan, Grand Rapids-Muskegon-
: Work in the Global Engineering Technologies team on Modern data architecture, data governance, building data pipelines, Data Warehousing (DW), and Business Intelligence (BI) solutions. The data engineer would help defines and builds the data pipelines that will enable faster, better, data-informed decision-making within the business
: Work closely with the Solutions Architect and Business Analysts to design, build, deploy and operate their data science, data analytics, data warehouse and BI solutions.
: Work in fast-moving development team using agile methodologies.
: Partner closely with Solutions Architect, BI developers and Product Managers to design and implement data models, database schemas, data structures, and processing logic to support various data science, analytics and BI workflows.
: Design and develop ETL (extract-transform-load) processes to validate and transform data, calculate metrics and attributes, populate data models etc., using Informatica, Spark, SQL, and other technologies.
: Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and other ' big data' technologies.
: Lead by example, demonstrating best practices for code development and optimization, unit testing, SDLC, performance testing, QA, capacity planning, documentation, monitoring, alerting, and incident response in order to ensure data availability, data quality, usability and required performance.
: Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
: Automate data availability and quality monitoring and alerts. Respond to alerts when SLAs are not being met.
: Communicate progress to the team members and the management clearly and in actionable form.
: Demonstrate commitment to your professional development by participating in developer communities inside and outside.
: BS in Computer Science, Mathematics, or a similar field.
: Object Oriented programming skills in Python and Java, and a willingness to learn other languages (e.G. R) as needed. Strong scripting skills (Python, shell).
: SQL and Query Performance tuning is must.
: Familiarity or working experience with Informatica ETL Tools.
: Functional understanding of Cloud Computing is a plus.
: 4+ years of experience integrating technical processes and business outcomes specifically: data architecture and models, data and process analysis, data quality metrics / monitoring, developing policies / standards & supporting processes.
: 4+ years of hands-on data engineering experience.
: 2+ years DevOps experience including configuration, monitoring and version control.
: Record of accomplishment working with data from multiple sources
: Willingness to dig-in and understand the data and to leverage creative thinking and problem solving.
: Ability to communicate objectives, plans, status and results clearly, focusing on critical few key points.