Manager, Data Engineering

KPMG is currently seeking a Manager in Digital Lighthouse for our Consulting practice.

Responsibilities: Rapidly architect, design, prototype, implement and optimize software systems to tackle Data Access, Data Management and Data Science needs for a variety of Fortune 1000 corporations and other major organizations; develop modular code to solve real world problems following best practices in the industry Manager the technical process from ideation to delivery: Research, experiment and utilize leading data platforms and container methodologies to migrate and/or scale cloud data solutions; build pipelines and scalable analytic tools using leading technologies (Hadoop, Spark, Kafka, Kubernetes, Terraform, Airflow, AWS, Azure, GCP) Regularly conduct peer code reviews to ensure code quality provide expert documentation and provide operating guidance for users of all levels; be a champion for high quality software using TDD and other quality control approaches Stay relevant and ahead of the curve on industry needs and technology trends by growing and participating in cross-disciplinary communities; lead, mentor and teach team members; participate in developing and presenting thought leadership Develop engineering designs impact business performance; translate between business and technical stakeholders; leverage technical background to lead technical work and build client relationships Help drive the engineering business by pursuing innovation funding, building targeted industry solutions and extendable platforms to expand our technology and solution delivery capabilities Qualifications: Minimum five years of programming languages (Python, Scala preferred) with the ability to pick up new languages and technologies quickly; ability to rapidly ingest, transform, engineer and visualize data for both ad hoc and product-level (automated) data & analytics solutions; understanding of cloud and distributed systems principles (load balancing, networks, scaling, in-memory vs.

disk); and experience with large-scale big data methods (Hadoop stack, Cloud, Kubernetes) Bachelors degree from an accredited college/university or Masters degree from an accredited college/university with minimum two years of experience; or a PhD in Computer Science, Computer Engineering, Engineering or related fields from an accredited college/university Architecture and Integration Experience with coding and testing patterns working with existing commercial or open source software platforms; familiarity with different architecture patterns of development (Event Driven, API concepts, cloud managed services, functional programming, Lambda); capability to architect highly scalable distributed systems using different open source tools Expertise with development tools and team methodologies (Agile, git, test driven development, CI/CD release management); comfortable working in Unix terminal and maintaining production data environment (database migrations, artifact infrastructure and code deployment) Ability to work with other technical team members to assess needs, provide assistance and resolve technical problems; understand engagement objectives which includes a clear understanding of the clients business problem and how the engagement helps address the clients business needs Travel as needed

Related Post