Sr Data Engineer, Corporate

Join SADA as a Data Engineer, Corporate!

Your Mission 

As a Sr Data Engineer, Corporate at SADA , you will have the opportunity to work with big data and emerging Google Cloud technologies to drive the corporate services.

You will have an opportunity to design, develop, and maintain the best Enterprise Data Warehouse solution to fit our corporate needs.

You will be interacting with all of our business units and Google Cloud subject matter experts.

From transforming business requirements, solution architecture, data modeling, architecting, ETL, metadata, and business continuity, you will have the opportunity to work collaboratively with architects and other engineers to recommend, prototype, build, and debug data infrastructures on Google Cloud Platform (GCP).

You will have an opportunity to work on real-world data problems facing our customers today.

Engagements vary from being purely consultative to requiring heavy hands-on work, and cover a diverse array of domain areas, such as data migrations, data archival and disaster recovery, and big data analytics solutions requiring batch or streaming data pipelines, data lakes and data warehouses. 

You will be expected to run point on whole projects, end-to-end, and to mentor less experienced Data Engineers.

You will be recognized as an expert within the team and will build a reputation with Google and our customers.

You will demonstrate repeated delivery of project architectures and critical components that other engineers demur to you for lack of expertise.

You will also participate in early-stage opportunity qualification calls, as well as guide client-facing technical discussions for established projects. 

Pathway to Success 

#BeOneStepAhead: At SADA, we are in the business of change.

We are focused on leading-edge technology that is ever-evolving.

We embrace change enthusiastically and encourage agility.

This means that not only do our engineers know that change is inevitable, but they embrace this change to continuously expand their skills, preparing for future customer needs. 

Your success starts by positively impacting the direction of a fast growing practice with vision and passion.

You will be measured quarterly by the breadth, magnitude, and quality of your contributions, your ability to estimate accurately, customer feedback at the close of projects, how well you collaborate with your peers, and the consultative polish you bring to customer interactions.  

As you continue to execute successfully, we will build a customized development plan together that takes you through the engineering or management growth tracks. 

Expectations

Internal Facing
– You will interact with internal customers and stakeholders on a regular basis, sometimes daily, other times weekly/bi-weekly.

Expectations will be to capture requirements and deliver solutions that are suitable for corporate divisions.

Training
– Ongoing with first week orientation at HQ followed by a 90 day onboarding schedule.

Details of the timeline can be shared.

Due to the COVID-19 pandemic, all onboarding will be temporarily conducted remotely.

Job Requirements

Required Credentials:

  • Google Professional Data Engineer Certified or able to complete within the first 45 days of employment 

Required Qualifications: 

  • Mastery in the following domain area:
    • Data warehouse modernization : building complete data warehouse solutions on BigQuery, including technical architectures, star/snowflake schema designs, query optimization, ETL/ELT pipelines, and reporting/analytic tools.

      Must have hands-on experience working with batch or streaming data processing software (such as Beam, Airflow, Hadoop, Spark, Hive, etc.)

  • Proficiency in the following domain areas:
    • Big Data : managing Hadoop clusters (all included services), troubleshooting cluster operation issues, migrating Hadoop workloads, architecting solutions on Hadoop, experience with NoSQL data stores like Cassandra and HBase, building batch/streaming ETL pipelines with frameworks such as Spark, Spark Streaming, and Apache Beam, and working with messaging systems like Pub/Sub, Kafka and RabbitMQ. 
  • Data migration : migrating data stores to reliable and scalable cloud-based stores, including strategies for minimizing downtime.

    May involve conversion between relational and NoSQL data stores, or vice versa

  • Backup, restore & disaster recovery : building production-grade data backup and restore, and disaster recovery solutions.

    Up to petabytes in scale

  • 4+ years of experience with Data modeling, SQL, ETL, Data Warehousing, and Data Lakes
  • 4+ years experience in writing production-grade data solutions (relational and NoSQL)
  • in an enterprise class RDBMS
  • 2+ years of experience with enterprise-class Business Intelligence tools such as Looker, PowerBI, Tableau, etc.
  • Experience writing software in one or more languages such as Python, Java, R, or Go
  • Experience with systems monitoring/alerting, capacity planning and performance tuning

Useful Qualifications:

  • Experience working with Google Cloud data products (CloudSQL, Spanner, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Bigtable, BigQuery, Dataprep, Composer, etc)
  • Experience with IoT architectures and building real-time data streaming pipelines
  • Experience operationalizing machine learning models on large datasets
  • Demonstrated leadership and self-direction –
    – willingness to teach others and learn new techniques
  • Demonstrated skills in selecting the right statistical tools given a data analysis problem
  • Ability to balance and prioritize multiple conflicting requirements with high attention to detail
  • Excellent verbal/written communication & data presentation skills, including ability to succinctly summarize key findings and effectively communicate with both business and technical teams

About SADA

Values: We built our core values on themes that internally compel us to deliver our best to our partners, our customers and to each other.

Ensuring a diverse and inclusive workplace where we learn from each other is core to SADA’s values.

We welcome people of different backgrounds, experiences, abilities, and perspectives.

We are an equal opportunity employer.

  1. Make them rave
  2. Be data-driven
  3. Think one step ahead
  4. Drive purposeful impact
  5. Do the right thing

Work with the best : SADA has been the largest partner in North America for Google Cloud portfolio of products since 2016 and has been named the 2020, 2019, and 2018 Google Cloud Global Reseller Partner of the Year .

SADA has also been awarded Best Place to Work year after year by the Business Intelligence Group , Inc.

Magazine , as well as LA Business Journal !

Benefits : Unlimited PTO , Paid Parental Leave, competitive and attractive compensation,  performance-based bonuses, paid holidays, rich medical, dental, vision plans, life, short and long-term disability insurance, 401K/RRSP with match, professional development reimbursement program as well as Google Certified training programs.

Business Performance: SADA has been named to the INC 5000 Fastest-Growing Private Companies list for 15 years in a row garnering Honoree status.

CRN has also named SADA on the Top 500 Global Solutions Providers for the past 5 years.

The overall culture continues to evolve with engineering at its core: 3200+ projects completed, 4000+ customers served, 10K+ workloads and 30M+ users migrated to the cloud.

SADA is committed to the safety of its employees and recommends that new hires receive a COVID vaccination before beginning work.

More Information on SADA

SADA operates in the Other industry.

The company is located in Los Angeles, CA, San Francisco, CA, Seattle, WA, Denver, CO, Austin, TX, Chicago, IL, Atlanta, GA and New York, NY.

SADA was founded in 2000.

It has 1000 total employees.

To see all 234 open jobs at SADA, click here .

Related Post

ProgrammerProgrammer

JT4 is hiring a Programmer for their Edwards Air Force Base located in California. Under limited supervision, converts data from project specifications and statement of problem and procedures to create or modify computer programs requiring and