PROJECT DATABASE SUPPORT SERVICES (DSS) 2021Twelve month assignment starting 4122021 to 411202Work will initially be remote but as we return to normal the contractor will need to work onsite in downtown Los Angeles.Scope of WorkOur client is looking for a full-time contractor with expertise in the Data Engineering field to assist in support of Google Cloud Platform Data Lake Data Warehouse, data pipelines and data analytics.Duties to be performedThe contractor is expected to perform the following duties under the direction of the database section manager Responsible for building and maintaining our cloud data infrastructure, including data warehouses, and data pipelines Design, build, operationalize, secure, and monitor data processing Systems with a particular emphasis on security and compliance Assist in transformation of data into a format that is useful for analysis Create and maintain optimal data pipeline architecture Build the infrastructure required for optimal extraction, transformation, and loading Of data from a wide variety of data sources using SQL and Google Cloud Platform technologies Develop, implement and maintain Continuous Integration and Deployment for Data Pipelines Create data tools for analytics and data scientist team members Develop, implement and maintain Data Governance Interface with various project team members and vendor representatives to resolve database related software and hardware problems Have excellent written and verbal communications with a diverse group of people Perform other related duties incidental to the work described herein Required SkillsKnowledge and experience in Programming in Python, R or related analytical or programming languages such as SPSS, SAS, or JMP Google Cloud Platform such as cloud storage, BigQuery, Cloud SQL, Dataproc, Cloud Run, Cloud function, Data Flows, Composers, Audit and Monitor Geospatial and Columnar databases such as PostgresPostGlS, Google BigOuery, etc., with a strong preference for PostGlS Postgres along with the OpenGeo stack. Experience with SQL databases and data management using Python, SQL and Airflow is a plus Jupyter notebook on a Dataproc cluster o Terraform Cloud ETL tools Data APIs Strong experience using Github for version control and continuous integration and deployment. General Understanding of security concepts and policies Supporting and enhancing the existing Data Governance and Standards Ability to communicate well with technical and non-technical end-users to discuss changes, fixes, requirements, issues, concerns and trouble-shooting Strong oral communication, technical writing, presentation and time management skills Strong ability in problem determination and resolution Strong ability to work effectively in emergency and real-time incident response situations Education and Certification 4-year B.S. degree in Computer Science or related field o Equivalent coursework or technical training will be considered Certification in Google Data Engineering is plus Examples of Deliverables Project deployment plan Policy template Data Governance All qualified applicants will receive consideration for employment without regard to their race, religion, ancestry, national origin, sex, sexual orientation, age, disability, marital status or medical condition.We will consider all applicants with criminal histories in a manner consistent with the City of Los Angeles Fair Chance Initiative.Contact Donna 2 Minimum Education Required Bachelor Years of Experience Required More than 5 years Expected Travel Time None