GCP Data Architect

Overview

LTIMindtree Limited is an Indian multinational information technology services and consulting company. A subsidiary of Larsen & Toubro, the company was incorporated in 1996 and employs more than 90,000 people.

Job Description

1) Minimum 10-15  years of IT Experience and 3years Development on the GCP projects and 5 projects implmented.

2) Possess In depth knowledge and hands on development experience in building Distributed Data Solutions including ingestion, processing, consumption). (Must Have)

3) You have experience with developing winning themes and then writing technical responses to bids (RFP’s & RFI’s).

4)Strong Development Experience in either one of the Distributed Big Data processing (bulk) engines preferably using Spark on Dataproc or DataFLow (Must Have).

5) Strong understanding and experience with Cloud Storage infrastructure and operationalizing GCP based storage services & solutions prefer GCP Bucket or related (Must Have).

6) Strong experience on either one or more MPP Data Warehouse Platforms prefer BigQuery, CoudSQL,CLoudSpanner, DataStore, Firestore or similar (Must Have).

7) Strong Development Experience on at least one or more event driven streaming platforms prefer PUB/SUB, Kafka or related (Must Have).

8) Strong Development Experience on the Networking on GCP (Must Have).

9) Strong Data Orchestration experience using tools such has Cloud Functions, DataFlow, Cloud Composer, Apache Airflow or related (Must Have).

10) Strong Development Experience to build data piple using Kubernaties (Must Have).

11) Strong Development Experience in IAM, KSM, Container Registr.

12) Assess use cases for various teams within the client company and evaluate pros and cons and justify recommended tooling and component solution options using GCP services, 3rd party and open source solutions (Must Have).

13) Strong technical communication skills and ability to engage a variety of business and technical audiences explaining features, metrics of Big Data technologies based on experience with previous solutions (Must Have).

14) Strong Data Cataloging experience preferably using Data Catalog (Must Have).

15) Strong Understanding and experience in Logging and Cloud Monitoring solutions (Must Have).

16) Strong Understanding of at least one or more Cluster Managers (Yarn, Hive, Pig, etc) (Must Have).

17) Strong knowledge and understanding of CI/CD processes and tools (Must Have).

18) Interface with client project sponsors to gather, assess and interpret client needs and requirements
Advising on database performance, altering the ETL process, providing SQL transformations, discussing API integration, and deriving business and technical KPIs.

19) Develop a data model around stated use cases to capture client’s KPIs and data transformations.

20) Assess, document and translate goals, objectives, problem statements, etc. to our offshore team and onshore management.

Skills & Requirements

GCP Data Architect with Bigquery, Data Flow, Airflow, Java or Python