Advanced Warning of scheduled maintenance

Due to scheduled system maintenance, our careers site will not be available from Friday 2nd June (11pm GMT) and Saturday 3rd June (5am GMT).  During this time you will not be able to submit new applications or continue with existing applications.  We apologise for the inconvenience this may cause.   

Design Engineering Professional

Job Req ID:  15080
Posting Date:  26-May-2023
Function:  Data & AI
Location: 

RMZ Ecoworld, Devarabeesanahal, Bengaluru, India

Salary:  Competitive

Why this job matters

Our ability to deliver brilliant personalised experiences for all of our 30m customers is fundamental to our future success. As a business we are investing extensively in the automation of our operations and networks, and in building our base management and data capabilities. This will allow us to take better decisions with data and to act on them in an automated way across all our customer interactions. 
Data driven automation and decision making has never been more important to BT. It will allow us to create the
Data has never been more important to BT. We’re creating the best personal experiences for our customers to help them stay connected, and that mission is all underpinned by high quality, low latency data. This role will play a critical part in the creation and running of BT Consumer’s foundational data layer and the data products built on top. 
As part of a team of highly skilled data engineers, you will create and maintain our logical data models, integrating data from multiple data sources into a central cloud repository. You will apply data cleansing and data standardisation rules, providing clear documentation of the business rules embedded in the system, with the potential of solving data quality issues through your team. You will work closely with the wider team to understand what the data journey needs to look like and work closely with the Data Architect team to develop our products and services. You will write and maintain data engineering user documentation to provide transparency and maintain the knowledge base within the team. 

What you’ll be doing

•    Applies specialist data expertise to develop and advise approach on a range of complex, high impact and data solutions a services
•    Responsible for developing data focussed cloud engineering solutions using Devops and Dataops principles.
•    Working as an individual contributor to design and build complex data pipelines on GCP. 
•    Assures data availability and quality in Consumer and across our systems
•    Helps to resolve technical problems
•    Proactively identifies new potential data sources and assess feasibility of ingesting
•    Develops tangible and ongoing standardisation, simplification and efficiency of engineering processes, reviewing and revising continuous improvement opportunities
•    Assures a high quality and comprehensive data flow and manage a team that provides a consumption layer where the business has access to all the data it needs
•    Ensures all data is compliant
•    Ensures that all data acquired is fully described/understood and communicated using appropriate tools
•    Productionise any tactical data feeds, including documentation

You'll have the following skills & experience

Essential skills & experience
• Deep technical knowledge of complex (and simple) data architectures, covering all aspects (compliance, risk, security) of our requirements
• Detailed knowledge of the concepts and principles of data engineering
• Sound awareness of Data Management best practice, including data lifecycle management
• Extensive skills in SQL, both at production grade and at analytical level, gained through intensive application in a commercial business environment
• Experience of building large scale data pipelines on at least one Cloud Platform (GCP and AWS preferred)
• Experienced in deploying data solutions and cloud infrastructure via CI/CD pipelines
• Experienced in deploying Infrastructure as Code (Terraform / Cloudformation etc)
• Solid Python data and development experience with major data enabling SDK’s / packages.
• Automated testing in Python eco system.
• Extensive experience in Pyspark, Pandas, GCP native services like , Cloud Function, Cloud Run, Data Flow, Big Query amongst others.
• Excellent oral and written communication skills for all levels of an organisation
• Collaborate to identify how work activities across the teams are related and highlight inefficiencies. You help to remove barriers and find the resources or support needed to improve processes
• Some knowledge of Cloud Computing patterns, workflows and services; and how they relate to a big data platform.

Desirable skills & experience
• Knowledge of REST/Graph APIs and how they can be used in a data environment.
• Can deliver complex big data solutions with structured and unstructured data
• Background in machine learning & software engineering frameworks such as TensorFlow or Keras
• Knowledge of Docker/Kubernetes, and how these can be used to simplify deployments.

About us

BT is part of BT Group, along with EE, Openreach, and Plusnet.

 

Millions of people rely on us every day to help them live their lives, power their businesses, and keep their public services running. We connect friends to family, clients to colleagues, people to possibilities. We keep the wheels of business spinning, and the emergency services responding. 

 

We value diversity and celebrate difference. As Philip Jansen, our CEO, says ‘We embed diversity and inclusion into everything that we do. It’s fundamental to our purpose: we connect for good.’

 

We all stick to the same values: Personal, Simple, and Brilliant. From day one, you’ll get stuck in to tough challenges, pitch in with ideas, make things happen. But you won’t be alone: we’ll be there with help and support, learning and development.  

 

This is your chance to make a real difference to the world: to be part of the digital transformation of countless lives and businesses. Grab it.