Table of Contents
Thomson Reuters Off Campus 2024 Details :
Thomson Reuters Off Campus 2024 Eligibility Criteria :
Join Telegram for All Top MNCs Jobs Updates
- Experience- 3-6 Years in Data Engineering
- Ability to translate business requirements to technical data requirements.
- Advanced knowledge of Tools like dBT, Fivetran.
- Minimum 1 year of experience with Snowflake with Snowflake Certification.
- Knowledge of Spark and Python.
- Expert knowledge of SQL and NoSQL.
- Experience with data modeling (using relational models, Star and Snow Schema).
- Experience with Data preparation, transformation and processing techniques at scale.
- Experience in Tuning and automation of data pipelines handling large volumes of data (>250M records).
- Knowledge and Hands-On with AWS tools (GLUE, Lambda, EMR etc.,)
- Data Analysis and use of Visualization tools such as Tableau or PowerBI.
- 1 year experience with Machine Learning Operations in the AWS environment.
- Experience in building POC’s using Generative AI techniques applicable for Data Engineering.
- Bachelor’s Degree from a reputable institution (Computer Science preferred).
- Knowledge of Machine Learning.
Jobs Description :
Want to be part of the Data & Analytics organization, whose strategic goal is to create a world-class Data & Analytics company by building, embedding, and maturing a data-driven culture across Thomson Reuters.
We are looking for a highly motivated individual with strong organizational and technical skills for the position of Data Engineer. You will play a critical role in ensuring that data is effectively managed and utilized to drive business insights and facilitating informed decision-making and help Thomson Reuters rapidly scale data-driven initiatives.
About The Role
In this opportunity as Data Engineer, you will:
- Engage with stakeholders, business analysts and project team to understand the data requirements.
- Source, prepare and move large volumes of data at scale for use by project team.
- Identify sources and destination of data and identify the appropriate data transformations required.
- Ensure that the appropriate systems of record are being used for source data.
- Pick the appropriate data movement mechanism. Build ETLs and/or data pipelines.
- Automate processes and ensure the appropriate backups and high availability mechanisms exist.
- Ensure the appropriate alerting mechanisms exist in the event of failure.
- Ability to tune the performance of the data movement mechanism so that it meets business operational needs.
- Design target data models, if none exist at the destination. Have experience with data models like relational schema design, star schema and snow schema.
- Do platform engineering and architecture to support machine learning operations when required.
- Assist with data queries, data analysis and dashboarding using tools such as Tableau and PowerBI.
Join our WhatsApp Group for Instant Jobs Notification