Cognizant’s Vacancies for Data Engineer
Are you a data enthusiast with a knack for AWS and PySpark development? Cognizant, a global leader in technology consulting and digital solutions, has fantastic news for professionals like you! They currently have multiple openings for the position of Data Engineer – AWS PySpark Developer, welcoming applicants with 3 to 9+ years of experience in this domain.
Tittle | Data Engineer – AWS PySpark Developer |
Experience | 3 to 9 years |
Location | Gurgaon/Noida, India |
Table of Contents
Why Choose Cognizant?
- Cutting-Edge Projects: Cognizant is at the forefront of digital transformation, working on projects that leverage advanced technologies like AWS, PySpark, and more. As a Data Engineer, you’ll have the opportunity to contribute to innovative solutions that drive business outcomes.
- Continuous Learning: Cognizant values skill development and offers a range of learning opportunities, including training programs, certifications, and hands-on experiences with emerging technologies.
- Global Exposure: With a global footprint and a diverse client base, Cognizant provides exposure to a wide range of industries and geographies, enriching your professional experience.
- Collaborative Culture: At Cognizant, collaboration is key. You’ll work in teams comprising talented individuals from diverse backgrounds, fostering creativity, teamwork, and mutual growth.
Skill requirements:
- Minimum 3 – 9 years of relevant experience in Data Engineer role.
- Excellent hands-on experience in AWS Data technologies Glue, S3, Athena, EMR, IAM etc.
- Mandatory – Hands on experience in Python and PySpark. Python as a language is practically usable for anything, we are looking for application Development and Extract/Transform/Load and Datalake curation experience using Python.
- Hands on experience in version control tools like Git.
- Worked on Amazon’s Analytics services like Amazon Athena, DynamoDB and AWS Glue
- Worked on Amazon’s Compute services like Amazon Lambda, Amazon EC2 and Amazon’s Storage service like S3 and few other services.
- Experience/knowledge of bash/shell scripting will be a plus.
- Has built ETL processes to take data, copy it, structurally transform it etc. involving a wide variety of formats like CSV, fixed width, XML and JSON.
- Have worked with columnar storage formats- Parquet, Avro, and ORC etc.
- Hands on experience in tools like Jenkins to build, test and deploy the applications.
- Excellent debugging skills.
- Ability to quickly perform critical analysis and use creative approaches for solving complex problems.
- Strong academic background.
- Excellent written and verbal communication skills, and strong relationship building skills.
Duties and Responsibilities
- Maintain knowledge of evolving industry trends, practices, techniques, and standards on Big Data with focus on AWS cloud technologies and Python/Pyspark from application development perspective.
- Be aware of DevOps practices and follow appropriate steps/tools to effect code migration during development phase.
- Collaborate with cross-commit teams for implementation of deliverables in all environments.
- Development of code and design to meet evolving needs while adhering to policy and standards.
- Write test cases for Unit testing and Functional testing when the application is developed.
- Conduct and participate in coding and design reviews.
- Manage test plan & risks through development phases to implementation, ensure zero defects are introduced into the production environment.
- Ensure required documentation for projects and/or enhancements are created and updated.
- Monitor and support the production environment applications to ensure optimum stability and performance prior to handing over the code to operations team.
- Be aware of data-warehousing concepts and different facts/dimension-based modelling mechanisms for implementing in a Datalake environment.
How to Apply:
If you’re ready to take on exciting challenges in data engineering and AWS PySpark development, don’t miss this opportunity to join Cognizant’s dynamic team! You can apply directly through Cognizant’s Careers portal using the following link: Cognizant Careers – Data Engineer – AWS PySpark Developer Vacancy.
Seize the opportunity to work on impactful projects, enhance your skills, and be part of a global leader in technology and consulting. Apply now and embark on a rewarding journey with Cognizant!
Cognizant’s Vacancies for Data Engineer Cognizant’s Vacancies for Data Engineer Cognizant’s Vacancies for Data Engineer
Cognizant’s Vacancies for Data Engineer
Cognizant’s Vacancies for Data Engineer
For more job search visit careerflock offical page