- Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Mid Data Engineer with Python & Snowflake Jefferies.
Urgent! Mid-Data Engineer with Python & Snowflake - Jefferies - Local Job Opening in Pune
This job is with Capco, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community.
Please do not contact the recruiter directly.
Job Title:
Data Engineer with Python & Snowflake- Pune
About Us
Capco, a Wipro company, is a global technology and management consulting firm.
Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount.
With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors.
We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry.
The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business.
Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
Role Description:
Key Skills: Data Engineering, Python, Snowflake, AWS, Git/ Bitbucket
Exp: 9+yrs
Location – Hinjewadi, Pune
Shift timings: 12:30PM- 9:30PM
3 days WFO (Tues, Wed, Thurs)
Technical Requirement
Job Summary
Job Description: Python & Snowflake Engineer with AI/Cortex Development
4+ years of experience in developing Data Engineering and data science projects using Snowflake/AI Cloud platform on AWS cloud.
Snow Park experience preferred.
Experience with different data modeling techniques is required.
4+ yrs experience with Python development.
Used tools like VS Code or anaconda, version control using Git or Bitbucket and Python unit testing frameworks.
Experience in building snowflake applications using Snowflake AI/Cortex platform (specifically cortex agents, cortex search and cortex LLM with understanding of context enrichment using Prompts or Retrieval-Augmented-Generation methods).
Deep understanding of implementing Object oriented programming in the Python, data structures like Pandas, data frames and writing clean and maintainable Engineering code.
Understanding multi-threading concepts, concurrency implementation using Python server-side python custom modules.
Implementing Object-Relational mapping in the python using frameworks like SQLAlchemy or equivalent.
Good at developing and deploying Python applications like lamda on AWS Cloud platform.
Good at deploying web applications on AWS Cloud using docker containers or Kubernetes with experience of using CI/CD pipelines.
Good at developing applications Snowpipe and Snowpark and moving the data from Cloud sources like AWS S3 and handling unstructured data from data lakes.
Good at Snowflake Account hierarchy models, Account-role-permissions strategy.
Good at Data sharing using preferably Internal Data Marketplace and Data Exchanges for various Listings.
Good at the Data Governance/Security concepts within Snowflake, Row/Column level dynamic data masking concepts using Snowflake Tags.
Good understanding of input query enrichment using Snowflake YAMLs and integrating with LLMs within Snowflake.
Candidate is good at understanding of Relevance search and building custom interaction applications with LLMs.
Nice to have experience in building Snowflake native applications using Streamlit and deploy onto AWS Cloud instances (EC2 or docker containers).
Candidate continuously improving functionality through experimentation, performance tuning and customer feedback.
Nice to have any application Cache implementation experience within Python web applications.
Nice to have duckdb with Apache arrow experience.
Nice to have implementing CI/CD pipelines within Snowflake applications.
Good at analytical skills, problem solving and communicate technical concepts clearly.
Experience using Agile and SCRUM methodologies and preferably with JIRA.
If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth.
For more information, visit www.capco.com.
Follow us on Twitter, Facebook, LinkedIn, and YouTube.
✨ Smart • Intelligent • Private • Secure
Practice for Any Interview Q&A (AI Enabled)
Predict interview Q&A (AI Supported)
Mock interview trainer (AI Supported)
Ace behavioral interviews (AI Powered)
Record interview questions (Confidential)
Master your interviews
Track your answers (Confidential)
Schedule your applications (Confidential)
Create perfect cover letters (AI Supported)
Analyze your resume (NLP Supported)
ATS compatibility check (AI Supported)
Optimize your applications (AI Supported)
O*NET Supported
O*NET Supported
O*NET Supported
O*NET Supported
O*NET Supported
European Union Recommended
Institution Recommended
Institution Recommended
Researcher Recommended
IT Savvy Recommended
Trades Recommended
O*NET Supported
Artist Recommended
Researchers Recommended
Create your account
Access your account
Create your professional profile
Preview your profile
Your saved opportunities
Reviews you've given
Companies you follow
Discover employers
O*NET Supported
Common questions answered
Help for job seekers
How matching works
Customized job suggestions
Fast application process
Manage alert settings
Understanding alerts
How we match resumes
Professional branding guide
Increase your visibility
Get verified status
Learn about our AI
How ATS ranks you
AI-powered matching
Join thousands of professionals who've advanced their careers with our platform
Unlock Your Mid Data Potential: Insight & Career Growth Guide
Real-time Mid Data Jobs Trends in Pune, India (Graphical Representation)
Explore profound insights with Expertini's real-time, in-depth analysis, showcased through the graph below. This graph displays the job market trends for Mid Data in Pune, India using a bar chart to represent the number of jobs available and a trend line to illustrate the trend over time. Specifically, the graph shows 125575 jobs in India and 4169 jobs in Pune. This comprehensive analysis highlights market share and opportunities for professionals in Mid Data roles. These dynamic trends provide a better understanding of the job market landscape in these regions.
Great news! Capco is currently hiring and seeking a Mid Data Engineer with Python & Snowflake Jefferies to join their team. Feel free to download the job details.
Wait no longer! Are you also interested in exploring similar jobs? Search now: Mid Data Engineer with Python & Snowflake Jefferies Jobs Pune.
An organization's rules and standards set how people should be treated in the office and how different situations should be handled. The work culture at Capco adheres to the cultural norms as outlined by Expertini.
The fundamental ethical values are:The average salary range for a Mid Data Engineer with Python & Snowflake Jefferies Jobs India varies, but the pay scale is rated "Standard" in Pune. Salary levels may vary depending on your industry, experience, and skills. It's essential to research and negotiate effectively. We advise reading the full job specification before proceeding with the application to understand the salary package.
Key qualifications for Mid Data Engineer with Python & Snowflake Jefferies typically include Computer Occupations and a list of qualifications and expertise as mentioned in the job specification. Be sure to check the specific job listing for detailed requirements and qualifications.
To improve your chances of getting hired for Mid Data Engineer with Python & Snowflake Jefferies, consider enhancing your skills. Check your CV/Résumé Score with our free Resume Scoring Tool. We have an in-built Resume Scoring tool that gives you the matching score for each job based on your CV/Résumé once it is uploaded. This can help you align your CV/Résumé according to the job requirements and enhance your skills if needed.
Here are some tips to help you prepare for and ace your job interview:
Before the Interview:To prepare for your Mid Data Engineer with Python & Snowflake Jefferies interview at Capco, research the company, understand the job requirements, and practice common interview questions.
Highlight your leadership skills, achievements, and strategic thinking abilities. Be prepared to discuss your experience with HR, including your approach to meeting targets as a team player. Additionally, review the Capco's products or services and be prepared to discuss how you can contribute to their success.
By following these tips, you can increase your chances of making a positive impression and landing the job!
Setting up job alerts for Mid Data Engineer with Python & Snowflake Jefferies is easy with Pune Jobs | Expertini. Simply visit our job alerts page here, enter your preferred job title and location, and choose how often you want to receive notifications. You'll get the latest job openings sent directly to your email for FREE!