Resume Score
CV/Résumé Score
  • Expertini Resume Scoring: See how well your CV/Résumé matches this job: Data Bricks Platform Administrator.
Pune | Expertini

Urgent! Data Bricks Platform Administrator Job | CODERS BRAIN

Data Bricks Platform Administrator



Job description

Hi,

Greetings of the day

As we discussed theDatabricks Platform Administrator  position, I am sharing with you the Job Description.

About us - Coders Brain is a global leader in its services, digital and business solutions that partners with its clients to simplify, strengthen and transform their businesses.

We ensure the highest levels of certainty and satisfaction through a deep-set commitment to our clients, comprehensive industry expertise and a global network of innovation and delivery centers.

We achieved our success because of how successfully we integrate with our clients.

  • Quick Implementation- We offer quick implementation for the new onboarding client.

  • Experienced Team - Weve built an elite and diverse team that brings its unique blend of talent, expertise, and experience to make you more successful, ensuring our services are uniquely customized to your specific needs.

  • One Stop Solution - Coders Brain provides end-to-end solutions for the businesses at an affordable price with uninterrupted and effortless services.

  • Ease of Use - All of our products are user friendly and scalable across multiple platforms.

    Our dedicated team at Coders Brain implements keeping the interest of enterprise and users in mind.

  • Secure - We understand and treat your security with utmost importance.

    Hence we blend security and scalability in our

implementation considering long term impact on business benefit.

Position - Databricks Platform AdministratorExperience- 12+YrsNoticePeriod : Immediate Joiners to 10 daysLocation - Remote

Payroll with the CodersBrain

Also Kindly Confirm that once you will be selected in the company, you join the company immediately also give 8-9hrs daily as discussed

Job Summary:We are seeking a skilled Databricks Platform Administrator / Engineer with proven experience supporting modern data platforms in a Property & Casualty (P&C) insurance setting.

The ideal candidate will be responsible for managing, configuring, optimizing, and maintaining the Databricks environment to support data engineering, analytics, and machine learning workloads in a secure and compliant manner.

Education:Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or related field.

Required Qualifications:

  • 10+ years of experience in data platform engineering or system administration.

  • 2+ years of hands-on experience with Databricks administration on Azure
  • Solid understanding of Apache Spark, Delta Lake, and Lakehouse architecture.

  • Proficiency in scripting and automation using Python, Bash, PowerShell, or Terraform.

  • Familiarity with data security, audit, and compliance frameworks relevant to P&C insurance (e.g., GDPR, NYDFS, SOC 2).

  • Experience with Azure Data Services, ADLS, Key Vault, and Azure DevOps
  • Strong understanding of the P&C insurance data ecosystem (policy, claims, underwriting, actuarial, etc.).

Preferred Qualifications:

  • Databricks Certified Associate / Professional (Administrator / Data Engineer) certification.

  • Experience integrating Databricks with core systems like Guidewire, Duck Creek, or custom P&C platforms.

  • Knowledge of industry standards like ACORD data models and ISO data feeds.

  • Exposure to tools like Unity Catalog, Purview, Collibra, Informatica, or similar.

Soft Skills:

  • Strong problem-solving and analytical skills.

  • Ability to work cross-functionally in an agile team environment.

  • Excellent communication and stakeholder management skills.

  • Experience working in regulated enterprise environments with strong data governance standards.

Key Responsibilities:

  • Platform Administration & Management:

  • Manage and administer Databricks workspaces, clusters, jobs, libraries, and notebooks.

  • Implement access controls using Unity Catalog, SCIM, and workspace-level RBAC.

  • Configure and monitor autoscaling, cluster policies, and compute usage to optimize cost and performance.

  • Support platform upgrades, patching, and operational automation.

  • Security, Compliance & Governance:

  • Enforce security policies aligned with P&C insurance industry regulations.

  • Integrate with enterprise identity providers (Azure AD, Okta) and manage SSO.

  • Support data governance using tools like Unity Catalog, Immuta, or Collibra.

  • DevOps & Automation:

  • Build CI/CD pipelines for notebooks, jobs, and ML workflows using tools like Azure DevOps, GitHub Actions, or Jenkins.

  • Automate platform provisioning and configurations using Infrastructure-as-Code (Terraform, ARM, etc.).

  • Monitoring & Troubleshooting:

  • Set up logging, monitoring, and alerting using tools like Azure Monitor, Datadog, or Splunk.

  • Troubleshoot performance issues, job failures, and cluster errors.

  • Collaboration & Enablement:

  • Work closely with Data Engineers, Data Scientists, and Business Analysts to ensure optimal platform usage.

  • Develop platform usage guidelines, onboarding documentation, and best practices.

  • Provide technical support and training to users.


Required Skill Profession

Other General



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Data Bricks Potential: Insight & Career Growth Guide


Advance your career or build your team with Expertini's smart job platform. Connecting professionals and employers in Pune, India.