Location Austin, Texas (Hybrid)
Local candidates Only
Local candidates Only
Contract
Key Responsibilities
Systems Analysis & Solution Design
- Analyze business objectives, workflows, and technical requirements
- Conduct feasibility studies and cost/benefit analyses
- Develop detailed functional and technical specifications
- Evaluate system capabilities and recommend enhancements
Databricks Administration
- Administer and support Databricks workspaces in AWS environments
- Configure and manage clusters, job scheduling, and workspace settings
- Implement and enforce cluster policies and governance standards
- Monitor platform performance, availability, and reliability
Security & Access Management
- Manage user roles, permissions, and access controls (IAM, SCIM, RBAC)
- Ensure compliance with data security, encryption, and governance policies
- Support Unity Catalog for data governance (preferred)
Data Platform Integration
- Integrate Databricks with cloud storage services (e.g., Amazon S3)
- Support data lake/lakehouse architecture implementations
- Enable analytics, reporting, and AI/ML workloads
Performance Optimization & Monitoring
- Optimize Apache Spark workloads and troubleshoot performance issues
- Monitor system health and implement improvements
- Manage cost optimization strategies for Databricks usage
DevOps & Automation
- Utilize tools such as Terraform and CI/CD pipelines
- Automate deployment, configuration, and monitoring processes
- Support scripting using Python, SQL, or Scala
III. Required Skills & Qualifications
Minimum Requirements (8+ Years)
- Experience administering Databricks on AWS
- Strong expertise in cluster configuration, job scheduling, and workspace management
- Experience with IAM, SCIM, and RBAC access control models
- Proficiency in Apache Spark (performance tuning & troubleshooting)
- Experience integrating Databricks with cloud storage (e.g., S3)
- Knowledge of cluster policies and governance standards
- Experience with Databricks SQL, notebooks, and orchestration
- Strong understanding of data security, encryption, and compliance
- Experience monitoring platform performance and availability
- Hands-on experience with DevOps tools (Terraform, CI/CD, scripting)
Preferred Qualifications (4+ Years)
- Experience in enterprise or government environments
- Familiarity with Databricks Unity Catalog
- Knowledge of cost optimization strategies
- Experience supporting AI/ML workloads (MLflow, Databricks ML)
- Understanding of data lake/lakehouse architectures
- Programming experience in Python, SQL, or Scala
IV. Level Description
- 8+ years of experience required
- Performs complex tasks independently
- Demonstrates advanced analytical and problem-solving skills
- Exercises creativity and sound judgment in solution delivery.
Thanks & Regards
Mohammad Faisal
:
:
:
:
🔔 Get our daily C2C jobs / Hotlist notifications on