Who We Are
Role Description
Job Description
- Administration and configuration of the Databricks workspace using Terraform, CLI, and SDK
- Managing environment settings (clusters, libraries, compute policies, access permissions)
- Defining and maintaining catalogs, schemas, and tables using Unity Catalog
- Ensuring platform security, scalability, and high availability
- Supporting the implementation of the Data Mesh concept across domains
- Providing technical support and advisory to data analysts, data scientists, and business users
- Leading onboarding, training, and enablement activities to support platform adoption
- Creating and maintaining technical documentation, participating in agile development and sprint planning
- Collaboration takes place in a hybrid model, with 2–3 days per week onsite in Prague
Requirements
- Minimum 5 years of experience with development and administration of cloud-based big data platforms (Azure / GCP)
- At least a Bachelor’s degree in Computer Science or a related field
- Active English proficiency at minimum B2 level (daily communication in English)
Advanced experience with:
- Strong knowledge of Databricks, Delta Lake, and Spark environments
- Experience with tools such as Apache Airflow, Azure Data Factory, or Apache Beam
- Advanced knowledge of Terraform, Python, and CI/CD tools (e.g., GitHub Actions)
- Strong understanding of data security, governance, monitoring, and platform management
- Excellent communication and coordination skills within an international team
Nice to have:
- Experience with technologies such as Kafka, Flink, or similar frameworks
- Previous experience working in Agile methodologies
- Experience in the financial markets domain
We Expect You to Have:
Oops! Something went wrong while submitting the form.
.png)

