Role: Databricks Architect (Architects, please, not developers or engineers)
Location: Remote work
Rate - $50/hr on W2 AND 63/hr on C2C
Skills Needed
• 8+ years of experience in data engineering, with at least 3 years on Databricks.
• Strong proficiency in PySpark, SQL, and Delta Lake.
• Hands-on experience with GCP Dataproc.
Responsibilities
• **Administration**:
• Lead the installation and configuration of Databricks on GCP cloud platforms.
• Monitor platform health, performance, and cost optimization.
• Implement governance, logging, and auditing mechanisms.
• **Development / Enhancements**:
• Design and develop scalable ETL/ELT pipelines using PySpark, SQL, and Delta Lake.
• Collaborate with data engineers and analysts to enhance data workflows and models.
• Optimize existing notebooks and jobs for performance and reliability.
• **Operations, Support & Troubleshooting**:
• Provide L2/L3 support for Databricks-related issues and incidents.
• Troubleshoot cluster failures, job errors, and performance bottlenecks.
• Maintain technical documentation for platform setup, operations, and development standards.
Best Regards,
Rahul Thakur
Phone: 760-349-0078
Email: rahul@dantatechnologies.net
Notes:- All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance.
Benefits: Danta offers a compensation package to all W2 employees that are competitive in the industry. It consists of competitive pay, the option to elect healthcare insurance (Dental, Medical, Vision), Major holidays and Paid sick leave as per state law.
The rate/ Salary range is dependent on numerous factors including Qualification, Experience and Location.