brand logo
View All Jobs

BigData Platform Engineer/Infrastructure Engineer

Chennai, Bangalore, Hyderabad
Job Description
Position Overview:
Tiger Analytics is seeking a highly skilled and experienced Cloud and Big Data Platform Architect to join our team. The candidate will be responsible for designing, implementing, and optimizing data and analytics solutions with a strong emphasis on infrastructure and platform strategies. The candidate also needs to collaborate with cross-functional teams, including developers, data engineers, and business stakeholders, to create robust and scalable platforms. The incumbent’s expertise in Databricks, big data technologies, and cloud infrastructure will be crucial in driving successful projects.
Key Responsibilities:
  1. Infrastructure and Platform Strategy:
    • Develop and drive comprehensive infrastructure and platform strategies specifically tailored for big data solutions using Databricks and AWS cloud environments.
    • Design scalable and resilient architectures that optimize compute and storage resources for efficient processing and storage of big data.
  2. Compute and Storage Optimization:
    • Implement best practices and architectural patterns to optimize compute and storage resources, ensuring high performance and cost-effectiveness of big data solutions.
    • Conduct performance analysis and capacity planning to scale compute and storage resources based on workload demands and growth projections.
  3. Cost Optimization:
    • Design cost-effective solutions by leveraging cloud-native services and optimizing resource utilization, effectively managing cloud spend while maintaining performance and scalability.
    • Implement cost monitoring and governance mechanisms to track and optimize cloud expenses, identifying opportunities for cost savings and efficiency improvements.
  4. Security Standards and Strategy:
    • Define and enforce robust security standards and strategies for Databricks and AWS environments, ensuring compliance with industry regulations and internal policies.
    • Implement security controls and mechanisms to protect data at rest and in transit, including encryption, access controls, and network security configurations.
  5. Solution Architecture and Design:
    • Lead the design and architecture of complex big data solutions, incorporating Databricks and AWS services to meet business requirements while adhering to infrastructure and platform best practices.
    • Collaborate with cross-functional teams to define technical requirements, assess solution feasibility, and propose architectural recommendations.
  6. Technical Leadership and Guidance:
    • Provide technical leadership and guidance to project teams, mentoring and coaching team members on infrastructure and platform design principles and best practices.
    • Collaborate with stakeholders to align infrastructure and platform strategies with business objectives, driving innovation and continuous improvement initiatives.
Qualifications:
  • Bachelor's degree in Computer Science, IT, or  a related field.
  • Total 15+ years of experience with 7-10 years of experience in infrastructure and platform design roles, with a focus on big data solutions leveraging cloud platforms.
  • Strong proficiency in designing scalable and resilient architectures for big data solutions, optimizing compute, storage, and cost.
  • Extensive experience in defining and implementing security standards and strategies for cloud environments,      ensuring compliance and data protection.
  • Proven track record of designing and implementing complex big data solutions, integrating Databricks and AWS      services to meet business requirements.
  • Excellent analytical, problem-solving, and decision-making skills.
  • Strong communication and collaboration skills,      capable of engaging with technical and non-technical stakeholders effectively.
Job Requirement
Databricks,Apache Spark,Linux/Unix,Lakehouse Platform,clusters,Bigdata Platform,Infrastruture