brand logo
View All Jobs

Architect-DBT

Bangalore, Chennai, Hyderabad
Job Description
Job Title: Architect - DBT
Bangalore | Chennai | Hyderabad
Who we are
Tiger Analytics is a global leader in AI and analytics, helping Fortune 1000 companies solve their toughest challenges. We offer fullstack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. We are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty and move forward decisively. Our purpose is to provide certainty to shape a better tomorrow.
Our team of 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of our team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence.
We are a Great Place to Work-Certified™ (2022-24), recognized by analyst firms such as Forrester, Gartner, HFS, Everest, ISG and others. We have been ranked among the ‘Best’ and ‘Fastest Growing’ analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine.

Curious about the role? What your typical day would look like?
As a Data Architect, you will play a pivotal role in designing and implementing solutions that adhere to best practices such as simplicity, flexibility, modularity, reusability, restorability, and traceability. Your responsibilities will include providing technical guidance and advisory services on data integration challenges to both clients and internal teams.
On a typical day, you might

• Facilitates requirement gathering sessions involving multiple departments/clients and complex data ecosystems, contributing effectively to technology roadmap workshops. 
• Independently devises end-to-end technical solutions and architectures for data platforms of any complexity, analyzing and guiding designs produced by the team. 
• Leads Design Review sessions with clients, actively participating in discussions comparing different design patterns and technology components. 
• Possesses expertise in multiple paradigms (Big Data/MPP/DWH, On-Prem/Cloud/Hybrid) across various data types. 
• Designs data models for multiple business domains and functional areas within modules, implementing best practices and standards in dimensional design, and advising client architects on data modeling. 
• Develops and implements advanced data governance components for projects of any scale and complexity, guiding the team in establishing robust data governance processes from requirement gathering to data provisioning. 
• Engineers and implements batch and stream design patterns using cloud services, providing guidance on all aspects of cloud platforms and leading data pipeline building and platform migration independently. 
• Conducts high-level custom checks and triangulations, in addition to common quality control checks, to ensure error-free outputs/solutions. 
• Organizes work clearly, making it easy for anyone to follow and understand, with modular and appropriately commented code. 
• Incorporates industry/vertical context from experience, in addition to client business context, and handles all business communication, report preparation, and proposals considering the target audience. 
• Capable of independently handling communication with client stakeholders and leading a team to execute the project.

Job Requirement
• Over 10 years of extensive experience in data warehousing systems, with a minimum of 1 year focused on building and implementing full-scale data warehouse solutions on cloud data warehouses. 
• Hands-on experience of 2+ years in designing and developing data integration solutions using DBT, along with strong proficiency in Apache Airflow development and Python scripting. 
• Proficient in creating DBT Models, Materialization, Macros, Snapshots, Jinja templating, and modifying YAML files. 
• Extensive expertise in DBT CLI, DBT Cloud, GitHub version control, and repository knowledge, utilizing DBT scripting for complex ELT processes and data pipeline builds. 
• Advanced SQL knowledge, with hands-on experience in writing complex queries using Analytical functions, troubleshooting, problem-solving, and performance tuning of SQL queries accessing data warehouses. 
• Working experience with Cloud data warehouses such as Redshift, Snowflake, BigQuery, including schema modeling, performance tuning, and utilization of CLI and GUI. 
• Skilled in data architecture, data modeling, physical database design, and tuning, with experience in migration projects and usage of Python DBT frameworks. 
• Proficient in analyzing data quality, aligning technical designs with data governance, and addressing operational requirements during the design and build of data pipelines. 
• Ability to independently envision and develop innovative ETL and reporting solutions, leading projects from conception to completion. 
• Architecting and implementing end-to-end projects with multiple transformation layers, demonstrating expertise in designing and implementing multiple use cases in Airflow and following best coding practices. 
• Strong understanding of CI/CD processes, with knowledge of GitHub, DBT, Airflow integration, and skilled in Python/Shell scripting. 
• Experienced in leading project teams of data engineers, performing code reviews, and collaborating closely with cross-functional teams. 
• Excellent communication skills, both verbal and written, along with strong interpersonal skills, comfortable working in dynamic, fast-paced, innovative environments with multiple ongoing concurrent projects. 
• Possesses strong analytical and problem-solving skills with high attention to detail.