brand logo
View All Jobs

MS Fabric Architect

Chennai, Bangalore, Hyderabad
Job Description
Who we are

Tiger Analytics is a global leader in AI and analytics, helping Fortune 1000 companies solve their toughest challenges. We offer full-stack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. We are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty andmove forward decisively. Our purpose is to provide certainty to shape a better tomorrow.

Our team of 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of our team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence.

We are a Great Place to Work-Certified™ (2022-24), recognized by analyst firms such as Forrester, Gartner, HFS, Everest, ISG and others. We have been ranked among the ‘Best’ and ‘Fastest Growing’ analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine. Curious about the role? What your typical day would look like?


Job Requirement
  • 8+ years of overall technical experience with at least 2 years working hands-on with Microsoft Fabric, preferably coming from Azure Databricks/Synapse background.
  • Experience of leading at least 2 end to end Data Lakehouse project on MS Fabric involving Medallion Architecture.
  • Deep expertise in the Fabric ecosystem, including: Data Factory, Notebooks, Pyspark, Delta Live Tables, Dataflow Gen2, Shortcuts, Fabric Lakehouse/ Warehouse, Copy Job, Mirroring, Event stream, KQL database, Fabric SQL DB, Semantic model (optional), Fabric Data Agent.
  • Experience integrating data as well as modernizing data from on-prem, Cloud, Web services, API, File sources: ADLS Gen2, SQL Database, Synapse, Event Hub, SFTP, Salesforce, Dynamics 365, Azure Synapse etc.
  • Experience of designing and developing metadata-driven frameworks (relevant for Data Engineering processes).
  • Strong programming, debugging, and performance tuning skills in Python and SQL.
  • Strong architectural understanding of Fabric workloads with pros/cons/cost awareness around proposing the right component.
  • Good experience with setting up Fabric Workspace, Access provisioning, Capacity management/cost control.
  • Good Experience of Data modeling (both Dimensional and 3-NF
  • Good Exposure to developing LLM/GenAI-powered applications.
  • Sound understanding of CI/CD processes using Azure DevOps & Fabric Deployment pipelines.
  • Exposure to technologies like Neo4j, Cosmos DB, and vector databases is desirable.