Desirable to have ETL with batch and streaming (Kinesis).
Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using data ingestion and transformation components. The following technology skills are required
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
Experience with Dataflow
Hands-on experience in using Databricks Hands-on experience with at least one of the leading public cloud data platforms (Amazon Web Services and Google Cloud) Experience with column-oriented database technologies (e.g., Big Query, Redshift, Vertica), NoSQL database technologies (e.g., DynamoDB, BigTable, Cosmos DB, etc.), and traditional database systems (e.g., SQL Server, Oracle, MySQL)
Experience with big data tools like Delta Lake, Databricks
Experience with Storage, Unity catalog
Assemble large, complex data sets that meet functional / non-functional business requirements.
Job Requirement
Strong in Pyspark, Python, PLSQL, SQL
Desirable to have ETL with batch and streaming (Kinesis).
Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using data ingestion and transformation components. The following technology skills are required
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
Experience with Dataflow
Hands-on experience in using Databricks Hands-on experience with at least one of the leading public cloud data platforms (Amazon Web Services and Google Cloud) Experience with column-oriented database technologies (e.g., Big Query, Redshift, Vertica), NoSQL database technologies (e.g., DynamoDB, BigTable, Cosmos DB, etc.), and traditional database systems (e.g., SQL Server, Oracle, MySQL)
Experience with big data tools like Delta Lake, Databricks
Experience with Storage, Unity catalog
Assemble large, complex data sets that meet functional / non-functional business requirements.