Hands-on experience in Python, R, PySpark, Keras, Tensorflow and APIs.
Experience working in cloud computing AWS services (S3, Redshift, Aurora) or GCP.
Experience working in the end-to-end data science life cycle from ideation to deployment and monitoring.
Experience with data warehousing, data architecture, ETL data pipeline and/or data engineering environments at ent erprise scale.
Knowledge in extracting data from RDBMS, NoSQL by writing queries.
Advance knowledge of Excel.