Jobs Home » Full Time Opportunities » Data Engineers

Company

Information

Options

CluztersOps1`s Listings

Data Engineers

Job Details

  • Job Posting Type Curated from an External Source

Company Details

Full Time Job Requirement Details

  • Designation Data Engineers
  • Required Skills R Programming, Python Programming, Statistical Analysis, Machine Learning, Predictive Modeling, Advanced Visualization, Big Data
  • Position Level Mid Senior Level
  • Years of Experience 2-5 years
  • Indicative Package Range Not Disclosed
  • Domain Expertise Flexible/Generic
  • Highest Qualification M.C.A, M.B.A, B.E / B.Tech, Any Post Graduation

JOB DESCRIPTION

What you will do



  • You'll design and develop scalable, reliable, secure and fault tolerant data systems and work on improving to deliver next gen systems

  • You’ll work 100% in the public cloud (AWS, GCP, Azure)

  • You'll build readable, concise, reusable, extensible code that avoids re-inventing solutions to problems that you solved

  • You will be migrating massive datasets from high-growth startups from one cloud to another or on-prem to cloud

  • You'll be part of discovery sessions with clients and will build state of the art data architectures

  • You’ll research new technologies and tools that enable building the next generation systems

  • You’ll use the knowledge you’ve learned from client projects to develop and improve our internal products.


The Role



  • At least 3-4 years of experience as a Data Engineer or Software engineer

  • Strong coding skills in one of the following: Python, Java or Scala

  • Experience with Spark or Beam

  • Experience with SQL databases as well as NoSQL data stores

  • Experience in SQL with an in-depth understanding of query optimization. You should also understand DDL very well.

  • Excellent understanding of building data models for the target data warehouse

  • Hands-on experience in any of the cloud platforms and having a thorough understanding of the cloud concepts are required.

  • Experience with any orchestration tool - Airflow, Kubeflow, Oozie, Luigi, Azkaban, Step Functions...

  • Must have an inherent talent of understanding data and its interpretation for end business users

  • Ability to drive initiatives and work independently

  • Good communication skills to be able to express one’s point to client concisely


Nice-To-Have



  • Experience building large-scale, high throughput, 24x7 data systems

  • Exposure to machine learning algorithms with implementation in practice

  • Experience with legacy big data systems (ex: Hadoop)

  • Any big data or date engineering certification on any of the clouds

ABOUT COMPANY

CloudCover delivers the insane potential of the public cloud to start-ups & agile enterprises through a combination of weaponized geekiness, extreme automation, and battle-scarred experience.

The public cloud's transformative power isn't in scale, or speed, or price. It's in the API. The ability for software to request and control hardware leads to an unprecedented opportunity to automate IT. CloudCover is obsessed with automation, to the point where doing something manually twice is once too many times.


CloudCover is a cloud-native solutions company focused on helping organizations teleport into the future of IT. It consults, designs, builds and manages cloud-native applications and data systems for high-growth startups and agile enterprises. Part of ST Telemedia Cloud, a leading public cloud solutions provider across Asia-Pacific, CloudCover is headquartered in Singapore with a delivery center in Pune and sales offices in Delhi, Mumbai, and Bengaluru. Its 190+ employees are dedicated to enabling seamless collaboration between developers and IT. For more information, visit www.cldcvr.com.