Data Engineer

IT Services

upstox

We are looking for a savvy Data Engineer to join our growing team of analytics experts at Upstox. The hire will be responsible for expanding and optimising our data and data pipeline architecture, as well as optimising data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimising data systems and building them from the ground up.
You will be responsible for –
– Creating complex data processing pipelines, as part of diverse, high energy teamsDesigning scalable implementations of the models developed by our Data Scientists.
– Hands-on programming based on TDD, usually in a pair programming environment.
– Deploying data pipelines in production based on Continuous Delivery practicesCreate and maintain clear documentation on data models/schemas as well as transformation/validation rules.
– Troubleshoot and remediate data quality issues raised by pipeline alerts or downstream consumer.
– Engage with stakeholders to gather requirements to deliver data solutions.
Advising clients on the usage of different distributed storage and computing technologies from the plethora of options available in the ecosystem
Ideally, you should have –
– Good understanding on building and deploying large scale data processing pipelines in a production environment
– Experience building data pipelines and data centric applications using distributed storage platforms like HDFS, S3, NoSql databases (Hbase, Cassandra, etc) and distributed processing platforms like Hadoop, Spark, Hive, Oozie, Airflow, etc in a production settingHands on experience in MapR, Cloudera, Hortonworks and/or Cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions.
– Strong communication and client-facing skills with the ability to work in a consulting environment is essential·
Desired Skills and Experience
– Comfortable working in Linux environment.
– Fluent in programming languages like Nodejs/Java/Python/AWS
– SQL (Expert Level)Hands-on Experience in Distributed Processing platforms such as AWS EMR, MapR, Cloudera
– Distributed storage platforms like HDFS, S3, NoSql databases

To apply for this job please visit jobs.lever.co.

Similar Jobs to Apply
  • wipro
    Pune

    Responsibilities Build an automation testing framework for ETL and Data Quality Validation Build and maintain automated testing scripts written in Scala, Spark and SQL Work with Scrum/Agile team to
  • alight
    Mumbai

    Eligibility Criteria for Technology Graduates : Should have pursued graduation in field of computers with BTech Background Minimum score of 70% throughout their academic tenure. (includes 10th
  • deshawindia
    Hyderabad

    We are looking for a Tech Associate for our Quality and Test Engineering team. The team is responsible for ensuring the quality of different kinds of applications. What you’ll do
  • Meesho
    Bangalore

    About the Role If you love making impacts, working with data, creating solutions, and have a knack for finding ways to improve processes. We are looking for a problem-solving, forward-thinking