Job Details


Senior Hadoop Developer - Foundational Services

Job Info:


Category: Development, Infrastructure
Company Description: Leading global provider of data, news and analytics
Salary: Highly Competitive, Depending on Experience
Position Type: Permanent
Location:
Job Number: 8550

Job Description:


 

Our client's problems are common yet complex. Their application teams face challenges of large-scale data storage, low-latency retrievals, high volume requests and high availability over a distributed environment. They create standardized solutions to these problems by building core services and technology frameworks for enterprise-wide use:

 

As an experienced Hadoop developer, you will help us refresh and evolve many facets of our data and analytics infrastructure for three of these systems: the Data Platform (BDP), Price History and Query Language (BQL). BDP is an initiative to help standardize the storage backends and structure data flows across our systems to improve discoverability and data provenance. Price History is our canonical end of day time series datastore. BQL is a distributed analytics framework that allows internal and external users to express complex data retrieval, analytics and screening criteria.

 

All of the company's data flows through these systems. You'll gain exposure to our financial data sets and how they're used while building high performance, low-latency, scalable software for these core infrastructure initiatives. Much of these applications are being built on top of open source Hadoop technologies, so there are plenty of avenues to innovate and contribute to the open source community.

 

We'll trust you to:

  •  Take ownership of a component of the BQL, Price History or BDP platform
  •  Interact with development teams across Bloomberg and understand their application requirements and access patterns
  •  Design and develop systems that meet our latency, volume, storage and scale expectations
  •  Participate in daily scrums to help influence architectural decisions

You must have:

  •  Java expertise
  •  3+ years of experience with Hadoop/HDFS/MapReduce
  •  3+ years of experience with NoSQL data stores (preferably HBase or Cassandra)
  •  Experience developing, enhancing and maintaining high throughput, low-latency Hadoop systems in a mission-critical production environment
  •  Experience working in a Test Driven and Agile development environment

We'd love to see:

  •  Experience enhancing and maintaining mission-critical software in a fast-paced environment
  •  Experience with Spark, Kafka, Oozie, Zookeeper, Flume or Storm

 

 


All qualified candidates are encouraged to apply by submitting their resume as an MS word document including a cover letter with a summary of relevant qualifications, highlighting clearly any special or relevant experience.


Please Note: All inquiries will be treated with the utmost confidentiality. Your resume will not be submitted to any client company without your prior knowledge and consent.


Contact Recruiter
brian.tesser@andiamogo.com
Senior Technical Recruiter
Andiamo Partners | 90 Broad Street, Suite 1501, New York, NY 10004


Share Share this Job