Apache Spark Core Platform Developer (Committers Wanted)
Category: Development, Infrastructure
Company Description: Leading global provider of data, news and analytics
Salary: Highly Competitive, Depending on Experience
Position Type: Permanent
Job Number: 8167
The Spark Platform team is building low-latency, distributed analytics infrastructure for the entire firm based on Apache Spark.
Instead of building dozens of isolated Spark applications for individual problem domains, we are building a platform that makes it easy for teams to plug in their business logic without duplicating common functionality. This includes connecting to various datastores or real-time streams, figuring out a way to serverize Spark transforms or having to re-implement transforms such as currency conversion that are very common in financial analytics. But we can't just use Apache Spark as is. We need to enhance open source Spark to fit our low-latency, high throughput and high availability contexts. That's where you come in.
As part of our team, you'll collaborate with a number of the core Spark contributors to co-design and co-develop enhancements to Spark in areas ranging from Performance to HA.
We'll trust you to:
You need to have:
We'd love to see:
All qualified candidates are encouraged to apply by submitting their resume as an MS word document including a cover letter with a summary of relevant qualifications, highlighting clearly any special or relevant experience.
Please Note: All inquiries will be treated with the utmost confidentiality. Your resume will not be submitted to any client company without your prior knowledge and consent.