z********a 发帖数: 46 | 1 https://adobe.taleo.net/careersection/2/jobdetail.ftl?job=142542&lang=en&sns
_id=mailto#.UGCHd9Uy26c.mailto
Responsibilities
Design and implement scalable solutions for the collection, storage and
analysis of huge data sets.
Write elegant, maintainable code.
Debug and troubleshoot production issues.
Work with other team members to ensure interoperability with the rest of
the software "stack".
Requirements
1-3+ years of experience developing large scale applications in Java or
C/C++ for the Linux platform.
A BS or MS in Computer Science or related discipline.
A strong foundation in computer science with a thorough understanding of
data structures, algorithms and object-oriented design principles.
Experience with at least one or more of Python, Ruby, PHP, SQL a plus.
Preferred
Experience with distributed platforms such as Hadoop, Hbase, Hive,
ZooKeeper, Cassandra.
Experience with scalable machine learning platforms such as Mahout.
Experience with messaging technologies such as RabbitMQ, HornetMQ or
ActiveMQ
Experience with data collection services such as Flume, Scribe, Chukwa
or Splunk
有意者可以在线申请或者发简历到h****[email protected] |
|