Our Client is a high frequency securities trading firm based in Chicago. The firm is one of the largest electronic trading firms in North America and specialises in high frequency automated market making and algorithmic trading driven by machine learning artificial intelligence. The firm have a sophisticated data driven approach which has enabled them to become a market leader in high volume trading.
Key Responsibilities
Design and build production cloud based batch and streaming data processing systems
Provide infrastructure tooling for cloud and hybrid computing
Provide operational support for production systems
Run and debug complex distributed environments
Evaluate new technologies by building and running PoCs
Asses and communicate potential business impact of those technologies
Act as a cloud architectural consultant for internal application teams
Working with Front Office IT teams through entire project lifecycles
Skills and Qualifications
Experience designing and running large batch and streaming data processing systems, e.g. Hadoop, Spark, or batch systems.
Experience building and running large data stores, e.g. Cassandra, MongoDB, Riak, HBase.
Experience with DevOps practices such as configuration management and infrastructure automation, (e.g. Ansible, packer, Jenkins etc...)
Experience building production systems on open source platforms.
Solid knowledge of various languages, preferably Python, C++ or Java.
Designed and deployed elastic applications to public or private clouds (AWS, GCE, Openstack).
Solid understanding of encryption and security.
Solid Linux systems knowledge.
If you would like to be considered for the position of Big Data/Cluster Engineer - Market Making or wish to discuss it further then please leave your details below. Your resume will be held in confidence until you connect with a member of our search team