Our partner's product is a next-generation data intelligence solution delivering cost-effective, machine learning powered, Business Intelligence as a Service.
Their vision is to simply create an easy to use but automatic insights platform utilising machine learning with Smart Alerting. They are committed to providing automatic insight through anomaly detection and deep learning, while providing for an open and collaborative platform.
We are building a high-performance data ingestion platform that will be able to process billions of rows of data and terabytes of information easily. If you are interested in pushing the boundaries of what’s possible then this might be the job for you.
You will be heavily involved in the end-to-end development of ETL/data processing solutions, especially the design and development of integrations with web-based platforms via APIs and SDKs.
Previous knowledge of ETL/data processing and workflow tools is preferable, such as:
- Apache Spark
- Candidate must demonstrate knowledge and experience of the following: Data processing, ETL pipeline development, database & warehouse architecture
- Data modelling and analytics
- SQL (any variant)
- Experience working with REST/SOAP APIs
- Attention to detail (e.g. writing clear technical documentation, identifying and working around technical limitations, testing your own solutions)
- Working to tight deadlines
- Excellent communication skills, both written and oral
- Some hands-on experience with Data Analytics / Data Visualisation tools (e.g. PowerBI, Tableau, Qlik, SSRS/SSAS, etc)
- Java web server administration
- End-to-end development lifecycle (e.g. requirements gathering, development and testing)