Data Infrastructure Developer (DevOps)
WatrHub accelerates sales to water utilities.
Canada and the United States need to rebuilt their aging water infrastructure. This will cost $1 Trillion over the next 20 years to replace pipes, pumps, values, treatment equipment and the software to run them all.The U.S. is rebuilding its water infrastructure. This will cost $1 Trillion over the next 20 years. WatrHub is the only source of predictive market information on how, when, and why these water infrastructure investments will be made. Our AI engine transforms millions of public documents into actionable sales leads and market insights. Manufacturers subscribe to WatrHub to secure their next water utility clients and bring their advanced, efficiency technologies to our aging water infrastructure.
Find out more about working at WatrHub: http://www.watrhub.com/jobs
We are looking for a Data Infrastructure Developer to direct our talented Development Team into the future. The position will be based out of our Toronto Headquarters.
Our Tech Stack:
- Python including Tensorflow, NLTK, scikit-learn, Flask, and Scrapy
- Kafka, Hbase, Elasticsearch, Tesseract, Selenium and Git
- We move quickly and deploy to Google Cloud Platform multiple times per day
What You Will Accomplish:
- Redesign and improve our data ingestion infrastructure to allow rapid expansion of our datasets
- Architect scalable solutions for the all stages of our data pipeline: large scale distributed data acquisition, data cleaning and normalization, storage, information extraction, RESTful APIs, authentication, and data visualization
- Design and build machine learning infrastructure including model training and serving API requests
- Design and develop solutions to seamlessly integrate layers of big data from thousands of disparate unstructured data sources into our data pipeline, while maintaining a fast customer experience
- Analyze development practices and take the lead in recommending and implementing improvements
- Experience in at least two of the following areas: Web scraping and crawlers, Elasticsearch, Machine Learning, NLP, distributed systems, microservices, Hadoop, Kafka
- Strong knowledge and experience with either AWS or GCP (Google Cloud Platform)
- Previous experience in a DevOps role is a plus
- Significant Industry experience with Python or Java, and SQL
- Experience working with large unstructured data sets from multiple sources is a plus
- Bachelors or Masters Degree in Computer Science or related field (Non-degree candidates with exceptional programming skills are also welcome to apply!)
- You are an excellent communicator, and collaborate well with front-end development teams and non-technical data experts
- You love what you do and you want to be part of a company with a strong social mission
- As an early member of our Engineering Team, you can influence positive change within one of the biggest societal challenges of our generation (water)
- You will be part of a culture that believes in moving fast and crushing hard problems with no obvious solutions
- There is no bureaucracy here. You will accomplish more here in a few months than what you would in a few years at a large technology company.
- You get Company-paid transportation, health & fitness benefits, a training budget, hardware of your choice, and work-from-wherever days
- You get support from our in-house subject matter expertise, from data scientists to water industry titans
Application Deadline: Feb 17, 2017
Do you have a LinkedIn account? Import your resume and save time!