SENIOR DATA ENGINEER (GCP)
We are a boutique, rapidly growing, GCP (Google Cloud Platform) consulting company based out of Toronto. We work with GCP’s top customers (banking, telco, energy, retail, etc. ) to help them with cloud transformation, security, analytics, ML, data governance, etc. Clients usually engage us to solve their hardest business problems, and help raise the bar at their organization.
You get the best of both worlds. We operate like an early stage startup, with all the associated benefits (talent, growth, learning opportunities, flexibility) but get to solve enterprise level technical challenges, and are actually profitable ;)
- People: We hire top tier talent. Our team consists of ex-Googlers, YCombinator Alumni, and individuals that built software for 400M users.
- GCP is the Best Cloud: Maybe we are biased, but GCP is the most cutting edge cloud provider, building on technologies that have been pioneered by Google : BigQuery, K8s/Anthos, Vertex AI, etc.
- Growth: We have doubled in size in the last 6 months, and are looking to double again this year.
- Work closely with technical leads and client teams to fully demonstrate the benefits of GCP technology
- Introduce clients to data architecture and analytics best practices
- Solve some of the most challenging and high scale data and IoT problems (telco, energy, financial market data, etc.)
- Write high quality and testable batch and real time data pipelines
- Work with Data Scientists to design pipelines that improve their productivity and enable them to implement their ideas
- Support clients in troubleshoot issues in their test and/or production environments and identifying root cause and solutions
- GCP experience ( for junior candidates, or exceptional talent with other cloud experience, we will provide a training programs)
- Experience in large-scale, secure, and high availability solutions in Cloud environment such as Google Cloud (GCP)
- Experience with one of Dataflow or Spark
- Experience with orchestration frameworks such as Airflow, Kubeflow, Azkaban, etc.
- Extensive programming experience in Python, Java or Scala
- Good understanding of modern data architecture
- Good understanding of GCP services such as IAM, and Google Cloud Storage buckets
- Experience with business intelligence tools like QuickSight, Looker and Data Studio
- Technical writing, experience in preparing and presenting technical material to a variety of audiences
- Experience in working in, and with, Agile delivery teams
Nice To Have
- Streaming experience with Dataflow, Spark, Flink, Kafka Streams, etc
- Strong understanding of Data Governance principles and experience working with tool such as Collibra or Immuta
Type of qualities we look for (across all roles)
- Passionate about delivering high quality commercial software products and platforms to market
- Team player. We are a small team, and enjoy working with each other and our clients - we would like it to keep it the same way as we grow
- Strong understanding of modern software engineering processes
- Client focused and passionate about delivering business value with
- Able to communicate clearly and efficiently with a variety of audiences including developers, clients, customers, partners and executives
- Flexible and willing to use the right technology for each problem in the context of timelines and business goals
- Ability to complete the job regardless of the circumstance