Sr Data Engineer

Location: Pittsburgh, PA

Job Type: Full Time / Permanent

The Data Services team builds and manages our data processing platform. Our vast number of data collectors and integrations provide our customers with unrivaled knowledge of their fleets. The platform must run reliably and efficiently and must make the data accessible in a fast and secure manner. Our Data Services team works closely with customers & integration partners to help them realize their data & pipelining needs. Establishing Safety Suite as the hub for all of the fleet’s data.


  • Collaborate with our product, customer success, and technical teams to research, design, and enhance the Data Services Platform.
  • Innovate on an industry full of data and experience, and create invaluable solutions to our customer’s problems.
  • Your focus will be developing new features, fixing bugs, and optimizing our Data Services Platform.
  • Build new integrations, features, and support our Data Services Platform
  • Define internal development processes & practices for scaling our organization
  • Lead & contribute to the development of tools to scale our business and our customers
  • Participate in internal reviews of code, software components, and systems and make data-driven decisions on how they should evolve
  • Communicate effectively and participate with team members in an Agile environment
  • Work on any task and help solve problems when needed — be humble and scrappy!

Education & Experience:

  • 3+ years of experience as a software developer
  • Strong proficiency with Google Go (Golang), Python, or similar language
  • Experience reviewing code and mentoring less experienced developers
  • Strong quantitative and analytical background & process
  • Knowledge of streaming (Kafka, RabbitMQ)
  • Experience writing unit, integration, and end-to-end test code
  • What will set you apart:
    • Experience in the Logistics / Transportation industry
    • Experience with cloud computing (AWS, Azure) and services like KMS, RDS, SQS, etc
    • Understanding of data pipelining tools like Hadoop and Kafka
    • Experience with distributed technologies like Cassandra, Kubernetes
    • Experience working in an entrepreneurial or enterprise environment