Principal Data Engineer (San Francisco) (Remote Eligible)

Last updated 2 days ago
Location:San Francisco, California
Job Type:Full Time

This is an opportunity to join our fast-growing Security Intelligence Platform team to develop cutting-edge risk-based authentication and authorization policies. We are looking for principal level data engineers who can help architect and own the platform for deploying and optimizing the machine learning models used to protect user authentication and security. They will also own the pipeline which needs to process hundreds of millions of events per day and provide results back to the authentication system to make real-time risk evaluation during user authentication. This project has a directive from engineering leadership to make OKTA a leader in the use of data and machine learning to improve end-user security.

We hope you will share our passion and great pride in the work we do and will join an engineering team that strongly believes in automated testing and an iterative process to build high-quality next-generation cloud platforms.

Our elite team is fast, innovative, and flexible. We expect great things from our engineers and reward them with stimulating new projects and emerging technologies.

Job Duties and Responsibilities:

  • Overall ownership of the architecture, platform, and pipeline for developing, deploying, and running new machine learning models in production.
  • Work with Data Scientists to help improve their productivity and implement their ideas
  • Design and maintain new data processing pipeline to support new decision and scoring models
  • Analyze performance metrics and logs to identify inefficiencies and opportunities to improve scalability and performance
  • SQL Query Tuning: complex query plan analysis and optimization and Schema (re-) design
  • Actively monitor production using tools such as Splunk, Wavefront, CloudWatch, etc
  • Maintain and enhance our performance monitoring and analysis telemetry, frameworks, and tools
  • Test-driven development, design and code reviews

Minimum Required Knowledge, Skills, and Abilities:

  • 10+ years experience building enterprise grade highly reliable, mission-critical software or big data systems
  • 5+ years of experience in production SaaS deployment
  • 5+ years experience with streaming systems: MQ, Kafka, Storm, Spark, etc.
  • Expert-level understanding of relational databases (columnar and row-based), and NoSQL including mongo, Cassandra or similar
  • Experience with the data toolchains: EMR, Kinesis, Redshift, Glue
  • Advanced Python programming
  • Java or Scala development
  • Experience with Docker, Terraform, Chef, Jenkins, or similar build tools

Preferred Skills:

  • Working knowledge of AWS Sagemaker, Lambda, and API Gateway including production deployment
  • Jupyter Notebook Kernel maintenance
  • IPython, TensorFlow, PyTorch
  • Experience with Flink, Snowflake, and/or ElasticSearch

Okta is an equal opportunity employer

Okta is rethinking the traditional work environment, providing our employees with the flexibility to be their most creative and successful versions of themselves, no matter where they are located. We enable a flexible approach to work, meaning for roles where it makes sense, you can work from the office, or from home, regardless of where you live. Okta invests in the best technologies and provides flexible benefits and collaborative work environments/experiences, empowering employees to work productively in a setting that best and uniquely suits their needs.