Skip to content

Latest commit

 

History

History
80 lines (69 loc) · 6.83 KB

resume.md

File metadata and controls

80 lines (69 loc) · 6.83 KB
layout title cover-img
page
Resume
/assets/img/fiesta5.jpg

A curious and passionate self-learner who enjoys using technology and data to solve problems.

Technical Skills

Engineering: SQL, Ruby on Rails, Java, Python, R, Docker, Terraform, Kafka, Kubernetes
Data/ ETL: dbt, Redshift, MySQL, Postgres, SQL Server, Jupyter Notebooks, Mage, Stitch, Census, Metabase, Looker, Google Analytics
Operating Systems: Mac, Linux
AWS Services: Redshift, Lambda, S3, RDS, CloudWatch, Kinesis, EC2, MSK, EKS
Project Management: Git, Notion, Kanban, Asana, Agile methodologies, JIRA, Confluence

Education

University of Denver
Bachelor of Science in Computer Science
Minors: Mathematics, Spanish
June 2016

Professional Experience

Data Team Lead at eSpark Learning | August 2021 – Present

  • Demonstrated strong leadership skills as the Data & Infrastructure team lead and Certified Scrum Master
    • Led scrum rituals, met with stakeholders to determine scope and deliverables, created team processes, and wrote technical documentation
  • Optimized data utilization strategies to enhance product decision-making and improve company-wide data accessibility
    • Contributed to defining and implementing key company metrics
    • Utilized Redshift analytics data to develop internal and external dashboards in SQL, aiding Customer and Sales teams in showcasing product effectiveness
    • Led a comprehensive cleanup and restructuring of the internal Metabase instance to improve usability and efficiency
  • Helped orchestrate a streaming event system where events are sent to Redshift from the Ruby on Rails applications via Kinesis Firehose
    • Developed new dbt models in SQL from streaming event data to support data science and analytics objectives
  • Developed and implemented new processes to deliver accurate, clean data to the core Ruby on Rails application, improving in-product decision-making
    • Utilized the Redshift Data API to serve complex aggregations from the data warehouse to the Ruby on Rails application
    • Refactored the product licensing implementation in the Ruby on Rails application by integrating data directly from the Salesforce Data API
  • Developed robust API integrations with key third-party data sources (Calendly, Salesforce, Intercom) to capture accurate data within the core Ruby on Rails application
  • Enhanced and maintained vital AWS cloud infrastructure supporting core applications, the company website, and engineering teams
    • Configured new AWS infrastructure, contributed to CI/CD setup, and developed GitHub Actions for new engineering projects
    • Migrated existing AWS infrastructure to Terraform and developed Python-based Lambda functions to automate processes

Senior Software Engineer at Nasdaq | July 2018 – August 2021

  • Worked on a new project which formulated order audit trail regulatory reports for clients and submitted them to FINRA on their behalf
    • Streamed drops of their daily trading activity using Java, Spring Boot, Kafka, Kubernetes, and various AWS services, and pieced together a cohesive timeline showing the lifecycle of the linked orders
  • Developed and delivered risk management reports to clients regarding complex trading strategies in use, risk exposure levels, market inconsistencies, and billing tiers calculations using Microsoft SQL Server
    • Helped maintain a cluster of 12 Windows servers both physical and virtual, to process the data and deliver roughly 300 daily reports
    • Migrated the project’s version control from VSS to git
  • Constructed and maintained a new data warehouse in Postgres with a REST API interface to analyze team report delivery metrics and ensure SLA accuracy
  • Leveraged an open source project called Poli to implement an internal dashboard and analytics tool to create ad-hoc reports, visualize reporting data, and minimize manual processes
    • Configured JDBC connections to Postgres, MySQL, and SQL Server reporting databases and wrote queries to serve data to the dashboard components

Data Warehouse Developer at NBCUniversal (acquired Craftsy) | October 2017 – July 2018

  • Enhanced the scope of the data warehouse project by pipelining data from additional sources and refining the architecture to meet the needs of the business analysts and data consumers
    • Configured the data visualization tool, Looker, to retrieve data from the data warehouse to make available to populate dashboards, analyze business segments, and enable users to create self-service reports
  • Created a marketing forecast to calculate 30-day rolling averages to forecast important business metrics, marketing costs, and revenue for different marketing channels
  • Developed API connections for several marketing channels (YouTube, Facebook, Pinterest and more) to gather ad engagement metrics and attribute costs to marketing actions
  • Assembled a new development environment and created development standards for the data warehouse project
    • Put the project in version control in git and created a workflow of task management in JIRA
    • Set up a version-controlled password management tool for the project using Python SOPS

Software Engineer at Nasdaq | June 2016 – September 2017

  • Created a workflow engine for a new equities dark pool using Java and Spring, and ingested data into separate secure client accounts in Redshift per regulatory requirements
  • Helped launch a new Business Intelligence project and created a unified platform for five options and three equities exchanges to deliver BI reports
    • Established a snowflake data model and developed ETL with Pentaho Data Integration
    • Documented and maintained source to target data mappings and collaborated with various business units on their reporting needs

Software Engineering Intern at Nasdaq | June 2015 – June 2016

  • Assisted senior engineers in developing the Nasdaq Data Warehouse application written in Java and Spring
    • Pipelined raw data from the trading system into AWS under heavy regulatory requirements
    • Enforced data quality and data integrity while ingesting millions of records per day
    • Archived the data from Redshift into long term storage in Amazon S3 and Glacier

Software Engineering Intern at Forsythe Technologies | May 2014 – February 2015

  • Developed a Java program to translate lists of each client’s business processes, and their dependencies, into dynamic BPMN diagrams embedded in a webpage using JavaScript and AngularJS
  • Documented the development process using several software modeling techniques such as UML diagrams, class diagrams, sequence diagrams, and activity diagrams

Certifications

Certified Scrum Master | May 2024 AWS Certified Solutions Architect Associate | April 2023
AWS Certified Cloud Practitioner | February 2023
NASM Certified Nutrition Coach | November 2023