Paul Wheeler


Paul Wheeler


I am a Senior Software Engineer/Technical Lead with experience designing and implementing applications from the database to the front end. I have deep knowledge of the Microsoft platform, especially C#, ASP.Net Core, Linq, WCF, and SQL; with additional experience in Java, Hadoop, Javascript, and Haskell. I have a passion for technology and solving real world problems. I maintain a sharp technical focus. I also have experience managing a team of developers, which has given me a keen awareness of what it takes to make a group of engineers cohesive and successful.


Earned a BS in Computer Science at Rochester Institute of Technology in Rochester, NY from 2001 to 2005.

May 2005
Computer Science
Genetic Algorithms, Artificial Intelligence, Computer Graphics, 3D Computer Animation, Operating System Design



  • Software Engineer Lead @ RevaComm

    • Honolulu, HI
    • Apr 2022 to present

  • Principal Software Engineer @ Free-Side Software LLC

    • Honolulu, HI
    • Jul 2020 to Apr 2022

    As Invio wound down in 2020, SourceDrive transitioned to operation as an independent consortium under the management of the Cedars-Sinai Health System. As the principal engineer developing, maintaining, and operating SourceDrive I have had the opportunity to maximize my breadth in design, architecture, cloud operations, customer support, and project management.

  • Senior Software Engineer @ Invio

    • Honolulu, HI
    • Sep 2017 to Jul 2020

    Invio is a early stage start-up focused on bringing automation and technological solutions to FDA regulated clinical trial operations. The company's focus is on improving efficiency, reducing costs, decreasing time to market, and ensuring quality. Years before a joined Invio I had the opportunity to work with the founders at a Startup Weekend event on a precursor project that laid the groundwork for what eventually became Invio.

    • Developed web frontend features including interactive document editing UI.
    • Designed and implemented backend API capabilities and databases.
    • Created public REST API for importing medical device data.
  • Principal Engineer @ Amperity

    • Honolulu, HI
    • Jan 2017 to Aug 2017

    Amperity is a fast growing start-up that is revolutionizing the way consumer brands manage their data with advanced, big data matching and de-duplication techniques. Amperity was a great place to work and a learned a lot during my time there, however when the opportunity at Invio became available I couldn't pass up the chance to rejoin that team.

    • Led a team tasked with overhauling Amperity's data ingestion pipeline to improve configurability, reliability, and performance.
    • Defined data ingestion system architecture, defined and prioritized project backlog, oversaw and mentored multiple developers.
  • Skipper @ S/V Serenity

    S/V Serenity
    • The Pacific Ocean
    • Feb 2016 to Dec 2016

    After performing another round of upgrades on Serenity, we sailed from California to the Island of Hawaii, to French Polynesia, to Honolulu, HI. Responsibilities included navigation, seamanship, maintenance, and first and foremost, ensuring the safety of myself, my wife, and my crew.

  • Senior Software Engineer @ IMS Health (Appature)

    • Seattle, WA
    • Feb 2013 to Feb 2016

    Appature was a Marketing Automation and Customer Relationship Management system targeted specifically at the life sciences and pharmaceutical industry. We tackled challenges bringing messy, high volume data sets into a system where quality and predictability were essential.

    • Redesigned internal data access patterns to simplify code, increase encapsulation of data-specific functionality, and decrease coupling to the database layer.
    • Supervised a team of three engineers, provided mentoring and career guidance, managed projects and assignments.
    • Built a new version of the Appature Nexus API for loading data. Significantly improved performance and reliability.
    • Redesigned query engine to increase expressiveness, simplify optimization process, and improve code maintainability.
  • Senior Software Engineer @ Alpha Heavy Industries

    • San Francisco, CA
    • Jul 2011 to Dec 2012

    Alpha Heavy Industries was founded by some of my coworkers at a previous start-up, Positronic, to expand upon our application of internet text mining and natural language processing to financial market analysis. We applied advanced genetic programming and solver techniques to identify signal in high frequency financial market data.

    • Designed and implemented data processing, market simulation, backtesting, and fitness function algorithms for trading strategy evolution.
    • Designed and ran experiments to improve strategy evolution platforms.
    • Made improvements to machine learning components.
  • Skipper @ S/V Serenity

    S/V Serenity
    • The Pacific Ocean
    • Oct 2010 to Jul 2011

    Having bought Serenity, a Hans Christian 38 Traditional Cutter, we refitted her and sailed from Berkeley, CA to La Paz, Mexico and back. Responsibilities included navigation, seamanship, maintenance, and first and foremost, ensuring the safety of myself, my wife, and my crew.

  • Senior Software Engineer @ eBay

    • San Jose, CA
    • Jan 2009 to Feb 2010

    In late 2008 eBay acquired Positronic in order to "[leverage] machine learning to provide a more predictive and compelling customer experience."launch

    • Applied Positronic tools to eBay’s search system in order to tune the ranking algorithm.
    • Created Hadoop jobs to processes click logs to generate a data set for machine learning.
    • Was influential in the creation of a Hadoop installation that became an integral part of eBay’s infrastructure.
  • Software Engineer @ Positronic

    • Seattle, WA
    • Nov 2007 to Dec 2008

    Positronic was an early stage start-up when I joined it as the fourth member of the team, and second full time engineer. Positronic's mission was to push the limits of what data science and machine learning could be applied to. Our initial target was financial markets and internet text mining.

    • Architected, designed, and implemented a unique .Net based distributed computing architecture for machine learning jobs.
    • Implemented genetic programming algorithms for evolving trading strategies.
    • Designed and implemented a high performance market simulator and backtester.
  • SDE @ Judy's Book

    • Seattle, WA
    • Sep 2007 to Nov 2007

    After two years at Microsoft a desire to work in a more customer focused environment prompted me to move to Judy's Book, a local business review and deal site. Unfortunately my timing could have been better and Judy's Book ceased operations shortly after I started there (although technically the site is still up under different ownership). On the upside, my short time at Judy's Book got me up to speed on ASP.Net MVC.

  • SDET & SDE @ Microsoft

    • Redmond, WA
    • Sep 2005 to Sep 2007

    At Microsoft I was part of a team that acted as an early adopter and tester for next generation web service and workflow technologies as they were being developed. We got our hands dirty with the bleeding edges of Windows Communication Foundations and Windows Workflow, provided meaningful feedback to the core product teams, and built real world applications while doing so.

    • Acted as Scrum Master.
    • Developed, automated, and maintained test cases.
    • Planned, designed, and developed applications to validate the latest components of WCF and Windows Workflow.
    • Worked with the core WCF team to expand RESTful web service capabilities.
    • Trained peers on best practices of API and Web Service design.
  • Software Developer/Intern @ USDA

    • Washington, DC
    • May 2003 to Sep 2004

    During college I did two summer internships with the Livestock and Seed division of the USDA, and worked part time in-between. I worked on two major projects while at the USDA: The first was an interactive media presentation about the Livestock and Seed division, which I implemented in Macromedia Flash and Actionscript. The second was an ASP.Net web application and API to facilitate operation of the OECD Seed Certification program in the U.S.. In both cases I independently designed and implemented the project, saving the USDA significant time and expense compared to bidding out the projects with contractors.


Paul Wheeler, M. Morse, V. Misic, P. Anderson. "Image Dithering as N-Queens Problem." 2004 Proceedings, Conference on Artificial Neural Networks in Engineering.

By expanding on a basic genetic algorithm for solving the N-Queens problem we were able to develop a method for generating high quality noise masks that could be used in image dithering producing results superior to traditional algorithms. In order to produce the results for the paper I implemented the expanded genetic algorithm in both Matlab and C++.