My Notes
Explain who our customers our (Enterprise) Sport Coverage Real-time nature Describe how we use machine learning 1. Years of historical data 2. Build features based of deep understanding of the game 3. Refer to the ML panel
Explain what our free to play product is
The last step, publish markets, we were not focusing on
* Story starts in the early days * We were building our initial archicture of the odds feed * Many open questions
* Needed a means to publish our data * Focused on building the core element of our product * Integration with Machine learning * Building a Rust ML framework
TODO a picture of our feed at a high level TODO highlight filtering an distribution
Apache Kafka is an distributed event streaming platform for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications.
These are the values of Kafka that attracted us to Kafka initially * Reputation for being fast * Nascent customer integration story Highlight that we picked Kafka because it seemed like a good enough fit, but we really did not understand the needs of our customers yet
You can use a managed solution BUT! you still need to deal with complexity locally Deploying per customer * Deployed a broker per customer * Terraformed out a topic per customer
* Customer packages (nfl \ mlb \ different markets)
* Once a message is written, it is readable for a long time after * This means that stale messages remain until they are dropped from the log by the broker
RBAC control is offered for Kafka via the Confluent Metadata service
* GraphQL * Websockets/streaming http still need to build intermediate
Running in production since June of last year We now have multiple customers on rabbit Easy to provision new ones We've expanded the use of Rabbit internally