**Summary**
This release includes a bit of an overhaul for the model APIs. As this repository started as an internal tool for hosting inference logic, the APIs were tailored for an HTTP interface. With this release, we have made using `inference` within your python code much smoother and easier. We also updated imports to be less verbose. See the README and docs for new usage. Additionally, a new interface is provided for consuming video streams, and then broadcasting the results over UDP. This interface is tuned for low latency and is ideal for use cases that need to the most up to date information as possible from a video stream. See https://blog.roboflow.com/udp-inference/ for more details.
**Breaking Changes**
The main change was creating new definitions for model `infer()` functions that now take many keyword arguments instead of a single `request` argument. To continue inferring using request objects, a new method `infer_from_request()` is provided.