How to Steal a Model

What does it mean to steal a model?  It means someone (the thief, presumably) can re-create the predictions of the model without having access to the algorithm itself, or the training data.  Sound far-fetched?  It isn't.  If that person can ask for predictions from the model, and he (or she) asks just the right questions, the model can be reverse-engineered right out from under you.  

Relevant links: