Deploy Python decision models as hosted apps directly from a Jupyter Notebook

Push your Python decision model from a local file to a remote application in minutes – whether you’re using a notebook or running in another Python environment. Conduct tests with fully featured experimentation tooling, collaborate and share results with teammates, and get observability into model performance.

Democratizing operations research includes making it easier to get decision models to production, no matter where you start working on them. In the same way that platforms like Databricks provide MLOps workflows for getting machine learning models into production from notebook environments, updates to the Nextmv Python SDK allow you to deploy a decision (optimization) model directly from notebook tooling to enterprise-grade infrastructure with tools for testing, collaboration, and model management.  

From iterative local prototyping to simple chart creation, there are a number of reasons why decision modeling work sometimes starts in notebooks. We’ve also seen more teams with varying backgrounds such as data science getting involved in the OR space – and we want to meet practitioners where they already are working on their models – in this case, notebooks.

Let’s say you’ve been building a decision model in a Jupyter Notebook. Development is going well. The model runs locally, produces expected results, and you’re ready to share it with teammates. You can screen share with a teammate while you run the model or send the file for them to run – and hope it works (and produces similar results) on their machine. But what happens when you want to see how your new model compares to the one currently running in production? How do you efficiently get stakeholder buy-in to move forward? How do you increase accessibility to the model for debugging and further analysis? You can package up the model, hand it over to your software engineering team to deploy, but you know they’re juggling a long list of requests – and OR projects can be tricky implement.

What if you could deploy it to remote infrastructure directly from the notebook you’re already working in right now? When you’re ready to take the next step in the workflow, a DecisionOps platform will accelerate the path with model management, larger-scale experimentation, and production operations. 

Nextmv + Python = Accelerated model workflows

We’ve been simplifying the decision modeling workflow in Python, making it easier to go from development to production all from the same Python environment. The Nextmv Python SDK follows these common decision modeling steps: the model takes input from a source, consumes options (or parameters), and uses some technology (like a solver) to get a solution. That solution is part of an output that is characterized by statistics for further analysis. The Nextmv Python SDK formalizes that pattern so you can develop your model in fewer lines of code and get to production faster.

Now you can use Nextmv to work with your model entirely in a Python environment. This means that you can deploy, run, and test the model without translating to different formats like .mps or .lp files. Using the new API, a decision model is handled like a regular Python class, which is used to run the model remotely so you don’t have to leave Python.

How to deploy your local decision model from a Jupyter Notebook

Follow these steps to push your Python model to Nextmv from your notebook in just a few minutes:

  1. Create a Nextmv account and start a plan  
  2. Create a custom app via the Nextmv UI
  3. Copy your API key from your account and add it as an environment variable
  4. Download a notebook template from one of our community apps in GitHub that showcases the new experience:
    1. Python nextroute VRP
    2. Python Gurobi knapsack
    3. Python HiGHS knapsack
  5. Follow the instructions in the notebook template 
    1. Install the necessary Python packages
    2. Add the necessary imports
    3. Create the decision model
    4. Run the model locally
    5. Push the model to Nextmv Cloud
    6. Run the model remotely

Here are a few video examples that walk through deploying a local model to Nextmv and running it remotely using various optimization tooling.

Pushing a HiGHS knapsack model from a notebook to Nextmv

Pushing a Gurobi knapsack model from a notebook to Nextmv

Pushing a VRP model from a notebook to Nextmv

Get started with Nextmv today

If you’re working in a notebook, get your decision model into production with Nextmv! Test your decision model, share results with stakeholders, and manage your model on our DecisionOps platform.  

Create a Nextmv account and start a free trial to try out this experience. Have questions? Reach out directly to our team.

Video by:
No items found.