Skip to main content

Spawn Training Jobs Directly From Notebooks

· 2 min read

You can now convert notebooks directly into training jobs to easily run independent training experiments while working on your projects. In contrast to copying the notebook into another notebook job, training jobs will run autonomously, send their output to the location you specify, and automatically terminate when finished.

Easy Notebook Forking For Rapid Experimentation

· 2 min read

​proxiML notebooks can now be forked into new instances to enable easy parallel experimentation. Unlike other cloud notebooks, when you fork a proxiML notebook, the entire working directory is copied. All datasets, checkpoints, and other data are copied into the new notebook.

Making Datasets More Flexible and Expanding Environment Options

· 4 min read

Persistent Datasets just got even better. Not only can you use the same dataset across many jobs in parallel at no additional charge, now you can attach multiple datasets to a single job for free. If that wasn't enough, you can now dynamically change the datasets attached to any notebook job as your needs evolve through the model development process. Additionally, more options have been added for job base environments, allowing you to save time and storage quota by using specific versions of popular frameworks.

Kaggle Datasets and API Integration

· 5 min read

Customers using proxiML to compete in Kaggle competitions or using public Kaggle datasets for analysis can now directly populate proxiML datasets from Kaggle competitions or datasets, as well as automatically load their Kaggle account credentials into notebook and training jobs to use for competition or kernel submissions.