Customer Provided Job Environments
Customers with prebuilt docker images can now use them as the job environment for any job type.
Customers with prebuilt docker images can now use them as the job environment for any job type.
The proxiML platform has been extended to support deploying models as REST API endpoints. These fully managed endpoints give you the real-time predictions you need for production applications without having to worry about servers, certificates, networking, or web development.
proxiML jobs now accept lists of packages that
will be installed using apt, pip, or conda as part of the job creation process and
will automatically install dependencies found in the requirements.txt file in the
root of the model code working directory.
Now your entire team or organization can share a single credit balance managed by a central account.
You can now start any job type from model code stored on your local computer without committing the code to a git repository. In combination with the proxiML CLI, starting a notebook from your local computer is as simple as:
proximl job create notebook --model-dir ~/model-code --data-dir ~/data "My Notebook"
proximl job create notebook "name"
Enjoy the "big ferocious" performance of NVIDIA's Ampere-based RTX 3090 for less than $1 an hour. Supplies are limited so reserve one while you can.
The proxiML platform has been extended to support batch inference jobs, enabling customers to use proxiML for all stages of the machine learning pipeline that require GPU acceleration.
The proxiML platform now allows customers to store models permanently and reuse those models for as many notebook and training jobs as desired.
You can now view summary details of the contents of a created dataset from the user interface.