Hyperstack - Product Updates

HYPERSTACK WEEKLY RUNDOWN #12: Latest Edition

Written by Damanpreet Kaur Vohra | Dec 2, 2024 9:18:48 AM

Welcome to the Hyperstack Weekly! We’re excited to bring you this week’s highlights, packed with updates to enhance your experience. From new features and tutorials, there’s something for everyone.

So, grab your coffee, settle in and let’s explore what’s new!

Snapshots Feature is Now Live

Snapshots allow you to capture a VM's exact state, including configuration and bootable volume data for quick restoration. During the process, active VMs are temporarily paused, snapshotted and restarted, providing a rollback point before significant changes. This feature helps optimise resources and minimise backup costs while preserving VM states efficiently. Learn more about our snapshot feature here.

To get started, go to the VM details page, move to the "More Options" dropdown and select the "Snapshot" option. See the below image for reference:

Enhanced Kubernetes Stability

Our Kubernetes service is now more reliable than ever. Despite being in the beta phase, we’ve implemented significant updates to create larger clusters with increased reliability. Have you tried our Kubernetes Clusters Beta API yet? If not, check out the API documentation here!

New in Our Blog

This week is filled with tutorials. Here’s a quick look at what’s new:

What are Open Source Models:

Get Started with Open Source Models on Hyperstack

Running large-scale open-source models can be resource-intensive. Hyperstack provides innovative features and easy deployment options to deploy open-source models while managing costs. Check out our latest tutorial to get started.

How to Run a Docker Container for AI Applications:

A Comprehensive Guide on Hyperstack

Running Docker containers on Hyperstack efficiently deploys AI applications, leveraging high-performance infrastructure like NVIDIA GPUs and NVLink for faster, reliable results. Want to get started? Check out our full tutorial here!


 

Getting Started with Hyperstack LLM Inference Toolkit:

A Detailed Guide

The Hyperstack LLM Inference Toolkit is an open-source tool designed to simplify the deployment, management and testing of Large Language Models (LLMs) using Hyperstack. Want to get started with this toolkit? See our tutorial here.

Hear It from Our Happy Customers 💬

Hear it from those who’ve partnered with us, our community is always happy with our cloud services. Recently, Jinxi shared his experience with Hyperstack:

Be the Next to Share Your Success Story with Hyperstack

We hope you enjoyed this week’s updates as much as we enjoyed putting them together. Stay tuned for the next edition. Until then, don’t forget to subscribe to our newsletter below for exclusive insights and the latest scoop on AI and GPUs- delivered right to your inbox!

Missed the Previous Editions? 

Catch up on everything you need to know from Hyperstack Weekly below:

👉 Hyperstack Weekly Rundown #10

👉 Hyperstack Weekly Rundown #11