Your submission was sent successfully! Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

You have successfully unsubscribed! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates about Ubuntu and upcoming events where you can meet our team.Close

Hyperparameter tuning with MLOps platform

Watch the webinar on-demand

Watch now

Developing AI/ML models is not a novelty for most organisations. But controlling their behaviour is still challenging. It’s essential to tune models correctly in order to avoid getting suboptimal results. To address this need, MLOps platforms started including dedicated solutions.

What is hyperparameter tuning?

Hyperparameters are used for computing model parameters. They are specific to the algorithm used for modelling. Their value cannot be calculated from the data. They are different from model parameters, which are learned or estimated by the algorithm and continue to update their values during the training process.

Hyperparameter tuning is the process of finding a set of optimal hyperparameter values for a learning algorithm. It is necessary to obtain an optimised algorithm, on any data set.

Watch our webinar to learn about:

  • Hyperparameter tuning
  • MLOps’ role in hyperparameter tuning
  • How you can use Kubeflow for this process

Speakers:

  • Michal Hucko - Charmed Kubeflow engineer
  • Andreea Munteanu - MLOps Product Manager

Read more about Charmed Kubeflow and how to get started with AI.