Convex optimization has witnessed a considerable progress, mainly due to the development of powerful algorithms and software. In fact, one class of convex problems, called self-scaled, can be particularly efficiently solved. This class encompasses a large amount of real-life convex optimization problems, including linear and semidefinite programming. This class is best described using an algebraic structure known as formally real (or Euclidean) Jordan algebra, which provides an elegant and powerful unifying framework for its study. This book proposes an extensive and self-contained description of these algebras. Our work focuses on the so-called spectral functions on formally real Jordan algebras, a natural generalization of spectral functions of symmetric matrices. Based on an original variational analysis of eigenvalues in Jordan algebras, we discuss their most important properties, such as differentiability and convexity. We show how these results can be applied to extend several algorithms existing for linear or second-order programming to the general class of self-scaled problems, e.g. the powerful smoothing techniques of Nesterov.