Invented in 1979 by Bradley Efron, the relatively new topic of bootstrap approximation technique is becoming one of the most efficient and fast expanding methods of statistical analysis, used not only by statisticians, but also by other researchers in economics, finance, medical sciences, life sciences, social sciences, and business. However, the current application of bootstrap is largely focused on independent and identically distributed (iid) data and to a lesser extent on weakly dependent data structures. Very little attempt is done to analyze the performance of bootstrap to strongly dependent (long-memory) processes. This work aims at laying the mathematical foundation for the application of parametric bootstrap to regression processes whose disturbance terms are strongly dependent. It is shown that, under some sets of conditions on the regression coefficients, the spectral density function, and the parameter values, the parametric bootstrap based on the plug-in log-likelihood (PLL) function of linear regression processes with Gaussian, stationary, and long-memory errors, provides higher-order improvements over the traditional delta method.