Molten salt reactors and pyroprocessing are widely considered for various nuclear applications. The main challenges for monitoring these systems are high temperature and strong radiation. Two harsh environments make the monitoring system needs to measure nuclides at a long distance with sufficient resolution for discriminating many different elements simultaneously. Among available methodologies, laser-induced breakdown spectroscopy (LIBS) has been the most studied. The LIBS method can provide the required stand-off and desired multi-elemental measurable ability. However, the change of the level for molten salts induces uncertainty in measuring the concentration of the nuclides for LIBS analysis. The spectra could change by focusing points due to the different laser fluence and plasma shape. In this study, to prepare for such uncertainties, we evaluated a LIBS monitoring system with machine learning technology. While the machine learning technology cannot use academic knowledge of the atomic spectrum, this technique finds the new variable as a vector from any data including the noise, target spectrum, standard deviation, etc. Herein, the partial least squares (PLS) and artificial neural network (ANN) were studied because these methods represent linear and nonlinear machine learning methods respectively. The Sr (580–7200 ppm) and Mo (480–4700 ppm) as fission products were investigated for constructing the prediction model. For acquiring the data, the experiments were conducted at 550°C in LiCl-KCl using a glassy carbon crucible. The LIBS technique was used for accumulating spectra data. In these works, we successfully obtained a reasonable prediction model and compared each other. The high linearities of the prediction model were recorded. The R2 values are over 0.98. In addition, the root means square of the calibration and cross-validation were used for evaluating the prediction model quantitatively.