The constant improvements in the silicon technology have resulted in a continuous reduction of the electron devices feature size. These technological developments strongly impact in device performance. One of the most important reliability issues in the last technological nodes is the Random Telegraph Noise (RTN). This is phenomenon affects the threshold voltage (VT) of the memory cell. A physics-based statistical model is presented and validated. Beyond the 60nm technology node, the granularity of the charge stored in the device is not more negligible and this fact influences the accuracy to perform the program operation. The statistical nature of the electrons injection into the floating gate spreads the VT distribution of the array cells. The feature size reduction affects negatively the distribution and it becomes more and more difficult to control the cell VT by the program algorithm. Another issue in sub-100nm technologies concerns the cell-to-cell interference. The VT loss displays a dependence on the program pattern of the adjacent cells. All these phenomena contribute all together to corrupt the data stored, causing the memory failure and thus the worn-out.