In recent years much effort has been devoted to Collective Classification (CC) techniques for predicting labels of linked instances. Given a large number of labeled data, conventional CC algorithms make use of local labeled neighbours to increase accuracy. However, in many real-world applications, labeled data are limited and very expensive to obtain. In this situation, most of the data have no connection to labeled data, and supervision knowledge cannot be obtained from the local connections. Recently, Semi-Supervised Collective Classification (SSCC) has been examined to leverage unlabeled data for enhancing the classification performance of CC. In this paper we propose a probabilistic generative model with network regularization (GMNR) for SSCC. Our main idea is to compute label probability distributions for unlabeled instances by maximizing both the log-likelihood in the generative model and the label smoothness on the network topology of data. The proposed generative model is based on the Probabilistic Latent Semantic Analysis (PLSA) method using attribute features of all instances. A network regularizer is employed to smooth the label probability distributions on the network topology of data. Finally, we develop an effective EM algorithm to compute the label probability distributions for label prediction. Experimental results on three real sparsely-labeled network datasets show that the proposed model GMNR outperforms state-of-theart CC algorithms and other SSCC algorithms.