With an explosion in the amount of data available for classification problems it becomes more and more complicated to obtain labels for the whole dataset. Because of this we are frequently presented with only a fraction of labeled data from the whole dataset. To leverage the presence of unlabeled examples Semi-Supervised methods could be used to increase the precision of used classifiers. Therefore, we propose a novel hybrid technique which extends the concept of Self-Training and Help-Training used in Semi-Supervised techniques by incorporating the Active Learning approach for determining the confidence of the classifier in the testing set samples. Specifically we employ the Query-by-Committee (QbC) approach and we call the final method Summit-Training. We apply this method to a range of classifiers (generative and discriminative) and evaluate it on several datasets and real world problems. The formulation of the Summit-Training method especially allows us to use Semi-Supervised approaches for purely discriminative classifiers in which no probabilistic representation of the evaluated classes exists. Compared to other Semi-Supervised techniques (Self-Training and Help-Training), the proposed new method achieves superior performance. It also has better generalization properties since it reduces the number of hyper-parameters and relaxes the conditions for classifiers on which it could be applied.