Min_DBN is an algorithm that minimizes the number of distinct branching conditions in a decision forest by sharing variable's thresholds assigned to different branching nodes under the condition that decision paths of given feature vectors must not be changed more than a specified percentage at each branching node. This simplification is effective for compact hardware implimentation of a decision forest with comparator sharing. In this paper, we experimentally investigate how effective the algorithm is for decision forests constructed by various ensemble learning methods (random forest, extremly random trees, AdaBoost, gradient boosting) in classification and regression. Futhermore, we improve the algorithm by using the feature vectors only used in a bagging-based learning for applying its path condition to each component tree of the ensumbles, and check its effectiveness experimentally.