Bayesian networks are graphical tools used to represent a high-dimensional probability distribution. They are used frequently in machine learning and many applications such as medical science. This paper studies whether the concept classes induced by a Bayesian network can be embedded into a low-dimensional inner product space. We focus on two-label classification tasks over the Boolean domain. For full Bayesian networks and almost full Bayesian networks with n variables, we show that VC dimension and the minimum dimension of the inner product space induced by them are 2n-1. Also, for each Bayesian network we show that if the network constructed from by removing Xn satisfies either (i) is a full Bayesian network with n-1 variables, i is the number of parents of Xn, and i
© 2001-2024 Fundación Dialnet · Todos los derechos reservados