Authors
Sebastian Berns, Simon Colton, Christian Guckelsberger
Publication date
2023/6/22
Conference
International Conference on Computational Creativity (ICCC)
Description
Large data-driven image models are extensively used to support creative and artistic work. Under the currently predominant distribution-fitting paradigm, a dataset is treated as ground truth to be approximated as closely as possible. Yet, many creative applications demand a diverse range of output, and creators often strive to actively diverge from a given data distribution. We argue that an adjustment of modelling objectives, from pure mode coverage towards mode balancing, is necessary to accommodate the goal of higher output diversity. We present diversity weights, a training scheme that increases a model's output diversity by balancing the modes in the training dataset. First experiments in a controlled setting demonstrate the potential of our method. We discuss connections of our approach to diversity, equity, and inclusion in generative machine learning more generally, and computational creativity specifically. An implementation of our algorithm is available at https://github.com/sebastianberns/diversity-weights
Total citations
Scholar articles
S Berns, S Colton, C Guckelsberger - arXiv preprint arXiv:2304.11961, 2023