I like to think of an artist as a scientist who builds his/her assumptions and experiments on observations of modified worlds formed by including the „insignificant“ things that are usually left behind. They are often unbound to default rules and formulas, and thus everything seems to be possible within their framework.
About fifty years ago similar thinking has led to utopian world models of which some have been transferred into reality and at the same time into their necessary failure. It seems that since then Utopia has earned major loss in popularity. But did it really fail altogether? What could Utopia be today and how can we use it to adjust our future way of living?
Not only the future plays a role in investigating these modified worlds – also the present can be seen in various different ways. Many technologies were invented to unveil processes parallel to our bounded perception. But, unluckily, they rarely make us perceive what we did not program them to do so. Therefore I am fascinated by diverting technologies from their intended use in order to open up spaces for and beyond imagination.
The night sky certainly is one of those spaces. It inhabits a vast amount of possibility for imagination and utopian vision that has been used for hundreds of years. Until now the night sky has also been one of the major influences throughout my artistic thinking – not only because it provides possibility for different worlds in a very direct sense, but also because we know so little about it: every assumption of any kind has its equivalent right to exist until it is „proven wrong“.
I am working on a project called „Habitable Zone“ (since 2014) that includes all of the above: constructing an imaginary world with endless possibilities, experimenting with technologies as images of what is not perceivable and reflecting on the night sky and on the discoveries of yet unknown planetary systems.
Beyond the Horizon
"Beyond the Horizon“ (working title) is an interactive setting that allows participants to alter and experience certain parameters of a technological environment in real-time.
The system consists of three parts: the projection of an image (landscape, representation of the horizon line), a (human) participant and a sensor device that is mounted on the participants head.
Participants manipulate the projected image by moving around the space.
Certain areas of the image are transformed into black and white pixel arrays depending on the distance between participant and projection.
These transformations lead to many different results –
for example dark pixel arrays in the outer realm of the projection can dissolve the frame of the projection while bright pixel arrays in the image center can shift and blur the horizon line of the landscape image.
The concept developed from an interest in the limitations of human perception as well as the limitations and artifacts of commonly used technologies.
These include machine learning algorithms (AI) that draw lines between data sets in order to build categories and calculate decisions.