Citizen science in the BioWaWi project
On March 27, 2025, learning group 5b went on an exciting excursion to the Waldhägenich nature reserve. Accompanied by Dr. Wachinger from the BioWaWi citizen science project (KIT), the pupils explored nature in a special way.
After a hike lasting just under an hour, they experienced the habitat with all their senses and made audio recordings of wild animals. These will later be analyzed in BNT lessons to understand how environmental changes affect biodiversity.
The aim of the project is to develop a species monitoring system based on sound patterns. The aim is to identify changes in species composition and take long-term measures to protect the environment. It was a great educational day with valuable insights into climate protection and discovering nature from a scientific perspective!
Report by Steffen Busam, teacher at the Aloys Schreiber School in Bühl
In our student project, we approach the question of how we can actually learn to recognize the invisible. Because drought is often not immediately visible. Sometimes the landscape looks completely normal - and yet the soil lacks water, the river lacks movement, the plant lacks life. We want to go in search of clues together with school classes at three points in the year. The starting point is our own perception: What actually looks "normal"? What seems strange? What do we not even notice?
In the first session, we look at the idea of *intuition* - the feeling we have before we think. We work with optical illusions, perspective play and physical irritations. The children deliberately take photos of things that are different from how they appear. They also look at whether and how an illusion can be depicted in a photo at all. They try to photograph optical illusions in such a way that they can be recognized - and ideally take a second picture that shows how it really is. This is how they learn that cameras do not always show the truth. And machines don't automatically "see" what we see either.
The second session is about change and prediction. The children choose certain places - a path, a riverbank, a plant - and make a prediction: what will this place look like in a month's time? They record their expectation with a photo and write down what will change. At the same time, their image is processed by an image generation model - such as Stable Diffusion: Based on the photo, the AI creates its own forecast of how this place could change. On the next visit, a second real photo is taken and the three versions - child, AI, reality - are compared with each other. This creates a dialog between time, imagination and reality.
The third session is about how such images can be incorporated into scientific work. The children load their photos into a system that uses AI to search for visible signs of drought: Which signs does the model recognize? Which ones doesn't? The children receive visual feedback - for example in the form of an assessment: "This picture indicates moderate drought." This makes it possible to experience how visual data is fed into physics-based AI systems - and what can be learned from it. > The result is a project that is scientifically relevant, but also remains very playful. It's not just about data, but about learning to see, about curiosity, about an eye for change. And to marvel at the fact that machines recognize a lot - but that they are dependent on us. Because everything starts with an image. With a glance. With a question.