Resources

A fundamental conceptual understanding is required in order to integrate the diverse application possibilities of artificial intelligence, and generative AI in particular, into teaching. The following compilation contains a range of different resources to lay the foundations for this and to deepen the subject matter.

Introduction to generative AI

Generative artificial intelligence (AI) is based on deep learning, in which machine learning is used to train artificial neural networks, known as Large Language Models (LLM). The models are trained with large amounts of data in order to generate new results based on the learnt patterns.

This technology, which has been researched for many years, achieved a breakthrough in 2017 when the Transformer architecture was published in the article external page Attention is All You Need. A key concept here is the so-called principle of self-attention, in which, for example, words can be embedded in the correct context of a sentence or even an entire text. As an introduction to this, the external page generative AI exists because of the transformer illustrates these relationships in a very understandable way.

Generative AI is always based on a probability calculation. It can therefore lead to hallucinations and incorrect references. Biases can also arise due to incorrect data or its processing. This will improve with the further development of the models, but the output must still always be checked for correctness.

Basics

The following list offers the opportunity to learn the basics of generative AI, gain an insight into the technical implementation and try out the first possible applications.

The external page KI-Campus is a learning platform for artificial intelligence and offers online courses on the basics of artificial intelligence, machine learning and the opportunities offered by language assistants in university teaching.

The external page Hochschulforum Digitalisierung is also addressing the topic and examining the effects of generative AI from various perspectives.

Prompting

Prompting refers to the provision of prompts to a language model. The quality and precision of a prompt largely determines the effectiveness of generative AI. With good prompting, the AI can be instructed to generate targeted output.

Data input for generally available applications in the field of generative AI should be treated with caution, as input data is usually not protected and can be reused by providers as training data.

Microsoft Copilot as well as Google Gemini offer a protected environment for employees and students via the ETH account. Further details can be found under Tools & Licences.

Information and discussion rounds

The Refresh Teaching event series regularly addresses the topic of AI and its impact on teaching. Lecturers provide insights into their teaching and report on their own experiences.

external page LeLa, the university didactics learning lab for digital skills, addressed the implications for the design of teaching at universities in the webinar series "AI or what the ChatGPT?". In the two series of events, various topics were addressed and examined in depth:

  • external page Volume 1: Implications for language, writing and didactics, reflections on aesthetics and ethics.
  • external page Volume 2: What do the data, performance records, AI in research, teaching and university operations say?
JavaScript has been disabled in your browser