Post a Comment Print Share on Facebook
Featured Karriere FDP Frankreich FußballWM paris

A danger for students? Why ChatGPT belongs in the classroom

When using ChatGPT, you sometimes get the feeling that computers have started to think.

- 3 reads.

A danger for students? Why ChatGPT belongs in the classroom

When using ChatGPT, you sometimes get the feeling that computers have started to think. "Write a eulogy for my mother's 80th birthday", "Explain how the holes get in the Swiss cheese" or "Write a computer code to recognize prime numbers" - the algorithm is not at a loss for an answer.

The fascinating thing is that the written texts or codes hardly differ from those written by humans. Or would you have realized that the entire first paragraph of this article was written by ChatGPT?

Don't worry, you are not alone. The renowned science magazine "Nature" recently reported on an experiment in which researchers were to determine whether summaries of scientific publications are genuine or were written by ChatGPT. In around a third of the cases, the researchers did not recognize the artificially generated texts.

Because ChatGPT was trained to write answers in such a way that we consider them to be as real as possible. The algorithm argues eloquently, can also structure longer texts well and, if necessary, can even be funny or profound - regardless of whether it is a letter or poem, newspaper article or advertising mail. And ChatGPT even writes down entire presentations and term papers in seconds.

At first glance, it seems understandable that the education authorities in several major US cities such as New York, Los Angeles or Seattle reacted immediately and blocked access to ChatGPT. There is great concern that learners will only cheat, that they will no longer develop critical thinking and problem-solving skills. Skills that are becoming increasingly important in the 21st century. But this way is wrong.

First, ChatGPT cannot be banned. Students will always find creative ways to use ChatGPT. Second, progress cannot be stopped. Applications like ChatGPT have the potential to provide a real productivity boost.

Why should companies still employ countless employees in the future to write simple customer emails, advertising flyers or operating instructions? Why should offices and authorities waste time writing simple letters or info mails?

Even software code will not always need programmers in the future. But people who recognize the possible uses of ChatGPT and its successors do. Employees who can handle artificial intelligence and automate processes. And who may even develop completely new business models with the help of algorithms.

Your education must begin now! That is why the use of ChatGPT is so important for schools so that children learn how to use artificial intelligence - but also its limits.

Thirdly, ChatGPT shows how important critical thinking skills and competent handling of knowledge will be in the future. As convincing as the texts of chat algorithms may sound, they are not always true. ChatGPT can even spout completely fact-free nonsense with great aplomb.

For example, when we ask him about errors in Albert Einstein's general theory of relativity and he eloquently questions the mass-energy relationship. Or when he writes about voter fraud in the last US election. The internet will soon be flooded with perfectly worded text and information, but also with more misinformation than we could ever have dreamed of.

In order to distinguish right from wrong, true from untrue in the future, contextual knowledge and experience are needed, but also critical questioning and exchange with others. And you can learn that!

For example, in which students try to recognize computer-generated texts. By learning when to use ChatGPT for research and when not. By writing whole papers or essays with ChatGPT and then checking them for accuracy using sources. By critically dealing with computer-generated texts and expressing their own opinion.

At the same time, the classic essay or homework will not become obsolete - quite the opposite. However, it should not primarily be about the result, but about the process of creation: the research, the structuring, the development of one's own opinion, the discussion. In other words: therefore "how we think", not "what we think".

Because unlike us humans, applications like ChatGPT cannot think for themselves. They process existing information and draw on knowledge that has long been thought up.

Let's now create the basis for future generations to be able to think for themselves and use artificial intelligence responsibly. "Because in the end it's only the spirit that brings every technology to life." Incidentally, it wasn't ChatGPT that said that, but Goethe.

Swantje Dettmers is a scientist and psychological consultant. Sebastian Dettmers is CEO of the global recruiting platform StepStone, which like WELT belongs to Axel Springer SE, and author.

"Everything on shares" is the daily stock exchange shot from the WELT business editorial team. Every morning from 7 a.m. with the financial journalists from WELT. For stock market experts and beginners. Subscribe to the podcast on Spotify, Apple Podcast, Amazon Music and Deezer. Or directly via RSS feed.

Your Name
Post a Comment
Characters Left:
Your comment has been forwarded to the administrator for approval.×
Warning! Will constitute a criminal offense, illegal, threatening, offensive, insulting and swearing, derogatory, defamatory, vulgar, pornographic, indecent, personality rights, damaging or similar nature in the nature of all kinds of financial content, legal, criminal and administrative responsibility for the content of the sender member / members are belong.