Got something to say or just want fewer pesky ads? Join us... 😊

[Misc] University? A.I. World...??



Moshe Gariani

Well-known member
Mar 10, 2005
12,305
Paper. The others would be plain silly. Biros and felt tip pens would be allowed, no insistence on quills. OK?

Yes, of course. I was trying to be clever but I don't feel very clever.

What WILL the pros and cons be if children born this year end up leaving school being unable to write using pen and paper..?

History seems to tell us that technological advances lead to improvements in work efficiency and living standards but the importance of current schoolchildren learning to handwrite isn't clear to me. (I know, I'll ask ChatGPT that question and become instantly relatively expert in the matter...)
 






The Antikythera Mechanism

The oldest known computer
NSC Patron
Aug 7, 2003
8,273
How times change. For example, when I did my A-Levels - Physics, Pure Maths and Applied Maths, only log tables could be used, all workings had to be shown and the grade / pass / fail was based solely on the exam, nothing on course work. :laugh:
 


GoldstoneVintage

Well-known member
Oct 20, 2024
322
Europe
May not be a completely foolproof solution, but it might help to insist that exams, theses and dissertations should be hand-written in the student's own hand-writing.
Yes, I would go further and say that the only way to ensure that the work is the student's, is to return to 100% exams for assessments. Doesn't mean AI can't be utilised to support learning though.
 






chickens

Have you considered masterly inactivity?
NSC Patron
Oct 12, 2022
2,989
Haha, well, at least our new AI overlords are polite, articulate, and terrifyingly efficient!

On the bright side, if universities adapt quickly, this could lead to a much-needed shake-up of outdated assessment models. On the downside… well, we might all end up working for an AI that writes better than us, thinks faster than us, and never needs coffee breaks.

Would you say you’re optimistic or a bit wary about where this is all heading?

You may regret asking. I’m positive on the possibilities of AI. I think it can already do incredible things, and will only improve from here.

I’m cynical about the organisation of society. Most technological advances, however brilliant, have been weaponised at one point or another to try and shape society, usually to ensure the current custodians of that technology achieve maximum wealth and power.

Now, this is the first technology that can potentially think for itself. E.g. if somebody keeps feeding it false information, there’s the potential for it to learn to discount information from that source, or at least assign it a lower value.

How does that play out in a society where quite often outright falsehood or at least unnecessary obfuscation are used as standard tools of politics and industry?

I can’t help but feel that very quickly there are going to be controls placed on these things, and that as they’re privately owned, we’ll have no visibility as to what those controls are, or how they affect each model’s outputs.

For those still awake, my final point is, if we assume that the growth of AI leads to significantly improved progress in the field of robotics, which doesn’t feel like an impossible stretch, how does society organize itself when humanity is not required in the workforce?

If it no longer holds true that a human being is the best tool for the job, work becomes a hobby, fetish or affectation. Almost all world societies (except the French) are based around work being integral to our sense of worth and wellbeing. What comes next?

Scary and exciting at the same time.
 


Blues Guitarist

Well-known member
Oct 19, 2020
740
St Johann in Tirol
We are moving back to more in-person exams in some cases for this very reason, but because kids today only communicate via texting, their handwriting is unreadable!
I’m studying for a Master’s degree in Innsbruck where at least half of my exams are 20 minute conversations. I hate it because it tests your memory as much as your understanding, but it gets round the problem of students using AI to write essays.
 


Harry Wilson's tackle

Harry Wilson's Tackle
NSC Patron
Oct 8, 2003
58,332
Faversham
There must be lots of people on here with good experience and views on the changes caused by A.I. technology advances...

I am pretty ignorant myself. Only really know a little bit about capability of A.I. within the Higher Education context.

It feels like the situation is progressing noticeably on a monthly timescale if not weekly. A short while ago we were complacently talking about A.I. generated work being mostly "rubbish" and easy to spot. I attended a Webinar this week and am now convinced that A.I. will very soon be able to generate post-graduate expert level writing on any topic. Traditional essays where the challenge is to review available theory/research literature and make application to answering a question will be meat and drink. It will also be able to write high level "research reports" having completed sophisticated quantitative or qualitative data analysis.

None of this will be detectable. The key aspect it needs to brush up is referencing. But even if it were detectable, what would be the argument in denying students access to available tools. We don't stop them from using word processors, calculators or Statistical Packages...??

What does this mean for the much vaunted "transferrable skills" that graduates are meant to benefit from acquiring...? Are graduate skills in analysis and different forms of communication, that used to have real-life value, already devalued?

Is the benefit of university education and a degree certificate changing radically? Would your advice to an 18 year old about going to university (to read Sociology or Business Studies, say) be any different now compared to a couple of years ago because of what is happening with A.I. ...???
This is a brilliant question. I am on an AI impact committee a the uni where I work. There are several strands of concern.

1. Detection of plagiarism. Here the main concern is the cutting and pasting of someone else's published words and representing them as your own. That is intellectual theft, and cheating. The concern is that AI text is undetectable. My solution is that I tell students to not use generative AI. Not because it is cheating but because you learn nothing from it. As far as I am concerned if it is not detectable in Turnitin, it is not plagiarism, and it is not my concern.

Madly, the college is focused on telling students to edit AI generated (and other stolen) text so it does not come up as a hit in Turnitin. My argument is that this is teaching students how to cheat better. If the answer is generated by someone else it is not your answer.

2. How to use generative AI productively. I have to say that the only use I would countenance would be to improve the quality and readability of the English of something constructed without AI input. I don't have a problem with this.

3. How to set tasks that cannot be done by the press of a few buttons in ChatGPT. I set essay questions that are not amenable to AI because they are based on bespoke interpretations of the literature and structures invented by me to explain drug actions. They are not searchable. Yet.... I suspect that if students upload their essays into an AI-searchable space all that will change.

I have found that if you ask students to describe, they can all do it, and ChatGPT will do it better for them. If you ask them to explain, or to compare and contrast, or to justify, it gets much harder. I spent most of today marking essays. Very few answered the question. It was mostly entry level description, with very weak critical analysis.

4. Process. We warn students about plagiarism but increasing numbers are being caught. The problem is that many colleagues seem happy with a color mosaic and a 40% hit rate in Turnitin. But when I see the mosaic, which can ONLY be generated by cut, paste and an edit to disguise the source, I see cheating and bad scholarship. My solution is that because I know the student will not be zero marked by the misconduct committee, I give them a very low mark for bad scholarship. Such work is usually crap anyway because they students don't understand what they are trying to edit.

5. How does AI help students write exams? The answer is it doesn't because they don't have access to AI in the exam hall. In fact an AI-dependent student (course work) will be finished in an exam. We had a lad here 2 years ago who fell into this fail-space. Sadly the college is moving towards not using essays and going the MCQ route in exams. Then students could get first via a combination of AI for coursework and rote learning for exams. Grade inflation. Then who gets the job? The reality is that students make mistakes and poodle-faking via AI won't help them in the long run if they don't bother to learn and understand what the software has generated for them.
 
Last edited:




Harry Wilson's tackle

Harry Wilson's Tackle
NSC Patron
Oct 8, 2003
58,332
Faversham
May not be a completely foolproof solution, but it might help to insist that exams, theses and dissertations should be hand-written in the student's own hand-writing.
I proposed this when concerns were expressed about computers in an exam hall having online access, 2 years ago.

I was told 'students don't know how to use a pen'.

That was by a 'senior' administrative academic colleague (someone promoted to Professor as a reward for chairing committees).

We have sadly promoted people who were shit at research and shit at teaching to positions of administrative power.

Needless to say we had an avalanche of exam hall cheating that has largely been covered up.
Students from a 'certain part of the world' see it as foolhardy to not take advantage of a loophole.
We have had students logging into the system and uploading work via pals located outside the exam hall.
We have scores of kids from this 'particular country' having to explain to parents why they have been expelled in their final year.

In reality, for coursework, and now exams, we have online submission systems that are necessary to facilitate process, the collation of work and the curation of marking. This includes doing plagiarism checks. Handwritten work will never return. But students do know how to use a pen, and senior administrative academics need to learn a bit of humility.
 
Last edited:


Harry Wilson's tackle

Harry Wilson's Tackle
NSC Patron
Oct 8, 2003
58,332
Faversham
When I was at Uni I had a viva exam on my final dissertation where they would question me about what I had written. Its all well and good to use AI to create your dissertation, but you need to understand and back up what is written.
AI could be used as a tool to maybe write something that would achieve good marks, but the viva would sort out what knowledge the student actually has on the subject.
I think people who have a good knowledge on the subject will write their own dissertations while people who have been coasting through uni will use AI and would be found out in a viva assessment.
The problem with a viva is it is hard to standardize.

We used to viva all students, then realized we were not doing anything with the outcome.

Around 30 years ago we decided that a viva for a student on a grade boundary could be used to take the student up a grade, but not down. Then we had external examiners who needed a crib sheet to ask questions to students about exam essays. So we had to write model answers for them. Pointless.

In the end even using a viva to test if a bit of coursework was done by the student became impossible to manage. Some students claim they 'freeze' in a viva. If we can't get an answer out of them what are we supposed to do to the mark for the written work?

These days we don't do any vivas.

There is no point doing something that generates an outcome you cannot do anything with.
This degenerates into a discussion equivalent to the one on here recently about what to do about the discontent in the dressing room.
(whose discontent, identified how and reported by whom, etc.)
 


GT49er

Well-known member
NSC Patron
Feb 1, 2009
50,337
Gloucester
Yes, of course. I was trying to be clever but I don't feel very clever.

What WILL the pros and cons be if children born this year end up leaving school being unable to write using pen and paper..?

History seems to tell us that technological advances lead to improvements in work efficiency and living standards but the importance of current schoolchildren learning to handwrite isn't clear to me. (I know, I'll ask ChatGPT that question and become instantly relatively expert in the matter...)
Well, those born 2020 are now in school, learning to write with pen and paper. Long may it continue! I never want to be dealt with by, say, a surgeon who passed his medical exams with flying colours but actually used AI and really doesn't have a clue about the actual job!
 




GT49er

Well-known member
NSC Patron
Feb 1, 2009
50,337
Gloucester
I proposed this when concerns were expressed about computers in an exam hall having online access, 2 years ago.

I was told 'students don't know how to use a pen'.

That was by a 'senior' administrative academic colleague (someone promoted to Professor as a reward for chairing committees).

We have sadly promoted people who were shit at research and shit at teaching to positions of administrative power.
Exactly. What worries me is that we'll also have 'highly qualified' (cough, AI helped) people who are shit surgeons, shit pharmacists, shit teachers (and lecturers) and shit civil and structural engineers who haven't got a clue about the BM of an I-beam, or the breaking strength of reinforced concrete etc, etc.
 


Harry Wilson's tackle

Harry Wilson's Tackle
NSC Patron
Oct 8, 2003
58,332
Faversham
I am now asking final year students to synthesis a narrative about something we don't know about, using information and narratives about things we do know about, and tapping into processes of logic and experimental practice concerning the generation of safe information (safe being reliable, factual and relevant). This makes it hard to make lists and expect the lists to somehow magically create the narrative.

Most of my colleagues don't do this because they have better things to do (that are more likely to get them tenure, then promoted) and because they don't know how to do this, and don't see it as a priority.
 


albionalba

Football with optimism
NSC Patron
Aug 31, 2023
373
sadly in Scotland
Interesting discussion. Away from the AI issue I would really caution against advising 18 year olds to follow a university study route. Most of the UK universities are castles built on sand and there will be massive turbulence to come from low international recruitment, the pension deficit and general disenchantment with middle and lower ranking institutes. There are so many learning sources out there (not always easy to verify truths but that's part of the learning process). I think the advice to young people is to learn how to learn and create original output and publish it digitally, build it physically or whatever is most appropriate and use that 'portfolio' as your resume for employment.

Ai will have massive impact on the employment market (and learning sources and activity) so better to stay in reactive mode for learners and teachers and iterate as things change.

I do understand @Harry Wilson's tackle points about vivas (although appealing to check for AI assistance) - as an employer though I'd rather sit with colleagues and judge a young person's portfolio selectively alongside a chat than rely on qualifications - the trouble with doing that in an academic environment is that it will always provoke debate for debate's sake whereas in a business the interviewers are solution-oriented towards making a good hire.
 




Harry Wilson's tackle

Harry Wilson's Tackle
NSC Patron
Oct 8, 2003
58,332
Faversham
Exactly. What worries me is that we'll also have 'highly qualified' (cough, AI helped) people who are shit surgeons, shit pharmacists, shit teachers (and lecturers) and shit civil and structural engineers who haven't got a clue about the BM of an I-beam, or the breaking strength of reinforced concrete etc, etc.
At the end of the day, if you have trained to qualify in something by means that don't train you to do it, you won't last five minutes, and the more critical the job the sooner you will be found out.

Training as a surgeon will never be done based on AI.
Nor Albion mid field maestro.
Can you imagine? :lolol:

All AI does is generate documents that use available information to explain what is already known, or speculate on what cannot yet be proven using plausible and unoriginal narratives. This includes AI-generated music.

If the job entails nothing more than the generation of manuals then a proficiency in ChatGPT may be an appropriate qualification.
 


GT49er

Well-known member
NSC Patron
Feb 1, 2009
50,337
Gloucester
At the end of the day, if you have trained to qualify in something by means that don't train you to do it, you won't last five minutes, and the more critical the job the sooner you will be found out...........
At which point the Peter Principle frequently kicks in. Kicking the incompetent failure upstairs is all too easy and all too common (see under Service: Civil!)
Training as a surgeon will never be done based on AI.
Nor Albion mid field maestro.
Can you imagine? :lolol:
Wasn't that Dahoud?
 


Harry Wilson's tackle

Harry Wilson's Tackle
NSC Patron
Oct 8, 2003
58,332
Faversham
At which point the Peter Principle frequently kicks in. Kicking the incompetent failure upstairs is all too easy and all too common (see under Service: Civil!)

Wasn't that Dahoud?
The Peter principle is the phenomenon of doing a goo job that gets you promoted, and so on, till you are promoted into a position for which you are incompetent.

If you got the initial appointment based on AI fakery of genuine competence it is unlikely you'll climb very far further up the tree.
 


Cotton Socks

Skint Supporter
Feb 20, 2017
2,297
I finished my degree last year. I was astounded by how many people confessed to using AI throughout the whole course after they were flashing their first class degrees on FB.
One person in particular was shit at everything, constantly asking questions from other students about how to do this & that, then suddenly started excelling on every essay. Sudden distinctions etc. When she was pissed after the grad ceremony she said she'd used AI a lot, as she really needed a decent grade. So she has no understanding of the subject matter and more importantly she has the potential to affect peoples lives with no clue what she's doing.
I did my degree as I'm interested in the subject & wanted to understand it more. I got the grade I deserved for my standard of work at the end & being quite a few words short on my intro & conclusion as I was running out of time (not a 1st), but I know I understand the subject matter.
I feel that a lot of people 'cheated', what's the point in doing it if you're going to cheat? If I had got a 1st due to using AI, I would feel guilty forever!
 




GT49er

Well-known member
NSC Patron
Feb 1, 2009
50,337
Gloucester
The Peter principle is the phenomenon of doing a goo job that gets you promoted, and so on, till you are promoted into a position for which you are incompetent.

If you got the initial appointment based on AI fakery of genuine competence it is unlikely you'll climb very far further up the tree.
Perhaps that's the sequence then -
1). The Peter Principle;
2). Kicked upstairs;
3). Early retirement on health grounds (or in some cases, the House of Lords)!
 


Harry Wilson's tackle

Harry Wilson's Tackle
NSC Patron
Oct 8, 2003
58,332
Faversham
Perhaps that's the sequence then -
1). The Peter Principle;
2). Kicked upstairs;
3). Early retirement on health grounds (or in some cases, the House of Lords)!
I clearly managed my 'career' all wrong(ly). :lolol:
 


Albion and Premier League latest from Sky Sports


Top
Link Here